This question applies to all measurements, but to be more clear, here's an example. Suppose a gaussian wavefunction moves to begin with towards a position detector screen. How do we obtain the 'Time of arrival' distribution? Should the average time of arrival be inversely proportional to the integrated mean of momentum distribution?
What does quantum mechanics say to predict the distribution of arrival times? If the velocity probability distribution has a wide spread, does the arrival time also have a wide spread? This seems intuitive, however, the particle doesn't actually have a velocity during the journey..
How do we know at what time t, a wavefunction will collapse. Could the wavefunction 'pass through' the screen without collapsing?
To rephrase the question in a more philosophical way, why does QM require time of measurements to make predictions, when time of measurement is itself not a free variable? In addition, how can it be reconciled with relativity if there is no consideration of time elapsed between 2 measurements? Upon further scrutiny, is the position of measurement even a free variable? It is obviously the position of the 'position measurement device' but how do you actually define this?