کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
978867 | 933309 | 2010 | 12 صفحه PDF | دانلود رایگان |
Based on the modelling of quantum systems with the aid of (classical) non-equilibrium thermodynamics, both the emergence and the collapse of the superposition principle are understood within one and the same framework. Both are shown to depend in crucial ways on whether or not an average orthogonality is maintained between reversible Schrödinger dynamics and irreversible processes of diffusion. Moreover, the said orthogonality is already in full operation when dealing with a single free Gaussian wave packet. In an application, the quantum mechanical “decay of the wave packet” is shown to simply result from sub-quantum diffusion with a specific diffusivity varying in time due to a particle’s changing thermal environment. The exact quantum mechanical trajectory distributions and the velocity field of the Gaussian wave packet, as well as Born’s rule, are thus all derived solely from classical physics.
Journal: Physica A: Statistical Mechanics and its Applications - Volume 389, Issue 21, 1 November 2010, Pages 4473–4484