quantum choice of the thought is not exactly a brain's wavefunction collapse but rather reduction of a mixed quantum brain state:
quote:... each increment in knowledge is associated with a reduction of the quantum state to one that is compatible with the new knowledge. The quantum brain is an ensemble of quasi-classical components. As just noted, this structure is similar to something that occurs in classical statistical mechanics, namely a ‘‘classical statistical ensemble.’’ But a classical statistical ensemble, though structurally similar to a quantum brain, is fundamentally a different kind of thing. It is a representation of a set of truly distinct possibilities, only one of which is real. A classical statistical ensemble is used when a person does not know which of the conceivable possibilities is real, but can assign a ‘‘probability’’ to each possibility. In contrast, ALL of the elements of the ensemble that constitute a quantum brain are equally real: no choice has yet been made among them. Consequently, and this is the key point, entire ensemble acts as a whole in the determination of the upcoming mind-brain event. Each thought is associated with the actualization of some macroscopic quasi-stable features of the brain. Thus the reduction event is a macroscopic happening. Moreover, this event involves, dynamically, the entire ensemble of quasi-classical brain states.
Just to start this off, I thought I'd briefly give an account of the interpretations of Quantum Mechanics (hence forth QM) that are still valid. By "still valid" I mean that no research on QM has yet ruled them out. There were several early interpretations of QM that originally held promised, but have since been ruled out via a combination of mathematical and experimental work. I also don't want to over describe them, so I'm cutting out various "sub-interpretations" were people disagree on minor points.
There are really five interpretations. I'll illustrate them with an experiment to measure an electron's spin. The electrons are produced from superheating silver in an oven. They then stream out of the oven into the magnetic field between two powerful electromagnets. Spin up electrons will have their path curled upward, spin down electrons will have their path curled downward. Past the magnets lies a photographic plate which the electrons will strike, leaving a white mark. Spin-up electrons will strike the top of the plate, and spin-down electrons will strike the bottom.
(a) Modern Copenhagen: Microscopic particles do not intrinsically possess any properties independent of measurement. Everything we measure about them (aside from their mass, total spin and some other properties) is a property of the interaction between the particles and a large macroscopic classical object, not an intrinsic property of the particle itself.
In the experiment above, I have in fact simply generated the event of certain kind of mark appearing on the photographic plate in the presence of a magnet and silver oven, I cannot say anything about the electrons.
Further more, the event generated is random, i.e. has no underlying cause, but the distribution of the events is predictable, i.e. the mark will appear at a certain location on the plate ~35% of the time.
(b) Many-Worlds: Particles possess multiple values for their properties, e.g. they are spin up and down at the same time, or one could say that their properties are multi-valued. When a large object attempts to interact with them (in a measurement let's say), the large object becomes multi-valued. In the example above the photographic plate develops a mark on both the top and the bottom, in response to both values of the electrons spin.
At a deeper level, the photographic plate develops two completely different atomic structures, one corresponding to a mark at the top and one corresponding to a mark at the bottom. Initially these atomic states can interact and both states are influenced by both electron spins. Very rapidly however, each state of the plate becomes unable to interact with the "wrong" electron spin (i.e. the state with a bottom mark is no longer influenced by the spin-up value of the electron) and both become (almost) unable to interact with or influence each other.
If a camera then records the state of the plate, it develops two different atomic states that rapidly decouple, and so on in a sphere expanding close to the speed of light.
The world essentially gains two parallel states, one with a set of values attached to the electron having spin-up and the other spin-down.
These first two interpretations are, in a sense, the only interpretations of the mathematics of QM. The following three interpretations are actual alternate theories, that is they have different mathematics, that agree with QM as far as we have tested it. They're all similar in that the world is "set up" in some way to simulate QM.
(c) Non-local classical theories: This is really a whole class of theories. In these theories there is no randomness and no multiple-worlds. They're essentially classical theories in the sense of Newtonian Mechanics and Electromagnetism. They all have three common features:
(i) All particles in the theory can communicate with each other faster than light. (ii) All particles carry an infinite amount of information (this information supports the superluminal communication) (iii) The communication mechanism, and the information behind it, is completely inaccessible to measurement or outside detection.
These two features are necessary (due to a result known as Bell's theorem) for a classical theory to reproduce the features of QM.
There are several such theories, e.g. Bohmian Mechanics.
The reason this superluminal communication is necessary, is because QM would produce a different pattern depending on the set-up of the magnets prior to the plate (e.g. two versus one electromagnets, electromagnets parallel vs perpendicular) something which the oven has no knowledge of. So the atoms from the experimental apparatus in front of the oven communicate with the atoms of the oven faster than light in order to alter its thermal distribution to correctly reproduce the results of QM.
(d) Retrocausal theories: In these theories the future can communicate with and alter the past. So when the electrons stream out of the oven toward the plate, the atoms of the plate, together with the electrons, send signals back in time to oven, altering its heat distribution so that it fires out electrons to correctly replicate the QM pattern on the photographic plate.
Cramer's Transactional theory is one theory in this family.
Again this permits the oven to produce the correct pattern of electrons since the electrons can signal backward in time what set of circumstances they encountered after leaving the oven.
(e) Super-deterministic theories: If I perform the experiment with two sets of magnets, versus one set of magnets, QM will produce different patterns on the plate. In the theories above, the oven somehow becomes "aware" of the set-up and alters its thermal distribution accordingly.
In super-deterministic theories however, a set of gamma rays arrive from the M31 galaxy, let's say, just in time to alter the electron beam, in such a manner as to replicate the appropriate pattern. In this case, the oven does not gain knowledge of it's surroundings, rather the electron beam is altered mid-flight by something from outside the experiment, always in such a way as to reproduce the predictions of QM.
A set of such coincidences must occur for all experiments at all times. Basically, something always "messes up" the experiment in a way that looks like QM.
't Hooft's cellular automata theory is one such theory.
Edited by Son Goku, : Proofreading, slight expansion on (c)
The "Process Formulation" of QM seems reconciling all the interpretations (if Many Worlds are understood as the Potentiality), introducing the distinction between "Einstein Time" and "Process Time",
quote:To comprehend the significance of time in modern physics one must distinguish two very different kinds of time. The first I call "process time," the second "Einstein time." Process time is the time associated with a cumulative process whereby things gradually become fixed and settled. Einstein time is the time part of the space-time continuum of contemporary physics. Contemporary physical theory establishes no connection at all between these two kinds of time, for it says nothing about process. It deals rather with the content of observations. Each observation has a content that includes, in principle, a clock and ruler reading. These readings assign to the observation a place in the space-time continuum. But whether the data represented in one observation become fixed and settled before or after the data represented in some other observation is not determined by contemporary physics: one can equally well imagine either that everything becomes fixed and settled all at once, in some single act of creation, and hence that neither process nor process time exists, or, alternatively, that things become fixed and settled in some definite order. These two possibilities are not empirically distinguishable. Indeed, Einstein's analysis of the meaning of time in physics made it clear that time enters physics only through the content of observations that say nothing at all about the order in which things become fixed and settled. His analysis effectively banished the concept of process from the physical theory of his era. Of course, in the deterministic framework within which Einstein himself worked, process could be no real issue. The deterministic laws ensured that everything was fixed and settled by the initial act of creation. Thus there could be no process. Hence the real impact of Einstein's analysis of time came only later, when quantum theory introduced indeterminism. In this latter context the idea of process arises naturally, at least at the conceptual level. But the founders of quantum theory, following Einstein's lead, circumvented the problems associated with process by asserting that the quantum-mechanical formalism "merely offers rules of calculation for the deduction of expectations about observations obtained under well-defined experimental conditions specified by classical physical concepts." Quantum theory has, nevertheless, one feature that suggests that it should be formulated as a theory of process. The wave function of the quantum theory is most naturally interpreted as representing "tendencies" or "potentia" for actual events. This intuitive idea of the meaning of the wave function was first made explicit by David Bohm in his 1951 textbook, Quantum Theory. ... The process formulation of quantum theory contains no explicit dependence on human observers: it allows quantum theory to be regarded as a theory describing the actual unfolding or development of the universe itself, rather than merely a tool by which scientists can, under special conditions, form expectations about their observations. The quantum theory of process is in general in accord with the ideas of the physicist, logician, and process philosopher Alfred North Whitehead. In particular, the actual is represented not by an advancing, infinitely thin slice through the space-time continuum, but rather by a sequence of actual becomings, each of which refers to a bounded spacetime region: event number n is represented, within physical theory, by a restriction on the set of classical fields allowed in the bounded space time region R(n). We have, therefore, neither becoming in three-dimensional space nor being in the four-dimensional world, but rather becoming in the four-dimensional world. Event number n is represented in physical theory by a restriction upon the classical fields allowed in the bounded space-time region R(n). This restriction induces, through the quantum formalism, changes in the tendencies for the next event. These changes in tendencies are manifested over all of space-time -- i.e., even if the region R(n+1) is spacelike situated relative to R(n). This change in tendencies is the nonlocal change that is associated with the collapse of the wave function in some formalisms and, more generally, with Bell's theorem. The tendencies are calculated in the quantum formalism by using Feynman's sum over all space-time paths. In the S-matrix formulation, these paths extend in time, in general, from minus infinity to plus infinity: they do not terminate in the region R(n). Thus as regards tendencies the entire space-time continuum of relativity theory is involved in each step of the process of becoming. But as regards actualities each actual event is associated with a bounded region in space-time. The conception of process described above differs from Whitehead's because it has no place for his "contemporary events." However, because Whitehead stressed so often that these awkward contemporary events were forced upon him by the physics of his time, rather than by his general principles, it is, I think, safe to infer that, had he known about the nonlocal connections entailed by Bell's theorem, he never would have mutilated his theory by the introduction of these contemporary events.
The quotations above are from the book "Physics and the Ultimate Significance of Time", pages 264, 267.
What if the Multiverse is like a tube of toothpaste? When one end is squeezed it effects every place and every dimension and every particle simultaneously. There are no distances because everything is intrinsically connected. (Bells Theorum, Quantum Entanglement, holographic principle etc..)
The tube is possibly 10 dimensional and the inner and outer surface of the tube is in play. (Kaluza-Klein theory/String Theory/SST/MTheory)
The paste is the quantum foam/strings/branes that fulminate within this tube resonating infinite feed back loops that appear as spontaneous quantum events manifesting universes/fundamental forces/spacetime/matter.
Everything that is and will be is contained in this tube and no observations can be made externally, because to do so would take the observer out of the mix. Maybe that is why all the various theories tend to support some properties affirmed by QM. That could explain why the Higgs comes in at the weight it has or why all probabilities can happen and still collapse into a measurement or why perhaps, retro-causalitly can exist. Even why information can not be lost.
The laws of physics could possibly differ from various regions of the tube like a weak gravity in our universe is a strong force in a neighboring universe. Universes existing on the surface of tube can not be ruled out nor additional curled up dimensions but I dont want to think about that.
The tube is most likely self existent and finite but it's contents in infinite flux and infinite stages of form.
Spit balling can be fun.
"You were not there for the beginning. You will not be there for the end. Your knowledge of what is going on can only be superficial and relative" William S. Burroughs
The amount of American slang I've learned on EVC, I've never heard this expression.
What if the Multiverse is like a tube of toothpaste?
Just to be clear, when you say Multiverse, do you mean "loads of different universes, with different properties" or do you mean the QM version, that is multiple timelines of the same universe.
In essence though, yes, what you describe is a resolution to the quantum interpretation problem. A world with no probabilities, but with objects which possess infinite possible states and which are all intrinsically connected.
Such models have not been developed in as much detail as ones obeying the many-worlds or Copenhagen interpretations (standard QM), they can currently only replicate non-relativistic particle physics, but there is no theorem blocking their development.
Note though, that none of the modern interpretations really have an "observer" as part of the theory, that's a confusion from older QM, i.e. pre-1980s
do you mean "loads of different universes, with different properties" or do you mean the QM version, that is multiple timelines of the same universe.
There is no actual need to distinguish between them. Just imagine that you have a real great Computer simulating, all at once, the whole mathematical object of Physics: ALL the universes, with all the different properties and all their "timelines" (each "timeline" from the very beginning to the very end) appear together in your computer memory at once.
Your computer is representing your toy reality for you as the multitude of events ready to be observed. The first simple game rule is that, choosing an event to observe, you automatically dismiss all the events that are physically incompatible with your chosen one.
Now, the most interesting thing to observe appears to be the good nature on a certain planet. You can't help choosing it of course ― thus killing irreversibly all the other variants of universes with different properties and all the other variants of evolution in you chosen universe ― and there you are, having just created your brand new world with beautiful creatures on the earth and ugly fossils under the earth (and "big-bang" radiation from the sky) ― and it's the very first day of actual observation although some new creatures (esp. serpents)) may well babble about billions years ― they all naturally emerged with such memory. ... After that, you continue your game of observation introducing new players who later discover the "quantum mechanics" and become puzzled about "wavefunction collapse"...
quote:There is no actual need to distinguish between them.
The reason for the distinction I believe, is that one is a multiverse flavor and the other a super symmetry flavor.
The Higgs coming in at the value it has favors super symmetry but just barely.
I really believe it is a matter of just saying the cosmos as oposed to a multiverse that could probably never be tested for validity.
No one wants to think there could ever be information that we could never test or have access to, but that seems to be the way reality manifest itself more often than not. Which I prefer as well. How boring would it be if we knew all the answers.
"You were not there for the beginning. You will not be there for the end. Your knowledge of what is going on can only be superficial and relative" William S. Burroughs