Understanding through Discussion


Welcome! You are not logged in. [ Login ]
EvC Forum active members: 85 (8937 total)
35 online now:
AZPaul3, jar, PaulK, PurpleYouko, ringo, Taq (6 members, 29 visitors)
Chatting now:  Chat room empty
Newest Member: ssope
Post Volume: Total: 861,850 Year: 16,886/19,786 Month: 1,011/2,598 Week: 257/251 Day: 28/58 Hour: 0/2


Thread  Details

Email This Thread
Newer Topic | Older Topic
  
Author Topic:   The Uncertainty Principle - is it real?
Son Goku
Member
Posts: 1150
From: Ireland
Joined: 07-16-2005
Member Rating: 7.2


Message 31 of 48 (281288)
01-24-2006 3:00 PM
Reply to: Message 30 by randman
01-24-2006 2:46 PM


Re: same old issue
Yes, measurement causes the wavefunction to take a discrete form in layman's terms. However a wavefunction with a discrete form isn't anything special or amazing, it just resembles the classical world a bit more than a generic one. There are also no completely discrete wavefunctions. The electron will always be spread over a couple of positions. Measurement just tightens the spread.

There is just one note:
Measurement doesn't require a conscious being. There are systems were no human is involved and measurement occurs.


This message is a reply to:
 Message 30 by randman, posted 01-24-2006 2:46 PM randman has responded

Replies to this message:
 Message 32 by randman, posted 01-24-2006 3:07 PM Son Goku has not yet responded

  
randman 
Suspended Member (Idle past 3158 days)
Posts: 6367
Joined: 05-26-2005


Message 32 of 48 (281290)
01-24-2006 3:07 PM
Reply to: Message 31 by Son Goku
01-24-2006 3:00 PM


Re: same old issue
Yes, measurement causes the wavefunction to take a discrete form in layman's terms.

But Wheeler and some others claim it not mere measurement in delayed-choice experiments, but rather observation or even the potential to be observed, correct?

The way I have heard it is that some physicists thought it was the intrusive act of measuring that caused the wave function to collapse, so to speak, but that the delayed-choice experiments showed that this was not the case, and that the mere potential for observation caused the collapse.

On the conscious aspect not necessarily being involved, and this is getting a little off-topic, I recognize that, which is demonstrated with the idea of mere potential for observation. At the same time, there is always consciousness there observing....it gets a little complicated, and generally I have to stop what I am doing to go down this path too far, but I can see where consciousness-based interpretations can be considered a subset of observer-participancy and not necessarily equivalent.


This message is a reply to:
 Message 31 by Son Goku, posted 01-24-2006 3:00 PM Son Goku has not yet responded

  
1.61803
Member
Posts: 2905
From: Lone Star State USA
Joined: 02-19-2004
Member Rating: 5.1


Message 33 of 48 (281294)
01-24-2006 3:39 PM
Reply to: Message 22 by randman
01-24-2006 11:49 AM


Re: ID as fundmental?
Hello,
semantics...that darn word.
Design, intial states of being, pre acutalization, probabilities, wavefunction, wavacule, unmanifested reality, and the list is endless. What one scientist may argue is nature another will concede is God, or spiritual. A self existing, unmanifested uncreated reality. Sounds pretty spirtual to me. Except atheist and or agnostics or dyed in the wool fundlementalist will state otherwise.

If one were to listen to physics theorist talk candidly the words and concepts can and do sound spiritual in nature.
How can one speak of these concepts without sounding like some wacked out Hindu Brahma.

Reminds me of a ancient Hindu story: A bird on the forest floor hears a beautiful bird song. He has tiny wings and cant fly. So he climbs up to a branch in the tree. Listens....hears the song again. He hops higher into the tree where he thinks he hears this bird. He thinks it must be the most beautiful of all creatures because of the song is so beautiful...he strains to listen, again he hears it...hops up to the very top and does not see the bird.. he loses his balance and falls....as he falls he realizes the beautiful bird is he.

What this story means to me is that we spend all our time and energy looking for everything outside of ourselves. What ever it may be. God, Love, truth..., ...But is it possible that the things that we seek already reside within us. Whether you choose to call it spirituality or not, does it matter? Whether one calls it nature or God, does the word change what you personally feel or believe? Is calling existance spiritual or a natural state of being any different when you get right down to the nitty gritty?


This message is a reply to:
 Message 22 by randman, posted 01-24-2006 11:49 AM randman has responded

Replies to this message:
 Message 34 by randman, posted 01-24-2006 4:02 PM 1.61803 has not yet responded

  
randman 
Suspended Member (Idle past 3158 days)
Posts: 6367
Joined: 05-26-2005


Message 34 of 48 (281301)
01-24-2006 4:02 PM
Reply to: Message 33 by 1.61803
01-24-2006 3:39 PM


Re: ID as fundmental?
Beautiful story and some good comments....one reason to point out the spiritual parallels is that we as people should be and some are in search for truth, and I think recognizing science is really involving in QM the deeper reality known by religion as "spiritual", imo, is helpful to understanding that truth is found elsewhere and outside of science and has been for a long time, but that science and spiritual traditions are not dealing with separate realities, but science is a more defined area and mechanism for researching reality.

But beyond any practical considerations, it is an amazing time for science and religion/spirituality, imo, and the joy and excitement of that alone is worth discussing.

I think your emphasis on self-discovery is right or at least needful, but I wouldn't say it is the whole shebang. We still need the Saviour, his redemptive act on the Cross, and God is still both transcendant and immanent. At the same time, Christ within us is the hope of glory.

Just as a quick OT comment, I think one can become a spiritual person and not have accepted Christ and that this has happened over and over again with people, and that many spiritual principles in various spiritual traditions are the same, and even that some forms of Christianity in practice are detrimental to continued spiritual development.

At the same time, the spiritual man, imo, is knowing Christ without knowing His name, and should recognize His sacrifice and atonement and be "saved", but we're getting off-topic. Suffice to say,the way I see the Christian versus other religions thing is that if a Hindu or Buddhist is in touch with God or the divine, he or she is getting in touch with the One, Jesus Christ.


This message is a reply to:
 Message 33 by 1.61803, posted 01-24-2006 3:39 PM 1.61803 has not yet responded

  
Son Goku
Member
Posts: 1150
From: Ireland
Joined: 07-16-2005
Member Rating: 7.2


Message 35 of 48 (281368)
01-24-2006 9:29 PM


But Wheeler and some others claim it not mere measurement in delayed-choice experiments, but rather observation or even the potential to be observed, correct?

The way I have heard it is that some physicists thought it was the intrusive act of measuring that caused the wave function to collapse, so to speak, but that the delayed-choice experiments showed that this was not the case, and that the mere potential for observation caused the collapse.


You're referencing Parametric down-converter experiments or the Vaidmann bomb, or at least they demonstrate what you're talking about.
This would take a quite a while to explain and to be fair I don't think I would do it any justice. I might attempt it eventually if I manage to come up with a decent way to explain it.

Essentially what you are saying is factually correct. However what "mere potential for observation" means in this circumstance is far from obvious. It took me a while to take it in when I first read about it and nearly drove me mad for a week.
It is undoubtedly the weirdest thing in quantum mechanics. Even now I can look at it and think "Okay, what the hell is going on there?".

I'll try my best to think of a half decent demonstration, although it might be a lengthy post, I wouldn't like to leave you hanging on this because you're obviously interested, I just don't know if my explanatory powers are up to it.*

Hopefully I'll be back with a response soon.

(*Its very hard to word it in a way which doesn't presume a serious understanding of the wavefunction)


Replies to this message:
 Message 36 by randman, posted 01-25-2006 12:24 AM Son Goku has not yet responded

  
randman 
Suspended Member (Idle past 3158 days)
Posts: 6367
Joined: 05-26-2005


Message 36 of 48 (281391)
01-25-2006 12:24 AM
Reply to: Message 35 by Son Goku
01-24-2006 9:29 PM


I can wait....
I think I got it actually, on the experiments, but then again, it took awhile and perhaps I don't really get it completely. It's a difficult topic.

This message has been edited by randman, 01-25-2006 12:25 AM


This message is a reply to:
 Message 35 by Son Goku, posted 01-24-2006 9:29 PM Son Goku has not yet responded

  
nipok
Inactive Member


Message 37 of 48 (345339)
08-31-2006 2:48 AM
Reply to: Message 11 by Chiroptera
01-11-2006 5:17 PM


SMOKE AND MIRRORS / Uncertainty and Normalization are FUDGEs
Is there anyone here familiar enough with the normalization of Feynman Diagrams and the Uncertainty Principle of quantum mechanics to be able to provide solid arguments against the likelihood that our current degree of scientific precision is not the real reason that these FUDGES of mathematics are taking place?

Is it not more realistic, or logical, or just plain common sense that we fudge the results to meet our expectations solely based on our current degree of scientific precision where in reality there is another alternative that does not force us to FUDGE data.

The reality is that we can't see or measure everything in real-time/real-space due to the relativity between us and that which we observe. If however time could be slowed down RELATIVE to that which we are trying to observe and distance/space could be measured in increments RELATIVE to that which we are observing then it would seem common sense that the Uncertainty Principle is a farce and it is being used as a crutch to fit what we see and record to better match what we expect.

Well , I say that what we expect is wrong because we are ignoring that which we can’t see or measure. BUT the fact that we can’t see it or measure it does not mean that it does not exist. And once we can accept the reality that there exists those forces caused by objects that we can not measure (YET) we can begin to approach quantum mechanics and particle physics at a scientific level and remove the smoke and mirrors that have been clouding our ability to understand the true nature of the universe. And when that happens, string theory and membranes collapse. String theory and membranes are like houses of cards. One level on top of the other yet a simple breeze or sneeze and it all falls down. Both have a strong basis on accepted deductions that we can’t prove. If we blindly accept the fallacies or shortcomings in the underlying physics that permeate most of common thought (that I guarantee will be disproved in the next 100 years) then we will be stuck without significant progress until then. It is not until the people with the resources to prove/disprove what is happening at the most fundamental levels of our universe step back from their current direction and accept the possibility that we have a house of cards in front of us that they may be brave enough to say that the emperor is wearing no clothes. And those of you who can appreciate my argument can I hope appreciate my analogy.


This message is a reply to:
 Message 11 by Chiroptera, posted 01-11-2006 5:17 PM Chiroptera has not yet responded

Replies to this message:
 Message 38 by Son Goku, posted 08-31-2006 3:45 AM nipok has responded

  
Son Goku
Member
Posts: 1150
From: Ireland
Joined: 07-16-2005
Member Rating: 7.2


Message 38 of 48 (345345)
08-31-2006 3:45 AM
Reply to: Message 37 by nipok
08-31-2006 2:48 AM


Re: SMOKE AND MIRRORS / Uncertainty and Normalization are FUDGEs
The reality is that we can't see or measure everything in real-time/real-space due to the relativity between us and that which we observe. If however time could be slowed down RELATIVE to that which we are trying to observe and distance/space could be measured in increments RELATIVE to that which we are observing then it would seem common sense that the Uncertainty Principle is a farce and it is being used as a crutch to fit what we see and record to better match what we expect.

The Uncertainty principle was predicted by Heisenberg before it was detected, but a lot of people honestly didn't expect to see it in experiment. It was predicted, then observed. So there isn't really any fudging.
A lot of experiments are performed in a particles rest frame, but that doesn't effect the uncertainty principle.

The Uncertainty Principle comes from the fact that for some measurable quantities a particle cannot be in a classically well defined state for both quantities at the same time.

Edited by Son Goku, : No reason given.

Edited by Son Goku, : No reason given.


This message is a reply to:
 Message 37 by nipok, posted 08-31-2006 2:48 AM nipok has responded

Replies to this message:
 Message 39 by nipok, posted 08-31-2006 3:57 AM Son Goku has not yet responded
 Message 40 by nipok, posted 08-31-2006 5:07 AM Son Goku has responded

  
nipok
Inactive Member


Message 39 of 48 (345346)
08-31-2006 3:57 AM
Reply to: Message 38 by Son Goku
08-31-2006 3:45 AM


Re: SMOKE AND MIRRORS / Uncertainty and Normalization are FUDGEs
But can you say (or anyone say) with any degree of certainty that this is not due to our inability to observe an event with sufficient scientific precision? I do not think we have the adequate tools or resources at our disposal to monitor these events with the clarity needed to see what is really happening behind the scenes. We see the before and the after

We can’t see, or measure, or record, the multitude of the in-betweens because of their fleeting nature. They are either too small or exist for too small of a length of time for us to capture or record or observe. That does not mean that they do not exist. That does not mean that they do not have an impact on those objects or events that we can observe and record.

Nor is that any validation of Heisenbergs’s predictive ability to events that he expected our scientific precision to be capable of observing. If anything his prediction further supports our reliance on a model that based on a faulty foundation.


This message is a reply to:
 Message 38 by Son Goku, posted 08-31-2006 3:45 AM Son Goku has not yet responded

  
nipok
Inactive Member


Message 40 of 48 (345352)
08-31-2006 5:07 AM
Reply to: Message 38 by Son Goku
08-31-2006 3:45 AM


Re: SMOKE AND MIRRORS / Uncertainty and Normalization are FUDGEs
Besides the uncertainty principle lets also think about the underlying reasons for The Casimir effect , The Einstein-Podolsky-Rosen Paradox and possibly even Compton scattering. Where is there any proof that these are not due to fallacies of the underling foundation and build upon each other like Band-Aids. It all boils down to our inability to record or observe an event within a sufficient relative time frame and distance to have meaning. All of quantum mechanics builds further and further upon these assumptions.
To assume that because we can't detect smaller particles than those that we can detect that they do not exist and do not impact or react with those that we can detect is a crack in the foundation that we must over come to move forward. Look back at the history of particle physics and quantum mechanics and really, where have we come in 50 years. Yes we know much more about the elementary particles that we can detect but so much of the real physics relies on estimations and acceptance of that which we think can never be measured. I disagree. Once we develop the tools to measure things in trillionths of a second on a scale that would make boson look like the football field we will then have a better idea of a small part of the underlying fabric of space time but we will never be able to map out or grasp the infinite chain of smaller and smaller particles that we will never be able to detect.

This message is a reply to:
 Message 38 by Son Goku, posted 08-31-2006 3:45 AM Son Goku has responded

Replies to this message:
 Message 41 by Son Goku, posted 08-31-2006 2:00 PM nipok has responded

  
Son Goku
Member
Posts: 1150
From: Ireland
Joined: 07-16-2005
Member Rating: 7.2


Message 41 of 48 (345461)
08-31-2006 2:00 PM
Reply to: Message 40 by nipok
08-31-2006 5:07 AM


Re: SMOKE AND MIRRORS / Uncertainty and Normalization are FUDGEs
But can you say (or anyone say) with any degree of certainty that this is not due to our inability to observe an event with sufficient scientific precision?

What do you think the Hiesenberg Uncertainty Principle says?

Nor is that any validation of Heisenbergs’s predictive ability to events that he expected our scientific precision to be capable of observing.

Heisenberg didn't make any assumptions about our limits of observation. The uncertainty principle is simply a statement about non-commuting observables.
A well defined state of one is not a well defined state of another.

If |x> is a state of definite position then:
|x> = c1|p1> + c2|p2> + c3|p3> +..........

Where the p states are states of definite momentum and the c's are constants.

Using this you can show that measurements of momentum and position have a certain standard deviation. The standard deviation predicted is the standard deviation observed.

Where is there any proof that these are not due to fallacies of the underling foundation and build upon each other like Band-Aids.

The fact that they work.
Where is there any proof that these are due to fallacies of the underling foundation and build upon each other like Band-Aids?

It all boils down to our inability to record or observe an event within a sufficient relative time frame and distance to have meaning. All of quantum mechanics builds further and further upon these assumptions.

Upon what assumptions?

To assume that because we can't detect smaller particles than those that we can detect that they do not exist

Who does this?
You should also understand that a particles size has very little to do with how detectable it is. What is more crucial to this, is its lifetime and the energy scale required to create it.

Look back at the history of particle physics and quantum mechanics and really, where have we come in 50 years.

Quantum Electrodynamics, Quantum Chromodynamics, ElectroWeak Theory and the Standard Model, as well as a much greater ability to manipulate the quantum field theory formalism.

but so much of the real physics relies on estimations and acceptance of that which we think can never be measured.

It does? Where exactly is this acceptance of what can't be measured present and how does it effect the theory's results.

Edited by Son Goku, : Placing emphasis.

Edited by Son Goku, : No reason given.

Edited by Son Goku, : My crap spelling.


This message is a reply to:
 Message 40 by nipok, posted 08-31-2006 5:07 AM nipok has responded

Replies to this message:
 Message 42 by nipok, posted 08-31-2006 9:27 PM Son Goku has responded

  
nipok
Inactive Member


Message 42 of 48 (345557)
08-31-2006 9:27 PM
Reply to: Message 41 by Son Goku
08-31-2006 2:00 PM


Re: SMOKE AND MIRRORS / Uncertainty and Normalization are FUDGEs
What do you think the Hiesenberg Uncertainty Principle says? ...Heisenberg didn't make any assumptions about our limits of observation.

In a quick nutshell the principle claims that we cannot with any certainty determine both the position and momentum of a particle somewhat because of its dual state as both a particle and a wave. And it most certainly makes a huge assumption about our limits of observation. He may not have said it directly but it’s easy to decipher between the lines that it is the limits of our current scientific precision that cause this inability to perform an accurate observation. Our need to interact or disturb a state in order to make an observation is due to our inability to record energy, mass, velocity, angular momentum, distance, and time on a scale ten thousand times smaller than a lepton or boson.

As far as what we have achieved in the last 50 years I realize that’s an odd way to phrase my point but what I meant was that between 1910 and 1960 there were over 10-15+ major discoveries / theories coming out each decade. From 1960 forward, yes we have made significant progress and discovered new particles and expanded the standard model but the number of significant theories and discoveries are less and more of what has occurred in the last 50 years were because we tried to fit our observations into the existing model making a blind assumption that the model must have a solid foundation. We all know that there are holes in the model which is why the constructs I mention in this thread were created. In the true model of our universe these fudges and false constructs are no longer required but the true model of our universe will not be known until our ability to surpass our current scientific precision expand 10 or 20 times greater than where we are at now and the problem is we may never reach that level. That will continue to add band aids to fix the holes and continue to steer us away from the truth.

Edited by nipok, : No reason given.


This message is a reply to:
 Message 41 by Son Goku, posted 08-31-2006 2:00 PM Son Goku has responded

Replies to this message:
 Message 43 by Son Goku, posted 09-01-2006 3:54 PM nipok has responded

  
Son Goku
Member
Posts: 1150
From: Ireland
Joined: 07-16-2005
Member Rating: 7.2


Message 43 of 48 (345780)
09-01-2006 3:54 PM
Reply to: Message 42 by nipok
08-31-2006 9:27 PM


Examples Please.
From 1960 forward, yes we have made significant progress and discovered new particles and expanded the standard model but the number of significant theories and discoveries are less and more of what has occurred in the last 50 years were because we tried to fit our observations into the existing model making a blind assumption that the model must have a solid foundation.

Where and when has this happened? In the formulation of what theory? I can think of a few examples over the past 50 years, but none of them are from particle physics.
The Standard Model has been working for thirty years with no data against it thus far.

We all know that there are holes in the model which is why the constructs I mention in this thread were created.

What constructs were created in response to what holes?
Can you please provide examples?

In a quick nutshell the principle claims that we cannot with any certainty determine both the position and momentum of a particle somewhat because of its dual state as both a particle and a wave. And it most certainly makes a huge assumption about our limits of observation. He may not have said it directly but it’s easy to decipher between the lines that it is the limits of our current scientific precision that cause this inability to perform an accurate observation. Our need to interact or disturb a state in order to make an observation is due to our inability to record energy, mass, velocity, angular momentum, distance, and time on a scale ten thousand times smaller than a lepton or boson.

I don't mean to sound abrasive, but this is entirely incorrect. The uncertainty principle states that noncommuting observables don't have simultaneous eigenstates, which leads to a minimum in the product of their standard deviations.
It has nothing to with us disturbing something or leaving it unaffected during our measurement.

In the example of position and momentum, if you have a single position you are in superposition of multiple momentum and vice versa.
To use lax language, think of a particle having a position wave function and a momentum wavefunction, the more collapsed one is the more uncollapsed the other is.


This message is a reply to:
 Message 42 by nipok, posted 08-31-2006 9:27 PM nipok has responded

Replies to this message:
 Message 44 by nipok, posted 09-06-2006 12:21 AM Son Goku has not yet responded
 Message 45 by nipok, posted 09-06-2006 1:58 AM Son Goku has responded

  
nipok
Inactive Member


Message 44 of 48 (346880)
09-06-2006 12:21 AM
Reply to: Message 43 by Son Goku
09-01-2006 3:54 PM


Re: Examples Please.
What constructs were created in response to what holes?
Can you please provide examples?

Dark Matter is a hole that is plugged by phantom particles and phantom energy when in reality it is Aetheric Density that accounts for our inability to directly match up Newtonian physics to our recorded observations and things as common as planets, comets, and moons, star dust, and asteroids that account for the missing matter in the universe.

The need for normalization in Feynman diagrams is another hole that is plugged with assumptions. Ditch the assumptions, account for the true fabric of space and time and the need for normalizations goes away. (OK, it does not go away, but as the precision of our scientific observations continues to increase we may someday be able to account for the need to normalize through real constructs that can be defined, observed, and measured.

The uncertainty principle although capable of producing expected results and helping mathmatical formulas work still at its core was used to plug a hole. Whether you see it or not it is our interpretation of quanta and particle duality that lets it work. If we rethink the interpretation within a wider context it becomes self-evident.

Imaginary Particles and the Casmir Effect are a hole that again try to fit observations into a theory that is flawed and based on accepted deductions that are accepted by consensus not direct proof. Now don’t get me wrong. I am in no way claiming the standard model, quantum physics, general or special relativity, or any of the other constructs that have evolved upon them are flawed as a whole. I would say there is more than enough evidence to support the constructs as a whole. What I don’t agree with is the continued failure to accept the true infinite nature of the universe. Once we accept the true nature of the infinite universe we live in we can then stop trying to plug holes with phantoms and start thinking logically again. Imaginary particles or more so the effects that they have on observable matter are exactly what my theories would expect. So the fact that we can observe their impact would almost seem to prove the existence of particles that we can’t detect yet.

Other holes exist, anyone that has studied particle physics and quantum mechanics is aware that we don’t know enough yet to predict with any accuracy the results we would like at all levels of scientific precision. As the precision increases we close some questions but new ones will pop up. Instead of relying on phantom constructs to put one band aid after another on top of a flawed model we need to separate that which science can prove and that which are assumptions. Once we can split proof and real science from speculation and assumption we will be left with a clearer model of the base physics that govern the universe. I am not saying that all my mutterings are not speculation and assumption, I am quite sure they very much are but we will soon see that they are more correct then the current paradigm. Once we can proof that our universe and all we know of it had initial velocity at the point of expansion it will prove that our pocket of space/time exists inside of a larger pocket of space/time. Along with an infinite number of larger and infinite number of smaller pockets of space/time we exist as a fleeting spec of nothingness in the overall grand picture. To the entire Universe, not just our universe we are a point in time, a point in space, a point in mass, and a point in energy by the definition of a point.

Edited by nipok, : No reason given.

Edited by nipok, : No reason given.


This message is a reply to:
 Message 43 by Son Goku, posted 09-01-2006 3:54 PM Son Goku has not yet responded

  
nipok
Inactive Member


Message 45 of 48 (346901)
09-06-2006 1:58 AM
Reply to: Message 43 by Son Goku
09-01-2006 3:54 PM


Re: Examples Please.
The uncertainty principle states that noncommuting observables don't have simultaneous eigenstates, which leads to a minimum in the product of their standard deviations

Another example of interpretation of observations misleading the logical mind.

Eigenstates and Eigenvectors and a multitude of other constructs are being used in formulas that rely too much on planes and lines and false frames of reference. Every object exists within its own relative pocket of space time and every pocket of space time has its own relative angular momentum in one way , shape, or form to every other relative center of mass. Every object we observe and every interaction between two objects exists within their own relative frame of references. Einstein's analogy to the curvature of time and space relates not only to the effects of gravity but also to the orbit that every object attempts to maintain around some other object. All objects in space will attempt to obtain a natural orbit and this includes inner space and outer space in an infinite number of directions.

We must stop thinking with Cartesian coordinates. The universe is made up an infinite number of polar coordinates with an infinite number of centers. Some could be a center of mass, others a center of density, others a center of energy, and still others I suppose although I am not sure how to visualize it a center of time. Once we come to understand the true nature of space and time we will begin to apply these centers of relative nature to standard equations and then we will learn how to fill in the holes without fudging the results and creating phantom constructs to make the observations fit the existing paradigm instead of re-evaluating the paradigm as a whole.

Argue all you want, anyone can try as hard as they like but nobody can say with any certainty that our current scientific precision and our false interpretations of what we see augmented by the fact that we don’t take polar coordinates, angular momentum, Aetheric density , and infinite quantum sizes and energies into every single equation are not the true reason why we are having so much trouble coming up with a grand unification theory. The sooner that the resources are given to explore these possibilities with some significant funding the sooner we may be able to unite the forces under the single force of electromagnetism.


This message is a reply to:
 Message 43 by Son Goku, posted 09-01-2006 3:54 PM Son Goku has responded

Replies to this message:
 Message 46 by fallacycop, posted 09-06-2006 8:21 AM nipok has not yet responded
 Message 47 by Son Goku, posted 09-06-2006 12:41 PM nipok has responded

  
Newer Topic | Older Topic
Jump to:


Copyright 2001-2018 by EvC Forum, All Rights Reserved

™ Version 4.0 Beta
Innovative software from Qwixotic © 2019