Donate SIGN UP

Uncertainty Principle

Avatar Image
Eustace H. P | 21:07 Sat 05th May 2007 | Science
4 Answers
I'm looking for a street-smart egg-head (is that an oxymoron?)

I'm after a snap-shot understanding of the Uncertainty Principle that'll make sense to someone who can blag intelligence by obviously unnecessarily employing words like 'oxymoron' (above) and 'sesquipedalian' (just then) but who is in truth so dumb that when someone genuinely smart tried to explain the notion of time being scientifically abstract and indeed a falsehood (or at least I think that was it) got so confused that I smiled to feign understanding when really I was instead singing a Chas n Dave song in my head to myself. I tried to find out in the notoriously unreliable Wikipidea however all I got was confusing sterile jargon resembling computer code and on another search found something more accessible but assumed some prior knowledge. The reason I ask is that there seems to be a Futurama joke (how low-brow - or "Lo-brau"! - am I?!) suggesting one cannot accurately measure objects in relation to each other and that to do so in fact gives a suggestion of relativity further from any reality. However, the layman suggestion seemed to be that this isn't actually the reality of the theory (law?). And I do so hate to be outside of the gag circle. Anyone? (Indeed anyone simply still reading?!)

(And while we're at it and now I come to think of it, I think I know the difference between a theory and a law, so where do principles sit amongst these?)
Gravatar

Answers

1 to 4 of 4rss feed

Best Answer

No best answer has yet been selected by Eustace H. P. Once a best answer has been selected, it will be shown here.

For more on marking an answer as the "Best Answer", please visit our FAQ.
You can think of it as limits as to what you can do. For example, you may right now be able to roll a ball and measure how fast it's moving, using some sort of equipment. You may not think there is a limit to how accurate you can do this -- you may use standard equipment and get a speed of 1mph, then do it again with better equipment and get 1.1mph, then 1.08mph, then 10.086mph, etc. You're just getting more and more accurate, and closer to the true speed.

However, Heisenberg (guy that came up with the maths supporting the idea), said that there was a limit to how accurate you can be, and came up with two equations,

[change in momentum] x [change in position] > planck's constant


[change in energy] x [change in time] > planck's constant

planck's constant is some constant number that appears throughout lots of quantum mechanics. just see it as some number. what the equations mean is that, for example, the more accurate you measure the momentum of something, the less accurate will be your knowledge of its position (at the same time). it's a balancing trick. the same goes for energy and time -- the more you know about its exact energy, the less you know about exactly when in time you measured it.

The reason you never come up to this limit in practical uses (like you measuring the ball, for instance), is that Planck's constant is so damned small. So you can still be 'very accurate' with your measurements, it's just a concept that you have to realise; that it's not an endless journey to being more and more precise.

It's a bit weird but those that use it just get on and use it. It works.
Greater men than I have come up with conclusions. Try Professor Stephen Hawking.

http://www.hawking.org.uk/lectures/warps3.html
Idiot.
Thirty years in one job does not make you an expert. You could be a complete plonker!

1 to 4 of 4rss feed

Do you know the answer?

Uncertainty Principle

Answer Question >>