An ultra-precise measurement of a transition in the hearts of thorium atoms gives physicists a tool to probe the forces that bind the universe.
So, I’m not quite sure I understand. I know that they use CZM atoms for atomic clocks, and they are extremely accurate. So, will this be used for atomic clocks, too? Or is it more accurate? Or is this for something totally different entirely? It appears to me as though this is something different entirely. But I don’t see why it could not be used for an atomic clock if it’s even more accurate than Seism.
My understanding is that current atomic clocks work on changing the state of whole atoms.
Whereas this new method changes the state of part of the nucleus of an atom.
Basically smaller is more precise. However given that current atomic clocks are one second out over something like a billion years I’ve no idea what benefit this extra preciseness will give us.
We’ll probably start noticing really weird shit when we look at time that precisely. That’s generally what’s happened when we get into the quantum scale of things.
Yeah the simulation breaks down when you reach quantum scales. The engine will start trying to render things it doesn’t know how to render and things just kind of fall apart (particle-wave duality and all that).
If you stay in the macro scale there are efficient functions that handle the world physics very well.
I’m most impressed with the concurrency of the simulation than anything else. But tbqh it could all be running on a single thread and we probably wouldn’t be able to tell. Again, unless we get to the quantum scales.
That fact that it could be a simulation hints at the fact that there is an underlying set of rules that could be used to generate that simulation. Those underlying set of rules could also be considered the most fundamental laws that govern the universe.
Cesium?
Yes, damn dictation
See Zed ‘Em?
It’s that t-229 can have its nucleus excited using far less energy than regular atomic clock nuclei.
That leads to ultra precise excitation using wavelengths that cancel out some of the fundamental forces within the atom.
That leads to us being able to monitor at a trillion to one ratio those forces based, in part, on mathematical ‘constants.’ In the excited state we can measure if there’s even the smallest variance in force, which in turn may mean that some ‘constants,’ aren’t.
However the real testing of that is in the future as they estimate that a 10 trillion to one ratio is needed.
Theory described a door, research defined the door and possibly what’s behind it, and experimentation just opened the door.
The Idaho researchers observed that reversing the intrinsic angular momentum, or “spin,” of thorium-229’s outermost neutron seemed to take 10,000 times less energy than a typical nuclear excitation. The neutron’s altered spin slightly changes both the electromagnetic and strong forces, but those changes happen to cancel each other out almost exactly. Consequently, the excited nuclear state barely differs from the ground state. Lots of nuclei have similar spin transitions, but only in thorium-229 is this cancellation so nearly perfect.
Basically, thorium-229 can be excited by conventional lasers instead of gamma rays. Instead of millions of electron volts, it takes less than 10, which means it’s more reliable and more precise.
This seems to be millions of times more accurate, according to the article.
So what if constants aren’t constants?
I imagine a few physicists will be a little upset.
Except the one who gets the Nobel Prize for proving it
Eh, theorists just work in units where they’re all 1 anyway. And experimentalists round to to the nearest order of magnitude lol
They will be called “variables” in the future, and scientists will try to figure out how they tick. And: They were seen as constants for a sufficientyl long time, so still treating them like constants won’t hurt, as the value will probably only vary over long reaches of time or unlikely/uncommon circumstances like relativistic speeds.
We treat g = 9.81m/s², well knowing that this changes depending on height and location. But this value is totally sufficient for everyday purposes, and no bridge will ever collapse just because of local derivations from 9.81. The precise local value of g is only of relevance for a very small range of applications.
It won’t affect much except bleeding edge theoretical physics. Much the way we don’t need relativity to make airplanes fly (but round-earth gravity models help for long distance flights).
Physical laws are mathematical models that reflect natural forces and predict outcomes (accurately that we can fling cans of passengers across the world safely). It wouldn’t be the first time we discovered that some previously constant forces are actually variable (much the way the force of gravity is affected by distance, noticeable only when you lob something high enough.) We shrug and change the variables, and some physicists near retirement may balk and say it’s ridiculous, as Einstein did regarding Heisenberg’s probability-based quantum mechanics.
It wouldn’t be the first time we discovered that some previously constant forces are actually variable (much the way the force of gravity is affected by distance, noticeable only when you lob something high enough.)
More specifically to your example, we discovered that gravity isn’t a force at all- it’s a literal curvature of space-time caused by objects with mass, which is why its effects aren’t constant.
Yes, but that model is late in the gravity game. We were toying with the two bodies experiment before heliocentrism, let alone higher-dimension curvature of space.
In math context, we’d just call it variable and move on.