Just four numbers underpin the laws of physics. That's why scientists have for decades looked for any discrepancies in these so-called fundamental constants. Finding such a variation would rock the very foundations of modern science.
Not to mention, it would guarantee at least one lucky researcher a free trip to Stockholm, a shiny new gold medal and a million bucks.
Recently, a pair of astronomers turned to one of the oldest stars in the universe to test the constancy of one of the superstars of the four fundamental forces of nature — gravity. They looked back in time over the past few billion years for any inconsistencies.
Not to give away the full story, but no Nobel Prizes will be awarded just yet.
We take Newton's gravitational constant (denoted simply by "G") for granted, probably because gravity is pretty predictable. We call it Newton's gravitational constant because Newton was the first person to really need it to help describe his famous laws of motion. Using his newly invented calculus, he was able to extend his laws of motion to explain the behavior of everything from apples falling from a tree to the orbits of the planets around the sun. But nothing in his math told him just how strong gravity ought to be — that had to be experimentally measured and slipped in to make the laws work.
And it's basically been that way for centuries — measuring G on its own and plugging it into the equations when needed. Nowadays, we have a more sophisticated understanding of gravity, thanks to Einstein's theory of general relativity, which describes how gravity arises from the distortion of space-time itself. And one of the cornerstones of relativity is that physical laws should stay the same in all reference frames.
This means that if one observer in a particular reference frame — say, someone standing on the surface of the Earth, or floating out in the middle of space — measures a particular strength of gravity (Newton's G), then that same value should apply equally all throughout space and time. It's simply baked into the mathematics and fundamental working assumptions of Einstein's theory.
On the other hand, we know that general relativity is an incomplete theory of gravity. It doesn't apply to the quantum realm — for instance, the itty-bitty particles that make up an electron or a proton — and the search is on to find a true quantum theory of gravity. One of those candidates for such a theory is called string theory, and in string theory there is no such thing as numbers that just need to be tossed in.
In string theory, everything we know about nature, from the number of particles and forces to all their properties, including the gravitational constant, must arise naturally and elegantly from the mathematics itself. If this is true, then Newton's gravitational constant isn't just some random number — it's an outgrowth of some complicated process operating at the subatomic level, and it doesn't have to be constant at all. And so in string theory, as the universe grows and changes, the fundamental constants of nature might just change along with it.
All of this begs the question: Is Newton's constant really constant? Einstein gives a firm and clear yes, and the string theorists give a firm and clear maybe.
It's time to do some tests.
Einstein on trial
Over the past few years, scientists have devised very sensitive experiments of the strength of gravity on Earth and in our nearby vicinity. These experiments give some of the tightest constraints on variations in G, but only over the past few years. It could be that Newton's constant varies incredibly slowly, and we just haven't been looking carefully for long enough.
Related: 6 Weird Facts About Gravity
On the other end of the spectrum, if you monkey around with the fundamental constants of nature, you're going to start messing up the physics of the early universe, which is visible to us in the form of what's called the cosmic microwave background. This is the afterglow light pattern from when the universe was only a few hundred thousand years old. Detailed observations of that background light also place constraints on the gravitational constant, but these constraints are much less precise than those found from tests we can do in our own backyard.
Recently, astronomers have concocted a test of variations in G that strikes a good middle ground between these two extremes, which they describe online in the preprint journal arXiv. It's a relatively high-precision test; not as precise as Earth-based ones but far better than the cosmic ones, and it also has the benefit of spanning literally billions of years.
It turns out that we can look for changes in Newton's gravitational constant by looking at the wobbling of one of the oldest stars in the universe.
It's in the wiggle
The Kepler space telescope is famous for hunting for exoplanets, but in general it's just really good at staring at stars for long periods of time, looking for even the slightest variation. And some of those variations just come from the fact that stars, well, vary in brightness. In fact, stars pulse and quiver from sound waves crashing around inside of them, just like earthquakes — both are made of materials (a superhot and dense plasma in the case of the sun) that can vibrate.
These quakes and quivers on the surface of the star affect its brightness and tell us about the interior structure. A star's interior depends on its mass and age. As stars evolve, both the size of the core and the dynamics of all its inner layers change; those changes affect what's going on at the surface.
Related: 15 Amazing Images of Stars
And if you start messing around with the constants of nature, like Newton's G, it changes how stars evolve over the course of their lifetimes. If Newton's constant really is constant, then stars should slowly increase in brightness and temperature over time, because as they burn hydrogen in their cores, they leave behind an inert lump of helium. This helium gets in the way of the fusion process, reducing its efficiency, forcing stars to burn at a faster pace to maintain equilibrium, getting hotter and brighter in the process.
If Newton's constant is slowly decreasing with time, this process of brightening and heating will operate on much faster timescales. But if Newton's constant behaves the opposite way and steadily increases with time, stars will actually dip in temperature for a while, then hold that temperature fixed while ratcheting up in brightness as they age.
But these changes are really apparent only over very long time periods, so we can't really look to our own sun — which is about 4.5 billion years old — as a good example. Also, big stars don't have long lives, and they also have incredibly complicated interiors that are difficult to model.
In comes KIC 7970740 to the rescue, a star only three-quarters the mass of our sun that's been burning for at least 11 billion years. A perfect laboratory.
After staring at this star, astronomers took years of Kepler data and compared it with various models of the star's evolution, including those with variations in Newton's G. Then, they tied those models to observations of the seismology — the wiggles — on the surface. Based on their observations, Newton's constant really is constant, at least as far as they can tell, with no changes detected at the level of 2 parts in a trillion (like knowing the distance between Los Angeles and New York City to the width of a single bacterium) over the past 11 billion years.
Where does Newton's constant come from and how does it remain so constant? We don't have an answer to that question, and as far as we can tell, Newton isn't going anywhere anytime soon.
- The 18 Biggest Unsolved Mysteries in Physics
- 11 Fascinating Facts About Our Milky Way Galaxy
- One Number Shows Something Is Fundamentally Wrong with Our Universe
Originally published on Live Science.