New physics research from CU Boulder keeps the theory of light and matter fractious
Light can’t seem to keep itself together—at least in two dimensions—according to new findings from the University of Colorado Boulder.
This research found that when a dimension is removed from light’s space—reducing it from three dimensions to two—and the electron charge is turned way up, particles of light break down to half of their former selves.
These results, published today in Physical Review Letters, send cracks through a foundational pillar of modern physics—the theory of light and matter. It is also the first time that this type of behavior has been seen in a quantum field theory, outside of one that resulted in a Nobel Prize.
“Quantum field theories are truly weird,” remarked Paul Romatschke, an associate professor of physics at CU Boulder and the author of the article. “They behave in properties that we haven’t really thought about or imagined, expected, even though the theory itself is over 50 years old.”
“It’s a theory that many people have studied their whole lives and yet there are new things that still appear that nobody really expected. It’s very much alive and unruly.”
Much like Romatschke’s other recent publication on string theory’s puzzle of three-quarters, this publication takes a well-known physics theory, quantum electrodynamics (the quantum field theory of how light and matter interact) and pushes it even further by attempting to solve a previously unsolvable question: What happens when an electron’s charge or its coupling (interaction) in light is sent incredibly high (or infinite)?
To do this, he looked to the same method used in his previous publication where he reduced the number of dimensions from three (the normal number) to two.
“This (new research) is not something like string theory that’s hard to observe in nature. It is our theory of light,” said Romatschke. “This is one of our most well-tested theories in three-dimensions.”
What he found is that when he reduces the dimensions, the photon, or particle of light, becomes very odd—so odd that it fractures to half of its former self.
This behavior is remarkable, according to Romatschke, for two reasons. First, the photon becomes a non-integer, and second, it is a simple fraction—both of which shouldn’t be possible in quantum physics.
And yet, these findings are not alone. By simply being non-integers, they gain more than a superficial similarity to another Nobel-prize winning theory: The quantum Hall effect.
The quantum Hall effect says, through a method remarkably similar to what Romatschke used, that the electrical conductivity of different systems jumps in a way that is consistent with the change in electron charge. Basically, if the number of electron charges changes, so too does the conductivity.
The fractional quantum Hall effect, then, takes it one step further. It found that the jumps could match to a fraction of the charge, which is a theory that, at that time, had never been demonstrated before in quantum physics.
“We’re brought up in quantum mechanics to say this (non-integers) is forbidden,” said Romatschke. “For free particles, quantum mechanics says this cannot be true.”
And now it’s been seen again—this time with light.
“I think this (new research) tells us something very deep about what happens when quantum mechanics is pushed far… far away from where we usually can solve things,” said Romatschke. “I think there are some deeper meanings behind this. It’s telling us something.”