One hundred years ago, in September 1923, a young aristocrat named Louis de Broglie presented the French Academy of Sciences with a note called Waves and Quanta. In total it was about three pages long. It contained a few brief thought experiments about the motion of microscopic particles. The accompanying equations were certainly advanced from a layman’s perspective, but they hardly would have fazed an expert audience like the members of the Academy. There was nothing too elaborate or far-fetched in de Broglie’s argument. But the implications of what he had to say would transform the most rudimentary assumptions of modern science about how reality is structured and what the universe is made of. He was simply upending the foundations of the world.

De Broglie’s note was the first published defense of what is now called “wave-particle duality”: the idea that the minute components of the physical world are not simply very small lumps of matter (“particles”), but also patterns of cyclical change (“waves”). This perplexing notion has become fundamental to the set of theories known as quantum physics. By 1923, it had already been acknowledged as a possibility in certain special cases. But de Broglie proposed that wave-particle duality could apply even to subatomic particles like electrons—that is, even to particles with mass.

Since at least the 17th century, mass had been considered a fundamental property of matter. It was a measure applied to physical objects, expressing the fact that they existed in space and would move over time in predictable ways. In Isaac Newton’s Mathematical Principles of Natural Philosophy (the Principia Mathematica, first published in 1687), the term for mass is a “quantity of matter”: it describes an amount of stuff. And the whole point of the Principia is that all equivalent amounts of stuff, no matter how small or large, behave in certain regular ways. Ten grams of wood or ten grams of stone are still ten grams of matter. As any high school physics teacher will explain, two chunks of matter will jostle against each other when they meet; they will not occupy the same space as one another; they will accelerate under the influence of force. They will behave the way objects behave in our everyday experience, falling to the earth or bursting into flame, melting or colliding as the occasion demands.

The atoms that make up these many chunks of stuff—and eventually the protons, neutrons, and electrons making up the atoms—were supposed to be no different in this essential respect. They were very small, but they were still stuff. De Broglie claimed, on the basis of pure theory, that this was only part of the truth. The next year, in 1924, he submitted his doctoral thesis on the same subject. It was an acrobatic leap of intuition and mathematical reasoning, not yet supported by experimental results.

By 1929 de Broglie had won the Nobel Prize “for his discovery of the wave nature of electrons.” But the intervening years were a chaos of disagreement and disarray among experts as they struggled to understand what was happening to physics. In 1927 the field’s greatest minds—Albert Einstein, Werner Heisenberg, Erwin Schrödinger, Paul Ehrenfest, and Niels Bohr among them—discussed de Broglie’s proposals at the fifth Solvay Conference, the world’s premiere meeting of quantum scientists. Near the end of the gathering, as physicists Guido Bacciagaluppi and Antony Valentini recount in Quantum Theory at the Crossroads (2009), Ehrenfest walked to the chalkboard and wrote down a passage from the Book of Genesis: “And the Lord said:…Go to, let us go down, and there confound their language, that they may not understand one another’s speech.” It was as if a jealous God had descended to crush another Tower of Babel, throwing the grand designs of men into a hopeless confusion. The hard and solid particles of the scientist’s reality had dissolved into a tangle of waves, and the clear tones of pure mathematics had fractured into unintelligible static. What did it all mean?

Ethereal Visions

It had begun with light. In 1846 Michael Faraday, a self-educated prodigy from a rural hamlet to the south of London, speculated tentatively that light and electricity might be the same phenomenon at some basic level. By that time Faraday was a celebrated lecturer in the city, enthralling the public with the wonders he could produce using simple magnets. Pass a magnet through a coil of wire, for example, and an electric pulse will shoot through the wire: the two currents loop around at right angles to each other. They are, in effect, two sides of the same coin. What Faraday wondered was whether magnetism, electricity, and light might all be forms of vibration in the “lines of force which connect particles, and consequently masses of matter together” (“Thoughts on Ray-Vibrations”).

For this theory to work, Faraday had to assume a “view of the nature of matter which considers its ultimate atoms as centres of force, and not as so many little bodies surrounded by forces.” The world, said Faraday, is not a collection of tiny solid objects exerting forces on one another, but a continuous sea of the forces themselves. If the forces are strong enough and concentrated enough, you have a solid object—a portion of space which will push irresistibly back against you when you touch it. If the forces are extended more loosely, we perceive them as empty space—but only in a manner of speaking. No point in the universe is really empty, thought Faraday, because every point is pulsing with some amount of force: “I do not perceive in any part of space…anything but forces and the lines in which they are exerted.” If so, then perhaps light was a shiver of change in the intensity and direction of the forces—a ripple gliding along the contours of space itself.

It would take a more trained mathematician, the Scotsman James Clerk Maxwell, to support some of these proposals with formal equations. The result was a comprehensive description of electricity and magnetism as waves fluctuating together along perpendicular planes, traveling simultaneously in the same direction. If an electromagnetic wave is moving from one end of a hallway to another, and electric charge is traveling up and down toward the ceiling and the floor, then magnetic forces are pushing side to side toward the walls of the hallway. Maxwell soon showed that in a vacuum the whole wave will always move down the hallway at a speed of roughly 300,000,000 meters per second—i.e., the speed of light. “We can scarcely avoid the inference,” wrote Maxwell in an 1862 paper “On Physical Lines of Force,” “that light consists in the transverse undulations of the same medium which is the cause of electric and magnetic phenomena” (emphasis in the original). In a matter of decades, Faraday was vindicated: light is an electromagnetic wave.

Talk of “waves” is fairly common in physics, and it’s easy to toss the word around as if we know what it means. But what is a wave made of? Sometimes it is difficult to get a straight answer. Strictly speaking, waves are not objects but patterns of change—they are disturbances traveling through a medium. That’s relatively satisfying if the medium is a material like water: then the wave is a cycle of motion that the water goes through. When wind passes over the ocean, it pushes the water into a circular movement that slides from region to region. Successive areas on the surface will rise and fall as the cycle continues, but it’s the wave overall that travels in a horizontal direction. The water stays roughly in place while the wave itself, that pattern of energy and movement, rolls through the water toward the shore.

Fair enough when it’s a case of matter in motion—objects pushing one another around as in old-fashioned Newtonian mechanics. But what is moving when light travels through the distant regions of space? One old answer, still preferred by Maxwell and many of his contemporaries, was “the ether”: a delicate fluid so subtle that it fills the whole of the cosmos invisibly. René Descartes had proposed an alternative in an unpublished treatise on The World (written between 1629 and 1633), which he brooded over in secret during the tense years after Galileo’s imprisonment. Perhaps light shuddered through particles of air which “all touch and press one another as much as possible,” so that their contact would pass the ray from one region to the next in an unbroken chain. But whatever the physical explanation, energy couldn’t simply hang suspended in nothingness: it must move through something, as the crest of an ocean wave moves through the sea.

No end of experimental effort was spent to detect some trace of light’s medium, some whisper of the fine substance through which it flows. It was all in vain. To carry light at its blistering speeds through space, the ether would have to be extraordinarily rigid—and yet it was also supposed to be so slick and yielding that its presence was imperceptible. As the contradictions mounted, the ether became like a glitch in the system, a jagged mismatch between the solid objects of Newton’s mechanical world and the smooth flow of Faraday’s energy fields. In trying to move from one picture of things to the other, the mind sputtered and caught on this impossible specter—this ether that was supposed to be lighter than a breath and stiffer than steel, undetectable yet omnipresent, like some spirit of the air.

The Quantum Hitch

At the turn of the last century, a young eccentric sat in a Swiss patent office, worrying over these paradoxes like a dog with a bone. Though the world had judged him a layabout, Albert Einstein could see before almost anyone else that electrodynamics was on a collision course with classical mechanics. The whole structure of physics would have to be torn down to the studs, and he would be the man to do it. The year 1905 is known among historians of science as Einstein’s annus mirabilis: his miracle year. Scrounging together what journals and books he could get, working almost alone, he published four papers in the Annals of Physics. Each one presented a result that would prove momentous. The first (“On a Heuristic Viewpoint Concerning the Production and Transformation of Light”) was engaged with work by the theorist Max Planck. And Planck, almost against his will, had found a snag in the electromagnetic field.

If light is indeed just an energy transfer, then any amount of it, however weak, should beam continuously through space and onto any surface it might meet. Electrons on the surface should build up energy gradually until they burst the bonds that keep them attracted to the positive charge at the center of the atom. But Planck’s work on radiation implied that at low frequencies, with relatively few wave crests per second, the energy of light did not flow smoothly. It would collide with a surface that should have eventually surrendered up its electrons, and then—nothing. Not until it reached a certain frequency threshold would light carry enough charge to set electrons free.

Light’s energy seemed to come in chunks or “quanta” (singular “quantum”), rather than in an unbroken flux. Planck had been trying to stamp out this little grain of indivisible activity, to simplify his equations and remove from them the quantum constant that now bears his name. But the quantum kept popping back up, like a stubborn air bubble under a tablecloth—it could be pushed from place to place, but never quite flattened out. Einstein showed that if the quantum was really fundamental, light’s behavior was mathematically identical to that of a particle moving through space. If light rays were streams of very small bodies, each with a certain momentum, then only light particles with enough energy could “knock” electrons loose from a reflecting surface. These light particles, called “photons,” had to be imagined as having momentum, but no mass. They carried an energy proportional to the frequency of the corresponding electromagnetic wave. From a mathematical standpoint, it made a certain degree of sense to describe the phenomenon as either a wave or a series of particles. But only one version of the picture—light as a bundle of tiny, charged objects—explained the strangely pixelated contours of the charge.

Here was the knotted tangle in the fabric of things, the hitch in the tapestry of existence. The “corpuscular” theory of light, which built it up out of small solid objects, had seemed like a relic of the 18th century. Experiment and theory alike had shown that light behaved like a wave of forces, rising and falling seamlessly over time. But now that pattern of change was behaving in ways that only a very small object could. It was as if the strangest of Faraday’s speculations was coming true—as if forces themselves were cohering into something very much like matter.

The third paper of Einstein’s miracle year was called “On the Electrodynamics of Moving Bodies.” It is now known simply as the special theory of relativity, a fantastically successful effort to adjust the laws of motion for the unique behavior of light that Maxwell’s equations implied. Among its consequences, elaborated in Einstein’s fourth paper, was the famous equation E=mc2. This implied that mass and energy could morph into one another under the right conditions. The boundaries between energy and matter were beginning to blur.

That was all de Broglie would need. Taken together, Einstein’s papers of 1905 have the makings of a discovery that would unsettle him and all his colleagues profoundly. Using the relativity equations, de Broglie would show that a particle with a small enough mass can be considered mathematically equivalent to a wave of a proportional frequency, since the mass of the particle and the energy of the wave are effectively interchangeable. Not only could waves of energy congeal into mass, but particles of mass could melt into patterns of energy. A crack was shivering through the Tower of Babel’s foundation stone; the orderly world picture of classical physics had begun to shift and bend into something like a surrealist painting. No wonder mayhem ensued.

The Medium of Matter

Among the most tormenting questions was one of the most basic ones: if a particle can be a wave, what is it a wave of? What medium is it moving through? When it came to wave-particle duality, no issue could be stickier. It became more troublesome still after the discoveries of Erwin Schrödinger, a debonair cosmopolitan who seized on the insights that would make his name during the Christmas and New Year’s season, 1925-26. It was a positively Byronic winter: holed away in a Swiss villa with one of his several mistresses and a copy of de Broglie’s Ph.D. thesis, Schrödinger would stuff a single pearl in each ear to drown out the noise around him as he worked. The result was six papers outlining the mathematical rules governing quantum waves.

Schrödinger’s wave function, for which he would win the Nobel Prize, is an operation that describes a set of “standing waves”—waves that repeat one pattern within a set region, like a guitar string vibrating up and down. The wave equations produce developing patterns of numbers over time. Schrödinger could prove that those numbers described a permanent wrinkle in the bedrock of existence. He had rummaged around in the dark heart of things long enough to outline the contours of particle waves. But he had nothing to say just yet about what those waves were moving through.

It was the German physicist Max Born who proposed an answer: the Schrödinger wave describes the probability of finding a given particle in a particular place, or possessing a particular momentum, at any given time. Solutions to the wave functions indicate how likely it is that a particle will be found at any point in space—if the particle is an electron belonging to an atom in the Grand Canyon, Schrödinger’s values will be higher in certain regions near the nucleus of that atom, lower in other regions, and effectively zero in a Seattle coffee shop. The same goes for the particle’s momentum: the equation tells you which values you are likely to find if you measure it—but none that you are certain to find. The wave function is, in Schrödinger’s words, “the means for predicting probability of measurement results.”

This was a monumental achievement. It was also a catastrophe. The language of numbers and equations was supposed to refer, however distantly, to things in the world. Even calculations of probability, when they emerged, were supposed to describe the temporary limits of human knowledge—not the inherent limits or, worse yet, the real state of affairs. We might not be able to predict exactly where each molecule in an expanding cloud of gas is going to travel, but each one is going to travel somewhere, and if we knew enough we could say exactly where. Now the possibility was opening up that we might never know enough—that until we observe the universe, there is only so much about it to be known.

At the dawn of modern science, Galileo drew a sharp distinction between what are called “primary” and “secondary” qualities. The secondary ones are mere functions of the human mind: subjective experiences like color, taste, smell, and so on. These qualities, he allowed, would vanish like smoke when no mind was there to perceive them. But the primary qualities of objects, the real facts of their existence, were supposed to be independent of any human experience. “The nature of matter or body in its universal aspect,” wrote Descartes in his Principia Philosophiae (1644), “does not consist in its being hard, or heavy, or colored, or anything that affects our senses in any other way, but solely in the fact that it is a substance extended in length, breadth, and depth.” These basic quantities could furnish sure knowledge about how things really are when we’re not looking. Calculations we make about position in space and time, stones falling and water flowing, should hold good whether we are there to see them or not. The world is not supposed to melt away when we turn our backs.

Yet quantum physics raises the possibility that what goes on beyond our sight is not the ticking of a clockwork machine, but a flutter of potential and chance. Because of course an object’s position and extension in space does “affect our senses”—that is how we know where it is. If it is not affecting anyone’s senses, perhaps it is not exactly anywhere: perhaps it only has the potential to be somewhere. That potential—the range of where things could be, what they could be doing—is the medium through which quantum waves move, the field whose changing values describe the basic building blocks of the universe. The only known thing that is sure to resolve that potential is the experience of a conscious observer.

This concept has met with passionate resistance. Einstein would insist resolutely in Relativity: The Special and General Theory (1916) that the universe should submit to an objective mathematical account, “a theory which describes exhaustively physical reality, including four-dimensional space.” Yet it became clear to everyone, as Schrödinger put it to the Berlin Congress of the Society for Philosophical Instruction in 1931, that “the mathematical apparatus derived by Newton is inadequately adapted to nature” (“Indeterminism in Physics”). At the relentless prompting of their colleague Niels Bohr, the pioneers of quantum physics were forced to grapple with the possibility that their discoveries might—in the words of Schrödinger’s translator and interviewer James Murphy—“reduce the last building stones of the universe to something like a spiritual throb that comes as near as possible to our concept of pure thought” (Lectures on Physics and the Nature of Scientific Knowledge).

In experiment, the world’s smallest particles—now multiplied in number by the Large Hadron Collider at the European Organization for Nuclear Research (CERN)—have kept on refusing to behave exactly like solid objects when out of our sight. If passed successively through a series of thin slits and onto a screen where its impact can be detected, a single electron will not land straight on the other side. Each electron will scatter into one of several regions predicted by its wave function, as though its many possible trajectories had gone on tossing and lapping against each other right up until they crashed and broke upon the rock of human perception. The technologies that depend on these kinds of strange facts—lasers, MRI machines, semiconductors—are already commonplace. The quantum world is not the world of what we can see and touch. But it is real.

A Factory of Idols

To this day, the interpretation of quantum physics is a hotly debated question: what’s at stake is nothing less than the nature of reality and our place in the universe. Those committed to saving a purely mechanical view of nature have come up with various ways to do so under quantum conditions—perhaps, for instance, the multiple possibilities that hover outside our vision are all playing out at once in parallel worlds. But all this must of necessity remain a matter of speculation. If the evidence does not compel, it does at least justify the conclusion that things are simply different when we cannot perceive them. That may be the most profound implication of wave-particle duality. Certainly, it is the one that most severely threatens the premises of the scientific revolution. And those premises are not simply the stock-in-trade of a select professional class. Newtonian mechanics achieved such dazzling success, for so many years, that its picture of the world took on a special authority as the definitive truth about things. “Nature, and Nature’s laws lay hid in night,” wrote the poet Alexander Pope: “God said, Let Newton be! and all was light.” Benighted generations past might have imagined the universe as a disc of earth between two infinite oceans, or a set of interlocking crystal spheres. But now at last the real vision had come into focus, sharp and clear as a mathematical equation: reality was a great infinity of space, occupied by bodies in motion. It is not too much to say that classical physics attained an almost scriptural authority over what could be considered absolutely real, as the Church in Galileo’s day feared it would.

Every kind of physics begins by asking us to picture the world a certain way. “Imagine everything riding on the back of a turtle.” “Picture a grid extended infinitely in all directions.” “Think of Venus fixed within a hollow orb.” The test of the picture is how exactly it predicts the future: we cannot see the turtle, or the grid, or the orb. But if they were there, what would happen to the things we can see? The hope and the promise of classical physics was to settle on a final picture once and for all, not simply as a convenient working model but as an absolute truth. That was the point of primary qualities: certain aspects of the world might be figments of human experience, but others would stand fixed for all time in the absolute certainty of mathematics. “Philosophy is written in this grand book, the universe,” declared Galileo in The Assayer (1623). “It is written in the language of mathematics, and its characters are triangles, circles, and other geometric figures.” If it could be quantified, it could be counted on: the raw truth of things was bodies moving through space. We might bear witness to them, but we had no part in their creation.

This in itself was a revolution. An older approach was to use mathematics only for the sake of “saving the appearances.” This meant finding pictures and models that would account for how things appear to us, without asking what is “behind” the appearances. However closely we peer at things, however carefully we parse our sensations, we will only ever by definition be experiencing a set of human perceptions. The numerical abstractions we use to predict and describe these experiences are not some “deeper” reality “beneath” our perceptions: our mathematical models really are models. However sophisticated they may be, they are not the bedrock of existence. “A map is not the territory it represents,” wrote the scientist Alfred Korzybski in 1933 (Science and Sanity). All models are wrong, but some are useful: thoughts never occur without images, as Aristotle understood (De Anima 431a). But the image is not the thought.

We know this. But it is easy to forget. What we long to do is reach beyond the veil of our humanity, to touch and see the world behind the screen of our perceptions. What we do instead is take our pictures for reality, confusing our physical models for the immaterial things they represent. In the Christian tradition this is called “idolatry,” a confusion between an eidōlon—i.e., an image—and the thing it depicts. A statue of a god becomes a god. The power of a human king replaces the divine power that it stands in for. Matter replaces spirit. Pictures replace the truth.

The picture of things that emerged from the scientific revolution—a world stripped naked of human feeling, its contents churning through space like parts in a machine—came to stand for generations as the absolute truth, the reality behind which there is nothing else. At the bottom of existence there are “quantities of matter”: that is still the world picture taken for granted in our pop metaphysics. “We’re all just made of molecules and we’re hurtling through space right now,” cheered comedienne Sarah Silverman as she accepted an Emmy award in 2014. “Atoms in our bodies trace to the remnants of exploded stars,” mused the celebrity physicist Neil deGrasse Tyson in a viral tweet. “All things are made of atoms—little particles that move around in perpetual motion,” asserted Richard Feynman, one of the greatest physicists of the last century, in his undergraduate lectures at Caltech. The picture of matter in motion has stuck in the popular imagination as the final truth of all things.

For a century and more, this picture of the world has been dissolving. It was only ever a model—another image of the world beyond our sight, useful but ultimately false. But like the priests of some old stone god, the most enthusiastic acolytes of classical physics took its imagery for literal reality. Even as that imagery has frayed around the edges, its popular appeal has not waned. As a result we have been living—we are still living—on the painted stage of a pagan universe, imagining that our pictures spring to life when we’re not looking.

Yet it’s only when we are looking that the pictures have any meaning at all. In the age of quantum physics, appearances have come back to haunt us. The entities we talk about are all entities we experience, ways in which the outer world makes an impression on us. Even words like “quark” and “electron” refer to our encounters with the smallest regions of space we can find when we inspect them very closely. Before that, they do exist—but not in any way we can comfortably imagine or depict. Our mathematics is beginning to describe the outer limits of our possible knowledge, the sum total of the information we could ever conceivably acquire. And those limits end precisely at the borders of our sight, in the domain where matter meets mind. The world, so far as we can speak of it, is the world of our experience.

We have yet to incorporate these new discoveries into the casual mythology we have built around science. Perhaps we are afraid to do so. Perhaps the result would be an iconoclasm so shattering that we could not let it near our cherished idols. The thoughtless world of matter in motion, Sarah Silverman’s world of “molecules hurtling through space,” is an imaginary fiction. The world of color and light and sound, of memories and dreams and desires—this world of human experience is the realest one there is. Beyond that we can only say how things can affect us once we encounter them. What can it even mean to speak about the stars except as points of light in the eyes of a night watcher, or blazes of color against a telescope lens? What are molecules or atoms, what is time itself, if no creature can experience “before” and “after”? What is the world without mind?

It has become customary to speak of the universe as existing for “billions of years” before the advent of conscious life—an empty cathedral built by no one, hurled into existence by a great burst of energy. The various competing explanations of this process all depend on resolving the many quantum possibilities of a tiny infant universe into a timeline of definite unfolding events, from the appearance of the first photons to the blazing fusion that would eventually create the first stars. But since those possibilities are manifold and indeterminate until observed—since things like “years,” “energy,” “photons,” and “atoms” are exactly the kinds of things that cannot quite exist unseen—it may turn out that we have been talking mostly about how these things would have behaved if there was someone there to watch them. The most fearsome heresy of all, in an age committed to materialism, is that indeed there was someone there. “Men commonly believe that all things are known or perceived by God, because they believe the being of a God,” wrote Bishop George Berkeley in his Dialogues (1713), long before wave-particle duality was ever suspected. “I, on the other side, immediately and necessarily conclude the being of a God, because all sensible things must be perceived by him.” Perhaps the earth was indeed “formless and void” until it came into full view at his command, illuminated by a burst of radiance from the regions where mind gives form to matter, and for the first time there was light.