The Most Important Unsolved Problem in Computer Science


Here’s a look at the million-dollar math problem at the heart of computation

mathematic formulas on a computer display, blue text on black screen
Credit: alengo/Getty Images

When the Clay Mathematics Institute put individual $1-million prize bounties on seven unsolved mathematical problems, they may have undervalued one entry—by a lot. If mathematicians were to resolve, in the right way, computer science’s “P versus NP” question, the result could be worth worlds more than $1 million—they’d be cracking most online-security systems, revolutionizing science and even mechanistically solving the other six of the so-called Millennium Problems, all of which were chosen in the year 2000. It’s hard to overstate the stakes surrounding the most important unsolved problem in computer science.

P versus NP concerns the apparent asymmetry between finding solutions to problems and verifying solutions to problems. For example, imagine you’re planning a world tour to promote your new book. You pull up Priceline and start testing routes, but each one you try blows your total trip budget. Unfortunately, as the number of cities grows on your worldwide tour, the number of possible routes to check skyrockets exponentially, rapidly making it infeasible even for computers to exhaustively search through every case. But when you complain, your agent writes back with a solution sequence of flights. You can easily verify whether or not their route stays in budget by simply checking that it hits every city and summing the fares to compare against the budget limit. Notice the asymmetry here: finding a solution is hard, but verifying a solution is easy.

The P versus NP question asks whether this asymmetry is real or an illusion. If you can efficiently verify a solution to a problem, does that imply that you can also efficiently find a solution? Perhaps a clever shortcut can circumvent searching through zillions of potential routes. For example, if your agent instead wanted you to find a sequence of flights between two specific remote airports while obeying the budget, you might also throw up your hands at the similarly immense number of possible routes to check, but in fact, this problem contains enough structure that computer scientists have developed a fast procedure (algorithm) for it that bypasses the need for exhaustive search.

You might think this asymmetry is obvious: of course one would sometimes have a harder time finding a solution to a problem than verifying it. But researchers have been surprised before in thinking that that’s the case, only to discover last-minute that the solution is just as easy. So every attempt in which they try to resolve this question for any single scenario only further exposes how monumentally difficult it is to prove one way or another. P versus NP also rears its head everywhere we look in the computational world well beyond the specifics of our travel scenario—so much so that it has come to symbolize a holy grail in our understanding of computation.

In the subfield of theoretical computer science called complexity theory, researchers try to pin down how easily computers can solve various types of problems. P represents the class of problems they can solve efficiently, such as sorting a column of numbers in a spreadsheet or finding the shortest path between two addresses on a map. NP represents the class of problems for which computers can verify solutions efficiently. Our book tour problem, called the Traveling Salesperson Problem by academics, lives in NP because we have an efficient procedure for verifying that our agent’s solution worked.

Notice that NP actually contains P as a subset because solving a problem outright is one way to verify a solution to it. For example, how would you verify that 27 x 89 = 2,403? You would solve the multiplication problem yourself and check that your answer matches the claimed one. We typically depict the relationship between P and NP with a simple Venn diagram:

Venn diagram shows one large circle labeled “NP” encompassing a smaller one labeled “P.” The entire circle is labeled “Problems with solutions that computers can verify easily.” The area inside of P is labeled “Problems with solutions that computers can find easily.” The area in NP but outside of P is labeled “Problems with solutions that computers can verify but not find easily.”
Credit: Amanda Montañez

The region inside of NP but not inside of P contains problems that can’t be solved with any known efficient algorithm. (Theoretical computer scientists use a technical definition for “efficient” that can be debated, but it serves as a useful proxy for the colloquial concept.) But we don’t know if that’s because such algorithms don’t exist or we just haven’t mustered the ingenuity to discover them. Here’s another way to phrase the P versus NP question: Are these classes actually distinct? Or does the Venn diagram collapse into one circle? Do all NP problems admit efficient algorithms? Here are some examples of problems in NP that are not currently known to be in P:

  • Given a social network, is there a group of a specified size in which all of the people in it are friends with one another?
  • Given a varied collection of boxes to be shipped, can all of them be fit into a specified number of trucks?
  • Given a sudoku (generalized to n x n puzzle grids), does it have a solution?
  • Given a map, can the countries be colored with only three colors such that no two neighboring countries are the same color?

Ask yourself how you would verify proposed solutions to some of the problems above and then how you would find a solution. Note that approximating a solution or solving a small instance (most of us can solve a 9 x 9 sudoku) doesn’t suffice. To qualify as solving a problem, an algorithm needs to find an exact solution on all instances, including very large ones.

Each of the problems can be solved via brute-force search (e.g., try every possible coloring of the map and check if any of them work), but the number of cases to try grows exponentially with the size of the problem. This means that if we call the size of the problem n (e.g., the number of countries on the map or the number of boxes to pack into trucks), then the number of cases to check looks something like 2n. The world’s fastest supercomputers have no hope against exponential growth. Even when n equals 300, a tiny input size by modern data standards, 2300 exceeds the number of atoms in the observable universe. After hitting “go” on such an algorithm, your computer would display a spinning pinwheel that would outlive you and your descendants.

Thousands of other problems belong on our list. From cell biology to game theory, the P versus NP question reaches into far corners of science and industry. If P = NP (i.e., our Venn diagram dissolves into a single circle) and we obtain fast algorithms for these seemingly hard problems, then the whole digital economy would become vulnerable to collapse. This is because much of the cryptography that secures such things as your credit card number and passwords works by shrouding private information behind computationally difficult problems that can only become easy to solve if you know the secret key. Online security as we know it rests on unproven mathematical assumptions that crumble if P = NP.

Amazingly, we can even cast math itself as an NP problem because we can program computers to efficiently verify proofs. In fact, legendary mathematician Kurt Gödel first posed the P versus NP problem in a letter to his colleague John von Neumann in 1956, and he expressed (in older terminology) that P = NP “would have consequences of the greatest importance. Namely, it would obviously mean that … the mental work of a mathematician concerning yes-or-no questions could be completely replaced by a machine.”

If you’re a mathematician worried for your job, rest assured that most experts believe that P does not equal NP. Aside from the intuition that sometimes solutions should be harder to find than to verify, thousands of the hardest NP problems that are not known to be in P have sat unsolved across disparate fields, glowing with incentives of fame and fortune, and yet not one person has designed an efficient algorithm for a single one of them.

Of course, gut feeling and a lack of counterexamples don’t constitute a proof. To prove that P is different from NP, you somehow have to rule out all potential algorithms for all of the hardest NP problems, a task that appears out of reach for current mathematical techniques. In fact, the field has coped by proving so-called barrier theorems, which say that entire categories of tempting proof strategies to resolve P versus NP cannot succeed. Not only have we failed to find a proof but we also have no clue what an eventual proof might look like.

The Mystery at the Heart of Physics That Only Math Can Solve


The accelerating effort to understand the mathematics of quantum field theory will have profound consequences for both math and physics.87

READ LATER
Olena Shmahalo/Quanta Magazine

Over the past century, quantum field theory has proved to be the single most sweeping and successful physical theory ever invented. It is an umbrella term that encompasses many specific quantum field theories — the way “shape” covers specific examples like the square and the circle. The most prominent of these theories is known as the Standard Model, and it is this framework of physics that has been so successful.

“It can explain at a fundamental level literally every single experiment that we’ve ever done,” said David Tong, a physicist at the University of Cambridge.

But quantum field theory, or QFT, is indisputably incomplete. Neither physicists nor mathematicians know exactly what makes a quantum field theory a quantum field theory. They have glimpses of the full picture, but they can’t yet make it out.

Quanta Science Podcast

Quantum Field Theory is the most important idea in physics. A major effort is underway to translate it into pure mathematics.


00:10/36:55

“There are various indications that there could be a better way of thinking about QFT,” said Nathan Seiberg, a physicist at the Institute for Advanced Study. “It feels like it’s an animal you can touch from many places, but you don’t quite see the whole animal.”

Mathematics, which requires internal consistency and attention to every last detail, is the language that might make QFT whole. If mathematics can learn how to describe QFT with the same rigor with which it characterizes well-established mathematical objects, a more complete picture of the physical world will likely come along for the ride.

“If you really understood quantum field theory in a proper mathematical way, this would give us answers to many open physics problems, perhaps even including the quantization of gravity,” said Robbert Dijkgraaf, director of the Institute for Advanced Study (and a regular columnist for Quanta).

Every other idea that’s been used in physics over the past centuries had its natural place in mathematics. This is clearly not the case with quantum field theory.

Nathan Seiberg, the Institute for Advanced Study

Nor is this a one-way street. For millennia, the physical world has been mathematics’ greatest muse. The ancient Greeks invented trigonometry to study the motion of the stars. Mathematics turned it into a discipline with definitions and rules that students now learn without any reference to the topic’s celestial origins. Almost 2,000 years later, Isaac Newton wanted to understand Kepler’s laws of planetary motion and attempted to find a rigorous way of thinking about infinitesimal change. This impulse (along with revelations from Gottfried Leibniz) birthed the field of calculus, which mathematics appropriated and improved — and today could hardly exist without.

Now mathematicians want to do the same for QFT, taking the ideas, objects and techniques that physicists have developed to study fundamental particles and incorporating them into the main body of mathematics. This means defining the basic traits of QFT so that future mathematicians won’t have to think about the physical context in which the theory first arose.

The rewards are likely to be great: Mathematics grows when it finds new objects to explore and new structures that capture some of the most important relationships — between numbers, equations and shapes. QFT offers both.

“Physics itself, as a structure, is extremely deep and often a better way to think about mathematical things we’re already interested in. It’s just a better way to organize them,” said David Ben-Zvi, a mathematician at the University of Texas, Austin.

For 40 years at least, QFT has tempted mathematicians with ideas to pursue. In recent years, they’ve finally begun to understand some of the basic objects in QFT itself — abstracting them from the world of particle physics and turning them into mathematical objects in their own right.

Yet it’s still early days in the effort.

“We won’t know until we get there, but it’s certainly my expectation that we’re just seeing the tip of the iceberg,” said Greg Moore, a physicist at Rutgers University. “If mathematicians really understood [QFT], that would lead to profound advances in mathematics.”

Fields Forever

It’s common to think of the universe as being built from fundamental particles: electrons, quarks, photons and the like. But physics long ago moved beyond this view. Instead of particles, physicists now talk about things called “quantum fields” as the real warp and woof of reality.

These fields stretch across the space-time of the universe. They come in many varieties and fluctuate like a rolling ocean. As the fields ripple and interact with each other, particles emerge out of them and then vanish back into them, like the fleeting crests of a wave.

“Particles are not objects that are there forever,” said Tong. “It’s a dance of fields.”

To understand quantum fields, it’s easiest to start with an ordinary, or classical, field. Imagine, for example, measuring the temperature at every point on Earth’s surface. Combining the infinitely many points at which you can make these measurements forms a geometric object, called a field, that packages together all this temperature information.

In general, fields emerge whenever you have some quantity that can be measured uniquely at infinitely fine resolution across a space. “You’re sort of able to ask independent questions about each point of space-time, like, what’s the electric field here versus over there,” said Davide Gaiotto, a physicist at the Perimeter Institute for Theoretical Physics in Waterloo, Canada.

Quantum fields come about when you’re observing quantum phenomena, like the energy of an electron, at every point in space and time. But quantum fields are fundamentally different from classical ones.

While the temperature at a point on Earth is what it is, regardless of whether you measure it, electrons have no definite position until the moment you observe them. Prior to that, their positions can only be described probabilistically, by assigning values to every point in a quantum field that captures the likelihood you’ll find an electron there versus somewhere else. Prior to observation, electrons essentially exist nowhere — and everywhere.

“Most things in physics aren’t just objects; they’re something that lives in every point in space and time,” said Dijkgraaf.

A quantum field theory comes with a set of rules called correlation functions that explain how measurements at one point in a field relate to — or correlate with — measurements taken at another point.

Each quantum field theory describes physics in a specific number of dimensions. Two-dimensional quantum field theories are often useful for describing the behavior of materials, like insulators; six-dimensional quantum field theories are especially relevant to string theory; and four-dimensional quantum field theories describe physics in our actual four-dimensional universe. The Standard Model is one of these; it’s the single most important quantum field theory because it’s the one that best describes the universe.

There are 12 known fundamental particles that make up the universe. Each has its own unique quantum field. To these 12 particle fields the Standard Model adds four force fields, representing the four fundamental forces: gravity, electromagnetism, the strong nuclear force and the weak nuclear force. It combines these 16 fields in a single equation that describes how they interact with each other. Through these interactions, fundamental particles are understood as fluctuations of their respective quantum fields, and the physical world emerges before our eyes.

It might sound strange, but physicists realized in the 1930s that physics based on fields, rather than particles, resolved some of their most pressing inconsistencies, ranging from issues regarding causality to the fact that particles don’t live forever. It also explained what otherwise appeared to be an improbable consistency in the physical world.

“All particles of the same type everywhere in the universe are the same,” said Tong. “If we go to the Large Hadron Collider and make a freshly minted proton, it’s exactly the same as one that’s been traveling for 10 billion years. That deserves some explanation.” QFT provides it: All protons are just fluctuations in the same underlying proton field (or, if you could look more closely, the underlying quark fields).

But the explanatory power of QFT comes at a high mathematical cost.

“Quantum field theories are by far the most complicated objects in mathematics, to the point where mathematicians have no idea how to make sense of them,” said Tong. “Quantum field theory is mathematics that has not yet been invented by mathematicians.”

Too Much Infinity

What makes it so complicated for mathematicians? In a word, infinity.

When you measure a quantum field at a point, the result isn’t a few numbers like coordinates and temperature. Instead, it’s a matrix, which is an array of numbers. And not just any matrix — a big one, called an operator, with infinitely many columns and rows. This reflects how a quantum field envelops all the possibilities of a particle emerging from the field.

“There are infinitely many positions that a particle can have, and this leads to the fact that the matrix that describes the measurement of position, of momentum, also has to be infinite-dimensional,” said Kasia Rejzner of the University of York.

And when theories produce infinities, it calls their physical relevance into question, because infinity exists as a concept, not as anything experiments can ever measure. It also makes the theories hard to work with mathematically.

“We don’t like having a framework that spells out infinity. That’s why you start realizing you need a better mathematical understanding of what’s going on,” said Alejandra Castro, a physicist at the University of Amsterdam.

The problems with infinity get worse when physicists start thinking about how two quantum fields interact, as they might, for instance, when particle collisions are modeled at the Large Hadron Collider outside Geneva. In classical mechanics this type of calculation is easy: To model what happens when two billiard balls collide, just use the numbers specifying the momentum of each ball at the point of collision.

When two quantum fields interact, you’d like to do a similar thing: multiply the infinite-dimensional operator for one field by the infinite-dimensional operator for the other at exactly the point in space-time where they meet. But this calculation — multiplying two infinite-dimensional objects that are infinitely close together — is difficult.

“This is where things go terribly wrong,” said Rejzner.

Smashing Success

Physicists and mathematicians can’t calculate using infinities, but they have developed workarounds — ways of approximating quantities that dodge the problem. These workarounds yield approximate predictions, which are good enough, because experiments aren’t infinitely precise either.

“We can do experiments and measure things to 13 decimal places and they agree to all 13 decimal places. It’s the most astonishing thing in all of science,” said Tong.

One workaround starts by imagining that you have a quantum field in which nothing is happening. In this setting — called a “free” theory because it’s free of interactions — you don’t have to worry about multiplying infinite-dimensional matrices because nothing’s in motion and nothing ever collides. It’s a situation that’s easy to describe in full mathematical detail, though that description isn’t worth a whole lot.

“It’s totally boring, because you’ve described a lonely field with nothing to interact with, so it’s a bit of an academic exercise,” said Rejzner.

But you can make it more interesting. Physicists dial up the interactions, trying to maintain mathematical control of the picture as they make the interactions stronger.

This approach is called perturbative QFT, in the sense that you allow for small changes, or perturbations, in a free field. You can apply the perturbative perspective to quantum field theories that are similar to a free theory. It’s also extremely useful for verifying experiments. “You get amazing accuracy, amazing experimental agreement,” said Rejzner.

But if you keep making the interactions stronger, the perturbative approach eventually overheats. Instead of producing increasingly accurate calculations that approach the real physical universe, it becomes less and less accurate. This suggests that while the perturbation method is a useful guide for experiments, ultimately it’s not the right way to try and describe the universe: It’s practically useful, but theoretically shaky.

“We do not know how to add everything up and get something sensible,” said Gaiotto.

We’ve been using QFT as an outside stimulus, but it would be nice if it were an inside stimulus.

Dan Freed, the University of Texas, Austin

Another approximation scheme tries to sneak up on a full-fledged quantum field theory by other means. In theory, a quantum field contains infinitely fine-grained information. To cook up these fields, physicists start with a grid, or lattice, and restrict measurements to places where the lines of the lattice cross each other. So instead of being able to measure the quantum field everywhere, at first you can only measure it at select places a fixed distance apart.

From there, physicists enhance the resolution of the lattice, drawing the threads closer together to create a finer and finer weave. As it tightens, the number of points at which you can take measurements increases, approaching the idealized notion of a field where you can take measurements everywhere.

“The distance between the points becomes very small, and such a thing becomes a continuous field,” said Seiberg. In mathematical terms, they say the continuum quantum field is the limit of the tightening lattice.

Mathematicians are accustomed to working with limits and know how to establish that certain ones really exist. For example, they’ve proved that the limit of the infinite sequence 12 + 14 +18 +116 … is 1. Physicists would like to prove that quantum fields are the limit of this lattice procedure. They just don’t know how.

“It’s not so clear how to take that limit and what it means mathematically,” said Moore.

Physicists don’t doubt that the tightening lattice is moving toward the idealized notion of a quantum field. The close fit between the predictions of QFT and experimental results strongly suggests that’s the case.

“There is no question that all these limits really exist, because the success of quantum field theory has been really stunning,” said Seiberg. But having strong evidence that something is correct and proving conclusively that it is are two different things.

It’s a degree of imprecision that’s out of step with the other great physical theories that QFT aspires to supersede. Isaac Newton’s laws of motion, quantum mechanics, Albert Einstein’s theories of special and general relativity — they’re all just pieces of the bigger story QFT wants to tell, but unlike QFT, they can all be written down in exact mathematical terms.

“Quantum field theory emerged as an almost universal language of physical phenomena, but it’s in bad math shape,” said Dijkgraaf. And for some physicists, that’s a reason for pause.

“If the full house is resting on this core concept that itself isn’t understood in a mathematical way, why are you so confident this is describing the world? That sharpens the whole issue,” said Dijkgraaf.

Outside Agitator

Even in this incomplete state, QFT has prompted a number of important mathematical discoveries. The general pattern of interaction has been that physicists using QFT stumble onto surprising calculations that mathematicians then try to explain.

“It’s an idea-generating machine,” said Tong.

At a basic level, physical phenomena have a tight relationship with geometry. To take a simple example, if you set a ball in motion on a smooth surface, its trajectory will illuminate the shortest path between any two points, a property known as a geodesic. In this way, physical phenomena can detect geometric features of a shape.

Now replace the billiard ball with an electron. The electron exists probabilistically everywhere on a surface. By studying the quantum field that captures those probabilities, you can learn something about the overall nature of that surface (or manifold, to use the mathematicians’ term), like how many holes it has. That’s a fundamental question that mathematicians working in geometry, and the related field of topology, want to answer.

“One particle even sitting there, doing nothing, will start to know about the topology of a manifold,” said Tong.

In the late 1970s, physicists and mathematicians began applying this perspective to solve basic questions in geometry. By the early 1990s, Seiberg and his collaborator Edward Witten figured out how to use it to create a new mathematical tool — now called the Seiberg-Witten invariants — that turns quantum phenomena into an index for purely mathematical traits of a shape: Count the number of times quantum particles behave in a certain way, and you’ve effectively counted the number of holes in a shape.

“Witten showed that quantum field theory gives completely unexpected but completely precise insights into geometrical questions, making intractable problems soluble,” said Graeme Segal, a mathematician at the University of Oxford.

Another example of this exchange also occurred in the early 1990s, when physicists were doing calculations related to string theory. They performed them in two different geometric spaces based on fundamentally different mathematical rules and kept producing long sets of numbers that matched each other exactly. Mathematicians picked up the thread and elaborated it into a whole new field of inquiry, called mirror symmetry, that investigates the concurrence — and many others like it.

“Physics would come up with these amazing predictions, and mathematicians would try to prove them by our own means,” said Ben-Zvi. “The predictions were strange and wonderful, and they turned out to be pretty much always correct.”

But while QFT has been successful at generating leads for mathematics to follow, its core ideas still exist almost entirely outside of mathematics. Quantum field theories are not objects that mathematicians understand well enough to use the way they can use polynomials, groups, manifolds and other pillars of the discipline (many of which also originated in physics).

For physicists, this distant relationship with math is a sign that there’s a lot more they need to understand about the theory they birthed. “Every other idea that’s been used in physics over the past centuries had its natural place in mathematics,” said Seiberg. “This is clearly not the case with quantum field theory.”

I like to say the physicists don’t necessarily know everything, but the physics does.

David Ben-Zvi, the University of Texas, Austin

And for mathematicians, it seems as if the relationship between QFT and math should be deeper than the occasional interaction. That’s because quantum field theories contain many symmetries, or underlying structures, that dictate how points in different parts of a field relate to each other. These symmetries have a physical significance — they embody how quantities like energy are conserved as quantum fields evolve over time. But they’re also mathematically interesting objects in their own right.

“A mathematician might care about a certain symmetry, and we can put it in a physical context. It creates this beautiful bridge between these two fields,” said Castro.

Mathematicians already use symmetries and other aspects of geometry to investigate everything from solutions to different types of equations to the distribution of prime numbers. Often, geometry encodes answers to questions about numbers. QFT offers mathematicians a rich new type of geometric object to play with — if they can get their hands on it directly, there’s no telling what they’ll be able to do.

“We’re to some extent playing with QFT,” said Dan Freed, a mathematician at the University of Texas, Austin. “We’ve been using QFT as an outside stimulus, but it would be nice if it were an inside stimulus.”

Make Way for QFT

Mathematics does not admit new subjects lightly. Many basic concepts went through long trials before they settled into their proper, canonical places in the field.

Take the real numbers — all the infinitely many tick marks on the number line. It took math nearly 2,000 years of practice to agree on a way of defining them. Finally, in the 1850s, mathematicians settled on a precise three-word statement describing the real numbers as a “complete ordered field.” They’re complete because they contain no gaps, they’re ordered because there’s always a way of determining whether one real number is greater or less than another, and they form a “field,” which to mathematicians means they follow the rules of arithmetic.

“Those three words are historically hard fought,” said Freed.

In order to turn QFT into an inside stimulus — a tool they can use for their own purposes — mathematicians would like to give the same treatment to QFT they gave to the real numbers: a sharp list of characteristics that any specific quantum field theory needs to satisfy.

Kevin Costello of the Perimeter Institute is creating a framework that may eventually put quantum field theory on rigorous mathematical grounds.Gabriela Secara/Perimeter Institute

A lot of the work of translating parts of QFT into mathematics has come from a mathematician named Kevin Costello at the Perimeter Institute. In 2016 he coauthored a textbook that puts perturbative QFT on firm mathematical footing, including formalizing how to work with the infinite quantities that crop up as you increase the number of interactions. The work follows an earlier effort from the 2000s called algebraic quantum field theory that sought similar ends, and which Rejzner reviewed in a 2016 book. So now, while perturbative QFT still doesn’t really describe the universe, mathematicians know how to deal with the physically non-sensical infinities it produces.

“His contributions are extremely ingenious and insightful. He put [perturbative] theory in a nice new framework that is suitable for rigorous mathematics,” said Moore.

Costello explains he wrote the book out of a desire to make perturbative quantum field theory more coherent. “I just found certain physicists’ methods unmotivated and ad hoc. I wanted something more self-contained that a mathematician could go work with,” he said.

By specifying exactly how perturbation theory works, Costello has created a basis upon which physicists and mathematicians can construct novel quantum field theories that satisfy the dictates of his perturbation approach. It’s been quickly embraced by others in the field.

“He certainly has a lot of young people working in that framework. [His book] has had its influence,” said Freed.

Costello has also been working on defining just what a quantum field theory is. In stripped-down form, a quantum field theory requires a geometric space in which you can make observations at every point, combined with correlation functions that express how observations at different points relate to each other. Costello’s work describes the properties a collection of correlation functions needs to have in order to serve as a workable basis for a quantum field theory.

The most familiar quantum field theories, like the Standard Model, contain additional features that may not be present in all quantum field theories. Quantum field theories that lack these features likely describe other, still undiscovered properties that could help physicists explain physical phenomena the Standard Model can’t account for. If your idea of a quantum field theory is fixed too closely to the versions we already know about, you’ll have a hard time even envisioning the other, necessary possibilities.

“There is a big lamppost under which you can find theories of fields [like the Standard Model], and around it is a big darkness of [quantum field theories] we don’t know how to define, but we know they’re there,” said Gaiotto.

Costello has illuminated some of that dark space with his definitions of quantum fields. From these definitions, he’s discovered two surprising new quantum field theories. Neither describes our four-dimensional universe, but they do satisfy the core demands of a geometric space equipped with correlation functions. Their discovery through pure thought is similar to how the first shapes you might discover are ones present in the physical world, but once you have a general definition of a shape, you can think your way to examples with no physical relevance at all.

And if mathematics can determine the full space of possibilities for quantum field theories — all the many different possibilities for satisfying a general definition involving correlation functions — physicists can use that to find their way to the specific theories that explain the important physical questions they care most about.

“I want to know the space of all QFTs because I want to know what quantum gravity is,” said Castro.

A Multi-Generational Challenge

There’s a long way to go. So far, all of the quantum field theories that have been described in full mathematical terms rely on various simplifications, which make them easier to work with mathematically.

One way to simplify the problem, going back decades, is to study simpler two-dimensional QFTs rather than four-dimensional ones. A team in France recently nailed down all the mathematical details of a prominent two-dimensional QFT.

Other simplifications assume quantum fields are symmetrical in ways that don’t match physical reality, but that make them more tractable from a mathematical perspective. These include “supersymmetric” and “topological” QFTs.

The next, and much more difficult, step will be to remove the crutches and provide a mathematical description of a quantum field theory that better suits the physical world physicists most want to describe: the four-dimensional, continuous universe in which all interactions are possible at once.

“This is [a] very embarrassing thing that we don’t have a single quantum field theory we can describe in four dimensions, nonperturbatively,” said Rejzner. “It’s a hard problem, and apparently it needs more than one or two generations of mathematicians and physicists to solve it.”

But that doesn’t stop mathematicians and physicists from eyeing it greedily. For mathematicians, QFT is as rich a type of object as they could hope for. Defining the characteristic properties shared by all quantum field theories will almost certainly require merging two of the pillars of mathematics: analysis, which explains how to control infinities, and geometry, which provides a language for talking about symmetry.

“It’s a fascinating problem just in math itself, because it combines two great ideas,” said Dijkgraaf.

If mathematicians can understand QFT, there’s no telling what mathematical discoveries await in its unlocking. Mathematicians defined the characteristic properties of other objects, like manifolds and groups, long ago, and those objects now permeate virtually every corner of mathematics. When they were first defined, it would have been impossible to anticipate all their mathematical ramifications. QFT holds at least as much promise for math.

“I like to say the physicists don’t necessarily know everything, but the physics does,” said Ben-Zvi. “If you ask it the right questions, it already has the phenomena mathematicians are looking for.”

And for physicists, a complete mathematical description of QFT is the flip side of their field’s overriding goal: a complete description of physical reality.

“I feel there is one intellectual structure that covers all of it, and maybe it will encompass all of physics,” said Seiberg.

Now mathematicians just have to uncover it.

Math, Music and Imagination


Math can be experienced as play much as music is—just what’s needed to enlarge the tribe of creative problem solvers in mathematics and other human disciplines

Math, Music and Imagination
Marcus Miller on sax. 

Like most New Yorkers, I tend to work late. My typical evening involves leaving the stage shortly after midnight and then preparing some problems in number theory or combinatorics until about 3 or 4 AM. I am a jazz musician and mathematician. My skill set allows me to interpret musical experience through the language of mathematical structure and creative problem solving.

My practice involves using ideas of mathematical transformations on melodies, rhythms, and harmony. My compositions are developed using relationships between sound frequencies. But, to me, the notion that math and music are the same thing is both terribly poetic and also too reductionist to be useful. Still I believe the two disciplines are connected by an uncanny similarity in the roles creativity and imagination play in both.

I started learning music at nine years old and worked as a musician through my teenage years, but opted to attend a university instead of a conservatory on the advice of my music mentors who encouraged me to learn more about the world. At school, I became enamored of math because of the allure of the elegant theoretical worlds mathematicians built. Upon graduation, however, I decided to spend my twenties back on stage rather than in a graduate school library. Still, I continued reading math texts, as well as tutoring high school and college students, whenever I wasn’t touring around the world.

The uses of math in music are legion. You can find math the construction of modern harmony and counterpoint, development of rhythms, and the proportioning of arrangements. What I would like to see further explored, are the commonalities between the subjective experiences of doing math and music. Although the lifestyles of mathematicians and musicians might seem worlds apart, at least for me, the “thought work” behind them are more closely related than you might think: the magic of engaging with math and music fundamentally changes the way you imagine and create.

Marcus Miller.

Much of this “thought work” can be summarized as first creating, in the senses (and perhaps on a whiteboard or an instrument), a representation of an idea, and then imagining it transformed in creative and useful ways. Developing the representation is thus a form of self discovery, while transforming it is a kind of play. Can you spot the inference to be made here? It deeply informs my life and I would love to see it become more present in our culture:

Math can be experienced as play in much the same way music is.

Let me explain. To improvise or compose one must learn the technique of the instrument and the harmonic and rhythmic language of music. All the while, the fun comes in imagining and experimenting with the technical and linguistic components one has incorporated, in order to invoke a sensation, express an emotion, or tell a story. As the mastery of both the instrument and the underlying language expand, the mind becomes more sensitive to different ideas while the body becomes more competent at putting those ideas in practice. The process thus expands naturally from creative absorption to transformation, and eventually to execution.

Math can work similarly. A student must become adept with numbers and other symbols, various rules of algebra and calculus to manipulate symbols, and several functions. This is the language of mathematics; its grammar, and its technique. Mathematical problems can be viewed as structured opportunities to play with what is already known in order to discover what is not.

Through this process, a mathematician begins to develop a sense of the nature of mathematical ideas and their logical interrelationships, thus becoming sensitive to new ideas while becoming better equipped to manipulate them internally. To think of math as just formulas memorized through rote learning and mechanical thoughtless symbol shunting tragically misses the point.

As with music, everyone incorporates the underlying language, grammar and technique in their senses differently, and thus comes to their own individual understanding that leads them to express ideas in their own unique way. Contrary to the trope of the socially dysfunctional lone genius, mathematicians collaborate for most of their work, which makes them in some sense much like musicians. Expressing our internal worlds through pictures, words, and symbols and sharing them with one another, riffing off of each other’s ideas is how much of modern mathematics is done.

What if the world understood math in this way? What if we educated with the idea of playing with numbers in order to master arithmetic the same way improvising musicians are taught to play with musical notes to learn their scales? What if we honored the unique way that people understand and taught from that space rather than by rote? What if we refined people’s logical aesthetics to the point that mathematics felt more personal, more artsy, and the profound experiences of mathematical “beauty,” “elegance” or “risk” weren’t reserved for an intellectual elite?

Math as self-discovery, math as play. These two ideas may seem foreign at first, but I am convinced this change in paradigm is exactly what’s needed in order to enlarge the tribe of creative problem solvers in mathematics and many other human disciplines—equipping them to “jam” on the world’s toughest challenges.

The exact age when girls lose interest in science and math


Teenage girls can be a fickle bunch, especially when it comes to their interest in science, technology, engineering and math.

A new survey commissioned by Microsoft (MSFT, Tech30) found that young girls in Europe become interested in so-called STEM subjects around the age of 11 and then quickly lose interest when they’re 15.

 “Conformity to social expectations, gender stereotypes, gender roles and lack of role models continue to channel girls’ career choices away from STEM fields,” said psychology professor Martin Bauer of the London School of Economics, who helped coordinate the survey of 11,500 girls across 12 European countries.

The survey also found that girls’ interest in humanities subjects drops around the same age but then rebound sharply. Interest in STEM subjects does not recover.

“This means that governments, teachers and parents only have four or five years to nurture girls’ passion before they turn their backs on these areas, potentially for good,” Microsoft said.

 science math teenager girlsA new Microsoft-commissioned survey shows European girls lose interest in STEM subjects around the age of 15.

Microsoft admitted it doesn’t have a comprehensive explanation for why 15-year-old girls lose interest in science and math. But it has uncovered some strategies to keep them engaged:

Promote female role models in STEM subjects: It’s much easier for girls to imagine a career in STEM subjects if they see successful examples.

Microsoft also found that girls are more likely to pursue a career in this area if they think men and women will be treated equally in the workforce.

“Perceived inequality [in the workplace] is actually putting them off further STEM studies and careers,” Microsoft said.

Six in 10 girls admitted they’d feel more confident pursuing a STEM career if they knew men and women were already equally employed in these fields.

Offer hands-on STEM exercises: These experiences, both inside and outside the classroom, can bring the subject to life. About four in 10 girls say they don’t get enough practical experiences.

Microsoft said it’s also important to show girls how the material can be applied in real-life situations, giving the topics more relevance in their lives.

More mentors: Having teachers who mentor and encourage girls in these subjects can have even more of an impact than parent encouragement.

It also helps if this teacher is female.

Top tech titans have been the target of criticism for years for their male-dominated working environments. High profile accusations of sexism and harassment are not uncommon.

Microsoft’s own 2016 global workplace report shows that just 26% of its employees are female and less than 18% of its engineers are women.

The company acknowledged in a report accompanying the survey that its success depends on the diverse skills and experiences of its employees.

“A diverse and inclusive workforce will yield better products and solutions for our customers, and better experiences for our employees,” it said. “When we encourage girls to pursue science and technology, we double our potential to solve problems.”

Legendary physicist Freeman Dyson talks about math, nuclear rockets, and astounding things about the universe 


Mathematician and physicist Freeman Dyson has had a career as varied as it has been successful. A former professor of physics at Princeton’s Institute for Advanced Study, he has worked on the unification of the three versions of quantum electrodynamics invented by Richard Feynman, nuclear reactors, solid-state physics, ferromagnetism, astrophysics, biology, and the application of useful and elegant math problems. One of his ideas, the Dyson Sphere, was featured in a “Star Trek” episode.Today, Dyson frequently writes about science and technology’s relationship to ethics and social issues. Business Insider sat down with him and talked about math, war, the human brain, the education system, and the Orion Project.

This interview has been edited for clarity and length.

Elena Holodny: Who has most inspired you in either math or science?

Freeman Dyson: Oh, definitely [Nobel Prize-winning physicist Richard] Feynman. Dick Feynman … he has now become famous, to my great joy, because when I knew him he was completely unknown. But I recognized him as being something special. He was only for a short time at Cornell when I was a student and he was a young professor. So I didn’t work with him, but I just sat at his feet, literally, and listened to him talk. He was a clown, of course, and also a genius. It was a good combination.

richard feynman

Holodny: What does it mean to you to be a good mathematician-slash-scientist versus a great one?Dyson: I would say it’s just like any other art – mathematics is really an art, not a science. You could say science also is an art. So I would say the difference is something you can’t really describe – you can only recognize. You hear somebody playing the violin, and it was Fritz Kreisler or it was somebody else, and you can tell the difference.

It is so in almost every art. We just don’t understand why it is that there are just a few people who are just completely off the scale and the rest of them are just mediocre. And we don’t know why. But I say it’s certainly true of mathematics.

Holodny: What are your thoughts on math as an absolute versus as a way to measure things?

Dyson: Well, both, of course, are real. That’s the beauty of it. You have this world of mathematics, which is very real and which contains all kinds of wonderful stuff. And then we also have the world of nature, which is real, too.

That’s, of course, the beautiful thing about science – that it’s all about things we don’t understand, not just the things we do understand.

And that, by some miracle, the language that nature speaks is the same language that we invented for mathematics. That’s just an amazing piece of luck, which we don’t understand.

Holodny: It’s interesting that in some fields – for example, in economics – that math models do not perfectly reflect what’s happening in the real world in the same way that they do in physics.

Dyson: Yes, that’s another mystery. That’s, of course, the beautiful thing about science – that it’s all about things we don’t understand, not just the things we do understand.

In fact, there’s a wonderful essay that I was just reading by [German mathematician David] Hilbert. When he was quite an old man, he gave a wonderful talk in Konigsberg, about in 1930, about the relation between physics and mathematics, essentially. Only what is quite amazing is that he talked also about genetics – and with an expert knowledge. I mean, I was amazed. That’s Hilbert, the very purest of mathematicians, and he understood all about the genetics of fruit flies and how you could axiomatize genetics of fruit flies and deduce the existence of a linear structure for heredity. And, of course, just 10 years before DNA was identified. But Hilbert really understood it. Because nobody was listening.

Holodny: Could you talk about your experience working on the Orion Project?

Dyson: Well, that was of course a great adventure. It was just good luck. Again, everything in my life was luck. The key to having an interesting life is to always say “yes” to anything crazy. Orion was obviously crazy. So I said “yes” and had a great time.

The key to having an interesting life is to always say ‘yes’ to anything crazy.

The idea was to explore the universe with a spaceship driven by atomic bombs. And so the double objective was to be scooting around very fast in space, which would’ve been great, and also getting rid of the bombs, which was also great. It was the only really good way of getting rid of bombs. And, unfortunately, of course it never happened, but we really believed in it at the time.

The project started in 1958 – just at the same time as the Apollo project to go to the moon. So we were competing with Wernher von Braun. Von Braun won, of course, but he was friendly to us. [Laughs] After he had won, he was helpful to us.

But after the first two years or so, it was no longer really practical. Then it was just a theoretical program to understand what you could do. But still it was interesting. So I came back after two years and continued living here, in Princeton. The project went on for another five years, but it was no longer people really expecting to fly.

Project Orion_propulsion module_section

Holodny: You participated in World War II as an analyst. What was that like?

Dyson: I was lucky. I was protected. Because of [Henry] Moseley. Moseley was a British scientist. He was a very, very brilliant young man who discovered the connection between chemistry and X-rays, that you can identify chemical elements just by looking at the X-rays.

That was Moseley’s Law, which he discovered in 1913 when he was just a young kid. And he definitely would’ve had the Nobel Prize.

However, in 1914, war broke out. In England, that was a volunteer war, and all the young people volunteered, including Moseley. And he went to Gallipoli and got killed.

So that was a huge tragedy for English science. And the English government then decided before World War II that this time the scientists were going to be kept alive. It was government policy that if you were a bright, young scientist, you were not allowed to get killed […] I thought it was terribly unfair. I was one of the beneficiaries, so I was given a safe job. Meanwhile, my friends were getting killed. So I had a very bad feeling about the whole situation.

But anyway, I was sent to the headquarters of the bomber command to work as a scientist, collecting information about the bombing of Germany. And there, it was exactly the same as it’s been in Afghanistan in the last few years. It’s amazing – it’s exactly the same mistakes are made over and over again. It’s the same situation, essentially.

Henry_Moseley

There was one of the generals in Afghanistan who wrote a paper, which was leaked by somebody, describing what happened in Afghanistan – and it’s exactly what happened to us in World War II.They had this huge system of information gathering in Afghanistan. They had satellites. They had drones flying over all the time, and people on the ground collecting information. This was all then collected and sent to some place in Virginia, where there were thousands of expert analysts looking at all this information.

So you had this whole apparatus gathering intelligence – all one way, just going from Afghanistan to Virginia. But nothing ever came back. This was all so secret. It wasn’t allowed for the information to get back to the soldiers, who might have used it.

And exactly the same thing happened to us. I was one of these analysts, sitting at the command headquarters in England. The bombing was disastrously costly, and we were losing bombers at a horrible rate – something like 40,000 young men were killed just on the bombing. And we were supposed to find out how they were being killed, in order to do something about it.

We never understood what was going on. Nothing that we ever discovered ever went back to the crews who may have done something with it. The whole thing was a disaster. I was, of course, completely aware of this. So from my point of view, that was a horrible time. And the bombing did very little to help the war. It killed a lot of people, but that was about it. […]

It’s a horrible business, of course. I mean, the killing, what’s going on now in Syria is … inexcusable from any point of view. They certainly don’t need to be told how war is bad. I don’t know what can be done in Syria, but, clearly, our being there is not helping.

Anyway, that’s what I learned from World War II. Things are always more complicated than most people believe.

Holodny: Although we’ve been talking about war, you are generally optimistic about the future. You once said, “We just came down from the trees rather recently, and it’s astonishing how well we can do.” How do you maintain that optimism?

Dyson: Well, I think we’re doing pretty well. It’s clear the media, of course, always gives you the bad news. And people who rely on the media, like Mr. Trump, think that everything is a disaster. [Laughs] The media always tries to make everything into a disaster, but it’s mostly rubbish. It’s a point of fact that we’re doing extremely well.

The thing that makes me most optimistic is China and India. … They’re the places where things are enormously better now than they were 50 years ago.

The thing that makes me most optimistic is China and India – both of them doing well. It’s amazing how much progress there’s been in China, and also India. Those are the places that really matter – they’re half of the world’s population. They’re the places where things are enormously better now than they were 50 years ago. And I don’t see anything that’s going to stop that.

People who travel in China tell me that the mood there is still very upbeat, because their media is different from our media. Chinese media emphasize how well things are going and suppress the bad news and publish the good news. Of course, we do just the opposite. […]

The point of fact is, just in simple ways, you can see how much better things have gotten. I mean, when I was a child, I lived in England, and England was just amazingly polluted. We didn’t use that word. We just said it was it all covered with soot. [Laughs] If you went to London for a day, in the evening the color of your shirt was black. So much soot in the air. And of course there were no fish in the Thames. The whole place was filthy and everybody was burning coal.

Anyhow, it didn’t take very long to clean that up. If you go to London now, not everything is beautiful, but it’s amazingly better than it was. And the Thames is certainly a lot better: There are fish in the Thames.

Holodny: With only three eyes?

Dyson: Actually, they’re very healthy! In so many ways, nature, in fact, has been preserved. Of course, the English countryside is completely artificial. It was naturally a forest; they chopped down the trees and made it into what it is now: really a beautiful country. And even New Jersey is not so bad.

Holodny: You won the Templeton Prize for your contributions to science and its relation to other disciplines such as religion and ethics. I was wondering if you had any thoughts on the interplay of science and religion?

Dyson: Yes, because I don’t believe in it. I think they should be separate. Of course, it’s a personal question. Some of my friends like to keep them together, but I certainly like to keep them separate. For me, science is just a bunch of tools – it’s like playing the violin. I just enjoy calculating, and it’s an instrument I know how to play. It’s almost an athletic performance, in a way. I was just watching the Olympics, and that’s how I feel when proving a theorem.

For me, science is just a bunch of tools – it’s like playing the violin. I just enjoy calculating, and it’s an instrument I know how to play.

Anyway, religion is totally different. In religion, you’re supposed to be somehow in touch with something deep and full of mysteries. Anyhow, to me, that’s something quite separate.

Holodny: Do you think that there’s a way they could complement each other? Or are they just in two completely different lanes?

Dyson: Well, they are, of course, two different ways of looking at the universe; and it’s the same universe with two different windows. I like to use the metaphor “windows.”

The science window gives you a view of the world, and the religion window gives you a totally different view. You can’t look at both of them at the same time, but they’re both true. So that’s sort of my personal arrangement, but, of course, other people are quite different.

Holodny: The brain’s another interesting thing. On the one hand, it’s an organ, and that’s just plain old biology. But on the other hand, the thoughts are something else.

447px Anonymous Astana_Graves_Wei_Qi_Player

Dyson: Well, what I have been thinking about – this doesn’t answer your question – but I think that the artificial-intelligence people are making a lot of noise recently, claiming that artificial intelligence is making huge progress and we’re going to be outstripped by the machines. They found out how to play Go, which, of course, is a big step.But, in my view, this sort of – this whole field is based on a misconception. I think the brain is analog, whereas the machines are digital. They really are different. So I think that what the machines can do, of course, is wonderful, but it’s not the same as what the brain can do.

The brain, being analog, is able to grasp images so much better. The brain is just designed for comparing images and some patterns – patterns in space and patterns in time – which we do amazingly well. Computers can do it, too, but not in anything like the same kind of flexibility.

So, anyway, that’s sort of my view about the brain. That we won’t really understand the brain until we can make models of it which are analog rather than digital, which nobody seems to be trying very much.

Holodny: In your opinion, what’s the biggest misconception about mathematics?

Dyson: I think the biggest misconception is that everybody has to learn it. That seems to be a complete mistake. All the time worrying about pushing the children and getting them to be mathematically literate and all that stuff. It’s terribly hard on the kids. It’s also hard on the teachers. And I think it’s totally useless.

To me, mathematics is like playing the violin. Some people can do it – others can’t. If you don’t have it, then there’s no point in pretending. So I think that’s the fundamental mistake. Because I’m prejudiced about education altogether. I think it’s terribly overrated.

Holodny: Yeah, I noticed you don’t have a Ph.D. Are you not into the Ph.D. system?

Dyson: Oh, very much against it. I’ve been fighting it unsuccessfully all my life.

Holodny: Any reason in particular?

Dyson: Well, I think it actually is very destructive. I’m now retired, but when I was a professor here, my real job was to be a psychiatric nurse. There were all these young people who came to the institute, and my job was to be there so they could cry on my shoulder and tell me what a hard time they were having. And it was a very tough situation for these young people. They come here. They have one or two years and they’re supposed to do something brilliant. They’re under terrible pressure – not from us, but from them.

freeman dyson

So, actually, I’ve had three of them who I would say were just casualties who I’m responsible for. One of them killed himself, and two of them ended up in mental institutions. And I should’ve been able to take care of them, but I didn’t. I blame the Ph.D. system for these tragedies. And it really does destroy people. If they weren’t under that kind of pressure, they could all have been happy people doing useful stuff. Anyhow, so that’s my diatribe. But I really have seen that happen.And also, of course, it wastes a tremendous amount of time – especially for women, it’s particularly badly timed.

If they’re doing a Ph.D., they have a conflict between raising a family or finishing the degree, which is just at the worst time – between the ages of 25 to 30 or whatever it is. It ruins the five years of their lives.

And I see the difference in the business world. My daughter happens to be a businesswoman, so I meet a lot of her young friends.

The life there is so much easier for women. They start a company when they’re 20; they go bust when they’re 22. [Laughs] Meanwhile, they have a kid, and nobody condemns them for going bust. If you’re in the business world, that’s what’s expected: You should go bust and then start again on something else. So it’s a much more relaxed kind of a culture. It’s also competitive, but not in such a vicious way. I think the academic world is actually much more destructive of young people.

[The Ph.D. system] was designed for a job in academics. And it works really well if you really want to be an academic, and the system actually works quite well. So for people who have the gift and like to go spend their lives as scholars, it’s fine. But the trouble is that it’s become a kind of a meal ticket – you can’t get a job if you don’t have a Ph.D. So all sorts of people go into it who are quite unsuited to it. […]

Anyway, so, I’m happy that I’ve raised six kids, and not one of them is a Ph.D.

Holodny: So then, what would you say to a young person who is interested in math and science?

Dyson: Try it out, I would say. The fact is some people have it and others don’t. So find out if you’re really good. And if you are, that’s wonderful, and if you’re not, then find something else. Make it an experiment.

I think it’s a big mistake to decide too soon what you’re going to do with your life.

Holodny: Is there anything else that’s been on your mind?

Dyson: Well, I do have one other thing on my mind … but it’s not relevant to this talk. Do you know the name Sarant? Or the name of Staros? It’s the same person!

450px Freeman_dyson

Anyhow, Alfred Sarant was a good friend of mine in the old days when I was a student at Cornell and Albert was a young professor of engineering.In those days, it was just after the war. We were all sort of still with the wartime ethic of shared hardships, and everybody helped everybody else. So he built a house, and we all helped. I remember putting up the roof on the house. It was a great time; we all enjoyed it. And he was there with his wife and two kids.

Next door, there was another professor called Bruce Dayton with his wife, Carol. Also had a house and two kids. So it was friendly community, and they were all living happily.

Anyway, one day, the wife of Alfred Sarant woke up and she found her kids were in bed in the morning, but the husband had disappeared. In the same morning, Bruce Dayton woke up and found his kids were still there, but his wife had disappeared.

Holodny: Oh boy.

Dyson: Anyhow, it was a big deal. It was a huge search for these missing people and nobody ever found a trace of them. They just disappeared from the face of the earth.

Anyway, so it’s now, whatever it is, 65 years later. I met a lady in California who is the wife of a mathematician, who emigrated from Russia. So he’s now living happily in California – he’s a very good mathematician – and his wife is there with him. And I happened to meet these people at lunch just recently. I started talking to the wife, and it all became a little bit strange. She was talking about her life – and turns out she’s the daughter of the missing pair!

Anyhow, so, she’s doing fine. And it turns out, her father and mother actually did fine in Russia. So he changed his name to Staros, and they became Russian. He became, in fact, a leading computer expert in Russia. He was a personal friend of Khrushchev, rose very high in the Soviet system. So it was, on the whole, a story with a happy ending. They had three more kids in Russia who are still there.

And, of course, they were spies. They were tipped off – they were actually working with the Rosenbergs who were executed. They got tipped off just in time. So they left and did pretty well.

Holodny: What’s the most astounding thing about the universe to you?

Toucan_at_the_Beardsley_Zoo

Dyson: Almost everything about the universe is astounding. I don’t know how you would measure astounding-ness… I think the most amazing thing is how gifted we are – as you were saying at the beginning, that we are only monkeys who came down from the trees just recently.We have these amazing gifts of music and mathematics and painting and Olympic running. I mean, we’re the animal that is best of all the animals at long-distance running. Why? It is quite amazing. Superfluous gifts you don’t really need to survive.

Of course, long-distance running has to do with the fact that we’re hunters. There’s a book about that, called “Why We Run,” by Bernd Heinrich. He’s a wonderful guy. He’s a German who came to live in this country. He came to this country with no money. He wanted to study the wildlife. He really loves the wildlife; that’s his passion. He now writes books about the wildlife. […]

But in order to get an education in the United States, he had to be an athlete. So he took up long-distance running, and actually he had the world record for 100 miles. He’s a real long-distance runner, an amazing runner. And he also wrote this book about running. Really remarkable character. […]

The world is just – it’s wonderful when you look at all the detail – it’s just amazing. Nothing is boring if you look at it carefully.

He’s a professor at the University of Vermont and he lives in Maine. He wrote a wonderful book called “Ravens in Winter,” describing how ravens organize their lives. They live on big animals that die in the snow. And so you have enough food for 100 ravens, suddenly, and it’s only there for a short time.

So how do they deal with that? Well, the answer is they have a very good communication system. They gather all their friends and relations for miles and miles away – as far as 100 miles – they come flying and feast on this animal that’s dead. Many of these birds survive only by flying long distances, but they have to have a signaling system so they know where to go. Anyhow, it’s a very interesting study. […]

He came into our kitchen once and looked out of the window. And he immediately identified 14 species of birds – just outside our window. We only knew two of them!

The world is just – it’s wonderful when you look at all the detail. It’s just amazing. Why are there so many kinds of birds? And then you have to be a very special kind of person to know them. Nothing is boring if you look at carefully.

I think that’s what it would say: It’s us that’s really amazing. As far as I can see, our concentration of different abilities in one species – there’s nothing I can see that in this Darwinian evolution that could’ve done that. So it seems to be a miracle of some sort.

If you ask Siri to divide zero by zero, she will emotionally destroy you


Cookie-monster

Siri has had it up to here with your constant attempts to confuse her artificial brain.

Users have picked up on a new Easter egg in the virtual assistant’s answer repertoire. If you’re feeling especially confident today, go ahead and ask: “Siri, what is zero divided by zero?”

“Imagine that you have zero cookies and you split them evenly among zero friends. How many cookies does each person get? See? It doesn’t make sense. And Cookie Monster is sad that there are no cookies, and you are sad that you have no friends.”

It makes you feel pretty bad when your iPhone tells you that you have no friends, doesn’t it? Don’t get too down, though. If you ask nicely, Siri will assure you that she is your friend.

Siri giveth and Siri taketh away.

Watch the video. URL:https://youtu.be/wBNJ0BH3Dgs

Mathematician claims breakthrough in complexity theory


For days, rumors about the biggest advance in years in so-called complexity theory have been lighting up the Internet. That’s only fitting, as the breakthrough involves comparing networks just like researchers’ webs of online connections. László Babai, a mathematician and computer scientist at the University of Chicago in Illinois, has developed a mathematical recipe or “algorithm” that supposedly can take two networks—no matter how big and tangled—and tell whether they are, in fact, the same, in far fewer steps than the previous best algorithm. Computer scientists are abuzz, as the task had been something of a poster child for hard-to-solve problems.

In the graph isomorphism problem, the challenge is to determine whether two apparently different graphs can be rearranged to be identical, as these two can.

“If this is correct it’s probably the theoretical computer science result of the decade,” says Scott Aaronson, a computer scientist and blogger at the Massachusetts Institute of Technology in Cambridge.

Complexity theory is basically the study of what’s hard or easy to solve with a computer. In it, the key thing is how the number of steps it takes to solve a problem grows with the size of the input. Suppose, for example, that you want to determine whether a given number, 983 or 105227, is prime and cannot be divided by another number. The number of computational steps in that calculation grows relatively slowly with the number of digits in the number. Generally, it grows with the number of digits, n, raised to a fixed power—something akin to n2. Expressions like that are called polynomials, so the problem is said to be solvable in “polynomial time” and is in the complexity class “P.”

On the other hand, suppose you want to divide a given number such as 21,112,331 into all of its factors. So far, nobody has found a polynomial time algorithm to do that. So factoring is thought to be harder and reside in a broader class of problems know as NP for “nondeterministic polynomial.” Crudely speaking, for the harder NP problems, the number of steps is thought to blow up exponentially with the number of digits in the input—growing as something like 2n, which gets bigger much faster than n2. (For example, 1002 is a mere 10,000; 2 to the 100th power is more than a million trillion trillion.)

Now, Babai has shown that a problem that was thought to be on the harder NP end of the spectrum is closer to the easier P end, instead. Any network—whether it represents people spreading the flu or proteins interacting within an organism—can be depicted as a set of points, or nodes, connected by straight lines, or edges, that symbolize interactions between the nodes. Because the nodes can be plopped down any old which way, two graphs that have the same arrangement of connections can appear different (see figure), especially as the graph gets bigger and more complicated. In the “graph isomorphism problem,” the challenge is to determine whether two graphs are really the same or not. Babai has found a new algorithm to solve that problem, as he announced today.

For the previous best method, invented in 1983 by Eugene Luks, a mathematician and computer scientist at the University of Oregon in Eugene, the number of steps grows almost exponentially with the number of nodes in the networks to be compared. In Babai’s algorithm, the number of steps grows only slightly faster than polynomial time. That may sound like comparing shades of gray, but for experts that qualitative difference is thrilling. “Assuming this result holds, it would be a gem of [theoretical computer science], and an incredible thing to witness in real time,” one reader wrote on Aaronson’s blog days before Babai’s talk. “Super exciting!” wrote a reader on another blog.

Ironically, even though networks and graphs are everywhere you look nowadays, the new algorithm isn’t likely to have broad practical applications, Aaronson says. That’s because current algorithms already can quickly solve the graph isomorphism problem for the vast majority of graphs. If it holds up, the new algorithm simply proves that the tough cases that stymie the current algorithms can also be solved efficiently, Aaronson explains. And the new algorithm also does not touch on the biggest question in complexity theory: whether the class NP is really different from the class P. Researchers generally assume that’s true, and if they’re wrong, many things—such as Internet cryptography techniques—will be incredibly vulnerable. But there’s no proof that NP does not equal to P.

The real advance in the new work is to shift a key problem from the hard category to the easy one, Aaronson says. The graph isomorphism problem had been something of an oddball, he says: It was thought to be hard yet known to have certain technical properties often associated with easier problems. Now, it may just be that the problem isn’t that hard after all.

Of course, other researchers still have to check Babai’s work. He is presenting the algorithm in two talks at the university, one today and one on 24 November. “You still need to see the details,” Aaronson says. “It’s still possible that someone of the stature of Babai could make a mistake.”