They underpin our economy, all mass communication, and at this point, even most of our government. But make no mistake, scientists are quickly learning that our traditional computers—from your laptop to our most advanced supercomputers—are intrinsically slow and wasteful. It's a fact: certain, everyday problems we have computers solve are becoming exponentially—almost absurdly—harder and more time-consuming as more data gets thrown into the mix. And it's an unavoidable problem because its based in how all our computers work.

For some computer scientists, the solution lies in building quantum computers—devices which take advantage of the inexplicable weirdness of atomic-level physics. The only downside? Quantum computers require cool, carefully tended environments that are beyond our current technological capabilities. But Massimiliano Di Ventra, a physicist and computer scientist at the University of California, San Diego, has an entirely different solution. He and a team of his colleagues have just designed and built the first brain-like computer prototype that bypasses certain structural limits of our modern electronics. Called the memcomputer, its the first computer to store and processes info simultaneously. It's announced today in the journal *Science Advances*.

According to Di Ventra, despite his new technology's futuristic promise, "memcomputers can be built with standard technology and operate at room temperature. This puts them on a completely different level of simplicity and cost in manufacturing compared to quantum computers."

### The fault in our computers

In short, a big problem with modern computers is that they store data and solve problems with it in two entirely different areas: the memory, and the central processing unit (CPU). And all that shuffling back and fourth takes its toll, says Di Ventra. "To make a quick comparison: *our own brain* expends about 20 watts to perform 10^16 operations per second," he says, while a supercomputer would require 10 million times more power to do the same number of operations. "A big chunk of that power is wasted in the back and forth transfer of information between the CPU and the memory," says Di Ventra.

Di Ventra's memcomputer sprung out of an easy-to-understand thought experiment from the 1970's. What if, like our brains, a computer stored data in the exact same place it crunched the numbers? And better yet, what if the actual *process* of crunching data was used *as* memory?

This type of memory-crunching computer (hence: *mem*computer) would sidestep the costly data shuffle. Furthermore, mathematicians have actually proven that that 2-for-1 process would also allow memcomputers to solve certain fantastically complex problems in a single step.

To build his memcomputer, Di Ventra and his colleagues had to physically rebuild and reprogram a computer from its most basic components. Rather than classical silicon transistors (the building blocks that combine to build all electronics), at the core of Di Ventra's machine are what he calls memprocessors. Di Ventra's simple computer uses 6 of them.

Here's how they work. A classical transistor's job basically boils down to one thing, either letting energy through, or not, depending one what it's been told to do. A memprocessor does this exact same job, but* *it also physically changes some of its properties ("such as its [electrical] resistance," says Di Ventra) depending on* how much* energy is trying to move through. Even when the memprocessor loses power, it stores that change. In this way, while totally functioning as a classical, data-crunching CPU, memprocessors can also be coded to store resistance-laden information at the same time. No more back and fourth.

### How it works

So what exactly are these problems that our computers struggle with, but large memcomputers can solve in a flash? Although it may sound pretty esoteric, but it's actually pretty simple to grasp. According to Di Ventra, many important computer problems can be shimmied down to a basic skeleton like this one: *If I have a giant set of numbers, how many of them add up to specific number, like 10?* "These problems appear in many things we do nowadays, such as machine learning, robotics, scheduling, and optimization," says Di Ventra.

Here's what a classical computers will do, when faced with that question. It'll grab two numbers in the number-set from its memory, run them over to its processor to see if they add up, and shimmy backwards and slot the answer right back into its memory. The computer does this same exercise over and over and over again, until it's added up every single number. Now, that's a quick process if you have only, say, a dozen numbers. But it's an endless slog if you have ten million*. *Each of the ten million numbers is compared ten million times, for a total of 10 trillion back-and-fourth runs. A computing ultra-marathon.

Di Ventra explains that this very sort of problem, which could take our most advanced computer decades, could take a memcomputer only seconds. That's because rather than hassling with a back-and-fourth data-shuffle, the internal architecture of the memcomputer essentially sets up a giant maze for electricity to run through. To oversimplify whats going on: In the maze, you can imagine the entrance is a single number and all the possible exits are every other possible number it could be combined with. The maze is also constructed so that electric current can only jolt through to correct combinations (combinations that add up to 10, for example). In a single blast the memcomputer will set up a maze, have electricity run it, and store which the numbers combined with the first number to add up to 10.

Instead of taking 10 *trillion* back-and-fourth runs, a memcomputer will run 10 *million *mazes. Imagine each maze and each back-and-fourth run takes a sluggish 1 second. Our memcomputer is done in 116 days, yet 300,000 years later a classic computer is still number-crunching.

### Data auto-correct

Di Ventra argues that in the foreseeable future, memcomputers will probably be used only to answer the specific questions that they're so good at. They're not going to replace your laptop and smartphone, and that's not just because we'd have to scrap the last half-century of computer codes and start fresh.

There's still kinks to work out. Di Ventra's prototype—made of 6 memprocessors—suffers from one major issue: data loss. That's because his memcomputer stores data in the form of energy frequency—spread across a large enough maze that frequency can become weak, attenuated, and indistinguishable from the static noise of the room-temperature computer. But Di Ventra says the problem could be bandaged with error-correcting codes, or entirely fixed by using an altogether different form of data-storage.

Still, Di Ventra emphasizes that this very first incarnation of memcomputing technology. And despite the data-loss problems, it has already shown that "problems that are notoriously difficult to solve with current computers," can indeed be solved by memcomputers with incredible efficiency.