It’s been six decades since Ivan Sutherland created Sketchpad, a software system that predicted the future of interactive and graphics computing. In the 1970s, he played a role in the computer industry’s rally to build a new type of microchip with hundreds of thousands of circuits that would become the basis of today’s semiconductor industry.
Now Dr. Sutherland, who is 84, believes the United States is failing at a crucial time to consider alternative chip-making technologies that would allow the country to regain the lead in building the most advanced computers.
By relying on supercool electronic circuits that switch without electrical resistance and as a result do not generate excess heat at higher speeds, computer designers will be able to bypass the biggest technological barrier to faster machines, he claims.
“The nation that best seizes the opportunity for superconducting digital circuits will enjoy computing superiority for decades to come,” he and a colleague recently wrote in an essay circulated among technologists and government officials.
Dr. Sutherland’s insights are important in part because decades ago he was instrumental in creating today’s dominant approach to making computer chips.
In the 1970s, Dr. Sutherland, who was chairman of the computer science department at the California Institute of Technology, and his brother Bert Sutherland, then a research manager at a division of Xerox called the Palo Alto Research Center, introduced the computer scientist Lynn Conway after the physicist Carver Mead.
They pioneered a design based on a type of transistor known as complementary metal-oxide semiconductor, or CMOS, which was invented in the United States. It made it possible to create microchips used by personal computers, video games and the vast range of business, consumer and military products.
Now Dr. Sutherland argues that an alternative technology that predates CMOS, and has had many false starts, should be given another look. Superconducting electronics was pioneered at the Massachusetts Institute of Technology in the 1950s and then followed by IBM in the 1970s before being largely abandoned. At one point it even made a strange international detour before returning to the United States.
In 1987, Mikhail Gorbachev, the last Soviet leader, read an article in the Russian newspaper Pravda, which described a surprising advance in low temperature computing made by Fujitsu, the Japanese microelectronics giant.
Mr. Gorbachev was intrigued. Wasn’t this an area, he wanted to know, where the Soviet Union could excel? The task of giving a five-minute briefing to the Soviet Politburo eventually fell to Konstantin Likharev, a young professor of physics at Moscow State University.
However, when he read the article, Dr. Likharev realized that the Pravda reporter had misread the news report and claimed that the Fujitsu superconducting memory chip was five orders of magnitude faster than it was.
Dr. Likharev explained the mistake, but he noted that the field still had promise.
That set off a chain of events in which Dr. Likharev’s small laboratory received several million dollars in research funding, making it possible to build a small team of researchers and, eventually, after the fall of the Berlin Wall, moving to the United States. Dr. Likharev took a physics position at Stony Brook University in New York and helped start Hypres, a digital supergear company that still exists.
The story could have been there. But it seems that the elusive technology is gaining momentum once again, as the cost of modern chipmaking has become enormous. A new semiconductor factory costs $10 billion to $20 billion and takes up to five years to complete.
Dr. Sutherland argues that the United States should consider training a generation of young engineers capable of thinking outside the box, rather than pushing more expensive technology that yields less efficiency.
Superconductor-based computing systems, where electrical resistance in the switches and wires drops to zero, can solve the cooling challenge that increasingly plagues the world’s data centers.
The manufacture of CMOS chips is dominated by Taiwanese and South Korean companies. The United States now plans to spend nearly a third of a trillion dollars in private and public money in an effort to rebuild the nation’s chip industry and regain its global dominance.
Dr. Sutherland is joined by others in the industry who believe CMOS manufacturing is hitting fundamental limits that make the cost of progress prohibitive.
“I think we can say with some certainty that we need to radically change the way we design computers because we are really approaching the limits of what is possible with our current silicon-based technology,” said Jonathan Koomey, a specialist in large-scale computer energy requirements.
As it has reduced the size of transistors to the size of only hundreds or thousands of atoms, the semiconductor industry is increasingly plagued with a variety of technical challenges.
Modern microprocessor chips also suffer from what engineers describe as “dark silicon.” If all the billions of transistors on a modern microprocessor chip are used at once, the heat they create will melt the chip. As a result, entire sections of modern chips are shut off and only some of the transistors are working at any one time – making them much less efficient.
Dr. Sutherland said the United States should consider alternative technologies for national security reasons. The advantages of a superconducting computer technology could first be useful in the highly competitive market for cellular base stations, the specialized computers in the phone towers that process wireless signals, he suggested. China has become a dominant force in the market for current 5G technology, but the next generation of 6G chips would benefit from both the extreme speed and significantly lower power requirements of superconducting processors, he said.
Other industry executives agree. “Ivan is right that the power problem is the big problem,” said John L. Hennessy, an electrical engineer who is Alphabet’s chairman and a former Stanford president. He said there were only two ways to solve the problem – either by gaining efficiency with new design, which is unlikely for general purpose computers, or by creating a new technology that is not bound by existing rules.
One such opportunity may be to create new computer designs that mimic the human brain, which is a marvel of low-power computing efficiency. Research into artificial intelligence in a field known as neuromorphic computing has previously used conventional silicon fabrication.
“There really is the potential to create the equivalent of the human brain with superconducting technology,” said Elie Track, chief technology officer of Hypres, the superconducting company. Compared to quantum computing technology, which is still in early experimental stages, “this is something that can be done now, but unfortunately the funding agencies have not paid attention to it,” he said.
The time for superconducting computing may not yet have arrived, in part because every time the CMOS world seems about to hit a final hurdle, clever engineering has overcome it.
In 2019, a team of researchers at MIT led by Max Shulaker announced that it had built a microprocessor from carbon nanotubes that promised 10 times the energy efficiency of today’s silicon chips. Dr. Shulaker is working with Analog Devices, a semiconductor maker in Wilmington, Mass., to commercialize a hybrid version of the technology.
“More and more, I believe you can’t beat silicon,” he said. “It’s a moving target, and it’s really good at what it does.”
But as silicon approaches atomic limits, alternative approaches look promising once again. Mark Horowitz, a Stanford computer scientist who has helped start several Silicon Valley companies, said he was not ready to discount Dr Sutherland’s passion for superconducting electronics.
“People who have changed the course of history are always a little crazy, you know, but sometimes they are crazy,” he said.