A Tech Industry Pioneer Sees a Way for the US To Lead in Advanced Chips (nytimes.com) 17
Ivan Sutherland played a key role in foundational computer technologies. Now he sees a path for America to claim the mantle in "superconducting" chips. From a report: It has been six decades since Ivan Sutherland created Sketchpad, a software system that foretold the future of interactive and graphical computing. In the 1970s, he played a role in rallying the computer industry to build a new type of microchip with hundreds of thousands of circuits that would become the foundation of today's semiconductor industry. Now Dr. Sutherland, who is 84, believes the United States is failing at a crucial time to consider alternative chip-making technologies that would allow the country to reclaim the lead in building the most advanced computers.
By relying on supercooled electronic circuits that switch without electrical resistance and as a consequence generate no excess heat at higher speeds, computer designers will be able to circumvent the greatest technological barrier to faster machines, he claims. "The nation that best seizes the superconducting digital circuit opportunity will enjoy computing superiority for decades to come," he and a colleague recently wrote in an essay that circulated among technologists and government officials. Dr. Sutherland's insights are significant partly because decades ago he was instrumental in helping to create today's dominant approach to making computer chips.
In the 1970s, Dr. Sutherland, who was chairman of the computer science department at the California Institute of Technology, and his brother Bert Sutherland, then a research manager at a division of Xerox called the Palo Alto Research Center, introduced the computer scientist Lynn Conway to the physicist Carver Mead. They pioneered a design based on a type of transistor, known as complementary metal-oxide semiconductor, or CMOS, which was invented in the United States. It made it possible to manufacture the microchips used by personal computers, video games and the vast array of business, consumer and military products. Now Dr. Sutherland is arguing that an alternative technology that predates CMOS, and has had many false starts, should be given another look. Superconducting electronics was pioneered at the Massachusetts Institute of Technology in the 1950s and then pursued by IBM in the 1970s before being largely abandoned. At one point, it even made an odd international detour before returning to the United States.
By relying on supercooled electronic circuits that switch without electrical resistance and as a consequence generate no excess heat at higher speeds, computer designers will be able to circumvent the greatest technological barrier to faster machines, he claims. "The nation that best seizes the superconducting digital circuit opportunity will enjoy computing superiority for decades to come," he and a colleague recently wrote in an essay that circulated among technologists and government officials. Dr. Sutherland's insights are significant partly because decades ago he was instrumental in helping to create today's dominant approach to making computer chips.
In the 1970s, Dr. Sutherland, who was chairman of the computer science department at the California Institute of Technology, and his brother Bert Sutherland, then a research manager at a division of Xerox called the Palo Alto Research Center, introduced the computer scientist Lynn Conway to the physicist Carver Mead. They pioneered a design based on a type of transistor, known as complementary metal-oxide semiconductor, or CMOS, which was invented in the United States. It made it possible to manufacture the microchips used by personal computers, video games and the vast array of business, consumer and military products. Now Dr. Sutherland is arguing that an alternative technology that predates CMOS, and has had many false starts, should be given another look. Superconducting electronics was pioneered at the Massachusetts Institute of Technology in the 1950s and then pursued by IBM in the 1970s before being largely abandoned. At one point, it even made an odd international detour before returning to the United States.
I don't mean to be mean but (Score:4, Insightful)
The guy did something groundbreaking 60 f'ing years ago. That's like SIXTY years man! I respect that, but at the same time, I wonder why he should considered an authority on anything 60 years later. It's like asking Edison his opinion on magnetic tape biasing: why would he have anything of value to say about that even if he invented sound recording?
Re:I don't mean to be mean but (Score:4, Insightful)
He may not be up on all the details of hi tech, but he is probably quite wise as to how hi tech executives and policy makers are brain damaged, having watched their idiocies repeated again and again for 60 years. Seems like the gist of his essay is that companies and policymakers have extreme tunnel vision and only pursue the one thing that is hot this week.
Re:I don't mean to be mean but (Score:5, Insightful)
Perhaps because he kept his ear to the chip industry and has connections there. Just because one is old doesn't make them wrong or always biased toward old stuff. Youth are biased by inexperience. Often it takes triangulation from multiple generations to get a clearer picture.
I've seen a lot of programming and software engineer fads come and go, so am naturally skeptical of new buzzwords. I don't say "it's bad", I just ask that clear benefits be demonstrated before reworking everything and tossing time-proven tools. The younglings often resent that, viewing me as a geezer stuck in the past. But I've been right more often than they have because I've developed a horse sense for BS, asking lots of probing questions (like an analyst should).
Many just want to pad their resumes with buzzwords to move to better pay rather than care about actual utility and project fit.
Re: (Score:2)
It's not about age, it's about being able to say something relevant in a field you yourself did something relevant decades ago: just because you did necessarily implies you can.
If he's kept up to date, then fair enough I guess.
Re: (Score:1)
Well, one reason is that he might be more insightful that someone like you who has never done anything groundbreaking.
Re: (Score:1)
You might want to reconsider picking Edison for your example.
In 1915, at the age of 68, Edison made a speech in which he argued that, given the geopolitical climate, the U.S. was too dependent on overseas technologies and industries and was falling behind in emerging fields that we need to stay competitive in. He made specific proposals on what to do about it. In response he was appointed a founding member of the U.S. Naval Consulting Board. This was basically the DARPA of its day and had major influences
How onerous is refrigeration? (Score:3)
Certainly, researching this, even at a modest loss, would be a more tantalizing investment than "the metaverse" or whatever blockchain fad folks last chased.
Re: How onerous is refrigeration? (Score:2)
Re: (Score:1)
Will there be a rush to build server farms near north-facing mt. slopes and Alaska?
Photonic computing (Score:2)
https://www.photonics.com/Arti... [photonics.com]
I hope they are not lacking in funding.
Re: (Score:1)
I've been seeing photonics for a long time. Sounds great. Unfortunately feels like fusion. Always 5 years away.
Re: (Score:2)
Gives the same advantages of instant speed and low heat. The key breakthrough is a photonic transister and searching Google we find a recent breakthrough
There is one other problem beside the photonic equivalent of a transistor. A memory cell is also required, and last I saw there wasn't a lot of progress on that front. DRAM uses capacitance to store a charge until it needs to be read, periodically refreshing it as needed to maintain the charge in the face of leakage. There is no photonic equivalent of a capacitor.
Storing photons DRAM-style is technically possible. A material with an extremely low speed of light through it could be fed photons, and perio
It's the Market stupid (Score:3, Interesting)
Not clear there are enough advantages (Score:4, Interesting)
The Algorithms for reversible computing are not easy to generate. Interfacing room temperature electronics to SQUID systems is difficult, the natural voltages are very different. Refrigeration systems are somewhat large and complex. Modern SQUIDS are a lot larger than CMOS gates. None of these is an absolute show stopper, but superconducting computing technology is very far from being competitive with conventional computing at the moment.
So...maybe, but its not the way I'd bet.
Re: (Score:2)
Richard Feynman came to the conclusion that quantum computers could spend zero energy for computation. The only energy expense was to input the data and extract the result. I believe all QC algorithms are reversible. His Lectures on Computation have the details...
The future (Score:3)
Ivan Sutherland played a key role in foundational computer technologies. Now he sees a path for America to claim the mantle in "superconducting" chips.
Perhaps Dr. Sutherland is anticipating a time when room-temperature superconductors can be made to order. Superconductivity is apparently a quantum effect in its known manifestations and there are theories of a possible third mechanism. Theoretical physics is famously stuck in a rut right now, with the notable exception of the quantum mechanics required to describe how modern semi-conductors are fabricated and function. Perhaps some real progress in superconductors is possible. It helps that they don't require outrageously huge machines for experiments.
Superconductors are theorized to be able to speed up computing substantially but precisely how is very hand-wavey still. It may take a good long while to work out the practical uses of superconductors in computing. Long enough that spending some time on it sooner rather than later is probably warranted. YBa2Cu3O7 and Bi2Sr2Ca2Cu3O10 function at liquid nitrogen temperatures, temperatures which are routinely achieved industrially and in labs the world over. Experimentation and development is possible now, without requiring outrageously low temperatures. Lessons learned about fabrication probably aren't too useful, but lessons learned about operation should be applicable to eventual new materials.