Traditional silicon-based computing, for decades governed by Moore’s Law, is reaching its physical limits.
We’re at a turning point in technology. This isn’t a dead end for computing, but a catalyst for a new wave of methods like quantum, optical, and neuromorphic computing. The driving force behind this transition is the critical need for energy efficiency. While traditional technologies raise concerns about power consumption, these new systems offer a path to a more sustainable future.
Research labs are at the forefront of this shift. The ETH Zurich-PSI Quantum Computing Hub, a joint effort with the Paul Scherrer Institute, is developing quantum computers based on ion traps and superconducting components. Quantum computing uses qubits and leverages phenomena like superposition and entanglement to perform complex calculations exponentially faster for specific problems.
Fujitsu also is developing a 10,000+ qubit quantum computer with a target completion date of 2030, and Amazon Braket has added new quantum processors from companies like IQM to its cloud service, making them more accessible to researchers and businesses.
Optical and neuromorphic computing, however, are at a very different stage of development compared to quantum computing, which is seeing more significant commercial investment and cloud-based services.
The GESDA Science Breakthrough Radar® projects that meaningful applications for this technology will emerge within the next five years, impacting fields such as materials science, chemistry, and pharmaceuticals. In a different approach, optical computing uses light instead of electrical currents to process information.
A major challenge for optical computing has been the lack of a suitable memory component. A recent study published in Nature Photonics showcased a new method for photonic “in-memory computing.” The breakthrough created memory cells that combine key attributes like high speed, low energy, and non-volatility in a single platform, moving optical computing from a theoretical concept to a more practical reality.
Inspired by the human brain, neuromorphic computing mimics biological processes for greater efficiency. British computer scientist Adam Adamatzky even explores using organisms like fungi as a form of “slow computer” for environmental sensing.
A company called FinalSpark has developed a remote platform that allows researchers to run experiments on living neurons that they call “wetware computing.” This platform aims to harness the incredible energy efficiency of biological neurons for computation, with potential energy savings of over a million times compared to digital processors. This adds a compelling and almost sci-fi-like element to the discussion of unconventional computing.
University of Manchester researchers developed nanofluidic memristors that can mimic the memory functions of the human brain, exhibiting both short-term and long-term memory, providing a significant step toward creating devices that learn and adapt in a truly brain-like fashion.
Practical applications and global impact
These new technologies are paving the way for real-world applications. An article published in Nature on August 4 demonstrates that 2D materials are a foundational technology, allowing for in-sensor computing.
The approach combines sensing, memory, and processing on a single platform, enabling smarter, more responsive technologies where computing is seamlessly integrated into the sensors themselves.
These advancements could lead to breakthroughs in healthcare, such as a portable neuromorphic device for rapid disease diagnosis, or quantum computers that accelerate drug discovery by simulating molecular interactions at unprecedented speeds. In agriculture, farmers could use sensors with built-in computing to optimize crop yields.
A recent trend is the development of neuromorphic chips that can perform real-time, on-the-fly learning in devices, such as robots or sensors, with ultra-low power consumption, making smart devices more autonomous and reducing the need for constant cloud connectivity
The potential to solve complex problems is vast, as highlighted by NASA’s “Beyond the Algorithm Challenge,” which sought innovative solutions using unconventional computing for Earth science problems like rapid flood analysis. Ultimately, these developments could lead to breakthroughs in artificial intelligence that change our daily lives.
Ethical and security challenges
The technological shift also brings challenges. The power of quantum computing could render today’s standard encryption obsolete, demanding a proactive effort to develop new cryptographic methods.
A recent paper published on arXiv specifically analyzes and compares the security risks of traditional sensor systems with emerging in-sensor computing systems, highlighting new potential attack scenarios and the need for new security frameworks.
There is also the potential for a wider digital divide, as access to these advanced technologies could be unevenly distributed. To navigate these complexities responsibly, a collaborative effort is essential.
New partnerships among governments, academic institutions, and private companies are needed to develop robust security frameworks and ensure the benefits of this new era are widely accessible. The ultimate goal is to build a future where these powerful innovations contribute to solving global challenges in a responsible way.
Where the science and diplomacy can take us
The 2024 GESDA Science Breakthrough Radar®, distilling the insights of 2,100 scientists from 87 countries, anticipates several upcoming breakthroughs in computing. Key Radar references:
→ →1.2 Quantum computing — When information is stored in the physical states of objects whose behavior is governed by the laws of quantum physics, new and powerful forms of information processing become possible. Suitably realised and managed, this creates novel computational capabilities that have the potential to significantly impact sectors as diverse as finance, materials science, cryptography and drug discovery.
→ 1.2.1 Quantum hardware development — Because quantum computing is most useful at scale, it is important that the qubits are available in large numbers, connected to each other in low-noise (and thus low-error) configurations.
→ 1.2.2 Quantum algorithms design — Software for quantum computing is evolving along with the hardware. At present, there are relatively few algorithms that will make the quantum computing era truly revolutionary. But as hardware becomes more accessible, co-design between hardware and algorithm designers is likely to change that.
→ 1.2.3 Quantum error correction and noise mitigation — The single most difficult challenge facing quantum computing is the problem of noise. External input of any kind, whether it comes from heat, mechanical vibrations, electrical or magnetic interference from surrounding circuits, or any other source, has the potential to cause “decoherence”, which alters the quantum state held in the qubits, instigating errors or even unrecoverable halts in the computation.
→ 1.2.4 Near-term applications of quantum computing — For a number of years now, quantum-computing researchers have spoken of computing with “noisy, intermediate-scale quantum” (NISQ) processors. These processors are not able to perform applications such as those using Shor’s algorithm, which requires large-scale, fault-tolerant, error-corrected quantum computers. However, there may be near-term applications where small, noise-affected machines can still perform useful tasks, , especially given the growing number of noise-mitigation solutions.
→ 1.3 Unconventional computing — The silicon transistor is one of the most far-reaching inventions in human history. However, the exponential improvements in computing performance predicted by Moore’s Law have begun to falter as the technology hits fundamental physical limits. This has sparked renewed interest in alternative computing technologies that could provide workarounds to these constraints.
→ 1.3.1 Neuromorphic computing — Neuromorphic computing seeks to replicate aspects of the structure and function of biological neural networks in electronics. The aim is to develop machines that will ultimately display the same capabilities as the human brain, including its incredible energy efficiency.
→ 1.3.2 Organoid intelligence — Rather than trying to create software and hardware that mimics the way the brain works, an emerging field of research seeks to coax nature’s most powerful computing technology — biological neural networks — into carrying out computations.
→ 1.3.3 Cellular computing— Many biological processes take a molecular input, carry out some process using molecular or cellular “machinery,” and output a different set of molecules. This observation has seeded a field in which researchers attempt to modify these processes to perform useful computing-like routines. So far, most work uses synthetic biology to build “genetic circuits” equivalent to logic circuits in conventional computers.
→ 1.3.4 Optical computing — Almost all of the modern world’s information-processing tasks are powered by electrons. But scientists have long considered whether the photon — the quantum particle of light — could be a more promising candidate. Optical systems are not subject to electrical resistance and they can transmit data across multiple frequencies in parallel, massively boosting energy efficiency and data flows.