Data centers’ power consumption is rapidly increasing.
The International Energy Agency (IEA) has released a paper predicting that data center, AI, and cryptocurrency demand might quadruple from 2022 levels by 2026.
By 2026, it predicts that just three industries may use as much energy as Japan does in a year.
More energy efficient technology is being developed by companies like Nvidia, whose computer processors are used by the majority of AI applications today.
Perhaps a more sustainable approach would be to construct computers with a radically new design, one that uses less power.
Businesses that share this view are modeling their operations after the brain, an organ that can do more with a fraction of the processing power of a traditional computer while using it more efficiently.
Connected in a manner that mimics the brain’s electrical network, neuromorphic computing uses electronic components that mimic neurons and synapses.
Efforts to perfect the method date back to the 1980s, so it isn’t brand new.
Nevertheless, the growing need to implement the emerging technologies is being fueled by the energy demands of the AI revolution.
Existing systems and platforms are mostly used for research purposes, although those in favor of them claim they have the potential to bring significant improvements in energy efficiency.
Commercial aspirations are shared by many, including hardware behemoths like IBM and Intel.
There are also a few smaller businesses operating here. “The opportunity is there waiting for the company that can figure this out,” says Dan Hutcheson, an analyst with TechInsights. The potential is high enough that it might compete with Nvidia.
A spinoff from Dresden University of Technology, SpiNNcloud Systems, initially stated in May that it would start selling neuromorphic supercomputers and is accepting pre-orders.
Neuromorphic supercomputers are now ready for commercialization, according to co-CEO Hector Gonzalez.
Professor Tony Kenyon of nanoelectronic and nanophotonic materials at University College London, who is actively involved in the subject, describes it as a major advance.
“Although there isn’t yet a perfect use case for neuromorphic computing, there are many domains where the technology can improve performance and energy efficiency, and I have no doubt that it will gain widespread acceptance as it develops,” he remarks.
Our current state of technology is far from being able to simulate the human brain in its entirety, but there is a spectrum of possibilities within neuromorphic computing.
However, it differs from traditional computing in a few fundamental ways due to its architecture.
To start, neuromorphic computers don’t use traditional memory and processor modules independently. As an alternative, all of those operations take place in the same spot on a single chip.
According to Prof. Kenyon, cutting out the middleman and eliminating data transfers between the two drastically improves processing speed while decreasing power consumption.
Another popular method of computing is an event-driven approach
Activation in neuromorphic computing may be sparser than in traditional computing, where all system parts are constantly active and accessible to interact with each other.
Similar to how many neurons and synapses in our own brains only fire up when instructed to do so, the imitation neurons and synapses only fire up when presented with an opportunity to communicate.
You may further reduce power consumption by processing only when there is data to process.
Neuromorphic computing, in contrast to current computers, which use binary digits to encode data, may use analogue methods.
This classically significant computing approach makes use of continuous signals and finds applications in situations requiring analysis of data supplied from the outside environment.
But most neuromorphic attempts with an eye toward business are digital, simply because it’s easier.
There are primarily two types of commercial applications that are being considered
One area that SpiNNcloud aims to address is the provision of a platform for artificial intelligence applications that is both more energy efficient and more performant. This platform is utilized for tasks such as image and video analysis, voice recognition, and the development of large-language models used by chatbots like ChatGPT.
Yet another is in “edge computing” uses, where data is handled in real time on connected devices that are power constrained rather than on the cloud. Possible applications include driverless cars, robotics, mobile phones, and wearable electronics.
But there are still technical hurdles to overcome. Creating the necessary software to operate the devices has long been seen as a major hurdle to the progress of neuromorphic computing in general.
Having the hardware is only half the battle; getting it to function properly requires programming, which might necessitate starting from square one with a programming language that is completely alien to traditional computers.
“These devices have enormous potential…” “The problem is how do you make them work,” Mr. Hutcheson concludes, and he says that the advantages of neuromorphic computing won’t be seen for at least ten years, if not more.
Cost is another problem. The creation of completely new chips is costly, as pointed out by Prof. Kenyon, regardless of whether they employ silicon, like the commercially focused initiatives, or alternative materials.
Intel The head of Intel’s neuromorphic computing lab, Mike DaviesIntel
“Rapid progress” is being made by Intel’s neuromorphic computer. affirms Mike Davies (on the right)
A neuromorphic chip prototype developed by Intel is known as Loihi 2.
The corporation said in April that it has combined 1,152 of them to build Hala Point, a massive neuromorphic research system with over 1.15 billion synthetic neurons and 128 billion synthetic synapses.
Intel claims that this system is the biggest of its kind, with a neuron capacity that is approximately comparable to that of an owl brain.
It is still in the early stages of development as an Intel research project.
“But Hala Point is showing that there’s some real viability here for applications to use AI,” says Mike Davies, head of Intel’s neuromorphic computing unit.
According to him, Hala Point, which is about the size of a microwave oven, is “commercially relevant” and is seeing “rapid progress” in terms of software development.
IBM’s newest flagship brain-inspired microprocessor is dubbed NorthPole
This TrueNorth prototype chip, which was introduced last year, is an improvement over its predecessor. Dharmendra Modha, head of brain-inspired computing at the business, claims that tests prove it is quicker, more space efficient, and uses less energy than any chip available today. On top of that, he says his team is now trying to prove that individual chips can be integrated into a bigger system.
Next chapter, he promises, will provide the path to market. According to Dr. Modha, one of the major novelties of NorthPole is that it was co-designed with the software, allowing for the full use of the architecture from the beginning.
Innatera, BrainChip, and SynSense are a few more smaller neuromorphic startups.
Big Blue NorthPole, an IBM chipset
According to IBM, its NorthPole processor outperforms competing chips in terms of speed and energy efficiency.
Researchers from TU Dresden and the University of Manchester created neuromorphic computing, which is now being commercialized via SpiNNcloud’s supercomputer. This work is part of the European Union’s Human Brain Project.
The University of Manchester’s SpiNNaker1 machine, which has been active since 2018, is one of two research-purpose neuromorphic supercomputers that have been created as a consequence of these efforts. It has more than one billion neurons.
The configuration of TU Dresden’s second-generation SpiNNaker2 machine is underway, and it can simulate at least five billion neurons. As far as Mr. Gonzalez is concerned, the commercially available systems from SpiNNcloud are capable of reaching an even greater level of 10 billion neurons.
According to Professor Kenyon, the future of computing will include a combination of conventional, neuromorphic, and quantum platforms, with the latter two kinds of computing interacting with one another.