Germany English
The largest AI machine in the world
The largest AI machine in the world
Company

Tech Podcast: The largest AI machine in the world

At the TU Dresden, Prof. Dr Christian Mayr and his colleagues are researching neuromorphic chips. But it is already more than pure research, because the first investors are already lining up. SpiNNaker 2 is in great demand. Germany seems to be a leader in this field. But what are neuromorphic chips and what are they needed for - five questions for the expert.

Let's do a bit of definition work at the beginning. What are neuromorphic chips?

Mayr: Research in this area has been going on for about 30 years. Carver Mead started it back then, one of the popes of VLSI integration. Mead then looked at how the brain actually processes information. In this respect, he had a parallel path to other mathematicians like von Neumann and so on, who started out with normal computer technology and then became fascinated by the brain. And he was interested in, let's say, circuitry. So how do I build a circuit that really behaves like neurons and synapses on a very detailed level? So it's different from, let's say, these very abstracted AI networks that we have nowadays.

So we're talking more about electrical engineering than IT?

Mayr: Our profession is more electrical engineering. Definitely! But nowadays it's all very, very heterogeneous. So neuromorphic circuit technology actually comes, as the name says, from circuit technology, so definitely electrical engineering circuit design. If I look at the brain and see that there are 10^14 synapses and 10^10 neurons, then I have an enormous number of organisation levels that can be identified. This means that if I practically only reproduce this lowest level in transistors, if I practically only deal with circuit technology, I am actually leaving out ten or twelve levels of these degrees of abstraction in the brain. That means I inevitably have to deal with the IT, with the algorithms that run on top of it. This brings us back to the current wave of deep neural networks and classical AI. The brain does many things that the current AI cannot do at the moment and that one would like to take over at various levels, from circuitry to algorithms.

Why are neuromorphic chips important?

Mayr: If you look at deep neural networks at the moment, they are very, very inefficient.

In what way?

Mayr: In that if I give the brain the same tasks, then I need dozens of server racks that draw me several hundred kilowatts compared to just these 30 watts that my brain consumes to solve similar tasks or even better tasks. How the brain does that is sparsity actually. That is, it tries to reduce every task it is given to the absolute minimum necessary. This is not frame-based as in the Deep Neural Network. If I have a video stream running where every frame is practically propagated over the Deep Neural Network, then 99 percent of it is practically pointless processing. You can easily compress that a hundred to one on typical problems like a LeNet or similar and actually lose nothing in terms of the network's performance. And you have to take that to the algorithmic level. There are various levels where you can do this. First of all, feedforward, practically compressing the information. In a feedback, I can then incorporate various attention mechanisms that compress the information once again under the control of the algorithm or the processing. Where the algorithm then says, this is exactly what interests me right now, in order to solve some problem, and you can leave out the rest completely. So we actually have to get away from this model in the entire information processing, where we really store everything and file it away somehow in order to be able to look at it again. Instead, to solve a task, only a small fraction of the information that a sensor actually records is necessary. And that applies to practically every sensor. Whether it's visual, whether it's a motion sensor in the robot, whether it's an audio signal, there are always large redundancies.

Where do we stand in Germany with technology - midfield or even Champions Leaue?

Mayr: Well, I can give you an example using our SpiNNaker 2. When we do AI processing now, the system can handle 10^14 parameters and can iterate over them in real time. That means we can also do 10^14 MAC operations per millisecond. That means we can iterate in real time over these huge models. The largest AI networks on the road at the moment have 200 quadrillion parameters, so 2 x 10^11. That means SpiNNaker 2 is one of the largest AI machines commercially available at the moment.

Prof. Dr. Christian Mayr explains in the podcast conversation which tasks still need to be solved, how he assesses Elon Musk's Neuralink and what role the algorithms play.

You can find this and other episodes on the factory of the future in our tech podcast channel "Industry rethought" on all known platforms or you can subscribe directly here via Podigee.

Contact for the Bosch Rexroth Tech Podcast: Susanne Noll

Please feel free to contact us!