AI Magnets – Magnets can help AI get closer to the efficiency of the human brain

AI Magnets – Researchers have developed a process to use magnetics with brain-like networks to program and teach devices such as personal robots, self-driving cars and drones to better generalize about different objects.
Artificial intelligence (AI) is a branch of computer science which aim to understand the essence of intelligence and produce a new intelligent machine. The hot research area of AI contains robot, language recognition, image recognition, natural language processing and expert system. Perhaps AI magnets is not only limited to the magnetic component in AI product, but also can be defined as magnets with AI algorithm participated during its R&D stage.

Magnets can help AI get closer to the efficiency of the human brain

Magnets can help AI get closer to the efficiency of the human brain

Computers and artificial intelligence continue to usher in major changes in the way people shop. It is relatively easy to train a robot’s brain to create a shopping list, but what about ensuring that the robotic shopper can easily tell the difference between the thousands of products in the store?

Material can be regarded as the fundamental substance of human life and development. New material technology is also becoming one of the major indicators of world new technology revolution. Material research was purely depending on intuition observation experience thousands of years ago, then formed physical model which marked by law of mathematics equation until several centuries earlier, especially law of thermodynamics. But for many scientific problems, the solution of complex model will take huge manual effort and time. The invention of computer and the development of computing technology has opened the way to simulate the complex models, then density functional theory and molecular dynamics have been applied rapidly just in this era. Generally, material science has gone through three stages included experience, theoretical model and simulation, however, material research is gradually entering the stage which data drives scientific discovery. Especially as the continuous application of high throughput experiment, characterization and calculation, discover knowledge from abundant data will be the main way of the future material research and this data-driren technology has defined as the 4th paradigm of the research of material science.

Research objective of material science is cracking the relationship between processing, structure, properties and performance of material. One important function of data-driven technology is material performance prediction. Material performance prediction model can not only confirm the performance of unknown material without experiments and theoretical calculations, but also able to guide material’s development and design which more important and challenging. For data-driven technology, material design is essentially an optimization problem that search the composition, structure and processing corresponding to the maximum material performance. Material performance is influenced by many factors including chemical composition, physical property, microstructure and processing technique. It should be noted that material performance is not linear with the above factors. Arguably, data-driven technology has significantly economic benefits compared with traditional R&D model.

Purdue University researchers and experts in brain-inspired computing think part of the answer may be found in magnets. The researchers have developed a process to use magnetics with brain-like networks to program and teach devices such as personal robots, self-driving cars and drones to better generalize about different objects.

AI Magnets - Researchers have developed a process to use magnetics with brain

AI Magnets – Researchers have developed a process to use magnetics with brain

“Our stochastic neural networks try to mimic certain activities of the human brain and compute through a connection of neurons and synapses,” said Kaushik Roy, Purdue’s Edward G. Tiedemann Jr. Distinguished Professor of Electrical and Computer Engineering. “This allows the computer brain to not only store information but also to generalize well about objects and then make inferences to perform better at distinguishing between objects.”

Roy presented the technology during the annual German Physical Sciences Conference earlier this month in Germany. The work also appeared in the Frontiers in Neuroscience.

The switching dynamics of a nano-magnet are similar to the electrical dynamics of neurons. Magnetic tunnel junction devices show switching behavior, which is stochastic in nature.

The stochastic switching behavior is representative of a sigmoid switching behavior of a neuron. Such magnetic tunnel junctions can be also used to store synaptic weights.

The Purdue group proposed a new stochastic training algorithm for synapses using spike timing dependent plasticity (STDP), termed Stochastic-STDP, which has been experimentally observed in the rat’s hippocampus. The inherent stochastic behavior of the magnet was used to switch the magnetization states stochastically based on the proposed algorithm for learning different object representations.

The trained synaptic weights, encoded deterministically in the magnetization state of the nano-magnets, are then used during inference. Advantageously, use of high-energy barrier magnets (30-40KT where K is the Boltzmann constant and T is the operating temperature) not only allows compact stochastic primitives, but also enables the same device to be used as a stable memory element meeting the data retention requirement. However, the barrier height of the nano-magnets used to perform sigmoid-like neuronal computations can be lowered to 20KT for higher energy efficiency.

“The big advantage with the magnet technology we have developed is that it is very energy-efficient,” said Roy, who leads Purdue’s Center for Brain-inspired Computing Enabling Autonomous Intelligence. “We have created a simpler network that represents the neurons and synapses while compressing the amount of memory and energy needed to perform functions similar to brain computations.”

Roy said the brain-like networks have other uses in solving difficult problems as well, including combinatorial optimization problems such as the traveling salesman problem and graph coloring. The proposed stochastic devices can act as “natural annealer,” helping the algorithms move out of local minimas.

Their work aligns with Purdue’s Giant Leaps celebration, acknowledging the university’s global advancements in artificial intelligence as part of Purdue’s 150th anniversary. It is one of the four themes of the yearlong celebration’s Ideas Festival, designed to showcase Purdue as an intellectual center solving real-world issues.

Roy has worked with the Purdue Research Foundation Office of Technology Commercialization on patented technologies that are providing the basis for some of the research at C-BRIC. They are looking for partners to license the technology.

Machine learning is a computer science of algorithms and statistical models that computer systems use to perform a specific task without using explicit instructions, but relying on patterns and inference instead. As a core branch and research topic of AI, machine learning has been already served to the R&D of the permanent magnet by Chinese researchers, and that’s why we call it AI magnets.

SmCo Cylindrical Pot Magnet w/ Fitting Tolerance h6 and Internal Thread

PPS Resin Isotropy NdFeB Sensor Magnet for TMR Angle Sensors

TMR Angle Sensor Magnets Round Isotropy NdFeB PA12

Deep Pot SmCo Magnets Brass Body with Fitting Tolerance h6

Permanent Magnet Generators Offer Improved Performance for RE Production