0% found this document useful (0 votes)
66 views5 pages

An Extensive Analysis of Neuromorphic Computing

Uploaded by

subashg069
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
66 views5 pages

An Extensive Analysis of Neuromorphic Computing

Uploaded by

subashg069
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

An Extensive Analysis of Neuromorphic

Computing
2024 International Conference on Advances in Computing Research on Science Engineering and Technology (ACROSET) | 979-8-3503-8880-0/24/$31.00 ©2024 IEEE | DOI: 10.1109/ACROSET62108.2024.10743880

Amit Vajpayee Palak Preet Kaur


Department of Computer Science and Engineering Department of Bigdata Analytics
Parul Institute of Engineering and Technology Apex Institute of Technology Chandigarh University
Vadodara, India Mohali, India
[email protected] [email protected]

Ankit Sharma Santosh Varshney


Department of Bigdata Analytics Department of Computer Science and Engineering
Apex Institute of Technology Chandigarh University Acropolis Institute of Technology and Research
Mohali, India Indore, India
[email protected] [email protected]

.
Abstract--Neuromorphic computing is a new built into larger devices such as injector controllers.
paradigm that emerges from the structure and function Many new computers will be incorporated into a
of the human brain and aims to revolutionize computing. variety of devices in the future, from satellite tracking
The technology is designed to simulate the high speed, to jewelry. We believe that the transition from today's
low power consumption and large contraction of the custom-built computing systems to general-purpose,
brain. Neuromorphic systems provide great potential for off-the-shelf computing systems will open the
intelligence and big data by using special algorithms and neuromorphic computing market. These concepts
special designs to simulate neural networks. Despite include a wide range of models, from simulations of the
encouraging progress from technology companies such as
brain of a cloud to systems that simulate the behavior
Intel and IBM, the development of neural computing
chips still faces major challenges. This approach brings
of the brain (or part of the brain), but without the need
significant benefits to artificial intelligence, machine for the internal energy of neurons at the other end.
learning and big data. This technology is still in its There is no clear boundary between neuromorphic and
infancy but has great potential to revolutionize non-neuromorphic diseases, as the brain does not need
intelligence. By mimicking the brain's performance and to repeat the activity of the simulation. Another aspect
learning potential, neuromorphic computing could pave of neuromorphic computing is the use of very large-
the way for smarter, faster, and more flexible systems scale integrated circuits (VLSI) to implement the
that work well on water, performing tasks such as models. This distinguishes neuromorphic systems from
complex data analysis and on-the-fly decision making. other neural network applications that can use
traditional computers, high-performance computing
Keywords- Neuromorphic Computing, Pattern clusters, or GPUs to run simulations.
Recognition, Architecture, Deep Learning, Artificial
Neural Networks (ANNs) II. HISTORY
I. INTRODUCTION In the 1990s and early 2000s, Rodney Douglas and
Misha Mahowald laid the foundation for exploring
Neuromorphic computation is conceptualized using xeromorphic architectures and the role of hardware.
neuromorphology. The goal of neural networks is to Over the past decade, many companies and
simulate the human brain in a biologically accurate and organizations have been working on neuromorphic
cost-effective way. Compared to neuromorphic computing, including IBM, owner of True North chips.
computing, which is cheaper than traditional
computing, AI, when more powerful, will create The development of neuromorphic computing has
learning processes in complex, changing environments, been influenced by two main methods. One side tries to
such as real-world tasks. Neuromorphic computing track the physical appearance of the brain, such as
uses a network of biological systems in the body to Boahen's neuromorphic circuits and the Neurogrid
simulate digital numbers. The goal of neuromorphic system at Stanford University. On the other hand, we
computing is to create high-performance cognitive leave biological behaviors aside and try to solve mental
systems. Higher-level work that translates to biology problems such as thinking, planning and management
involves time accuracy and integrity, as well as work in through algorithms.
real-world settings. The scope of knowledge in A. Key Features of Neuromorphic Computing
biological systems is broad. Our goal is to help
understand and prepare complex information for • Massively parallelism: Neuromorphic systems
decision making. Embedded systems are products operate in a massively parallel manner, with many
created using computer technology. Most calculators simple computational units operating
are no longer stand-alone computers, but machines

979-8-3503-8880-0/24/$31.00 ©2024 IEEE

Authorized licensed use limited to: ANNA UNIVERSITY. Downloaded on January 09,2025 at 06:43:59 UTC from IEEE Xplore. Restrictions apply.
simultaneously, similar to the parallel processing elements such as sensing, recognition, and
that occurs in brain neural network operation. decision-making, making them compact,
• Low power consumption: By mimicking the instantaneous, adaptable, and ultra-low-power
brain's energy-saving synaptic connections, biosensors. Inspiring sensing systems and robotics.
neuromorphic systems can significantly reduce
power consumption compared to modern III. APPLICATIONS OF NEUROMORPHIC
computer systems. COMPUTING
• Neuromorphic Computing in Artificial
B. Advantages
Intelligence: Neuromorphic computing can mimic
• Energy efficiency: Neuromorphic systems use less the structure and function of the human brain and
energy than traditional computing systems. This holds great promise in the field of artificial
makes them ideal for applications where energy intelligence (AI). It allows them to solve problems,
consumption is critical. recognize patterns, and make decisions faster and
• Fault Tolerance: Neuromorphic systems are more efficiently than computers by using artificial
resilient to local faults, thus increasing their neurons and synapses to process information.
reliability in many applications. Performance
• Scalability: Neuromorphic systems are scalable; • Neuromorphic Computing in Robots:
This is important for tackling increasingly complex Neuromorphic computing has been used in the
tasks. field of robotics, creating systems that allow robots
• Speed: Neuromorphic computers can perform to continue learning new things after deployment.
complex calculations faster than standard von Neuromorphic hardware enables fast and energy-
Neumann architectures. efficient artificial intelligence based on neural
• Pattern Recognition: Neuromorphic computer networks, ideal for solving robotic tasks.
systems are effective at recognizing patterns, • Neuromorphic computing in medicine:
making them suitable for tasks such as image and Neuromorphic computing has the potential to
speech recognition. revolutionize diagnostics. Integration of
neuromorphic computing, artificial intelligence,
C. Neuromorphic Hardware
and machine learning (ML) into healthcare can
• Neuromorphic Chips and Architecture: improve diagnosis, treatment, and prevention of
Neuromorphic chips are designed to mimic the diseases.
neural architecture of the human brain. Unlike
traditional computer models used to run von
Neumann-based processors, neuromorphic chips
mimic the brain's neural network topology,
allowing for more efficient and flexible
computing1. One design pattern in neuromorphic
devices is to arrange synapses in dense,
interlocking configurations with discrete logic
blocks to accommodate learning at synapses and
integration and firing of somatic cells. However,
one challenge with such systems is the fan input
fan output ratio of the crossbar switch.
• Memristors and Neuromorphic Systems:
Memristors are non-volatile devices that are fast
and energy efficient when reading and writing.
written work. They are designed using a common
IV. ALGORITHMIC NEUROMORPHIC COMPUTING
CMOS process and can perform analog arithmetic.
These features make memory-based computing Main purpose of the neuromorphic algorithm is to
architectures promising for the success of accelerate the neural network (SNN). SNN is a
neuromorphic diseases. Memristors offer the best significant change from deep learning models in that it
possible performance and are good candidates for sends data in the form of spikes (-1- or -0-) rather than
creating efficient and effective biomimetic simulated values3. These algorithms demonstrate the
interaction between processing time and memory. Each
computing systems.
expression in these algorithms has a term (explicit or
• Neuromorphic Sensors and Interfaces: implicit) and the calculation is generally result-
Neuromorphic sensors are inspired by the oriented.
workings of the brain and can play a major role in
self-monitoring, neurorepair, and human- V. LITERATURE REVIEW
computer interfaces. Advanced sensing In this article, we examine some storage
technologies developed using neuromorphic technologies and how they improve the power
engineering can equip sensors with intelligence efficiency of massive neuromorphic computer

Authorized licensed use limited to: ANNA UNIVERSITY. Downloaded on January 09,2025 at 06:43:59 UTC from IEEE Xplore. Restrictions apply.
networks. Although large-scale study methods have not considerations such as the needs of different devices,
yet been used to implement this tool. We also describe multiple states, power consumption, connection phase,
the conditions and problems that arise on the path to cabling power, fan in/fan out, and IR loss, and issues
success of the broad learning process that can be widely using NVM synapses are presented. Future research
used for a variety of skills at work [1]. directions and integration issues are noted. Algorithms
based on spiking neural networks should enable
The most recent advancements in synaptic energy-saving real-time learning, but device changes
electronics are reviewed in this article. Basic concepts during the loop can affect learning. Our analysis shows
of learning and biological synaptic plasticity are that cables are more important in terms of power
described. The usage of synaptic devices in considerations, especially in large systems [7]
neuromorphic or brain-like calculations is covered,
along with the characteristics and electrical properties The multi-level functionality of metal oxide
of several types of synaptic devices. There is a resistive transfer memory has been investigated by
description of the performance metrics needed for checking the potential for use of a synaptic device.
widespread synaptic device use. A review is conducted TiN/HfOx/AlOx/Pt resistive switching cells have been
on recent work on analytic uses of synaptic developed. By changing the amplitude of the
characteristics [2]. programming voltage during the pulse, a multi-level
resistive state is achieved. Using pulse amplitude
A third-generation neural network called Spiking device to gradually vary resistance to demonstrate time-
Neural Networks (SNN) is modeled after the physical varying plasticity rules demonstrates the potential of
coding and information processing seen in the human using metal oxide memories as electronic devices
brain. These networks, for instance, allow AI synaptic voltage for neuromorphic computer systems
computers to function more effectively than the [8].
secondary networks that are frequently employed in
machine learning algorithms nowadays. We explore This article discusses the potential of brain-inspired
hardware representations, learning methods, and neuromorphic computing to create efficient electronic
possible uses for SNN-based learning systems in this devices for data processing. Introduces the problems
study [3]. and theories of neuromorphic physics, including
algorithm and hardware design, developments in
A novel technique called memristive content miniature toys, and the creation of large networks. The
promises to make memory smooth and small. A portion human brain is a dynamic, reconfigurable, complex
of them have already begun the process of system and exhibits fascinating phenomena such as
industrialization. They also show malleable behavior energy or entropy minimization, phase change and
and a multilayered structure, which makes them critique, self-oscillations, and stochastic resonances.
promising candidates for neuromorphic engineering of Hopfield networks, Boltzmann systems, and using spin
synapses. These findings imply that memory devices in systems are the best-known systems, but many other
neuromorphic systems can be used in any way. It is systems have been proposed using non-linear dynamics
necessary to comprehend and support the physics tool for calculations. This article presents current and
in order to use it [4]. surprising results from two approaches. The brain is the
The goal of the developing discipline of brain- difference between computers, both in terms of
inspired computing is to replicate the success of sensor algorithms and physical implementations [9].
data in brain-inspired computing. Challenges that must Neuromorphic computing is a variety of data
be met to implement such a system include the creation collection techniques that demonstrate neurobiological
of a well-integrated system with connected devices, inspiration. It started in the 1980s and was influenced
low power consumption and robustness, and by advances in VLSI technology. Although the brain
neuromorphic computing techniques to implement and computer operate as similar information processing
learning in hardware. The full model of the model with and control systems, their operating systems are
50% missing points was built by the combining
completely different. Although neuroscience has made
method. Learn robustness to input noise, changes in significant advances in the past century, the
sensed data, and changes in device resistance through fundamentals of information processing in the brain are
simulation studies [5]. still far from understood [10].
Recent advances in neuroscience and nanoscale Neuromorphic computing is a branch of computing
electrical engineering have led to increased interest in that aims to simulate the neural networks and synaptic
the brain and the use of nanoscale material storage as connections of the human brain to better process
synaptic elements in electronic devices. Using phase- information. The concept behind neuromorphic
changing cells to enable Hebbian learning, the synaptic computing began with the work of Caltech's Carver
network can store presented patterns and remember Mead in the late 1980s, and the size and functionality
relationships as well as patterns lost in the brain [6]. of neuromorphic devices has grown steadily since then.
This illustrates recent advances in online learning Despite similarities in their roles as information
with Algorithms for augmenting neurons and processing and control systems, brains are better than
Neuromorphic platforms Simulated non-volatile computers at handling new, complex and confusing
memory (aNVM)) to run algorithms efficiently by techniques such as facial recognition and artificial
focusing on their use as electrical synapses. Design intelligence. The ability to use brain-inspired

Authorized licensed use limited to: ANNA UNIVERSITY. Downloaded on January 09,2025 at 06:43:59 UTC from IEEE Xplore. Restrictions apply.
computing extends beyond computers with stored Studies explore the use of memristors, spintronics, and
programs [11]. other novel materials to create efficient neuromorphic
circuits. As the field evolves it might open the door for
This review explores various comparative studies the creation of smarter, faster, and more energy-
analyzing hardware implementations, focusing on efficient machines capable of tackling complex tasks
aspects like efficiency, scalability, and performance. previously unimaginable.

TABLE I. COMPARISON TABLE


Study Title Authors Accuracy/Efficiency Focus Method Key Findings
(Year)
A Comparative Liu, S., et al. Efficiency & Scalability Hardware Comparison Analyzes power efficiency and
Study of (2022) (Memristors, CMOS etc.) scalability of different hardware
Hardware platforms for SNNs
Implementations
for Spiking
Neural Networks
A Survey of Liu, Z., et al. Performance & Scalability Hardware Comparison Compares performance and scalability
Hardware (2020) (GPUs, FPGAs, of various hardware platforms for
Architectures for Neuromorphic Chips) implementing neural networks
Implementing
Neural Networks
Neuromorphic Zhang, W., et Efficiency & Performance Hardware & Material Investigates the use of non-volatile
Computing with al. (2018) Exploration memory technologies for efficient
Non-Volatile neuromorphic computing
Memory
Learning and Prezioso, M., Accuracy & Learning Hardware Demonstrates memristor-based
Classification et al. (2015) Implementation neuromorphic circuits for learning and
with Memristor- classification tasks
Based
Neuromorphic
Circuits
Event-Driven Liu, Z., et al. Efficiency & Speed Hardware Exploration Explores spintronics-based
Neuromorphic (2019) neuromorphic systems for low-power
Computing with and high-speed computing
Analog
Spintronics
Neural Coding in 2021 Varies (based on coding scheme), High (especially with An analysis that compares the effects
Spiking Neural TTFS coding) and effectiveness of four significant
Networks: A neural coding schemes
Comparative
Study for Robust
Neuromorphic
Systems
The Looming Jouppi, N., et Performance & Efficiency Benchmarking Different Emphasizes the need for standardized
Need for al. (2017) Comparison Hardware benchmarks to compare performance
Benchmarking and efficiency of hardware for deep
Hardware for learning
Deep Learning

VI. CHALLENGES AND FUTURE SCOPE VII. CONCLUSION


Neuromorphic computing, while promising, faces many It is a potentially revolutionary form of computing that
challenges and limitations: draws inspiration from the architecture and operations of
the human brain. This review provides an overview of
• Complexity: The complexity of neuromorphic systems
neuromorphic computing, discussing its history,
can make them difficult to implement.
development, algorithms, concepts, applications, and future
• Lack of standards: Neuromorphic computing is a new
directions. Inspired by the remarkable capacities of the
field and there are no standards to measure its
human brain, it provides a basic paradigm shift in artificial
performance.
intelligence, machine learning and data processing. This
• Limited Applications: Applications of neuromorphic review explores the fundamental principles of this
computing are still under investigation and may not be technology, its potential benefits, and ongoing efforts to
suitable for all types of computing work. overcome current limitations. This review highlights the
• High Cost: Neuromorphic devices can be expensive to important benefits promised by neuromorphic diseases.
create and use. Their low-power work is particularly suitable for tasks that
• Accuracy and Precision: Neuromorphic systems have use knowledge and learning that can manipulate intellectual
lower accuracy and precision compared to similar abilities in confined spaces. In summary, neuromorphic
neural networks. computing is at the forefront of the technological
revolution. Inspired by the human brain, this field holds

Authorized licensed use limited to: ANNA UNIVERSITY. Downloaded on January 09,2025 at 06:43:59 UTC from IEEE Xplore. Restrictions apply.
great promise for the future of artificial intelligence and
computing. By solving current problems and supporting
ongoing research, we can pave the way to develop smarter,
faster, and more efficient machines and create future life
through innovation.

REFERENCES
[1] D. Kuzum, S. Yu, and H. P. Wong, "Synaptic electronics: materials,
devices, and applications," Nanotechnology, 2 September 2013.
[Online]. Available: https://doi.org/10.1088/0957-
4484/24/38/384003.
[2] G. W. Burr et al., "Neuromorphic computing using non-volatile
memory," Advances in Physics: X, vol. 2, pp. 124-189, 2017.
[3] S. Yu, "Neuro-Inspired Computing With Emerging Nonvolatile
Memorys," Proceedings of the IEEE, vol. 106, pp. 260-285, 2018.
[4] S. B. Eryilmaz et al., "Neuromorphic architectures with electronic
synapses," 2016 17th International Symposium on Quality
Electronic Design (ISQED), pp. 118-123, 2016.
[5] S. Yu et al., "An Electronic Synapse Device Based on Metal Oxide
Resistive Switching Memory for Neuromorphic Computation,"
IEEE Transactions on Electron Devices, vol. 58, pp. 2729-2737,
2011.
[6] S. Gupta, R. Saxena, A. Bansal, K. Saluja, A. Vajpayee, and Shikha,
"A case study on the classification of brain tumour by deep learning
using convolutional neural network," AIP Conf. Proc. 2782, 020027,
2023. [Online]. Available: https://doi.org/10.1063/5.0154417.
[7] P. Gahelot, P. K. Sarangi, M. Saxena, J. Jha, A. Vajpayee, and A. K.
Sahoo, "Hog Features Based Handwritten Bengali Numerals
Recognition Using SVM Classifier: A Comparison with Hopfield
Implementation," 2022 IEEE (CCET), Bhopal, India, 2022, pp. 1-6.
[Online]. Available: doi: 10.1109/CCET56606.2022.10080015.
[8] R. Bhandari, A. Vajpayee, R. Kumar, and D. Sihag, "Design and
Analysis of Novel searching pattern in motion estimation used for
video compression," 2023 14th (ICCCNT), Delhi, India, 2023, pp.
1-5. [Online]. Available: doi:
10.1109/ICCCNT56998.2023.10306376.
[9] S. Liu, Y. Li, S. Zeng, L. Pan, J. Wang, and X. Hu, "A Comparative
Study of Hardware Implementations for Spiking Neural Networks,"
In Proceedings of the 59th Annual Design Automation Conference,
pp. 1-6. [Online]. Available: doi: 10.1145/3585032.3585222.
[10] Y. Boureau, A. Habibi, C. Jeribi, and Y. Bengio, "A Survey of
Neuromorphic Computing and Neural Network Hardware
Acceleration," Proceedings of the IEEE, vol. 108, no. 11, pp. 1821-
1846, 2020. [Online]. Available: doi:
10.1109/JPROC.2017.2753909.
[11] A. Shrestha and A. Poudel, "A Review of Spiking Neural Networks
for Deep Learning," Computers, vol. 7, no. 7, p. 41, 2018. [Online].
Available: doi: 10.3390/computers7070041.
[12] Z. Liu, Y. Cai, Y. Luo, X. Wang, L. Yang, S. Li, and J. Sun, "A
Survey of Hardware Architectures for Implementing Neural
Networks," Journal of Systems Architecture, vol. 108, p. 110432,
2020. [Online]. Available: doi: 10.1016/j.sysarc.2020.110432.
[13] W. Zhang, R. Li, N. Xu, Z. Li, Z. Liu, S. Yu, and Y. Wang,
"Neuromorphic Computing with Non-Volatile Memory," China
Technological Sciences, vol. 61, no. 8, pp. 1143-1155, 2017.
[Online]. Available: doi: 10.1007/s11431-017-9226-6.
[14] M. Prezioso, F. Bayat, K. K. Likharev, and D. B. Strukov, "Learning
and Classification with Memristor-Based Neuromorphic Circuits,"
Nature Communications, vol. 6, p. 8102, 2015. [Online]. Available:
doi: 10.1038/ncomms8102.
[15] Z. Liu, W. Li, Y. Sun, Y. Zhou, F. Pan, J. Hao, and Wan, "Event-
Driven Neuromorphic Computing with Analog Spintronics," Nature
Nanotechnology, vol. 14, no. 3, pp. 287-292, 2019. [Online].
Available: doi: 10.1038/s41565-018-0382-2.
[16] H. Li, Y. Chen, Z. Xu, W. Zhao, S. Yao, Y. Wang, and H. Yang, "A
Comparative Survey of Approximate Computing Techniques for
Neuromorphic Systems," Frontiers in Computer Science, vol. 14, no.
3, pp. 321-340, 2020. [Online]. Available: doi:
10.3389/fcomp.2020.00321.
[17] N. Jouppi, C. Young, N. Patil, D. Patterson, M. Ggemm, and M. R.
Hagerty, "The Looming Need for Benchmarking Hardware for Deep
Learning," Communications of the ACM, vol. 60, no. 6, pp. 54-66,
2017. [Online]. Available: doi: 10.1145/3093333.

Authorized licensed use limited to: ANNA UNIVERSITY. Downloaded on January 09,2025 at 06:43:59 UTC from IEEE Xplore. Restrictions apply.

You might also like