2. • What is neuromorphic computing?
• How does it work?
• Biological neuron Vs artificial
neuron
• Impact in solving today’s problems
• Comparison between traditional AI
and neuromorphic computing
• Future prediction / Applications
• Limitations
OVERVIEW
1
3. • Neuromorphic computing is based on the principles of neural
networks and aims to develop hardware and software systems that
mimic the architecture and function of the human brain.
• Inspired by the human brain and neural network principles, and has
recently gained prominence due to advancements in Artificial
Intelligence.
WHAT IS NEUROMORPHIC
COMPUTING?
2
4. HOW IT WORKS?
• Neuromorphic computing circuits are designed that
incorporate both digital and analog components to
emulate the way biological neurons and synapses interact
with one another.
• Employs a distinct high performance architecture.
• Spiking neural network (SNN) is the prevalent type of
neuromorphic hardware
3
6. • To revolutionize AI computing by allowing AI algorithms to run at
the edge instead of the cloud due to their,
• smaller size and
• low power consumption
• With the ability to adapt to its environment
IMPACT IN SOLVING TODAY’S
PROBLEMS
5
7. TRADITIONAL AI NEUROMORPHIC
COMPUTING
• Uses a non-von
Neumann architecture
• Learn from
unsupervised data
• Systems that closely
mimic human brain
• Use a von Neumann
architecture
• Labelled data to learn
• Run on conventional
computer systems
TRADITIONAL AI
VS
NEUROMORPHIC
COMPUTING
6
8. • Microcombs
• Chip manufacturing
• Driverless cars
• Drones
• Robots
• Smart home devices
• Natural language, speech
and image processing
• Data analytics
APPLICATIONS
7
9. • Powerful tool for solving problems that are currently beyond our
grasp.
• Enhance the learning capabilities of the devices
• Making them smarter
• More efficient
• Mimics the neuron spiking functions of biological nervous systems
BENEFITS
8
10. • Limited accessibility for non-experts
• Ethical and legal issues surrounding sentient machines
• Current constraints based on our understanding of human cognition
• Limited accuracy
• Lack of standardized benchmarks
• Underdeveloped software algorithms
LIMITATIONS
9