History of Microcontrollers
History of Microcontrollers
The Micro controller was developed by Intel in 1980. The 8051 is one of the most popular 8 bit micro
controllers and combines an instruction set that allows tight coding of small particularly I/O application
with enough power and a large enough program space that it can be used with C Language.
The last decade has seen an exciting evolution with capabilities of microprocessors. The development of
16 and 32 bit microprocessors contributed to the growth of powerful personal computers that are used in
all walks of life. Because of their processing power and speed, these 16 and 32 bit microprocessors have
also found their way into the design of standalone products such as electronic instruments which require
sophisticated control capability. In the evolution of microprocessor capability, instead of focusing upon
larger word widths and address spaces. the present emphasis 'is upon exceedingly fast real time control.
The development of microcontrollers has focused upon the integration of the facilities needed to support
fast control into a single chip.
Intel has introduced standard 8-bit microcontroller 8048 in 1976. The same company has continued to
drive the evolution of single chip microcontrollers. In the year 1980, Intel has introduced the 8051
microcontroller, with higher performance than 8048. With the advantages of 8051, the microcontrollers
applications took a peak level. The 8-bit microcontroller, 8051 family. quickly gained the position of the
second generation world standard microcontrollers.
Because of the advanced semiconductor technology, it has become possible to integrate more than
1.00.000 transistors onto a single silicon chip. Intel has made use of this advanced process technology and
developed a new generation of single chip 16 bit microcontrollers called the MCS-96 (8096 family). The
8096 family offers the highest level of system integration ever achieved on a single chip microcontroller
with l,20,000 transistors. This 8096 microcontroller has 16 bit CPU, 8K bytes of program memory. 232
bytes of data memory and both analog and digital type of 1/ O features.
The Motorola Microcontroller family was first introduced to the market in 1978 and is built in the same
pattern of the microprocessor 6800. Even though the Microcontroller 6801 family was designed similar
to the microprocessor 6800, its design and instruction set have been modified to suit the control
applications.
The microcontroller 6801 family includes On chip Input/Output ports, an Asynchronous serial
communication device and 16 bit timer modules. The Microcontrollers 6801. 6803, 6805. 6811 are
available from Motorola Company. The 6811 microcontroller family have different version with ROM,
RAM, EPROM, and EEPROM. These versions are denoted by suffix characters and numbers.
Although it’s not essential that you understand how microcontrollers developed to the point where they
are today, it’s an interesting story, which can help you understand where an AVR microcontroller fits into
the overall hierarchy of information technology (IT) and electronics products. More important, by having
such an understanding you can make better choices and decisions about when and where to use a
microcontroller, in preference to other alternatives. If you open up a CD player or a VCR from the 1980s
(perhaps you have one in the attic, or in your garage, I know I do!) you will find that they are absolutely
stuffed with circuit boards, and that each circuit board is densely populated with integrated circuits (chips)
and components that made the thing work. By contrast, open up a DVD player made in the last few years
and you are likely to find quite a lot of empty space, and just one quite small circuit board that contains
perhaps two or three quite large chips and a handful of other components. Yet, it’s probable that the
modern device offers far better quality and robustness. It will certainly offer massively more product
features and options than its 1980s predecessor.
Until the mid-1980s most electronic products were still built using extremely intricate and clever
combinatory logic 2 circuits, implemented with an awful lot of chips! Starting in the early 1980s, a
minority of manufacturers started to build in microprocessors to their products in order to reduce chip
count, which brought down manufacturing costs and thus reduced end-user prices. The earliest 8-bit
microprocessors such as the Intel 8080 or the Zilog Z80 first appeared toward the late 1970s and were a
significant advance on what had gone before. Engineers and designers soon realized that once you put a
microprocessor into a device, you could not only make it do much more, but you could also update it much
more cheaply if defects or flaws in the original design came to light. Many product defects could now be
addressed by using semiskilled labor to plug in a replacement firmware ROM (read-only memory) (this
was in the days before programmable flash memory) rather than having to use skilled labor to expensively
rework or replace thousands of complete circuit boards. As the 1980s wore on, more and more products
had a microprocessor at their core. Even though microprocessors were a huge improvement on what they
replaced, they weren’t a complete magic bullet for bringing down costs and complexity of product design.
The problem was that, to make a microprocessor do anything useful, it had to be surrounded by a large
number of additional chips for input output (I/O) and it usually needed other support chips too — such as
real-time clock chips and address decoders. By the 1990s, improved silicon processing and chip
manufacturing techniques resulted in the ability to put ever more circuitry on one chip. One of the ways
this was used was to augment the microprocessor chip with additional functions and features that had
previously been implemented by separate external chips. To differentiate these new super-microchips
from their simpler forebears, these came to be called microcontrollers.
Some examples of functions that moved from being external chips to being part of the microcontroller are
• Serial ports to enable the subsystem to talk to a desktop computer or other RS232 port-equipped devices.
• Timers to enable the microcontroller to have an accurate time reference on chip and to carry out events
at accurate preset intervals. These timers also enabled microcontrollers to generate music and sounds,
since interval accuracy could be assured.
• Serial digital channels to enable microcontrollers to chat with one another, over just two linking wires.
• Analog to digital convertors allowing a microcontroller system to sense analog signals and store or
process them as digital data.
• Digital to analog convertors that allowed microcontrollers to interface with external devices like motors
that need a continuously variable voltage.
• Input ports for sensing on/off states of things in the outside world.