What are the differences in detail between ATmega 16 and 8051 microcontroller, which of these two is better?

Both are micro-controllers so fundamentally they both are same. But what makes them differ is how and for what you want to use them.

8051, is a very good micro-controller to learn mainstream embedded systems(or at least what they teach in colleges),It is powerful enough to run most of your projects.There are now a days there are advanced variants(S8051XC3 ).

But on comparing the original 8051(or even better AT89C51RD2,which is optimized for faster code execution in X2 mode) with Atmega16, Atmega16 is all about high speed prototyping and getting your project running is most simplest and fastest way with least possible part count.

Atmega16 wins on many accounts:

  • With Atmega16 you can get your project up and running in least possible time with minimum part count,whereas with 8051 you at the very start will have to deal with pull-up resistors and such components that are required by 8051 to at least  run properly in the first place.
  • Atmega16 has better RISC instruction set ,most of them being single cycle execution thus faster code execution,while 8051 still supports slower CISC which require multiple machine cycles for execution.
  • Atmega16 has loads of on-chip  peripherals like timers(both 8 and 16 bits),8-channel 10 bit ADCs,I2C bus,SPI bus,UART interface,watchdog timer,whereas original 8051 only has 2 timers plus UART and better variants like RD2 only has SPI added.
  • Atmega16 has higher code memory and RAM as compare to 8051.
  • Atmega16 is simple to program and supporting programming hardware is also easy to learn and use.
And not only Atmega16 but whole AVR series scores well above 8051 family(Look I am not here to take sides but there exists an 8051 variant C8051F120 that pisses all over  Atmega16 in terms of speed and peripheral count.)

The ATmega16 hardware differs in many respects from the 8051, but in some cases it's an easy replacement. Some of the earlier AVRs, the AT90S8515 for example, were pin compatible with the 8051, and for them it was a drop-in replacement, the only differences being the polarity of the RESET and that PORT0 (PORTA on the AVR) wasn't open collector. It is not difficult to convert software. C programs can be recompiled with few changes, and assembler programs can be translated line by line once you have a little knowledge of the architectural differences.

One big difference is the AVR is much faster. It executes most instructions in a single clock cycle, as against 12 for a standard 8051 or 6 for one of the high speed variants. If you're converting an existing project, it's really important to take this into account, or all the timing will be wrong. Further, AVRs have an internal calibrated clock option, so in many cases you don't need a crystal and gain two extra port pins.

8051 can address external memory, and also execute from it. Some AVRs can address external RAM (the mega16 is not one of them) but no AVR can execute code from external memory. AVRs (mostly) have a lot more internal RAM than 8051, and a real stack and stack pointer. They have 32 registers, each of which behaves like the 8051's accumulator, ie can be the destination for an arithmetic operation. They have three 16 bit pointer registers. A lot of this doesn't matter if you program in C because you're abstracted from the hardware, but you will certainly notice the edge in code size and performance.

Oh, and you can get an in circuit emulator for about $50. Does one even exist for the 8051?


Post a comment

If you have any questions or droughts feel free to ask here.