Home  >  Articles  >  History of Computer: Charles Babbage, Early History, Different Generations, and Brief History

History of Computer: Charles Babbage, Early History, Different Generations, and Brief History

Nikita Parmar

Updated on 18th August, 2023 , 9 min read

History of Computer Overview

Prehistoric humans invented the earliest counting instrument. They counted using sticks, stones, and bones. More computer devices were produced over time as human intellect and technology advanced. While we now use computers for both work and entertainment, the computer was originally designed for a very different purpose. The computer has been around for a long time. Computers are classified into five generations. This leads to smaller, more powerful, and more resilient systems that are less expensive.

What is a Computer?

It is essential to understand the history of computers while studying the many elements of computing. Charles Babbage created the Analytical Engine, which was a type of universal computer. It aids us in comprehending the evolution and advancement of technology across time. It is also a topic covered in competitive and banking exams.

Who discovered Computers?

Charles Babbage, an English mathematician, and inventor, is credited with developing the first automated digital computer. Babbage created ideas for the Analytical Engine in the mid-1830s.

History of Computer

Read more about Who Invented Computer?

History of Computer

The term 'computer' has an intriguing origin. It was initially used in the 16th century to describe someone who computed or did computations. Until the twentieth century, the term was used in the same way as a noun. Women were employed to work as human computers, performing all types of calculations. By the late nineteenth century, the term was also used to denote devices that performed computations. The term is now often used to denote programmable digital devices that run on electricity.

History of Computer

Early History of Computer

For thousands of years, machines have been employed for computation since the development of humanity. The founder of computers, Charles Babbage, then began creating the first mechanical computer in 1822. Then, in 1833, he created an Analytical Engine, which was a general-purpose computer. It included an ALU, some fundamental flow chart ideas, and the integrated memory notion. Then, more than a century later in computer history, we obtained our first general-purpose electronic computer. It was known as the ENIAC (Electronic Numerical Integrator and Computer)John W. Mauchly and J. Presper Eckert created this computer. And as technology advanced, computers became smaller and faster to process informationAdam Osborne and EPSON presented the first laptop to us in 1981.

History of Computer

Read more about the First Computer in the World

History of Computer: Different Generations

The developments of contemporary computers are sometimes referred to as the generation of computers in computer history. We are now using computers from the fifth generation. The following table gives the key aspects of these five computer generations-

Types of Generations

Description

Examples

 

 

1st Generation

The initial generation of computers (1946-1959) was sluggish, large, and costly. Vacuum tubes were employed as the primary components of the CPU and memory in these computers. These computers relied heavily on batch operating systems and punch cards. In this generation, magnetic tape and paper tape were employed as output and input technologies, respectively. 

EDVAC (Electronic Discrete Variable Automatic Computer)

ENIAC (Electronic Numerical Integrator and Computer)

UNIVACI (Universal Automatic Computer)

IBM-650

IBM-701

 

 

2nd Generation

The transistor computer period spanned the second generation (1959-1965). These computers employed transistors, which were inexpensivesmall, and required less power, making transistor computers quicker than first-generation computers. Magnetic cores were employed as primary memory in this generation, whereas magnetic discs and tapes were used as secondary storage. These computers employed assembly language and programming languages such as COBOL and FORTRAN, as well as batch processing and multiprogramming operating systems.

 

IBM 1620

IBM 7094

CDC 1604

CDC 3600

UNIVAC 1108

 

 

3rd Generation

Instead of transistors, third-generation computers employed integrated circuits (ICs). A single integrated circuit (IC) may contain a large number of transistors, increasing the power of a computer while decreasing its cost. As an operating system, these machines employed remote processing, time-sharing, and multi-programming. This generation also employed high-level programming languages such as FORTRON-II TO IV, COBOL, PASCAL PL/1, and ALGOL-68.

IBM-360 series

Honeywell-6000 series

PDP (Personal Data Processor)

IBM-370/168

TDC-316

 

 

4th Generation

The fourth generation of computers (1971-1980) featured very large-scale integrated (VLSI) circuits, which consisted of a chip containing millions of transistors and other circuit parts. These processors made computers of this generation more small, powerful, quick, and economical. These computers utilized real-time, time-sharing, and a distributed operating system. 

DEC 10

STAR 1000

PDP 11

CRAY-1(Super Computer)

CRAY-X-MP(Super Computer)

 

 

5th Generation

VLSI technology was superseded by ULSI (Ultra Large Scale Integration) in fifth-generation (1980-to-date) computers. It enabled the manufacture of microprocessor chips with 10 million electrical components. Parallel processing hardware and AI (Artificial Intelligence) software were employed in this generation of computers. C, C++, Java,.Net, and other programming languages were employed in this generation.

Desktop

Laptop

NoteBook

UltraBook

Chromebook

History of Computer

Read more about When was the Computer Invented?

Brief History of Computer

The computer's creators had to recognize that their construction was more than just a number cruncher or a calculator. They had to deal with all of the challenges that come with creating such a machine, executing the concept, and actually manufacturing it. The history of computers is the tale of these problems being solved.

History of Computer: 19th Century

1801: In 1801, a weaver and merchant from France named Joseph Marie Jacquard invented a loom that used perforated wooden cards to mechanically weave fabric motifs.

1822: In 1822, mathematician Charles Babbage built the first steam-powered calculating engine capable of computing number tables. The "Difference Engine" concept failed due to a lack of available technology at the time.

1848. Lovelace also provides a step-by-step explanation of using Babbage's machine to compute Bernoulli numbers.

1890: Inventor Herman Hollerith developed the punch card system used to calculate the 1880 U.S. census. He went on to start the company that would become IBM.

History of Computer

History of Computer: 20th Century 

1930: Vannevar Bush devised and built the first large-scale automated general-purpose mechanical analog computer, the Differential Analyzer.

1936: Alan Turing devised the Turing machine, a universal machine that could calculate anything that could be computed.

1939: Bill Hewlett and David Packard discovered Hewlett-Packard in a garage in Palo AltoCalifornia, in 1939.

1941: Konrad Zuse, a German inventor, and engineer, finished his Z3 machine, the world's first digital computer, in 1941. The machine, however, was destroyed during a World War II bombing raid on Berlin. J.V. Atanasoff and graduate student Clifford Berry created a computer that can solve 29 equations at once. The first time data may be stored in a computer's main memory.

1945: John Mauchly and J. Presper Eckert of the University of Pennsylvania invent the Electronic Numerical Integrator and Calculator (ENIAC) in 1945. It was Turing-complete and capable of solving "a vast class of numerical problems" by reprogramming, giving it the moniker "Grandfather of Computers."

1946: The UNIVAC I (Universal Automatic Computer) was the first general-purpose electronic digital computer created in the United States for business purposes in 1946.

1949: A team at the University of Cambridge develops the Electronic Delay Storage Automatic Calculator (EDSAC), which is the "first practical stored-program computer."

1950: In Washington, DC, the Standards Eastern Automatic Computer (SEAC) was constructed, making it the first stored-program computer in the United States.

History of Computer

 

History of Computer: Late 20th Century

1953: Grace Hopper, a computer scientist, develops the first computer languageCOBOL, which stands for Common Business-Oriented Language. It allowed a computer user to provide instructions to the computer in English-like words rather than numbers.

1954: The FORTRAN programming language, an abbreviation for FORmula TRANslation, was established in 1954 by John Backus and a team of IBM programmers. Furthermore, IBM created the 650.

1958: Jack Kilby and Robert Noyce invented the integrated circuit, often known as the computer chip, in 1958.

1962: Atlas, the computer, first appears in 1962. It was the world's fastest computer at the time, and it pioneered the notion of "virtual memory."

1964: Douglas Engelbart offers a contemporary computer prototype with a mouse and a graphical user interface (GUI) in 1964.

1969: Bell Labs developers headed by Ken Thompson and Dennis Ritchie unveiled UNIX, an operating system written in C that solved program compatibility issues.

1970: Intel unveils the Intel 1103, the first Dynamic Access Memory (DRAM) chip.

1971: Alan Shugart and a team of IBM employees devised the floppy disc in 1971. In the same year, Xerox created the first laser printer, which not only made billions of dollars but also signaled the start of a new era in computer printing.

1973: Robert Metcalfe, a member of Xerox's research department, invented Ethernet in 1973, which is used to link multiple computers and other devices.

1976: Steve Jobs and Steve Wozniak establish Apple Computers, introducing the world to the Apple I, the first computer with a single-circuit board.

1977: Jobs and Wozniak introduced the Apple II at the first West Coast Computer Faire in 1977. It contains color graphics and a cassette drive for music storage.

1978: VisiCalc, the first computerized spreadsheet program, is released in 1978.

1979: WordStar, a word processing program from MicroPro International, is published in 1979.

1983: The CD-ROM, which could hold 550 megabytes of pre-recorded data, was released in 1983. This year also saw the introduction of the Gavilan SC, the first portable computer with a flip-form design and the first to be sold as a "laptop."

1984: Apple introduced the Macintosh during a Super Bowl XVIII ad in 1984. It was $2,500 in price.

1985: Microsoft releases Windows, which allows for multitasking using a graphical user interface. C++, a programming language, has also been released. Tim Berners-Lee, an English programmer, and scientist, developed HyperText Markup Language, often known as HTML, in 1990. In addition, he developed the term "World Wide Web." The first browser, a server, HTML, and URLs are all part of it.

1996: Sergey Brin and Larry Page founded the Google search engine at Stanford University in 1996.

1988: Apple debuts the iMac, an all-in-one Macintosh desktop computer, in 1998. These $1,300 PCs had a 4GB hard disk, 32 MB of RAM, a CD-ROM, and a 15-inch display.

1999: Wi-Fi, an acronym for "wireless fidelity," is invented in 1999, with an initial range of up to 300 feet.

History of Computer

History of Computer: Late 21st Century 

2000: In the year 2000, the USB flash drive was first released. When used for data storage, they were faster and had greater storage space than other storage medium alternatives.

2001: Apple introduces Mac OS X, subsequently renamed OS X and then just macOS, as the replacement for the traditional Mac Operating System.

2004: Facebook originated in 2004 as a social networking website.

2005: Google purchases Android, a Linux-based mobile phone operating system.

2006: In 2006, Apple released the MacBook Pro. The Pro was the first dual-core, Intel-based mobile computer from the firm.

2009: Microsoft introduced Windows 7 in 2009.

2011: Google releases the Chromebook, which runs Google Chrome OS, in 2011.

2014: The world's tiniest computer, the University of Michigan Micro Mote (M3), was built in 2014.

2015: Apple Watch is introduced in 2015. Microsoft also introduced Windows 10.

2016: The world's first reprogrammable quantum computer is constructed in 2016.

History of Computer

Check Eligibility   Free 1:1 Counselling