Computers as we know them today began in the 19th century, thanks to a mathematician named Charles Babbage. Charles’s differencing and analytical engines enabled automatic computing.
This is because it was his designs that made it possible for the current framework of computers today. However, computer history does span more than 2500 years but more on that later.
Throughout history, each generation was able to advance computers in ways that we may not have imagined. Thus, increasing the importance of the technology in our lives and business.
What we have today wouldn’t be possible if not for the people that came before us. Advancements in computer tech has changed lives, given industries a competitive edge and enhanced our daily activities.
Enter Generation #1
Between 1943 and 1958, we saw the first commercial computers take off such as the delivery of the UNIVAC I (Universal Automatic Computer 1) to the U.S Bureau of the census in 1951.
UNIVAC I | IMAGE CREDIT: WIKIPEDIA
The major differentiator between this generation and its successors is that vacuum tubes were used as internal computer components.
The tubes were about 5 – 10 centimeters in length and because of this, computers had to occupy more space. This resulted in huge machines that cost lots of money to obtain and maintain whenever tubes failed.
Also Read : WHY SHOULD YOU KEEP YOUR APPS UPDATED?
Between 1959 and 1964, people saw the invention of the transistor by Bell labs. Transistors were capable of performing plenty of the same tasks as vacuum tubes but occupied less space. Therefore, reducing the size of computers.
Not only did they reduce the size of computers, they were also much faster, reliable and needed less electricity.
This generation also developed the symbolic or assembler language.
This discovery allowed programmers or users to specify instructions in words. The computer would later translate the language into something that it understood (series of 0’s and 1’s aka Binary code).
This also fostered the development of higher-level languages like Fortran and Cobol.
Generation #3 (1965 – 1970)
The integrated circuit (IC) was invented in 1965. It allowed experts to contain a complete circuit with hundreds of components in a single silicon chip 2 (or 3 mm square).
This new ability made computers much smaller, more powerful and affordable to the general public.
Not to mention that computers became even faster because a smaller sized computer, meant that electrical currents had shorter distances to travel.
Also, for the first time in human history, computers could run multiple softwares at the same time, enabling multi-tasking.
1971 – Present Day
Naturally, computers continued to evolve.
Existing technologies improved in power and reliability. During the 1970s, a single silicon chip became capable of holding thousands of integrated circuits.
In the past, most computers were serial devices compared to present day. They had one processor. In the present day, parallel computers rain supreme. These computers contain multiple processors.
Of course, the history of computers can’t be summed up in one publication without adding supporting details like the following infographic. Enjoy!
Thank you for reading!