HIRE WRITER

Brief History of Computers Summary

This is FREE sample
This text is free, available online and used for guidance and inspiration. Need a 100% unique paper? Order a custom essay.
  • Any subject
  • Within the deadline
  • Without paying in advance
Get custom essay

So what makes a computer? For companies, it could be a specific application or a feature to help them best accomplish their needs and reach their customers. For teens, it could be a device with good speakers and programs to write papers. For parents, it’s a place to go online shopping and the like. And for little children, you just need some entertaining games stored inside this technological wonder for them to have fun. The history of computers is complex and chock-full of setbacks, miraculous discoveries, and determined people ready to create a source of information for everyone across the world to reach-at any time. Like all things, it begins with an idea.

The very first computers didn’t even operate electronically, nor by signals in any shape of form. Beginning in 400 BCE, the Chinese would use an abacus. And abacus is an ancient machine used for adding that you cans sometimes find in a child’s playroom, where you manipulate little beads across horizontal poles to work the fundamentals of math. This early computing machines are not considered “computers,” in the sense of electronic computing, but they are technological devices all the same. Maybe you’ve heard of Leonardo da Vinci, the artist behind the Mona Lisa?

Well, some of his inventions involving gears began to bring the idea of a computing machine to fruition. Sciencing.com reports that; “The invention of the first generation computer, the “vacuum tube,” in 1904 kicked off a revolution in computers. A vacuum tube is a tube that has had all air and gas removed, making it perfect for controlling electrical circuits.” A vacuum tube is a large device with a transparent covering resembling a lightbulb that encases several pieces of complicated machinery inside. When this said vacuum tube was switched on, electric currents would decide what kind of calculations it would perform at a moderate pace. This enables humans to use faster machines to solve their problems, but a machine is not he only denotation of the word computer.

Starting in the early seventeenth century, the term “computer” was coined for people whose job was to solve complex equations for others. Wikipedia defines a human computer as; “’one who computes’: a person performing mathematical calculations, before electronic computers became commercially available.” This is fairly normal, as in the movie Hidden Figures, the main character-Katherine Johnson-performs convoluted mathematical feats for the government in order to surpass the Russian’s in the 1950’s Space Race. The mention of the Space Race draws into the conclusion that computers have played a key role in many historical landmarks.

Now that early human computers and abacuses have had their long-term exposure to the world, the 1800’s changed things yet again. Charles Babbage, an English mathematician, configured the parts of a computer that modern computers still use today, according to Easy Science for Kids. This is a big step, as such continuities in computer components could help improve computers for the better in the future. And after the vacuum tube was built, its successors-the transistors, integrated circuits, and microprocessors-made things even easier.

A second generation computer transistor, according to 101 Computing, is; “an electronic component with three pins. Basically, a transistor is a switch (between two of the pins: the collector and the emitter) that is operated by having a small current in the third pin called the base.” Well, what does that mean? It means that a transistor is basically an upgraded, souped-up vacuum tube that spawned new capabilities. But both transistors and vacuum tubes were large, and function and size needed to correspond well.

This brings history into the third generation of computers, the creation of the integrated circuit. A small black ship capable of compounding electronic equations and recognizing currents, the integrated circuit. The brain child of scientists Jack Kilby and Robert Noyce, the integrated circuit, or as known by the widespread colloquialism-the “IC,” consisted of transistors and processors all converged into hat one tiny chip. Sparkfun.com says that the integrated circuit of a combination of resistors, transistors, and capacitators in the chip. But wait, it gets even smaller!

That’s when the fourth generation microprocessor came into play. The microprocessor, more commonly known as the CPU-central processing unit-can do these equations in a small amount of time in a small distance constraint.

And after that? Well, it only skyrockets from there. Following the development of the transistor and microprocessor, a new generation of computers came onto the scene. These new, faster processors were able to compute at paces otherwise unfathomable and put “human computers” to the brink of extinction. Lasting up until around 1972, this is when a new invention changes computers universally.

The internet. It’s no secret that almost everyone relies on the internet at one point in time, and it has an impact on the way almost all computers operate today. In 1973, and idea for an online service such as this was created by Bob Kahn and Vint Cerf. Cerf and Kahn. According to History.com; “…scientists Robert Kahn and Vinton Cerf developed Transmission Control Protocol and Internet Protocol, or TCP/IP, a communications model that set standards for how data could be transmitted between multiple networks.” This basic idea of an omniscient connection between all computers that was previously a crazy theory now became a reality. While the World Wide Web, invented by Tim Berners-Lee gave birth to the grassroots of the web, it was more and more small components that put the internet to its point of esteem today.

So where next? After we’ve surpassed the fifth generation of computers, one may think we’ve reached the paramount of technological advancement. But the thing is, there’s still a lot farther we can go. Artificial intelligence, robots are only the tip of the iceberg. We’ve only scratched the surface of computer’s capabilities, we’ve got a lot further to go.

Cite this paper

Brief History of Computers Summary. (2021, May 23). Retrieved from https://samploon.com/brief-history-of-computers/

FAQ

FAQ

What is the best history of computer?
What Are The 10 Best Books About Computer History? "Computer: A History Of The Information Machine" by William Aspray. "Computers: The Life Story of a Technology" by Eric G. "Dealers of Lightning: Xerox PARC and the Dawn of the Computer Age" by Michael A. "Dark Territory: The Secret History of Cyber War" by Fred Kaplan.
What is the history of computer and its generation?
The first computers were created in the early 1800s. The first generation of computers used vacuum tubes.
We use cookies to give you the best experience possible. By continuing we’ll assume you’re on board with our cookie policy

Hi!
Peter is on the line!

Don't settle for a cookie-cutter essay. Receive a tailored piece that meets your specific needs and requirements.

Check it out