HIRE WRITER

The Future of Computers

This is FREE sample
This text is free, available online and used for guidance and inspiration. Need a 100% unique paper? Order a custom essay.
  • Any subject
  • Within the deadline
  • Without paying in advance
Get custom essay

Could you imagine a world without cell phones, calculators, or laptops? We rely on these forms of computers in our daily life without even realizing how life would be without them. It would take hours to solve complex math problems and the only way to communicate with others would be by sending a letter, which could take weeks to get a reply. In addition, we would not have the Internet, as the invention of computers made the Internet possible.

While it may seem hard to imagine, this was how life was before the computer was invented. By definition, a computer is, “a device for processing, storing, and displaying information” (“Computer,” 2019). This definition is vague and computers can range from calculators to televisions. The invention of computers has automated many manual tasks in our daily lives and has given us more time to spend on more important things.

Not only have computers made our daily lives more efficient, but they have also helped our society and world grow and advance. What separates today’s world from five hundred years ago, is that we have many new technologies that allow for many advances. The most recent example of this is computers, which have allowed for a great amount of progress in fields such as IT, manufacturing, and medicine. Most of this progress is only made possible because of computers. For instance, the fields of information technology and cybersecurity would not exist without computers, the Internet, and Wi-Fi.

Another example of this is the medical field, which has had many recent advancements. Not only can researchers use computers to find cures faster, but new forms of computers such as X-Ray and MRI machines also help save lives and aid in the recovery of injured patients. While these are just a few examples of how computers act as a catalyst for advances in prominent fields in today’s society, they show how beneficial computers are to making our society and world progress.

Computers have been around for some time, as the first form of one was invented in 1642 by Blaise Pascal. Pascal, a French scientist, mathematician, and philosopher, saw the need for a quicker way to solve complex math problems. His father, a tax supervisor, had to solve these problems which would sometimes take hours to solve, and Pascal wanted to fix this. This is why he created the first automatic calculator, which could perform basic addition and subtraction. This calculator was the first form of a computer, and by using large wheels linked together with gears. The first wheel represented 1-10, and the second wheel represented 10-100. The calculator worked by moving the wheels. If the first wheel was moved a certain number of rotations, the second wheel would move in the appropriate proportion (Gunter, 2019).

However, computers were not as famous, as the next important contribution to the computer was 159 years later, in 1801, when Joseph Marie Jacquard built an automatic loom. Jacquard, a French weaver, noticed that the weaving process had a major flaw. The weaving process consisted of workers moving thread on looms to create complex patterns. When a worker made a mistake, it sometimes took 2-3 hours to correct it. Jacquard looked to change this, which led to his creation of the Jacquard Loom, a computer that automated the weaving process, reducing any room for error.

This loom worked by using long belts of punched cards. These cards controlled and manipulated the needles that were on the loom. The machine worked because the cards had holes in them. Where there was a hole in a card, the needle passed through. On the other hand, where there wasn’t a hole, the needle got rejected and couldn’t pass through. Each punched card was different and had a different pattern of holes. In this way, weavers could change the order of the punched cards to create and design complex patterns (Gunter, 2019).

The next major advance in the computer was in 1822 when Charles Babbage, an English mathematician, built a small, hand-cranked computer that was able to compile and print mathematical tables. He presented his idea to the British government and received 17,000 pounds to build a full-scale version of the machine. This machine would be used by the Royal Navy to calculate navigational tables.

However, Babbage decided that he would build a better machine, called the Analytical Engine. This machine would be powered by steam and would be able to store information on punched cards. In addition, it would be able to solve most types of arithmetic problems (“Charles Babbage,” 2019). While Babbage was unable to complete the building of his Analytical Engine, this machine is considered to be the grandfather of the modern computer, as it had all the components of modern computers.

The computer was revolutionary, as it changed the way people lived their lives. This meant that it was very hard to invent a new model of the computer, and only the most inventive and creative people could create one. The first of these scientists was Joseph-Marie Jacquard, who, as explained above, built the first automatic loom. Jacquard was born in 1752 in Lyons, France. Both of his parents worked in the silk industry; however, Jacquard was not interested in following in his parent’s footsteps and working in the textile industry.

It was for this reason that he became an apprentice to a bookbinder, a job he found interesting. Jacquard continued as an apprentice until his parents died, which gave him his family’s home. In addition, he also received a small amount of money, but he lost all of it in entrepreneurial investments that had little or no profit. Jacquard’s career was at a standstill, as he had little money and was only an apprentice. It was at this point that he went back to where his parents worked: the textile industry.

Jacquard had little knowledge of how to work in a factory, but he quickly realized that the weaving process was slow and a mistake took hours to fix. This was when he drafted the idea for an automatic loom, as talked about above, that would automate the weaving process. Jacquard finished building his machine in 1801, and three years later, took his machine to Paris. In Paris, he was awarded a gold medal and a patent for his machine (“Joseph-Marie Jacquard,” 2019).

The next significant contribution to the computer was by Howard Aiken, who was born on March 8, 1900, in New Jersey. Aiken was raised in Indianapolis, Indiana and after eighth grade, he had to find a job, as his family had little resources and money. He attended high school and during the night worked as a switchboard operator for the Indianapolis Light and Heat company. While working the job, Aiken found out that he had a love for machines and computers. Later, From 1919-1923, Aiken went to the University of Wisconsin and got a bachelor’s degree in science.

In the next 12 years, he worked as a professor at the University of Miami and in 1939, he finally received his Ph.D in the field of Physics. As a physicist, Aiken had to spend many hours doing long calculations. This was when he thought about improving calculating machines to make it easier for physicists to perform long calculations (“Howard Aiken,” 2019).

In 1937, Aiken wrote a 22 page-long paper in which he proposed the design for a machine that would solve these complex calculations in little time. This machine, later called the Mark 1 calculator, worked using punched-cards and telephone relays. He went to IBM for funding and IBM’s president, Thomas Watson, gave Aiken a proposal: IBM would build the computer and Aiken would supervise it. In addition, the U.S. Navy would give additional backing. The Navy had agreed to back Aiken’s machine because it had the potential to help aim long-range missiles. The machine could calculate the trajectory of the missiles in minutes. In 1944, the machine was built and called the Mark 1 Calculator (“Howard Aiken,” 2019). Today, the Mark 1 may seem very clumsy and slow, but the Mark 1 was the first powerful computer of its time.

There are many uses of the computer in today’s world. These include in businesses, in the industrial industry, and for personal use. However, there is also a very important use of computers: in education. Computers are becoming more common in classrooms and learning environments, as they help students connect their knowledge to the real world and make learning fun. In addition, they allow for more personalized and interactive learning.

They are also used when schools receive fewer funds and resources. Numerous schools have increased class size and dropped courses to deal with the less money they receive. To keep the quality of learning the same, many schools have turned to computers for help. For example, the KIPP Empower Academy in Los Angeles had to cut one whole classroom due to a tighter budget. The academy went from five classes of 20 students to four classes of 28 students. The academy’s principal, Mike Kerr, was worried about the quality of education his students were receiving, so he turned to computers for help.

The academy installed computers in all the classrooms and invented a new system for the students to learn: some kids work on the computer while another group of kids works with a teacher. The two groups later switch. In this way, the students would feel like they were in a small class, and the academy didn’t have to hire additional teachers (Abramson, 2019). This example highlights a use of computers that is becoming more and more popular today.

Currently, scientists are working on a new kind of computer: the nanocomputer. Nanocomputers are part of the field of nanotechnology, an emerging field that is becoming the future of computers. Nanotechnology is the technology of devices that range from one to a couple hundred nanometers in size. A nanometer is one-billionth of a meter. Recently, nanotechnology has been making big headlines and gaining more popularity. Computer technology has advanced a great amount due to engineers putting more and more electrical components into small silicon chips, which power computers.

While this has worked so far, there will be a point when engineers will not be able to increase computer power by reducing the size of electrical components. This is because silicon, what the chips are made out of, has physical limits to the number of components it can hold. In addition, silicon chips get more expensive the more compact the chip is (“Big Advances in Tiny Computers,” 2019).

However, there is a solution and a way out of this problem: Nanocomputers. Using nanocomputers will allow the computer to be created using only a molecule of electrical components. A nanocomputer can fit billions or even trillions of electrical components into the space that one of today’s chips take up. To put it into comparison, today’s chips only fit ten million components. Nanotechnology is not just hopes and dreams, breakthroughs in it are happening right now.

Research into nanotechnology started in 2000 when U.S. president Bill Clinton started a National Nanotechnology Initiative. This provided a big increase in funds for nanotechnology research. In 2001, a team of scientists was able to link together molecule-sized devices into a miniaturized circuit. This was the first major breakthrough in nanocomputers and nanotechnology. New diodes on transistors were created on the molecular level, allowing nanocomputers to be more powerful and compact (“Big Advances in Tiny Computers,” 2019).

Also in 2001, a Dutch team of scientists created the first single-electron transistor. This device operated by only a single electron, a huge development and breakthrough in nanotechnology. Another example of a nano-transistor was created by Bell Labs, who created a field-effect transistor that was 50,000 times smaller than the width of a human hair.

Also, Harvard University developed a nanocomputer that could solve simple addition equations. Nanotechnology has many uses, such as making computers smaller and more portable. In addition, nanotechnology is used in products that we interact with and use in our everyday lives. It is used in sunscreens, water bottles, gasoline, and much more. Nanotechnology is a very popular field that is providing us with an insight into how the future of computers will look like (“Big Advances in Tiny Computers,” 2019).

Computers are a big part of today’s world and have a big impact on our society. A major impact that computers have on society, is on employment. While computers greatly increase productivity, they can also increase unemployment. This happens because many factory jobs are replaced by machines, which can do the job much faster and error-free. For example, IBM’s computer-chip manufacturing plant in Fishkill, New York is a clear example of computers and robots taking over factory jobs. The plant has three hundred robotic tools, and more computing power than NASA uses to launch Space Shuttles.

These computers are extremely effective, as the plant produces tens of millions of chips in a year, each which has circuitry 800 times thinner than a human hair. Other computer chip manufacturing plants have about 400 employees, while this plant only has 100. These employees usually keep their distance from the work and are only needed to ensure that computers and machines are working. The machines are very reliable, as last winter, a huge snowstorm hit Fishkill, sending all the workers home. The machines kept working regardless of the snow, and the plant produced the usual amount of chips.

Perry Hartswick, a senior program manager at the plant, says that “The productivity increases for IBM are amazing.” These productivity improvements are good and bad for the economy. They are good because they help American businesses stay competitive with other countries. In addition, they stop inflation, as there is increasing demand. However, an increase in machines taking the places of jobs also has negative effects on the economy. Not only does an increase in machines cause an increase in unemployment, it also holds back job growth in our economy.

For example, recently, more than 44,000 jobs were removed, as machines were used instead. Many Americans are arguing that a limit on the number of machines that a company can use, should be put. This would make it so that a company could not use machines to completely replace all jobs. While using machines and computers can increase productivity and demand, it causes a higher rate of unemployment and limits job growth in our economy.

As mentioned above, computers and machines greatly increase productivity. For example, in the 1980s, computers accounted for a third of total productivity for the entire economy that year. Additionally, the world market for computers and information technology grew at twice the rate of the gross domestic product between 1987 and 1994. This shows that the demand for computers and the computer revolution is increasing on a global scale.

References

Cite this paper

The Future of Computers. (2021, May 23). Retrieved from https://samploon.com/the-future-of-computers/

FAQ

FAQ

What is the four future of computer?
So, for the sake of conversation, let's imagine a world where the four primary computer systems of the future ( classical, photonic, hybrid, and quantum ) have all matured into usefulness.
What will computers be like in 2030?
In 2030, computers will be more powerful, more portable and more affordable than ever before. They will be an essential part of everyday life, helping people to work, learn and communicate with each other.
What would be computer like in 2050?
In 2050, computers will be more like humans in that they will be able to think and learn for themselves. They will also be smaller and more portable, making them more convenient to use.
We use cookies to give you the best experience possible. By continuing we’ll assume you’re on board with our cookie policy

Hi!
Peter is on the line!

Don't settle for a cookie-cutter essay. Receive a tailored piece that meets your specific needs and requirements.

Check it out