Memory is the essential instrument by which human history is created. Memories of the past make future actions possible; past experiences potentially lead to better decisions for the future. Memory is the indispensable thread that ties the past and the possible future to the present moment, where decisions (and history) are made. In considering history, it is not difficult to draw an analogy to the digital world, where memory is the power to handle information, and a user’s history can inform how systems respond.
Unlike (political) human history, the history of the digital world appears to be happening at a lightning pace. Indeed, from the past century to the present day, the development and growth of Information Technology (IT) has been extraordinarily dynamic, and the generating, gathering, and disseminating of information has continued at an accelerating pace. The rise of this information era has also brought with it new anxieties and new hopes for the future, from fears of a robot singularity to hopes for expanded human longevity.
If the developed world is, indeed, at an inflection point in terms of the development of IT, it would be wise to take stock of the past and try to understand the recent history. In doing so, it is expected that the meaning, growth, and development of Information Technology will be correlated with the arrivals of new knowledge and innovations. This should hardly surprise; as Paul (2010) observed, new technologies are the products of the past technologies that have undergone innovative modifications.
The history of IT development is relevant in myriad ways. Modern technology could hardly exist without past information technologies. Innovative individuals who have come up with competent information technologies usually refer to the past ones to perfect their works. While history is important in understanding the origin of humanity, it is also important in the propelling of growth and development of IT. According to the United Nations Public Administration Network (2010), enormous development of information development took place in the 20th century. Historically, Information Technology merely referred to collection, procession, storing, and dissemination of information rather than computation and utilization of communication technologies using advances information systems.
The meaning of information technology has expanded to a tremendous level due to advancement in technology in the contemporary technological world. The advancement of computer and telecommunication technologies began to spread widely in 1970s. However, the modern technologies have always seen the importance of referring to the past technologies to innovate new ones majorly in the communication sector. The complexity of information technology has come along with more powerful technologies, the use of optics, advancement of microelectronics, and improvement of integration of telecommunication.
Moreover, the understanding of the old information technologies has facilitated the convergence of telecommunication technologies. Each country has a history of the development of IT. On the other hand, the history of advancement of IT can be localized and globalized depending on its relevance to the current society. Computerization of communication systems, development of networking technology, and improvement of communication systems has diverse history (Mahoney, n.d.). However, the general development of Information systems has been improving year after year. The transformation of traditional communication methods to modern information processing and dissemination methods have been perfecting rapidly in the modern world as opposed to the past times.
Evolution of information technology is pragmatic like the development and improvement of other subjects (Kaur, n.d.). Some of the good technologies used in the information sector today are only good because of their reliance on the past information technologies. Notably, information technology is a subject of improvement if not advancement. A perfect information technology is only possible if it is made out of an old technology. As mentioned earlier, much of the present technologies originated from the past. The founders of different technologies are recognizable up-to-date because of the originality of their thoughts that are being used hitherto because by information technology experts in the contemporary world.
Importantly, doing away with the past information technologies is inevitable and quite unrealistic. As human strive to know more regarding technology, the past technologies do not lose value in the verge of IT development. Some of the IT researches extend to the past technologies to investigate them and come with the optimal outcomes that are helpful to the current community. Kallinikos (2011) notes that Information and Communication Technology (ICT) is not only based on the present and the future but also the past. The modern information technologists admit using past researches many times as a way of furthering their innovation by way of maintaining uniqueness, enhancing efficiency, and testing the workability of a technological idea or innovation.
Studying the history of Information Technology (IT) is important for several reasons: to enhance technology optimization and perfection, to facilitate a deeper understanding of various technologies, to improve the development of information technology computing, to promote the process of technology adoption in organizations, and to promote reactivity and proactivity to IT changes with the aim of improving the contemporary technological world.
The history of information technology plays a vital role in the optimization and perfection of various information systems in various settings. Old technologies are quite significant in innovating new ones – innovation builds on the past. To take an example, for every technology that exists, a weakness always emerges. Understanding such weaknesses may not be simple without a reference to the origin or other previous works of the main technology. Paul (2010) notes that the detection of technological flaws in information systems is a way of enhancing their perfection and efficiency.
What’s more, every update of every technology includes improvements making the final product ostensibly better than the previous one. Generally speaking, redundancy is a sign of inefficiency in technology; the main objective of every technologist is to innovate a type of technology that is more productive and efficient than its alternative (Mahoney, n.d.). In historical times, people accepted inefficient technologies more readily because they had no alternatives. The level of innovation in the information sector in the modern world is relatively high, which explains why optimization is a major factor in the introduction of new technologies.
The quest for perfection in technology is never-ending. Perfection in IT means accuracy, clarity, and timeliness in coding and decoding information to ensure effective communication between the information senders and the receivers. Limits in physical technology can limit IT perfection. For instance, a research conducted by Paul (2010) found out that the first mobile phones to be made did not deliver clear information compared to the modern phones; while network was available, the devices could not capture it well. This example highlights the weakness of old technologies as well as the importance of perfecting them to improve IT. Technologies are bound to have weaknesses.