Type: Process Essays
Sample donated: June Wong
Last updated: October 30, 2019
The Second Industrial Revolution and International Revolutions Name: Lecturer: Course: Date: The Second Industrial Revolution and International Revolutions Historical Development of a Digital Computer The second industrial revolution marked a stage of a large revolt across the globe in terms of industrialization. This actually started in Western Europe with the modern digital computer technology becoming widely used towards creating innovative progress in industries and other sectors. Digital computer replaced analogue computers that were used in the preceding revolution period. Thus, the digital computer marked the beginning of information age. The computers operations accept inputs from human operators as well as provide result outputs to human consumers. The technology offers high memory used for data storage and each memory carries out distinct operations.
The digital computer has a controlling element that changes operational orders based on information stored in the memory. Peripheral devices fitted in the computer allow input of information from external sources and allows operational results to be sent out. Thus, the first computer was to be designed was the Abacus followed by Colossus until the second industrial revolution when Vincent Atanasoff invented digital computers that are presently utilized. This paper will seek to outline the various refinements noted in the creation of the computer device to the present period. As earlier noted, the first computer emerged about five hundred years ago in Asia Minor termed as the Abacus. It was considered as the first computer and it is still used up to the present within the same region. The Abacus enabled the user to make various computations using the sliding system beads arranged on a rack. The technology was used by early merchants in making trade transactions.
However, some people argue that the Abacus is not a computer but rather a spell checker tool controlled by humans in a way that provides answers to the problems. It spread to Europe and took a long period of about twelve centuries until the next advancement of computers were made. Increased computer development projects funding by the government hastened the advancement phase. By 1941, Konrad Zuse developed a computer known as the Z3 for designing airplanes and missiles (Ifrah, 2001). In 1943, the British invented a powerful computer known as the Colossus to interpret Germany’s messages, especially in warring periods. It was the first electronic programmable computer in the globe created by Tommy Flowers, William Chandler, and Sidney Broadhurst amongst other British members. The computer initially began working in December 1943 and the first person to operate it was Bletchley Park in 1944.
The Colossus used vacuum tubes in performing calculations. However, the Colossus affected computer development because of two reasons. The first is that the Colossus was not a general-purpose computer because it was only invented for purposes of interpreting secret messages. The other reason is that the computer was kept secret until the period after World War I. In addition, following the effect of Colossus in the computer development industry, Konrad Zuse, the civil engineer and computer pioneer in Germany decided to design another computer, which preceded the Colossus computer. Zuse’s second creation successfully managed to produce the Turing-complete computer, which became the first program-controlled computer an improved version of the Z3 (Yost, 2005). The Improved Z3 started its operations in May 1941. Konrad also created the S2 computing machine, which became the first process-controlled computer before further producing the Z4 that became the first commercial computer.
When Konrad was working on the Z4 computer, he realized that the programming machine code was too problematical thus and he decided to work on designing a high-level programming computer. The Colossus computer combined digital processes but it was partially programmable and electronic leading to the creation of another computer known as the Electronic Numerical Integrator and Computer (ENIAC). The ENIAC was developed in 1947 by John Mauchly and Presper Eckert amongst other team engineers. The ENIAC was a general-purpose computer reprogrammed to solve diverse ranges of computing problems.
It was developed with the need to calculate artillery-firing tables in laboratory research in the United States (Sternberg, & Preiss, 2005). It was effective and able to operate mathematical sequences but it was unable to read them from a tape. Many scientists as well as industrialists were excited because of its mathematical power solution and general-purpose programmability. Later, another computer known as the Analog was developed; it was more advanced than the Abacus.
The Analog became widely used especially in military war during the First World War. This computer used continuously changeable aspects such as electrical mechanical and hydraulic quantities in solving required problems. Mechanical analog computers, which were made in significant numbers, became vital in World War II particularly in gun fire control. Electronic analog computers became practical especially when advanced with transistors and thus they became commonly used in science and industry sectors. Analog computers were developed in different varieties although they were viewed as being highly complicated. With time, the computer became more effective as it obtained accurate values as information was fed into digital computers through use of an iterative process in order to achieve the desired precision. Owing to this, the computer became significant in real time applications and overall computations. During the second industrial revolution, Vincent Atanasoff designed the first digital computer in 1937.
This was a modern form of computer developed in the mid twentieth century, around 1940 to 1945. It became economical and extensively adopted worldwide. The digital revolution transformed technology that previously was analog into a digital format. The digital computer became a major landmark in the second industrial revolution because it was easy to access data and there was no form of information loss. However, digital computers were separated into programming and automated calculations. The first programmable computer was invented by Konrad Zuse with the electronic one being invented by Tommy Flowers (Norman, 2005). However, the invention of digital computers noted an element of efficacy since they used less power and was more convenient. In 1980s, the digital computer became quite familiar in developed countries and many people bought them for home use.
In addition, many business and big commercial enterprises became dependent on digital computers. Computer knowledge became necessary in the same period and many people who possessed adequate computer knowledge were accorded jobs in working environments (Spector, 2008). In the 1990s, the Global Web was released as noted with the internet, which became the mainstream realization as many businesses listed the websites in their advertisements. Almost every nation in 1999 was connected with the internet tool.
By 2005, the internet reached billions of people and televisions started being transformed from analogue to digital signals. By the end of 2010, the internet spread was hastened by mobile devices that are actually expected to exceed the personal computers in the internet usage in terms of connectivity. In conclusion, the first computer was termed to as the Abacus followed by Colossus until the second industrial revolution when Atanasoff invented the digital computer, which is used up to the present. Abacus emerged in Asia Minor and this was later followed by the Colossus that affected computer development in a positive manner.
The Colossus was partially programmable and electronic and together with the Z3, it led the creation of the ENIAC, which was also fully programmable and electronic. Later on, the Analog computer was invented but it was replaced by the latest digital computer, which is used up to the present owing to its effectiveness and convenience aspects. References Cassedy, P. (2004). Computer technology. San Diego, CA: Lucent Books. Ifrah, G. (2001).
The universal history of computing: From the abacus to the quantum computer. New York, NY: John Wiley. Norman, J. M. (2005).
From Gutenberg to the Internet: A sourcebook on the history of information technology. Novato, CA: Historyofscience.com. Spector, J. M.
(2008). Handbook of research on educational communications and technology. New York, NY: Lawrence Erlbaum Associates. Sternberg, R. J., & Preiss, D.
(2005). Intelligence and technology: The impact of tools on the nature and development of human abilities. Mahwah, NJ: Lawrence Erlbaum Associates. Yost, J. R. (2005). The computer industry.
Westport, CT: Greenwood Press.