Probably the most sensible thing to do is to start at the beginning.
Many events and occurrences have shaped the IT industry into the creature it is. Its history is filled with myriad names, brands, ideas, and concepts, all of which have played their part in making IT one of the most dynamic industries in the world. The history of computers is an interesting subject; indeed entire books have been written about it. Although understanding the history of computers may not help you secure a career in IT, they make for an interesting read and are a useful source of background information. Included in Appendix E are some suggestions for further reading.
To convey the entire history of computing in a few paragraphs is difficult indeed. So, rather than try to cover all of the bases, the following is a summary of the some of the more significant happenings that have influenced the evolution of the computer industry over the past 50 or so years.
In the Beginning
The roots of digital computing can be traced back to 1946 when the worlds first digital computing device called ENIAC was put into service. This event, in digital terms at least, marked the beginning of the information age. In terms of size and complexity, ENIAC was a monster, occupying an entire room and needing a team of engineers and scientists to operate it. Despite its impressive size and appearance, it had less processing power than today's hand-held electronic organizers.
As well as being regarded as the world's first real digital computer, ENIAC is also famous for helping to define the term bug, due to the fact that a moth was found to have shorted out the system, causing it to stop. The term those early engineers coined to describe the fault is still used today to refer to a problem with a computer system or program.
Shortly after ENIAC was being brought to a standstill by flying insects, another significant event occurred: the invention of the transistor. The transistor was the first step toward the development of microprocessors. It was a very significant invention, as todays microprocessors have literally millions of individual transistor switches in them.
The formative years at the end of the 1940s served as the foundation to the computer and IT industry as we now know it. Those early technical pioneers sowed the seeds of discovery that would shape the world. As the saying goes, the rest is history.
The Territory of the Mainframe
Realizing the obvious potential of computers, the inventors of the ENIAC continued their work into the 1950s, designing and subsequently manufacturing a more powerful machine than ENIAC, called the UNI VAC. The UNI VAC is regarded as the first commercially available digital computer.
It was also during the 1950s that IBM, who had made a name for itself producing calculating machines, made its first foray into digital computing with a system called the Type 701 EDPM.The 701 was the first machine in a series that would propel IBM into the mainframe computer market and make the company one of the most significant and well known in the computer industry.
The Decade of Downsizing
The problem was that although they performed tasks useful to many businesses, mainframe computers were incredibly expensive to buy and subsequently run. What was needed was a computer that was smaller and more affordable than a mainframe. The answer was the minicomputer, originally manufactured by the Digital Equipment Corporation.These new minicomputers, though still only affordable by good-sized businesses, were considerably cheaper to buy than mainframes and had lower operating costs.
Before computers could get even smaller, the computer industry needed to develop a way to make the processing elements of the systems smaller and even more powerful.The solution to that problem arrived when Robert Noyce teamed up with fellow engineer Gordon Moore and started a company called Intel.
Intel did not actually start out as a microprocessor company; initially the company designed and manufactured memory chips. It wasn't until a Japanese calculator company asked Intel to produce a set of chips to put into a programmable calculator, that one of its engineers hit on the idea of an all purpose processor. By 1971, Intel was selling its first fully functioning microprocessor, the 4004, for the tidy sum of $200.
Though the development of the microprocessor was a major event, the '60s also saw many other important developments and achievements for the computer industry. Perhaps one of the most significant was the advent of computer networking. It was in the late 1960s that work began on the ARPAnet, the network that would eventually become today's Internet.
Also significant in the 1960s was the development of the Unix operating system, which has become the operating system of choice for use on the Internet. Unix, though obviously upgraded and updated, is still used by many large companies today. The current trend toward Linux, also has much to thank the Unix operating system for. Linux is actually based on a computing standard called POSIX, which was originally developed for Unix software.
Today, technology is everywhere. If theY2K bug scare did nothing else, it served to remind us of just how reliant we have become on technology and the industry that supports it. Without computers, modern day life would look very different- indeed it would take a very creative mind to imagine life without it. The milk carton you used this morning was almost certainly designed on a computer, and this article was written on one. Look around you now and think for a moment about how technology affects you.