Can you imagine your life without a computer?
Think about all of the things you wouldn’t be able to do. Send an email, online shop, find an answer to a question instantly.
And that’s just the tip of the iceberg. We’ve come a long way from the very first computer, and even the first smartphone. But how much do you really know about their history and evolution? From floppy discs to cloud security, the Acorn to the Macintosh, let’s explore how far we’ve come.
Interested in something specific about computers? Jump ahead to a certain time in history:
While today we use computers for both work and play, the computer was actually created for an entirely different purpose. In 1880, the population of the United States had grown so large that it took seven years to formulate the results of the U.S. Census.
So, the government looked for a faster way to get the job done, which is why punch-card computers were invented that took up an entire room.
While that’s how the story starts, it’s certainly not where it ends. Let’s explore everything that happened leading up to that, in between, and after.
1801: In France, weaver and merchant Joseph Marie Jacquard creates a loom that uses wooden punch cards to automate the design of woven fabrics. Early computers would use similar punch cards.
1822: Thanks to funding from the English government, mathematician Charles Babbage invents a steam-driven calculating machine that was able to compute tables of numbers.
1890: Inventor Herman Hollerith designs the punch card system to calculate the 1880 U.S. census. It took him three years to create, and it saved the government $5 million. He would eventually go on to establish a company that would become IBM.
A census clerk tabulates data using the Hollerith machine
1936: Alan Turing developed an idea for a universal machine, which he would call the Turing machine, that would be able to compute anything that is computable. The concept of modern computers was based on his idea.
1937: A professor of physics and mathematics at Iowa State University, J.V. Atanasoff, attempts to build the first computer without cams, belts, gears, or shafts.
1939: Bill Hewlett and David Packard found Hewlett-Packard in a garage in Palo Alto, California. Their first project, the HP 200A Audio Oscillator, would rapidly become a popular piece of test equipment for engineers.
In fact, Walt Disney Pictures would order eight to test recording equipment and speaker systems for 12 specially equipped theaters that showed Fantasia in 1940.
David Packard and Bill Hewlett in 1964
Source: PA Daily Post
Also in 1939, Bell Telephone Laboratories completes The Complex Number Calculator, designed by George Stibitz.
1941: Professor of physics and mathematics at Iowa State University J.V. Atanasoff and graduate student Clifford Berry design a computer that can solve 29 equations simultaneously. This is the first time a computer is able to house data within its own memory.
That same year, German engineer Konrad Zuse creates the Z3 computer, which used 2,300 relays, performed floating-point binary arithmetic, and had a 22-bit word length. This computer was eventually destroyed in a bombing raid in Berlin in 1943.
Additionally in 1941, Alan Turing and Harold Keen built the British Bombe, which decrypted Nazi ENIGMA-based military communications during World War II.
1943: John Mauchly and J. Presper Eckert, professors at the University of Pennsylvania, build an Electronic Numerical Integrator and Calculator (ENIAC). This is considered to be the grandfather of digital computers, as it is made up of 18,000 vacuum tubes and fills up a 20-foot by 40-foot room.
|Tip: A vacuum tube was a device that controlled electronic current flow.|
ENIAC technician changing the tube
Source: Science Photo Library
Also in 1943, the U.S. Army asked that Bell Laboratories design a machine to assist in the testing of their M-9 director, which was a type of computer that aims large guns to their targets. George Stibitz recommended a delay-based calculator for the project. This resulted in the Relay Interpolator, which was later known as the Bell Labs Model II.
1944: British engineer Tommy Flowers designed the Colossus, which was created to break the complex code used by the Nazis in World War II. A total of ten were delivered, each using roughly 2,500 vacuum tubes. These machines would reduce the time it took to break their code from weeks to hours, leading historians to believe they greatly shortened the war by being able to understand the intentions and beliefs of the Nazis.
That same year, Harvard physics professor Howard Aiken built and designed The Harvard Mark 1, a room-sized, relay-based calculator.
1945: Mathematician John von Neumann writes The First Draft of a Report on the EDVAC. This paper broke down the architecture of a stored-program computer.
1946: Mauchly and Eckert left the University of Pennsylvania and obtained funding from the Census Bureau to build the UNIVAC. This would become the first commercial computer for business and government use.
That same year, Will F. Jenkins published the science fiction short story A Logic Named Joe, which detailed a world where computers, called Logics, interconnect into a worldwide network. When a Logic malfunctions, it gives out secret information about forbidden topics.
1947: Walter Brattain, William Shockley, and John Bradeen of Bell Laboratories invented the transistor, which allowed them to discover a way to make an electric switch using solid materials, not vacuums.
1948: Frederick Williams, Geoff Toothill, and Tom Kilburn, researchers at the University of Manchester, develop the Small-Scale Experimental Machine. This was built to test new memory technology, which became the first high-speed electronic random access memory for computers. The became the first program to run on a digital, electronic, stored-program computer.
1950: Built in Washington, DC, the Standards Eastern Automatic Computer (SEAC) was created, becoming the first stored program computer completed in the United States. It was a test-bed for evaluating components and systems, in addition to setting computer standards.
1953: Computer scientist Grace Hopper develops the first computer language, which is eventually known as COBOL, that allowed a computer user to use English-like words instead of numbers to give the computer instructions. In 1997, a study showed that over 200 billion lines of COBOL code were still in existence.
That same year, business man Thomas Johnson Watson Jr. creates the IBM 701 EDPM, which is used to help the United Nations keep tabs on Korea during the war.
1954: The FORTRAN programming language is developed by John Backus and a team of programmers at IBM.
Additionally, IBM creates the 650, which was the first mass-produced computer, selling 450 in just one year.
1958: Jack Kirby and Robert Noyce invented the integrated circuit, which is what we now call the computer chip. Kilby was awarded the Nobel Prize in Physics in 2000 for his work.
1962: IBM announces the 1311 Disk Storage Drive, the first disk drive made with a removable disk pack. Each pack weighed 10 pounds, held six disks, and had a capacity of 2 million characters.
Also in 1962, the Atlas computer makes its debut, thanks to Manchester University, Ferranti Computers, and Plessy. At the time, it was the fastest computer in the world and introduced the idea of “virtual memory”.
1964: Douglas Engelbart introduces a prototype for the modern computer that includes a mouse and a graphical user interface (GUI). This begins the evolution from computers being exclusively for scientists and mathematicians to being accessible to the general public.
Additionally, IBM introduced SABRE, their reservation system with American Airlines. It program officially launched four years later, and now the company owns Travelocity. It used telephone lines to link 2,000 terminals in 65 cities, delivering data on any flight in under three seconds.
1968: Stanley Kubrick’s 2001: A Space Odyssey hits theaters. This cult-classic tells the story of the HAL 9000 computer, as it malfunctions during a spaceship’s trip to Jupiter to investigate a mysterious signal. The HAL 9000, which controlled all the ship, went rogue, killed the crew, and had to be shut down by the only surviving crew member. The film depicted computer demonstrated voice and visual recognition, human-computer interaction, speed synthesis, and other advanced technologies.
1969: Developers at Bell Labs unveil UNIX, an operating system written in C programming language that addressed compatibility issues within programs.
Source: Nokia Bell Labs
1970: Intel introduces the world to the Intel 1103, the first Dynamic Access Memory (DRAM) chip.
1971: Alan Shugart and a team of IBM engineers invented the floppy disk, allowing data to be shared among computers.
That same year, Xerox introduced the world to the first laser printer, which not only generated billions of dollars but also launched a new era in computer printing.
Also, email begins to grow in popularity as it expands to computer networks.
1973: Robert Metcalfe, research employee at Xerox, develops Ethernet, connecting multiple computers and hardware.
1974: Personal computers are officially on the market! The first of the bunch were Scelbi & Mark-8 Altair, IBM 5100, and Radio Shack's TRS-80.
1975: In January, the Popular Electronics magazine featured the Altair 8800 as the world’s first minicomputer kit. Paul Allen and Bill Gates offer to write software for the Altair, using the BASIC language. You could say writing software was successful, because in the same year they created their own software company, Microsoft.
1976: Steve Jobs and Steve Wozniak start Apple Computers and introduce the world to the Apple I, the first computer with a single-circuit board.
Also in 1976, Queen Elizabeth II sends out her first email from the Royal Signals and Radar Establishment to demonstrate networking technology.
1977: Jobs and Wozniak unveil the Apple II at the first West Coast Computer Faire. It boasts color graphics and an audio cassette drive for storage. Millions were sold between 1977 and 1993, making it one of the longest-lived lines of personal computers.
1978: The first computers were installed in the White House during the Carter administration. The White House staff was given terminals to access the shared Hewlett-Packard HP3000.
Also, the first computerized spreadsheet program, VisiCalc, is introduced.
Additionally, the LaserDisc is introduced by MCA and Phillips. The first to be sold in North America was the movie Jaws.
1979: MicroPro International unveils WordStar, a word processing program.
Related: WordStar certainly wasn’t the last of its kind, as there is a plethora of other document creation software on the market today. Check out some of the highest rated and leave a review on your favorite!
1981: Not to be outdone by Apple, IBM releases their first personal computer, the Acorn, with an Intel chip, two floppy disks, and an available color monitor.
Source: Florida History Network
1982: Instead of going with its annual tradition of naming a “Man of the Year”, Time Magazine does something a little different and names the computer its “Machine of the Year”. A senior writer noted in the article, “Computers were once regarded as distant, ominous abstractions, like Big Brother. In 1982, they truly became personalized, brought down to scale, so that people could hold, prod and play with them."
1983: The CD-ROM hit the market, able to hold 550 megabytes of pre-recorded data. That same year, many computer companies worked to set a standard for these disks, making them able to be used freely to access a wide variety of information.
Later that year, Microsoft introduced Word, which was originally called Multi-Tool Word.
1984: Apple launches Macintosh, which was introduced during a Super Bowl XVIII commercial. The Macintosh was the first successful mouse-driven computer with a graphical user interface. It sold for $2,500.
1985: Microsoft announces Windows, which allowed for multi-tasking with a graphical user interface.
That same year, a small Massachusetts computer manufacturer registered the first dot com domain name, Symbolics.com.
Also, the programming language C++ is published and is said to make programming “more enjoyable” for the serious programmer.
1986: Originally called the Special Effects Computer Group, Pixar is created at Lucasfilm. It worked to create computer-animated portions of popular films, like Star Trek II: The Wrath of Khan. Steve Jobs purchased Pixar in 1986 for $10 million, renaming it Pixar. It was bought by Disney in 2006.
1990: English programmer and physicist Tim Berners-Lee develops HyperText Markup Language, also known as HTML. He also prototyped the term WorldWideWeb. It features a server, HTML, URLs, and the first browser.
1991: Apple releases the Powerbook series of laptops, which included a built-in trackball, internal floppy disk, and palm rests. The line was discontinued in 2006.
1993: With an attempt to enter the handheld computer market, Apple releases Newton. Called the “Personal Data Assistant”, it never performed the way Apple President John Scully had hoped, and it was discontinued in 1998.
Source: The Register
Also that year, Steven Spielberg’s Jurassic Park hits theaters, showcasing cutting-edge computer animation, in addition to animatronics and puppetry.
1995: IBM released the ThinkPad 701C, which was officially known as the Track Write, with an expanding full-sized keyboard that was comprised of three interlocking pieces.
Additionally, the format for a Digital Video Disc (DVD) is introduced, featuring a huge increase in storage space that the compact disc (CD).
Also that year, Microsoft’s Windows 95 operating system was launched. To spread the word, a $300 million promotional campaign was rolled out, featuring TV commercials that used “Start Me Up” by the Rolling Stones and a 30-minute video starring Matthew Perry and Jennifer Aniston. It was installed on more computers than any other operating system.
Source: Tech Digest TV
1996: Sergey Brin and Larry Page develop Google at Stanford University.
That same year, Palm Inc., founded by Ed Colligan, Donna Dubinsky, and Jeff Hawkins created the personal data assistance called the Palm Pilot.
Also in 1996 was the introduction of the Sony Vaio series. This desktop computer featured an additional 3D interface in addition to the Windows 95 operating system, as a way to attract new users. The line was discontinued in 2014.
1997: Microsoft invests $150 million into Apple, which ended Apple’s court case against Microsoft, saying they copied the “look and feel” of their operating system.
1998: Apple releases the iMac, a range of all-in-one Macintosh desktop computers. Selling for $1,300, these computers included a 4GB hard drive, 32MB Ram, a CD-ROM, and a 15-inch monitor.
Source: Start Ups Venture Capital
1999: The term Wi-Fi becomes part of the computing language as users begin connecting without wires. Without missing a beat, Apple creates its “Airport” Wi-Fi router and builds connectivity into Macs.
2000: In Japan, SoftBank introduced the first camera phone, the J-Phone J-SH04. The camera had a maximum resolution of 0.11 megapixels, a 256-color display, and photos could be shared wirelessly. It was such a hit that a flip-phone version was released just a month later.
Also in 2000, the USB flash drive is introduced. Used for data storage, they were faster and had a greater amount of storage space than other storage media options. Plus, they couldn’t be scratched like CDs.
2001: Apple introduces the Mac OS X operating system. Not to be outdone, Microsoft unveiled Windows XP soon after.
Also, the first Apple stores are opened in Tysons Corner, Virginia, and Glendale, California. Apple also released iTunes, which allowed users to record music from CDs, burn it onto the program, and then mix it with other songs to create a custom CD.
2003: Apple releases iTunes music store, giving users the ability to purchase songs within the program. In less than a week after its debut, over 1 million songs were downloaded.
Also in 2003, the Blu-ray optical disc is released as the successor of the DVD.
And, who can forget the popular social networking site Myspace, which was founded in 2003. By 2005, it had more than 100 million users.
2004: The first challenger of Microsoft’s Internet Explorer came in the form of Mozilla’s Firefox 1.0. That same year, Facebook launched as a social networking site.
Source: Business Insider
2005: YouTube, the popular video-sharing service, is founded by Jawed Karim, Steve Chen, and Chad Hurley. Later that year, Google acquired the mobile phone operating system Android.
2006: Apple unveiled the MacBook Pro, making it their first Intel-based, dual-core mobile computer.
That same year at the World Economic Forum in Davos, Switzerland, the United Nations Development Program announced they were creating a program to deliver technology and resources to schools in under-developed countries. The project became the One Laptop per Child Consortium, which was founded by Nicholas Negroponte, the founder of MIT’s Media Lab. By 2011, over 2.4 million laptops had been shipped.
And, we can’t forget to mention the launch of Amazon Web Services, including Amazon Elastic Cloud 2 (EC2) and Amazon Simple Storage Service (S3). EC2 made it possible for users to use the cloud to scale server capacity quickly and efficiently. S3 was a cloud-based file hosting service that charged users monthly for the amount of data they stored.
2007: Apple released the first iPhone, bringing many computer functions to the palm of our hands. It featured a combination of a web browser, a music player, and a cell phone -- all in one. Users could also download additional functionality in the form of “apps”. The full-touchscreen smartphone allowed for GPS navigation, texting, a built-in calendar, a high-definition camera, and weather reports.
Also in 2007, Amazon released the Kindle, one of the first electronic reading systems to gain a large following among consumers.
And, Dropbox was founded by Arash Ferdowsi and Drew Houston as a way for users to have convenient storage and access to their files on a cloud-based service.
2008: Apple releases the MacBook Air, the first ultra notebook that was a thin and lightweight laptop with a high-capacity battery. To get it to be a smaller size, Apple replaced the traditional hard drive with a solid-state disk, making it the first mass-marketed computer to do so.
2009: Microsoft launched Windows 7.
2010: Apple released the iPad, officially breaking into the dormant tablet computer category. This new gadget came with many features the iPhone had, plus a 9-inch screen and minus the phone.
2011: Google releases the Chromebook, a laptop that runs on Google Chrome OS.
Also in 2011, the Nest Learning Thermostat emerges as one of the first Internet of Things, allowing for remote access to a user’s home thermostat by use of their smartphone or tablet. It also sent monthly power consumption reports to help customers save on energy bills.
Source: Innovation Fan Girl
In Apple news, co-founder Steve Jobs passed away on October 11. The brand also announced that the iPhone 4S will feature Siri, a voice-activated personal assistant.
2012: On October 4, Facebook hits 1 billion users, as well as acquires the image-sharing social networking application Instagram.
Also in 2012, the Raspberry Pi, a credit-card-sized single-board computer is released, weighing only 45 grams.
2014: The University of Michigan Micro Mote (M3), the smallest computer in the world, is created. Three types were made available, two of which measured either temperature or pressure, and one that could take images.
Additionally, the Apple Pay mobile payment system is introduced.
2015: Apple releases the Apple Watch, which incorporated Apple’s iOS operating system and sensors for environmental and health monitoring. Almost a million units were sold on the day of its release.
This release was followed closely by Microsoft announcing Windows 10.
2016: The first reprogrammable quantum computer is created.
2019: Apple announces iPadOS, the iPad's very own operating system, to better support the device as it becomes more like a computer and less like a mobile device.
I don’t have the answer to what awaits us in regards to computers. One thing is for sure -- in order to keep up with the world of tech, the growing need for cyber security, and our constant need for the next big thing, computers aren’t going anywhere.
If anything, they’re only going to become a bigger part of our daily lives.
Still curious about all things computers? Expand your knowledge even further with this roundup of cyber security terms! Or, go in a related direction and read all about the history of artificial intelligence.
Mara is a Senior Content Marketing Specialist at G2. In her spare time, she's typically at the gym polishing off a run, reading a book from her overcrowded bookshelf, or right in the middle of a Netflix binge. Obsessions include the Chicago Cubs, Harry Potter, and all of the Italian food imaginable. (she/her/hers)
Subscribe to keep your fingers on the tech pulse.