A CDE Definition
Following is a brief summary of the generations of computers based on their hardware and software architecture.
In the late 1940s and early 1950s (EDSAC, UNIVAC I, etc.) computers used vacuum tubes for their digital logic and liquid mercury memories for storage. See early memories, EDSAC and UNIVAC I.
In the late 1950s, transistors replaced tubes and used magnetic cores for memories (IBM 1401, Honeywell 800). Size was reduced and reliability was significantly improved. See IBM 1401 and Honeywell.
In the mid-1960s, computers used the first integrated circuits (IBM 360, CDC 6400) and the first operating systems and database management systems. Although most processing was still batch oriented using punch cards and magnetic tapes, online systems were being developed. This was the era of mainframes and minicomputers, essentially large centralized computers and small departmental computers. See punch card, System/360 and Control Data.
The mid to late-1970s spawned the microprocessor and personal computer, introducing distributed processing and office automation. Word processing, query languages, report writers and spreadsheets put large numbers of people in touch with the computer for the first time. See query language and report writer.
Fifth Generation - The Future
The 21st century ushered in the fifth generation, which increasingly delivers various forms of artificial intelligence (AI). More sophisticated search and natural language recognition are features that users recognize, but software that improves its functionality by learning on its own will change just about everything in the tech world in the future. See AI, machine learning, deep learning, neural network, computer vision, virtual assistant and natural language recognition.
The Beginning of Commercial Computing
Then and Now
Before/After Your Search Term
Terms By Topic
Click any of the following categories for a list of fundamental terms.