About
Subscribe

HR and IT timeline

Early tools, such as macro assemblers and interpreters, are created in the 1950s.
By Mia Andric, Brainstorm special editions editor
Johannesburg, 11 Jun 2007
1940s: First computer users write machine code by hand.

1950s: Early tools, such as macro assemblers and interpreters, are created and widely used to improve productivity and quality; first-generation optimising compilers created.

1960s: Second-generation tools like optimising compilers and inspections are being used to improve productivity and quality. The concept of software engineering is widely discussed. The first really big (1 000 programmer) projects take place. Commercial mainframes and custom software for big business become freely available. The influential 1968 Nato Conference on Software Engineering is held.

1970s: Collaborative software tools, such as Unix, code repositories, and so on are being used. Mini-computers and the rise of small business software also happen in this decade.

1980s: Personal computers and personal workstations become common; commensurate rise of consumer software.

1990s: Object-oriented programming and agile processes like extreme programming gain mainstream acceptance. Computer memory capacity skyrocket and prices drop drastically. These new technologies allow software to grow more complex. The Web and handheld computers make software even more widely available.

2000s: Managed code and interpreted platforms such as Java, .NET, Ruby, Python and PHP make writing software easier than ever before. Offshore outsourcing changes the nature and focus of software engineering careers.

Compiled by Mia Andric. Sources: History of the , Christos JP Moschovitis, Hilary Poole, Tami Schuyler, Teresa M Senft; Hobbes' Internet Timeline, Robert Hobbes Zakon; Computer history; Howstuffworks; Whatis; Wikipedia; History of computers; Computer.org; IT Management.

Share