The two key components in computing, hardware and software exist side by side. Improvement in one drives improvement in the other. Compare with hardware, imporvement of software is less costly, and could be the results individual efforts. There are many different types of software, applications software, systems software, applications development tools, and combination of these - DBMS. Hardware is important, but in a very real sense the history of information technology is the history of software.

Before stored-program digital computers

edit

Origins of computer science

edit
 
Charles Babbage is credited with inventing the first mechanical computer - Analytic Engine.
 
Ada Lovelace is credited with writing the first algorithm intended for processing on a computer.

The first piece of software was arguably created by Ada Lovelace(1815-1852) in the 19th century, for the planned analytical engine. At Charles Babbage's request, she wrote some "note" from the French a summary of Babbage's idea written by Luigi Federico Manabrea. The "notes" ended up being three times longer than Manabrea's original. Her note inclcuded a way for the Analytical Engine to calculate Bernoulli numbers, which is regard as the world's first computer program. [1] However, it was never executed.

The first theory about software - prior to the creation of computers as we know them today - was proposed by Alan Turing in his 1935 essay Computable numbers with an application to the Entscheidungsproblem (decision problem).[2] The turing machine, as described in his 1936 paper was theoretical construct, not a physical device. At its heart is an infinityly long piece of paper, comprising an infinite number of boxes, within which mathematical symbols and number could be written, read and reased. And mathematical calculation, no matter how complex, could be performed by a series of action based on the symbol. [3]

This eventually led to the creation of the twin academic fields of computer science and software engineering, which both study software and its creation. Computer science is more theoretical (Turing's essay is an example of computer science), whereas software engineering is focused on more practical concerns. Turing’s “symbols” were in essence computer functions (add, subtract, multiply, etc.), and his concept of any complex operation being able to be reduced to a series of simple sequential operations is the essence of computer programming. [4]

However, prior to 1946, software as we now understand it - programs stored in the memory of digital computers - did not yet exist. The very first electronic computing devices were instead rewired in order to "reprogram" them - see History of computing hardware.

Early days of computer software (1945-1979)

edit

The first software was loaded onto computers - sometimes literally - using various relatively laborious mechanisms, including flicking switches (as with the Manchester Baby), and punching holes at predefined positions in cards and loading these punched cards into a computer. With such methods, if a mistake was made, the whole program might have to be loaded again from the beginning.

The first true electronic computer was the ENIAC, which ran its first program in November, 1945. The programming was performed by setting switchs and knobs, which told different parts of the machine (known as accumulators) which mathematical function to perform. ENIAC operators had to plug accumulator together in the proper order, and preparing a program to run could take a mounth or more. [5]

All these early machines like ENIAC, UNIVAC, and SAGE ran software, but there was no software industry as we understand it today. Early commercial machines were programmed mechanically, or by the use of machine language. In the early days there was little understanding of the distinction between hardware and software. That was to change with the development of the first programming languages.

Programming Languages and Operating Systems

edit

The term “software” did not come into use until 1958. It is probable that it was coined by Princeton University professor John W. Tukey in an article in The American Mathematical Monthly in January of that year. [6]

Grace Murray Hopper joined Eckert and Mauchly’s fledgling UNIVAC company to develop an instruction code for the machine in 1951. She devised the term “automatic programming” to describe her work. [7] Hopper became a tireless proselytiser for the concept of automatic programming. Her work led directly to the development of FORTRAN (“FORmula TRANslator”), the world’s first true computer language. FORTRAN was developed in the mid-50s by an IBM development team led by a young researcher named John Backus. The first version of FORTRAN was released in 1954.

FORTRAN enabled people to program computers using simple English-like instructions and mathematical formulas. It led to a number of other languages, the most successful of which was COBOL (Common Business-Oriented Language), initially developed by the US government Committee on Data Systems and Languages (CODASYL) and strongly promoted by Grace Hopper. COBOL and FORTRAN dominated programming until the late 1970s. Other languages, such as ALGOL (Algorithmic Language), PL/1 (programming language 1), RPG (Report Program Generator) and BASIC (Beginner’s All- purpose Symbolic Instruction Code) also became popular, inspired by the success of FORTRAN and COBOL.

These languages became known as 3GLs (third-generation programming language), so called because they were an evolution from the first and second generations of computer language – machine code and assembler. Some people wrote bits of applications in those more efficient but much more cryptic languages, but the English-like syntax of the 3GLs made them easier to learn and much more popular.

Not only did they not have programming languages, early computers also did not have operating systems. Every function had to be separately programmed, and in the early days there was no distinction between systems and applications software. Programming languages such as FORTRAN and COBOL greatly improved general programming functions, but the task of handling machine-specific functions such as control of peripherals was still left up to individual programmers.

Most of the innovative early work on what we now call operating systems was done by individual users.[8] The first true operating system is generally agreed to be MAD (Michigan Algorithmic Decoder), developed at the University of Michigan in 1959. MAD was based on the ALGOL 3GL, and was designed to handle the various details of running a computer that were so tedious to code separately. [9] But the concept of the operating system was still largely unknown, until the momentous development of IBM’s S/360.

edit

Later, software was sold to multiple customers by being bundled with the hardware by Original equipment manufacturers (OEMs) such as Data General, Digital Equipment and IBM. When a customer bought a minicomputer, at that time the smallest computer on the market, the computer did not come with Pre-installed software, but needed to be installed by engineers employed by the OEM.[citation needed] Most companies[clarification needed] had their software on the books for 0 dollars, unable to claim it as an asset (this is similar to financing of popular music in those days).[citation needed]

This bundling attracted the attention of US antitrust regulators, who sued IBM for improper "tying" in 1969, alleging that it was an antitrust violation that customers who wanted to obtain its software had to also buy or lease its hardware in order to do so. Although the case was dropped by the US Justice Department after many years of attrition as "without merit", IBM started selling software separately anyway. This began the age of commercial software.

Very quickly, commercial software started to be pirated, and commercial software producers were very unhappy at this. Bill Gates, founder of Microsoft, was an early moraliser against software piracy with his famous Open Letter to Hobbyists in 1976.

Data General also encountered legal problems related to bundling - although in this case, it was due to a civil suit from a would-be competitor. When Data General introduced the Data General Nova, a company called Digidyne wanted to use its RDOS operating system on its own hardware clone. Data General refused to license their software (which was hard to do, since it was on the books as a free asset), and claimed their "bundling rights". The US Supreme Court set a precedent called Digidyne v. Data General in 1985, and the Supreme Court let a 9th circuit decision stand, and Data General was eventually forced into licensing the operating system because it was ruled that restricting the license to only DG hardware was an illegal tying arrangement.[10] Unable to sustain the loss from lawyer's fees, Data General ended up being taken over by EMC Corporation.

The rather absurd legal precedent in Digidyne v. Data General regarding bundling has never been applied to Apple, which might never have been as profitable as it is today had it been forced to license its Macintosh operating systems to competitors (although it did do so temporarily and voluntarily for a while).

Unix (1970s-)

edit

Unix was an early operating system which became popular and very influential, and still exists today. The most popular variant of Unix today is Mac OS X, while Linux is closely related to Unix.

The PC Software Industry

edit

The first spreadsheet program, called VisiCalc, was released for the Apple II in November 1979, created by Bricklin. [11] Bricklin’s professors had described to him the large blackboards divided into rows and columns that were used for production planning in large companies.[12] With the release of Mitch Kapor’s Lotus 1-2-3 for the IBM PC, the spreadsheet became a standard microcomputer application.

The first word processor was IBM’s MT/ST (Magnetic Tape/Selectric Typewriter), released in 1964, which fitted a magnetic tape drive to an IBM Selectric electric typewriter. Word processors evolved from typewriters rather than computers, but with the advent of the microcomputer the two technologies merged.[13]

The first word processing program for microcomputers was Electric Pencil, developed for the MITS Altair by Michael Shrayler in 1976. It was very rudimentary. The first to be commercially successful was Wordstar in 1979, developed by Seymour Rubinstein and Rob Barnaby.[14] Wordstar used a number of cryptic commands, but it had all the power of a dedicated word processor. By the time the IBM PC was released in 1981, PCs and word processing machines had all but converged in technology and appearance. It took some time before word processing software caught up with dedicated machines in functionality, but they won the battle in price-performance immediately. All the dedicated word processing companies were out of business by 1990.

Pre-internet source code sharing

edit

Before the Internet - and indeed in the period after the internet was created, but before it came into widespread use by the public - computer programming enthusiasts had to find other ways to share their efforts with each other, and also with potentially-interested computer users who were not themselves programmers. Such sharing techniques included distribution of tapes, such as the DECUS tapes, and later, electronic bulletin board systems. However, a particularly popular and mainstream early technique involved computer magazines.

Source code listings in computer magazines

edit

Tiny BASIC was published as a type-in program in Dr Dobbs Journal in 1975, and developed collaboratively (in effect, an early example of open source software, although that particular term was not to be coined until two decades later).

It was an inconvenient and slow process to manually type in source code from a computer magazine, and a single mistyped - or worse, misprinted - character could render the program inoperable, yet people still did so (optical character recognition technology to scan in the listings and obviate the need for typing was not yet available at the time).

However, even with the widespread use of cartridges and cassette tapes in the 1980s for distribution of commercial software, free programs (such as simple educational programs for the purpose of teaching programming techniques) were still often printed, because it was cheaper than manufacturing and attaching cassette tapes to each copy of a magazine. Many of today's IT professionals who were children at the time had a lifelong interest in computing in general or programming in particular sparked by such first encounters with source code.

However, eventually a combination of four factors brought this practice of printing complete source code listings of entire programs in computer magazines to an end:

  • programs started to become very large
  • floppy discs started to be used for distributing software, and then came down in price
  • more and more people started to use computers - computing became a mass market phenomenon, and most ordinary people were far less likely to want to spend hours typing in listings than the earlier enthusiasts
  • partly as a consequence of all of the above factors, computer magazines started to attach free cassette tapes, and free floppy discs, with free or trial versions of software on them, to their covers

1980s-present

edit

Just like the Auto industry, the Software industry has grown from a few visionaries operating (figuratively or literally) out of their garage with prototypes. Steve Jobs and Bill Gates were the Henry Ford and Louis Chevrolet of their times[citation needed], who capitalized on ideas already commonly known before they started in the business. A pivotal moment in computing history was the publication in the 1980s of the specifications for the IBM Personal Computer published by IBM employee Philip Don Estridge, which quickly led to the dominance of the PC in the worldwide desktop and later laptop markets - a dominance which continues to this day. Today his move would be seen as a type of crowd-sourcing.

Free and open source software

edit

Recent developments

edit

The software industry has changed forever, and more in the last five years than in the previous twenty. As hardware becomes ubiquitous and commoditised, software will become more pervasive and more important. It is now nearly 70 years since ENIAC’s first switches were flicked and the first program ran. Sixty years from now, today’s software will look as primitive as ENIAC’s does to us.

App stores

edit

Applications for mobile devices (cellphones and tablets) have been termed "apps" in recent years. Apple chose to funnel iPhone and iPad app sales through their App Store, and thus both vet apps, and get a cut of every paid app sold. Apple do not allow apps which could be used to circumvent their app store (e.g. virtual machines such as the Java or Flash virtual machines).

The Android platform, by contrast, has multiple app stores available for it, and users can generally select which to use (although Google Play requires a compatible or rooted device).

This move was replicated for desktop operating systems with the Ubuntu One Software Center (for Ubuntu) and the Mac App Store (for Mac OS X). Both of these platforms remain, as they have always been, non-exclusive: they allow applications to be installed from outside the app store, and indeed from other app stores.

The explosive rise in popularity of apps, for the iPhone in particular but also for Android, led to a kind of "gold rush", with some hopeful programmers dedicating a significant amount of time to creating apps in the hope of striking it rich. As in real gold rushes, not all of these hopeful entrepreneurs were successful.

How software has affected hardware

edit

As more and more programs enter the realm of firmware, and the hardware itself becomes smaller, cheaper and faster as predicted by Moore's law, an increasing number of types of functionality of computing first carried out by software, have joined the ranks of hardware, as for example with graphics processing units. (However, the change has sometimes gone the other way for cost or other reasons, as for example with softmodems and microcode.)

Most hardware companies today have more software programmers on the payroll than hardware designers[citation needed], since software tools have automated many tasks of Printed circuit board engineers.

See also

edit

References

edit
  1. ^ Philipson, G. (2013). 2 A short history of software. Management, Labour Process and Software Development: Reality Bites, 13.
  2. ^ Hally, Mike (2005). Electronic brains/Stories from the dawn of the computer age. London: British Broadcasting Corporation and Granta Books. p. 79. ISBN 1-86207-663-4.
  3. ^ Hodges, A. (1983, Unwin Paperbacks edition 1985) Alan Turing: The Enigma of Intelligence. London, Unwin Paperbacks. pp. 100.
  4. ^ Philipson, G. (2013). 2 A short history of software. Management, Labour Process and Software Development: Reality Bites, 13.
  5. ^ McCartney, S. (1999) ENIAC: The Triumphs and Tragedies of the World’s First Computer. New York, Walker and Company.
  6. ^ Peterson, I. (2000). Software’s Origin. <www.maa.org/mathland/mathtrek_7_31_00.html>
  7. ^ Cambell-Kelly, M and Aspray, W. (1996) Computer: A History of the Information Machine. New York, HarperCollins. pp. 187.
  8. ^ Ceruzzi, P.E. (1999) A History of Modern Computing. Cambridge, MA, MIT Press. pp. 96.
  9. ^ Ceruzzi, P.E. (1999) A History of Modern Computing. Cambridge, MA, MIT Press. pp. 98.
  10. ^ "Tying Arrangements and the Computer Industry: Digidyne Corp. vs. Data General". JSTOR 1372482. {{cite web}}: Missing or empty |url= (help)
  11. ^ Freiberger, P. and Swaine, M. (1984) Fire in the Valley: The Making of the Personal Computer. Berkeley, CA, Osborne/McGraw Hill. pp/ 229.
  12. ^ Cringely, R.X. (1992) Accidental Empires. London, Viking. pp. 65.
  13. ^ Kunde, B (1996). A Brief History of Word Processing (Through 1986). Online:<www.stanford.edu/~bkunde/fb-press/articles/wdprhist.html>
  14. ^ Kunde, B (1996). A Brief History of Word Processing (Through 1986). Online:<www.stanford.edu/~bkunde/fb-press/articles/wdprhist.html>

Category:History of software

edit