The Creation of the UNIX* Operating System

After three decades of use, the UNIX* computer operating system from Bell Labs is still regarded as one of the most powerful, versatile, and flexible operating systems (OS) in the computer world. Its popularity is due to many factors, including its ability to run a wide variety of machines, from micros to supercomputers, and its portability -- all of which led to its adoption by many manufacturers.

Like another legendary creature whose name also ends in 'x,' UNIX rose from the ashes of a multi-organizational effort in the early 1960s to develop a dependable timesharing operating system.

The joint effort was not successful, but a few survivors from Bell Labs tried again, and what followed was a system that offers its users a work environment that has been described as "of unusual simplicity, power, and elegance...."

The system also fostered a distinctive approach to software design -- solving a problem by interconnecting simpler tools, rather than creating large monolithic application programs.

Its development and evolution led to a new philosophy of computing, and it has been a never-ending source of both challenges and joy to programmers around the world.

Next: Before Multics there was chaos, and afterwards, too

Before Multics there was chaos, and afterwards, too

Computer systems didn't talk to each other in the early days of computing. Even the various computer lines made by the same company often needed interpreters. And forget any interoperability of systems by different vendors!

In addition, operating systems very often performed only limited tasks, and only on the machines for which they were written. If a business upgraded to a bigger, more powerful computer, the old operating system probably wouldn't work on the new computer, and often the company's data had to be entered -- again -- into the new machine.

To try to develop a convenient, interactive, useable computer system that could support many users, a group of computer scientists from Bell Labs and GE in 1965 joined an effort underway at MIT on what was called the Multics (Multiplexed Information and Computing Service) mainframe timesharing system.

Over time, hope was replaced by frustration as the group effort initially failed to produce an economically useful system. Bell Labs withdrew from the effort in 1969 but a small band of users at Bell Labs Computing Science Research Center in Murray Hill -- Ken Thompson, Dennis Ritchie, Doug McIlroy, and J. F. Ossanna -- continued to seek the Holy Grail.

Next: From Multics to something else

From Multics to something else

Actually, Multics worked, and eventually became a product, but not initially on the scale its developers wanted. "Even though Multics could not then support many users, it could support us, albeit at exorbitant cost," Ritchie explained. "We didn't want to lose the pleasant niche we occupied. What we wanted to preserve was just not a good environment in which to do programming, but a system around which a fellowship could form."1

"During 1969, we began trying to find an alternative to Multics," Ritchie said. "Throughout 1969, we lobbied for a medium-scale machine for which we promised to write an operating system. Our proposals were never clearly and finally turned down, but they were never accepted, either.

"Eventually, we presented an exquisitely complicated proposal involving outright purchase, third-party lease, and equipment trade-in, all designed to minimize financial outlay," Ritchie said. But the proposal was rejected. "Rumor soon had it that Bill Baker, the vice president of Research, exclaimed 'Bell Labs just doesn't do business this way!'"

Ritchie conceded, "Actually, it is perfectly obvious in retrospect -- and it should have been at the time -- that we were asking the Labs to spend too much money on too few people with too vague a plan." He also noted that buying a new machine might lead to another expensive Multics project, which management wanted to avoid, or developing another computer center, a responsibility which Research wanted to avoid.

Next: In the beginning: botched acronyms

In the beginning: botched acronyms

The origins to UNIX can be traced back, somewhat fuzzily, to the early spring of 1969 during an informal discussion of just what the researchers wanted a computer operating system to do.

Thompson, once it was obvious that Multics was going away, decided to satisfy two urges: to write an operating system of his own, and to create an environment in which to do future work. "Dennis, (Rudd) Canaday and myself were just discussing these ideas of the general nature, of keeping the files out of each other's hair, and the nitty-gritty of expanding, of the real implementation: where you put the block address ...", Thompson explained.2

At the end of the discussion, Canaday picked up the phone, dialed into a Bell Labs dictation service, and read in his notes. "The next day these notes came back," Thompson said, "and all the acronyms were butchered, like 'inode' and 'eyen.'"

Butchered or not, the notes became the basis for UNIX. Each researcher received a copy of the notes, "...and they became the working document for the file system," Thompson said.

Next: The famous PDP-7 comes to the rescue

The famous PDP-7 comes to the rescue

While mulling over the problems of operating systems in 1969, Thompson in his spare time developed a computer game called "Space Travel." The game simulated the motion of the planets in the solar system. A player could cruise between the plants, enjoy the scenery, and even land the ship on the planets and moons.

The game, first written on Multics and then transliterated into Fortran for the GECOS operating system, ran on a GE 635 computer. The game's display was jerky and hard to control because the player had to type commands to control the ship. Also, it cost about $75 in CPU time on the big GE 635, a cost that hardly endeared it to management.

"It did not take long, therefore, for Thompson to find a little-used PDP-7 computer with an excellent display terminal," Ritchie explained. "He and I rewrote 'Space Travel' to run on this machine." Their effort included a floating-point arithmetic package, the pointwise specification of the graphics characters for the display, and a de-bugging subsystem that continuously displayed the contents of typed-in locations in the corner of the screen.

"All this was written in assembly language for a cross-assembler that ran under GECOS and produced paper tapes to be carried to the PDP-7," Ritchie said. "'Space Travel,' though it made a very attractive game, served mainly as an introduction to the clumsy technology of preparing programs for the PDP-7."

"It was the natural candidate as the place to put the file system," Thompson said. "When we hacked out this design, this rough design of the file system on the dictation machine that day in Canaday's office, I went off and implemented it on the PDP-7."

Next: The UNIX system begins to take shape

The UNIX system begins to take shape

What Thompson had on the PDP-7 was a system, but not really an operating system.

So during the summer of 1969, Thompson began implementing the paper file system, which Ritchie liked to refer to as the "chalk file system," since it emerged from countless discussions at chalkboards in the Computing Science Research Center.

"I allocated a week each to the operating system, the shell, the editor, and the assembler to reproduce itself...", Thompson explained.

He first worked out the requirements for an operating system, in particular the notion of processes. Then he developed a small set of user-level utilities: the means to copy, print, delete and edit files. And he developed a command interpreter, or shell.

Next: It looked like an operating system, almost

It looked like an operating system, almost

The initial implementation "was totally rewritten in a form that looked like an operating system," Thompson said, "with tools that were sort of known, you know, assembler, editor, shell." The system, he said, "...if not maintaining itself, was right on the verge of maintaining itself, totally severing the GECOS connection."

He was referring to the fact that up to this point, all the programs were written using GECOS and transferred to the PDP-7 by paper tape. But once an assembler was completed, the system was able to support itself.

As Ritchie summed up the effort, "Although it was not until well into 1970 that Brian Kernighan suggested the name 'UNIX,' in a somewhat treacherous pun on 'Multics,' the operating system we know today was born."

Next: Porting UNIX for its first commercial application

Porting UNIX for its first commercial application

It soon became obvious that the PDP-7 machine, which the UNIX group didn't own, was becoming obsolete. In 1970, they proposed buying a PDP-11 for about $65,000. Two research department heads, Doug McIlroy and Lee McMahon, realized the benefits of the new operating system and supported the proposal. The PDP-11 arrived at the end of the summer. The system was so new no disk was available at the time, so the effort to port UNIX didn't begin until December.

[ Dennis Ritchie and Ken Thompson at the PDP-11 ]

Dennis Ritchie (standing) and Ken Thompson begin porting UNIX to the PDP-11 via two Teletype 33 terminals.

"During the protracted arrival of the hardware," Ritchie said, "the increasing usefulness of the PDP-7 UNIX made it appropriate to justify creating PDP-11 UNIX as a development tool, to be used in writing a more special-purpose system, text processing."

The first potential customer was the Bell Labs Patent Department, which was evaluating a commercial system to prepare patent applications. In developing UNIX to support text processing, the Computing Science Research Center supported three Patent Department typists who spent the day busily typing, editing, and formatting patent applications.

Ritchie said, "The experiment was trying, but successful. Not only did the Patent Department adopt UNIX, and thus become the first of many groups at the Laboratories to ratify our work, but we acquired sufficient credibility to convince our own management to acquire one of the first PDP 11/45 systems made.

"The rest," Ritchie said, "is history."

Next: From B language to NB to C

From B language to NB to C

The first version of UNIX was written in assembler language, but Thompson's intention was that it would be written in a high-level language.

Thompson first tried in 1971 to use Fortran on the PDP-7, but gave up after the first day. Then he wrote a very simple language he called B, which he got going on the PDP-7. It worked, but there were problems. First, because the implementation was interpreted, it was always going to be slow. Second, the basic notions of B, which was based on the word-oriented BCPL, just were not right for a byte-oriented machine like the new PDP-11.

Ritchie used the PDP-11 to add types to B, which for a while was called NB for "New B," and then he started to write a compiler for it. "So that the first phase of C was really these two phases in short succession of, first, some language changes from B, really, adding the type structure without too much change in the syntax; and doing the compiler," Ritchie said.

"The second phase was slower," he said of rewriting UNIX in C. Thompson started in the summer of 1972 but had two problems: figuring out how to run the basic co-routines, that is, how to switch control from one process to another; and the difficulty in getting the proper data structure, since the original version of C did not have structures.

"The combination of the things caused Ken to give up over the summer," Ritchie said. "Over the year, I added structures and probably made the compiler code somewhat better -- better code -- and so over the next summer, that was when we made the concerted effort and actually did redo the whole operating system in C."

Next: Connecting streams like a garden hose

Connecting streams like a garden hose

Another innovation of UNIX was the development of pipes, which gave programmers the ability to string together a number of processes for a specific output.

Doug McIlroy, then a department head in the Computing Science Research Center, is credited for the concept of pipes at Bell Labs, and Thompson gets the credit for actually doing it.

McIlroy had been working on macros in the later 1950s, and was always theorizing to anyone who would listen about linking macros together to eliminate the need to make a series of discrete commands to obtain an end result.

"If you think about macros," McIlroy explained, "they mainly involve switching data streams. I mean, you're taking input and you suddenly come to a macro call, and that says, 'Stop taking input from here and go take it from there.'

"Somewhere at that time I talked of a macro as a 'switchyard for data streams,' and there's a paper hanging in Brian Kernighan's office, which he dredged up from somewhere, where I talked about screwing together streams like a garden hose. So this idea had been banging around in my head for a long time."

Next: Back to the chalkboards to work out the syntax

Back to the chalkboards to work out the syntax

While Thompson and Ritchie were at the chalkboard sketching out a file system, McIlroy was at his own chalkboard trying to sketch out how to connect processes together and to work out a prefix notation language to do it.

It wasn't easy. "It's very easy to say 'cat into grep into...,' or 'who into cat into grep,'" McIlroy explained. "But there are all these side parameters that these commands have; they just don't have input and output arguments, but they have all these options."

"Syntactically, it was not clear how to stick the options into this chain of things written in prefix notation, cat of grep of who [i.e. cat(grep(who))]," he said. "Syntactic blinders: I didn't see how to do it."

Next: "I'm going to do it," and so he did

"I'm going to do it," and so he did

Although stymied, McIlroy didn't drop the idea. "And over a period from 1970 to 1972, I'd from time to time say, 'How about making something like this?', and I'd put up another proposal, another proposal, another proposal. And one day I came up with a syntax for the shell that went along with the piping, and Ken said, 'I'm going to do it!'"

"He was tired of hearing this stuff," McIlroy explained. "He didn't do exactly what I had proposed for the pipe system call. He invented a slightly better one that finally got changed once more to what we have today. He did use my clumsy syntax."

"Thompson saw that file arguments weren't going to fit with this scheme of things and he went in and changed all those programs in the same night. I don't know how...and the next morning we had this orgy of one-liners."

"He put pipes into UNIX, he put this notation into shell, all in one night," McElroy said in wonder.

Next: Creating a programming philosophy from pipes and a tool box

Creating a programming philosophy from pipes and a tool box

As technically neat as the accomplishment was, when Thompson created pipes, he also put something else into UNIX -- a philosophy.

As McIlroy described it, "the philosophy that everyone started to put forth was 'Write programs that do one thing and do it well. Write programs to work together. Write programs that handle text streams, because that is a universal interface.'"

"All of these ideas, which add up to the tool approach, might have been there in unformed way prior to pipes, but they really came in afterwards," he said.

Kernighan agreed. He noted that while input/output direction predates pipes, the development of pipes led to the concept of tools -- software programs that would be in a "tool box," available when you need them.

He noted that pipes made software programs somewhat analogous to working with Roman numerals instead of Arabic numerals. "It's not that you can't do arithmetic," he said, "but it is a bear."

"I remember the preposterous syntax, the ">,>" or whatever the syntax that everyone came up with, and then all of a sudden there was the vertical bar ( | ) and just everything clicked at that point," he said. The bar was the syntax that made pipes work: who | cat | grep.

"And that's, I think, when we started to think consciously about tools, because then you could compose things together....compose them at the keyboard and get 'em right every time."

Next: 'What you think is going on, is going on'

'What you think is going on, is going on'

Novices to the system could experiment, linking different commands together for what they thought should be the output. And very often their "pipes" worked the first time!

Joe Condon and Ken Thompson 
with computer and chess board

Joe Condon, left, and Ken Thompson work on a chess playing program they developed, called 'Belle.'

When Joe Condon, the owner of the PDP-7 that Thompson first used for UNIX, started using UNIX himself, he asked a co-worker how to do a certain function. "'What do you think is the reasonable thing to do, Joe?'" he was asked in return.

"That was a very interesting clue to the philosophy of how UNIX worked," Condon later said. "The system worked in a way which is easy to understand. It wasn't a complex function, hidden in a bunch of rules and verbiage and what not."

"Cognitive engineering" is what Condon called it, "...that the black box should be simple enough such that when you form the model of what's going on in the black box, that's in fact what is going on in the black box."

Next: The manual even warned of bugs

The manual even warned of bugs

Most manuals are often an afterthought, something cobbled together after the product is made. The UNIX manual, in contrast, reflected the philosophy of UNIX in design and content. It even told you where the bugs were.

Sandy Fraser with TIU board

Sandy Fraser.

The manual style initially was set by Ritchie, but McIlroy soon took over its compilation as a love of labor. "The fact that there was a manual, that he (McIlroy) insisted on a high standard for the manual, meant that he insisted on a high standard for every one of the programs that was documented," explained Sandy Fraser.

Fraser, then a member of technical staff in the Computing Science Research Center, said that before a program got into the manual, it often had to be rewritten to meet the manual's standards. "And then add to all that, it's probably the first manual that ever had a section with bugs in it. That's a level of honesty you don't find."

"Cleaning something up so you can talk about it is quite typical of UNIX," McIlroy said. "Every time another edition of the manual would be made, there would be a flurry of activity. When you wrote down the 'uglies,' you'd say, 'We can't put this in print,' and you'd take out a feature or put features in to make them easier to talk about."

Fraser summed up the approach of the UNIX developers: "I think the level of intellectual honesty that was present in that activity is rare."

The manual warned that the checkdoc troff document checker failed to expand external sourcing of documents.

Next: Sharing UNIX with the rest of the world

Sharing UNIX with the rest of the world

In 1976-77, Ken Thompson took a six-month sabbatical from Bell Labs to teach as a visiting professor at the Computer Science Department at the University of California-Berkeley (UCB). What he taught, of course, was the UNIX system. While there, he also developed much of what eventually became Version 6.

The system was an instant hit, and the word spread quickly throughout the academic community.

When Thompson returned to Bell Labs, students and professors at Berkeley continued to enhance UNIX. Eventually, many of these enhancements were incorporated into what became known as Berkeley Software Distribution (BSD) Version 4.2, which many other universities also bought.

UNIX had been distributed via academic licenses, which were relatively inexpensive, and government and commercial licenses from about 1975. UCB became important in spreading the word about UNIX when it established a Computer Systems Research Group (CSRG), originally under the direction of Robert Fabry. The CSRG obtained a grant from DARPA to support a version of UNIX for DARPA contractors, which were mostly academic and military organizations, and some commercial firms. Ritchie recalled, "The contractors got the UNIX licenses from Bell Labs, but they got the BSD software from Berkeley."

The CSRG did much of the real work in making the TCP/IP protocols, which are the foundations of the Internet, accessible with their BSD distributions. The expansion of UNIX into academic environments also was aided by the fact that the Digital VAX machine was at a price that academic departments could afford.

In addition, UNIX helped play a key role in the early days of the Internet, since most of the VAX computers supporting it ran on UNIX.

Next: Business gets the word

Business gets the word

As UNIX spread throughout the academic world, businesses eventually became aware of UNIX from their newly hired programmers who had used it in college.

Soon a new business opportunity developed -- writing programs to run on UNIX for commercial use. What made UNIX popular for business applications was its timesharing, multitasking capability, permitting many people to use the mini- or mainframe; its portability across different vendor's machines; and its e-mail capability.

In 1984, AT&T divested itself of its local Bell telephone companies, and also created an independent subsidiary, AT&T Computer Systems. The creation of the subsidiary enabled the communications giant to enter the computer business. The new subsidiary marketed a number of computer products, including the UNIX operating system. Its software flagship was System 5, which ran on AT&T's 3B series of computers.

As the versions of UNIX grew in number, the UNIX System Group (USG), which had been formed in the 1970s as a support organization for the internal Bell System use of UNIX, was reorganized as the UNIX Software Operation (USO) in 1989. Two of the earliest applications of UNIX for telecommunications operations support systems were the Centralized Automatic Reporting on Trunks (CAROT) system and the Loop Management Operations System (LMOS), two Bell Labs efforts that monitored the health of transmission facilities in the Bell System. Other systems to quickly follow were the SARTS (Switched Access Remote Test System) and CMS (Circuit Maintenance System). In Bell Labs, the UNIX System Laboratories (USL) also was organized in 1989.

The USO made several UNIX distributions of its own to academia and to some commercial and government users, stemming from the development of the Programmer's Workbench (PWB) system. The USO in 1990 then was merged with UNIX Systems Laboratories, and the USL became an AT&T subsidiary.

A number of other computer manufacturing companies also sold UNIX computers. For example, both Sun Microsystems and SGI developed UNIX workstations, and Hewlett-Packard, NCR and IBM also sold UNIX computers.

Next: Early versions of the UNIX* system

At Berkeley, as elsewhere, computer scientists started "improving" UNIX, adding new features and applications, revising code, trying to push this versatile operating system to its limits. Over time, a number of versions of UNIX were floating around, including Bell Labs official versions.

Early versions of the UNIX* system

Version Year released Applications
Version 6 1975 Universities
Version 7 1978 Universities and commercial. The basis for System V.
System III 1981 Commercial
System V, Release 1 1983 Commercial
System V, Release 2 1984 Commercial, enhancements and performance improvements
Version 8 1985 Universities
Version 9 1986 Universities
Version 10 1989 Universities

In addition, there are several university versions, the most important being the UNIX-Berkeley BSD distributions.

Next: The UNIX wars

The UNIX wars

AT&T entered into an alliance with Sun Microsystems to bring the best features from the many versions of UNIX into a single unified system. While many applauded this decision, one group of UNIX licensees expressed the fear that Sun would have a commercial advantage over the rest of the licensees.

The concerned group in 1988 formed a special interest group, the Open Systems Foundation (OSF), to lobby for an "open" UNIX within the UNIX community. Soon several large companies -- who at the time were promoting their own proprietary operating systems in competition to UNIX -- also joined the OSF.

In response, AT&T and a second group of licensees formed their own group, UNIX International. The technical issues soon took a back seat to what can be charitably described as competitive maneuverings, and the trade press dubbed the ensuing controversy the "UNIX wars."

When efforts failed to bring the two groups together, each group brought out its own version of an "open" UNIX. Media wags soon noted the dispute could be viewed two ways: positively, since the number of UNIX versions were now reduced to two; or negatively, since there now were two more versions of UNIX to add to the pile.

Next: UNIX moves on

UNIX moves on

In 1991, AT&T "spun off" USL and UNIX when it sold shares to 11 other companies. The following year, the Novell Corporation signed a letter of intent to purchase USL and UNIX. The transaction was completed in 1993.

In 1995, when AT&T announced that it was planning to divest itself of many of its equipment manufacturing operations that eventually become the core of the new Lucent Technologies, it also announced that it would sell its NCR computer subsidiary and get out of the computer business.

The same year, Novell sold its entire UNIX business to the Santa Cruz Operation.

During all these changes, the Bell Labs Computing Science Research Center continued their development efforts on UNIX, eventually developing Versions 8, 9 and 10.

Next: Where UNIX stands today

Where UNIX stands today

The successes of UNIX are intertwined with C, the first general-purpose programming language to combine the efficiency of assembly language with high-level abstract expressiveness. Like UNIX, C programs can move essentially without change from machine to machine, eliminating the need for expensive, error-prone software rewrites.

UNIX-based systems are sold today by a number of companies. The systems include Solaris* from Sun Microsystems, HP-UX* from Hewlett-Packard, AIX* from IBM, and Tru64 UNIX* from Compaq. In addition there are many freely available UNIX and UNIX-compatible implementations, such as Linux, FreeBSD and NetBSD.

UNIX is the operating system of most large Internet servers, businesses and universities, and a major part of academic and industrial research in operating systems is based on UNIX. Most commercial software is written in C or C++, a direct descendant of C that was also developed at Bell Labs, or more recently Java, a C++ descendant developed at Sun Microsystems.

Next: It still remains a phenomenon

It still remains a phenomenon

Much of the progress of computer hardware, software and networks during the last quarter century can be traced to the influence the UNIX system had on the computer industry. It embodies visionary ideas -- deliberate generality and openness -- that continue to be a strong force today. Many of its approaches and notations have influenced the entire span of subsequent operating systems.

"Thirty years after its creation, UNIX still remains a phenomenon," Ritchie marveled.


* Product names are the trademarks or registered trademarks of their respective companies.