© The text and graphics associated with this page are, unless otherwise noted, the copyright of the IEEE Computer Society. Any copying of these materials, including electronic reproduction, with the express written consent of the IEEE is prohibited.
In the 50 years since the unveiling of the ENIAC, the history of computing and the history of the society are intertwined. In this presentation we will look at the two threads with the hope that we will recognize our history, its pioneers and its contributions.
But first a little pre-history.
Calculation was a need from the early days when it was necessary to account to others for individual or group actions, particularly in relation to maintaining inventories (of flocks of sheep) or reconciling finances. Early man counted by means of matching one set of objects with another set (stones and sheep). The operations of addition and subtraction were simply the operations of adding or subtracting groups of objects to the sack of counting stones or pebbles. Early counting tables, named abaci, not only formalized this counting method but also introduced the concept of positional notation that we use today. The next logical step was to produce the first "personal calculator" -- the abacus -- which used the same concepts of one set of objects standing in for objects in another set, but also the concept of a single object standing for a collection of objects -- positional notation. This one-for-one correspondence continued for many centuries even up through the many years when early calculators used the placement of holes in a dial to signify a count -- such as in a rotary dial telephone. Although these machine often had the number symbol engraved alongside the dial holes, the user did not have to know the relationship between the symbols and their numeric value.
Only when the process of counting and arithmetic became a more abstract process and different sizes of groups were given a symbolic representation so that the results could be written on a "storage medium" such as papyrus or clay did the process of calculation become a process of symbol manipulation.
The bits and pieces of a computer (including the software) came together over many centuries, many people each adding a small contribution. One of those that was not recognized for many years was that of Mukhammad ibn Musa Al'Khowarizmi, a Tashkent cleric who in the twelth century developed the concept of a written process to be followed to achieve some goal, and published a book on the subject that gave it is modern name -- algorithm.
1612 John Napier made the first printed use of the decimal point (after it had been invented in the Netherlands by ????, and invents logarithms, and several machines for multiplication. Best known of his machines was the "bones" that was an aid to multiplication, though perhaps the chessboard calculator was the most ingenious and least known!
1622: William Oughtred created the slide rule (originally circular) based on Napier's logarithms that was to be the primary calculator of engineers through the 19th and early 20th centuries. With an common accuracy of only three digits, the slide rule, an analog device, provided sufficient precision for most works, but was not suited to situations where accuracy was needed such as in accounting. (The picture here is of an 1880's cylindrical Slide Rule, effectively 200 inches long, and capable of four digit accuracy.)
1623: William Schickard described a machine that combined the concept of Napier's bones (in a cylindrical form) with a simple adder that allowed the user to more easily complete the multiplication of multi-digit numbers. However no original copies of Schickard's machine have been found, and thus the credit for the first adder with automatic carry often is given to Blaise Pascal.
1642: Blaise Pascal created an adding machine with automatic carries from one position to the next. The son of a merchant, Pascal devised a machine that contained several dials that could be turned with the aid of a stylus. Addition was achieved by the underlying gears turning as each digit was dialled in, the cumulative total being displayed in a window above the "keyboard". While several models were completed, Pascal's machine (often called the "Pascalene") was more likely to be found in the living rooms of their owners as a conversation piece rather than in the work room. (IEEE Center also has a picture #460)
1673; Using a stepped cylindrical gear, Gottfried Leibniz built a calculator capable of multiplication in which a number was repeatedly, and automatically added into an accumulator. (PICTURE -- ??)
1801: In France, Joseph-Marie Jacquard invented an automatic loom using punched cards for the control of the patterns in the fabrics. The introduction of these looms caused the riots against the replacement of people by machines. (IEEE Center images # 972, 453)
1822: Charles Babbage recognized that among the most common of calculating devices -- the mathematical, celestial, and navigation tables -- are full of errors and leading to the loss of ships. While studying at Cambridge University he suggests that it ought to be possible to compute the table entries using a steam engine. This desire becomes the theme of his life and he begins to design the Difference Engine for the purposes of computing the entries in navigation and other tables. Later he applies to the British Government for assistance, and receives what may have been the first government grant for computer research -- an event that is repeated a hundred years later in the US to help build the ENIAC at the University of Pennsylvania.
1833: Ten years later Charles Babbage had second thoughts about the Difference Engine, realizing that it was a special-purpose machine capable of only a single operation. Abandoning this line of work temporarily, he designed the Analytical Engine that had the basic components of a modern computer, and has led to Babbage being described as the "Father of the Computer". Like so many programmers of today, Babbage did not do a good job of documentation and his ideas were not widely accepted for the simple lack of communication.
1842 Ada Augusta King, Countess of Lovelace, translates Menabrea's pamphlet on the Analytical Engine, adding her own notes, and becomes the world's first programmer.
1847-49 Charles Babbage returned to his plans for the Difference Engine and completed 21 drawings for the construction of the second version, but still did not complete the manufacture himself. In 1991, on the occasion of the bicentenary of Babbage's birth, the Science Museum in Kensington, England, built a copy from those drawings, only finding a small number of very obvious errors. To overcome the suggestion that Babbage was unable to complete his machine because the technology of the era was insufficient, the Museum carefully used only techniques available in the mid-1800's and built a copy that operated correctly. After Babbage's death his son, Henry Prevost, built several copies of the simple arithmetic unit of the Difference Engine and sent them to various places around the world, including Harvard University, to ensure their preservation. In October 1995 one of those copies was sold by Christies, auctioneers, in London, on behalf of descendants of Charles Babbage in New Zealand to the Powerhouse Museum in Sydney, Australia for approximately $200,000.
1854: George Boole describes his system for symbolic and logical reasoning that becomes later the basis for computer design.
1884: The American Institute for Electrical Engineering (AIEE) was founded; the first of the organizations that would eventually merge to form the IEEE in 1963.
1890 The increasing population in the US, and the demands of Congress to ask more questions in each census, was making the processing of the data a longer and longer process. It was anticipated that the 1890 census data would not be processed before the 1900 census was due unless something was done to improve the processing methodology. Herman Hollerith (IEEE Center image # 1594) won the competition for the delivery of data processing equipment to assist in the processing of the data from the 1890 US Census, and went on to assist in the census processing for many countries around the world. The company he founded, Hollerith Tabulating Company, eventually became one of the three that composed the Calculating-Tabulating-Recording (C-T-R) company in 1914, and eventually was renamed IBM in 1924. The Hollerith machines were the first to appear on a magazine cover.
1912: The Institute of Radio Engineers was founded -- the second organization that would eventually merge to found the IEEE in 1963.
1925: Babbage's and Hollerith's digital computing methods were rarely used in scientific computation, though analog devices such as the slide rule were in wide use, especially in engineering calculations. Vannevar Bush, MIT, built a large-scale differential analyzer with the additional capabilities of integration and differentiation. Funded by the Rockefeller Foundation, the differential analyzer was perhaps the largest computational device in the world in 1930.
Digital computing came to the fore again in the 1930's when a number of scientists recognized that the technology had reached the stage where the necessary components of a computer were available. Each, in his turn, had to conceive (or perhaps "reconceive" not being aware of the prior work of Babbage) of the structure of a computer. While we can identify the discrete precise dates during which at least four pioneers recognized the capabilities of the technology, a hundred years from now our ancestors will see this as a single instance in time and that simultaneously independent researchers brought forth the computer.
1935-38: Konrad Zuse, in Berlin, Germany, developed his Z-1 computer in his parent's living room, a relay computer, using binary arithmetic. He continued with the Z-2 in 1938 with the help of Helmut Schreyer. During World War II he applied to the German Government for assistance in building his machines, but he was turned down on the basis that it would take longer to complete his work than the government expected the war to last. Eventually he fled to Hinterstein at the end of the war and then to Switzerland where he reconstructed his Z-4 machine at the University of Zurich and founded a computer company that was eventually absorbed into the Siemens Corporation. Recently the Deutsches Museum in Munich, Germany, reconstructed the Z-1 machine as the central core of their computer exhibition. Zuse's machines were unknown outside Germany until well after the war, and while they may have precedence chronologically, they had little impact on the overall industry development.
1936-39: John Vincent Atanasoff, with John Berry, developed the machine we now call the ABC -- the Atanasoff-Berry Computer -- at the University of Iowa, USA as a special purpose machine for the solution of sets of linear equations in Physics. Perhaps the earliest example of an electronic calculator, the ABC did develop primary concepts that would appear later in "modern computers" -- the electronic arithmetic unit and the regenerative, cyclic memory.
1937: While not using the practical technology of the era, Alan Turing developed the idea of a "Universal Machine" capable of executing any describable algorithm, and forming the basis for the concept of "computability". Perhaps more importantly Turing's ideas differed from those of others who were solving arithmetic problems by introducing the concept of "symbol processing".
Also in the US two other people were considering the problems of computation: Howard Aiken (IEEE Center image #328) at Harvard University, whose work would come to fruition in 1944, and George Stibitz (IEEE Center image #1591) at Bell Telephone Laboratories who was looking at the use of telephone relays in doing arithmetic. He first constructed a relay driven arithmetic unit in 1937 (which he later called the Model-K since it was built on the Kitchen table) and from that small start built a number of relay machines that were in use during World War II.
1939: One of the major computational problems at Bell Telephone Laboratories was in the domain of complex numbers. Stibitz' first full-scale electromagnetic relay calculator solved this problem and was named the Complex Number Calculator (later the Bell Labs Model 1). A year later this machine was the first to be used remotely over telephone lines, setting the stage for the linking of computers and communication systems, time-sharing, and eventually networking. A teletype was installed in a hallway outside the meeting rooms for the annual American Mathematical Society conference at Dartmouth College, and connected to the Complex Calculator in New York. Among the people who took the opportunity to try out the system were Norbert Wiener and John Mauchly. (Need picture)
The need of computation during World War II was exacerbated by the sudden enhanced development of a number of ordnance devices to counter the increased technology of attack devices such as the aircraft. Stibitz extended his relay machines to include tracking and aiming devices to be attached to anti-aircraft guns, but the major deficiency was the availability of "firing tables" for field and naval artillery. Thus the early US calculating devices were, like Babbage's Difference Engine, designed to produce tables, not to complete on-time computations for the solution of scientific (or military) problems. (picture of guns??)
1944: - The first large scale, automatic, general purpose, electromechanical calculator was the Harvard Mark I (AKA IBM Automatic Sequence Control Calculator [ASCC]) (IEEE Center image #16) conceived by Howard Aiken in the late 1930's and implemented by Messrs. Hamilton, Lake, Durfee of IBM. The machine, sponsored by the US Navy, was intended to computer the elements of mathematical and navigation tables -- the same purpose as intended by Babbage for the Difference Engine. Aiken dedicated his early reports to Babbage, having been made aware of the piece of the Difference Engine at Harvard in 1937. The ASCC was not a stored program machine but instead was driven by a paper tape containing the instructions.
Grace Murray Hopper went to work for Aiken at Harvard in June 1944 and became the third programmer on the Mark I. The two who preceded her, then called "coders", were Ensigns Robert Campbell and Richard Bloch.
1940-44: Across the Atlantic a major need for supporting the war effort was to decrypt the intercepted messages of the German forces. Encrypted in the early years using the US designed ENIGMA, a team at Bletchley Park, halfway between Oxford and Cambridge Universities, including Alan Turing, built a series of machines culminating in 1943 with Colossus. (IEEE Center image #491) The Colossus Mark I was delivered by the Telephone Research Establishment, under the leadership of Tommy Flowers (seen on the right here with Sir. Harry Hinsley, also a leader in the Bletchley Park activities, and more recently a documenter of their activities), in December 1943 and became operational in 1944, decrypting messages to assist in the planning for D-Day later that year. Further machines were delivered in time for the landings in Normandy and played a significant part in the defeat of Nazi Germany. The existence of Colossus was a secret until 1970 and the algorithms of decryption are still a secret in 1995. Turing and others had only a small influence on the British computer development after the war. A copy of Colossus is being reconstructed at the Museum that now exists at Bletchley Park in England. In the US a similar program using the technology transferred from Bletchley Park was undertaken at the United States Naval Computing Machine Laboratory (USNCML) in Dayton, Ohio, and later at the Wisconsin Avenue headquarters of what is now known as the National Security Agency (NSA). Besides adding to the code breaking of German Codes the USNCML also worked on Japanese codes. After the war members of this group of engineers founded the Electronic Research Associates (ERA) in Minneapolis.
1943: Work on ENIAC was started in 1943 under the guidance of John Brainerd, Dean of the Moore School of Electrical Engineering at the University of Pennsylvania, with John Mauchly and J. Presper Eckert responsible for its implementation. The US Army liaison, on behalf of the Aberdeen Proving Ground (Ballistic Research Laboratory), was Herman Goldstine. (Photo shows Eckert on left and Goldstine on right, holding an arithmetic unit from the ENIAC) (IEEE Center image #923)
30 June 1945: John von Neumann wrote the "First Draft of a Report on the EDVAC" that set the stage for the architectural design of several generations of computers; the report never got past the draft stage, and his co-authors (though obviosuly not his co-writers) never got properly named. The architectural style became as the "von Neumann architecture" and this source of the concept of the "stored program" becomes a matter of controversy. Eckert and Mauchly claimed that they had those thoughts before von Neumann joined the work in progress at the University of Pennsylvania. Konrad Zuse claimed in later years that he had those thoughts too in the 1930's. (Cover from Annals? or picture of vN)
14 February: ENIAC was unveiled in Philadelphia. ENIAC represented still a stepping stone towards the true computer, for differently than Babbage, Eckert and Mauchly, although they knew that the machine was not the ultimate in the state-of-the-art technology, completed the construction. ENIAC was programmed through the rewiring the interconnections between the various components and included the capability of parallel computation. ENIAC was later to be modified into a stored program machine, but not before other machines had claimed the claim to be the first true computer.
1946 was the year in which the first computer meeting took place, with the University of Pennsylvania organizing the first of a series of "summer meetings" where scientists from around the world learned about ENIAC and their plans for EDVAC. Among the attendees was Maurice Wilkes from the University of Cambridge who would return to England to build the EDSAC.
Later that year Eckert and Mauchly, in a patent dispute with the University of Pennsylvania, left the University to establish the first computer company -- Electronic Control Corp. with a plan to build the Universal Automatic Computer (UNIVAC). After many crises they built the BINAC for Northrup Aviation, and were taken over by Remington-Rand before the UNIVAC was completed. (picture needed) At the same time the Electronic Research Associates (ERA) was incorporated in Minneapolis and took their knowledge of computing devices to create a line of computers; later ERA was also assimilated into Remington-Rand.
That same year the AIEE Committee on Large-Scale Computing Devices was formed, with the chair, Charles Concordia (May/June 1946-49); the committee that is the origin of the IEEE Computer Society in 1963. (need picture)
William Shockley, John Bardeen, and Walter Brattain (IEEE Center image #s 1599, 1347, 307) invent the "transfer resistance" device, later to be known as the transistor that will revolutionize the computer and give it the reliability that could not achieved with vacuum tubes.
The work on a stored program computer was ongoing in at least four locations -- at the University of Pennsylvania on the construction of EDSAC, with John von Neumann at Princeton University on the Institute for Advanced Study Machine (IAS), with Maurice Wilkes at Cambridge University, and at the University of Manchester. Douglas Hartree had visited various locations in the US and had returned to England to convince his colleagues, Freddy Williams and Tom Kilburn, to build a computer. Max Newman, one of the leaders of the Bletchley Park activity, had created the Royal Society Computing Laboratory at Manchester, and was looking for a means to build a computer. On June 21, 1948 their prototype machine, the "Baby" was operated for the first time; the world truly moved from the domain of calculators to the domain of computers. Williams, Kilburn, and Newman continued to build a full scale machine they designated the Manchester Mark I. The Ferranti Corporation took the design and began a line of computers that were one of the major components of the British Computer Industry.
T.J. Watson Sr., miffed at Howard Aiken at the lack of recognition at the dedication of the Automatic Sequence Control Calculator [ASCC] (Harvard Mark I) and unnerved by the success of ENIAC, ordered the building of the Selective Sequence Control Computer (SSEC) for IBM. (IEEE Center image # 20) Though not a stored program computer, the SSEC was the first step of IBM from total dedication to punched card tabulators to the world of computers. The publicity pictures of SSEC were modified to exclude the columns in the machine room at the Madison Avenue offices of IBM, after Watson expressed regret that they existed!
CS initiates standards activity
Just a year after the Manchester Baby machine became the first operating stored program machine in the world, then first large scale, fully functional, stored-program electronic digital computer was developed by Maurice Wilkes and the staff of the Mathematical Laboratory at Cambridge University. It was named EDSAC (Electronic Delay Storage Automatic Computer); the primary storage system was a set of mercury baths through which generated and regenerated acoustic pulses represented the bits of data. Wilkes had been an attendee at the 1946 Summer School at the University of Pennsylvania and come home with the basic plans for a machine in his mind. (need picture of EDSAC)
Back in the US the National Bureau of Standards began work on two machines. The Bureau had been made responsible for managing the contract for the delivery of the UNIVAC to the Census Bureau, but recognized that it needed computational facilities for its own work. Not having an overwhelming budget, the Bureau decided to emulate the National Physical Laboratory (its UK equivalent) and build its own machines. These were to be placed in the east and west coast centers. Sam Alexander took charge of the development of the Standards Eastern Automatic Computer (SEAC), while Harry Huskey (builder of the Pilot ACE at the National Physical Laboratory [NPL], the British equivalent of NBS) led the development of the Standards Western Automatic Computer (SWAC).
In the fifties, the PGEC became an organization with many elements of the present Computer Society, notably excepting the technical and education committees. Conferences were the most significant early activity, but publications grew rapidly with some 1,800 editorial pages generated during the decade. At the end of the fifties, the PGEC was the largest professional group in the IRE. It had 19 chapters across the US and 8,874 members, including 8,129 full members, 679 student members, and 66 affiliates.
After World War II and his work at Bletchley Park, Alan Turing joined the staff of the National Physical Laboratory at Teddington, England, with plans to build his own computer. His design for the Automatic Computing Engine (ACE) was completed in 1947 but the director of the Laboratory gave the task of construction to the Physics rather than the Mathematics department, where Turing resided, and consequently Turing left NPL to take up a position with his war-time boss, Max Newman, at the University of Manchester. The work on a prototype machine based on Turing's plans, named Pilot Ace, was designed by Harry Huskey in 1948 and completed in 1951. The full scale version was completed several years later by the Department of Scientific and Industrial Research.
Jay Forrester, Bob Everett and others at MIT began work on a simulator for the Air Force in late 1946, but changed their minds about the use of analog techniques, deciding instead to use digital processing to produce the first real-time processing computer -- the Whirlwind. This work is also well known for the development of core memory. The basic concept for core memory had been patented by An Wang, Harvard University, in 1949, but his technique involved using the cores on single wires to form delay lines. The Whirlwind Project conceived the technique of stringing the cores onto a matrix of wires and thus producing a random access memory.
After five years of work and several different instantiations of the first computer company established by Eckert and Mauchly, the UNIVAC computer was delivered to the Census Bureau, just in time to begin work on the decennial census. Somewhat over budget, the hope for the Remington-Rand Corporation was that they could produce a sufficient number of copies to recover their losses on a 1946 fixed-price contract with the government. Eventually XX copies would be built and delivered to a wide variety of both government and commercial users. (IEEE Center image # 21)
Maurice Wilkes had realized quickly after the completion of the work on EDSAC at Cambridge University that "a good part of the remainder of [his] life was going to be spent in finding errors in ... programs." With Stanley Gill and David Wheeler he developed the concept of subroutines in programs to create re-usable modules; together they produced the first textbook on "The Preparation of Programs for an Electronic Digital Computer", (Addison-Wesley Publ. Co., New York, 1951). The formalized concept of software development (not to be named for a decade) had its beginning. (need picture -- Cover of book?)
The third of Howard Aiken's machines, the Mark III was delivered to the Naval Surface Weapons Center, Dahlgren, Virginia in March 1951. The Mark III was notable for being the first full scale machine to include drum memory, even though Aiken still insisted on keeping the data and the instructions on separate (and slightly different sized) drums. The Time magazine, in a painting by Artzybasheff, featured the Mark III on the cover; the first time a computer appeared. The painting is now at Harvard University.
CS sponsors 1st Joint - PGEC formed
Grace Hopper, by now an employee of Remington-Rand and working on the UNIVAC, took up the concept of reusable software in her 1952 paper entitled "The Education of a Computer", (Proc. ACM Conference, reprinted Ann. Hist. Comp., Vol. 9, No.3-4, pp. 271-281) in which she described the techniques by which a computer could be used to select (or compile) pre-written code segments to be assembled into programs in correspondence with codes written in a high level language -- thus describing the concept of a compiler, and the general concept of language translation. For the next forty years Hopper was to champion the development of easier ways of solving problems and taking no notice of the doubters who said that "it can't be done". The idea of "automatic programming" had been born. (picture of original paper? -- I have copy)
By the end of 1952 UNIVAC had become the common name for a computer, just as Hoover and Xerox became synonyms for vacuum cleaners and paper copiers, fueled in part by the use of UNIVAC in the CBS presidential election night television news program. Using a dummy console in the studio, the returns were processed on a machine in the Philadelphia plant of Remington-Rand. With only 5% of the returns counted the UNIVAC predicted a landslide victory for Eisenhower, but in spite of Charles Colinwood's repeated requests to "UNIVAC, tell us what you think", it was not until after midnight on the East Coast of the US that CBS admitted that they had nor believed the predictions and had withheld the results of the programs run on UNIVAC. Election nights on television would never be the same again, and UNIVAC was established as the premier computer. (PICTURE -- believe we have a picture of Cronkite and Colinwood alongside the UNIVAC?)
In 1952 John von Neumann also completed his version of the successor to the ENIAC at the Institute for Advanced Study at Princeton University.
CS begins Transactions on Computers
In the midst of the first "police action" on the part of the United Nations in Korea, IBM took the opportunity to contribute to the war effort by providing a "Defense Calculator" that was in fact their first true entry into the computer business. The IBM "Type 701 EDPM" was built as a result of the conviction of T.J. Watson, Jr. that IBM had to take a step into this field and his convincing his father that computers would not immediately destroy the card processing business. The 700 series of machines, including the 704, 709, and eventually the 7090 and 7094, dominated the large mainframe market for the next decade, and brought IBM from computer obscurity to first place in that same time period.
While many universities in the US and other countries were building their own computers, the Cambridge University EDSAC was the first to be commercialized. With the foresight of company probably least likely to have been expected to have a strong interest in computers, J. Lyons & Company, Ltd., a purveyors of confectionery and operators of "corner teahouses" throughout Great Britain, took the EDSAC design and converted it for their own business applications. Called LEO (Lyons Electronic Office), it came to the attention of many other companies with the same kind of business processing needs, turning an in-house development project into a new computer company. LEO Computers, Ltd. eventually was purchased by English Electric Company, and together they became part of International Computers Ltd. (ICL), the major builder of British computers through the 1970s. (need picture)
1st Special Issue on Computers of IRE Proceedings
Since the 1930s IBM had built a series of calculators in the 600 series that contributed to the versatility of the card processing equipment that was their major product. The early IBM computers (701 and 702) were incompatible with the punched card processing equipment, but the IBM Type 650 EDPM, a natural extension of the 600 series, used the same card processing peripherals thus making it upwardly compatible for many existing IBM customers. A decimal, drum memory machine, the 650 was the first to be mass produced though IBM never expected to lease 1000 in the first after its announcement. For many universities it was to be their first computer, its attractiveness was considerably enhanced by the availability of a 60% educational discount conditional on the institution teaching certain computer-related courses.
Following the example set by Grace Hopper, and a successful implementation of a digital code interpreter for the IBM 701 named Speedcoding, John Backus proposed the development of a programming language that would allow uses to express their problems in commonly understood mathematical formulae -- later to be named FORTRAN. Assembling a team of researchers from IBM and other customer sites, Backus always believed that it would take them 6 months to complete the task; whenever anyone asked when the system would be ready he responded "in six months"!
While John von Neumann was working on the IAS Machine, several other parallel projects were underway at other institutions to build copies. To assure conformity the Princeton group took extensive photographs of the construction stages of the IAS machine and shipped them, with notes to the other builders. At Los Alamos National Laboratory Nick Metropolis built the MANIAC, the University of Illinois built the ILLIAC, and at Rand Corporation Willis Ware built the JOHNNIAC. In March 1954 the JOHNNIAC was unveiled and operated by Keith Uncapher, later to be the first chair of the newly formed IEEE Computer Group, later named the Computer Society. In 1994 Willis Ware was presented with the IEEE Computer Society Pioneer Award for his work on JOHNNIAC. The newly formed National Science Foundation (NSF) awarded John von Neumann a grant to continue his work on computing, the first of a long line of university support for computing by the Foundation.
Less than ten years after the unveiling of ENIAC, the idea of large scale computing epitomized by ENIAC, had changed to the concept of "supercomputing". IBM began work on their contribution to the national effort by producing a machine that was promised to be 100 times faster than the fastest machine of the day. This machine was to expand the state-of-the-art and thus was named STRETCH. When STRETCH was eventually delivered in 1960 (??) the quoted price had to be reduced since the speed target was not reached. That same year IBM introduced the 704, whose principal architect was Gene Amdahl, who was go on to establish his own company building supercomputers in the 1990's. The 704 had the distinction of being the first commercial machine with floating-point hardware, and was capable of operating at approximately 5 kFLOPS. (Picture of 704 needed)
The number of computer customers had grown to the point where it was appropriate to form the first users groups to exchange experiences and programs, and at the same time to present a uniform face to the manufacturers. SHARE (not an acronym but often given the interpretation of "Society to Help Allieve Redundant Effort") was created for the users of (large) IBM machines, and USE for UNIVAC users. Computing was no longer to be shaped only by the computer companies.
Sperry-Rand, the successor to Remington-Rand, but still maintaining the UNIVAC Division, took up the challenge to create a supercomputer on behalf of Lawrence Livermore National Laboratory (LLNL) that was to be named LARC (Livermore Automatic Research Computer). Work also began in the United Kingdom on a supercomputer project. The Atlas project was a joint venture between University of Manchester and Ferranti Ltd. with Tom Kilburn as the principal architect.
Not forgetting that the purpose of the computer was to solve problems, John McCarthy and Marvin Minsky organized a conference at Dartmouth College, with assistance from the Rockefeller Foundation, on the concept of Artificial Intelligence. From this meeting the promises of AI grew, but were not to be achieved for several years.
The early computers had small internal memories and slow external memories primarily relying on magnetic tape. While internal memories had been upgraded to magnetic drums and then core memory. The next logical step was the disk memory, with movable read/write heads to provide a semi-random access capability and a storage capacity akin to that of magnetic tape. The IBM 305 RAMAC was the first disk memory system. (I have lots of slides of Disk drives but not the RAMAC)
After three years of work Backus and his colleagues delivered the first FORTRAN program compiler for the IBM 704, and almost immediately the first error message was encountered -- a missing comma in a computed GO TO statement. The unmarked 2000 card deck was received at Westinghouse in Pittsburgh by Herbert Bright who deduced that it was the long expected compiler, and created the first user program -- complete with an error. The world of programming languages had taken a large step upward, from a domain in which only specially trained programmers could complete a project, to a domain in which those with problems could express their own solutions.
The invention of the transistor in the latter half of the 1940's opened the modern electronics age of employing "electrons in solids," leaving obsolete the vacuum tube electronics utilizing "electrons in vacuum" then in its heyday. In 1958, Jack St. Clair Kilby conceived and proved his idea of integrating a transistor with resistors and capacitors on a single semiconductor chip, which is a monolithic IC. His idea of a monolithic IC, together with the planar technology of Dr. Jean Hoerni and Robert Noyce's idea of "junction isolation" for planar interconnections, underpins the great progress of today's semiconductor IC and the microelectronics based upon it. The technology has allowed the innovation of numerous applications in computers and communications, which have changed our life styles dramatically. (IEEE Center images #33, 1595, 1091, 243)
The original development that started with the Whirlwind project became a reality in 1958 with the installation of the SAGE system for Air Defense at McGuire AFB in NJ. The first effective air traffic control system was operational for the north-eastern US.
The recently founded Control Data Corporation under the leadership of William Norris, created their contribution to the supercomputer market with the fully transistorized -- CDC 1604 -- Seymour Cray was the chief architect.
Meanwhile, continuing his work towards the development of Artificial Intelligence, John McCarthy developed concepts of the programming language LISP for manipulating strings of symbols, a non-numeric processing language. Later students, changed the meaning of LISP, standing for LISt Processing, into "Lots of Idiotic, Silly Parentheses". (PICTURE??)
While there was a movement towards supercomputers in many companies, IBM announced the availability of two desk-size machines for the small user -- the IBM 1401 for the business user and the IBM 1620 for the scientist. The 1401 became the most popular business data processing machine, and for small universities and colleges, the 1620 became the first computer experience for a many students. (I have pictures of the 1620) Both machines introduced a character oriented core memory of 20-40k bytes in which "word" boundaries could be defined by the programmer to provide "unlimited precision". Both machines were supported by an arithmetic unit that used decimal table-look-up instead of binary adders. Initially IBM had intended to name the 1620 as the CADET, but when this was translated into "Can't Add, Doesn't Even Try" the name was dropped.
After several years of work General Electric Corporation delivered 32 ERMA (Electronic Recording Machine -- Accounting) computing systems to the Bank of America in California to rescue the banking industry from being overwhelmed by the rapidly increasing numbers of checks being used by an ever increasing clientele. Based on a basic design by SRI, the ERMA system employed Magnetic Ink Character Recognition (MICR) as the means to capture data from the checks and introduced a check handling system that was not daunted by documents that were not in pristine condition. The banking industry had become automated, opening the way for new ways of banking including the ATM and electronic personal banking. On the other hand it was a high water mark in the history of computer manufacture at GE that, with the exception of developing a profitable line of machines for NCR (the NCR 304), never really achieved the status that might be expected of such a financial giant.
1st IFIP Congress - CS represents USA
PGEC services in the early sixties were much the same as in the late fifties, although the number of conferences and transactions pages continued to increase. However, in 1961, the PGEC leadership began to consider creating technical committees. These committees would provide more forums for special interests and, at the same time, reduce the chance of these interests forming separate IRE groups and segmenting the field. In May 1962, the first of these committees, a logic and switching theory committee, was approved to operate jointly with the AIEE committee already in operation.
Concurrently, merger plans were proceeding between the IRE and AIEE. The IRE-AIEE merger into the Institute of Electrical and Electronics Engineers began at the headquarters level in 1963. The PGEC then became the Professional Technical Group on Electronic Computers, and very shortly thereafter, the Computer Group. In early 1963, the group began operating with an Administrative Committee that included a mix of PGEC and AIEE CDC people. The final merger was completed in April 1964.
A major step was taken in July 1966 with the first issue of the bimonthly Computer Group News, which included group and industry news, applied and tutorial articles, a guide to computer literature, and a repository of computer articles. Repository materials were available to the profession for a nominal charge.
Computer Group News opened the door for many magazines in the society, as well as in IEEE. But it was also significant in another way. With the publication of its own magazine, the Computer Group employed and managed its own small full-time staff in the Los Angeles area for publications support and other administrative activities. The Computer Group was the first IEEE group to employ its own staff, and it was a major factor in the growth of the society.
In 1968, IEEE Transactions on Computers became a monthly publication. The number of published periodical pages grew to almost 9,700 pages in the transactions and about 640 in the Computer Group News. Membership grew to 16,862, including 4,200 students and 158 affiliates. The decade closed with 41 chapters.
Since 1952 Grace Murray Hopper had been developing a series of programming languages that increasingly used natural language-like phrases to express the operations of business data processing. FLOWMATIC was the last of these. Others had also taken on the challenge, including IBM that had produced a language named COMMERCIAL TRANSLATOR. From these bases an industry-wide team -- Conference on Data System Languages (CODASYL) -- led by Joe Wegstein of NBS (now NIST) developed a new language in a very short time and created the first standardized business computer programming language, COBOL (Common Business Oriented Language). For the next 20 years there were more programs written in COBOL than any other single language. That same year the second of the mathematical languages, ALGOL 60 was developed, also by a committee. Although not widely implemented ALGOL became the conceptual basis of many programming languages thereafter.
1960 marked the end of first generation of computers (vacuum tube driven) gave way to the second generation using transistors.
The work on integrated circuits by Jack Kilby and Robert Noyce came to fruition in 1961 when the first commercially available integrated circuits became available from the Fairchild Corporation. The patent for the silicon based IC had been granted to Robert Noyce, starting a long contention on patent rights for IC's between the germanium version of Kilby and that of Noyce. (PICTURE -- ??) From this date forward computers would incorporate ICs instead of individual transistors or other components.
While operating systems (originally called monitors or supervisors) had been developed as a means of improving the throughput of computers in the late 1950s, the users were frustrated by their lack of intimacy with the computer. To solve this problem and return the control of the computer back in the hands of the user, Fernando Corbató, MIT, produced CTSS (Compatible Time Sharing System) for the IBM 7090/94, the first effective time-sharing system and coincidentally the first means of remote access to a computer since Stibitz' demonstration in 1940. (Picture of Corbato?)
In Great Britain the Atlas computer at the University of Manchester became operational; it is the first machine to use virtual memory and paging; its instruction execution was pipelined, and it contained separate fixed- and floating-point arithmetic units, capable of approximately 200 kFLOPS. (PICTURE??)
By 1963 the process of standardization of the elements of the industry was becoming prevalent and among the first was a standard for a code for information interchange (ASCII). For the first time there was a means for computers to interchange information, but it would take almost 15 years before this would become commonplace.
1963 was the year in which the IRE and AIEE merged to form the IEEE, the process of examining the various components to consolidate activities began, and the opportunities for new projects were considered. From the AIEE Large Scale Computing Committee and the IRE Professional Group on Electronic Computers would come a new group.
Starting in 1959 Douglas Engelbart launched the SRI Augmentation Research Center to pioneer the modern interactive working environment. NLS (On Line System) was built during the mid-1960's to develop and experiment with software and hardware tools that would make people more productive using computers. NLS was an exploratory vehicle for research into the "knowledge worker/organization." Among the original ideas developed and implemented in NLS were the first hypertext system, outline processor, and video conferencing. In 1964 he had developed the "mouse," to be followed by the development of two-dimensional editing, the concept of windows, cross-file editing, uniform command syntax, remote procedure-call protocol, mixed text-graphic files, structured document files, idea processing, and many more developments. Like the work of almost any pioneer Engelbart's work was not recognized immediately, the mouse waiting until the development of the personal computer, fifteen years later, to find its niche. Engelbart received the IEEE Computer Society Pioneer Award in 1992. (Let's find a picture of a mouse?? And Engelbart?)
By the mid-1960's the jet airliner (and Boeing) had revolutionized the airline travel business but the reservation process, even though there were elemental reservation systems, was inadequate. In a period when remote access had been proven by the CTSS system, IBM produced the first large scale, on-line, real-time reservation tracking system, named SABRE for American Airlines, and soon to be copied by others. (PICTURE -- ??)
To many the world of computing changed radically on April 7, 1964 when IBM announced System/360, the first IBM family of compatible machines. While there was a least one other compatible family in GE, the commitment to an upwards compatible family and the merging of the scientific and business lines of machines by IBM had a profound effect on the way many businesses thought about computers.
In the Fall of 1964, the Dartmouth Time Sharing System became operational with BASIC as principle language for student program development. Developed by John Kemeny (later president of Dartmouth and chairman of the commission that investigated the Three Mile Island accident) and Tom Kurtz, together with lots of help from undergraduates, BASIC was to become the "lingua playpen" of the young computer community. Both Kemeny and Kurtz were honored by the IEEE Computer Society as Pioneers. (need picture of Kurtz)
1964 was also the year in which the IRE Professional Group on Electronic Computers and AIEE Committee on Large-Scale Computing Devices merged to form Computer Group, later to be renamed the Computer Society, and Keith Uncapher served as the first chair (1964-65). (PICTURE??)
While some companies were developing bigger and faster machines, Digital Equipment Corporation introduced the PDP-8 in 1965, the first TRUE minicomputer. The PDP-8 had a minuscule instruction set and a primitive micro-language, and excellent interface capability. Thus the PDP-8 became used extensively as a process control system, including interfacing to telephone lines for time-sharing systems. (PICTURE -- ??)
The success of CTSS at MIT was noted by J.C.R. Licklider, director of information processing research at ARPA, who believed that the technology had useful applications in the agency. He arranged to sponsor Project MAC (variously interpreted as "Machine Aided Cognition", "Minsky Against Corbató" and other names) that would take the next logical stage in the development of time-sharing to produce a system known as "Multics". Choosing a GE 600 series machine as the basis for the development, MIT was joined by GE and AT&T Bell Laboratories to produce a general-purpose, shared-memory multiprocessing timesharing system.
A joint project between IBM and the SHARE user's group developed a new programming language with the intention of combining both scientific and business data processing as had the System/360 machines. The language was also intended to be a high level system development language.
The need for computers in Colleges and Universities in support of science and engineering was noted in the "Rosser report" sponsored by NSF. This report effectively terminated any grant support for universities to build their own machines, and improved the support for universities to lease commercial machines.
By December 1996 the Computer Group membership had reached 11,000, a growth of 10% during the year.
Seven years after Fairchild Corp. had delivered the first commercial integrated circuit, the third generation of computers began in 1967 with the delivery of the first machines using that technology.
John Hamblen, faculty member at the University of Missouri, Rolla, produced the first annual survey on the use of computer in higher education, that became the guideline for university administrations against which to gauge their computing resources and activities. The NSF "Pierce report", a follow-on to the Rosser report looked closely at the curriculum of computer science programs throughout the US, and provided the impetus for the development of curricula for the field.
In 1955, during the development of the programming language FORTRAN, Harlan Herrick had introduced the high level language equivalent of a "jump" instruction in the form of a "GO TO" statement. In 1968 Edsger Dijkstra laid the foundation stone in the march towards creating structure in the domain of programming by writing, not a scholarly paper on the subject, but instead a letter to the editor entitled "GO TO Statement Considered Harmful". <- Comm. ACM, August, 1968 -> The movement to develop reliable software was underway. (PICTURE -- ??)
For the future, Arthur C. Clark introduced HAL, the computer of the future in the movie "2001: A Space Odyssey", basing the design on the artificial intelligence proposals of I.J. Good (a member of Bletchley Park) and Marvin Minsky. Supposedly HAL was a monosyllabic cypher of IBM!
Work on ARPAnet begins. The concept of networking was by no means new in 1969; even as early as the Romans had there been a network of roads that facilitated not only the rapid movement of troops but also the rapid interchange of information by messengers. During the Napoleonic and American Civil Wars there were various schemes developed to distribute messages over a network of communication lines, primarily along lines of sight between prominent locations. (Can we get a picture from the book by Holzman??)
Disillusioned by the work on Multics and continuing problems with the GE 600 series machines, Bell Telephone Laboratories withdrew from Project MAC. Messrs. Ritchie (IEEE image # 1593) and Thompson (IEEE image #700) began work on their own operating system, that instead of being targeted to multiple users, would concentrate on the single user and thus in a play on the name Multics, it was named UNIX. In 1994 Ritchie and Thompson each received the IEEE Computer Pioneer Award.
In 1971, the Computer Group became the Computer Society. (The Computer Group promoted this name change to better represent the stature it and other IEEE groups had attained.) For the Computer Society, the seventies was a decade of significant growth in both the depth and breadth of services. Membership grew by a factor of over two-and-a-half.
The society's publication program grew rapidly. The Computer Group News, renamed Computer in 1972, became a monthly publication in 1973, and significantly increased its tutorial-oriented content. At the same time, IEEE Transactions on Computers was unbundled from it, making Computer the only publication received automatically with society membership. The subscriber base to the now optional transactions held up well, and the society learned it could expand its publications program outside the membership dues structure.
The society introduced the IEEE Transactions on Software Engineering in 1975, and the IEEE Transactions on Pattern Analysis and Machine Intelligence in January 1979. The decade saw the publication of more than 25,000 periodical pages: about 13,500 pages for the IEEE Transactions on Computers, about 4,100 pages for the IEEE Transactions on Software Engineering, over 400 pages for the IEEE Transactions on Pattern Analysis & Machine Intelligence, and over 8,000 editorial pages for Computer.
Late in the decade, the society formalized its non-periodical publications into the Computer Society Press. The operation mainly produced conference proceedings, tutorial texts, and reprints in the seventies.
Fourteen new technical committees were formed, making a total of 20 by the end of the period. The committees contributed significantly to growth in the number of specialty conferences and meetings. In the late seventies, the Computer Society was sponsoring or cosponsoring about 50 technical conferences, meetings, and symposia, many with ACM.
The society initiated the Education Committee in 1970, and produced the first model curriculum in 1976. The Distinguished Visitor Program began providing speakers to chapters in 1971.
The Computer Society was also the first society within IEEE to establish student branch chapters. This activity began in 1974 as an experiment and was subsequently adopted by the IEEE. Additionally, the society formalized and expanded its awards program in this decade.
The staff supporting the society's operations also grew. The position of executive secretary was created in 1971. By the end of the decade the Computer Society staff numbered 16 permanent employees: two in the executive secretary's home office in Silver Spring, Maryland, and 14 in the publishing group's rented space in Long Beach, California, plus several temporary part-time people in both locations. The needs and viability of the publishing organization grew to the extent that, late in the decade, the society started the process of acquiring its own building in Los Alamitos, California.
By the end of the seventies, Computer Society membership had grown to 43,930, including 7,833 students and 3,943 affiliates. There were now more than 100 chapters, including about 30 student branch chapters.
With the change of name of the Society magazine to Computer (picture of first cover??), the Society inaugurated a contest to find a logo.
Computer Society Magazine is named "COMPUTER"
The world of personal computing has its roots in 1971, with two important products - First commercially available microprocessor and the first floppy disk. The recently founded Intel Corporation produced the Intel 4004 for Busicom company, giving birth to a family of "processors on a chip". Ted Hoff produced the Intel 4004 (IEEE Center image #36) in response to the request from a Japanese company (Busicom) to create a chip for a calculator. Hoff decided that it would be easier to use a "computer on a chip" for this purpose than to custom develop a calculator chip. Marcian E. (Ted) Hoff received the IEEE Computer Society Pioneer Award in 1988. (IEEE Center image # 1596)
Alan Shugart at IBM produced the first regular use of an 8 inch floppy (magnetic storage) diskette, primarily for the Memory Writer (??).
Ken Simon of Litton Industries submitted the winning entry in the Computer Society logo competition and it was adopted by the Board of Governors at their mid-summer meeting. The winning design was a clever adaptation of the IEEE logo, replacing the two element current and magnetic effect design with a interlaced pair of binary digits.
The first digital microcomputer available for personal use was the MITS (Micro Instrumentation and Telemetry Systems) 816. Though not equipped with a display or keyboard, the 816 was of considerable interest to the amateur enthusiast who was seeking the personal computer. (PICTURE??)
ARPAnet demonstrated at CS-ACM ICCC in Washington
Digital Computer Controlled flight - NASA F-8
Don Knuth promises to deliver a dozen volumes on the "Art of Programming"; the first three turn out to be the "bibles" of software development for many years, (PICTURE -- ??) containing many of the basic algorithms of the field that became known as "data structures" and many of the techniques of programming that would become the basis of "software engineering". In 1982 Knuth was identified with the Computer Society Pioneer Award.
While the concept of a wide area network had been effectively developed as a part of the ARPANet project, the basis for a "local area net" was Ethernet, created at Xerox PARC by Robert Metcalfe. In some ways Metcalfe invented Ethernet three times, the first time as part of his dissertation at MIT (as part of Project MAC), at Xerox PARC, and then again later at 3COM, a company he founded to exploit his invention.
Twenty-seven years after the unveiling of the ENIAC, and many years after the US Patent Office had issued the patent for the computer to Eckert and Mauchly, the Judge Earl Larson of the US District Court in Minneapolis invalidated it. In a suit between Honeywell Information Systems and Sperry-Rand Corp. regarding the payment of royalties for the use of the concept of the computer, Larson found that Mauchly had derived his ideas for the computer from "one, John Vincent Atanasoff". (PICTURE -- ??) Neither Eckert nor Mauchly ever gave up their opposition to this finding, believing that their invention was true. For many this was the first time they had heard of the work of Atanasoff, or of the ABC.
Ethernet - LAN concept - Bob Metcalf @ Xerox Park
The March 1974, "QST" magazine, contained the first formally advertised personal computer -- the Scelbi ("SCientific, ELectronic, and BIological") developed by the Scelbi Computer Consulting, Milford, Connecticut. At almost the same time, Jonathan Titus produced a widely marketed personal computer kit, named the Mark-8. The world of personal computing was growing. Intel introduced the 8080 for the purposes of controlling traffic lights, but it was to find fame later as the processor for the Altair. (PICTURE -- ??)
Gary Kildall introduces CP/M as the first operating system to run (almost) independent of the platform.
John Cocke designs first RISC machine for IBM Research. (PICTURE -- ??)
Zilog, Inc. is founded to compete with Intel in the production of micro-processors on a chip. (Z80)
President's address - Steve Yau - pictured on a motorcycle. (From Los Alamitos?)
Computer Networks & Communications (Computer 2/74): This issue discusses telecommunication turbulence and network evolution, data security in the computer communication environment and data communication standards; critical aspects then, as well as now, for the telecommunication environment.
By 1975 the market for the personal computer was demanding a product that did not require an electrical engineering background and thus the first mass produced and marketed personal computer (available both as a kit or assembled) was welcomed with open arms. Developers Edward Roberts, William Yates and Jim Bybee spent 1973-1974 to develop the MITS Altair 8800. The price was $375, contained 256 bytes of memory (not 256k),but had no keyboard, no display, and no auxiliary storage device. Later, Bill Gates and Paul Allen wrote their first product for the Altair -- a BASIC compiler. The Altair named after a planet on a "Star Trek" episode.
1975 was also the year in which IBM produced their first "personal computer", the 5100. Under development for two years, the price of the 5100 and the software supported did not enamor it to the same community that had welcomed the Altair. While some units were used for the support of education, it never took off as the answer to "Computer Aided Instruction" (CAI). (PICTURE -- ?? we recently did an article in Annals)
Cray I beginning the modern supercomputer trend: Seymour Cray, the principal architect for CDC, started the trend toward modern supercomputers and computational architectures. A Cray machine was, and still is, the standard by which to judge super performance. There is a public domain version of the Cray operating system.
A year after the Altair was produced, Steve Jobs and Steve Wozniak produced the Apple II (IEEE image #17) that was assembled and complete with its own keyboard and monitor. It was an immediate success, priced within the reach of the enthusiast and supporting some basic software applications that showed its true usefulness. The Apple II (PICTURE of Apple II??) was quickly assimilated into schools and colleges and was the basis of many early "microprocessor" courses. That same year the Microsoft and Apple Corporations founded.
1977 saw the opening of the First West Coast Computer Faire in San Francisco where many attendees got their first looks at the Apple II (costing $1298) and the Commodore Pet ($795). That same year Radio Shack introduced the TRS-80 microcomputer, given the derogatory handle of "Trash-80" (PICTURES??).
First Computerland store opened in Morristown NJ, under the name Computershack.
Computers in Telephone switching (Proc IEEE 9/77)
Cryptography - Diffie & Hellman (Proc IEEE 3/79)
While most microprocessors had been quickly supported by a BASIC compiler, and some primitive games, Visicalc introduced by Daniel Bricklin and Bob Frankston was a major breakthrough in application software for this level of machinery. The first spreadsheet program set the standards for both the "look and feel" of later spreadsheet systems and the tremendous ease of use. To many users this was the opening of the age of the fourth generation of software support.
Following the 1978 release of Visicalc and its unprecedented success, Micropro International released Wordstar in 1979, which like Visicalc would set the standards for word processing systems. Microprocessors were beginning to be capable of doing useful work beyond the aspirations of the hobbyist.
Within the Computer Society, the growth of the seventies continued in every function, but with new dimensions and changing emphasis. This was the decade of new magazines, major standards activities, new education initiatives, international services, and a significant growth and refinement of staff services and facilities.
Within the society, the breadth of the profession and member interest in the more tutorial-oriented materials published in Computer prompted the creation of similar magazines in specialty areas. The society introduced IEEE Computer Graphics & Applications in January 1981, IEEE Micro in February 1981, both IEEE Design & Test and IEEE Software in February 1984, and IEEE Expert in the spring of 1986.
IEEE Transactions on Knowledge & Database Engineering was introduced in September 1989. IEEE Transactions on Software Engineering and IEEE Transactions on Pattern Analysis & Machine Intelligence moved from bimonthly to monthly publication in 1985 and 1989, respectively.
The society published more than 65,200 periodical editorial pages during the decade -- with over 33,400 pages in transactions and 31,800 pages in magazines, including 12,700 in Computer.
The number of technical committees continued to grow, mirroring the diversity in the computer industry. Fifteen new technical committees brought the total to 33 by the end of the decade. These committees were the primary sources of conferences and meetings. The society sponsored and cosponsored more than 50 conferences annually and cooperated, without financial involvement, with other organizations in dozens more. Interest in the more vertical or specialty conferences increased, relative to the broad conferences such as Compcon and Compsac. Several of the specialty conferences drew many more attendees than the broad-based conferences. The number of meetings held outside the US grew significantly, many of them sponsored by technical committees. In the eighties the society sponsored and cosponsored more than 90 conferences outside the US. CompEuro was initiated in 1987, cosponsored with IEEE's Region 8.
The technical committees began to support standards activities in a major way. The results were remarkable. At the end of the decade, 56 standards had been approved and 125 working groups were under way. These projects involved well over 5,000 people.
The growth in society services was clearly fueled by the industry's growth and by the many volunteer professionals who were motivated to provide the technical base for these services. But this growth simply would not have been possible without the staff support that developed during this period. The society brought in its first executive director in 1982, and the staff developed from 16 people at the beginning of the eighties into a highly professional operation of 94 people by the end of the decade.
The Computer Group staff operations had begun in the garages and basements of its first publisher and executive secretary. In early 1980 the West Coast publishing operation moved into its newly purchased building, and in 1985 the space was doubled with the purchase of the adjoining building. Also in 1985, the society purchased its current headquarters building in Washington (PICTURE??), D.C., and extended its staff support overseas by opening an office in Brussels. The Brussels office was expanded in 1987. In 1988, an office was opened in Tokyo. These offices represent a major step in serving the society internationally.
Alan Shugart, having left IBM and founded his own company, Shugart Associates, continued his leadership in the development of storage devices by introducing the Winchester hard drive, thereby revolutionizing the storage capabilities of personal computers. No longer would personal computers be limited to tiny internal memories and slow external storage cassette tapes or diskettes. The personal computer moved from being a microcomputer limited by its storage capabilities to compete effectively with the power of many mainframe systems, and certainly with the majority of minicomputers.
After waiting for the opposition to soften up the market, IBM entered the field in 1981 with the IBM "PC" and supported by the DOS operating system developed under an agreement that gave Microsoft all the profits in exchange for the development costs having been borne by Microsoft. Disregarding CP/M that had been the choice for earlier machines, IBM chose to go in a radically different direction on the marketing assumption (that turned out to be correct) that the purchasers of the PC were a different breed than those who were prepared to build their own system from a kit. Using a caricature of Charlie Chaplin as the user who was able to take the PC out of the box and immediately begin using it, IBM attracted a community of users who wanted the machine for its usefulness rather than its intrinsic engineering appeal. (Surely we can find photographs of the original PC!!!??)
Planning to get ahead of the competition Osborne Computer Corporation began marketing the first self-contained, portable microcomputer in 1981, complete with a with monitor, disk drives and carrying case -- the Osborne 1. Though initially successful, Osborne eventually declared bankruptcy two years later. (PICTURE??)
That same year Commodore introduced the VIC-20, and quickly sold 1 million units!
Less than four months after IBM introduced the PC, Time Magazine named the computer as the man of the year! Never before (or since) had an inanimate object been chosen as the "man of the year". Alan Turing would have been proud of the object of his research!
By 1982 the computer had become a prime tool in the movie industry and Disney Studios completed a movie where the characters existed inside a computer -- Tron -- and where the special effects were computer generated. (PICTURE??)
Software development exploded with the introduction of the PC, standard applications including not only spreadsheets and word processors, but also graphics packages and communications systems. Games were also prolific. In 1983 Mitch Kapor introduced Lotus 1-2-3, and took over the spreadsheet supremacy from Visicalc. Later Mitch was to found the Electronic Freedom Foundation, originally intended to provide legal support for some hackers who had intruded into certain computer systems. EFF is the ACLU of the computer industry.
Starting in 1978, the US Department of Defense had begun development of a "modern" high order programming language; having recruited a committee of volunteer experts into the HOLWG ("holwig", High Order Language Working Group) a series of documents of increasing detail and precision were distributed to the programming language community for criticism. The process of development also involved a continually decreasing number of proposals, each taking advantage of the best features of the prior proposals, until finally the "Green" language (named for the color of the cover of the report, so that the name actual proposer was unknown to the evaluators) was chosen as the winner, and renamed "Ada" in honor of Ada Augusta King, Lady Lovelace, the mathematical companion of Charles Babbage. Among other innovations, the language introduced the rendezvous mechanism for interprocess communication and synchronization, but is widely criticized for its complexity. (picture of Cover?)
Geometric Modeling (Computer Graphics 10/83): Computer-aided geometric modeling, once an arcane subject, has become a key to effective use of computers in science and engineering. Robotics and other complex design problems can be approached more easily using the techniques described in this issue of Computer Graphics.
The first portable computer 65 B.C. (Micro 2/84) Antikythera is the island near which the device, 16 x 32 x 9 cm., was found. It is probably the first known portable computing mechanism. Derek de Sola Price's article on A History of Calculating Machines is this issue of Computer.
Computer Society Headquarters on Mass Ave in DC: After many years in which the Society's office was in Harry Hayman's basement and other places, the Society purchased its own building in Washington D.C. There is also a West Coast facility in Los Alamitas, California.
Starting from the 8086 chip used in the IBM PC, Intel Corporation continually developed new chips to support the ever increasing demand for processing power; in 1986 Intel released the 386 chip -- the intermediate stage between the 1980 8086 and the 1994 Pentium. (PICTURE??)
At the other end of the computer family scale the CRAY X-MP with 4 processors achieved a processing speed of 713 MFLOPS (against a peak of 840) on 1000x1000 LINPACK. In thirty years the supercomputer had achieved an improvement of five orders of magnitude (from 5KFLOPS for the 1955 IBM STRETCH). If the motor car had had the same degree of improvement in the past 100 years then it would have a gas consumption of one thimble full per hundred miles, travel at 3,000,000 miles per hour, and be cheaper to replace than pay the parking fee downtown! The change in scale is most apparent in this picture of a ring counter, or shift register from the ENIAC that can easily be compared mentally with the register in today's machines.
Cray X-MP (4 processors) CM-1 Connection machine (65536 single bit processors) Conceived as a Ph.D. thesis by Danny Hillis, this hypercube computer is a massive parallel array of 65,536 1-bit processors each with an ALU and an external 4kbyte memory. It is one the largest commercial examples of parallelism in computing.
Computer Society opens European office in Brussels As befitting the largest. transnational Computer society in the world, The Computer Society openned an office in Brussells, Belgium in order to serve the growing number of European members more effectively.
Computer Society opens Asian office in Tokyo
The Programmer's Apprentice (Computer 11/88)
Rapid Prototyping (Computer 5/89)
Scientific Visualization (Computer 8/89)
Software for Workstations (Computer Graphics 7/90)
EEG Monitoring for Epilepsy (Computer 9/92)
Exploiting Parallelism in Loops (Computer 2/94)
Visualization (Computer 7/94)
The impact of the information revolution on our society and our industry is immense. In our increasing desire to control our own destinies, we seek to understand not only our contemporary technology, but also to look to the past to recognize trends that will allow us to predict some elements of the future. Looking backward to discover parallels and analogies to modern technology can provide the basis for developing the standards by which we judge the viability and potential for a current or proposed activity. But we also have a feeling of responsibility for preserving the achievements of our forebears through the establishment of archives and museums, with the expectation that the pleasure of discovery will easily outweigh the profitability of mere historical rumination.
The Computer Society and its predecessor committees in the AIEE and the IRE have been an important player in the development of the field providing a means for the free interchange of ideas, for the mutual enhancement of concepts, and the development of a profession that is providing leaders in our society, the nation and the world. The National Academy of Sciences has recognized the computer and the programming language FORTRAN as among the inventions of the century, and our pioneers rank with those in other well established fields in their recognition in the annual Kyoto Prize competition, and national awards throughout the world.