[Home]History of computing/Talk

HomePage | History of computing | Recent Changes | Preferences

I doubt very much now that Harvard Mark I was fully programmable. Later models could switch conditionally from one paper roll to another, but since I don't believe it could rewind the paper rolls, no actual loops are possible and it's not Turing complete. But I'm not sure. Anybody know details about the Mark I?

Oh, and I just read that the Manchester Mark I was actually the first functional von Neumann machine, even before EDVAC -- but of course based on EDVAC's ideas. --AxelBoldt


I believe you are correct about the IBM-Harvard Mark I. This machine, by the way, was not built at or by Harvard. It was built for Harvard (and the U.S. Navy) by IBM.

My understanding is that the first operational stored program computer was the "Manchester Baby Mark I", a test machine for the Williams-tube storage technology, not the Manchester Mark I itself. The EDSAC at Cambridge appears to have preceeded the Manchester Mark I as the first "practical" stored program computer in operation.


Axel: An electromechanical computer necessarily uses some electronics. Thus "electro". But I understand what you mean about the electronics/electromechanical distinction.

Aiken directed the construction of the ASCC by IBM engineers at the IBM Endicott labs. Construction was completed in 1943. It was moved to Harvard, and operation began May 1944. [1]

As stated, the EDVAC was never completed--so all EDVAC-based computers were "before EDVAC". The "Baby" was first based on the EDVAC design that got a program running. --The Cunctator

I gotta say, the Wiki method really works--this entry has gotten amazingly better in a vary short period of time. It's still a little too discursive (some of the specificity would be better in stand-alone entries), but it's highly informative and readable. --The Cunctator


Not to disagree, but there's still a whole lot missing. No mention of Whirlwind, SAGE, PLATO, to give just a few examples.

Is that a disagreement or not? SAGE is mentioned in the history of networking... --The Cunctator

It seems like the end of the article is the original timeline visible at the top of this page table, and reads very much like a timeline. Wouldn't more of an overview and synthesis be appropriate, considering we have the (very good IMHO) other timeline?


I agree, particularly the latter part of the article has too many dates, names and details obscuring the general flow of progress. --AxelBoldt

I don't want to be argumentative, but I thought the new article didn't tell much of anything before WWII or after 1970, let alone flow of progress. The flight control system of the F14, while interesting, was hardly a landmark computer.

Yes, there was a fair amount where I just went in and pasted missing stuff from the old page. However, I feel it is more important to have date-filled placeholders than nothing at all. Now that some base data is there, anyone can go in and rewrite/rearrange it. By all means, feel free to edit as you see appropriate. The power of Wiki :-) --Alan Millar


Names, dates, and details are good things; but need to be pushed down into more detailed articles on more specific topics. At the same time, an overview/summary/synthesis needs to be presented at this level. But my guess is its easier to do this bottom-up rather than top-down. In other words, collect all the detailed information first, then refactor into appropriate levels of detail.

Also, should this article cover software as well as hardware? -HWR

Of course, hardware w/o software is scrap metal. The question is whether it's tangible enough to procude records. --Yooden


Anyone can refactor (a basic design feature of Wiki), but only if there is some information to refactor, so I think the bottom-up approach is necessary.

But all the information is already on the Computing timeline page, so why repeat it here? I think this article should have a bird's eye view on Computing history, just outlining the developments, and not listing anecdotes such as ads bought by certain companies at certain sports games. --AxelBoldt

As to Swiss clocks: the essence of computing is not the addition and subtraction of numbers, although it grew out of it and is a necessary part of it. The essence of computing is the execution of a sequence of instructions, and in that respect modern computers have as much in common with Swiss clocks as the abacus. And no, I'm not recommending removing the reference to the abacus :-) --Alan Millar

Swiss clocks neither process information nor can be programmed. They are just fancy mechanical devices, like all mechanical clocks. I don't see any relation to the history of computing except maybe that some early mechanical calculators used similar mechanisms as mechanical clocks (why Swiss?). Also, why are they mentioned in the paragraph about programmability? --AxelBoldt


What about music boxes? They're programmed to play tunes. -HWR

They have a single sequence, as do player pianos, and player pianos can even use a different paper roll to play a different tune. In that respect, the music box mechanically is a predecessor to the Jacquard loom. The Swiss clocks had multiple sequences of actions, where a main cog would activate other cogs to order different actions. The first GOSUB? :-) --Alan Millar

Actually, there are music boxes that play tunes from interchangeable discs. I don't know the chronology of this however.

BTW, is this article restricted to the history of DIGITAL computers? Analog computers don't generally execute sequences of instructions. -HWR


"IBM decided to enter the PC market ..., with the IBM XT" is not correct -- the XT was their second machine, with the hard drive.

That's correct--I'll change it. The first one was simply called the "IBM PC". Some mention of Compaq and the beginnings of the clone market in that era seems appropriate too. --LDC


I'm afraid this entry is getting too timeline-y...but I see that others are aware of that. Looks like we need to start thinking about some more subentries...anyone have any suggestions? --The Cunctator
Unfortunately the timeline here has many inaccuracies and ommissions of historical importance: 1965: IBM System 360 (first OS); 1968 first mouse/window system demo; 1973: CPM first micro OS; 1969 Intel 4004; 1977 Commodore Pet & TRS 80; 1978 Atari 400/800; 1979 Motorola 68000 32 bit CPU (w. 16 bit data and 24 bit address bus); 1981 Commodore Vic20 & IBM PC & Xerox Star (w. GUI/Mouse?/Ethernet?...); 1982 Commodore 64 with 64k RAM $600 & Timex Sinclair 2K RAM $99; 1983 1 million Commodore Vic20s and 1 million Apple IIs sold; 1985 Commodore Amiga with multitasking/Color? GUI/accelerated video/stereo sound/3.5" floppy $1200; 1988 7 million Commodore 64 and 128 computers sold.... --Jonathan--

Feel free to enter whatever you think is missing to Computing timeline, not to History of computing. --AxelBoldt


Ack! It's getting insanely more timeliney! I'm thinking of paring. Please, everyone, notice Computing timeline. History of computing shouldn't supposed to list every computer, but discuss the intellectual development of the engineering/science of computing. --The Cunctator

HomePage | History of computing | Recent Changes | Preferences
This page is read-only | View other revisions
Last edited September 7, 2001 8:02 am by The Cunctator (diff)
Search: