Talk:History of computing

Latest comment: 6 days ago by The Slacking Gecko in topic Innacuracies


Wiki Education Foundation-supported course assignment

edit

  This article is or was the subject of a Wiki Education Foundation-supported course assignment. Further details are available on the course page. Student editor(s): EFohsta-Lynch.

Above undated message substituted from Template:Dashboard.wikiedu.org assignment by PrimeBOT (talk) 23:36, 16 January 2022 (UTC)Reply

Name of this article

edit

Removed redirect on this talk page to related talk page Talk:History_of_computing_hardware. Tempshill 00:35, 4 Dec 2003 (UTC)

It sounds like the intention of this page is "History of computing methods" and I recommend it be renamed accordingly. "History of computing" will sound to 99% of our user base like it is the "History of computers" article. Tempshill 00:35, 4 Dec 2003 (UTC)

Second that; like it or not, one has to acknowledge the reception of a term in the real world. "H. of C. methods", or perhaps "H. of calculation" would be a much better title, I think. --Wernher 05:46, 17 Dec 2003 (UTC)
So 99% of our user base are laboring under an error. We should correct it. Michael Hardy 21:18, 17 Dec 2003 (UTC)
No offence, but I was anticipating this argument. It is of course a valid one ("ten million flies can't be wrong, etc" is always a dangerous path), and ideally an encyclopedia should strive to educate it's readers, but I think we should rather be discussing the best article title (and, BTW, not all computer scientists/engineers are hopeless introverts with no extra-curricular interests and a general historical overview :-] ). --Wernher 21:35, 17 Dec 2003 (UTC)
Guys, the phrase "History of Computing" is generally used as a broader alternative to "the history of computer hardware" to include the history of CS, software, the computer business, computer use in science, administration, engineering, etc. Computing is what you do with a computer. So that includes the history of mathematical applications, but a whole lot of other stuff as well. For example, IEEE Annals of the History of Computing has been published for more than 25 years now. As someone said above, you're really trying to swim against the tide here. Blame the people who continued to call the things "computers" back in the 1950s even after they started to do more than just compute with them. There should be a page of the history of automatic mathematical calculation, but to avoid confusion call that one "History of Computation" or "History of Scientific Computation" or "History of Scientific and Technical Computation."
I agree that this page should be moved to "History of Computation" and "history of computing" should redirect to "History of computer hardware". I think this because I just got to this page wanting an article on the history of the use of the Computer. Tomgreeny 00:16, 26 September 2006 (UTC)Reply

I find that I know next-to-nothing about this topic, and apparently neither do those who originated the history of computing hardware. Otherwise, I would have added a lot more information here. When were the algorithms taught in grammar school for addition and multiplication invented, and how was it done before that time? Archimedes and Euclid should both appear here -- the latter for Euclid's algorithm; the former for The Sand Reckoner. Probably Briggs should be mentioned here. And maybe Fibonacci for introducing Arabic numerals into Europe. Michael Hardy 21:44, 17 Dec 2003 (UTC)

... and it should mention the existence and nature of human computers and their work in the time what it was taken for granted that the word computer refers to a person who computes. Michael Hardy 21:49, 17 Dec 2003 (UTC)

The following book is superb on this subject matter (although slightly Franco-centric regarding inventions):
  • Georges Ifrah (1999). The Universal History of Numbers : From Prehistory to the Invention of the Computer. John Wiley & Sons. ISBN 0471375683.
--Wernher

Add navigation box

edit

Hi, I added a navigation box on several articles in this series (similar to that used in several other histories (including the History of the United States articles), to help tie the articles together, I hope this will help readers to see the thread between the history of computing overall in the sense that Michael Hardy means and the more narrow and common use senses. I made this article the top-level. I hope this is useful. Perhaps some of the computing timelines could also be part of the navbox, but I thought I'd give people a chance to discuss what might be appropriate before hauling in a whole slew of new articles into the navbox. --Lexor 11:39, 22 Dec 2003 (UTC)

Writing this article

edit

So... who's interested in actually writing this article? Fredrik (talk) 20:57, 9 Jun 2004 (UTC)

I would be, if I knew anything. It was actually this article that caused me to realize that I don't. I know something of the history of mathematics, and I know that Euclid wrote about Euclid's algorithm, but I have no idea, e.g., how the ancient Greeks did multiplication, or when and how long division was introduced, etc. Michael Hardy 21:04, 9 Jun 2004 (UTC)
Unless someone else does it before then, I may make it my project of the week end. I don't know a lot about the subject, but I can go to the library and study :-). Then someone can correct my mistakes. David Remahl 21:02, 9 Jun 2004 (UTC)

Possible directions for enhancement

edit

Michael Hardy's program for this article could be realized in several directions, but it seems as if the directions are actually articles in themselves:

  • the arithmetic operations, once we have a positional number system, could characterize multiplication of M by N as repeated addition of M, N times; likewise, division of M by N as repeated subtraction of a trial subtrahend from M and its succeeding minuends, N times, until we arrive at the Egyptian fraction, bringing us to continued fractions etc., in the cornucopia that is mathematics, but that would serve to cover up the relationship of the the arithmetic operations to computing with minutiae.
  • the algorithms, such as Newton's method, the Runge-Kutta methods, and dozens of similar specialized methods for each kind of mathematical structure, and again, we have a cornucopia to deal with.
    • the fact that there are usually limitations to the algorithms, meaning that they cannot be applied blindly, requiring judgement and the eventual resort to heuristics

So, does anyone have any suggestions? Ancheta Wis 00:10, 2 Aug 2004 (UTC)

I think this would be a good project for Wikipedia:WikiProject_Computer_science which could use more members and advisors. Also Wikipedia:WikiProject_Mathematics. There is a lot of Computer and math stuff that could use better formatting and referencing. I'm just a learner in these topics, but I think you need a sense of History to gain some perspective. CQ 21:10, 24 August 2005 (UTC)Reply

Is anyone going to add how people managed to calculate trigonometric or logarithmetic functions using analog/digital devices? That could be a good historical fact to add. Rtthb (talk) 17:47, 16 February 2018 (UTC)Reply

Some "definition"

edit

The lead section starts the definition (this is an Encyclopedia, remember, where things are defined) with these words:

"The history of computing is longer than the history of computing hardware and modern computing technology..."

Has anyone ever read a more inept definition of a subject? --SciCorrector (talk) 22:53, 10 November 2009 (UTC)Reply

I would be more concerned by the fact that it says that this article is longer than [The history of computing hardware], when the latter is actually five times longer.

Thaum1el (talk) 09:17, 8 January 2011 (UTC)Reply

list of books

edit

I started the section "books for futher reading" and I duplicated the list in history of computing hardware. (Someone has kindly added to the lsit.) There are more books that I'd like to add, but I don't want us to have to maintain the list in more than one place. (There are probably other articles that could use this list too). So would it be a good idea to make an article with just the list of books that can be linked from the other places? Or is there a better way to do it and eliminate the duplication? Bubba73 00:11, 15 Jun 2005 (UTC)

Mention of people

edit

In the history of computing, there are key people, John von Neumann, and Grace Murray Hopper, and Donald Knuth, just to mention a few. Could we have a brief section in this article mentioning and linking to bios of people like them? MathStatWoman 23:10, 16 January 2006 (UTC)Reply

Even more importantly, this article lacks mention of human computers and the rooms full of them each assigned parts of large computational tasks. Astronomy, geodesy, cartography, and navigation are, I'm sure, only a few of the areas that relied heavily on such activities. At the very least, there should be a link to the Wikipedia article "Human Computer." — Preceding unsigned comment added by Alpha cassiopeia (talkcontribs) 17:44, 2 August 2013 (UTC)Reply

Excluding hardware is absurd

edit

Excluding computing hardware is absurd, simply because computing hardware can be used for...computing. I'm fine with a general page that lists both human and hardware computing history (keeping the additional page focused on hardware), but this page doesn't breakout any hardware history, and even the human history is very limited. Superm401 - Talk 22:21, 17 September 2007 (UTC)Reply

edit

Most of inter-wiki links to other langages talks about computer science but this article is not about computer science. I'm removing some I can understand but not all... 152.81.12.56 07:36, 12 November 2007 (UTC)Reply

No wonder. Try taking some English Language lessons. You'll understand a lot more after that. --SciCorrector (talk) 22:56, 10 November 2009 (UTC)Reply

Still unhappy with this article

edit

As noted above, the title is misleading. Most sources consider the term "history of computing" to mean "the history of computer software and hardware." The misleading title has caused several serious problems:

  1. It appears at the top of the nav box for the history of computer software and hardware, suggesting that it is the main article on this subject. (It isn't.)
  2. The category "history of computing" uses the term in the standard way, but it uses this article as the lead article. This article doesn't even belong in that category, but rather in the history of mathematics.
  3. It makes external links to a number of sites which use the term in the standard way, rather than the quirky way this article does.
  4. Inter-language links (as noted above) are often wrong.
  5. Links into this article are often intended for history of computer hardware or history of computer science

I have no problem with an article that discusses the history of mathematical algorithms used for "computing" but all the above problems need to fixed. I will fix each of these if there are no objections. I notice that the discussion of the problem seems to have petered out about three years ago. ---- CharlesGillingham 18:05, 12 November 2007 (UTC)Reply

  Done ---- CharlesGillingham (talk) 02:42, 7 December 2007 (UTC)Reply

Computer programming in the punch card era

edit

I have found the above article in the list of those that need wikifying, and I wondered whether people who edit here would have a view on whether it needs to be merged, renamed, rewritten or otherwise altered, or perhaps just brought into your article series. Itsmejudith (talk) 21:58, 3 March 2009 (UTC)Reply

Concrete devices?

edit

Is this article really suggesting computing started out with concrete?! This is either poor use of language, or the materials industry are unaware of the ancient use of concrete! — Preceding unsigned comment added by 86.140.26.42 (talk) 08:23, 9 January 2013 (UTC)Reply

See Abstract object --TedColes (talk) 09:07, 9 January 2013 (UTC)Reply

HTML text in the header

edit

All history of computing pages are showing the following code in the header: <div class="noprint"> </div> <noinclude> </noinclude>

This is being caused by the command {{History of computing}}, but I couldn't figure out how to solve it. DSchiavini (talk) 08:49, 18 August 2014 (UTC)Reply

The "command" in question is actually a template; in particular, it's Template:History of computing, which puts the box in the sidebar of pages that use it.
This edit, with the edit comment "Hide sidebar if printing as a book.", wrapped the template in an instance of the {{Hide in print}} template, i.e. Template:Hide in print. That editor's next edit replaced that template with an explicit inclusion of <div class="noprint"> in the template, perhaps because the {{Hide in print}} template, as its documentation page says, doesn't work. Guy Harris (talk) 17:00, 18 August 2014 (UTC)Reply
edit

Hello fellow Wikipedians,

I have just added archive links to one external link on History of computing. Please take a moment to review my edit. If necessary, add {{cbignore}} after the link to keep me from modifying it. Alternatively, you can add {{nobots|deny=InternetArchiveBot}} to keep me off the page altogether. I made the following changes:

When you have finished reviewing my changes, please set the checked parameter below to true to let others know.

This message was posted before February 2018. After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than regular verification using the archive tool instructions below. Editors have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the RfC before doing mass systematic removals. This message is updated dynamically through the template {{source check}} (last update: 5 June 2024).

  • If you have discovered URLs which were erroneously considered dead by the bot, you can report them with this tool.
  • If you found an error with any archives or the URLs themselves, you can fix them with this tool.

Cheers. —cyberbot IITalk to my owner:Online 19:31, 18 October 2015 (UTC)Reply


Looks like IEEE computer history timeline https://www.computer.org/cms/Publications/timeline.pdf needs to be archived as well (or updated if its moved - I had a quick look and couldn't see a new location) Cronky (talk) 10:22, 11 October 2023 (UTC)Reply

Done. Guy Harris (talk) 09:37, 31 October 2023 (UTC)Reply

Computer Hardware Added

edit

I have included hardware history to the Digital Electronic Computers section with relevant history from the 1970s- 1990s.EFohsta-Lynch (talk) 06:16, 11 September 2017 (UTC)Reply

What are the forms of computing and computers other than the "actual computing" and "actual computers" referred to in

Late 1980s and beginning in the early 1990s we see more advances with actual computers to aid with actual computing.

What are some examples of "non-actual computing" and "non-actual computers"? Guy Harris (talk) 06:41, 11 September 2017 (UTC)Reply
edit

Hello fellow Wikipedians,

I have just modified 7 external links on History of computing. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:

When you have finished reviewing my changes, you may follow the instructions on the template below to fix any issues with the URLs.

 Y An editor has reviewed this edit and fixed any errors that were found.

  • If you have discovered URLs which were erroneously considered dead by the bot, you can report them with this tool.
  • If you found an error with any archives or the URLs themselves, you can fix them with this tool.

Cheers.—InternetArchiveBot (Report bug) 02:07, 5 November 2017 (UTC)Reply

Women in early computing

edit

The claim that "computing was a female-dominated field" is sourced to Hicks (2017). Supportive passages begin on page 1 ("In the 1940s, computer operation and programming was viewed as women's work") and continue to the end, since this is the thesis of the book. Detailed statistics on the growth of women's clerical work including computer work can be found on pages 8-9. The prevalence of women's labor even in computer manufacturing, and even before World War 2, is discussed on page 21. An inline citation to this work was already clearly present; this claim is continuous with the subsequent claim that the condition continued until the 1960s. Citations should be provided for contrary views, not vague claims about probabilities. — Preceding unsigned comment added by 67.189.38.90 (talk) 20:20, 21 March 2021 (UTC)Reply

Thanks for providing a more precise reference. It still doesn't support the very broad claim that "computing was historically a female-dominated field". That's a claim about the field of computing as a whole, whose history begins several thousand years before the 1940s. We could of course add the sourced claim that in the 1940s, computer operation and programming was viewed as women's work, or something about computer manufacturing, as you mentioned. — Chrisahn (talk) 20:29, 21 March 2021 (UTC)Reply
In this edit summary, you said "this claim appears explicitly as early as page 1 of the work already cited inline". That's false. Page 1 of the cited book starts with "In 1959, a computer operator embarked on an extremely hectic year ..." and ends with "Usually, more machinery and...". Nowhere between these sentences is there anything that would support the very broad claim "computing was historically a female-dominated field". — Chrisahn (talk) 20:44, 21 March 2021 (UTC)Reply
What do you make of "as computing gained prominence, men displaced the thousands of women who had been pioneers in a feminized field of endeavor" (same page); "When the gender makeup of a field flips ... in early computing" (same page); "computing's masculinization" (p.2); "women lost out on computing" (p.2)? It takes a truly stubborn reading to say this claim is not present from the beginning of the book. 67.189.38.90 (talk) 21:20, 21 March 2021 (UTC)Reply
It takes a truly creative reading to believe expressions like "feminized field of endeavor" or somewhat vague statements about the "gender makeup of a field" support an explicit and broad claim like "computing was historically a female-dominated field". — Chrisahn (talk) 21:32, 21 March 2021 (UTC)Reply
If it helps you see how these phrases fit together, the back cover of the book does some synthesis: "As computing experienced a gender flip, becoming male-identified in the 1960s and 1970s..." Again, this is the thesis of the book, not a subtle interpretation of ambiguous phrases. 67.189.38.90 (talk) 21:40, 21 March 2021 (UTC)Reply
The quote implies that computing was "female-identified" before the 1960s, but it doesn't make that claim explicitly. If that's the thesis of the book, it should be easy to find a clear, explicit statement to that effect, and adding it to the article would be fine. (Of course, "female-identified" is quite different from "female-dominated".) — Chrisahn (talk) 23:14, 21 March 2021 (UTC)Reply
"Soon, women became synonymous with office machine operators" (9) "Their alginment with machine work in offices persisted through waves of equipment upgrades and eventually through the changeover from electromechanical to electronic systems (9) "Office managers did not see anything unusual about the association of women with increasingly complex machines, because the association of women with automation was nearly a century old by this point." (9) "The 'total war' style of combat included conscription of women's labor, resulting in a wartime intelligence establishment that was overwhelmingly female" (11) [eg, the Colossus operators were ten-to-one women] "Most computer work in government was still done by women" (15) "That the workers in this field were disproportionately white is no more a coincidence than the fact that they were overwhelmingly women" (16) -- all this from the introduction alone. For sources, Hicks cites (p. 16, not getting into the bibliography) "Government records and the records of the nationalized industries ... complemented by the records of British computing companies, staff association and labor union records, and media from both trade and popular publications" 67.189.38.90 (talk) 23:34, 21 March 2021 (UTC)Reply
Thanks for the quotes! We should try to condense these and other quotes into something we can add to the article. I'm a bit skeptical about the claim "the workers in this field [computing] ... were overwhelmingly women", since it's somewhat at odds with the other sources I mentioned below, which say there were about 30 to 50 percent female programmers in the US in the 1950s. Maybe there was a significant difference between the US and the UK? Or maybe that claim only applies to the 1940s? Does Programmed Inequality provide some numbers regarding these issues? — Chrisahn (talk) 00:02, 22 March 2021 (UTC)Reply
Yes, Hicks focuses on the UK case. I skimmed through for some claims that can be pinned to dates (skipping over the well documented ENIAC and Colossus operators): In 1948 the Treasury creates the machine operator class of jobs in the Civil Service, which is a women's class - "The duties of the class will cover all work on calculating, punch card, and accounting machines and will range from the simplest to the most complex work" (p. 70) A 1962 Trade Union Congress internal memo said "The use of computers and other electronic devices has increased in large engineering companies and nationalized industries since 1958 ... Experience has shown that most work fed into these machines is prepared by female labor." (p.86) In 1955 the Civil Service officially instituted equal pay, but only in mixed-gender pay grades, hence not for the women-only machine operator class. (p. 93) "In the flux of the mid-1960s, actual and perceived computer labor shortages first helped women stay in the field, then eventually removed them from it, by inflating the status and pay for computer work out of the realm of 'women's work'" (p. 102) A quote from Office Methods and Machines, 1967: "Most 'operative' jobs are performed by female staff" (p. 112) "Nearly all photographs used to sell and showcase computers in the early 1960s pictured a conservatively dressed, plain-looking female workforce" (p. 113) "India's punch operator labor force was not as feminized as Britain's. Sometimes, punching jobs were reserved for men only ... The punch room at Air India in 1961 contained both men and women punchers in roughly equal proportions." (p.120) Paraphrasing unless in quote marks. — Preceding unsigned comment added by 67.189.38.90 (talk) 02:03, 22 March 2021 (UTC)Reply
Thanks for your contributions in this edit. Very clear and balanced, and a good synopsis of the various quotes and sources we discussed here. Great job! Thanks! — Chrisahn (talk) 14:56, 22 March 2021 (UTC)Reply
There is a sense in which it is true that computing is an ancient field, and the article partly adopts this framing. However, this section of the article adopts an equally if not more common focus on modern computing. See how "early decades of computing" is used in the block quotation following this paragraph - nothing about this is particularly ambiguous. That the claim including "until the 1960s" has not also been removed shows what a ridiculous and selective standard is being applied. — Preceding unsigned comment added by 67.189.38.90 (talk) 20:39, 21 March 2021 (UTC)Reply
I agree that the claim including "until the 1960s" doesn't make sense without more details. I deleted it. — Chrisahn (talk) 20:48, 21 March 2021 (UTC)Reply
Of course computing is an ancient field. See e.g. Computing#History: "The history of computing is longer than the history of computing hardware and modern computing technology and includes the history of methods intended for pen and paper or for chalk and slate, with or without the aid of tables. ... The earliest known tool for use in computation was the abacus, and it was thought to have been invented in Babylon circa 2400 BC." History of computing hardware#Early devices: "Devices have been used to aid computation for thousands of years..." Timeline of computing hardware before 1950#Prehistory–antiquity starts at c. 19,000 BC. — Chrisahn (talk) 20:52, 21 March 2021 (UTC)Reply

Yes, people have performed computations since ancient times, and used various devices for the purpose. Nevertheless it is not ambiguous when "computing" is used to refer to modern computing. For instance, the founding statement of the Association for Computing Machinery restricted its scope to "the new machinery for computing, reasoning, and other handling of information," (emphasis mine) but it was not necessary to specify this in the name of the organization. Since the single word "modern" provides adequate context for both claims, I do not see why they shouldn't be restored with this small addition.67.189.38.90 (talk) 21:21, 21 March 2021 (UTC)Reply

Saying something like "modern computing" or "computing in the mid-20th century" instead of "computing" would be an improvement. But a broad claim like "computing in the mid-20th century was a female-dominated field" would still be unsupported by the given sources, and it would be wrong. (I guess if we asked historians of these areas the question "whose ideas dominated computing in the mid-20th century", they'd most often mention Alan Turing and John von Neumann.) We'd have to be more specific. — Chrisahn (talk) 21:29, 21 March 2021 (UTC)Reply
It is, like the preceding and following (now the surviving) sentences, a claim about proportional representation in the workforce, not about leading intellectuals. It is supported from the first page of the already given source and in several specific contexts listed above among many others. Speculation about hypothetical experts is not a source. 67.189.38.90 (talk) 21:36, 21 March 2021 (UTC)Reply
As far as I can tell, Programmed Inequality doesn't claim that workforce in computing-related areas was "dominated" by women, and the book doesn't seem to provide data about gender proportions. I looked for other sources and found these: "In the 1950s, women comprised between 30 and 50 percent of programmers." [1] "In 1960 [...] the proportion of women in computing and mathematical professions (which are grouped together in federal government data) was 27 percent. ... Raytheon, where the programmer work force 'was about 50 percent men and 50 percent women' ..." [2]Chrisahn (talk) 23:05, 21 March 2021 (UTC)Reply
Same Atlantic article: "During the 1940s and 50s, it was primarily women, not men, who were developing code for the nation’s first computers, and the accompanying pay and prestige were both relatively low." Following through to the source on 30-50% (Ensmenger, The Computer Boys Take Over, which seems to articulate a similar thesis to Hicks but in the US private sector) we find "In the first textbooks on computing published in the United States, for example, John von Neumann and Herman Goldstine ... distinguished between the headwork of the (male) scientist or "planner," and the handiwork of the (largely female) "coder." (15) "The first computer programmers were not scientists or mathematicians; they were low-status, female clerical workers and desktop calculator operators" (32) "early software workers were gendered female" (38). The 30-50% figures come from p.237, in a section titled "Where did all the women go?," and are in a paragraph describing the decline of women's participation in the 60s, continuing to 23% in the 70s. 67.189.38.90 (talk) 00:00, 22 March 2021 (UTC)Reply
From the NYT article, the source on the Raytheon stats adds “And it really amazed me that these men were programmers, because I thought it was women’s work!” 67.189.38.90 (talk) 00:49, 22 March 2021 (UTC)Reply
A bit more anecdotal evidence: Margaret Hamilton says: "When I began as a programmer in 1959, and continuing until now, in every software organization and in every software project I have been involved in, there were always many more men than women programmers. There were many more men than women in our profession, and this is still the case. Regarding my own experiences, women were always in the minority and men were always in the majority." [3]Chrisahn (talk) 23:26, 21 March 2021 (UTC)Reply
"In the 1940s, computer operation and programming was viewed as women's work". Presumably this refers to the late 1940s, as the first programmable computers, in the modern sense, showed up in the late 1940's and early 1950's. I'm not sure what Prof. Abbate was referring to when she said that "[Women] made up the majority of the first computer programmers during World War II" - even ENIAC (the first programmers listed by its article were all women) was "first put to work for practical purposes on December 10, 1945", after Japan's surrender. Either she's referring to the setup of machines prior to ENIAC, such as the Harvard Mark I, or she's referring to work done after WW2. Guy Harris (talk) 22:16, 21 March 2021 (UTC)Reply
The first claim is from Hicks, not Abbate; Hicks (2017) describes types of electromechanical machines in use immediately before the war, and the genders of their operators, such as on p. 21. These devices were programmed using punch cards. See tabulating machine for more information, including historical classification of these devices as computers. Hicks also describes the women operators of Colossus computer machines (digital, electronic, and programmable) which were used in the first half of the decade, before ENIAC, and may be what Abbate refers to. 67.189.38.90 (talk) 22:28, 21 March 2021 (UTC)Reply
So "programming" in a broader sense, not just stored-program computers. The tabulating machines generally used punch cards for data but used plugboards for programming (which I did a small amount of at my high school); the skill of doing that may be closer to that of high-level hardware design (routing data from component to component, routing control pulses to trigger operations) than to programming a stored-program computer. Tabulating machine says "Many applications using unit record tabulators were migrated to computers such as the IBM 1401.", so it doesn't classify them as "computers" in the modern sense, but they're computing machines capable of some amount of decision making and changing the sequence of operations - and this article is "history of computing", not "history of stored-program computers", so they're relevant. Abbate's writing about Colossus and ENIAC; ENIAC was also originally partially plugboard-programmed (along with function tables made up from an array of switches). Guy Harris (talk) 01:12, 22 March 2021 (UTC)Reply
Yep. What I meant about them being considered computers was in the language of the day, eg, the Computing-Tabulating-Recording Company. Still, Hicks (p.67) describes some larger machines with card-based programmability on top of data entry, for limited instructions on sorting and data manipulation. Endnotes sadly don't specify which machines this refers to. 67.189.38.90 (talk) 02:24, 22 March 2021 (UTC)Reply
Unfortunately, I have the Apple Books versions of both Hicks and Abbate, so the page numbers are different. Could you give me a quote from what Hicks said on page 67 of your edition, or indicate what subsection it's in? I looked for "card" and "sorting", but the main thing I see is in the "Computer Labor before "Computers"" subsection - "The process of programming the electromechanical computers in punched card installations involved plugging up a board that either was attached to the machine or could be swapped in and out, allowing operators to quickly run programs they had set up on other boards."
BTW, are there any studies on women in computing in countries whose names don't begin with the word "United"? :-) Women in computing seems rather US/UK-centric. Guy Harris (talk) 06:00, 22 March 2021 (UTC)Reply

With regards to the timeline of computing, why is Zuse's Z1 mischaracterized?

edit

There seems to be an unnecessary amount of stress being placed on the development of purely electric vacuum computers. The Z1 while being motor driven, and unreliable after extended use, featured programability via punched film, and this was in 1936. And it could multiplay and divide, which also isn't mentioned in the article, so theres that.[1]

The article seems to make little distinction between programmability via a labor intensive process of manually setting switches and plugs needed to run a program, and simply reading stored directions via a binary punched medium.

In terms of the Z1, Zuse's device was way ahead in terms of several important features and this article really doesn't want to talk about it.

With regards to the two other early modern computers which were ostensibly developed during the late 1930's: I am really struggling to find anything about Arthur Halsey Dickinson's (IBM) 1939 and Joseph Desch, NCR3566 (NCR) 1939. Does anyone here even know what these computers even looked like, or have images they can host? I sure don't.

Just a working citation to anything about either of these inventions and their creators. This is a pretty big hole in History if no one is able to locate any publicly available schematics or information here.

I mean, provided these inventions actually do exist, and weren't simply gambits to secure patent rights on possible future inventions, and were later mischaracterized in the compilation of history. It would be excellent if my hunch turns out to be nothing more than a product of simple ignorance, which hopefully someone will be able to remedy.

My biggest concern here is that neither of Arthur Dickinson or Joseph Desch have Wiki articles themselves, which is a little strange, considering the language of the article on the history of computing appears to suggest that these two would logically share the honor of being the sire figure to modern-day computing, which may not be an accurate characterization. — Preceding unsigned comment added by 2601:14a:c100:d:595c:b157:e389:5d84 (talk) 08:31, 10 May 2022 (UTC)Reply

References

  1. ^ Rojas, Raul. "The Z1: Architecture and Algorithms of Konrad Zuse's First Computer" (PDF). Freie Universität Berlin. Retrieved 10 May 2022.

History of computation

edit

Might this be an alternative title for this article? TedColes (talk) 18:00, 19 June 2023 (UTC)Reply

Innacuracies

edit

When people click on ''History of Computing Before 1960s'' It only shows until 1950, This is an innacuracy which made me lose about an minute of research spending time trying to find the history of computing 1951-1960, i believe this should be solved. The Slacking Gecko (talk) 15:59, 21 November 2024 (UTC)Reply