Talk:Algorithm/Archive 1
This is an archive of past discussions about Algorithm. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 1 | Archive 2 | Archive 3 | → | Archive 5 |
Basic Definition of the Algorithm
I removed: "Some people restrict the definition of algorithm to procedures that eventually finish, while others also include procedures that run forever without stopping." Because, A) its opinion. B) its moronic.
- It is not opinion, it is fact. Some people do indeed restrict the term in this way. AxelBoldt 16:55 Jan 4, 2003 (UTC)
Havent you ever heard "non-deterministic algorithm"?? this has been used since the 1950s.
Imagine this algorithm:
while 1: a = random integer from 0 to 10 if a < 5: quit
Thats a set of repeatable steps. Yes, the results will be different each time, but its still an algorithm. This is too simple to have a debate on.
- Lot's of people will disagree that this is an algorithm. You can't implement it on a Turing Machine, for example. AxelBoldt 17:02 Jan 4, 2003 (UTC)
I agree that this is moronic, but I'm afraid AxelBoldt has a point about many people using it this way. This distinction even makes it into a footnote in the not that technical book Darwin's Dangerous Idea (p. 52). --Ryguasu 05:29 Jan 12, 2003 (UTC)
- I just want to add that non-deterministic, applied to algorithms, does not mean non-terminating. Non-deterministics processes are ones that arrive at the set of choices, from recurring multiple choices, straightaway. That is, behavior is as if they make the right choice every time. This overcomes a problem with deterministic processes where a number of very bad choices done first can lead to expenditure of extreme effort, and for some methods there is no perfect first choice. (The famous P = NP question is about a comparison of certain kinds of deterministic procedures and non-deterministic procedures for the same problem.) --Orcmid 01:45, 12 Mar 2004 (UTC)
- Hmm, I also want to add that if the random integer procedure uses some source of input not given at the start (say some source of thermal noise), the procedure doesn't qualify as an algorithm either, in the usual strong sense. That is, algorithms of the kind that are the usual focus of attention are ones where all inputs are known in advance and someone could sit down and produce exactly the same result from the given initial input and the procedure. For a human procedure, you could probably ask the person to throw dice and other things, but there is some question whether this would qualify, even though it can certainly be carried out. This strong sense of algorithm is not meant to rule out other (computational) procedures, and many computational procedures realized with computers are not algorithmic. There's no shame. Donald Knuth gives some additional categories in Chapter 1.1 of The Art of Computer Programming, and that is probably a perfectly usable separation. --Orcmid 01:45, 12 Mar 2004 (UTC)
- The random number values can just as well be considered an input for the algorithm, can they not? If you know all the random values, you can always reproduce the same result. -- intgr 00:56, 3 July 2006 (UTC)
- I believe it is important to mention that some people use the term algorithm for nondeterministic algorithms, some (like me) use it for nonterminating procedures, and some use it for procedures in which the data used is not all determined in advance (such as cooking recipes). I believe it is important to clarify that different contexts of use exist in which the presence or absence on these restrictions makes a crucial difference to understanding. Rp 6 July 2005 18:48 (UTC)
- To put an end (hopefully) once and for all to this debate, in The Art of Computer Programming vol. 1, Donald Knuth gives a list of five properties an algorithm:
- (1) Finiteness. "An algorithm must always terminate after a finite number of steps."
- (2) Definiteness. "Each step of the algorithm must be precisely defined..."
- (3) Input. "An algorithm has zero or more inputs..."
- (4) Output. "An algorithm has one or more outputs..."
- (5) Effectiveness. At first Knuth states, somewhat confusingly: An algorithm's "operations must all be sufficiently basic that they can in principal be done exactly and in a finite length of time by someone using a pencil and paper." He later goes on to clarify that the "effectiveness" principle means that the algorithm requires no cleverness on the part of its executor.
- I would like to draw attention in particular to the Finiteness axiom. This should bring the issue to a close and inspire a revision of the article. It will not, since this is Wikipedia. However, since my reference comes straight from the horse's mouth (Knuth), it certainly deserves some mention on the page. There are, after all, various schools of thought (including most mathematicians and computer scientists) which believe that an algorithm must terminate. Thanks, Silly rabbit 08:01, 12 November 2005 (UTC)
Tragedy of the commons
- Okay, I haven't gotten any responses yet. So I'm either guessing that everyone has either dismissed me, or has their own idea an algorithm differing from the standard. Anyway, I don't wish to rely on the Knuth books as a primary reference. Here is what the U.S. National Institute of Standards (NIST) offers as the definition of an algorithm: [1]. In that definition, however, they make reference to a computable [2] set of steps. To summarize, an algorithm is a set of operations that can be performed by a Turing machine in a finite number of steps. The input is allowed to be random, but contrary to a prior poster's idea of nondeterministic algorithms, these too must terminate. In fact, I will state it categorically and without any hesitation
- THE DEFINITION OF AN ALGORITHM REQUIRES THAT IT CAN BE COMPLETED IN A FINITE NUMBER OF STEPS, REGARDLESS OF THE INPUT DATA.
It isn't usually necessary to prove this mathematically in applications, but it really, truly, honestly is the very definition of an algorithm.
- Afterthought: all true wikipedists should go home and weep. This article won an award despite incredible inaccuracies. This is the very definition of a tragedy of the commons. Silly rabbit 01:13, 19 November 2005 (UTC)
- Inacuracies aside, it's actualy badly written and confusing for whole paragraphs, tragedy indeed.--Cs01ab 12:12, 5 June 2006 (UTC)
Introductory Material
The introductory paragraphs contain this statement:
- Correctly performing an algorithm will not solve a problem if the algorithm is flawed, or not appropriate to the problem. For example, performing the potato salad algorithm will fail if there are no potatoes present, even if all the motions of preparing the salad are performed as if the potatoes were there.
I would like to introduce some related machinery here. My draft goes on too long so it might belong in something else or a subordinate topic. At the same time, I propose that it is useful to honor this in what is written, even if it is left tacit.
- Note: I notice that I have improved the separation of algorithm from procedures, but I have still left some ambiguity between a procedure as carried out, and the description of the procedure in a script, recipe, or other writing down of the steps to follow. That matters because the expression of the procedure is to be finite though the carrying out of the procedure is a different story and an algorithm is expressed only when the procedure always completes in a finite number of steps for all valid (well-defined and finitely-expressed) inputs. It is important to sharpen these distinctions because computer programs are expressions of procedures, and it is the the procedure that may or may not be algorithmic. (As I read that, I wonder if it is worth pursuing, unless one can show that careful use of such hair-splitting language clarifies things enough. It needs to be developed further to see, with honoring of the common usage of these terms as much as possible). --Orcmid 21:35, 17 Mar 2004 (UTC)
First, I would start by saying
- Correctly performing the procedure will not solve a problem if the procedure is flawed or the algorithm implemented using the procedure is not appropriate to the problem.
I don't have a problem with the hypothetical potato-salad algorithm, though I can see adding
- For example, if following the procedure given as "the potato salad algorithm" produces macaroni salad, we know that the procedure is defective (as an algorithm for potato salad). And if the procedure is impossible to follow, or doesn't reach a point of conclusion, we would question whether that procedure delivers an algorithm for anything, let alone potato salad.
Next, it is useful to point out, especially for the young reader, that
- We often speak of a particular procedure as realizing an algorithm without concern for what it accomplishes under what conditions, although the recognition of the correctness of an algorithm involves establishing that it satisfies a well-defined purpose with specific input conditions. In too many cases, the purpose of the algorithm is omitted from its name, and some notable algorithms are simply named for the people who codified them.
- Some well-known algorithms are Euclid's algorithm (for the greatest common divisor of two numbers), the Sieve of Eratosthenes (for finding prime numbers), Newton's method (for solutions of certain equations), Dijkstra's algorithm (for finding least-cost routes in a network), Warshall's algorithm (for problems to which Dijkstra's algorithm is also applicable), and many more. Other algorithms have been given names that reflect the procedure as well as the purpose: Bubble Sort, Quick Sort, Radix Sort and Merge Sort; Binary search and Hash Lookup. [more Wikipedia links here would be very cool]
- Some algorithms are so well-known and familiar that we don't recognize them as such. The procedure printed on the back of bank statements for balancing a checkbook is algorithmic. When a discrepancy arises, it is because of an error following the procedure or in our or the bank's records. The importance of the procedure's use in a valid algorithm for check-book balancing is that any discrepancy is a reliable clue that the problem is in the data or our following of the procedure. If we also have doubts about the validity of the procedure and wonder whether there is a "bug," trouble-shooting a discrepancy is now much more problematic. We often rely on algorithms, including ones realized via computer software, that we do not know how to derive or confirm ourselves. Normally, we take as confirmation having experiences consistent with the procedures being correct.
- The most well-known algorithms are the ones we learn as children. The procedures for arithmetic -- addition, subtraction, (long) multiplication, (long) long-division, and so on -- are all algorithmic. These highly-practical algorithms are valuable because of their simplicity. We can count on them and they provide numerous ways for our checking each others work and in finding errors of calculation.
- A pocket calculator has comparable, though different, procedures for achieving the same arithmetic results. Because we cannot observe the calculator's procedure, and we use the calculator for operations that we might not know how to duplicate ourselves, it is extremely important to be able to trust that the calculator implements reliable procedures for confirmed algorithms and that it delivers the functions that it is specified to calculate.
- We can see that there may be many algorithms that satisfy the requirements of a given problem (the two approaches to Towers of Hanoi would be excellent here). Determining that one has successfully arrived at an algorithm for the desired problem and also determining how good that algorithm is in terms of simplicity and efficiency has led to a body of study known as Analysis of Algorithms.
OK, that's enough for now. I don't know where I would introduce random procedures (for testing primality, for example) as heuristics that can be far more efficient than known algorithms, and how to relate to that. To go deeper, we might also need to broach the notion of what can be done by not requiring an algorithmic approach to a problem, and recognizing that interactive behavior of computers may be algorithmic in the procedures that are applied although the overall interactive process is not itself algorithmic. I think these topics can be acknowledged, but not this close to the front. --Orcmid 23:19, 13 Mar 2004 (UTC) updated by Orcmid 21:20, 17 Mar 2004 (UTC)
Formalized Algorithms
Well, I went and read through some of the early material. There is a "Thus ..." that ends with "Turing-Complete system" that is off the mark in two regards. 1. There is nothing to make "Thus" follow, here. Also, the reference to "Turing-Complete" is very tricky, appeals to knowledge that we don't want to presume, and might even be misleading for those of us who think we know what that means.
- Oh. I see the problem. In this paragraph, computer programs are equated with algorithms, rather than with procedures. It is important to remember that not every procedure is an algorithm. Furthermore, not every computer program is even a procedure in the sense required before the satisfaction of the additional conditions can be considered. The particular conditions that a procedure must satisfy to be the procedure of an algorithm are stated well enough in the first paragraph of the article. I don't think we should mess with that. One might say that an algorithm consists of a certain form of procedure plus some other things. In addition, not every computer program is a procedure of the certain form required by the definition of algorithm! (I will leave that as a puzzle for the reader for now. Hint: The procedure that a universal turing machine follows is not an algorithm. Well, that's accurate but not such a great hint. Consider ordinary computers and the impact of input-output operations and using new inputs as part of manipulation of the procedure itself. That's a better hint.) --Orcmid 01:30, 15 Mar 2004 (UTC)
There is some work called for here. For starters, it is not the case that every procedure embodied in a Turing machine is an algorithm. That is not an equivalence. It is the case, for a certain class of problems, there are algorithms for them, and there are Turing-machine representations of those algorithms. But not every Turing machine illustrates, represents, or embodies an algorithm. Maybe that wasn't meant, but it is too easy to read that claim into the text as written. --Orcmid 02:56, 12 Mar 2004 (UTC)
Vegnere Cipher
Where does the new cryptographic algorithm that is supposed to be unbreakable and was developed by a faculty member at M.I.T., I believe, belong here. It works by keys picked up from some random source, like a satellite, that are processed in the encryption but never stored anywhere. The inventor has proved that it is unbreakable with current computational power, and no one contests this apparently. Do you know what I am referring to? RoseParks
Yes, but it isn't really an encryption algorithm itself so much as a novel means of key exchange. Most modern cryptography techniques are like that: everyone has known about one-time pads for a long time, and that they are unbreakable. Modern techniques like RSA and Diffie-Helmann are just ways to safely exchange keys, which can then be used as one-time pads. RSA and DH key exchange protocols can themselves be broken, while professor Rabin's method theoretically cannot be, but it is not yet practical for other reasons.
Thank you. You may remove or do what you want with this. Perhaps other people have read about it and could use your explanation on these pages...maybe a page itself, or somewhere on these pages. RoseParks
Most of the discussion above is pretty confused. Rabin's method isn't theoretically unbreakable, RSA and Diffie-Hellman are essentially never used to exchange one-time pad keys, RSA is not normally thought of as a key-exchange protocol, properly applied RSA may in fact be impossible to break, etc. But I don't have the time to write a section on cryptography just yet. --Kragen
Standardized Pseudocode
I think it would be a great idea if we came to some sort of standard on writing pseudocode. I've used a sort of hybrid procedural style for the algorithms in Linear search and Binary search but I'm wondering if there's a better standard out there. Can we borrow a style from a textbook like Intro to Algorithms? (is this copyrighted or is style a public domain thing like an idea?). Only a fairly small set of control structures is needed -- something to define functions, if-statements, loops, list access, mathematical operators and probably a few more. Comments? Mark Jeays
- I do think we should try to do standard pseudocode. I have no idea what that pseudocode should be however. CLR's style is usually clear, but sometimes I find it confusing (often because I do not parse the
A <- B
assignment syntax properly). I don't think things like pseudocode style are copyrightable... We should make a pseudocode article that defines whatever we use (in addition to explaning what pseudocode is in general), then link to that from every pseudocode example. In addition, we could include examples in other languages (see bubble sort for example) by putting them in subpages like bubble sort/C++. --BlckKnght
Robertson L.A, has released a book which attempts to standardise Pseudocode by specifying pseudocode words (READ, GET, DOWHILE etc) in a book "Simple Program Design - A step by Step approach" Nelson Thompson Publishing 2000. The Pseudocode examples are close enough to most High Level Languages so as to easy to translate to any language.
We definitely need code samples for all the algorithms; I think there should be one language (pseudocode or otherwise) on the main page, and implementations in other languages on subpages, as BlckKnght suggested above.
I think some executable language would be far preferable to pseudocode, for the following reasons:
- it's possible to test executable implementations of algorithms to see if they're buggy
- executable languages tend to be more rigorous than pseudocode; people writing in pseudocode tend to gloss over relevant details (like whether the range n..m includes n, m, both, or neither --- this is a huge difficulty with, for example, binary search)
My current favorite languages for this are Scheme, C, and Python.
Python is the most readable of the three; it reads like pseudocode itself. It's also the least standardized of the three, the most rapidly changing, and the one with the most "hidden" stuff going on behind the scenes. (arr.append(item)
in Python may allocate new space for the array if the space it's in isn't big enough; that really screws with algorithmic complexity analysis.)
C is the most widely-used of the three, probably the one with the most complex and irregular syntax, the most standardized of the three, the least rapidly changing, and the one with the least "hidden" stuff going on behind the scenes. It's also the most verbose and the least readable for people who aren't familiar with the language or one derived from it. (Although, since it is so widely used, almost anyone who knows how to program is familiar with the language or one derived from it.)
Scheme is intermediate between C and Python, in my opinion, except that it is the least widely used.
For now, I'm going to add implementations in Python to the algorithm pages, and when I have time, I'll add implementations in C and Scheme on subpages.
-- Kragen
- Kragen, have a look at the pseudocode page, we thrashed out a "standard pseudocode" for Wikipedia a while ago (though feel free to make suggestions/improvements). The trouble with providing actual implementations of algorithms in real languages is that the trees start to get in the way of the forest. This is less of a problem in Python and Scheme, of course. --Robert Merkel
Hmm, I don't see a standard pseudocode section anywhere. I wonder where it went? I thought it would be here. -- Kragen
- I advise you to not pursue any endeavor to create a standard pseudocode. I created one that I pushed vociferously and it was rejected by the community, who claimed that a standard was undesirable and unnecessary. I still think they're all wrong, but nothing I can do about it. See User:Dcoetzee/Wikicode. Deco 02:02, 30 July 2005 (UTC)
Whoa. User_talk:Dcoetzee/Wikicode/Specification has quite a bit of discussion (about 15000 words' worth) that could use some summarization; it appears to include most of the usual programming-language holy wars. I still think it's a good idea to try to make algorithm descriptions more similar to one another, and executable, when that doesn't interfere with their clarity. But maybe Wikipedia isn't the right place to describe algorithms? -- Kragen
Computer-Science Bias
An algorithm is not a rough form of a computer program. This is an example (among many) of the distinct Computer-science bias of the Wikipedia.
Not every computer program is an algorithm either, at least according to some of the definitions of algorithm.
I don't have a reference handy, but I have seen algorithm defined to mean, roughly, a finite set of rules that is supposed to produce an answer in a finite number of steps. Therefore, infinite loops (which can occur in computer programs) cannot occur in algorithms according to this definition.
This is, by the way, one of the motivations for the study of the halting problem. How do you prove that a certain method is in fact an algorithm?
- Well, it is done one computational procedure at a time. I don't want to be facetious. I agree with the concern here, but we may be mixing too many things together. Maybe the lapse is one of carelessness on the part of computer scientists, though I think it is more of an overgeneralization by practitioners and in the teaching of programming, not so much hard-core computer science. Either way, I think there is a legitimate concern here.
- I am concerned with the substitution of computer codes for algorithmic definitions also, though there is nothing that says a computer code can't implement an algorithm. (Be shown to terminate, clearly consist of well-defined operations and rules for their progressive carrying-out, etc.)
- But one must be careful about taking a computer code as the expression of an algorithm. For example, there is more assumed in the Ruby illustration of binary search than meets the eye.
- So what is to be done here, in this section, and in the expression/illustration of algorithms in Wikipedia? I wouldn't recommend an overhaul, though maybe some tidiness can be introduced. For example, the binary search section currently makes it clear that it is using Ruby as a means for illustrating an algorithm. Considering that it makes the idea of binary search pretty accessible, I would not complain.
- Here in the section on algorithm, I would be more concerned. I also think that maybe procedure should be used in a lot of places, including binary search, and not over-use "algorithm" quite so much.
- I have given longer expression to my concern in discussions around the notion that programs are rarely algorithms. I also have an in-progress article on the question "Do Programs Teach Algorithms" that, as it happens, uses Binary Search (and Python) as a basis for discussion. --Orcmid 02:09, 12 Mar 2004 (UTC)
There is an important reason to preserve the strict notion of algorithm (and somehow allow the important informal notion of effective procedure to survive as well). It is not just historically valuable, it preserves important content. Many of the historical breakthrough results were produced using the strict notion of algorithm. When there is a relaxed use of the term, now, and those earlier results are repeated, they are implied to cover a wider territory than has ever been demonstrated. That leads to sloppy science and, potentially, outright error. I think we should be more prudent, but still find good ways to speak and write in an informal setting. --Orcmid 02:09, 12 Mar 2004 (UTC)
I think this article could do with some cleaning up to clarify the distinction between the more formal use of the term algorithm (a finite set of well-defined instructions for accomplishing some task which, given an initial state, will terminate in a corresponding recognizable end-state) with the more commonly understood definition (a step-by-step procedure for solving a problem or accomplishing an end, especially by a computer). There seems to be some references to this in the article, but would it not be better to clarify this distinction up front? At least some form of disambiguation. Medical algorithm links to algorithm, yet does not match the formal definition. MickWest 16:58, 21 July 2005 (UTC)
Algorithm Removal
I removed the following example algorithm, because I think it is incorrect. Starting with the words zzz,yyy,xxx, it will in step one produce yyy,zzz,xxx and then it will produce yyy,xxx,zzz and then it will stop. AxelBoldt 00:33 Oct 8, 2002 (UTC)
An example of an algorithm is this rule (or method or procedure) for alphabetizing a list of names by repeating the specified steps until the job is done:
- Step 1. Compare the first 2 names on the list:
- a. If the 1st one is alphabetically ahead of the 2nd one, go to step 2.
- b. If the 2nd one is alphabetically ahead of the 1st one, swap the two of them and then go to step 2.
- Step 2. Pretend the 2nd and 3rd names on the list are the 1st and 2nd ones, and repeat step 1.
- No, it doesn't stop then -- you forgot to "repeat[] the specified steps until the job is done." The second time thru, you get xxx,yyy,zzz and then you stop, because "the job is done." -- isis 06:33 Oct 8, 2002 (UTC)
We have to formulate the algorithm clearer then. The whole point of an algorithm is to give unambiguous instructions for a process, and these are hardly unambiguous. Step two for instance talks about the 2nd and 3rd names, but doesn't say what to do if there is no 3rd name. I thought it should stop then, but you actually want a loop. If you say "until the job is done", does that imply that after each step 2, I have to scan through the whole list of names to check if it is already alphabetically ordered, and stop if it is? If so, the algorithm should say that clearly (and it wouldn't be bubblesort).
Also, please don't mark your edits as "minor" if you make major changes to an article. AxelBoldt 14:25 Oct 8, 2002 (UTC)
Ease of Understanding
- The whole idea of an encyclopedia is to explain basic concepts to people who don't know anything about them, including (or especially) 10 or 12 year-old-olds. I respectfully submit that anyone who knows what a "greatest common divisor" is (and probably anybody who knows what an "integer" is) already knows what an "algorithm" is. It's okay to put in the stuff that reads like a math textbook, but before that you need introductory material for the ordinary people that are never going to read past that expanded definition in lay terms. What you should really do is write something dynamic that would show a bubble sort of a short alpha list that wouldn't take an explanation at all but would show the items swapping and bubbling up the list. -- isis 22:46 Oct 8, 2002 (UTC)
- GCD and HCF are usually covered at junior school level (10-12). Integers are probably called "whole numbers" at that stage, but the concept is graspable. I'm not sure at what age one encounters a term like "algorithm". Later, I think. -- Tarquin 22:50 Oct 8, 2002 (UTC)
There's no question those concepts are taught in math classes at that level, but that's irrelevant to the topic under discussion. My point is that the purpose of an encyclopedia article is to define/introduce its subject to someone who doesn't yet know anything about it. Unless this article explains "algorithm" in terms someone who doesn't already know what one is can understand, it is useless for its intended purpose. Ordinary people (of any age, but I'm talking about only the U.S., and it may be different elsewhere in the world) don't know what GCDs and integers are, and they don't need to know what they are to understand what an algorithm is, so the article should be written (at least at the beginning, and you can put in the technical stuff farther down) in terms they do understand -- that could have been accomplished by using the example of putting a list of numbers in numerical order, for example.
But there's a larger issue I consider pertinent here, and that's that Americans are innumerate, not so much because they're incapable of understanding numerical concepts as that they've acquired a fear of math, and that's the fault of generations of math teachers who didn't understand or couldn't explain the concepts in terms that made them accessible. Those of us who are teaching math now (including teaching it thru the 'pedia) have an obligation to do better by the ones we teach, because math is so much more important a tool than it has ever before been in society. The more non-numerical examples we use to explain the concepts, the less math resistance we have to overcome, and the better (and easier) we get our job done. Therefore "word problems" of practical, everyday matters are better than sets of equations for illustrating mathematical principles, and therefore a list to be alphabetized is better than numbers to be sorted for explaining "algorithm," because it makes the readers comfortable with the concept before they realize it's math and resist it. (I have yet to see a 'pedia article on a mathematical topic that didn't look like it came from a math textbook instead of an encyclopedia.)
- I find it interesting that encyclopedias do not seem to reflect your maxim about what encyclopedia articles are for. I think what you propose is laudable, however. How do we deal with the fact that algorithm is listed under Mathematics in the Featured Articles section? --Orcmid 02:41, 12 Mar 2004 (UTC)
So I respectfully insist that the example in this article needs to be of a simple sorting algorithm (and there is none simpler than a bubble sort, so I'm still voting for that) that anyone can understand. I'd like for it to be of alphabetizing items, because that would obviate math anxiety, but if it has to be numerical, make it something like putting checks or invoices in serial-number order. I'd like for it to be dynamic, showing the items swapping in pairs, but I don't know how to program anything that moves, although I've started introducing animated gifs to encourage contributors that do know how to animate illustrations to do so. It is ridiculous not to take advantage of the capabilities of this medium, especially when equations are so deadly dull, and instead of showing the transformations in a mind-numbing list of them, you could have the elements moving in and out of the equations on the screen, and the novelty of that would suck your readers in, so they would pick up the concept before they realized it. -- isis 00:17 Oct 9, 2002 (UTC)
I completely agree with you that the first example should be as simple and intuitive as possible, and sorting names is simpler than computing gcd's. I just didn't like the bubblesort description since it wasn't explicit enough. By the way, I always thought selection sort is simpler: pick the smallest element, put it at the top. Pick the next smallest element, put it at position two, etc. That's how I usually sort things.
- Interesting. Bubblesort and selection sort are very closely related. For selection sort, you have to remember where the least element was found and exchange it with the leading element, or otherwise slide things. In Bubblesort, you bring the smallest element to the front by starting at the far end and comparing and exchanging adjacent elements as necessary so the smaller of the two is always moved forward and used in the next comparison. That way, when you complete a sweep, the least member is in the leading slot. Then, on the remaining unsorted n-1 elements, do it again, excluding the already-sorted elements. Rinse, repeat. (Termination condition is that nothing moved, or you can just go until there is only one element left - it will be the largest.)
- I think selection sort animates nicely. It is also easier to illustrate as a paper-and pencil method where the sorted list is produced in a separate column. Bubblesort is a little trickier as a computer animation and is not very suitable as a manual procedure (interesting point). I like the Towers of Hanoi because there are two quite different solutions, the problem is easily understood, it exhibits combinatorial explosion, and also shows that (efficient) algorithmic procedures often do not resemble the problem they are solving in a clearcut way -- Euclid's algorithm does that too, more directly. This shows the difference between clarity/directness of solution and optimum effort, and raises important questions about trusting in algorithms we do not understand. I recommend that sorting illustrations be in articles on the specific algorithms, though. The current example of finding the largest element in a list works, although it would be nice to establish the initial conditions (at least one member in the list). One can raise the influence of data representation and use of available elementary operations with even this simple one, including transformation to a similar algorithm employing the same principle (finding the least instead, etc.) --Orcmid 16:39, 15 Mar 2004 (UTC)
As to your contention that Wikipedia articles are too technical for an encyclopedia: here is the start of Encyclopedia Britannica's algorithm article, (fair use):
- systematic procedure that produces—in a finite number of steps—the answer to a question or the solution of a problem. The name derives from the Latin translation, Algoritmi de numero Indorum, of the 9th-century Muslim mathematician al-Khwarizmi's arithmetic treatise “Al-Khwarizmi Concerning the Hindu Art of Reckoning.”
- For questions or problems with only a finite set of cases or values an algorithm always exists (at least in principle); it consists of a table of values of the answers. In general, it is not such a trivial procedure to answer questions or problems that have an infinite number of cases or values to consider, such as “Is the natural number (1, 2, 3, . . .) a prime?” or “What is the greatest common divisor of the natural numbers a and b?” The first of these questions belongs to a class called decidable; an algorithm that produces a yes or no answer is called a decision procedure.
And it gets worse from there; not a single example. AxelBoldt 00:33 Oct 9, 2002 (UTC)
Articles should of course start with good definition/introduction paragraphs that give a person of average intelligence a good idea what the subject is and why it is important but we shouldn't dumb things down to the point where articles are not useful to people who already know the basics. BTW, we should also aim to be better than Britannica in depth, breadth and accessibility by starting off with the basics and building from there. Just my 2 cents. --mav 00:37 Oct 9, 2002 (UTC)
- As to selection sorts being easier than bubble sorts, I guess my age in showing -- I learned to sort in assembly language without matrix notation, and bubble sorts were much easier (at least for me) than selection sorts, especially because when there wasn't enough room to hold two lists of the given length (which was usually the case), the job would bounce instead of running, so it wouldn't be done when the assignment was due. I think most of us do use selection sorts in sorting alphabetical or numerical items that are physically separate (like file cards or cancelled checks), so I think that would be a good example to use, but I'd still like to have a dynamic image of the cards rearranging themselves, the way my Windows shows me my files flying from one folder to another.
- Did you cite the Encyclopædia Britannica article because you meant to suggest it was worse than the current Wikipedia article? To me, it's not, -- it starts with a definition that is accurate, complete, and concise, something neither the first sentence nor the first whole paragraph of the current Wikipedia article does. Or did you mean to imply that if ours is no (or not much) worse than theirs, it's acceptable? Even if I considered EB the standard for encyclopedias (and I never have -- I've always preferred the Encyclopedia Americana to EB and Funk & Wagnalls to World Book, which were the standard reference books in libraries when I was in grade-school), I'd still like for the 'pedia to be the best it could, not just as good as the supposed competition.
- So mav has expressed my position very well: Each article should start out with introductory material that explains to an average reader what the article is about and why it is important' and then segue into some more advanced material that will appeal to someone who already knows something about the topic (and then there may even be some advanced topics that appeal only to specialists). A reader who wants the basics will read only the introductory part (so it needs to be complete for that purpose), but that part should not be so simplified ("dumbed down") that it will offend the specialist who skims over it getting to the meatier part. -- isis 01:51 Oct 9, 2002 (UTC)
- I gave the EB text because I consider it far worse than our article: they start out with saying "in a finite number of steps" which is wrong; then they enter the topic of table lookups for finite problems which is completely besides the point, and then they go right into primes and gcd's without a single example algorithm. And I don't find the definition "systematic procedure that produces—in a finite number of steps—the answer to a question or the solution of a problem" any better than "well-defined method or procedure for solving a problem, usually a problem in mathematics or otherwise relating to the manipulation of information". AxelBoldt 02:45 Oct 9, 2002 (UTC)
- Whoa! You threw me. Why do you say "in a finite number of steps" is wrong. They are not implying a fixed number of steps, only a finite number. It is common to require that an algorithm (not every computational procedure) terminate. Since the steps are discrete, that kind of means you can count them, and when the algorithm terminates given a particular input, you can read the counter. Any example of sorting, for example, given a finite input (also a requirement in most definitions of algorithm) had better be one that terminates on any valid input or we would throw it out as an algorithm for sorting. --Orcmid 02:19, 12 Mar 2004 (UTC)
- Well, as the saying goes, "There's no accounting for tastes." -- isis 03:03 Oct 9, 2002 (UTC)
- First, I don't think this is a matter of taste, but of understandability - and I think MickWest raised an excellent point: guaranteed termination and e.g. determinism of an algorithm is sometimes assumed, sometimes not, and it is important to be aware of this when seeing the term used. What is more, the phrasing 'terminates in a finite number of steps' is ambiguous: it can either be interpreted strictly, to mean that termination must happen in a finite number of steps on any input, or liberally, by saying that termination must be within a finite number of steps on any input for which an answer is defined at all. This is an important difference: the difference between primitive recursive and recusrive, or if you're a programmer, between for loops and while loops. I think it is important to have at least some mention of this in the text. Rp 02:06, 6 May 2006 (UTC)
- Second, I like sorting as an example, mainly because I once found myself switching from selection sort to merge sort, when I had to sort a large stack of exams. There wasn't a computer in sight! I'm sure anyone can understand the task of alphabetizing or otherwise sorting a very large stack of items, and can also understand the two sorting algorithms. Rp 02:11, 6 May 2006 (UTC)
Choice of Example
We need a different algorithm for the example; this Euclidean GCD one is too unintuitive. I have a master's in math, and I can't figure it out, so I know ordinary readers can't follow it. -- isis 20:54 Oct 27, 2002 (UTC)
- How about the trial division prime testing algorithm, or maybe one of the bin-packing algorithms ?
How about the sieve of Erathosthenes? okay, so it's infinite, but it's a simple prodecure to explain, it's clear how & why it works, and you can limit it to, say, numbers up to 100 so it terminates. I remember doing it at age 10 or so. -- Tarquin 21:09 Oct 27, 2002 (UTC)
- I have no idea what that is, but it sounds dirty -- let's do it. -- isis 21:17 Oct 27, 2002 (UTC)
- Sieve of Eratosthenes --Imran 21:35 Oct 27, 2002 (UTC)
- I'll tell you what: You go out to a shopping area, a mall or downtown, and stop 10 people at random and ask them, and if as many as five of them know what a prime is, I'll agree to this one. The problem I'm having is that the purpose of an example is to clarify things for somebody who's looking the topic up because they don't know anything about it, and none of these mathematical concepts (like "divisor" and "prime") is part of most people's vocabulary, not to mention mind-set. I still say this needs to be an algorithm for something ordinary people understand, like alphabetizing or (if you can't live with its not being numbers) counting or sorting. -- isis 21:52 Oct 27, 2002 (UTC)
Sorting is a simpler concept, but is actually a more complex algorithm in terms of the actions involved and what they do. The sieve is easy: you just write the numbers down then cross them out. It's clear that it works, because when you cross out 2,4,6,8, you're removing all the multiples of 2. I think the explanation on Sieve of Eratosthenes could do with a rewrite though. Some 20% of the shopping mall people will be illiterate, but we're not dumbing down our long words here for them. We must make articles accessible to the intelligent lay-person. That means someone who can look up "prime" and "divides" and quickly grasp the concept. Like I said, it's 10-year old maths: I clearly remember doing it. If people are so innumerate that they forget this stuff ... *sighs in despair* -- Tarquin
- I agree with Tarquin, I certainly recall knowing the sieve technique when I was 10-11, possibly it's just a UK education thing.
- I think we should assume knowledge of what a prime is in the same we assume people know what a noun is.--Imran 22:19 Oct 27, 2002 (UTC)
Guys, if you're just not willing to put it in terms an average reader of the 'pedia can understand without having to look the words up first, then simply take the example out, because it's not going to do what an example is supposed to do. What you remember from 5th grade is totally irrelevant -- you're not the one that needs to look the word up. The problem is that having an example there that's too hard is worse than not having one at all, because it will scare off someone who was interested enough to try to find out what "algorithm" means. And BTW, people in this country (U.S.) don't know what a noun is, either. -- isis 23:03 Oct 27, 2002 (UTC)
- ask the same people you ask about "prime" whether they have any interest in reading about the meaning of the word "Algorithm". Some might -- but those with no grasp of maths concepts will go away happy with the intro text: "Generally, an algorithm is a list of instructions for accomplishing some task..." Knowledge builds on knowledge, and people getting this far should know that. Anyway, why don't we try writing several examples: a sort and the sieve, and leave the GCD too. We can then try and decide which is the simplest to the thick *cough* novice reader. -- Tarquin
I recommend implementing the selection sort algorithm, whether there will be only the sort algorithm or other ones as example(s). It being simple to understand is an issue, that's the right algorithm. Its correctness is easy to get. TCascardo
Question on Etymology
I don't know how to fix this, but I'd like to point out that the article's etymology of "algorithm" doesn't look entirely coherent. The article seems to provide two different stories:
- It descended from successive translations of an Arabic guy's name.
- It descended from a word in said Arabic guy's chief work, a word which referred to "the rules of performing arithmetic using Arabic numerals"
There seems to be a missing link here somewhere. --Ryguasu 05:34 Jan 12, 2003 (UTC)
- Please read it again more carefully: It says "algorithm" came from his name, and "algebra" came from the name of his book. It may not be expressed clearly enough because all of us who worked on it knew what it meant, so if it doesn't say that to someone who doesn't already know it, please fix it. -- isis 05:46 Jan 12, 2003 (UTC)
Good Examples
I had a thought on the subject of good examples -- how about the Tower of Hanoi algorithm? -- Tarquin 11:52 Jan 24, 2003 (UTC)
- Has there been any more consideration of this? I think the problem of demonstrating that the procedure terminates, and it always gets the right result might be a little tricky if you want to keep this informal. The other nice aspect is that there are two different approaches, one that is easy to do with a computer, and the "practical" one that is iterative, rather than recursive, and more economical in a number of ways. The number of moves of disks is the same, of course, unless you slip up, but the housekeeping for one is negligible and the other is a function of the number of disks in the stack (though I would not call it impractical for anything a person would be willing to carry out, or that I would be willing to watch carried out on my computer display). But it is a practical, mostly math-free procedure. Of course, it is sort of fun to see why it works to say that to move the stack of n on post i to post j, you first move the top n-1 to post 6-i-j. --Orcmid 02:33, 12 Mar 2004 (UTC)
Reworked the First Paragraph
Ok, some of the problems with the opening paragraph:
- The definition of the algorithm was prefaced with generally, when broadly-defined was meant. The definition of the algorithm as presented may be restricted, but it is never excepted unless done so erroneously, so it makes sense to be more precise with the modifier.
- The term "list" has a more sequential denotation than does "set" as usually code is very non-linear. Take any assembly code for example and you get my point. I realize that an argument can be made the acutal storage of an implementation of code may be list, but this is not very strong as an argument on two points. First, most code today is non-linear in storage with the advent of shared libraries and subprocedures, and secondly the storage of instructions is really insignificant in comparison with the operation of the code, which is what algorithms are about. Set, therefore, is more precise. I realize for non-computing applications such as a recipe, list is generally the norm, but it seems to me that set is equally understandable to lay persons.
- I added the hyperlink iteration and the term decision with the links to Boolean logic and inequality because comparison and logic are the cornerstone of even the simplest cooking recipes.
- I changed 'mnemonics' as an example of an algorithm, because strictly speaking, a mnemonic device is NOT an algorithm. It is the data structure associated with an alogrithm, as the process of encoding and decoding the original information IS the algorithm. The phrase ROY G BIV is no more an algorithm than an MP3 file. Both are the byproduct of a codec.
- I added the 'heuristics' link, because of the close relation.
- I thought that the bit on policy and mechanism was important for understanding how different algorthims get the same thing done.
- The rest of it I just cleaned up the syntax for conciseness and clarity.
jtvisona 01:51, 14 Aug 2003 (UTC)
Example algorithm is flawed ...
The algorithm to find the highest number in a list is flawed :
- It uses the first element of the list twice
- It fails when the list is empty
--Stephan Leclercq 08:34, 20 Jul 2004 (UTC)
- That has now finally been corrected. — 131.230.133.185 18:42, 12 July 2005 (UTC)
Corrected ? Maybe :-) But at the expense of readability. How do you expect a non-programmer to understand this Lisp / Smalltalk / Python / Whatever pseudocode ? Why not simply "show List.Max" ?
- I deal with why I used that style of pseudocode below. As for using "List.Max", that eliminates the example (an algorithm is a list of steps, not a command) entirely, which is pointless. The current example shows the steps of the algorithm in a style that is close to the simple, nonprogrammer's English used to describe it. — 131.230.133.185 23:48, 19 July 2005 (UTC)
References
Ok, I'm a lazy bastard and I always meant to do it way back in the day when I first put my hands into this article, but Knuth's 3 volume set really should be included in the references.
jtvisona 072004
An automated Wikipedia link suggester has some possible wiki link suggestions for the Algorithm article, and they have been placed on this page for your convenience.
Tip: Some people find it helpful if these suggestions are shown on this talk page, rather than on another page. To do this, just add {{User:LinkBot/suggestions/Algorithm}} to this page. — LinkBot 10:24, 17 Dec 2004 (UTC)
- (The above leads to Wikipedia:Archive/Refreshing brilliant prose - Science#Applied Arts and Sciences, where there are simply three votes for retention. --Jerzy(t) 21:17, 2005 Feb 8 (UTC))
Etymology
I agree that the "not Greek" insertion is better gone, tho
- a) not popular belief and b) it's not from the word to gore, either, but we don't have to explicitly say that
are bad reasons for removal. There is little popular knowledge that it came via Arabic, and irrational is it, normal human minds are far more confused by the anagram than by a coincidental syllable. There's a stumbling block here, and a better-crafted corrective may well be worth while.
For that reason and others, how about some more etymology: what is the usual relationship between -ithm and -ism; it was certainly de-Arabified, and was it Grecified or Latinized? Is there an early Arabic form other than just the personal and place names? Some of these answers may be worth including.
--Jerzy(t) 21:17, 2005 Feb 8 (UTC)
Hi, If everyone agrees that my edit was unecessary so be it, and remove it if you must. However, I agree with you that the resons given a) and b) where at least silly. I think it is popular belief, because of the ending -rithm which is a popular Greek word (logarithm, arithmetic etc etc) and I am also drawing from my experience as a student, where many people thought it had a Greek root. I think it makes more sense for someone who does not know, to assume its Greek rather than Arabic. As for b) and the fact that someone might associate it with "gore" rather than arithmetic, I think that tells a lot about that person. May I suggest a bit less TV and open a book. --Vasileios Zografos 21:29, 8 Feb 2005 (UTC)
The example is becoming a contest ...
It seems to me that the example algorithm (how to find the highest number in a list) is becoming a contest on how to perform it in the smallest amount of instructions and in the weirdest syntax possible. It has become totally unreadable to the average person, who does not know about strange constructs from Perl or Lisp. The conventional example, while simplistic, could be read by anyone.
Unless there is strong opposition (please vote here), I will soon revert to a more understandable version, with a conventional "for" loop, the requirement of a non-empty list (so we can talk about preconditions) etc.
Oh, by the way, the example is currently much more flawed than before. It assumes that the > operator returns true when comparing a number with the value "nothing", while most languages would return false (or abort execution) in this case :-)
- From the perspective of a programmer in a C-style language, you are correct. All the flaws you mention should be removed, because they are grating. However, programmers of any style do not really need an example; they've seen dozens. Nonprogrammers (usually) do. We should use their perspective to guide the choice of an example. A nonprogrammer learning of algorithms may not want to learn a C-style language (or any other style). Regardless of how much we want to, we should refrain from forcing off-topic details of C-style languages on the reader. That, not a contest to make C-style programmers angry, was why I changed the example.
- The algorithm given is (in simple, nonprogrammer English): "look at each item of the list; if it's the highest we've seen so far, remember it; after we've gone through the list, what is the highest we've seen ?" All other details are irrelevant and forced. "do something if a > b" is very close to plain English with a commonly understood math symbol. How some languages handle comparisons to nothing strangely is arcane, off-topic detail; at any rate, pseudocode is intended to get the idea, rather than the exact detail. "List.each_item" conveys the loop in a way that a nonprogrammer can understand at a glance. "for i = 1 to len(List)" requires an explanatory comment and a more convoluted way of thinking in the hope of teaching the reader the C style. How certain languages iterate through lists is irrelevant; that you are looking at each item is the point.
- I believe that the syntax in the current example is closest to the simple English description while still being pseudocode and requires no understanding of any programming constructs except assignment. What a C-style programmer, through extensive practice, comes to consider common knowledge that any average person could understand is not necessarily what an average person could understand without coaching (as shown by the necessity for comments). What is nonstrange to a C-style programmer, through extensive familiarity, is not necessarily familiar to nonprogrammers. Most importantly, what is strange to a C-style programmer is not necessarily strange at all to a nonprogrammer who has no idea why the example should be considered strange. — 131.230.133.185 23:43, 19 July 2005 (UTC)
I strongly agree that we do not need to revert to a pure C syntax. However, do you really think that the non-programmer (who is likely also non-mathematician) will understand the dot notation as in list.each_item { some codeblock } ? Does not look like C, but like Smalltalk to me :-)
If we want to move away from programming, we either have to turn to a purely mathematical definition where the looping nature of the solution, and the need to examine each number only once, would be lost in recursion, or else use plain english with no more than prep school math symbols.
What's C-minded in a phrase like "examine in turn each number N in List, if N > max then max becomes N" ?
--Stephan Leclercq 18:10, 22 July 2005 (UTC)
- I meant that we should reduce unfamiliar concepts, not reduce unfamiliar appearances. There are no new concepts introduced in the current pseudocode, such as counter-variable iteration, needing the initial maximum to be prefilled for whatever reason, etc. For the same reason, I do not agree with introducing intermediate concepts like recursion.
- As for unfamiliar appearance, we already have the example repeated in English just above the pseudocode. Any confusion from appearances, such as dot notation, can be rapidly cleared up by referencing the English listing :
- 1. Look at each item in the list. If it's larger than any you've seen so far, make a note of it.
- List.each_item { highest = this_item if this_item > highest }
- While I don't wish to waste reader brainpower with extra beside-the-point concepts, I believe most nonprogrammers are competent enough to pick up the example.
Moot point now, as someone (not me :-) re-introduced the loop (and a bug when list is empty :-) I'm afraid no one (especially computer guys) will accept an example that does not explicitly contain a loop.
wrong link for "comparison"?
the link "comparison" at:
The concept of an algorithm is often illustrated by the example of a recipe, although many algorithms are much more complex; algorithms often have steps that repeat (iterate) or require decisions (such as logic or comparison)....
takes you to Inequality (math), is this intended?
Legal issues
I removed the reference to the USA as POV. The statement "Certain countries, such as the USA" seems to single out the USA for no reason apparent to the topic Algorithm. I'd be willing to compromise if a number of both pro-patent and anti-patent countries were listed, assuming a noteworthy software industry presence in the country.
I also removed the "provided a physical embodiment is possible" part, because it may give the idea that a physical embodiment isn't possible. If we're defining algorithms in terms of a Turing machine, then every algorithm has a possible physical embodiment. /* Pradeep Arya 18:20, 2 January 2006 (UTC) */
- Why did [User:Mfc] remove "controversially"? It seems to me like it was neutrally reporting that controversy exists. -- Mikeblas 21:53, 2 January 2006 (UTC)
- I agree. I think not mentioning the controversy is POV; slanted toward the pro-patent side in a lie by omission kind of way. However, since I made the last edit to that section, I won't change it myself. If somebody else agrees with me and Mikeblas, feel free to change it back. /* Pradeep Arya 23:18, 2 January 2006 (UTC) */
Inherently serial problem?
Regarding "Inherently serial problems", the text reads:
- One such example is depth-first search of a graph, which happens to be recursive, but can not be parallelized.
Firstly "inherently serial problems" occurs only 5 distinct times in a google search [3]. Secondly there seem to be several parallel implementations of depth-first search [4](and, if you think about it, it seems reasonable to parallelized - you just need a way of knowing which node would have been readed first, which you can do by encoding the node positions in the tree). I'd suggest a better example would be Newton's method
MickWest 18:07, 8 January 2006 (UTC)
- I had the same thought about depth-first search when I read the article. It does not seem like a good example. Perhaps another alternative for an example would be the Three-body problem which seem hard to parallelize beyond 3 processors. -R. S. Shaw 23:06, 10 January 2006 (UTC)
- Or perhaps repeated hashing, ala: result = MD5(MD5(MD5(MD5(plaintext)))); any good hash algorithm perturbs the result significantly in the presence of more information. It makes it very difficult to begin the next iteration of hashing without having completely finished the previous iterations. —Pradeep Arya 12:27, 11 January 2006 (UTC)
Mathematician / Astrologer
User Theodore7 is modifying the article to emphasize al-Khwarizmi's work as an astrologer.
It began here: [5]
This was reverted by Lumos3: [6]
Theodore again modified it: [7]
This was reverted by Duncharris: [8]
Theodore again modified it: [9]
This was reverted by Ruud: [10] who left a message at Theodore7's talk page.
I left a follow-up message to discuss. In the meantime, Theodore again modified it: [11]
I'm going to revert the changes now, and move the discussion from Theodore7's talk page to this newly created subsection.
- Listen, my edit is sourced. Ok. So please, do not revert unless you have a source that runs counter to him being an astrologer. Saying he was a "mathematician" is redundant. All astrologers - particular the classical astrologers such as Al-Khwarizmi were mathematicians. Cite a sources that say he was not. Thanks.Theo 04:48, 10 January 2006 (UTC)
Your changes to algorithm have been reverted. While they are correct, the fact that al-Khwarizmi was a astrologer is already mentioned in his biography and not very relevant in the algorithm article. Cheers, —Ruud 12:19, 9 January 2006 (UTC)
I'll agree with Ruud here. al-Khwarizmi's work as a mathematician is germane to the topic of Algorithm. His work as an astrologer is not. /* Pradeep Arya 14:42, 9 January 2006 (UTC) */
I suggest you read Boyers' History of Mathematics, and realize that al-Khwarizmi's main work was as an astrologer - which means that he was a mathematician. Suggest you study the facts of the history and not your POV.Theo 14:46, 9 January 2006 (UTC)
- I'm not sure to which POV you are referring, please elucidate. That said, I seem to understand you saying "astrologer equals mathematician". Even if that were true in al-Khwarizmi's time, it certainly isn't true in our time. We need to write Wikipedia for people to understand it today with the way they use language today (not as language was used in al-Khwarizmi's time). /* Pradeep Arya 15:03, 9 January 2006 (UTC) */
I reverted your most recent changes to Algorithm, please see Talk:Algorithm. /* Pradeep Arya 15:36, 9 January 2006 (UTC) */
My two cents: al-Khwarizmi may have been a great many things in addition to mathematician and astrologer: writer, swordsman, home chef, whatever. This article should not list them all. His role as mathematician is the relevant one here. —Bunchofgrapes (talk) 04:29, 10 January 2006 (UTC)
- That is silly. The facts say otherwise. What anyone's "current view" is concerning this subject is not relevant. Suggest you study the history of astrology - relating to the Hindus and Arabs, as it is primary to the history section of this article. Again, one's personal views (POV) relating to "astrology" does not allow for removal based on views "for" or "against". Check the history before pushing POV. Suggest you refrain from choosing what is "relevant" and what is not. The material is sourced, and many more can be added as well - primarily, al-Khwarizmi's own books. If you have sources that say he was not an astrologer - please cite them.Theo 04:45, 10 January 2006 (UTC)
- Again, he was indisputably a writer, a human, a mammal -- use some judgement. —Bunchofgrapes (talk) 16:09, 10 January 2006 (UTC)
- Again, this is not the point. The material is factual, and sources cited. I suggest using some judgement here Bunchofgrapes, and not the very generalized he was a "writer" "human" and "mammal" - which, for some reason beyond me, you would presume others are not aware of these moot facts? Or, perhaps, you are just joking. Ok.Theo 17:17, 12 January 2006 (UTC)
I think the issue is this: You're trying to get al-Khwarizmi's profession right. I'm trying to get al-Khwarizmi's role in algorithms right. So when I say "astrology is irrelevant" you're seeing it as an anti-astrology POV; as though I'm trying to rewrite a historical fact (al-Khwarizmi's profession). That said, from my side, I'm trying to get his role in algorithms correct. I'm not trying to idenitfy his profession, only characterize the work he did with respect to algorithms. Can we agree that "mathematician" is a better characterization of the work he did with respect to algorithms, than "astrologer"? /* Pradeep Arya 07:14, 10 January 2006 (UTC) */
- Pradeep, I do understand what you mean here, but, my addition to the "history" section of this Algorithm Page does not take away from the encyclopedic version, but ADDS to it. I do not see it as anti-astrology at all, or even pro-astrology - but a fact. If you read the materials, you will see it clearly yourself that astrologers invented mathematics, and that Hindu astrologers invented the algebra that al-Khwarizmi translated, and used. I agree partially w/you on "role" but perhaps Pradeep, you might be better served as to actually read al-Khwarizmi, and the very clear connections between astrology, algebra, and yes, algorithms, which is directly connected to astrology itself, and the work as mathematicians. All astrologers are mathematicians, and al-Khwarizmi was an astrologer, foremost. Your view in this matter, might be missing this fact, in that you seperate Algorithm, and al-Khwarizmi from whatever idea you may, or may not have of what "astrology" is. Now, I am suggesting that you might really want to see the historical facts of the direct connections here, and not base the entire matter on your possible confusion here. Moreover, my entry is in the history section, Pradeep, and is factual, attributed, and cited. That is all. I would love to help you to see that I am not trying to change the entire article, because I have no such intention, but I can add numerous direct sources on this, and I think, with your interest, and knowledge of modern algorithms, and my historical knowledge of the orgin of algorithms, to astrological techniques, principles, and history, that a fascinating new version, or expanded version, could be achieved. Think about it positively, and how exciting this could be. There is defintely an "anti-astrology" view that permeates the "scientific community" that runs counter to HISTORICAL FACT. I am against this always, because it is POV and intends to "rewrite history" to suit conventional POV that is in error when it comes to the facts - and facts that can be sourced, and cited extensively. Even the smallest mention of the word "astrologer" seems to get the blood-pressure up of those who hold strong POV on what they think "astrology" is, but it does not change the facts. And Wikipedia is an encyclopedia with facts on subjects. Now, I would really enjoy working with you on this Pradeep. I mean it. I liked reading what you wrote so far, and it made me want to read more, then I came up on the history section (al_Khwarizmi) and that's what I edited, added, and sourced. So, don't take it personally, please, because it was not, and never will be that way with me, ok? As a fellow Wikipedian, I am on your side, Pradeep, and not against you. What do you say?Theo 18:00, 12 January 2006 (UTC)
- Okay. Perhaps we're coming at "astrology" from different angles, because I have no knowledge of how algorithms connect to astrology. Please share with me your reasoning and logic that leads you to conclude that astrology and algorithms are connected. Personally, I don't see it; but I am willing to listen and learn. —Pradeep Arya 18:15, 12 January 2006 (UTC)
- There is no connection. Theo, you might want to read [12]. —Ruud 18:19, 12 January 2006 (UTC)
I beg to differ R.Koot/Ruud. That is one site, from the University of Scotland. Suggest that you see additional sources other than those that fit your POV. Even the site you gave shows Al-Khawarizmi was an astrologer. Moreover, again, there was NO distinction between astrologer/astronomer/mathematician in that time, and many centuries until the 17th century. Suggest you read the references cited below on the major texts on astrology and algorithms, and not just one website that one can read from one's own POV - then stating - in error, as you do, that "there is no connection." There clearly is - and historical facts bears this out, extensively, mind you. (See references cited below.) Lastly, again, al-Khwarizmi's role as an astrologer is proven, and is mentioned in the appropriate section, and is cited. You cannot state that he was not, then, replace it with mathematician based on your POV. No, it is not the subject of the entire article, but, as you know, Wikipedia has links from each subject matter to others, so this claim is not valid. If you must persist, then I will extend my entry on the history section with more sources if need be. There was NO distinction between astrologer, astonomer and mathematician with regard to al-Khawarzimi, and others - and just because there is today in the POV of others, does not change that historical fact. I state the obvious regarding him, and it is not a big deal, but is being made one with reverts to the contrary. If you want, I will add numerous sources, but I like Pradeep's entries for the most part, and the article as a whole.
Now, Pradeep, thanks for saying what you did. Good. That's a start. But, because you do not personally see it is directly related to your admission that you have no knowledge of how algorithms connect to astrology. Mathematics & astrology are directly connected, and from this connection - came algorithms. This is historical fact. There is a direct connection Pradeep. Go to the origins of Vedic astrological mathematics and see for yourself. I am quite versed on Vedic astrology and its mathematics, and if you conduct a simple internet search - (Astrology and the origin of Algorithms) you will be able to find copious facts. This is one of the major problems with conventional views, is that contrary to history - to historical fact - that they make a distinction, not only between astrology, and astronomy, but between astrology and mathematics. There was no such distinction; yet, it is being stated today as a fact. This is not true. Read Hindu Astrology: Myths, Symbols and Realities, by Anthony Philip (New Dehli, Select Books, 1981) and also Sphudjidhvaja, The Yavanajataka of Spudjidhvaja, (2 Volumes, edited, translated, commented on by David Pingree, Cambridge MA & Harvard University Press, 1978) among them. These materials are rich with evidence, and give direct proof of my assertions. There are plenty more I can name, but will start with these two, and of course, Boyer's work. I hope this helps to get you started. If you need more references, ask, and I'll be happy to send more your way. Theo 18:58, 12 January 2006 (UTC)
- Maybe you could cite a few relevant paragraphs from those works. Would save me a trip to the library... —Ruud 19:15, 12 January 2006 (UTC)
What is wrong with a trip to the library? Not to sound rude, because it is not meant that way whatsoever, but, perhaps, you might just benefit a loaner from the library, or better yet, purchase the books mentioned. I can cite others for you since you might want to have them in your own personal library for future reference yourself since it would be good for you to see the direct connection between serious, scientific, applied astrology, to algorithms, astronomy, and yes, even to mathematics itself.Theo 19:20, 12 January 2006 (UTC)
- It takes time and energy better spent on other thing, and since you seem to have those books at hand... —Ruud 19:25, 12 January 2006 (UTC)
- I agree that the line between astronomy and astrology was much blurrier back then. However by today's standard parts of al-K's work are clearly mathematics, so there is no reason to call every mathematician who lived before the scientific revolution (for as far as we can call mathematics a science) an astrologer. —Ruud 19:25, 12 January 2006 (UTC)
There was nothing "blurry" about it at all back then, as you say. Hey, I wrote the short paragraph in the history section of the piece, ok, and not in the "today's standard section." What you say are "today's standards" is POV, no matter how many in a community, or club, would share that view. Now, it is ok to say that they "believe" this, or that they don't "believe" that, yes, but, the "scientific revolution" as you call it, is actually the Age of Rationalization, and Astrology, despite what these POVs may claim, is practiced today as an applied science by professional astrologers - using mathematics. I use it every single day. You know, if you would look honestly, without the intrusive point of view, and move along the circle to see more points of view - and historical fact - you just might be surprised by what your open eyes will see. Astrologers were ancient mathematicians. That is all they did. And, they used mathematics not only to map the skies, and calculate the motions of planets, and stars, but they mapped the Earth as well. The used the mathmatics invented by Hindus, and some by the Chinese too - and mapped the celestial skies and the terrain on the Earth. This is fact. The seperation is only in your mind. There IS NO SEPARATION. Algebra was a shorthand used to shorten the long calculations of tracking the movement of the planets - including the Earth - due to the many variables. It was needed because the old arithmatic just took too long, and contained many errors in calculation for creating ephemeris. All this is history man. Astrologers invented the modern mathematics used today - algebra, geometry, trigonomery - and it is ok. That is the major problem here. The POV that denies these historical facts. What is wrong with having this known? More clearly - it was also practiced by people such as al-Khwarzimi, and many, many other astrologers who entered India after the Islamic invasion of that country and those in Europe later. This is historical fact. There is nothing "blurry" about it. The copious materials are there. The sources are there. They can be cited. They can be read. Taken out from the library. Bought, and read by anyone. I am not making this stuff up. Go see it for yourself.Theo 19:34, 12 January 2006 (UTC)
- Alright, explain to me were you think my reasoning goes wrong:
- Al-Khwarizmi wrote Al-Jabr wa-al-Muqabilah, as work of mathematics, therefore, Al-Khwarizmi was a mathematician.
- Al-Khwarizmi practices astrology, therefore Al-Khwarizmi was an astrologer.
- Al-Khwarizmi was a male.
- Al-Khwarizmi was or did many more things.
- We shouldn't list ALL of these things in the algorithm article because they disrupt the flow of the text and are better explained in Al-Khwarizmi's biography.
- Therefore we should only list the property of Al-Khwarizmi that is the most relevant in the algorithm article.
- Therefore we should refer to Al-Khwarizmi as a mathematician.
- 19:50, 12 January 2006 (UTC)
Listen, don't play semantics with me. Games, ok? I took the time to write to you, cited sources, and then you come back with this? Are you a serious person? I know you are only 19-years-old Ruud, but, perhaps you may want to learn a little more about the world before getting stuck on your absolute knowledge, ok? The above notations by you are picking gnat s__ out of pepper. Give it a rest, ok? I knew this stuff when I was your age. I am in my mid-40s now, ok kid? You are rationalizing to suit your own POV. Stick to the facts. The sources are there, can be cited, and leave it at that. So, please, don't insult my intelligence here with items like the above "... was male" and "... was or did many more things" - please, give it a break Ruud, and either be a serious person, or go play elsewhere. There is NO disruption to the text whatsoever, and it is entered in the appropriate section. I suggest Ruud, that you actually be honest, and read the books, sources mentioned. Is that too hard for you to do? Or will you not let the facts interfere with your POV? That way, if you take a little time to actually read the materials, rather than wasting time on your fixed point POV - you actually see that astrologers were the mathematicians. Or, does it bother you that they were Hindus, and not western Europeans? Is that it. That Hindus discovered the value of zero, and algebra, etc.? Does that bug you? The sources are there and can be cited. No need to be simple here about it and reduce things to silliness. Ok? Read the materials, be an honest Wikipedian. Let the facts and the history speak for themselves.Theo 20:00, 12 January 2006 (UTC)
That assuming that you think that I am not one of those people. I suggest you take the time to read the Tao of Physics. Perhaps you might learn more than your age suggests.
- And if you had cared to do the maths you would have known I'm 20. —Ruud 20:32, 12 January 2006 (UTC)
That is "math" Ruud, and yes, Happy Birthday Capricorn, I just had a birthday too a few days after your own, and am a January Capricorn too. I do apologize to you for my assertion that perhaps you might not like that Hindus discovered zero by the way; I don't believe you really think that. However, it does not take away from my point. Do your own homework, and do not assume that at the new age of 20 that you've learned all there is about the subject at hand, and would prefer to stick to a POV, and not read the historical fact. I took the time to actually read the link you sent. Suggest you actually take the time to find the books I referenced.Theo 21:07, 12 January 2006 (UTC)
- When someone (Theodore7, in this case, but whoever) resorts to ad hominem arguments, he or she has lost the debate. So, best ignore that one. Needs a thick skin sometimes, but that's life. Personally, I'll bet that an objective 19-year old (or 16-year old) is more likely to be right than a bigoted mid-40-year old. But that's just my experience as an even older person...quota
Perhaps, but then again - ad hominem - who asked you? Express knowledge on the issue at hand, and not POV that perhaps doesn't sit well with your own sensibilities. One size does NOT fit all Quota. And, as an even older person, you should already know this. I also wouldn't say that I am bigoted, never was, and you don't know me to state this. So, I suggest you add something of value to the issue rather than on your POV on what is "objective" and what is not. You would bet - and in this case, you'd lose it too. I do not easily discount age, and knowledge for nothing. The arguements this 20-year-old makes for not reading the historical fact of the matter at hand makes this point even more valid, since he would just as much not for the benefit of POV. Just as Ruud, who just left 19 for 20, knows more than he did when he was 9-10 years old, consider he'll know more at 30 and then 40. Moreover, Quota, if you are to insert yourself into to this discussion - then do so from a position of adding something to the discussion, and spend time reading the Talk Page, than from a political-correctness that has no place in the discussion. Theo 21:07, 12 January 2006 (UTC)
Theo, let's strike a deal here:
- Give me time to read your sources. My marriage reception is on the 14th, and I'm travelling to the United States on the 18th. After that, I'll go to the library and find your sources and read them. Then we can talk about it.
- In return, I ask that you please not modify the article from mathematician to astrologer; to avoid pointless revert wars regardless of who is correct.
Do we have a deal? —Pradeep Arya 20:15, 12 January 2006 (UTC)
Congrats by the way on your marriage Pradeep. I have no problem; however to have a mutual agreement, I would suggest that both terms astrologer/matematician - as one line - be added in your article section. We will leave my edit, w/ the Boyer attribution w/paragraph, out of the history section until you've had the time to read the sources. Agreed?Theo 21:07, 12 January 2006 (UTC)
Well, I can't speak for the consensus, so I can't say "astrologer/mathematician" is an okay edit. The thing is, you've given a source, but nobody can verify that source yet. So like science aritcles, it needs peer review before it gets published. Changing the article without the peer review puts the cart before the horse, if you know what I'm getting at.
Please give me the time to read your sources. In the meantime, not making a change to the article avoids stirring up the ill-will of the other editors here. If you are correct, I'll have a much easier time convincing them of that fact if they didn't have to fight a revert war every day for a whole week.
Ultimately, everyone has to be convinced, and as you've already noted, the anti-astrology POV can be quite strong, especially in science-related matters. Trust me, getting your version up for five minutes every day isn't much of a victory when you lose the good faith of the other editors working with you.
Just give me a week, that's all I'm asking for. —Pradeep Arya 21:38, 12 January 2006 (UTC)
I see your point. Well said. Yes, I will agree to this. Take longer than a week. I am not a jerk. I'm not pushing POV. I just enjoy seeing the facts of the matter included - and not denied - based on ignorant, and sometimes hostile POV. At least you are willing to go and read the sources. So, take longer than a week. I'm not looking for a "victory" Pradeep, just the facts not being shoved aside to support POVs that are "anti-astrology" - while not even knowing what applied astrology is, and it sure is not "sun-sign newspaper astrology." POVer's not wanting to admit the historical facts that are sourced, and verifiable - while pretending to be the "voice of science and reason." What kind of "reasoning" throws out historical facts, and denies a science as astrology as false when they maintain little knowledge of it, do not practice it to be qualified to peer review? Science is verb to me. Not a staid noun. However, I do agree with your assessment Pradeep. I am tough-minded, but am not closed-minded, or a rude cynic, or ignorant, not to be able to listen, because I do listen to common sense and good advice. And your request is both. So, I will agree to this as long as you give your word, and will be honest enough that when the historical facts bear out as sourced, and verifiable, that you will then be able to help support the factual position, and expand on the piece without the "anti-astrology POV" based on ignorance of the subject, and denial of the true history. That's all I am asking. Ok? Other than this, I will have to agree with you, because you make perfect sense in this request. Have a wonderful reception, and trip Pradeep. All the best to you and your new wife.Theo 21:53, 12 January 2006 (UTC)
Translation
Who the hell came up with the folowing translation? "De numero Indorum" (Concerning the Art of Hindu Reckoning). If the English translation is an accurate translation of the Arabic title, then the Latin is wrong. If however the Latin is the correct title, then the English is worng. (De Numero Indorum means "Concerning the Indians'/Hindus' Number"). Jim62sch 15:05, 15 January 2006 (UTC)
- Yes, also see al-Khwarizmi more of his books suffer from this problem. Most references contradict each other (EB, TAoCP, MacTutor) and you'll find even more names on the Internet. I think part of the problem may lie in the fact that the English names comes from the original Arabic name, while the Latin translations (of the text, not only the title) have different titles. Most of the names of his books seem to be abbreviated as well. I'm currently doing some research on this. —Ruud 15:19, 15 January 2006 (UTC)
Read Wikipedia Policy and Guideline
About the article, it is not so bad, I wonder why you want remove it.
About external links, I suggest you read the guidelines before to decide to suppress them. Or remove the good one instead.
The dictionary of algos is not marvellous, I spent lot of time in their pages to find what I search without to find any valid code. Splang 12:31, 11 May 2006 (UTC)
- I think you may be confused—it's not being proposed to remove the article, it's being proposed that featured article status be removed from the article. That means that the star at the right top of the article would be taken off, as would the "Algorithm is a feature article" box at the top of this talk page. A "not so bad" article is not necessarily good enough to be a "featured" article. -R. S. Shaw 07:02, 13 May 2006 (UTC)
- I suggest you read the guidelines. —Ruud 16:34, 11 May 2006 (UTC)
- Ok. I have just read them. I even read them twice. The link I am speaking about "dictionary of algos" is linked twice at bottom of the page, this is spam.
- I have removed promotional text, the name of two authors, useless, and you have restored them. Previously you have removed the link itself. This is not coherent, not logic and not in the policy of Wikipedia. Splang 06:54, 13 May 2006 (UTC)
Proposal of inclusion
I propose we add more links to sources of algorithms. Reading about algorithm without source code is absolutely useless. I have added one. If someone remove it or remove others links to source code of algorithms, I invit anyone to read the guidelines, visit the websites and add them again, if one thinks they are valuable content (this according to the guidelines to be sure the link is not added by the owner of the site). Splang 06:54, 13 May 2006 (UTC)
- There are plenty of links to implementations of specific algorithms in the articles for those algorithms. See Quicksort for just one example. There's no need to include links to implementations of specific algorithms here as well, particularly since the Algorithm article already contains wikilinks to more the more specific articles. This article should be providing the lay-person with a general idea of what the general concept of "algorithm" (which is a far more general concept than just "source code") represents. If a reader wants more detail, they can click through to a more specific article. --Allan McInnes (talk) 17:21, 13 May 2006 (UTC)
- "which is a far more general concept". If you are right, we must create a Computer Algorithm page (this is actually redirected to the Algorithm page. Do you agree?
- About QuickSort, the page has sources in Python. The syntax is special (significative indenting), not the best language, links to sources in other language are required (or changing the examples). Splang 06:58, 15 May 2006 (UTC)
- I don't agree with the need to create a "Computer Algorithms" article, because I can't see what you would put there that would be substantially different from the content of the current Algorithm article. Issues related to implementing specific algorithms on computers are covered in the articles for those specific algorithms.
- As for the Quicksort article, it contains:
- A pseudocode description of the algorithm
- An implementation in C
- An implementation in Haskell
- An implementation in Prolog
- A link to quicksort implementations in 28 different languages on Wikibooks
- Links to implementations in Java and C
- A link to implementations in a half-dozen different languages on the LiteratePrograms wiki.
- I fail to see the need for further links to implementations. Nor should further implementations be added to the article. Wikipedia is an Encyclopedia, not a source repository. There are plenty of other websites that provide source repository capabilities. Not to mention websites that provide directories of links. --Allan McInnes (talk) 18:55, 15 May 2006 (UTC)
Proposal of rewrite
The examples in the page must be rewritten in a langage with a more conventionnal syntax. Pascal? Splang 07:03, 15 May 2006 (UTC)
- The examples in this article have been written in a form of pseudocode. This has the advantage of being both accessible to lay-readers, language neutral, more expressive than any one language, and the form used by both mathematicians and computer scientists in describing the way an algorithm works rather than just how it is implemented. I suggest that you read through Wikipedia:Algorithms on Wikipedia, and more importantly Wikipedia talk:Algorithms on Wikipedia, to get a feel for the arguments that have already taken place about which languages to use in articles (or if any should be used). --Allan McInnes (talk) 15:43, 15 May 2006 (UTC)
- I'll continue the discussion of this page you have linked. Splang 14:08, 17 May 2006 (UTC)
Definition of Algorithm traceable to a documented source
Relative to a question RE "Church Turing Thesis" I pulled out my cc of Donald E. Knuth: The Art of Computer Prograamming, Second Edition, Volume 1/Fundamental Algorithms, Addison-Wesley, 1973. I am going to paste this into the article:
'Besides merely being a finite set of rules which gives a sequence of opertions for solving a specific type of problem, an algorithm has five important features:"(p. 4) "1) Finiteness 2) Definiteness 3) Input 4) Output 5) Effectiveness "(p.4-6)
With respect to (1) Finiteness:
"An algorithm must always terminate after a finite number of steps" (p. 4). " We should remark that the "finiteness" restriction is really not strong enough for practical use; a useful algorithm should require not only a finite number of steps, but a very finite number, a reasonable number" (p. 6, Knuth's italics) With respect to (2) Definiteness:
"Each step of an algorithm must be precisely definied; the actions to be carried out must be rigorously and unambinguously specified for each case" (p. 5) With respect to (3) Input:
"An algorithm has zero or more inputs, i.e., quantities which are given to it initially before the algorithm begins. This inputs are taken from specified sets of objects" (p. 5) With respect to (4) Output:
"An algorithm has one or more outputs, i.e., quantities which have a specified relation to the inputs" (p. 5) With respect to (5) Effectiveness:
"'This means that all of the opertions to be performed in the algorithm must be sufficiently basic that they can in principle be done exactly and in a finite length of time by a man using paper and pencil" (p. 6).
I am a believer that the algorithm is actually the machine, not just a list of instructions, but the machine itself, programmed to do xyz. But that's just my definition; Knuth's will suffice.wvbaileyWvbailey 15:13, 29 May 2006 (UTC)
Retrieved from "http://en.wikipedia.org/wiki/Talk:Church%E2%80%93Turing_thesis"
- Thanks for digging up the Knuth source following my question. I think it's unfortunate that your contribution got reverted. I suspect if the material had been paraphrased it would have stayed, but large quotes raise copyright issues.
- The following is an exchange with RUUD as in "Rude"
- quoting from taocp
- Hi, I reverted your edits to algorithm, as I doubt quoting that amount is allowed. By the way, in the third edition he gives al-Khwarizmi's name as "Abū ʿAbd Allāh Muḥammad ibn Mūsā al-Khwārizmī". Cheers, —Ruud 18:46, 29 May 2006 (UTC)
- The following is an exchange with RUUD as in "Rude"
- My editions states: "All rights reserved. No part of this publication may be reproduced...". I know you can quote pretty much anything of any length in an academic paper (you have to if you don't want to be accused of plagiarism), but on Wikipedia you also have to think about copyright. I'm afraid you will have to do a bit of creative writing of your own (and attribute Knuth as well). —Ruud 18:56, 29 May 2006 (UTC)
Retrieved from "http://en.wikipedia.org/wiki/User_talk:Wvbailey"
- Grrr [wvbailey here]
- When i calm down I'm going to revert your revert. Of course this is allowed -- its a quotation, not an unatributed copy. If you want to compromise, take out the first part re the history but put back the second part. it's important for the Church-Turing thesis page. wvbaileyWvbailey 18:50, 29 May 2006 (UTC)
- Grrr [wvbailey here]
- I'll bring this up-- contest this -- with whatever overseers there are here on Wikipedia. Please see WP:Dick. wvbaileyWvbailey 22:05, 29 May 2006 (UTC)
- Uhh... right? —Ruud 22:23, 29 May 2006 (UTC)
- I'll bring this up-- contest this -- with whatever overseers there are here on Wikipedia. Please see WP:Dick. wvbaileyWvbailey 22:05, 29 May 2006 (UTC)
- I have no idea what I have done wrong to you, but stop the immature name calling please. —Ruud 17:36, 30 May 2006 (UTC)
Retrieved from "http://en.wikipedia.org/wiki/User_talk:Wvbailey"
- Look: you behave like a dick, you get treated like a dick. You don't know squat about US copyright law. Learn your manners: open a discussion before you revert. Here is the consequences of your actions: I treat you like a dick. You are not the first dick I've encountered on the Wikipedia, and it is always a self-centered dick attitude: "I'll just tear this guys work up because I can". I tell you what. Try to be constructive <== operative word. Then I won't think you're a dick. Don't just tear stuff up because you're puffed. That was good, thought-out work you tore out. I'm putting this cc on the comments page too. I'm sick of this. Don't write me any more, please.wvbaileyWvbailey 18:51, 30 May 2006 (UTC)
Comments re Historical information
- In the long term I'd like to see each of the criteria expanded and justified. This is such a fundamental issue that it'd be irresponsible to do otherwise! Pgr94 08:51, 30 May 2006 (UTC)
- Good luck. Mr. RUDE will revert you. I haven't decided how to deal with this yet. I have a shit-load more experience (about 40 years more) with this stuff than he does. But what to do?wvbaileyWvbailey 14:56, 30 May 2006 (UTC)
- But, thanks for the support. This is the first time in 100's of contributions that anyone has freaked about the copyright issue -- and believe me, these are short compared to others I've included. The issue should be moot because these are small quotes, not entire chapters, plus they are identified and attributed. What worries me about paraphrasing is the original wording is the best there is, I can't do better. So it ends up a true steal, even if attributed well.
About the 2nd point, I believe that the problem boils down to the nature of induction. If you look at induction, you see a starting point or "0" case that is "the input", then an action of some sort, then the use of the previous value, etc. etc. leading to the output. And why induction? (I can't say what I really want to say because I'm working on some stuff to publish.) Because: "Any property belonging to zero, and also to the immediate successor of very number that has the property, belongs to all numbers" (cf Nagel and Newman, Godel's Proof, p. 103). This is on the boundary between the philosophers (in particular Bertrand Russell who fretted alot about induction) and the mathematicians. The next thread is this: recursion theory. This is just another kind of inductive-like set of precisely-defined operations.
Historical information
Hollerith punchcards, Jacquard loom, telephone-switching technologies
The idea of a mechanical process entirely controlled by a fixed set instructions:
The reference: Bell and Newell, Computer Structures: Readings and Examples, McGraw-Hill, 1971, diagram on page 39, shows these elements (Jacquard loom --> Hollerith punchcards and telephone-switching technologies) as roots of a tree leading to development of the first computers.
Jacquard loom: [?]
Hollerith punched cards: from http://www.maxmon.com/punch1.htm:
- The first practical use of punched cards for data processing is credited to the American inventor Herman Hollerith, who decided to use Jacquard's punched cards to represent the data gathered for the American census of 1890, and to read and collate this data using an automatic machine.
Telephone switching technologies:
From Valley News, Lebanon NH, Tursday, March 31, 1983: George Stibitz, inventor of the Model I Complex Computer in 1940, the "first computer". 1937 he was working as a mathematical consultant for Bell Laboratories. The article cites the "burdensome" use of mechanical calculators with gears:
- "Steibitz thought that if standard telephone relays could be used to make mathematical calculations, the entire process would be simplified.
- " He went home one evening in 1937 intending to test his idea. He tinkered with two flashlight bulbs, some metal strips, two telephone relays, a dry cell and a few feet of wire.
- " When the tinkering was over, Stibitz had constructed a binary adding device" (p. 13)
Telegraph with tape, Teletype, stock-market ticker tape: the mechanical typewriter extended to automatism
Interesting observation about Turing in his biography --
- "There were, of course machines in existence which manipulated symbols. The typewriter was one such. Alan had dreamt of inventing typewriters as a boy; Mrs Turing had a typewriter; and he could well have bugun by asking himself what was meant by calling a typewriter 'mechanical'"(Hodges, p. 96)
Hodges uses the phrases "exactly determined behavior", "finite numbers of possible 'configurations'", "one at a time", "tape marked off into unit squares" etc. (p. 97)
When was the teletype with paper-tape punch invented? This is a far better example of a contraption that prints (rather, punches) symbols (symbols-as-dots arranged in code) into paper tape that it or another machine can read. The early Control Data computers had this as an input/output device, ditto for the Digital Equipment PDP 8. (I'm an eye-witness to all this stuff -- used it all with me very-own little fingers.)
As is the ticker tape machine is another example.
Gottlob Frege (1879), Guisseppe Peano (1889)
(boldface added) The following is quoted from van Heijenoort's Introduction to Frege's Begriffsschrift, a formula language, modeled upon that of arithmetic, for pure thought (1879):
- ...although a mere booklet of eighty-eight pages it is perhaps the most important single work ever written in logic. Its fundamental contributions ... are the truth-functional propositional calculus, the analysis of the proposition into function and argument(s) instead of subject and predicate, the theory of quantification, a system of logic in which derivations are carried out exclusively according to the form of the expression, and a logical definition of the notion of mathematical sequence"(p. 1)
- "a language that deals with "the conceptual content" and that he came to call 'Begriffsshrift' [footnote 6: ideography, 'concept writing']. This ideography is a "formula language", that is a ligua characterica, a language written with special symbols, "for pure thought", that is, free from rhetorical embellishments, "modeled upon that of arthmetic", that is, constructed from specific symbols that are manipulated according to definite rules"( p. 1)
Apparently Peano was unaware of the work of Frege [?]. The following is quoted from van Heijenoort's introduction to Peano's The principles of arithetic, presented by a new method (1888). The symbols he used look very much like modern symbols.
- "Written in Latin, this small book wsa Peano's first attempt at an axiomatization of mathematics in a symbolic language. Peano had already (1888) used the Logic of Boole and Schröder in mathematical investigations and introduced into it a number of innovations...: for instance the use of different signs for logical and mathematical operations, and a distinction between categorical and conditional propostions that was to lead him to quantification...."(p. 81 ff)
David Hilbert (early 1900's)
Can we keep this tethered close to "algorithm"? The emerging idea of "precise" as in "steps of a proof", no wiggle-room allowed, no definitional flexibility: marks on a page without meaning other than that derived from their definitions: Logicism > Kronecker and Hilbert fight [pre-1900's] > Hilbert's program (Paris Conference[?] 1900: Hilbert's [2nd?] Problem. Leads to a quest to axiomatize mathematics and find a "proof of everything": > Subsequent Logicist a.k.a. Hilbert fights with the Intuitionists in particular Brouwer > Russell's paradox in the emerging "set theory" > Hilbert's quest + Russell's paradox leads to Alfred North Whitehead's and Bertrand Russell's Principia Mathematica (PM), a thorough and complete axiomatization of "the marks on the page" that represent "Logic" and "arthmetic".
Kurt Godel (1931)
From his famous paper: "...proofs can be carried out according to a few mechanical rules"(U. p. 5) "The formulas of a formal system (we limit ourselves here to the system PM) are, considered from the outside, finite sequences of primitive symbols... and one can easily make completely precise which sequences of primitive symbols are meaningful formulas and which are not.... proofs are nothing but finite sequences of formulas"(U. p. 7) But this is not so simple. Davis commentary (U. 39):
- "since Godel's characterization of 'formal mathematical system' uses the notion of a rule's being constructive, that is, of there existing a 'finite procedure' for carrying out the rule, an exact characterization of what constitutes a finite procedure becomes a prime requisite for a completely adequate development.Church's thesis ...[see below, U p. 100-102] identifies the functions whcih can be computed by a finite procedure witt the class of recursive functions (general recursive functions)...[but]...Dr. Godel has stated in a letter that he was, a the time of these lectures [1934], not at all convinced that his concept of recursion comprised all possible recursions: and in that in fact the equivalence between his definition and Kleene's [cf U. pp. 237-252] is not quite trivial"[U p. 40]
Kleene's paper is not trivial at all.
Alonzo Church (1936)
And from the theories of recursion Alonzo Church defined "effective computability":
- "We now define the notion ... of an effectively calculable function of positive integers by identifying it with the notion of a recursive function of positive integers" (U p. 100)
- It has been already pointed out that, for every function of positive integers which is effectively calculable in the sense just defined, there exists an algorithm for the calucation of its values.
- " Conversely it is true...that every function, an algorithm for the calcuation of the values of which exists [thus an algorithm requires values to exist], is effectively calculable... the fact that the algorithm has terminated becomes effectively known... the relation between a positive integer and the expression which stands for it be effectively determinable..."(U pp. 100-101)
Also, commentary by the editor Martin Davis in the same volume:
- "This paper is principally important for its explicit statement (since known as Church's thesis) that the functions which can be computed by a finite algorithm are precisely the recursive functions..." (p. 88, The Undecidable)
Alan M. Turing (1936-1937)
"U" refers to "The Undecidable" (boldface added):
- "The "computable" numbers may be described briefly as the real numbers whose expressions as a decimal are calculable by finite means" (first sentence of turing's paper, U p. 116)
- "According to my definition, a number is computable if its decimal can be written down by a machine"(U p. 116)
Turing then equates his definition of "computability" with Church's "effective calculability"
- "The proof of equivalence...is outlined in an appendix to the present paper"(U. 117)
[His outlined-proof is not trivial. But he states the following at the outset, thus providing us with the thread from Church's "λ-definable" to his work:
- "The theorem that all effectively calculable (λ-definable) sequences are computable and its converse are proved below in outline."(U. p. 149)]
The two definitions in §1 of Turing's paper receive further detailed treatment in his §9:
- "All arguments which can be given are bound to be, fundamentally, appeals to intuition, and for this eason rather unsatisfactory mathematically. The real question at issue is "What are the possible processes which can be carried out in computing a number?" (U p. 135)
- "In particular, it follows that, if there is a general process for deteremining whether a formula of the Hilbert function calculus is provable, then the determination can be carried out by a machine." (U. p. 135)
Turing then approaches his assertion by whittling down the work of a "computer" (his use of this word denotes a man) to its essential form: "Computing is normally done by writing certain symbols on paper. We may suppose this paper is divided into squares like a child's arithmetic book....I assume then that the computation is carried out on one-dimensional paper, i.e. on a tape divided into squares. I shall also suppose that the number of symbols which may be printed is finite. "The behavior of the computer at any moment is determined by the symbols which he is observing, and his "state of mind" at that moment. We may suppose that there is a bound B to the number of symbols or squares which the computer can observe at one moment. If he wishes to observe more, he must used successive observations. We will also suppose that the number of states of mind which need be taken into account is finite [he justifies this]. He then reduces the "operations performed by the computer to be split up into 'simple operations' which are so elementary that it is not easy to imagine them further divided" (U p. 136). His reduction yields the following:
- "The simple operations must threfore include:
- "(a) Changes of the symbol on one of the observed squares
- ("b) Changes of one of the squres observed to another square within L squares of one of the previously observed squares.
- "It may be that some of these change necessarily invoke a change of state of mind. The most general single operation must therefore be taken to be one of the following:
- "(A) A possible change (a) of symbol together with a possible change of state of mind.
- "(B) A possible change (b) of observed squares, together with a possible change of state of mind"(U p. 137)
- "We may now construct a machine to do the work of this computer."(U p. 137)
Emil Post (1936)
Here is a remarkable coincidence of two men not knowing one another but describing a process of men-as-computers working on computations -- and they yield virtually identical defintions. Post's is even more atomized than Turing's. Plus he provides a STOP instruction, unlike Turing who lets his machine go on forever calculating "figures" (1's and 0's):
- "Finite Combinatory Processes. Formulation I.
- "...a two way infinite seuqence of spaces for boxes... The problem solver or worker is to move and work in this symbol space, being capable of being in, and operating in but one box at a time.... a box is to admit of but two possible conditions, i.e. being empty or unmarked, and having a single mark in it, say a vertical stroke.
- "One box is to be singled out and called the starting point. ...a specific problem is to be given in symbolic form by a finite number of boxes [i.e. INPUT] being marked with a stroke. Likewise the answer is to be given in symbolic form by such a configuration of marked boxes[i.e. OUTPUT].
- "The set of directions ... sets up a deterministic process when applied to each specific problem. This process will terminate only when it comes to the direction of type (C) [STOP]. "
- "The writer expects the formulation to turn out to be logically equivalent to recursiveness in the sense of the Godel-Church development....our aim will be to show that all such [wider and wider formulations] are logically reducible to formulation 1. We offer this ... as a working hypothesis. And to our mind such is Church's identication of effective calculability with recursiveness" (U p. 289-290)
J.B. Rosser (1939)
Then, we see J.B. Rosser state [boldface added]:
- "'Effective method' is used here in the rather special sense of a method each step of which is precisely determined and which is certain to produce the answer in a finite number of steps. With this special meaning, three different precise definitions have been given to date. [foot note 5 references definitions of Church, Herbrand, Godel, Post and Turing. Godel/Herbrand's is "λ-definability", Post's is highly mechanical, similar to Turing uses to a man moving from box to box writing a mark or erasing it according to instructions, but also eventually halting, etc.]. The simplest of these to state (due to Post and Turing) says essentially that an effective method of solving certain sets of problems exists if one can build a machine which will then solve any problem of the set with no human intervention beyond inserting the question and (later) reading the answer. [thus: INPUT and OUTPUT, wvb comment]. All three definitions are equivalent ... the fact that all three are equivalent is a very strong argument for the correctness of any one. (J.B. Rosser, An Informal Exposition of Proofs of Godel's Theorem and Church's Theorem, Journal of Symbolic logic, vol. 4 (1939), reprinted in The Undecidable, p. 223ff).
Stephen C. Kleene (1943)
In 1943, Stephen C. Kleene actually defined as his "Thesis I" what is now known as "the Church-Turing Thesis". But he did this in the following context [boldface added]:
- "12. Algorithmic theories... In setting up a complete algorithmic theory, what we do is to describe a procedure, performable for each set of values of the independent variables, which procedure necessarily terminates and in such manner that from the outcome we can read a definite answer, "yes" or "no," to the question, "is the predicate value true?
- "We can express this by saying that we set up a second predicate: the procedure terminates in such a way as to give the affirmative answer. This last property of the second predicte we designate as the property of being effectively decidable.... etc." (reprinted on p. 273-274, the Undecidable, from Stephen c. Kleene, Recursive Predicates and Quantifiers, American Mathematical Society, 1943, Transactions, Volume 53, No. 1, pages 41-73).
Andrei N. Kolmogorov (1953)
The following is quoted from Blass and Gurevich, Algorithms: A Quest for Absolute Definitions, from the Bulletin of European Association for Theoretical Computer Science 81, 2003. Kolmogorov attempts to define algorithm in talk at the Moscow Mathematical Society (bold-face and italics added for emphasis):
- "*An algorithmic process splits into steps whose complexity is bounded in advance, i.e. the bound is independent of the input and the current state of the computation.
- "*Each step consists of a direct and immediate transformation of the current state.
- "*This transformation applies only to the active part of the state and does not alter the remainder of the state.
- "*The size of the active part is bounded in advance.
- "The process runs until either the next step is impossible or a signal says the solution has been reached." (Blass and Gurevich p. 9)
(Together with Uspensky, Kolmogorov created a new class of computation model of a primitive machine now called a Kolmogorov machine) (Google search does find this, but is not on Wikipedia).
The following references are from Blass and Gurevich p. 29-30:
- [40] Andrei N. Kolmogorv and Vladimir A. Uspensky, "on the definition of algorithm", Uspekhi Mat. Nauk 13:4 (1958), 3-28, Russian; translated into English in AMS Translations 29 (1963), 217-245.
- [44] Yiannis N. Moshovakis, "What is an algorithm?" in B. Enquist and W. Schmid, Editors, Mathematics Unlimited, Springer-Verlag 92001) 919-936.
- [56] Vladimir A. Uspensky and Alexei L. Semenov, "Algorithms: Main Ideas and Applications", Kluweer, 1993.
Kurt Gödel (1963)
In 1963, Godel appended to his famous paper the following (quoted from From Frege to Godel):
- "Note added 28 August 1963: In consequence of later advances, in particular of the fact that due to A.M. Turing's work [footnote 69] a precise and unquestionably adequate definition of the general notion of formal system [70] can now be given, a completely general version of Theorems VI and XI is now possible. That is, it can be proved rigorously that in every consistent formal system that contains a certain amount of finitary number theory there exist undecidable arithmetic propositions and that, moreover, the consistency of any such system cannot be proved in the system" (p. 616)
- [Footnote 69] references Turing's 1937 paper
- [Footnote 70] "In my opinion the term "formal system" or "formalism" should never be used for anything but this notion...formal systems in the proper sense of the term, whose characteristic property is that reasoning in them, in principle, can be completely replaced by mechanical devices"(p. 616, boldface and itaclics added) wvbaileyWvbailey 23:09, 2 June 2006 (UTC)
So there you have it -- the threads that bind the fabric of algorithm as we use the word today. As I said above there is more, but I'm keeping that to myself for the time being. (Hint: this philosophy goes back as far as David Hume. He made certain observations about minds that were taken up by Russell, in particular the nature of "animal logic" and simple enumeration (cf Russell, The Problems of Philosphy.)wvbaileyWvbailey 14:36, 30 May 2006 (UTC)
Donald Knuth (1970's)
His work represents a maturiting of the newly-emergent computer science. Do we bring his work in here? Where he defines "algorithm" with his five bullet points, writes a 3-volume set? Provides algorithms galore in a pseudocode he calls "MIX". He's still in the process of finishing his work, per Stanford Alumni magazine article (June[?] 2006)).
Gregory Chaitin (2005)
Chaitin attributes "the creat[ion] of metamathemtics" [discussing mathematics with mathematics] to David Hilbert:
- "It was his idea that in order to be able to study what mathematics can achieve, we first have to specify completely the rules of the game. It was his idea to create a formal axiomatic system or FAS for all of mathematics, one that would eliminate all the vaugness in mathematical argumentation, one that would eliminate any doubt whether a mathematical proof is correct or not.
- "Mathematical proofs must be formulated in this language with nothing [his boldface] missing, with every tiny step of the reasoning in place" (p. 29-30, Meta Math! The Quest for Omega, Pantheon Books, NY, 2005)
Reference in article
Please note that it is inappropriate to refer readers of the Algorithm article to material on the talk page. I have removed the recently added reference to this section of the talk page. --Allan McInnes (talk) 23:13, 20 June 2006 (UTC)
- I read this ASR policy and I don't think a temporary reference of this sort violates it. But whatever. It was worth a try. Frankly I'm not too interested in Wiki's supposed "needs" --it has so many little silly "don't do this, don't do that", just like the Army -- but rather, I'm interested in the readers and their needs. I suggest that editors keep their customers always in mind. wvbaileyWvbailey 11:57, 21 June 2006 (UTC)
- It is precisely because of the needs of readers that this policy exists. Specifically, quoting from the policy in question:
- Put simply, this policy is about remembering that the goal of Wikipedia is to create an encyclopedia, not merely to perpetuate itself, so the articles produced should be useful even outside the context of the project used to create it. (emph. mine)
- A talk page is part of the infrastructure used to create an article. It's not part of the article itself. The talk page won't necessarily be picked up by other websites that draw on Wikipedia content (there are already quite a few that carry Wikipedia-derived content), and it certainly won't appear in printed version of the encyclopedia. Whether or not the reference to the talk page is "temporary" is irrelevant — people take snapshots of this site all the time. If the content you've placed here is valuable (which it appears to be), either integrate it into the article itself, or wait for other editors to get around to doing so. --Allan McInnes (talk) 15:53, 21 June 2006 (UTC)
- It is precisely because of the needs of readers that this policy exists. Specifically, quoting from the policy in question:
Is an algorithm the list of instructions or symbols + mind/machine in action?
This is a very serious question. I'm not joking. If the answer is, "Well we've always stated that "agorithm" is the list of instructions, so it must be true; I will then ask you "Is this a truth or just an instance of received-but-flawed wisdom." because...
...the above quotes would seem to indicate that even if the algorithm is defined as "the list of instructions", that the algorithm also includes a proof of its correctness, including its ability to terminate, its ability to handle the set of input for which it has been defined, and something to do with "the output" which I can't even conceive of until "the algorithm" does its business.
Example #1: In the context of a Turing machine: Is the algorithm just the "the TABLE" of instructions (the instructions in the finite-state portion of the machine, or perhaps the instructions of a U-machine on tape), or the whole Turing machine (including symbols on the tape at the start and then at end), or the machine in operation actually cranking out its number(s), or what?
Example #2: a slide rule is a three sticks with symbols on them, the center stick sliding between the two outer sticks. Where does "the algorithm" for multiplying two numbers with a slide rule reside? In the mind of the slip-stick operator? In the book of instructions that came with the slip-stick? Both? If our slip-stick operator wants to multiply say 2.35 x 4.78, she has to undergo a sequence of mechanical operations. Is this the algorithm, with her reading out the number at the conclusion?
Example #3: words of a "procedure" (e.g. for multiplication) written on a page are meaningless marks without a (human) body to read/scan them and a (human) mind to assign them "meaning": ("scan right-to-left"), distinguish the marks ("Is that a 1 or a 7?"), recognize them ("the first mark is an L not a number; subsequent marks signify "Left" and that means..."), associate them to actions ("Left" --> (means) "move pencil one space with muscle group #453"), then act ("make mark on page, then scan eyes down to next line"), etc. So where is "the algorithm" for multiplication? What is it really really?
If you read the quotes above including the reverted one from Knuth, you see all these words like "procedure", "formal system", "mechanical device", "sequence of operations", "input", "output", "termination", etc. These do not sound to me like "a list of instructions" but rather an active process that includes the "input" wedged between a finite sequence of operations and "output" and a "proof of termination."
The problem is historically an algorithm always involved a human computer with "a mind' or e.g. paper and pencil, stick and dirt to scribble in, abacus, slip-stick, whatever). But now we have automatic machines that seem to render confused what was historically assumed without statement -- that "algorithm" involved symbols plus a mind to interpret and manipulate the symbols.
Comments anyone? wvbaileyWvbailey 16:14, 4 June 2006 (UTC)
- These are interesting questions. However, we can't invent a new definition of algorithm for Wikipedia. Nor can we attempt to infer a definition by synthesizing the works of Turing, Kleene, et al. The best we can do within the policies of Wikipedia is rely on other people's synthesis (Knuth's for example). The quotes and references that you have provided are great, and would certainly make a welcome addition to the "History" section of the article. And perhaps they do point to a "true" definition of "algorithm" that is different than the received wisdom. But we cannot make that leap here — we can only report on leaps that others have made. --Allan McInnes (talk) 19:48, 4 June 2006 (UTC)
Allan McInnes is correct as far as Wiki policy goes. I respect the NOR policy. But I am on a deep-fishing expedition: Has anybody out there ever run into this idea, and if so can they point to some sources? Meanwhile I'm on the hunt. This thought/concept/definition has been brewing and gurgling and percolating for a few months and now it's finally bubbled to the surface. Thanks, wvbaileyWvbailey 22:54, 4 June 2006 (UTC)
- As far as I can see the following article doesn't precisely answer the question it asks -- Blass and Gurevich, Algorithms: A quest for absolute definitions -- but it does strongly address the notion of "algorithm" as a "machine that does the computation". Gurevich has stated in private communication that an "algorithm is indeed a machine in action." wvbaileyWvbailey 20:16, 2 July 2006 (UTC)
http://research.microsoft.com/~gurevich/Opera/164.pdf
- This article was originally published in "Bulletin of European Association for Theoretical Computer Science" 81, 2003.
The authors state -- that indeed an algorithm is a process that is a machine of one sort or another. Apparently Kolmogorov 1953 was the forerunner of this idea. He later continued his thesis with his student Uspensky:
This is the authors' quote from Kolmogorov-Uspensky (1958):
- "...It seems plausible to us that an arbitrary algorithmic process satisfies our definition of algorithms [sic]. We would like to emphasize that we are talking not about a reduction of an arbitrary algorithm to an algorithm in the sense of our definition but that every algorithm essentially satisfies the proposed definition." (quoted from Kolmogorov-Uspensky 1958, translated 1963 [40], page 10 in Blass-Gurevich)
The authors go on to ask the question:
- "Is it possible to capture ( = formalize) sequential algorithms on their natural levels of abstraction? Furthermore, is there one machine model that captures all sequential algorithms on their natural levels of abstraction? According to [27], the answer to both questions is yes. (page 15 of .pdf)
- Reference [27] is:
- Yuri Gurevich, "For every sequential algorithm there is an equivalent sequential abstract state machine", ACM Transactions on Computational Logic, vol. 1, no. 1 2000), 77-111.
In three postulates the authors define what a "sequential algorithm" is. Then they state this theorem:
- "Theorem 5.6 (ASM [Abstract State Machine] Characterization of Small-Step Algorithms) For every sequential algorithm A there is a sequential abstract state machine B behaviorally equivalent to A. In particular, B simulates A step for step.
The authors define something similar for "wide-step" algorithms in terms of "parallel abstract state machines" -- theorem 6.2.
The authors' paper includes an extensive bibliography of 56 references. wvbaileyWvbailey 20:16, 2 July 2006 (UTC)
Jeff Edmonds
Someone (I'm assuming it's User:Jeff Edmonds, although the recent additions have been by User:70.49.118.220) keeps adding external links to:
I've removed them for now as spam. Anyone (aside from Jeff) think these links are worth including? From my reading of WP:EL these links might be considered to count as "textbooks", and therefore acceptable. OTOH, the fact that Jeff is the one adding them seems a little self-promotional, so it'd be nice to get some other opinions. --Allan McInnes (talk) 18:31, 5 June 2006 (UTC)
- Update: I see that someone added
- Edmonds, Jeff A. (2007). How to Think About Algorithms, Loop Invariants and Recursion (Preprint ed.). Cambridge House Press.
{{cite book}}
: External link in
(help)|edition=
- Edmonds, Jeff A. (2007). How to Think About Algorithms, Loop Invariants and Recursion (Preprint ed.). Cambridge House Press.
- to the reference list. Since there was no editing of the article itself aside from the addition of this "reference", I think it's safe to assume that this book wasn't actually used as a reference in writing the article. As a result, I have removed it from the reference list. --Allan McInnes (talk) 18:34, 5 June 2006 (UTC)
I don't think it was meant as spam (at least the first time), but I see little reason to include the link. There are dozens of lecture notes on algorithms available online and I don't see why this one is any better than the others. —Ruud 12:52, 6 June 2006 (UTC)
- Yeah, "spam" may have been too strong a word. Sorry 'bout that. But I think the links pretty clearly fall into the WP:EL "links to avoid" category, as links to a "website that you own or maintain". I've already pointed Jeff to WP:EL, and asked him to follow the guidance there regarding posting such links on the talk page first. Instead, User:70.49.118.220 reinserted the links. So I've posted the links here myself. --Allan McInnes (talk) 18:33, 6 June 2006 (UTC)
Too big
This article is too big (for me). I suggest to split it, specially by moving and rewriting the "classification" part. Acaciz 13:14, 3 July 2006 (UTC)