This is the talk page for discussing improvements to the Decimal article. This is not a forum for general discussion of the article's subject. |
Article policies
|
Archives: 1, 2Auto-archiving period: 2 years |
This level-4 vital article is rated C-class on Wikipedia's content assessment scale. It is of interest to multiple WikiProjects. | |||||||||||
|
Consider adding to Intro
editConsider adding the following text to the intro of the article, as discussing the limitations of the base 10 system would be beneficial for readers curious about such information regarding it. The text is in brackets. [Though the decimal (base 10) system is widely used, it is not free of criticism. As 10 is only divisible by itself, 1, 2, and 5, proponents of other base number systems (such as base 12) would argue that using a base number with more factors would make math simpler (as 3, 4, and 6 go perfectly into a dozen). (see https://en.wikipedia.org/wiki/Duodecimal)]
- This article is about the decimal base, not about the comparison of bases, which is the subject of the article Number base. Moreover, your formulation is not convenient for an encyclopedia; for a convenient tone, see the dozenal entry in the table in Number base § In numeral systems. D.Lazard (talk) 13:58, 28 April 2022 (UTC)
- I see your point, and thanks for informing me. Disregard the edit suggestion. 104.35.147.45 (talk) 22:42, 28 April 2022 (UTC)
Add subsection on conversion
editI suggest to add a subsection on conversion to numbers with different bases AXONOV (talk) ⚑ 15:17, 28 May 2023 (UTC)
Decimals in the Vedas
edit@Dragoon17: Sorry, I overlooked your "fv" edit summary here. If you've checked the Atharva Veda and it doesn't support the claim, tagging the reference with {{Failed verification}} would be appropriate, but IMHO deleting the paragraph is too radical. Peter Brown (talk) 22:39, 11 June 2023 (UTC)
- Hi. Yes, I did check it. It's a primary source, actually, of a hymn. I did search google books/scholar for "Atharvaveda" + decimal fractions (with and without a space in "Atharva Veda") and it turned up nothing beyond a couple of references to place value (related to the text having words for the numbers 10^4, 10^8, and so on). A google search of the specific citation turns up Hindu twitter accounts and websites, and nothing scholarly. This exact citation was, I believe, copied directly from "Hindupedia", although that page does not really assert what this sentence asserts. Certainly none of it appears to support the statement that the text makes use of "mathematical decimal fractions", which also conflicts with the information in this article, which puts the first use of decimal fractions around a thousand years after that. But I do not feel strongly enough about it to undo your reversion. Dragoon17 (talk) 04:46, 12 June 2023 (UTC)
- I have deleted it as it comes under WP:NOR. അദ്വൈതൻ (talk) 22:05, 8 July 2024 (UTC)
"Generally", "usual"
editI have some concerns about this text:
Generally, a decimal has only a finite number of digits after the decimal seperator. However, the decimal system has been extended to infinite decimals for representing any real number, by using an infinite sequence of digits after the decimal separator (see decimal representation). In this context, the usual decimals, with a finite number of non-zero digits after the decimal separator, are sometimes called terminating decimals.
Where does this "generally" come from? I think the most common usage is actually as an approximation to some measured quantity. For example this desk is 90.2 centimeters wide means something like "I can measure the desk's width to a precision of roughly 0.1 cm, and when I do that it comes out 90.2. It's probably between 90.15 and 90.25, but I'm not signing over my firstborn in case it's a little outside that range."
Now, to the extent that the desk's width is completely well-defined (which of course it probably isn't, but let's go with it for now), if you had better and better measurement tools, you could keep going, adding more and more digits, without any obvious fixed limit.
So "generally" there's no actual fixed finite number of digits to which a decimal should be expanded. The exception would be if you had some reason to believe the exact answer were of the form for some natural numbers and , which is almost never the case.
That's in the "real-life" case. In mathematics, for representing real numbers, actually all decimal expansions are infinite; it's just that some eventually start spitting out the value 0 and never stop.
Not sure how best to fix this; it is good to have somewhere to point terminating decimal, as ultimately unimportant as the concept is. --Trovatore (talk) 01:08, 15 January 2024 (UTC)
- IMO, it suffices to replace "Generally" with "Originally and in most uses". Even in mathematics, infinite decimals are not really used, except for the study of the cardinality of the real numbers. I agree that "Generally" is confusing, as if may be wrongly understood as "most decimals are finite". D.Lazard (talk) 08:35, 15 January 2024 (UTC)
- I actually think the first sentence should just be removed, and the second sentence should not talk of an "extension" but rather should say that the expansion may be infinite. Then the third sentence should change to saying that a "terminating decimal" is one that can be expressed with a finite number of digits (of course it isn't really finite, it just starts becoming all zeroes, but we can leave that out in this spot). --Trovatore (talk) 20:40, 15 January 2024 (UTC)
- I strongly oppose to your suggestion. Basically, a (finite) decimal is a numeral that allows representing some numbers, namely the decimal fractions. Infinite decimals are not numerals, as most of them cannot be written; they may be viewed as increasing sequences or series, and have a very different nature. Finite decimals were known in Westerm Europe since Fibonacci, while the formal definition of an infinite decimal requires actual infinity. So, saying that infinite decimals are an extension of finite decimals is coherent with history and with the level of abstraction needed to understand the concepts. Saying, as you suggest, that finite decimals are only a special case of infinite decimals, is logically correct, but, IMO, a form of pedantry, consisting of asking the reader to understand complicate things before using more simpler and more useful things. D.Lazard (talk) 21:53, 15 January 2024 (UTC)
- The thing is, the finite decimals are not about representing numbers exactly (and since you brought up history, they never have been about representing numbers exactly). They're about representing numbers approximately. The only ones they can represent exactly are those rationals with a denominator whose only prime factors are 2 and 5, which is not a particularly interesting class of numbers in either mathematics or science (oh, I expect there's some interesting mathematics you can do with them, but whatever it is, I doubt it's very relevant to this discussion).
- So I think it's misleading (and by the way unsourced) to call out the finite decimals as a special class. It's particularly dangerous given that many readers of this article are already foggy on the difference between numbers and numerals, and already think there's something special about base 10.
- My suggestion is not intended to add pedantry; it's mostly about removing language that I think is currently problematic. I'm happy to rethink it to avoid any active suggestion that the finite decimals are a special case. I just don't think we should walk readers down a garden path towards the idea that they're the main case. --Trovatore (talk) 22:21, 15 January 2024 (UTC)
- Nobody is saying that finite decimals represent all numbers exactly. They represent decimal fractions exacly, and are widely used to approximate real numbers. Infinite decimals are essentially used only for the study of the cardinality of the continuum. So the main case is definitively the finite case, except possibly for specialists of mathematical logic. IMHO, infinite decimals should not be taught before college. In any case, they should not be taught to people who do not master approximations. D.Lazard (talk) 22:42, 15 January 2024 (UTC)
- For precision you should say something like "decadic fraction" rather than decimal fraction, as an infinite decimal arguably is a "decimal fraction". ("Fraction" here is in the sense of part of something, not in the sense of ratio of integers.)
- It's true that finite decimals represent decadic fractions exactly, but those are fairly unimportant and definitely not the motivation for the notation.
- One test example: We should probably point out early that 2.3 means something different from 2.30, as the latter implies higher precision.
- I am not suggesting that we should lead heavily with infinite decimals. I do think we should remove the language that I've called out. --Trovatore (talk) 22:50, 15 January 2024 (UTC)
you should say something like "decadic fraction"
– in your own textbook you can make up whatever neologism you like. Wikipedia should stick to standard terminology where it exists (and in the rare exceptional case where a non-standard term is needed for whatever reason, should be extremely clear to point that out).finite decimals [are] unimportant and definitely not the motivation for the notation
– what does this mean? Which notation are you referring to? –jacobolus (t) 23:51, 15 January 2024 (UTC)- I wasn't suggesting that we should add that term to the article. I was pointing out that the term "decimal fraction" does not in itself capture the restriction to finite decimals, and proposing a term D.Lazard could have used in this discussion.
- Also I didn't say finite decimals were unimportant. What I said, or at least what I meant, is that the numbers exactly represented by finite decimals (the ones of the form ) are unimportant (as a class), and irrelevant to the motivation for decimal-fraction notation. --Trovatore (talk) 00:25, 16 January 2024 (UTC)
- A couple more remarks: Your claims about the limited applications that infinite decimals are "used" for seem to apply only to actually infinite decimal expansions. Potentially infinite expansions have been known much longer, and schoolchildren are familiar with the idea that you can do decimal division of 1 by 3 and it keeps spitting out 3s, and the more advanced ones will know that there's an algorithm for computing (say) , and it will keep spitting out digits without ever settling into a fixed repetitive pattern.
- This language that emphasizes finite decimal expansions seems to exclude even this rather basic piece of knowledge.
- But surely we don't want to go onto a digression about actual versus potential infinity, certainly not in the lead, and probably not in this article at all. It's an almost Scholastic distinction that has little to do with the topic given by the title. --Trovatore (talk) 23:10, 15 January 2024 (UTC)
- Nobody is saying that finite decimals represent all numbers exactly. They represent decimal fractions exacly, and are widely used to approximate real numbers. Infinite decimals are essentially used only for the study of the cardinality of the continuum. So the main case is definitively the finite case, except possibly for specialists of mathematical logic. IMHO, infinite decimals should not be taught before college. In any case, they should not be taught to people who do not master approximations. D.Lazard (talk) 22:42, 15 January 2024 (UTC)
- I strongly oppose to your suggestion. Basically, a (finite) decimal is a numeral that allows representing some numbers, namely the decimal fractions. Infinite decimals are not numerals, as most of them cannot be written; they may be viewed as increasing sequences or series, and have a very different nature. Finite decimals were known in Westerm Europe since Fibonacci, while the formal definition of an infinite decimal requires actual infinity. So, saying that infinite decimals are an extension of finite decimals is coherent with history and with the level of abstraction needed to understand the concepts. Saying, as you suggest, that finite decimals are only a special case of infinite decimals, is logically correct, but, IMO, a form of pedantry, consisting of asking the reader to understand complicate things before using more simpler and more useful things. D.Lazard (talk) 21:53, 15 January 2024 (UTC)
- I actually think the first sentence should just be removed, and the second sentence should not talk of an "extension" but rather should say that the expansion may be infinite. Then the third sentence should change to saying that a "terminating decimal" is one that can be expressed with a finite number of digits (of course it isn't really finite, it just starts becoming all zeroes, but we can leave that out in this spot). --Trovatore (talk) 20:40, 15 January 2024 (UTC)
redirect from "base 10"
edit"base 10" might as well be binary, seximal, octal, dozenal or almost any other base
perhaps there should be a {{redir}}, something like "Base 10" redirects here. For other uses, see List of numeral systems#Standard positional numeral systems.? Pandaqwanda (talk) 09:58, 17 February 2024 (UTC)
- Base 10 redirects to Decimal; base 8 redirects to Octal; base 6 redirects to Senary; base 12 redirects to Duodecimal; base 16 redirects to Hexadecimal. Template {{redir}} is useful only for ambiguous redirects. This is not the case here. D.Lazard (talk) 10:19, 17 February 2024 (UTC)
- i started the topic because, in the context of numeral systems, there isn't really such a thing as "base 10". in almost all bases, "10" is how you write the base itself: it can be two and hence base 10 can be binary, it can be five and hence base 10 can be quinary, etc
also, the page in question seemingly doesn't refer to decimal as "base 10" anywhere
Pandaqwanda (talk) 13:11, 17 February 2024 (UTC)- In English, when 10 is written without other specification, this means the number ten, not the numeral representing another number in another base. It is true that 10 is a numeral that represents a base in the base itself. It seems that you confuse "numeral 10" with "base 10". D.Lazard (talk) 14:07, 17 February 2024 (UTC)
- i see, nevermind then.
consensus: keep the redirect.Pandaqwanda (talk) 14:58, 17 February 2024 (UTC) (edited the {{resolved}} to lowercase Pandaqwanda (talk) 15:09, 17 February 2024 (UTC)) (edited the {{resolved}} to say "resolved" instead of "redirect", mixed up words at the time pandaqwanda (talk) 08:41, 19 February 2024 (UTC))resolved
- i see, nevermind then.
- In English, when 10 is written without other specification, this means the number ten, not the numeral representing another number in another base. It is true that 10 is a numeral that represents a base in the base itself. It seems that you confuse "numeral 10" with "base 10". D.Lazard (talk) 14:07, 17 February 2024 (UTC)
- i started the topic because, in the context of numeral systems, there isn't really such a thing as "base 10". in almost all bases, "10" is how you write the base itself: it can be two and hence base 10 can be binary, it can be five and hence base 10 can be quinary, etc