This is the talk page for discussing improvements to the GPT-3 article. This is not a forum for general discussion of the article's subject. |
Article policies
|
Find sources: Google (books · news · scholar · free images · WP refs) · FENS · JSTOR · TWL |
Archives: Index, 1Auto-archiving period: 3 months |
This page is not a forum for general discussion about ChatGPT. Any such comments may be removed or refactored. Please limit discussion to improvement of this article. You may wish to ask factual questions about ChatGPT at the Reference desk. |
This article is rated C-class on Wikipedia's content assessment scale. It is of interest to multiple WikiProjects. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|
If your option was "Others" in the previous question please mention the source.*
editIf your option was "Others" in the previous question please mention the source.* 2409:4055:107:DC6F:0:0:26D0:20A1 (talk) 06:01, 25 January 2024 (UTC)
175B Parameters?
editThere is much discussion online about the number of parameters in the model, and as far as I can tell there is no clear consensus. The cited paper for the 175B parameter claim does not contain this information. I believe this claim to be unsupported. 156.57.89.183 (talk) 18:42, 9 March 2024 (UTC)
- This article defines GPT-3 as the model family from the paper "Language Models are Few Shot Learners." That paper explicitly identifies the largest GPT-3 model as having 175 billion parameters.
- The connection between the models in that paper and the models released via the OpenAI API was poorly documented for a long time. However two and one half years later it was disclosed by OpenAI, via a page that has since been taken down but has been archived [1]. I've updated the link in the article to point to the archive of the webpage. We also cite an EleutherAI blog post which seems to be the first third party research that identifies the correct answer for all model sizes and predates OpenAI's disclosure by a year and a half. Now that we have official confirmation, third party research is less relevant but it makes sense to me to include it for historical reasons. Stellaathena (talk) 03:15, 4 September 2024 (UTC)
Discussion about procrastinate
editDiscussion about procrastinate 210.23.168.11 (talk) 12:16, 14 March 2024 (UTC)
- Additional information needed I think you are referring to the following, although your comment was too sparse to determine your intent.
- I, for one, am unsure what you wish to have done. Peaceray (talk) 16:03, 14 March 2024 (UTC)
Experiencial strategy: A way to understand various literature as perceived by the selected literature teacher In Lemery Colleges, Inc. A.Y 2024-2025
editconceptual frame work 103.180.50.10 (talk) 13:11, 29 April 2024 (UTC)
- Not done You have provided no citation information, & a preliminary Google search returns nothing useful. Peaceray (talk) 14:03, 29 April 2024 (UTC)