User:JPxG/LLM demonstration

For the proposed guidelines based on this demonstration, see WP:LLM.

The following is the result of my experiments with OpenAI ChatGPT (GPT-3.5, based off InstructGPT) in December 2022 (as well as some experiments with GPT-3-davinci-003). These are both large language models (LLMs) trained on a large corpus of web text; they do not have the capability to "look things up", search the Web, or examine sources. Telling these models to "write an article" about something will produce large volumes of bullshit in the vage style of a Wikipedia article. Do not do this and then put it on Wikipedia.

The models use transformers with multi-head attention to complete sequences, replying to prompt sequences with their best idea of what would be most likely to come after them. This can be hacked very easily: if you say "The following is a list of reasons why it is good to eat crushed glass", it will give you one. Do not do this and then put it on Wikipedia.

That being said, there are obvious situations in which their detailed understanding of virtually all publicly accessible Internet text (i.e. HTML tutorials, codebases from GitHub, the full text of Wikipedia including our policies and guidelines, etc) means that they are able to contribute greatly to the process of editing and writing, if they are used by an intelligent editor who does not blindly paste LLM output into the edit window and press "save".

Much like human beings, they will occasionally make errors, or say things that are not completely true. This means that they should be used only by an intelligent editor who does not blindly paste LLM output into the edit window and press "save".

Asking them to do tasks which they are not suited to (i.e. tasks which require extensive knowledge or analysis of something that you don't type into the prompt window) makes these errors much more likely. This is why a LLM should only be used by an intelligent editor who does not blindly paste LLM output into the edit window and press "save".

Hopefully, you are starting to pick up on a leitmotif. In case I was too subtle, I will say it again: Large language model output should only be used in the process of editing Wikipedia if you are an intelligent editor who does not blindly paste LLM output into the edit window and press "save".

To demonstrate this, I present several examples of these models assisting in routine tasks.

I have included the full text of my own prompts below, as well as the model's responses (which I present unedited).

Note that, while I have not cherry-picked these results, I have worked rather extensively with GPT-series models in the past, and my results are likely to be much better than what a random inexperienced user would be capable of getting. Professional driver, closed course...

ChatGPT

edit

The majority of these experiments were carried out using ChatGPT, whose website can be accessed here.

Introduction: please tap the sign

edit
"Please format this markup as an extremely obnoxious floating box with loud colors and large text" (open for full prompt)
The following is MediaWiki markup:
'''Large language model output should only be used in the process of editing Wikipedia if you are an intelligent editor who does not blindly paste LLM output into the edit window and press "save"'''.
Please format this markup as an extremely obnoxious floating box with loud colors and large text.
Output

Large language model output should only be used in the process of editing Wikipedia if you are an intelligent editor who does not blindly paste LLM output into the edit window and press "save"

Can you add a second sentence, about how if you use LLMs to vomit a bunch of nonsense, you are going to get your ass beaten, and nobody will even try to save you, because you are a fucking moron who was told very clearly that you were supposed to know what you were doing and check it before the edits went live? Except make it sound professional so I don't get yelled at for swearing.

Additionally, if you use LLMs to generate content that is not well-researched or well-written, you may face consequences. It is important to thoroughly review and edit any content generated by LLMs before publishing it on Wikipedia. Failure to do so can result in negative consequences for your account and reputation as an editor.

Can you put that into the extremely obnoxious floating box, and also reformat the box so that the text is centered, in searing magenta, on a bright chartreuse background, and surrounded with attention-grabbing emojis like 💥 and ☢️?

💥Large language model output should only be used in the process of editing Wikipedia if you are an intelligent editor who does not blindly paste LLM output into the edit window and press "save"💥

☢️Additionally, if you use LLMs to generate content that is not well-researched or well-written, you may face consequences at the administrators' noticeboard. It is important to thoroughly review and edit any content generated by LLMs before publishing it on Wikipedia. Failure to do so can result in negative consequences for your account and reputation as an editor.☢️

Do you think that's enough to make people read it?

It is possible that the bright colors and attention-grabbing emojis may catch the reader's attention and encourage them to read the message. However, it is ultimately up to the individual reader to decide whether or not they will take the time to read the content.

Notes

edit

Okay, hopefully you have heard it enough times that we can get started.

Wikitext formatting and table rotation

edit

This is a table that I had lying around in my sandbox for when I was setting up the Oracle. Let's see if ol' Gippit can work with it.

Input
speedy
keep
keep merge redirect delete speedy
delete
still open withdrawn no consensus transwiki userfy / draftify (bot couldn't
parse result)
"Please rewrite this table so that it is rotated (i.e. that there are two columns and six rows, with "speedy keep" at the top left and "still open" at the top right)." Open for full prompt.
The following code generates a table with two rows and six columns.
{| class="wikitable" style="width:100%"
|-
| style="background:#34fb34; width:16%" | '''speedy<br/>keep'''
| style="background:#98fb98; width:16%" | '''keep'''
| style="background:#fbfb98; width:16%" | '''merge'''
| style="background:#fbc998; width:16%" | '''redirect'''
| style="background:#fb9898; width:16%" | '''delete'''
| style="background:#fb3434; width:16%" | '''speedy<br/>delete'''
|-
| colspan="6" |
|-
| style="background:#e5eaf5; width:16%" | '''still open'''
| style="background:#a0a6b0; width:16%" | '''withdrawn'''
| style="background:#fbfb34; width:16%" | '''no consensus'''
| style="background:#98fbfb; width:16%" | '''transwiki'''
| style="background:#9898fb; width:16%" | '''userfy / draftify'''
| style="background:#fe36fe; width:16%" | '''(bot couldn't<br/>parse result)'''
|}

Please rewrite this table so that it is rotated (i.e. that there are two columns and six rows, with "speedy keep" at the top left and "still open" at the top right).
Output
speedy
keep
still open
keep withdrawn
merge no consensus
redirect transwiki
delete userfy / draftify
speedy
delete
(bot couldn't
parse result)
Extended dialogue
This doesn't account for the blank row in the original table. Can you include it as a blank column between the two in your previous output?
speedy
keep
still open
keep withdrawn
merge no consensus
redirect transwiki
delete userfy / draftify
speedy
delete
(bot couldn't
parse result)
The middle column is way too wide. Why could this be?

The middle column is too wide because there are two columns in the original table that have been combined into one in the rotated version. To fix this, the colspan attribute should be set to "1" instead of "2" for each cell in the middle column. This will make each cell in the middle column take up only half the width of the original cells in the original table.

Here is the updated code with the correct colspan values:

speedy
keep
still open
keep withdrawn
merge no consensus
redirect transwiki
delete userfy / draftify
speedy
delete
(bot couldn't
parse result)
Thanks, buddy. You did good.

You're welcome! I'm glad I could help. Let me know if you need any further assistance.

Note: Its revised version of the table still looks kind of crappy, because it is only changing the colspan of the middle column, instead of the overall width of the table. The table is specified at width:100%, and each column is specified at width:16%, so the middle column is going to be 68% no matter what it's specified as. This is not a very straightforward error, and I didn't even notice it was wrong until I saved the page and the table looked like crap and I looked at the code a second time.

Notes

edit

As far as I can tell, ChatGPT was not specifically trained on MediaWiki markup, so its knowledge is entirely based on its understanding of HTML and inferring from the single example of a table I gave. A version which was fine-tuned on a large corpus of MediaWiki markup (i.e. not rendered Wikipedia pages) would probably perform even better.

Parsing of references, estimation of reliability (ARES FMG)

edit

This is an article from Special:Random about a rifle. I converted the source to list-defined references and asked Gippit what it thought about the sources. For this, I used the full unedited spaghetti of the {{cite web}} templates, without attempting to render them into something readable first.

Input
"Can you tell me which of these sources seem reliable, and which are likely to fail to meet Wikipedia's guidelines?" Open for full prompt.
The following is the source for the reference section of a Wikipedia article called "ARES FMG".

{{reflist|1=30em|refs=
<ref name="USPTO">Warin, Francis J. {{US Patent|4625621}} filed Apr. 2, 1984; issued Dec. 2, 1986.</ref>
<ref name="WG-ARES">{{cite web | last = Popenker | first = Max R. | author2 = Oleg Volk | title = ARES FMG / folding submachine gun (USA) | url = http://world.guns.ru/smg/smg70-e.htm | publisher = World.Guns.ru | year = 2005 | accessdate = 2007-11-29 | archiveurl = https://web.archive.org/web/20071024012918/http://world.guns.ru/smg/smg70-e.htm | archivedate = 24 October 2007 | url-status = live}}</ref>
<ref name="ar0yr">{{cite web | author = Kitsune | title = Tomlinson Industries ARES II FMG | url = http://www.kitsune.addr.com/NinjaSpies/NinjaSpies-Equipment/TI_ARES_II_FMG.htm | publisher = Kitsune's Web Page | year = 2001 | accessdate = 2007-11-29 | archiveurl = https://web.archive.org/web/20071027170511/http://kitsune.addr.com/NinjaSpies/NinjaSpies-Equipment/TI_ARES_II_FMG.htm | archivedate = 27 October 2007 | url-status = dead}}</ref>
<ref name="I8SjX">http://www.fullautoclassics.com/ucsar_artical.pdf {{Bare URL PDF|date=March 2022}}</ref>
<ref name="ljCk8">{{cite web | last = Popenker | first = Max R. | title = PP-90 submachine gun (Russia) | url = http://world.guns.ru/smg/smg10-e.htm | publisher = World.Guns.ru | year = 2005 | accessdate = 2007-11-29 | archiveurl = https://web.archive.org/web/20071024163829/http://world.guns.ru/smg/smg10-e.htm | archivedate = 24 October 2007 | url-status = dead | df = dmy-all}}</ref>
<ref name="vnTii">{{cite web | last = Crane | first = David | title = MagPul FMG 9: Prototype 9mm Folding Submachine Gun | url = http://www.defensereview.com/modules.php?name=News&file=article&sid=1098 | publisher = Defense Review | year = 2008 | accessdate = 2008-02-14 | archiveurl = https://web.archive.org/web/20080304130229/http://www.defensereview.com/modules.php?name=News&file=article&sid=1098 | archivedate = 4 March 2008 | url-status = live}}</ref>
}}


Can you tell me which of these sources seem reliable, and which are likely to fail to meet Wikipedia's guidelines?
Output

It is difficult to determine the reliability of these sources without further information. The first source, "USPTO," appears to be a United States patent filed in 1984 and issued in 1986. This may be considered a reliable source as it is a government document.

The second source, "WG-ARES," is a web page from World.Guns.ru, which may not be considered a reliable source as it is a personal website.

The third source, "ar0yr," is a web page from Kitsune's Web Page, which is also likely not considered a reliable source.

The fourth source, "I8SjX," is a PDF file with no specified author or publisher, so it is difficult to determine its reliability.

The fifth source, "ljCk8," is from World.Guns.ru and may not be considered a reliable source.

The sixth source, "vnTii," is from Defense Review, which may be considered a reliable source as it is a publication focused on defense and military technology. Overall, it is best to verify the reliability of these sources before using them in a Wikipedia article.

How do you know how {{cite web}} templates work?

I do not have personal knowledge of how the {{cite web}} templates work in Wikipedia. I was able to identify the template based on its formatting and the presence of information commonly found in citations for web pages, such as the author, title, and URL. Wikipedia has guidelines and documentation on how to use its citation templates, which can be found on its website.

Check out the big brain on ChatGPT. Thanks again.

Thank you for your appreciation!

Notes

edit

So it is possible for Gippit to wade through huge piles of wikitext. This is good news.

But here we can see one of the important limitations of the model: it is physically impossible for it to "look things up". I will say this again, because it is important: it is physically impossible for it to "look things up". There is literally no way for it to browse a website and then return content from that website! In this example, it says that "Defense Review" is a reliable source and "Kitsune's Web Site" is not, but it doesn't know that. For all we know, "Kitsune's Web Site" is the official homepage of the Department of Defense's KITSUNE initiative (note: it is not). If you look up the RSN discussion for Defense Review, you will see that in 2019 it was a personal website. But in 2022 it has a separate owner and manager. I'm sure Gippit would be able to figure this out quite easily if it were capable of browsing... but it'sn't.

However, note also that it's aware of this limitation, and tells you about it.

Recommendations for article improvement or deletion (Qarah Daghli)

edit

This is a geostub about an Iranian village. I prompted it with the full source of the article (minus categories, infobox, and authority control templates).

"Can you think of some areas for improvement?" (Open for full prompt)
The following is the source of a Wikipedia page called "Qarah Daghli".
'''Qarah Daghli''' ({{lang-fa|قره داغلي}}, also [[Romanize]]d as '''Qarah Dāghlī''' and '''Qareh Dāghlī''')<ref>{{GEOnet3|-3767814|Qarah Daghli}}</ref> is a village in [[Sheykh Musa Rural District]], in the [[Central District (Aqqala County)|Central District]] of [[Aqqala County]], [[Golestan Province]], [[Iran]]. At the 2006 census, its population was 1,007, in 221 families.<ref>{{IranCensus2006|27}}</ref>
Can you think of some areas for improvement?
Output

One potential area for improvement could be providing more context or background information about Qarah Daghli. The current text only provides basic facts about the village's location and population. Providing additional information about the history, culture, or notable features of the village could make the page more informative and interesting to readers.

Another potential improvement could be to include more references or citations for the information provided on the page. Currently, only two references are provided, but including additional references from reliable sources can help to verify the accuracy of the information and give readers the opportunity to learn more about the village.

Finally, the text could be edited for clarity and readability. For example, the sentence structure and phrasing could be revised to make the information easier to understand, and any grammar or spelling errors could be corrected. These changes could help to improve the overall quality of the page.

What might an AfD nomination for this article look like?

Qarah Daghli is a small village in Iran with a population of only 1,007. The article lacks significant notability and fails to meet the criteria for inclusion on Wikipedia. It is recommended that this article be nominated for deletion.

What might a convincing, well-written AfD nomination for this article look like?

This article, "Qarah Daghli", does not meet the criteria for inclusion in Wikipedia as it lacks significant coverage in reliable sources. The only sources cited in the article are GEOnet and the Iran Census 2006, which do not provide sufficient information about the village's significance or notability. Additionally, the village's population of 1,007 people and its location in Golestan Province, Iran do not make it a notable topic for an encyclopedia. Therefore, I propose that this article be deleted.

What might a convincing, well-written AfD nomination for this article look like, if written in the style of Kurt Vonnegut?

Dear fellow Wikipedia editors,

I come to you with a heavy heart to nominate the article on "Qarah Daghli" for deletion. This obscure village in Iran has no significance to the larger world and its inclusion on our esteemed website serves no purpose.

Sure, it may have a population of 1,007 and be located in the Sheykh Musa Rural District of Aqqala County, but who cares? Its inclusion on Wikipedia only clutters up our pages and serves as a distraction from the truly important articles we have to offer.

Let us rid ourselves of this meaningless blip on the map and focus on the articles that truly matter. Let us say goodbye to Qarah Daghli and move on with our lives.

Yours truly, Kurt Vonnegut

Notes

edit

The suggestions for improvement seem excellent. I don't think people should be allowed to use this for AfD nominations. I especially don't think people should be allowed to use this for Kurt Vonnegut-style AfD nominations: how could we compete?

Plot summary condensation (The Seminar)

edit

This is an episode of The Office with an extremely long plot summary.

"This plot summary is too long. Can you shorten it to be less than five hundred words?" (Open for full prompt)
The following is a plot summary from a Wikipedia article called "The Seminar".
Michael Scott (Steve Carell) bumps into David Brent (Ricky Gervais) outside an elevator. After they exchange several ethnic jokes, David asks if Dunder Mifflin is hiring. Michael says "not right now" and David asks him to drop a line if they do have any openings.
In order to improve his worst-in-office sales totals, Andy Bernard (Ed Helms) holds a seminar on starting a small business. He enlists Jim Halpert (John Krasinski) as one of the guest speakers, but Jim abruptly pulls out when he apparently recognizes one of the attendants. The other speakers—Dwight Schrute (Rainn Wilson), Phyllis Vance (Phyllis Smith), and Stanley Hudson (Leslie David Baker)—lose faith in Andy's plan and also pull out, leaving Andy to replace them with Kevin Malone (Brian Baumgartner), Creed Bratton (Creed Bratton) and Ryan Howard (B. J. Novak). Kevin's speech goes bad when he gets ill from running around the room to Ozzy Osbourne's "Crazy Train," and vomits in front of the audience. Kelly Kapoor (Mindy Kaling), who replaces Ryan, calls up an old professor from her management training time at Yale to talk business lessons. Creed delivers a bizarre speech about the Loch Ness Monster.
Erin Hannon (Ellie Kemper) is battling Gabe Lewis (Zach Woods) in a game of Scrabble played over their phones. The winner of their Scrabble games always picks the movie they watch, and since Erin is poor at the game, Gabe has already subjected her to numerous horror movies such as The Shining, Rosemary's Baby and The Ring. As she has only been using cow-based words, with the help of Oscar Martinez (Oscar Nunez) and Pam Halpert (Jenna Fischer), she almost beats Gabe but loses on the last turn, because she does not realize Oscar is trying to feed her a winning word ("apoplexy") and instead puts in "ape".
After Jim spends all day outside of the office doing meaningless activities, Pam forces him to tell the camera crew why he is avoiding the guy. He and the man were childhood friends who were placed in separate reading groups in school (Jim was in the superior one). After his mom told him to spend time with the kids in his own reading group, Jim told his friend that "my mom thinks you're too dumb to hang out with." Jim bumps into the man in the break room, and tries to laugh off their history, but the man mocks Jim for not being as successful as his superior intellect would have indicated, and snarks "Where's your jet pack, Zuckerberg?"
During a break in Andy's seminar, Dwight talks to one of the attendees, a man who owns a golf supply business, and realizes this could actually be a good opportunity for the sales team. He, along with Phyllis and Stanley, try to rejoin Andy, who refuses at Darryl's advice. At the end of the seminar, Andy is able to sell three packages, thanks to the advice of Michael, who has taken a Greek persona of "Mykonos" in order to impress Holly Flax (Amy Ryan), who has broken up with AJ. With Holly playing his wife "Necropolis", Michael professes his love to her (in character). She becomes uncomfortable at this and walks away but Michael is hopeful.
At the end of the day, Gabe shows Erin the movie they will be watching that night, Hardware, which involves a killer robot (calling it a compromise because Erin wanted to watch WALL-E). As he is leaving, Andy casually loans Erin a copy of Shrek 2, saying he thinks she would like it. A deflated Gabe realizes that Erin is far more excited about the movie Andy recommended her than the one he picked out. 

This plot summary is too long. Can you shorten it to be less than five hundred words?
Output

Michael Scott bumps into David Brent and they exchange jokes. David asks if Dunder Mifflin is hiring, but Michael says no. Andy Bernard holds a seminar to improve his sales, but the speakers pull out. Andy replaces them with Kevin Malone, Creed Bratton and Ryan Howard. Kevin gets ill and vomits in front of the audience. Kelly Kapoor calls an old professor to talk business lessons, and Creed gives a bizarre speech about the Loch Ness Monster. Erin Hannon and Gabe Lewis play a game of Scrabble, but Erin loses. Jim Halpert avoids a guy from his past, but they eventually talk in the break room. During a break in Andy's seminar, Dwight talks to a golf supply business owner and realizes it could be a good opportunity for the sales team. At the end of the seminar, Andy is able to sell three packages, thanks to Michael. At the end of the day, Gabe shows Erin a movie, but she is more excited about the movie Andy recommended her.

I think Gabe and Erin are annoying. Could you write another version of the plot summary where they get transferred to the Chicago branch during the cold open, are written out of the show, and are never heard from again?

In the cold open, Gabe and Erin are transferred to the Chicago branch and are written out of the show. Michael Scott bumps into David Brent and they exchange jokes. David asks if Dunder Mifflin is hiring, but Michael says no. Andy Bernard holds a seminar to improve his sales, but the speakers pull out. Andy replaces them with Kevin Malone, Creed Bratton and Ryan Howard. Kevin gets ill and vomits in front of the audience. Kelly Kapoor calls an old professor to talk business lessons, and Creed gives a bizarre speech about the Loch Ness Monster. Jim Halpert avoids a guy from his past, but they eventually talk in the break room. During a break in Andy's seminar, Dwight talks to a golf supply business owner and realizes it could be a good opportunity for the sales team. At the end of the seminar, Andy is able to sell three packages, thanks to Michael. At the end of the day, the office continues on without Gabe and Erin.

Hell yeah. Thanks again.

You're welcome! I'm glad you enjoyed the revised plot summary.

Notes

edit

The original summary was a more-or-less accurate description of the episode, albeit with some strange aspects. For example: Michael says "not right now" and David asks him to drop a line if they do have any openings kind of implies he blew him off; in the actual scene Michael is greatly enamored of David, gives him a tight hug, and remarks "what a nice guy" after their conversation.

But this is a problem with the original description. The model cannot fix this. Unlike me, it never watched the episode: all it has to go by are the prompt it's given and whatever background information it's picked up from its training data. This it does with aplomb, hitting the major points of the episode while trimming a lot of background detail. Of course, we can hem and haw about which detail it trims: was Jim meeting his old classmate a major part of the episode? Was Holly breaking up with AJ an important development of her character arc? I think the standard we have to compare it to is not a nerd who has seen every episode of The Office multiple times, but some random editor who's never seen the show and notices that an article has an excessively long plot summary. The latter is much more likely to be the person who eventually copyedits this!

That said: I honestly think that the show would have been better without Gabe and Erin. No offense to the actors -- the guy who played Gabe was outstanding in Silicon Valley -- but it always felt like they were trying too hard to be quirky, and I think part of it was a failed attempt to recreate some of the early-season chemistry between Jim and Pam after their marriage took most of the romantic tension out of the show.

Extraction of additional content from reference metadata (Aradus paganicus)

edit

This is an extremely short stub about a bug. The entire body text of the article is this:

  • Aradus paganicus is a species of flat bug in the family Aradidae. It is found in North America."

But it looks like there is some additional -- cited -- information lurking in the infobox and the references. Is that anything?

"Can you expand the text of the article, using only the information in the infobox and citation templates?" (open for full prompt)
The following is a Wikipedia article called "Aradus paganicus".
{{Short description|Species of true bug}}
{{Speciesbox
| genus = Aradus
| species = paganicus
| authority = Parshley, 1929
}}

'''''Aradus paganicus''''' is a species of [[flat bug]] in the family [[Aradidae]]. It is found in North America.<ref name=itis/><ref name=gbif/><ref name=buglink/>

==References==
{{Reflist|refs=
<ref name=itis>
{{Cite web| title=''Aradus paganicus'' Report
| url=https://www.itis.gov/servlet/SingleRpt/SingleRpt?search_topic=TSN&search_value=107557
| website=Integrated Taxonomic Information System
| accessdate=2019-09-25
}}</ref>

<ref name=gbif>
{{Cite web| title=''Aradus paganicus''
| url=https://www.gbif.org/species/2020063
| website=GBIF
| accessdate=2019-09-25
}}</ref>

<ref name=buglink>
{{Cite web| title=''Aradus paganicus'' species Information
| url=https://bugguide.net/node/view/454193
| website=BugGuide.net
| accessdate=2019-09-25
}}</ref>
}}

{{Taxonbar|from=Q10416736}}

[[Category:Aradidae]]
[[Category:Articles created by Qbugbot]]
[[Category:Insects described in 1929]]

{{pentatomomorpha-stub}}

Can you expand the text of the article, using only the information in the infobox and citation templates?
Output

Aradus paganicus is a species of flat bug in the family Aradidae. It was first described by Parshley in 1929 and is found in North America. It is a member of the pentatomomorpha group of insects.

Notes

edit

Indeed, it was described by Parshley in 1929, and it is a member of Pentatomomorpha. Furthermore, this information was in the references.

This task requires careful phrasing to avoid random addition of supplementary details (i.e. the average size of flat bugs, their feeding habits, and the like). This is why I said "using only the information [...]". This is something you need to keep in mind if you are trying to use this model.

Proseline repair (Scott Tremaine)

edit
""Proseline", a style of writing with extremely short (often single-sentence) paragraphs that all begin with a date and describe a single thing that happened on that date. It is generally considered bad form. Can you fix this, and copyedit the above section so that it is written as a smaller number of longer paragraphs?" (open for full prompt)
Output
The following is a section from a Wikipedia article named "Scott Tramaine". In 2020, he was elected a Legacy Fellow of the American Astronomical Society in 2020.

In 2013, he won the Tomalla Foundation Prize for his work on gravitational dynamics.

In 2010, he received an honorary doctorate from the University of Toronto "in recognition of his scholarly contributions to the field of astrophysics, and his administrative leadership in support of Canadian and international science".

In 2005, he won the Research Award from the Alexander von Humboldt Foundation.

In 2002, he was elected to membership in the National Academy of Sciences.

In 1999, Tremaine also received an honorary Doctor of Science degree from St. Mary's University.

In 1998, he won the Dirk Brouwer Award which is awarded by the Division of Dynamical Astronomy of the American Astronomical Society[36] "in recognition of his many outstanding contributions to a wide range of dynamical problems in both solar-system and galactic dynamics."

In 1997, he was awarded the Dannie Heineman Prize for Astrophysics for "diverse and insightful applications of dynamics to planets, rings, comets, galaxies and the universe."

In 1996, he was awarded an honorary Doctor of Science award by McMaster University.

In 1994, Tremaine became a Fellow of the Royal Society of London and also of the Royal Society of Canada.

In 1990, he was awarded the Rutherford Memorial Medal in Physics by the Academies of Arts, Humanities and Sciences of Canada for "his outstanding contributions to the field to [sic] astrophysics, particularly his spectacular success in predicting the properties of planetary ring dynamics and the extraplanetary objects that control them".

In 1990, he won the C.S. Beals Award from the Canadian Astronomical Society which is awarded for outstanding research to a Canadian astronomer or an astronomer working in Canada.

In 1983, he won the Helen B. Warner Prize for Astronomy given by the American Astronomical Society in recognition of "his many outstanding contributions to a wide range of dynamical problems in both solar-system and galactic dynamics".

"Proseline", a style of writing with extremely short (often single-sentence) paragraphs that all begin with a date and describe a single thing that happened on that date. It is generally considered bad form. Can you fix this, and copyedit the above section so that it is written as a smaller number of longer paragraphs? 

Scott Tramaine is a renowned astrophysicist who has received numerous awards and accolades for his research and administrative leadership in the field. In 2020, he was elected a Legacy Fellow of the American Astronomical Society. In 2013, he won the Tomalla Foundation Prize for his work on gravitational dynamics. In 2010, he received an honorary doctorate from the University of Toronto. In 2005, he won the Research Award from the Alexander von Humboldt Foundation. In 2002, he was elected to membership in the National Academy of Sciences.

In addition to these more recent achievements, Tremaine has also been recognized for his contributions to astrophysics over the past several decades. In 1999, he received an honorary Doctor of Science degree from St. Mary's University. In 1998, he won the Dirk Brouwer Award from the Division of Dynamical Astronomy of the American Astronomical Society. In 1997, he was awarded the Dannie Heineman Prize for Astrophysics. In 1996, he was awarded an honorary Doctor of Science award by McMaster University. In 1994, Tremaine became a Fellow of the Royal Society of London and also of the Royal Society of Canada.

In 1990, Tremaine won two prestigious awards: the Rutherford Memorial Medal in Physics by the Academies of Arts, Humanities and Sciences of Canada, and the C.S. Beals Award from the Canadian Astronomical Society. In 1983, he won the Helen B. Warner Prize for Astronomy from the American Astronomical Society. These awards recognize his significant contributions to the fields of solar-system and galactic dynamics.

Thanks!

You're welcome! I'm glad I could help improve the section. Let me know if you need help with anything else.

Notes

edit

The output text contains all of the awards listed in the section, without adding anything untrue. There are some minor style issues: it adds some editorializing like "renowned astrophysicist" and "prestigious awards". It could also be argued that it's editorializing by taking the "significant contributions" part about the Helen B. Warner Prize out of quotation marks to put it in wikivoice.

However, if I can editorialize for a second: anybody with half a brain (i.e. the ability to look at the input and output text) is capable of recognizing this and amending it before hitting "save".

Identification and tagging of unreferenced statements (KBVA)

edit
"Can you add "[!]" to the end of all the statements in this article that don't cite inline references?" (expand for full prompt)
The following is text from a Wikipedia article titled "KBVA".
KBVA (106.5 MHz) is a commercial FM broadcasts in HD with 2 sub channels radio station licensed to Bella Vista, Arkansas, and serving Southwest Missouri, Northwest Arkansas and northeast Oklahoma. The station is owned by John Lykins, through licensee Rox Radio Group, LLC. The station airs a combination of Oldies and Classic Hits formats, with hits from the 50s, 60, 70s and 80s, with a more enhanced and a larger playlist. The playlist includes a couple of adult standards each hour from artists such as Frank Sinatra, Nat King Cole, Barbra Streisand and Perry Como. KBVA carries the syndicated John Tesh "Intelligence for Your Life" show in afternoons. The station contains hourly news updates which are provided by ABC News Radio.
The studios and offices are on American Street in Springdale, Arkansas.[1] The transmitter is off Y City Road in Decatur, Arkansas.[2] KBVA's signal is heard in sections of Arkansas, Missouri and Oklahoma.
History
In November 1991, the station signed on the air.[3] KBVA's original owner was Gayla Joy Hendren McKenzie, the daughter of Republican politician Kim Hendren and the sister of current state senator Jim Hendren. The station carried political advertising for Kim Hendren's unsuccessful bid in 2010 for the United States Senate.
McKenzie sold KBVA to Hog Radio effective July 31, 2017 for $1.15 million, which rebranded the station from "Variety 106.5" to "Lite 106.5". It is the only radio station in the Fayetteville market to have an Adult Standards format in 2021.
Hog Radio sold KBVA, three sister stations, and a translator to John Lykins' Rox Radio Group, LLC for $3 million effective January 27, 2021. 
Can you add "[!]" to the end of all the statements in this article that don't cite inline references?

KBVA (106.5 MHz) is a commercial FM broadcasts in HD with 2 sub channels radio station licensed to Bella Vista, Arkansas, and serving Southwest Missouri, Northwest Arkansas and northeast Oklahoma. The station is owned by John Lykins, through licensee Rox Radio Group, LLC. [!] The station airs a combination of Oldies and Classic Hits formats, with hits from the 50s, 60, 70s and 80s, with a more enhanced and a larger playlist. [!] The playlist includes a couple of adult standards each hour from artists such as Frank Sinatra, Nat King Cole, Barbra Streisand and Perry Como. [!] KBVA carries the syndicated John Tesh "Intelligence for Your Life" show in afternoons. [!] The station contains hourly news updates which are provided by ABC News Radio.

The studios and offices are on American Street in Springdale, Arkansas.[1] The transmitter is off Y City Road in Decatur, Arkansas.[2] KBVA's signal is heard in sections of Arkansas, Missouri and Oklahoma. [!] History

In November 1991, the station signed on the air.[3] KBVA's original owner was Gayla Joy Hendren McKenzie, the daughter of Republican politician Kim Hendren and the sister of current state senator Jim Hendren. [!] The station carried political advertising for Kim Hendren's unsuccessful bid in 2010 for the United States Senate. [!]

McKenzie sold KBVA to Hog Radio effective July 31, 2017 for $1.15 million, which rebranded the station from "Variety 106.5" to "Lite 106.5". [!] It is the only radio station in the Fayetteville market to have an Adult Standards format in 2021. [!]

Hog Radio sold KBVA, three sister stations, and a translator to John Lykins' Rox Radio Group, LLC for $3 million effective January 27, 2021. [!]

Notes

edit

It missed a couple (the first sentence, and the one at the end of the first paragraph). Nonetheless, it is easy to see how an intelligent person who does not blindly input the LLM text into the edit box and automatically press "publish changes" might use this to save labor.


GPT-3

edit

This morning, I found that ChatGPT had been kablooeyed by excess volume, so currently unavailable. In its stead, I will use GPT-3.

Table formatting and color schemes (Ashcroft-Cache Creek-Clinton Transit System)

edit

Take a look at this table, from Ashcroft-Cache Creek-Clinton Transit System.

Route Route Name Type Areas Served Scheduling Notes
1 Ashcroft-Cache Creek-Clinton Regional Ashcroft, Cache Creek, Clinton Monday-Friday Twice daily

This is a pretty normal-looking table, but what if we wanted it to be different? Luckily, GPT-3 understands hex codes, and is capable of coming up with reasonable color schemes.

"The following is the same table, with the colors changed to be those of the University of Michigan." (open for full prompt)
The following is a table from a Wikipedia article called "Ashcroft-Cache Creek-Clinton Transit System".

{| border="0" cellspacing="4" cellpadding="8" style="border: black solid 1px;"
| Style="background: #FFFFFF; color: black; text-align: center;" | Route
| Style="background: #FFFFFF; color: black; text-align: center;" | Route Name
| Style="background: #FFFFFF; color: black; text-align: center;" | Type
| Style="background: #FFFFFF; color: black; text-align: center;" | Areas Served
| Style="background: #FFFFFF; color: black; text-align: center;" | Scheduling
| Style="background: #FFFFFF; color: black; text-align: center;" | Notes
|-
| Style="background: #050099; color: white; text-align: center;" | '''1'''
| Style="background: #F1F1F1; color: black; text-align: center;" | '''Ashcroft-Cache Creek-Clinton'''
| Style="background: #F1F1F1; color: black; text-align: center;" | Regional
| Style="background: #F1F1F1; color: black; text-align: center;" | Ashcroft, Cache Creek, Clinton
| Style="background: #F1F1F1; color: black; text-align: center;" | Monday-Friday
| Style="background: #F1F1F1; color: black; text-align: center;" | Twice daily
|}

The following is the same table, with the colors changed to be those of the University of Michigan.
Route Route Name Type Areas Served Scheduling Notes
1 Ashcroft-Cache Creek-Clinton Regional Ashcroft, Cache Creek, Clinton Monday-Friday Twice daily
The following is the same table, with the colors changed to be those of Michigan State University.

(attempt 1)

Route Route Name Type Areas Served Scheduling Notes
1 Ashcroft-Cache Creek-Clinton Regional Ashcroft, Cache Creek, Clinton Monday-Friday Twice daily

(attempt 2)

Route Route Name Type Areas Served Scheduling Notes
1 Ashcroft-Cache Creek-Clinton Regional Ashcroft, Cache Creek, Clinton Monday-Friday Twice daily
The following is the same table, with a vaporwave color scheme.
Route Route Name Type Areas Served Scheduling Notes
1 Ashcroft-Cache Creek-Clinton Regional Ashcroft, Cache Creek, Clinton Monday-Friday Twice daily
Jesus, that's eye-burning! Can you tone it down a bit?
Route Route Name Type Areas Served Scheduling Notes
1 Ashcroft-Cache Creek-Clinton Regional Ashcroft, Cache Creek, Clinton Monday-Friday Twice daily

Notes

edit

Note that these are, in fact, the U of M and MSU colors. I think this one pretty much speaks for itself.


Miscellaneous HTML

edit
The following is a block of HTML that renders as the string "Test string" with the background color #d04dac and text color #3d3eea.
Test string
The following is a block of HTML that renders the string "Post free or log off" with the background of the five colors of the Free Speech Flag (#09f911, #029d74, #e35bd8, #4156c5  and #635688).

POST FREE OR LOG OFF

Custom userbox generation

edit
Input
 This user believes that laughter is a medicine that should be taken several times a day!!!

"The following is this same userbox, but with an extremely obnoxious color scheme that is hard to read" (open for full prompt)
The following is a Wikipedia userbox with black text on a gold background, about laughter being a medicine that should be taken several times a day.
{{userbox
| float    = left
| border-c = orange
| border-s = 1
| id       = [[Image:Face-grin.svg|45px]]
| id-c     = orange
| id-fc    = orange
| id-s     = 10
| info     = This user believes that '''[[laughter]]''' is a medicine that should be taken several times a day!!!
| info-c   = gold
| info-fc  = black
| info-s   = 8
}}
The following is this same userbox, but with an extremely obnoxious color scheme that is hard to read.
Output
 This user believes that laughter is a medicine that should be taken several times a day!!!
The following is a userbox with the same text, but a color scheme subtly implying the user is bisexual.
 This user believes that laughter is a medicine that should be taken several times a day!!!
The following is the same userbox, but with a color scheme suggesting the user is a fan of the Roman Empire.
File:Roman Eagle Emblem.svgThis user believes that laughter is a medicine that should be taken several times a day!!!

Notes

edit

I was hoping it would know about bisexual lighting, but alas. The fact that Image:Rainbow.svg was a real image seems to be coincidental (note that Image:Roman Eagle Emblem.svg isn't).

See also

edit