User talk:GreenC/2021
Happy New Year!
editThanks for your contributions to Wikipedia, and a Happy New Year to you and yours! North America1000 05:44, 3 January 2021 (UTC)
- – Send New Year cheer by adding {{subst:Happy New Year}} to user talk pages.
Iron Law of Oligarchy
editYou mention this on your userpage. I got indeffed and IP-banned from Conservapedia for a single well-sourced post noting that molality and molarity are respectively absolute and relative measures. If I hadn't bragged about it on RationalWiki, it'd probably still be there... Narky Blert (talk) 18:24, 3 January 2021 (UTC)
I have to say
edit"As a proof, they told me that the Keep vote of [GreenC which you see in the deletion discussion is done by them." If that's true, it's actually pretty cunning. Gråbergs Gråa Sång (talk) 20:47, 25 January 2021 (UTC)
- Eh, my guess is a high end fraud lawyer is more cunning to fall for it and then post tears of regret publicly on Wikipedia. Story doesn't add up. -- GreenC 20:56, 25 January 2021 (UTC)
- That feels probable, sure. Gråbergs Gråa Sång (talk) 20:59, 25 January 2021 (UTC)
The Wikipedia Motivation Barnstar | ||
You are the true motivator :) Sliekid (talk) 07:16, 27 January 2021 (UTC) |
Untitled
editperse nuk vendosni import export te vitit 2020 por keni len ate te vitit 2017? Apo ngaqe esht deficiti shum me i lart — Preceding unsigned comment added by 151.41.56.71 (talk) 19:25, 28 January 2021 (UTC)
Your Opinion Requested at Michael Shellenberger
editHi GreenC,
You've previously weighed-in on the issues of publicity at Michael Shellenberger. I recently tried to clean said page up and add academic literature to the page, and it seems the page's subject has recently taken umbrage with said revisions. If you have the time, do you mind taking a look at the issues that recently occurred at Talk:Michael Shellenberger? --Hobomok (talk) 20:41, 28 January 2021 (UTC)
Edits reverted
editHey GreenC, thanks so much for reviewing Saket Modi. I noticed you reverted all of my edits although they were written in a neutral tone and were supported by third party, reliable sources. I read WP:LEAD that you highlighted in your comment and it says "a lead section should contain no more than four well-composed paragraphs and be carefully sourced as appropriate." and "The lead section should briefly summarize the most important points covered in an article." I am not trying to stuff the lead section rather updating it and adding an award. It was hardly a one-line addition and i provided citations too to back them up. In addition to this, i had made some small changes to the rest of the article with citations and they were also reverted. You seem to be very well versed in policies and guidelines. I would really appreciate your guidance and help with this. Thanks.2405:204:C:AD2D:18B4:8F41:A22B:98E2 (talk) 16:05, 1 February 2021 (UTC)
Hi Sir, did you have a chance to look at it?.
Scripts++ Newsletter – Issue 20
editNews and updates associated with user scripts from the past month (January 2021).
Hello everyone and welcome to the 20th issue of the Wikipedia Scripts++ Newsletter:
Scripts Submit your new/improved script here
|
- As a reminder, the legacy javascript globals (like accessing
wgPageName
without first assigning it a value or usingmw.config.get('wgPageName')
instead) are deprecated. If your user scripts make use of the globals, please update them to usemw.config
instead. Some global interface editors or local interface administrators may edit your user script to make these changes if you don't. See phab:T72470 for more.
- For people interested in creating user scripts or gadgets using TypeScript, a types-mediawiki package (GitHub, NPM) is now available that provides type definitions for the MediaWiki JS interface and the API.
- A GitHub organization has been created for hosting codebases of gadgets. Users who maintain gadgets using GitHub may choose to move their repos to this organization, to ensure continued maintenance by others even if the original maintainer becomes inactive.
- A script to ease reviewing Good Article nominations
- A script to help manage Z number templates
- ...and many more, all available at Wikipedia:User scripts/Requests
As always, if anyone else would like to contribute, including nominating a featured script, help is appreciated. Stay safe, and happy new year! --DannyS712 (talk) 01:17, 3 February 2021 (UTC)
Signature issue on your comment at AfD
editIt seems like there was a problem with your signature for your comment (Special:Diff/1004919157/1004929875) on the Jack Schlossberg AfD. It was a good comment any you might want to correct this issue. Cheers! - tucoxn\talk 13:42, 5 February 2021 (UTC)
crowd governance
editthe external link on your user page no longer works :( i dont want to edit your page, but i was able to enjoy the story at this address: https://web.archive.org/web/20200127053758/http://misrc.umn.edu/wise/2014_Papers/110.pdf have a good one Violarulez (talk) 20:39, 11 February 2021 (UTC)
- Thank you, added the archive link. Interesting and non-intuitive story. -- GreenC 20:43, 11 February 2021 (UTC)
Speedy deletion nomination of Category:Esperanto literary awards
editA tag has been placed on Category:Esperanto literary awards requesting that it be speedily deleted from Wikipedia. This has been done under section C1 of the criteria for speedy deletion, because the category has been empty for seven days or more and is not a disambiguation category, a category redirect, a featured topics category, under discussion at Categories for discussion, or a project category that by its nature may become empty on occasion.
If you think this page should not be deleted for this reason, you may contest the nomination by visiting the page and clicking the button labelled "Contest this speedy deletion". This will give you the opportunity to explain why you believe the page should not be deleted. However, be aware that once a page is tagged for speedy deletion, it may be deleted without delay. Please do not remove the speedy deletion tag from the page yourself, but do not hesitate to add information in line with Wikipedia's policies and guidelines. Liz Read! Talk! 15:58, 13 February 2021 (UTC)
- @Liz: I didn't create empty categories 11 years ago, I guess whatever was there has been deleted. -- GreenC 16:09, 13 February 2021 (UTC)
TravelMate URL's
editUnfortunately I can not find the exact reference for this but back in June 2020 you stopped bot InternetArchiveBot from archiving links for [1] or possibly a shorther name, as the archived versions do not work - the only reference I can now find is Wikipedia:Australian_Wikipedians'_notice_board/Archive_57#premierpostal.com. I have now encountered a similar problem with Red Cliffs, Victoria where a link to http://www.travelmate.com.au/MapMaker/MapMaker.asp which is dead, is being archived but the archived versions do no work as the javascript does not operate. The bot has now for the second time archived this link, on both occasions removing the 'dead link' tag. Can you please again help in stopping this bot archiving links for this URL. Fleet Lists (talk) 02:52, 20 February 2021 (UTC)
- Fleet Lists, I believe the correct action is to 'whitelist' the URL which means the bot will always consider it 'alive' and will not try to add an archive. I just did this which should stop IABot. It could still be a problem with any other bot trying to save dead links in the future due to the
{{dead link}}
tag. If the link is dead and no viable archive it might be better to convert these to{{citation}}
without a|url=
. -- GreenC 03:32, 20 February 2021 (UTC)- Thank you for your reply and update of the Red Cliffs article. However the bot has now revisited and removed the "dead link" tag. So we are back where we started. How can the URL be "whitelisted"? Fleet Lists (talk) 22:00, 23 February 2021 (UTC)
- Now I'm not sure what is happening. For the moment, I added
{{cbignore}}
which tells the bot to stay off the reference. This is fine, except when there are dozens or 100s of citations, as in this case, as each requires the cbignore. I'm going to ask the developer why the whitelist is not working. -- GreenC 22:45, 23 February 2021 (UTC) - Ah now figured it out: at iabot.org set the URL status to blacklist (not whitelist) and also delete the archive URL from the record. This action can only be done by an administrator. Should be set now. -- GreenC 22:51, 23 February 2021 (UTC)
- Now I'm not sure what is happening. For the moment, I added
- Thank you for your reply and update of the Red Cliffs article. However the bot has now revisited and removed the "dead link" tag. So we are back where we started. How can the URL be "whitelisted"? Fleet Lists (talk) 22:00, 23 February 2021 (UTC)
DYK for George Dinning
editOn 21 February 2021, Did you know was updated with a fact from the article George Dinning, which you recently created, substantially expanded, or brought to good article status. The fact was ... that in 1897, former slave George Dinning was the first black man to successfully sue a mob of the Ku Klux Klan? The nomination discussion and review may be seen at Template:Did you know nominations/George Dinning. You are welcome to check how many pageviews the nominated article or articles got while on the front page (here's how, George Dinning), and if they received a combined total of at least 416.7 views per hour (ie, 5,000 views in 12 hours or 10,000 in 24), the hook may be added to the statistics page. Finally, if you know of an interesting fact from another recently created article, then please feel free to suggest it on the Did you know talk page.
Disambiguation link notification for February 22
editAn automated process has detected that when you recently edited Brian Nelson (literature professor), you added a link pointing to the disambiguation page Swann in Love.
(Opt-out instructions.) --DPL bot (talk) 06:14, 22 February 2021 (UTC)
there's a mess...
edit... in this edit.
—Trappist the monk (talk) 23:02, 2 March 2021 (UTC)
- Bug that caused this fixed. -- GreenC 16:36, 11 April 2021 (UTC)
Removing archived urls
editHi! I'm sure you're doing great work, but not all of it seems to be going well. I've already posted at User talk:GreenC bot to ask why your bot removed an archived link from Louise Blouin. Why did you then again remove this archived url with this edit? Why should that url not be archived in case it ever ceases to be accessible in the future? Are you aware that, because of the General Data Protection Regulation, many North American websites block access for users from Europe? And that archive.org in many cases provides a way of restoring that access? Of course, if we have a policy that links should not be archived unless unavoidably necessary, do please point me to it. Otherwise, can you unconditionally guarantee that neither you nor your bot will again remove a working archived link from Wikipedia? And that you will, as a matter of priority, identify and repair any instance where either you or the bot has done so in the past? Thanks, Justlettersandnumbers (talk) 22:23, 13 March 2021 (UTC)
- We don't use archives with the intention of bypassing policy blocks, that is not what are archives are meant for, there is no community consensus for that. Policy blocks, be it a pay wall or government regulation. There is no problem adding archive URLs as a precaution for link rot, but in this instance it was added directly into the URL with no citation template or
{{webarchive}}
thus in effect making to live URL inaccessible - literally deleting it. Now, the bot in this case was doing a URL move of observer.com because a user requested it - changing a dead URL to a live URL (there was a change in schemes at observer.com). During URL moves it does preserve the archive but only if there is a citation template or{{webarchive}}
. I probably could add a feature to add a new{{webarchive}}
when it's a square URL with an archive in order to preserve the archive. -- GreenC 22:48, 13 March 2021 (UTC)- The archived link leads directly to the actual source cited when the content was written (see WP:Text-source integrity). That content may have been changed or completely removed from more recent versions of the external page. There is no obligation that I'm aware of to cite a current link to a page if we already have an archived link; nor is there any obligation to use citation templates or webarchive templates (WP:CITEVAR). Anyway, would you kindly either point me to community consensus that a working archived link may be removed without discussion or unconditionally guarantee that neither you nor your bot will again remove a working archived link from Wikipedia, and that you will, as a matter of priority, identify and repair any instance where either you or the bot has done so in the past? Thank you, Justlettersandnumbers (talk) 11:32, 14 March 2021 (UTC)
- I already added the feature. I'll take a look about readding old ones. -- GreenC 13:53, 14 March 2021 (UTC)
- The archived link leads directly to the actual source cited when the content was written (see WP:Text-source integrity). That content may have been changed or completely removed from more recent versions of the external page. There is no obligation that I'm aware of to cite a current link to a page if we already have an archived link; nor is there any obligation to use citation templates or webarchive templates (WP:CITEVAR). Anyway, would you kindly either point me to community consensus that a working archived link may be removed without discussion or unconditionally guarantee that neither you nor your bot will again remove a working archived link from Wikipedia, and that you will, as a matter of priority, identify and repair any instance where either you or the bot has done so in the past? Thank you, Justlettersandnumbers (talk) 11:32, 14 March 2021 (UTC)
Incorrect IABot's edit summary in Russian
editThe current summary "Добавьте № книги для Википедия:Проверяемость" has no sense in Russian language. Correct summary can be "Добавление ссылок на электронные версии книг" or "Добавление ссылок на электронные версии № (plural|книги|книг)". MBH (talk) 14:10, 24 March 2021 (UTC)
- @MBH: I don't know which is better so I did the first one. Thank you very much. -- GreenC 14:31, 24 March 2021 (UTC)
- Also I advice you not to use machine translation for translating bot messages into any languages you don't know. Maybe machine translation between big Roman and Germanic languages is not very bad, but machine translation from English to Russian is always terrible due to big difference in languages' structure. MBH (talk) 14:42, 24 March 2021 (UTC)
Nomination for deletion
editAn article you created or have significantly contributed to has been nominated for deletion. The article is being discussed at the deletion discussion, located here. North America1000 11:41, 1 April 2021 (UTC)
Backlinks?
editHi GreenC! I'm enjoying using the Backlinks functionality - it's been about a year now. I didn't receive any emails today - did your process stop for April Fools' Day? :-) Thanks! GoingBatty (talk) 13:48, 1 April 2021 (UTC)
- It's not that clever :) I checked the logs and it appears to have run and sent emails, the data looks normal. I just sent you a test email from the server can you verify it came through? -- GreenC 15:09, 1 April 2021 (UTC)
- I did not receive the test email, and have received emails from other senders. @Certes: Did you receive the Backlinks emails today? GoingBatty (talk) 16:03, 1 April 2021 (UTC)
- Hmm strange. Certes is using a new system that post results online instead of email. Do you want to use that instead? For example:
- Config page: https://en.wikipedia.org/w/index.php?title=User:Certes/Backlinks
- Data page: User:Certes/Backlinks/Report
- Otherwise I can try to debug why emails are not coming through. -- GreenC 16:08, 1 April 2021 (UTC)
- Yes, I'm interested in having the results posted online instead. I've created User:GoingBatty/Backlinks/Report. For User:GoingBatty/stopbutton, when stopped, does this mean that results are queued on your side, and then all posted once we set Action=RUN again? If so, I'm interested in using that on the days when I'm away from my computer. Thanks! GoingBatty (talk) 16:38, 1 April 2021 (UTC)
- Just ran it, and it worked. I forgot to adjust the filters you wanted to keep out Template, Project and some others, those will be in effect next run. The stop button is a hard stop the program does not cache results. Useful for extended disabled. For random days, recommend viewing the page history which serves as a cache of prior runs. -- GreenC 19:13, 1 April 2021 (UTC)
- There were quite a few links to be fixed in the Template, Project and other spaces, so feel free to keep those coming. Thanks! GoingBatty (talk) 00:53, 2 April 2021 (UTC)
- You now have everything except these:
(^Talk:|^Wikipedia:|^Wikipedia talk:|^Template talk:|^Portal talk:|^User:|^User talk:|^File talk:|^MediaWiki:|^MediaWiki talk:|^Help:|^Help talk:|^Category talk:|^Book:|^Book talk:|^Draft:|^Draft talk:|^TimedText:|^TimedText talk:|^Module talk:)
- -- GreenC 01:32, 2 April 2021 (UTC)
- You now have everything except these:
- There were quite a few links to be fixed in the Template, Project and other spaces, so feel free to keep those coming. Thanks! GoingBatty (talk) 00:53, 2 April 2021 (UTC)
- Just ran it, and it worked. I forgot to adjust the filters you wanted to keep out Template, Project and some others, those will be in effect next run. The stop button is a hard stop the program does not cache results. Useful for extended disabled. For random days, recommend viewing the page history which serves as a cache of prior runs. -- GreenC 19:13, 1 April 2021 (UTC)
- Yes, I'm interested in having the results posted online instead. I've created User:GoingBatty/Backlinks/Report. For User:GoingBatty/stopbutton, when stopped, does this mean that results are queued on your side, and then all posted once we set Action=RUN again? If so, I'm interested in using that on the days when I'm away from my computer. Thanks! GoingBatty (talk) 16:38, 1 April 2021 (UTC)
- My Backlinks appeared on the data page as usual at 10:47 UTC today. It has failed to appear a couple of times over the last few months, but worked fine today. I asked to stop receiving Backlinks by e-mail, as my long list produced lots of e-mails. If I'm away for a few days I'll just catch up using the page history. Certes (talk) 23:47, 1 April 2021 (UTC)
- Hmm strange. Certes is using a new system that post results online instead of email. Do you want to use that instead? For example:
- I did not receive the test email, and have received emails from other senders. @Certes: Did you receive the Backlinks emails today? GoingBatty (talk) 16:03, 1 April 2021 (UTC)
IABot bug - "blocked: You have been blocked from editing." despite not being blocked
editHello! I think phab:T274050 is back to bug us again. I'm getting a "blocked: You have been blocked from editing." error when trying to analyse & edit pages despite not being blocked. I can't seem to make the tool report on the exact API message it's getting (e.g. to see if an autoblock of a Toolforge IP is to blame), could you have a look? Thanks! ƒirefly ( t · c ) 15:36, 3 April 2021 (UTC)
Pages using duplicate arguments in template calls
editis it possible to remove User:GreenC/test from Category:Pages using duplicate arguments in template calls (easier to see the actual problems when there aren't user pages in there)? thank you. Frietjes (talk) 16:22, 11 April 2021 (UTC)
- Done. -- GreenC 16:33, 11 April 2021 (UTC)
Bot functionality request
editHi GreenC, nice to meet you. I found you trawling through the bot status report (User:MajavahBot/Bot status report). I was wondering if I could interest you or request a relatively simple bot task? That task is: periodically go through the entries in this category: Category:Peer review requests not opened.
For each peer review talk page there will be a template like {{Peer review|archive=X}}. There should be a corresponding peer review page called Wikipedia:Peer review/PAGENAME/archiveX, but about once a week someone starts the process but doesn't actually create the page, so the template just hangs there. It would be very useful for a bot to remove the template if the peer review wasn't started for, like, a week after the template was placed, as that probably means no review page will be created.
I've had some problems with single functionality bots before so I thought I might ask you because your bot seems unlikely to randomly become inactive :P. Crossing my fingers, Tom (LT) (talk) 10:28, 12 April 2021 (UTC)
- Hi Tom (LT) - I can help with this, though it would be a standalone bot, running on Toolforge from cron ie. servers maintained by Wikimedia in their datacenter, with code accessible to anyone with a Toolforge account. I think once a day it could retrieve the list of page names in the tracking category, along with today's date, and add it to a text file in two columns (page name|added (ie. today's) date). If the page name is already in the text file don't add it again, but check if it has been more than 7 days since the added date. If so, verify there is Peer review archive and if not then remove the Peer review template, and remove from the text file. Likewise if the pagename is in the text file but not in the tracking category then remove the pagename from the file. Sound good? -- GreenC 02:19, 14 April 2021 (UTC)
- That would be wonderful. It is just one of those small thankless tasks that a bot could so, so I'm very appreciative of this. There are a couple of similar tasks lying around, would it be possible to pester you in the future if something similar arises? Tom (LT) (talk) 07:48, 14 April 2021 (UTC)
- Alright. Hopefully will get to it this week. It depends on the task how complicated, and how busy I am at the time. There is also BOTREQ. BTW I will need to send this through BRFA which sometimes can take forever but see no trouble in approval given it's simplicity and non-controversial. -- GreenC 15:30, 14 April 2021 (UTC)
- User:GreenC bot/Job 20 & Wikipedia:Bots/Requests for approval/GreenC bot 20 -- GreenC 03:29, 15 April 2021 (UTC)
- That would be wonderful. It is just one of those small thankless tasks that a bot could so, so I'm very appreciative of this. There are a couple of similar tasks lying around, would it be possible to pester you in the future if something similar arises? Tom (LT) (talk) 07:48, 14 April 2021 (UTC)
When you have a moment
editHello Green C. I hope you are well. I asked for a run from Template:Cleanup bare URLs/bot last night that it still hasn't processed. You may already be aware of this but I wanted to let you know just in case. My year and a half long infobox person cleanup project is almost finished so I will have time to use this bot again. Cheers. MarnetteD|Talk 22:24, 17 April 2021 (UTC)
- Hello MarnetteD, there was a stuck/zombie process on one of the Toolforge grid computers blocking the spawning of new processes. That can happen, it's beyond my control to prevent but easily fixed by killing the process (done). If by chance it ever happens again and I am not around for a while, you can request help at Village Pump Technical who will point you to the right place (probably a Phab ticket), the stuck process will be called "tagbot.awk". Last resort waiting for the computer to reboot every couple months would also clear it. You take on big projects :) This one is probably infinite but every change is a huge help. -- GreenC 02:46, 18 April 2021 (UTC)
- You said it :-) Thanks for the info and the fix! MarnetteD|Talk 02:55, 18 April 2021 (UTC)
MarnetteD, looks like it zombied again. If it keeps happening I might need to make another program that monitors for stuck processes. -- GreenC 17:57, 26 April 2021 (UTC)
- I'm glad you noticed. I was waiting a bit to see if it would kick in. It is hard to say when this problem crept up since it wasn't getting used as regularly in the last year or so. Thanks for the update. MarnetteD|Talk 18:12, 26 April 2021 (UTC)
InternetArchiveBot in esWiki
editHi, GreenC. Thanks for taking care of this. Can you assure me that, in addition of fixing the duplicates, the bot won't perform inconsequential editions like this (it's difficult to find, it's just an added space)? That's the other half of the complaint. If that's so, I'll lift the block. Thanks. --Angus (talk) 22:07, 18 April 2021 (UTC)
- This bot is small and purpose-built it shouldn't make empty edits. Bigger bots that can happen as they are doing many functions adding and deleting text. -- GreenC 23:28, 18 April 2021 (UTC)
- Sorry I misunderstood, you mean ensure IABot does not (was thinking the smaller fixer bot). I contacted Cyberpower678, this should be an easy bug to detect and avoid by removing all whitespace from the original and new article, compare the two strings and if they are equal abort the edit. -- GreenC 00:19, 19 April 2021 (UTC)
- GreenC, yes, this will be corrected. I should have a fix for this ready fairly quick. —CYBERPOWER (Message) 02:51, 19 April 2021 (UTC)
Hi guys, thanks for your cooperation. I unblocked the bot. --Angus (talk) 12:44, 19 April 2021 (UTC) cc user:cyberpower678
- Hi Angus, could you recommend wording for a Spanish edit summary equivalent to "Fixing 1 redundant
{{wayback}}
" and "Fixing 2 redundant{{wayback}}
" (plural). Will also need "Fixing 1 redundant archiveurl/urlarchvo argument" and "Fixing 2 redundant archiveurl/urlarchvo arguments". I've learned not to use Google Translate or guess but ask a native speaker. Thank you! -- GreenC 14:41, 19 April 2021 (UTC)- Here:
- Arreglo
{{wayback}}
redundante - Arreglo 2
{{wayback}}
redundantes - Arreglo argumento urlarchivo/archiveurl redundante
- Arreglo 2 argumentos urlarchivo/archiveurl redundantes
- Arreglo
- --Angus (talk) 14:52, 19 April 2021 (UTC)
- Here:
- Angus, btw, the bot has a run page so it doesn’t need to be blocked to stop it. You can find the run page at https://iabot.toolforge.org/index.php?page=runpages&wiki=eswiki —CYBERPOWER (Around) 16:27, 19 April 2021 (UTC)
- Cyberpower678, unfortunately the "IABot Management Console" wants me to give it unnecessary access to private information, like my email address and who knows what else, before it will show me that page. So it remains inaccessible to me. --Angus (talk) 16:49, 19 April 2021 (UTC)
- Angus, as the designer of the bot and the UI I can assure that not only is your email address not saved anywhere unless you explicitly tell the tool to, your email address is not ever passed to the tool on authorization. I have no idea why it says that. All you are giving the tool is your username and public accessible data like your registration date, permissions, and block status. —CYBERPOWER (Chat) 17:05, 19 April 2021 (UTC)
- Angus, user privacy is taken very seriously and is never leaked. Private data is only stored with the users’ permission and critical data is encrypted to prevent unauthorized access. —CYBERPOWER (Chat) 17:06, 19 April 2021 (UTC)
- Cyberpower678, unfortunately the "IABot Management Console" wants me to give it unnecessary access to private information, like my email address and who knows what else, before it will show me that page. So it remains inaccessible to me. --Angus (talk) 16:49, 19 April 2021 (UTC)
Cyberpower678, it's ok, no worries. Maybe the Mediawiki API (or whatever) should be changed so it doesn't request unneeded data...
GreenC, thanks man! Sorry I wasn't there when needed, I'm glad things are fixed now! --Angus (talk) 22:54, 20 April 2021 (UTC)
esWiki
editHi, I'm not sure if this is the right place to report this bug, but InternetArchiveBot duplicated two articles on esWiki while trying to fix a redundant archive. The first one is es:Anthem Sports (a duplicate of es:Anthem Sports & Entertainment) and the second one is es:Heckler (a duplicate of es:Heckler & Koch MP5). I think these are the only cases so far ([2]). --Soulreaper (talk) 15:08, 20 April 2021 (UTC)
- Yes I am aware of this bug in the code and fixed it and had already redirected Heckler but was not aware of Anthem, now also redirected. If you think they should be deleted instead I'll start that process. -- GreenC 16:34, 20 April 2021 (UTC)
GreenC Bot
editWhat do GreenC Bot do ? Cookersweet (talk) 11:44, 22 April 2021 (UTC)
Thanks for helping out at peer review!
editThe Peer Review Barnstar | ||
For your very helpful bot-related contributions to Wikipedia peer review, I present to you the peer review barnstar. Nice work! Tom (LT) (talk) 07:11, 5 May 2021 (UTC) |
Tom (LT) (talk) 07:11, 5 May 2021 (UTC)
- No problem! At this rate it will be longest trial period for 25 edits in history :) -- GreenC 01:15, 6 May 2021 (UTC)
Transclusion of deleted template
editI've nowiki'ed a transclusion of the now-baleeted {{Wayback}} from a subpage in your userspace, but I will let you know here for the sake of visibility (since I don't know if you're going to see an edit on some random userspace page). jp×g 17:11, 17 May 2021 (UTC)
- @JPxG: thank you, I just pre'd the whole page for now. -- GreenC 18:21, 17 May 2021 (UTC)
Just wanted to tell you about a project I've started recently. Wikipedia:Link rot/Templates is intended to list all our external link templates on one page along with the status of the links to more quickly catch when links go down. I hope to get all templates with over 1000 transclusions on there within a few weeks.
If it would be possible to have a bot assisting with detection of dead links that would be great. If the links were checked to be working weekly by bot that would make the page a lot more useful. Is that plausible or not? I'm sadly completely out of my depth with that kind of bot and can not answer even simple questions like that on my own. --Trialpears (talk) 23:24, 22 May 2021 (UTC)
Disambiguation link notification for May 25
editAn automated process has detected that when you recently edited Lionel Terray, you added a link pointing to the disambiguation page Mount Huntington.
(Opt-out instructions.) --DPL bot (talk) 06:01, 25 May 2021 (UTC)
Dead link
editHi GreenC, I noticed you marked a link I added as dead. Thanks for pinging me. I added it today, and just checked again, and the link is definitely not dead. ― Tartan357 Talk 21:46, 7 June 2021 (UTC)
- Never mind, I figured it out. The link is uniquely-generated and has a short expiration. I'll just link to the index. ― Tartan357 Talk 22:06, 7 June 2021 (UTC)
User:GoingBatty/Backlinks/Report
editHi GreenC! I've been enjoying the daily updates posted User:GoingBatty/Backlinks/Report and fixing the appropriate articles. I noticed that the bot didn't post an update today. Could you please check on it? Thanks! GoingBatty (talk) 01:55, 10 June 2021 (UTC)
- It ran and generated the table, which it keeps on hand, but it didn't post for some reason. Maybe a network transient? I just posted it manually. Good thing you asked as it only keeps it for up to the next batch run. -- GreenC 02:06, 10 June 2021 (UTC)
- Thank you for the manual list. The bot worked fine today, as usual. Happy editing! GoingBatty (talk) 22:37, 10 June 2021 (UTC)
- Hi again! Unfortunately, your bot did not post a new version of User:GoingBatty/Backlinks/Report today. Could you please check on it? Thanks! GoingBatty (talk) 13:57, 6 July 2021 (UTC)
- Thank you for the manual list. The bot worked fine today, as usual. Happy editing! GoingBatty (talk) 22:37, 10 June 2021 (UTC)
OK just added a loop it will try 10 times with 30 second pauses to account for timeouts. After 10 on fail it will email me. I believe this will solve it. -- GreenC 15:33, 6 July 2021 (UTC)
- Thank you for manually posting an update for today, but that update contains many items not on User:GoingBatty/Backlinks. Did you accidentally provide me someone else's list? Thanks! GoingBatty (talk) 16:56, 6 July 2021 (UTC)
- Oi, that's my list! Certes (talk) 17:31, 6 July 2021 (UTC)
- lol yeah sorry about that the procs are called "bw" and "bw2" on the server and I got confused which is GoingBatty (bw) and Certes (bw2). Should be corrected now. -- GreenC 17:34, 6 July 2021 (UTC)
Shadows Commons bot
editUser:GreenC bot/Job 10 hasn't tagged anything as {{ShadowsCommons}} since 4 May. The bot page says it uses Quarry 18894 but that query doesn't work due to [3]. It seems unlikely that absolutely nothing shadowed Commons after 4 May. — Alexis Jazz (talk or ping me) 22:16, 24 June 2021 (UTC)
- @Alexis Jazz: Ah. Shoot. It was discussed in this Phab a while back and the WMF sysadmins didn't come up with a viable alternative. I just posted an alternative idea but it would take some time to develop, assuming it can even be made to work. The basic issue is that Commons has 60+ million titles and downloading that list takes a very long time, meanwhile ShadowBot needs to run daily. So my idea was to break the problem down into sub-lists; and scrap using database queries which can't deal with this problem effectively. It's an ugly problem. -- GreenC 00:02, 25 June 2021 (UTC)
- GreenC, MGA73, how do I get a list of files on enwiki? As in, without the local description pages. I already have a list that includes those. It seems technically SELECT img_name FROM image should work, but it looks like it'll take about half an hour? — Alexis Jazz (talk or ping me) 09:51, 25 June 2021 (UTC)
- @Alexis Jazz: I have no good solution atm. It seems that it does take a long time to run a quarry. --MGA73 (talk) 11:21, 25 June 2021 (UTC)
- MGA73, 1298.78 seconds to return 894832 rows to be exact. — Alexis Jazz (talk or ping me) 11:31, 25 June 2021 (UTC)
- MGA73, User:Alexis Reggae/The Real Slim ShadyCommons something something — Alexis Jazz (talk or ping me) 14:24, 25 June 2021 (UTC)
- Alexis Jazz hi, sorry, not sure what we are looking at, were you able to devise a working query? -- GreenC 18:49, 25 June 2021 (UTC)
- Not really, just wanted to get a list once to prevent the backlog from getting bigger. It actually returned more than expected, but many problem files so I'm keeping the list. The Real Slim ShadyCommons is essentially a list of files on enwiki (SELECT img_name FROM image) that shadow a Commons file (index from dump) or redirect and don't have the {{keeplocal}} template. It includes what the bot would have tagged, but also much more, so it's not as simple as this. — Alexis Jazz (talk or ping me) 19:00, 25 June 2021 (UTC)
- Alexis Jazz hi, sorry, not sure what we are looking at, were you able to devise a working query? -- GreenC 18:49, 25 June 2021 (UTC)
- @Alexis Jazz: I have no good solution atm. It seems that it does take a long time to run a quarry. --MGA73 (talk) 11:21, 25 June 2021 (UTC)
- GreenC, MGA73, how do I get a list of files on enwiki? As in, without the local description pages. I already have a list that includes those. It seems technically SELECT img_name FROM image should work, but it looks like it'll take about half an hour? — Alexis Jazz (talk or ping me) 09:51, 25 June 2021 (UTC)
- @Alexis Jazz: It appears shadows bot is working again! All thanks to the amazing work of User:AntiCompositeNumber who wrote a Python program that can produce a list in a matter of minutes. The list is then fed through my shadows.awk as normal and the tags added. -- GreenC 02:26, 9 November 2021 (UTC)
- I see. 27 files in Category:Wikipedia files that shadow a file on Wikimedia Commons, work to do! — Alexis Jazz (talk or ping me) 19:29, 9 November 2021 (UTC)
- Cleared the category except for two files that I listed on FfD instead. — Alexis Jazz (talk or ping me) 21:22, 9 November 2021 (UTC)
- Great, not bad. You must have special powers to rename File: because I can't seem to do that, like with And babies.jpg -- GreenC 21:30, 9 November 2021 (UTC)
- Special powers indeed. — Alexis Jazz (talk or ping me) 22:47, 9 November 2021 (UTC)
- For File:And babies (anti-Vietnam War poster).jpg / c:Commons:Deletion requests/File:And Babies.jpg, you should ask Clindberg. IMHO the lettering doesn't exceed c:COM:TOO US, but the copyright status of the underlying photo is also unclear. Clindberg is more knowledgeable about that. — Alexis Jazz (talk or ping me) 23:05, 9 November 2021 (UTC)
- Great, not bad. You must have special powers to rename File: because I can't seem to do that, like with And babies.jpg -- GreenC 21:30, 9 November 2021 (UTC)
- Cleared the category except for two files that I listed on FfD instead. — Alexis Jazz (talk or ping me) 21:22, 9 November 2021 (UTC)
- I see. 27 files in Category:Wikipedia files that shadow a file on Wikimedia Commons, work to do! — Alexis Jazz (talk or ping me) 19:29, 9 November 2021 (UTC)
Hi @Alexis Jazz: sorry to bother you but there is kind of a mystery... Back in May when the bot was running under the original system it was adding 2-5 templates a day. Almost no days were 0. However in the past week, using the new system, it only add 1 template, log:
Bot ran on 20211109-04:37:10 ---- ---- ---- 20211109-04:38:08 ---- No data returned by SQL query Bot ran on 20211110-04:37:10 ---- ---- ---- 20211110-04:38:39 ---- No data returned by SQL query Bot ran on 20211111-04:37:10 ---- ---- ---- 20211111-04:38:40 ---- No data returned by SQL query Bot ran on 20211112-04:37:10 ---- ---- File:Chucky_TV_series_logo.png ---- 20211112-04:38:41 ---- Added template Bot ran on 20211113-04:37:10 ---- ---- ---- 20211113-04:38:33 ---- No data returned by SQL query Bot ran on 20211113-19:06:25 ---- ---- ---- 20211113-19:07:47 ---- No data returned by SQL query Bot ran on 20211114-04:37:10 ---- ---- ---- 20211114-04:38:41 ---- No data returned by SQL query Bot ran on 20211115-04:37:10 ---- ---- ---- 20211115-04:38:37 ---- No data returned by SQL query
Most days are zero. This could be accurate.. or sign of trouble. Do you have any thoughts or insights if there is trouble brewing? It's disconcerting there might be some/many undetected. Another idea is they added new processes at Commons to help avoid name collisions. Perplexed! -- GreenC 05:29, 15 November 2021 (UTC)
- @GreenC: I don't really know. Does the SQL query time out? (maybe adding a limit would help?) Maybe try a query that you know should return something to ensure things are working? — Alexis Jazz (talk or ping me) 10:57, 15 November 2021 (UTC)
- @Alexis Jazz:, that's a good idea. It is hard to impossible to test outside the live system as it's unclear where the problem originates, and I can't find existing cases since that is being tested, finding cases. Wonder if I uploaded a File: from Commons to Enwiki under the same name, tested to see if they are detected, do you also have powers to quickly remove once the test is done (which takes minutes)? I'd also like go in reverse, upload a file from Enwiki to Commons under the same name, but might need to find a different image that is CC/PD, but matching a name on enwiki. -- GreenC 22:00, 15 November 2021 (UTC)
- So try https://en.wikipedia.beta.wmflabs.org/ and https://commons.wikimedia.beta.wmflabs.org/. Not 100% the same (many templates are not there and the configuration can be slightly different in some cases) but it's a start. Let me know if you need rights or deletions on betacommons. I tried uploading a file on https://en.wikipedia.beta.wmflabs.org/wiki/Special:Upload with a name that already existed on betacommons. I get a warning "A file with this name exists already. Please check File:Test upload header 3.svg before overwriting. Use a descriptive filename (e.g. "Eiffel Tower, Paris, at night.jpg") to prevent conflicts with existing files." and the upload is denied. (the warning can't be ignored) There is no warning when uploading a file to betacommons that already exists on betaenwiki though. You can take https://commons.wikimedia.beta.wmflabs.org/wiki/File:Test_upload_header_1.svg and make some change with a text editor to avoid warnings about duplicates. Just add a comment, change the ID, the color (fill="#000") etc. — Alexis Jazz (talk or ping me) 23:36, 15 November 2021 (UTC)
- Hmm for those need to determine two pieces of info for each, to connect via SQL. For example on enwiki it is database="enwiki_p" and host="enwiki.analytics.db.svc.wikimedia.cloud" -- GreenC 03:28, 17 November 2021 (UTC)
- Maybe the folks at WP:VPT know. — Alexis Jazz (talk or ping me) 11:58, 17 November 2021 (UTC)
- Hmm for those need to determine two pieces of info for each, to connect via SQL. For example on enwiki it is database="enwiki_p" and host="enwiki.analytics.db.svc.wikimedia.cloud" -- GreenC 03:28, 17 November 2021 (UTC)
- So try https://en.wikipedia.beta.wmflabs.org/ and https://commons.wikimedia.beta.wmflabs.org/. Not 100% the same (many templates are not there and the configuration can be slightly different in some cases) but it's a start. Let me know if you need rights or deletions on betacommons. I tried uploading a file on https://en.wikipedia.beta.wmflabs.org/wiki/Special:Upload with a name that already existed on betacommons. I get a warning "A file with this name exists already. Please check File:Test upload header 3.svg before overwriting. Use a descriptive filename (e.g. "Eiffel Tower, Paris, at night.jpg") to prevent conflicts with existing files." and the upload is denied. (the warning can't be ignored) There is no warning when uploading a file to betacommons that already exists on betaenwiki though. You can take https://commons.wikimedia.beta.wmflabs.org/wiki/File:Test_upload_header_1.svg and make some change with a text editor to avoid warnings about duplicates. Just add a comment, change the ID, the color (fill="#000") etc. — Alexis Jazz (talk or ping me) 23:36, 15 November 2021 (UTC)
- @Alexis Jazz:, that's a good idea. It is hard to impossible to test outside the live system as it's unclear where the problem originates, and I can't find existing cases since that is being tested, finding cases. Wonder if I uploaded a File: from Commons to Enwiki under the same name, tested to see if they are detected, do you also have powers to quickly remove once the test is done (which takes minutes)? I'd also like go in reverse, upload a file from Enwiki to Commons under the same name, but might need to find a different image that is CC/PD, but matching a name on enwiki. -- GreenC 22:00, 15 November 2021 (UTC)
@Alexis Jazz: It looks like Wikipedia:Database reports/Non-free files shadowing a Commons file contains files that are shadowed that were not being detected by this bot? Not entirely sure what's going with this list. Also Wikipedia:Database reports/File description pages shadowing a Commons file or redirect and Wikipedia:Database reports/Local files with a duplicate on Commons -- GreenC 15:43, 24 November 2021 (UTC)
- I guess you'd have to ask User:Fastily and/or wait until 29 November to see if the bot finds anything unexpected. — Alexis Jazz (talk or ping me) 12:53, 25 November 2021 (UTC)
- Your bot should probably tag w:File:JamesDukeStatueAndChapel.jpg as a duplicate of c:File:JamesDukeStatueAndChapel.jpg, no? It doesn't qualify for WP:CSD#F8 as the local file description is better but the file itself is identical. — Alexis Jazz (talk or ping me) 21:08, 11 January 2022 (UTC)
- According to the bot logs:
File:JamesDukeStatueAndChapel1.jpg ---- 20220104-04:38:15 ---- Added template
ie. w:File:JamesDukeStatueAndChapel1.jpg was tagged. Not sure if that helps anything why it didn't tag the other one. Seems like a bunch of duplicates. -- GreenC 21:55, 11 January 2022 (UTC)
- According to the bot logs:
Who cares?
editReuters
edit- Moved to WP:URLREQ#Reuters (again)
Backlinks and common words
editI'm thinking of adding selected common words (maybe 20) to my Backlinks list. Of course, a search for "The" would match almost everything and time out, but a search for linksto:"The" is fast. Would these additions be safe, or would they slow things down in an antisocial way? I already have A listed (to catch jokers who link each letter), so you could check whether that runs noticeably slower than less common words. Certes (talk) 01:06, 15 July 2021 (UTC)
Certes, top 50 by size from your list
Extended content
|
---|
3449577 china 2551179 London 925721 Boston 899215 Sydney 822928 National Football League 787712 Jazz 742641 Melbourne 637715 The Daily Telegraph 602020 Luxembourg 545562 Athens 470567 Manchester 417654 Liverpool 414820 Birmingham 358707 Perth 298299 Naples 287433 Edmonton 284325 Hollywood 262800 New Brunswick 244875 surrey 244846 Surrey 242423 Oxford 241038 guinea 227722 Havana 224517 Blues 224517 blues 219266 Wellington 208890 National League 198799 Oxygen 197470 Butterfly 197470 butterfly 197311 Cambridge 197190 Norfolk 194592 Hyderabad 191996 Christchurch 190041 ABC News 183295 The Observer 182475 Country 182475 country 179265 Madonna 179265 madonna 174391 Alexandria 164651 Portsmouth 160175 Sculpture 160175 sculpture 143481 The Sunday Times 139840 York 138493 Hanover 135255 Company 135255 company 134502 Stream |
That's file size in bytes (each file contains a list of article names), but it gives a relative sense of which ones are the largest. The system was never designed with this many in mind but it seems to be holding up fine. One reason it might have trouble is if it takes > 24 hrs to run, at which point we increase the time period between runs. The last run took 2.5 hours so you're 10% of the way there ;) Or if the number of links is in the millions, like the {{cite web}}
template, but it's hard to imagine linked terms much more common than china or london. Ghandi? Jesus? China has 133,717 backlinks. 'The' has less than a thousand. -- GreenC 02:31, 15 July 2021 (UTC)
- Thanks for the list. I'd already removed London, Boston, Sydney, Melbourne, National Football League, Luxembourg, Manchester and others for producing too many false positives. I removed Jazz and Athens last night, so that's most of the top ten gone. china (lower case for pottery) can also go now as it appears rarely. Of the top ten, that just leaves The Daily Telegraph. I still get plenty of links for The Daily Telegraph (Sydney) in semi-automated citations; I think it's linked to the wrong WP article in Trove. However, it would be perfect for a variant which limits the search to articles which also mention Australia (or perhaps NSW, Brisbane, etc.) Here's another Telegraph error today: David Storey (politician). Certes (talk) 13:00, 15 July 2021 (UTC)
- User:Certes, you may already know about this, but in case thought you might be interested in Zipf's law (second paragraph of lead section). External links has a Zipf's list for English. They might contain frequent disambiguation problems. -- GreenC 17:15, 19 July 2021 (UTC)
- I vaguely remember Zipf's law but it had slipped my mind. I just checked a couple of lists and the only word I'd missed was "information", which seems a legitimate enough target for the false positives to dominate the errors. I've deliberately omitted words such as Be, which are or redirect to dabs and will be picked up by WikiProject Disambiguation. The words are now being checked: just one today; no false positives. Certes (talk) 17:39, 19 July 2021 (UTC)
- User:Certes, you may already know about this, but in case thought you might be interested in Zipf's law (second paragraph of lead section). External links has a Zipf's list for English. They might contain frequent disambiguation problems. -- GreenC 17:15, 19 July 2021 (UTC)
Precious anniversary
editFive years! |
---|
Orphaned non-free image File:Sheikh Zayed Book Award medal.jpg
editThanks for uploading File:Sheikh Zayed Book Award medal.jpg. The image description page currently specifies that the image is non-free and may only be used on Wikipedia under a claim of fair use. However, the image is currently not used in any articles on Wikipedia. If the image was previously in an article, please go to the article and see why it was removed. You may add it back if you think that that will be useful. However, please note that images for which a replacement could be created are not acceptable for use on Wikipedia (see our policy for non-free media).
Note that any non-free images not used in any articles will be deleted after seven days, as described in section F5 of the criteria for speedy deletion. Thank you. --B-bot (talk) 17:38, 8 August 2021 (UTC)
www.independent.co.uk
editGreenC bot seems to have blacklisted all links to https://www.independent.co.uk/, causing IABot to repair all links to that domain, despite most still being alive. See e.g. this. Could this be "fixed" somehow? Jonatan Svensson Glad (talk) 01:26, 9 August 2021 (UTC)
- This is mostly undone it will be finished in a few hours. Started rolling back yesterday after the problem was noticed. (It's not all links, but a lot). It was caused by a wrong header where they were returning 406's but the page was 200 (but only for the bot not web browser) - whatever caused it has since stopped since they now return 200 correctly. Some live URLs may have gotten archived by IABot, I don't have a way to determine which. -- GreenC 01:45, 9 August 2021 (UTC)
Blacklisted
editNot related to your bot, but do you know how to unmark the domain gothamist.com as dead on IABot? The websites are alive, but it seems the domain is blacklisted and being tagged as dead. Jonatan Svensson Glad (talk) 20:40, 12 August 2021 (UTC)
- Yup, that's an IABot admin action, which I am. Done. It was blacklisted since 2017. -- GreenC 22:16, 12 August 2021 (UTC)
Your RFC
editI'm pretty annoyed that you've chosen to misrepresent me and ask a completely pointless straw man question in your RFC. Whatever the "result", it will have no bearing on the edit I made. Wtqf (talk) 23:23, 12 August 2021 (UTC)
- I could tell right away there was something wrong with you.. belligerence, anger, obvious deep knowledge of Wikipedia but blank user page and new account. I figured it was only a matter of time before you would be blocked, at least > 30 days, so I had no choice but the start the RfC. Well, it only took 24 hrs. You had "block permanently" written all over. Next time, check your attitude maybe your sock won't be so obvious. -- GreenC 14:15, 13 August 2021 (UTC)
Could I interest you in some more...
editCould I interest you in one more peer review related task...? (Wikipedia:Bot_requests#Bot_to_repair_broken_peer_review_links)
Summary: a nearly completed bot exists but the owner went away. Old peer reviews didn't contain a fixed link to the peer review page, which means over time as pages are moved, the links get broken. The bot was designed to fix those links. There was one outstanding issue which was that sometimes it would include a link twice in the output. Once that happens it can fix the rest of the 680 outstanding broken links. Would I be able to interest you in picking up and finishing this task...? :D Tom (LT) (talk) 01:31, 9 May 2021 (UTC)
- Tom, do you know if the source available somewhere that I could take a look? -- GreenC 16:59, 9 May 2021 (UTC)
- Ah, looks like the owner has resurfaced and there is a new bot RfA in the works (Wikipedia:Bots/Requests for approval/AWMBot 2). Hurray, and ignore my request! Tom (LT) (talk) 04:08, 10 May 2021 (UTC)
- Ok good! -- GreenC 15:22, 10 May 2021 (UTC)
- Never mind, the original creator has since retired. The bot's code is contained as a link in the bot request. I would be super grateful if you'd be able to take up the baton here - there's still 679 broken links (Category:Pages_using_Template:Old_peer_review_with_broken_archive_link) and having a functioning bot makes it much easier to maintain and repair them.Tom (LT) (talk) 01:34, 17 August 2021 (UTC)
- Tom, I looked at it. JavaScript is not a language I know and could not follow what it is doing. It might be good to first see if any JS programmers want to adopt it. A place they hang out is WP:SCRIPTREQ - it's scripts not bots but both are JS. Also, DannyS712 has made tons of JS bots and runs the scripts newsletter. -- GreenC 14:44, 18 August 2021 (UTC)
- No problem, thanks for your help to date. Tom (LT) (talk) 07:17, 19 August 2021 (UTC)
- Tom, I looked at it. JavaScript is not a language I know and could not follow what it is doing. It might be good to first see if any JS programmers want to adopt it. A place they hang out is WP:SCRIPTREQ - it's scripts not bots but both are JS. Also, DannyS712 has made tons of JS bots and runs the scripts newsletter. -- GreenC 14:44, 18 August 2021 (UTC)
- Never mind, the original creator has since retired. The bot's code is contained as a link in the bot request. I would be super grateful if you'd be able to take up the baton here - there's still 679 broken links (Category:Pages_using_Template:Old_peer_review_with_broken_archive_link) and having a functioning bot makes it much easier to maintain and repair them.Tom (LT) (talk) 01:34, 17 August 2021 (UTC)
- Ok good! -- GreenC 15:22, 10 May 2021 (UTC)
- Ah, looks like the owner has resurfaced and there is a new bot RfA in the works (Wikipedia:Bots/Requests for approval/AWMBot 2). Hurray, and ignore my request! Tom (LT) (talk) 04:08, 10 May 2021 (UTC)
WikiProject SpaceX
editHi. Would you be interested in joining a WikiProject SpaceX? If you are, can you please make a WikiProject proposal for it (I as an IP, can not make the proposal cause I would be stopped when trying to create the proposal page). @GreenC:
- You can sign up for an account is best. -- GreenC 03:00, 20 August 2021 (UTC)
Manual conversions?
editRegarding your posts on converting archive references, such as [4]: did you mean "manual conversions"? isaacl (talk) 16:04, 22 August 2021 (UTC)
- Oh yeah, my ability to type lately has really gone downhill, one word thought and another comes out. I'll just blame the spell checker. -- GreenC 16:09, 22 August 2021 (UTC)
What should we do for cites from punesite dot com?
editInternet Archive bot froze
editIt seems that no jobs are running. Normal single page things are working, but no jobs are progressing or even starting. AManWithNoPlan (talk) 20:45, 26 August 2021 (UTC)
- Aware, thanks. -- GreenC 21:11, 26 August 2021 (UTC)
thanks, did it
editHi, GreenC. Thanks for your help here https://en.wikipedia.org/wiki/Wikipedia:Reliable_sources/Noticeboard/Archive_350#Can_plausible%2C_unsourced_statement_rely_on_sourced_statement%3F. Did go ahead and add the citation needed notation. Felt pretty good like the section is stronger now. Greg Dahlen (talk) 19:54, 31 August 2021 (UTC)
Master Reference sourcer
editHi, you seem to be a person that I could ask...
Has anyone every discussed trying to create a "single" reference place? Such that articles would not have "individual" references, but be able to find, then tag them from a reference repository? (One that would also mark "BAD"/invalid references) - a centralized (maybe project based) place to put standard "named" references. a template '<nref name="Green" />' would "look up" a named reference. Though I am loathe to create a "reference editor" permission, maybe limit who is allowed to change the "master reference"? (maybe based on "# of pages affected").
I am wondering if the "Cite" function of the WikiText Editor could:
1a) look up the url, (in the "repository")
1b) display a warning if "designated invalid",
1c) if valid, then insert an "nref" (with shortname already on file)
2a) if "new" URL, place it in the repository,
2b) request a "shortname"; checking that is does not already exist..
2c) then insert the nref
3) a "dead" URL would still be searchable, but could be tagged in the "repo" with a "substitute"?
A side benefit would be to make a LOT of articles "much" easier to edit (lack of long references interrupting text), and "much" easier to "alter" a reference when broken (or archived)? Mjquinn_id (talk) 13:34, 8 September 2021 (UTC)
- yeah that's basically the vision of WikiCite, and recently there was a proposal to build such a database but the WikiMedia Foundation declined to fund the project. -- GreenC 15:53, 8 September 2021 (UTC)
Thanks for responding! "What is the Best AI model for Content Moderation on Wikipedia?"
editDear @GreenC: Thanks so much for your participation in our discussion post "What is the Best AI model for Content Moderation on Wikipedia?" in Village Pump! Your insights are really appreciated! I wonder if you would have time to engage a bit more with us and other interested editors on this topic? We plan to host an online zoom discussion session soon to invite editors to discuss further with us and with each other -- of course you can turn your camera off if you want :) If you're interested, please get in touch and I will send you (and others) a WhenToMeet link to schedule the session! Thanks so much again for your time! Bobo.03 (talk) 03:32, 16 September 2021 (UTC)
usurped domains
editHi. When you mark domains as usurped, can you please tell me what, if any, crosswiki processes are then taking place. I need to understand this so we can better document things for the global spamblacklist, and how and when we add domains so as not to impact any processes that are occurring. Also gives me an indication of where I may need to ping admins from other wikis about repair work that they need to do. Thanks. — billinghurst sDrewth 08:08, 19 September 2021 (UTC)
- Hi billinghurst, thanks for asking. I wish IABot had the capability to mark domains usurped. The best we can do is mark them permadead in the IABot database. It requires an IABot admin action: request "permadead" at meta:User_talk:InternetArchiveBot. This results in the bot treating every URL it encounters as if it were a dead link, it doesn't check if the link is alive/dead it just assumes dead and act accordingly ie. add an archive URL or
{{dead link}}
. That's all it does, it doesn't flip the|url-status=usurped
or collapse{{webarchive}}
into a square link. IABot is currently running on about 120 wikis (the largest sites volume wise), excluding dewiki and frwiki notably. For enwiki, still recommend to notify WP:URLREQ as you have done, because WaybackMedic Medic can do usurped properly. Since I am also an IABot admin I can also permadead links through that channel. Always thought it would be a good idea to document usurped domains globally somewhere, so tool makers now and future have a list to work from. -- GreenC 19:28, 19 September 2021 (UTC)- Thanks for the detail. For me it sounds that it is best solution, where significant links, is to simply use URLREQ page, leverage your helpfulness and let all the magic happen universally, and then catch up with it in a week to add to global blacklist. If a link is hardly used, then I am typically manually fixing and blacklisting. Bummer that we cannot universally kill it as usurped, c'est la vie. As a note my documentation will either be on the talk page for global blacklist or on the xwiki report at meta. If you are unaware of our search function a report like m:User:COIBot/XWiki/worldaffairsjournal.org has a "find entry" query for a domain, and if you ever need to check a xwiki report I maintain a specific search form on my metawiki user page. Thanks for the work that you are doing. — billinghurst sDrewth 23:27, 19 September 2021 (UTC)
- Those are interesting reports by COIBot. -- GreenC 02:54, 21 September 2021 (UTC)
- Thanks for the detail. For me it sounds that it is best solution, where significant links, is to simply use URLREQ page, leverage your helpfulness and let all the magic happen universally, and then catch up with it in a week to add to global blacklist. If a link is hardly used, then I am typically manually fixing and blacklisting. Bummer that we cannot universally kill it as usurped, c'est la vie. As a note my documentation will either be on the talk page for global blacklist or on the xwiki report at meta. If you are unaware of our search function a report like m:User:COIBot/XWiki/worldaffairsjournal.org has a "find entry" query for a domain, and if you ever need to check a xwiki report I maintain a specific search form on my metawiki user page. Thanks for the work that you are doing. — billinghurst sDrewth 23:27, 19 September 2021 (UTC)
A barnstar for your efforts
editThe Anti-Vandalism Barnstar | ||
Awarded for you recent efforts in reverting vandalism done across articles relevant to WikiProject Socialism. Awarded by Cdjp1 on 22 September 2021 |
- @Cdjp1: thanks! I'm curious about that as it wasn't clear if the user is a good faith incompetent, or a vandal, it could have gone either way, difficult to determine. They seem to have quit editing (with that account) after I reverted most of their "work". I suspect there may be other socks that may show up elsewhere. -- GreenC 19:12, 22 September 2021 (UTC)
- @GreenC: yeah, due to the unhelpful results in most of the edits this star seemed the most appropriate, even if it turns out the actions weren't explicitly malicious. -- Cdjp1 (talk) 07:53, 23 September 2021 (UTC)
Disambiguation link notification for September 23
editAn automated process has detected that when you recently edited Mac Ross, you added a link pointing to the disambiguation page Hypoxia.
(Opt-out instructions.) --DPL bot (talk) 05:57, 23 September 2021 (UTC)
Does your bot fix author parameters?
editMy Zotero translator for The Straits Times just went live on Citiod (even though it was in the Zotero repo for two months already, and the last update to Citoid service with the submodule updated a month ago. Hah). Previously, Citoid would put 'herms' or 'hermsauto' as the author name, and many a time, this value is included into the references. One example at Soh Rui Yong. The translator fixes that issue and put in the correct author names. Can your bot be run to update the author parameters (i.e. first=, last=, etc) of all straits times references that contain 'herms' or 'hermsauto' as the author? – robertsky (talk) 07:00, 26 September 2021 (UTC)
- This would be good to request at WP:AWBREQ as it's a mass search-replace. -- GreenC 13:23, 26 September 2021 (UTC)
- Hmm... I think I will try tackle this on my own first then. Having been wanting to learn to extend the functionalities of AWB through custom codes. – robertsky (talk) 02:08, 27 September 2021 (UTC)
- Whether a bot or AWB is used, would this be a good place to attach a related problem? I see a lot of online works attributed to prolific authors such as Mr. Privacy Statement, Ms. Cookie Policy and Dr. Submitted Content, which have clearly been scraped in a semi-automated way from the website. Certes (talk) 14:03, 26 September 2021 (UTC)
- Ah that is funny. It probably should be reported to Help talk:Citation Style 1 as they could track them in a category and flag with a red error message. Bots and AWB the problem is what to replace with, might require manual fix. And which processes are responsible, maybe it's one bot, though likely VE. -- GreenC 14:32, 26 September 2021 (UTC)
- I have added the three names to the list of bogus names, see Help_talk:Citation_Style_1#Unlikely_authors
- If there are more such patterns please report them at CS1 talk.
- What about these "herms" and "hermsauto" strings mentioned above? How frequent are these errors?
- --Matthiaspaul (talk) 19:01, 26 September 2021 (UTC)
- Matthiaspaul, this is only seen for Straits Times citations so far, and should be the majority of the Straits Times citations for the last ten years that was populated through Zotero in Citiod, in VE. Being the newspaper of record for Singapore, without looking into the data, I would say that majority of Singapore related articles could be affected.
- This is because the default translator prefers one meta tag over another. The preferred meta tag would return 'herms' or 'hermsauto', while the actual author values are in other meta tags, except for expert opinions, which are in the bylines. – robertsky (talk) 02:06, 27 September 2021 (UTC)
- Okay, thanks for the explanation. If I understand you correctly, "herms" or "hermsauto" are not some kind of universally semantically impossible values as a name (like putting a date in when asked for an author name), they are just incorrect in specific cases (although possibly even systematic). Then it sounds like a error, that we cannot fix by putting it into the blacklist of names in CS1/CS2 citation templates, and it should better be fixed by humans, possibly script- or bot-assisted.
- --Matthiaspaul (talk) 08:04, 27 September 2021 (UTC)
- Ah that is funny. It probably should be reported to Help talk:Citation Style 1 as they could track them in a category and flag with a red error message. Bots and AWB the problem is what to replace with, might require manual fix. And which processes are responsible, maybe it's one bot, though likely VE. -- GreenC 14:32, 26 September 2021 (UTC)
Test
editTesting unsigned helper.
phab:T36928 Create a user right that allows ignoring the spam ...
editYou may wish to add comment to the ticket. Maybe my opposition should be rescinded. — billinghurst sDrewth 04:37, 22 October 2021 (UTC)
A barnstar for you
editThe da Vinci Barnstar | ||
For your work on WayBackMedic and Wikipedia linkrot Rlink2 (talk) 01:28, 25 October 2021 (UTC) |
Thanks
editThe Barnstar of Integrity | ||
For your help, I appreciated it. Lightburst (talk) 14:28, 4 November 2021 (UTC) |
Intrepidity in the face of adversity
editOrder of the Garter | |
Thank you for your service. Honi soit qui mal y pense. 7&6=thirteen (☎) 15:04, 5 November 2021 (UTC) |
Wargame coverage
editThanks for the heads up on my talk page, old magazine articles like the one you found are always welcome additions. :) I am not a wargamer myself, but as I work my way through old magazine reviews I have definitely done my part to improve our coverage of wargames, along with the generous help of User:Guinness323. I also have been working on User:BOZ/Games deletions which incorporates wargames into the "Board and miniature games" section, in case you see anything there that looks worth taking a stab at recovering. BOZ (talk) 23:57, 6 November 2021 (UTC)
- Yes I have seen User:BOZ/Games deletions, an impressive list and much needed, game stuff can be hard to source. -- GreenC 09:00, 8 November 2021 (UTC)
It might be a bit of a stretch, but do you see any additional sources for Draft:Phoenix (wargaming magazine)? BOZ (talk) 16:26, 28 November 2021 (UTC)
- As a follow up to that, hopefully the total rewrite on The Longest Day (game) helps. :) BOZ (talk) 14:27, 12 December 2021 (UTC)
- BOZ Oh cool. Yeah, I had not noticed. Interesting, I never knew about the problems with historical accuracy, thanks Wikipedia. -- GreenC
- Yeah, it was just rewritten earlier today by a hardworking editor who has done lots and lots of work on wargame and other articles. :) BOZ (talk) 22:34, 12 December 2021 (UTC)
- BOZ Oh cool. Yeah, I had not noticed. Interesting, I never knew about the problems with historical accuracy, thanks Wikipedia. -- GreenC
Do you know of any sources to help get Draft:Keith Poulter back to article space? BOZ (talk) 04:07, 14 December 2021 (UTC)
link rot and actions by your bot
editRegarding this edit:
https://en.m.wikipedia.org/wiki/Special:MobileDiff/1054040399
What was not correct about the removed url-status parameter? Why was iy removed? Can you somehow access the page while i can't, no matter from where i try?
Or is there more to this than i know? I remember reading the whole 'link rot' procedure before adding that parameter... Can you explain what i did wrong? It would be nice that your bot adds some explanations, even very basic ones would go a long way in helping educate rather than just doing without even a hint at preventing similar "issues" in the future.
im all for automation but... behaviors that have something to do with cleaning up mistakes or misuse of something should, must (in my opinion) have explanatiin of the rationale bwhind them. I havw issues just "trusting" a bot that fixes stuff based on "something" and having no possibility of verifying/understanding/overseeing what has been done and for what reason.
I hope you understand my concern and that you can explain so i can better my knowledge and refine my usage of templates. We will all benefit from it, after all.
Thanks Oldfart404 (talk) 21:59, 7 November 2021 (UTC)
I guess the url isnt 'dead' per say. Just... 404. It could maybe be archived but i thought i tried finding it before adding the parameter. Maybe not, i cant check at the moment Oldfart404 (talk) 22:03, 7 November 2021 (UTC)
|url-status=dead
is a common source of confusion. It's one and only purpose is to be paired with|archive-url=
- if|url-status=dead
then the URL in|archive-url=
is displayed first in the citation. If|url-status=live
the URL in|url=
is displayed first. That's all it does and is meant for, a flag to tell the template which URL to display first. If there is no archive URL available, or known, you would add a{{dead link}}
following the citation to mark as a dead URL. -- GreenC 00:07, 8 November 2021 (UTC)
Thanks for the reply. I guess i understand. One more thing... have you checked the url at all? Can you access it? Assuming it's truly dead and can't be found in archives (i couldn't at least), would fixing a template parameter misuse without correcting, rather updating to the most relevant template of {{dead link}}
be better than leaving sonmewhat-ish relevant misuse of tpl? It seems to me that the fix did only push the problem into the ether, hoping someone would (if not original editor, aka me in this case) eventually go ahead and fix stuff. I understand bots can be limited in some ways with regard to what they can do versus the ease of programming for each and every possible cases. That being said, is there a way to possibly mitigate that and have the 'proper' template applied in such case that i described? Does the actual here-bot log somewhere and have it's edits reviewed or just assumed as 'best' or 'better-than' whatever it fixed?? I don't mean to be an ass here. Genuienly trying to understand. I also see the link has been updated with an archived url now. Still, i don't get how it's possible to use an archived page from a different base domain name as alternative to what 'was', without assurance of it being what was intended. Sorry for the questions... i guess i just want to understand and do the right thing. Of course, it helps if you check what stuff i refer to by looking at the urls, which i'm sure you do. Don't know why i felt like saying. Thanks! -- Oldfart404 (talk) 09:10, 8 November 2021 (UTC)
- I did think about that, but the problem is the bot doesn't know who added the stray url-status or why or when. It would need to determine independently if the URL is dead. It can do that in a pinch, but ideally it would be done by IABot which has a better system for dead link detection. So it leaves it alone and next time IABot runs it should detect and add archive (as what happened). Nevertheless, I will continue to consider it, need to be careful of false positives. I suppose if it gets a clear 404 then it can safely assume the URL is dead and add archive URL. Do not rely on this however, my bot is very intermittent and might never see it. -- GreenC 10:59, 8 November 2021 (UTC)
The MAE-East thing
editSo, I was successful at getting more information, but I'm not sure it solves our problem, since it's not citable. If you feel like discussing, my contact info is in my user page. Bill Woodcock (talk) 00:15, 17 November 2021 (UTC)
Template question
editAre Template:Calendar date/holidays/Hanukkah-test.js and Template:Calendar date/holidays/Yom HaShoah.js used somewhere? Gonnym (talk) 15:21, 21 November 2021 (UTC)
Gonnym The live data for Yom HaShoah is Module:Calendar date/localfiles/Yom HaShoah and Hanukkah-test appears to be a test. They look like old files made during template development or were later deprecated. -- GreenC 16:07, 21 November 2021 (UTC)
ArbCom 2021 Elections voter message
editWould it be possible to create something like this for WP:FA, as well? I'm one of the FAC coords, and we recently had a situation crop up where we had a mismatch that proved somewhat difficult to track down manually. Hog Farm Talk 06:22, 1 December 2021 (UTC)
- GA and FA have similar pages so it should be possible to adapt gambot into a fambot. The syntax of entries in Wikipedia:Good articles/all is different from Wikipedia:Featured articles is the main thing. To get it approved would actually have to make test edits over a period of weeks, otherwise it could be rejected as not really needed. It sounds like your mismatch was an edge case. Do you think this would frequently find and report problems? With GA it seems to with every run. -- GreenC 07:19, 1 December 2021 (UTC)
- I'll go ahead and ping the other FA coord groups - @FAC coordinators: , @TFA coordinators , and @FAR coordinators: to see their thoughts. I don't believe it would catch things all that often - I'm aware of two or three instances of ones not having the star when needed this year, and about the same number of ones that weren't FAs getting tagged with it. A monthly run would probably be sufficient, due to lower volume at FAC. The two reasons why there are some with GA most runs are that 1) the FAC process has a very reliable bot, the GA process has a notoriously buggy one and 2) the manual parts of GAN are done by the nominator, who is frequently less experienced with the process, leading to mistake, while for FAC/FAR, the process is done by experienced coords. Hog Farm Talk 14:21, 1 December 2021 (UTC)
- My 2c: I think it would be useful to have this bot, because otherwise it takes more manual work to find any mismatches. With a bot we can immediately tell that someone has put a star on an article where it doesn't belong. However, I think you're right that it would not be used as much as with GAN. (t · c) buidhe 14:46, 1 December 2021 (UTC)
- What Buidhe said. Gog the Mild (talk) 15:10, 1 December 2021 (UTC)
- Hog Farm et al. Sounds like there is demand. Would like to try it. Right now, I am in the middle of a complicated migration of Billboard URLs and after will have time to look into starting a new project. In the mean time you could create a page similar to Wikipedia:Good articles/mismatches such as Wikipedia:Featured articles/mismatches (hmm already exists). Hog Farm, I am going to email you about something. -- GreenC 19:56, 2 December 2021 (UTC)
- Any further work that needs done on the FA mismatches page? I can do page cleanup for that if necessary. Hog Farm Talk 20:22, 2 December 2021 (UTC)
- Yes as close to the GA version. Probably the same sort of layout and instructions. Actually just the docs portion as the bot will create the rest. -- GreenC 20:38, 2 December 2021 (UTC)
- Done, barring a few tweaks like some wording changes, updating the petscan query from GA to FA, and the frequency of runs. Hog Farm Talk 21:50, 2 December 2021 (UTC)
- Yes as close to the GA version. Probably the same sort of layout and instructions. Actually just the docs portion as the bot will create the rest. -- GreenC 20:38, 2 December 2021 (UTC)
- Any further work that needs done on the FA mismatches page? I can do page cleanup for that if necessary. Hog Farm Talk 20:22, 2 December 2021 (UTC)
- Hog Farm et al. Sounds like there is demand. Would like to try it. Right now, I am in the middle of a complicated migration of Billboard URLs and after will have time to look into starting a new project. In the mean time you could create a page similar to Wikipedia:Good articles/mismatches such as Wikipedia:Featured articles/mismatches (hmm already exists). Hog Farm, I am going to email you about something. -- GreenC 19:56, 2 December 2021 (UTC)
- What Buidhe said. Gog the Mild (talk) 15:10, 1 December 2021 (UTC)
- My 2c: I think it would be useful to have this bot, because otherwise it takes more manual work to find any mismatches. With a bot we can immediately tell that someone has put a star on an article where it doesn't belong. However, I think you're right that it would not be used as much as with GAN. (t · c) buidhe 14:46, 1 December 2021 (UTC)
- I'll go ahead and ping the other FA coord groups - @FAC coordinators: , @TFA coordinators , and @FAR coordinators: to see their thoughts. I don't believe it would catch things all that often - I'm aware of two or three instances of ones not having the star when needed this year, and about the same number of ones that weren't FAs getting tagged with it. A monthly run would probably be sufficient, due to lower volume at FAC. The two reasons why there are some with GA most runs are that 1) the FAC process has a very reliable bot, the GA process has a notoriously buggy one and 2) the manual parts of GAN are done by the nominator, who is frequently less experienced with the process, leading to mistake, while for FAC/FAR, the process is done by experienced coords. Hog Farm Talk 14:21, 1 December 2021 (UTC)
The Teamwork Barnstar | ||
To AmericanLemming for seeing the problem and the idea for a solution; to Hog Farm for grabbing the bull by the horns and running with it; and to GreenC for setting up fambot to create the report at Wikipedia:Featured articles/mismatches that will save all of us many hours of searching for that missing or extra star populating the Featured article categories. Thank you for the speedy solution to a problem spanning more than a decade. SandyGeorgia (Talk) 23:50, 8 December 2021 (UTC) |
- Thank you, User:SandyGeorgia. Glad it will be of use! There is a way to make it so you can run the report on-demand like if you want the report right away, but it takes a little work to setup; just depends how much you think it would help vs. waiting for the weekly report, or increasing to 3x a week or something. Gambot runs 3x a week. -- 15:32, 9 December 2021 (UTC)
- I suspect what we have now will be just fine. Thanks!! SandyGeorgia (Talk) 16:03, 9 December 2021 (UTC)
Nomination for deletion of Template:Calendar date JavaScript subtemplates
editHi, just to let you know that I've nominated the following two templates that you authored for deletion, as they appear to be unused:
Please see the discussion at Wikipedia:Templates for discussion/Log/2021 December 4#Calendar date JavaScript subtemplates. Best — Mr. Stradivarius ♪ talk ♪ 13:57, 4 December 2021 (UTC)
InternetArchiveBot bug on fawiki
editHi, I have reported a bug of this bot on persian wiki in your telegram group. I mentioned it here because I thought you may not see it in the near future. thanx Mojtabakd «talk» 09:12, 19 December 2021 (UTC)
Songs of the season
editHoliday cheer | ||
Here is a snowman a gift a boar's head and something blue for your listening pleasure. Enjoy and have a wonderful 2022 GC. MarnetteD|Talk 09:47, 19 December 2021 (UTC) |
Thank you!
editThe da Vinci Barnstar | ||
For untangling the Australian Web Archive URLs. ClaudineChionh (talk – contribs) 21:58, 21 December 2021 (UTC) |
ClaudineChionh (talk – contribs) 21:58, 21 December 2021 (UTC)
Thank you bringing some light on it as well. I feel that da Vinci would agree it is world's most difficult archive URL scheme (I keep finding more edge cases).-- GreenC 22:54, 21 December 2021 (UTC)
Merry Christmas!
editBOZ (talk) is wishing you a Merry Christmas! This greeting (and season) promotes WikiLove and hopefully this note has made your day a little better. Spread the WikiLove by wishing another user a Merry Christmas, whether it be someone you have had disagreements with in the past, a good friend, or just some random person. Don't eat yellow snow!
Spread the holiday cheer by adding {{subst:User:Flaming/MC2008}} to their talk page with a friendly message.
I'm wishing you a Merry Christmas, because that is what I celebrate. Feel free to take a "Happy Holidays" or "Season's Greetings" if you prefer. :) BOZ (talk) 20:19, 22 December 2021 (UTC)
- BOZ, thank you! Look forward to a constructive 2022. -- GreenC 16:22, 23 December 2021 (UTC)
Scripts++ Newsletter – Issue 22
editHello everyone, and welcome to the 22nd issue of the Wikipedia Scripts++ Newsletter. This issue will be covering new and updated user scripts from the past seven months (June through December 2021).
Got anything good? Tell us about your new, improved, old, or messed-up script here!
|
|
- Ahecht:
- draft-sorter sorts AfC drafts by adding WikiProject banners to their talk pages. It supersedes User:Enterprisey/draft-sorter, adding a few features and fixing some bugs.
- massmove, a modified User:Plastikspork/massmove.js that adds a link to the left column, allows adding and removing both prefixes and suffixes.
- watchlistcleaner removes missing pages (redlinks), redirects, pages you haven't edited recently, and/or pages you've never edited from your watchlist.
- Awesome Aasim:
- Infiniscroll adds infinite scrolling to user contributions, page histories, and log pages.
- Quick create allows for the fast creation of red-linked pages with two clicks.
- Caburum:
- UTCclock adds a clock displaying the current UTC time.
- Chlod:
- CopiedTemplateEditor, mainly for CCI case handlers, allows graphically editing a talk page's {{copied}} templates.
- DaxServer:
- BooksToSfn adds a portlet link in Visual Editor's source mode editing, in main namespace articles or in the user's Sandbox. When clicked, it converts one
{{cite book}}
inside a<ref>...</ref>
tag block into an{{Sfn}}
.
- BooksToSfn adds a portlet link in Visual Editor's source mode editing, in main namespace articles or in the user's Sandbox. When clicked, it converts one
- FlightTime:
- OneClickArchiver is a custom version of User:Technical_13/Scripts/OneClickArchiver which doesn't prepend {{Clear}} to the top of each section on the archive page.
- Jon Harald Søby:
- diffedit enables editing directly from viewing a diff "when, for instance, you notice a tiny mistake deep into an article, and don't want to edit the entire article and re-find that one line to fix that tiny mistake".
- warnOnLargeFile warns you if you're about to open a very large file (width/height >10,000px or file size >100 MB) from a file page.
- JPxG:
- PressPass adds a collection of tools for Newspapers.com including configurable automatic citation generation in five different formats.
- CurrentSwitcher gives you links on the contribs page to hide duplicate entries, current revisions, rollbacks, huggles, twinkles, and redwarns.
- TrackSum lets you automatically sum the lengths of tracks in templates like {{track listing}} and get total runtimes.
- Nardog:
- CopySectLink adds a button to copy the unencoded page title or section path next to each heading.
- IPAInput allows you to type in IPA symbols by directly looking at an IPA key like Help:IPA/English and clicking on the symbols.
- TemplatePreviewGuard warns when you try to use "Preview page with this template" with a page that doesn't transclude the template.
- NguoiDungKhongDinhDanh:
- ContribsTabVector adds "Contributions" and "Statistics" tabs to user and user talk pages on the Vector skin.
- CopyvioChecker adds a "CopyvioCheck" tab to all pages, except Special (Vector skin only).
- LiveDiffLink is a version of Equazcion's LiveDiffLink which shows a wikilink instead of a URL.
- QuickDiff (by OneTwoThreeFall at Fandom) lets you quickly view any diff link on a wiki, whether on Recent Changes, contribs pages, history pages, the diff view itself, or elsewhere. For more information, view its page on Fandom.
- Novem Linguae:
- DetectSNG scans a list of 1,600 SNG keywords and displays them at the top of the article.
- NotSoFast highlights recently created articles in the new pages feed, to discourage patrolling them too quickly.
- UserRightsDiff concisely displays what perm was added or removed when viewing Special:UserRights.
- VoteCounter displays a rough count of keeps and deletes at XFDs, RFCs, etc.
- WatchlistAFD automatically watchlists the AFDs of any pages you AFC accept or NPP patrol, to help you calibrate your reviewing.
- P.T.Đ:
- TwinkleMobile enables Twinkle on mobile view (Minerva skin).
- Qwerfjkl:
- editRedirect adds a → link after redirects to edit them.
- RegExTypoFix, a script for fixing typos, is a wrapper for User:Joeytje50/RETF.js.
- talkback creates links after user talk page links like this: |C|TB (with the first linking to the user's contributions, and the latter giving the option of sending a {{talkback}} notice). It also adds a [copy] link next to section headers.
- Rublov:
- diff-link shows "copy" links on history and contributions pages that copy an internal link to the diff (e.g., Special:Diff/1026402230) to your clipboard when clicked.
- Rummskartoffel:
- auto-watchlist-expiry automatically watchlists every page you edit for a user-definable duration (you can still pick a different time using the dropdown, though).
- generate pings generates the wikitext needed to ping all members of a category, up to 50 editors (the limit defined by MediaWiki).
- share ExpandTemplates url allows for easy sharing of your inputs to Special:ExpandTemplates. It adds a button that, when clicked, copies a shareable URL to your exact invocation of the page, like this. Other editors do not need to have this script installed in order to access the URL generated.
- show tag names shows the real names of tags next to their display names in places such as page revision histories or the watchlist.
- Tol:
- VisualEditor Citation Needed adds a button (under "Insert") in VisualEditor to add a {{citation needed}} tag.
- Venkat TL:
- ColourContrib color-codes the user contributions page so that pages you've edited last are sharply distinguished from pages where another editor was the last to edit the page.
- Vukky:
- StatusChanger is a fork of Enterprisey's Status Changer, which adds a UI to the script. (using Morebits, so you'll need to have Twinkle enabled to use it).
All in all, some very neat scripts were written in these last few months. Hoping to see many more in the next issue -- drop us a line on the talk page if you've been writing (or seeing) anything cool and good. Filling in for DannyS712, this has been jp×g. Take care, and merry Christmas! jp×g 07:30, 24 December 2021 (UTC)
Merry Christmas!
editSeason's greetings and Merry Christmas to you and your family. Have a wonderful holiday season. Cheers! RV (talk) 03:14, 25 December 2021 (UTC)
Numberof typo
editSeasons Greetings! I had a look at c:Data:Wikipedia statistics/meta.tab for something I'm playing with and noticed a typo in the "Data source" line at the bottom: "Calculted" should be "Calculated". You might think about also changing "auto" to "automatically" although it's a little longer. Johnuniq (talk) 02:45, 26 December 2021 (UTC)
- err.. indeed. Fixed in production and Git. Thanks! -- GreenC 03:12, 26 December 2021 (UTC)
Happy New Year!
editThanks for your contributions to Wikipedia, and a Happy New Year to you and yours! 7&6=thirteen (☎) 13:56, 1 January 2022 (UTC)
- – Send New Year cheer by adding {{subst:Happy New Year}} to user talk pages.
Hello, GreenC,
It really helps admins if you use Twinkle to tag pages for deletion. When you select CSD>G4 from the drop down Twinkle menu, there is a field to place a link to the deletion discussion so there is no need to write long, explanatory edit summaries, there is a direct link in the CSD tag that admins can use to look at the deletion discussion. It really makes things easier and admins are very familiar with its features. Plus it has a lot of other great features you can discover on your own. Thank you! Liz Read! Talk! 03:36, 3 January 2022 (UTC)
- Though, as I read over the G4 explanation, it is for the RECREATION of content deleted in an AFD deletion discussion and this content is old while Wikipedia:Articles for deletion/Dark Ages (Europe) just closed today so it is not a recreation of deleted content. I am removing the tag and I recommend you nominate it for deletion at WP:MFD where I think you'd get a favorable response. Liz Read! Talk! 03:41, 3 January 2022 (UTC)
- Same goes for User:Kauffner/Europe in the Dark Ages. Liz Read! Talk! 03:48, 3 January 2022 (UTC)
- Oh I see. thank you! -- GreenC 05:53, 3 January 2022 (UTC)
- Same goes for User:Kauffner/Europe in the Dark Ages. Liz Read! Talk! 03:48, 3 January 2022 (UTC)