User talk:Dispenser/Checklinks/Archive 1

Latest comment: 6 years ago by Tacsipacsi in topic common.js
Archive 1

Readability page

What do the highlighed red words signify? (e.g. here) Harryboyles 06:17, 6 January 2008 (UTC)

They're for debugging the sentence counter which just looks for periods. —Dispenser (talk) 08:17, 6 January 2008 (UTC)

add a separate column, and/or a button that says details, which upon pressing, expands the link as is being done now by clicking anywhere. —Preceding unsigned comment added by Nergaal (talkcontribs) 04:10, 16 January 2008 (UTC)

Thanks for the direction, while I've been trying to keep the interface as clutter free as possible. I think the icon seems to be the necessary hint. But we'll have to see in the logs (its been 13 hit of 100 utility hit w/ the message). —Dispenser (talk) 03:45, 18 January 2008 (UTC)

monobook script

Hi. The monobook script doesn't seem to be working for me. It doesn't appear in either Firefox or IE7. I am looking in the right place, right? At the toolbox on the left? Matthew | talk | Contribs 01:02, 12 February 2008 (UTC)

  Done I had been loading the function dynamically before. It should show up bellow the search box and labeled as "Check external links". — Dispenser 03:09, 12 February 2008 (UTC)
It does. Good job! -- Matthew | talk | Contribs 03:54, 12 February 2008 (UTC)

Readability tool

Sometimes the tool comes back with statistics fairly quickly (For example; Introduction to evolution, Bees and toxic chemicals and Dog), but othertimes seems to be slow, so slow that it might be broken (Evolution for example). What is going on? Are those articles just too complicated? Is something else wrong?--Filll (talk) 01:23, 20 February 2008 (UTC)

  Fixed I made an optimization that I shouldn't have in template removal. I do not put much credence to the tool and have ceased any serious development. Problems begin with the syllable counter doesn't use a dictionary or known algorithms. The readability algorithms were based on their respective Wikipedia articles which have errors, are simplified, and/or were incorrect. Additionally, the readability algorithms have a standard deviation of roughly 1½ for 1 interval, i.e. accurate to within ±1.5 for 68% of people. — Dispenser 05:58, 20 February 2008 (UTC)

Featured Article Candidates

Dispenser,

First, congratulations on a great tool; very useful.

Now the bad bit ;) Where does the tool get the FAC list from? I ask because it doesn't seem to be up-to-date for the list of all current candidates. Is it a manual job to update it? Cheers. Carré (talk) 11:47, 6 March 2008 (UTC)

It runs automaticlly starting at 5:00 UTC, using the category list created from /config template. It uses the HTML output from the page and the runs a regex on it to get the pages from the linked headers. The part has been working for some time now. However, it seems as though there is a caching issue somewhere as it continues to get 1½ month old version of the page. I've changed the address to the purge page in hopes that will resolve the issue. It'll solve it in the short term, we will see if it fixes the problem in 6 months from now. — Dispenser 04:01, 7 March 2008 (UTC)

KeyError

When trying to do a check on a page with an unusual character (such as é), the python script gets confused and throws up a KeyError. — Wackymacs (talk) 16:52, 14 April 2008 (UTC)

  Fixed Thanks! — Dispenser 03:48, 20 April 2008 (UTC)

Error checking Degrassi: The Next Generation

I've just noticed this error that was not occurring in the last few days.

It repeatedly brings back both GLAAD links as not working, however, when I clicked on them to check myself, they do work. Cheers! -- αŁʰƏЩ @ 03:18, 29 April 2008 (UTC)

I am unable to duplicate your results, I have found other bugs but both GLAAD links continue to popup with rank 0. Perhaps it was a server or during the weekend development of the tool. — Dispenser 02:59, 30 April 2008 (UTC)
Yup. Just ran it 3 times, and it's all fine. Thanks for checking though. -- αŁʰƏЩ @ 03:08, 30 April 2008 (UTC)

Suggestions

I don't really think a sortable table would help that much. A big pink notice that a site is one of the host sites like members.aol.com/geocities/etc. might be nice, but its so easy now to get a domain name that it's easy to hide if you know what you're doing. Being generally clueless on the sort of programing that you can do with Wikipedia, would it be possible to highlight if the word "blog" was on the page? Or other words that should throw up a red flag? If this isn't possible or would be too hard, I totally understand. In the totally dreaming realm I'd also love something that would see the list of refs and see if they are using cite web and check that they have publisher and last access date used so that I can easily pull up a list of citations missing those two parameters. That's easily the one thing that gets dropped the most. Ealdgyth - Talk 15:45, 14 April 2008 (UTC)

I've changed how templates are handled so they're more flexible. It will display the {{cite web}} information in single {} and italics. Is this alright? The blog thing seems hard to do since it isn't easy equating a single link with a word that appear outside that link (i.e. intro talk about somebody's blog, and its in reference to link number 10). — Dispenser 03:48, 20 April 2008 (UTC)
By the way, it's working wonderfully for me know. I love the fact that I can see the root domain name, that helps SOOOO much! Thanks for all the work you do on this. It's very much appreciated. Ealdgyth - Talk 14:50, 18 May 2008 (UTC)

False deadlinks?

Someone recently ran the tool against an article and it reported a dead link [1]. I manually checked the link in question, and it is good. I also tried to run the tool, and got the same false reading of a deadlink. The link is on the New York Times website, I'm wondering if they might be filtering traffic of this nature? Yngvarr (c) 12:53, 18 May 2008 (UTC)

I checked the page earlier today and the link in question did not show up with a red row. I suspect that you misinterpreted the addition of «dead link» to the title and have changed to the more traditional {{dead link}} format. I suspect that the user that made the edit in question had merely played around with the options. An alternate possibility is NYT site was down. I will need to add a history mechanism in the future. — Dispenser 05:29, 19 May 2008 (UTC)

How to run it?

There's no simple, one-stop explanation of how to run it. Philcha (talk) 16:09, 17 June 2008 (UTC)

I've added a {{nutshell}} to the top of the page, but basically your suppose to figure it out from the examples box which fills in various pages. — Dispenser 04:05, 30 July 2008 (UTC)

Readability - Simple English

I was in the process of checking an article for Simple English and noticed the option of checking that wiki was removed. I was wondering if this was a permenant change or if I was just imagining that it even happened.. The tool is extremely handy for checking the level of our VGA (simple's version of FAs) candidates and I hope we can continue to use it. Creol (talk) 09:41, 6 September 2008 (UTC)

I had forgotten to update it when I was migrating all my tools the interwiki link syntax. Since your posting I've ensured that tools is coherent (links on simple don't goto English. You can check pages by using the syntax simple:music or by pasting the URL in and it will convert for you. — Dispenser 18:48, 11 September 2008 (UTC)

Unwatching

Articles in my watchlist are unwatched when I run this, which is a bit annoying. Other than that, very good tool. --Closedmouth (talk) 05:07, 26 June 2008 (UTC)

I've added an option in the new Preferences page, but it hasn't been implemented into the backend yet. Option are only Remove all or Add all, since I don't have access to user information. — Dispenser 04:05, 30 July 2008 (UTC)
  Done It should work on all the tools now. — Dispenser 21:11, 21 September 2008 (UTC)

great service! correction bottom graph caption, please

"Distribution", not "distrAbution". Thanks. TONY (talk) 05:26, 18 July 2008 (UTC)

  Fixed Thanks, I fixed a few other, then removed the section as I realized it was a mostly irrelevant. Need to place things contextually next time. — Dispenser 21:11, 21 September 2008 (UTC)

First, I will add my voice to the chorus of kudos. Nicely done.
I notice that the Checklinks resource does not recognize links generated as part of (at least some) transcluded templates. For instance, {{AMQ}} as used in WDEL; there are several such templates in broad use among radio station articles. This might be something to include in the documentation as a limitation. --User:Ceyockey (talk to me) 02:10, 14 September 2008 (UTC)

  Done I've added a section on the Internal workings. — Dispenser 21:11, 21 September 2008 (UTC)

Possible bug with duplicate ref content

[2] Two references named "GQ" appear in one paragraph. Someone using your tool converted the second name to autogenerated1. Does this look right? (As a separate issue, I've noticed editors putting quotes around ref names, so should autogenerated1 be "autogenerated1" when a name is created?) Gimmetrow 19:13, 18 October 2008 (UTC)

On the same note, it properly combined the ref named "bbc1". Wildhartlivie (talk) 19:34, 18 October 2008 (UTC)
Should be addressed to User:NicDumZ since he's the authors of that particular code. The two refs only differ by a space before "December". — Dispenser 01:06, 20 October 2008 (UTC)

Blacklisted URLs?

Why are certain URLs blacklisted? I was attempting to fix Schiller Institute when reflinks [sic] tripped over one of the URLs. --Adoniscik(t, c) 17:15, 23 October 2008 (UTC)

  Fixed Only www.jstor.org is blacklisted. The link in question resulted from an error that I made in the globalbadtitles when I made a mistake in converting to the new format. Read more about title blacklisting at DumZiBoT approval request. — Dispenser 18:17, 27 October 2008 (UTC)

Cite Web

Regarding edits today to the Tennis page. It torn our a number of "Cite Web" references, in favor of a "old style" flat reference...? Is this standard? -- IrishDragon 05:35, 23 November 2008 (UTC) —Preceding unsigned comment added by IrishDragon (talkcontribs)

URLs are not proper references, so it converted them to "flat URLs" and ran the reflinks bot script to add titles. If you wish to convert them into using cite web you may wish to the the webreflinks script. — Dispenser 06:57, 27 November 2008 (UTC)

The tool link is curently not working. Dr.K. (logos) 05:31, 3 December 2008 (UTC)

It's likely the result of the move to the new server. It's probably a disk caching issue since it work again when I reloaded it. — Dispenser 05:35, 3 December 2008 (UTC)
Thank you very much. It works after I followed your suggestion and reloaded it. Take care. Dr.K. (logos) 05:47, 3 December 2008 (UTC)
I'm still getting "404 not found" after both F5 and CTRL-F5. --Philcha (talk) 17:38, 8 December 2008 (UTC)
It seems to a be a server configuration problem, the new server, passes escaped URL directly to the rewrite script. I suspect it is related to a configuration setting as it was working at the beginning, but River's filed a bug and has since enabled a workaround solution. — Dispenser 02:21, 21 December 2008 (UTC)

Table formatting

Hi. I've noticed a problem with some of the changes checklinks has been making in articles that use a table formatting different than a basic wikitable. It's converting them back to the basic wikitable. WP:ACTOR has endorsed and incorporated a more stylized table than this basic one and there is no option allowed to avoid this when running the tool. You ran the tool on Mark Wahlberg which left the filmography table reverted to this, although it had been updated to the new format [3]. When I originally began using checklinks, it didn't do this, and I used it routinely. Could this be removed or converted to allow the exclusion of this table changing? Thanks. Wildhartlivie (talk) 18:32, 26 December 2008 (UTC)

They shouldn't be doing that, it screws with custom skins, increases article size, doesn't automatically update, and is inconstiant with the rest of Wikipedia. The code they used was from 21 June 2005 revision of {{prettytable95}}, the prettytable templates where deprecated in late 2005-early 2006. Some pages are still subst with the old code which is why the convertion code exists. Since the idea was neither well thought out or implement, I've posted wikiproject about changing this. — Dispenser 22:12, 26 December 2008 (UTC)
I didn't report what was happening as a bug, but just as something the tool has started changing more recently. The change that was implemented by WP:ACTOR only changed the font size and the color of the table top - the filmography table itself has been in use since the beginning of the project, so it isn't like a drastic reworking has occurred from the original table used. A lot of projects use variations on tables in the course of their projects and in the past, when I ran the tool on those pages, wikitable changes weren't made. If projects aren't free to adapt changes in tables used by the projects, there is a problem, because something outside of those projects is dictating style. Meanwhile, as one of the handful of people who are consistently active in the project, I have to say, your recommendation for templates is a bit over my head. I don't really know what you are suggesting regarding them. I don't see a lot of difference between the markup that is being used and what is included in Help:Table. What I do know is that mandatory table changes by the tool will cause me to not use it like I have in the past. Wildhartlivie (talk) 23:57, 26 December 2008 (UTC)
Before you changed it [4], it was a standard prettytable, it was revert [5] 2 months later, reverted back, and now changed to standard wikitable markup. This is the only project I've seen on this wiki that uses non-standard table just for styling. The reason given above are sufficient for any of the regulars at common.css to being running a bot to clean up the mess. Additionally, the documentation in Help:Table is old dated and includes hacks that we really don't want to pollute the data set with. — Dispenser 04:42, 26 January 2009 (UTC)

talk pages

great tool. after you check a page, on the tools drop-down, if you click on "talk page" it appends "-talk" at the end of the articlename instead of appending "Talk:" to the beginning. shirulashem (talk) 02:02, 21 January 2009 (UTC)

  Fixed, but since I drop pseudo-namespace support it mean things like Talk:Wikipedia:article will happend, sigh. — Dispenser 05:43, 26 January 2009 (UTC)

Not working?

Checklinks Says "No changes will be maded [sic]" and then does nothing when I click ok. --Closedmouth (talk) 07:53, 9 February 2009 (UTC)

  Fixed, and now that area of the code's been cleaned up. — Dispenser 20:25, 13 February 2009 (UTC)
Thanks! --Closedmouth (talk) 05:54, 14 February 2009 (UTC)

URLs that redirect to another page cannot be "fixed" anymore by the tool, even if we want to? Gary King (talk) 21:31, 17 February 2009 (UTC)

As I’ve explained on the reflinks discussion there is no benefit to replacing redirects. Despite the warning, the button was a temptation (like advisor's fix button), so there was misuse of replacing with 404 pages. The copy and paste method, however, still works. — Dispenser 16:09, 18 February 2009 (UTC)

<references />

The tool arbitrarily replaces <references /> with {{reflist|colwidth=30em}}. That is really bad, since there are valid reasons for using both, and thus it should not be automatically changed. (Generally <references /> is actually preferable if there is no particular reason to use {{reflist}}). So if someone could please stop it from making this change as soon as possible that would be great.
Apis (talk) 18:30, 12 January 2009 (UTC)

I totally agree. Please cease this practice. That kind of thing requires consensus to change. --Adoniscik(t, c) 20:17, 12 January 2009 (UTC)
when is <references/> better? i've always thought {{reflist|colwidth=30em}} was better, to be honest. shirulashem (talk) 02:09, 21 January 2009 (UTC)
Text size gets unnecessarily small and multiple columns usually makes no sense unless the page use harvard references or similar. The rest of the page is in one column. Bad typography that makes the reference section less accessible which is kind of counter productive on a site that wants to make information more accessible.
Apis (talk) 13:58, 21 January 2009 (UTC)

The controversy with this template will never be settled. The font size was made consistent (smaller) in IE, and users fired back. Many users keep upping the number of columns and others want hacks implemented in MediaWiki to add columns to every reflist. While I agree with the points raised by Apis, I believe consistency across pages is far more important.

Reflinks will only convert when told to use templates. Commonfixes (used in Reflinks, Checklinks, and PDFbot) will convert if the surrounding divs make the references smaller (i.e. no visual change). Commonfixes also applies a simple algorithm if more than 30 references it changes {{reflist}} and {{reflist|3}} into {{reflist|colwidth=30em}}. If less than 8 it will remove any columns. This is based on edits I have seen, so if somebody could bring me edge cases (even theoretical) I will try to improve this behavior. — Dispenser 05:39, 26 January 2009 (UTC)

The tool changes <references /> even if the result is smaller text?
There is no reason to reduce the fontsize for reflists when using a single column. I agree that colwidth is a much better option than a fixed number of columns and as far as I am concerned it is great if the tool changes {{reflist|3}} into {{reflist|colwidth=30em}}. However, it shouldn't change {{reflist}} and <references /> into {{reflist|colwidth=30em}}. Also 30em is kind of arbitrary and might not be a good number for most of the cases where multiple columns are actually desirable. If you could somehow have the tool check if the page is using harvard style or shortened footnotes that would be an indication that the page would benefit from multiple columns. If not <references /> would probably be better. Anyway, I doubt you would get community consensus for automatically changing reference type like this.
Apis (talk) 08:38, 3 February 2009 (UTC)
The behavior is almost compulsory for Firefox Wikipedians (only firefox users can see it) to change every reference section to multicolumn reflist. The 30em number was chosen from how it worked on different screen sizes; closely matching what people were hard coding. 20em might be good for the short footnote style, but coding something will be hard and some people improperly mix styles (long+short footnotes). Judging from the edits and discussions of hard coded multicolumn for IE users, that if we took a poll that most would support standardizing on reflist, but I digress.
I’ve reviewed the original request for Reflink, realized it was only asking to use reflist when adding the references section, and change have the tool accordingly. By the way Checklinks never changed a regular <reference />. — Dispenser 05:58, 4 March 2009 (UTC)

HI, in this edit to D. B. Cooper, you erroneously flagged http://www.msnbc.msn.com/id/23801264/ as a dead link. I've fixed it. TJRC (talk) 02:45, 15 February 2009 (UTC)

Sorry about that MSNBC seems to have flacky server software. — Dispenser 05:06, 14 March 2009 (UTC)

Localization in pt

Is it possible to use this tool in other languages? I would like to give it a try in pt:wiki to speed up review of Featured article candidates. GoEThe (talk) 16:07, 16 March 2009 (UTC)

Interfaces messages are not stable for translation. But to get pages from pt:wiki you just need to prefix pt: before the pagename, just like with interwiki links. — Dispenser 19:07, 16 March 2009 (UTC)
Thanks, it works. Great tool! GoEThe (talk) 21:34, 16 March 2009 (UTC)

Ignore list

Sorry if this is mentioned somewhere, but I don't see it. What URLs would be on the ignore list or more appropriately, why are CNN links on the URL Ignore list? Phydend (talk) 19:46, 18 February 2008 (UTC)

ignorelist = [
    re.compile(r'.*[\./@]example.(com|net|org)(/.*)?'), # reserved for documentation
    re.compile(r'.*[\./@]tools.wikimedia.(org|de)/.*'), # So we don't end up calling ourself
    re.compile(r'.*[\./@]wikimedia.org/.*'),            # Wikipedia media repository
    re.compile(r'.*[\./@]archive.org(/.*)?'),           # Prevent downloading of media
    re.compile(r'.*[\./@]cnn.com(/.*)'),                # CNN has firewalled us
]
Basically CNN had put a rule in their firewall config to drop all packets from the Toolserver. This caused requests which queried CNN to timeout, which take about 5 minutes. — Dispenser 23:19, 18 February 2008 (UTC)
Alright, that makes sense. Thanks for the quick response, I was just wondering. Phydend (talk) 01:21, 19 February 2008 (UTC)
CNN links currently display as a blue connection time-out issue. Is this the intention per the ignore list, please? Tom B (talk) 11:24, 14 November 2008 (UTC)
Yes, blue indicates that information was not obtained from the server. — Dispenser 19:33, 28 March 2009 (UTC)

After checking a page's links, I can't click on the links to expand and fix them. Has this functionality been removed to only provide reporting or is there an error? —Ost (talk) 16:11, 26 February 2009 (UTC)

Thanks for the fix. I don't know about anyone else, but I appreciate it. I hadn't noticed it sooner as I was working off a page created weekly where the code was still working.
I was also wondering, what is the expected behavior of checklinks on Category pages? The weekly pages and associated log do not appear to visit all of the pages in the category (e.g., Category:Top-importance Louisville articles). —Ost (talk) 14:22, 9 March 2009 (UTC)
  Fixed, thank you. It was actually two bugs: category detection was not working and the fall back "list link" generator was skipping adjacent links. — Dispenser 15:26, 9 March 2009 (UTC)
Thanks again for the fix. When you reran the tool for the pages last week it worked, but now the reports for the category pages are empty. Looking at the log, the tool seems to have removed the first letter after the category namespace: Getting [[Category:Igh-importance Louisville articles]].... Thanks, Ost (talk) 18:35, 16 March 2009 (UTC)
I decided to release the source and wanted to clean it up before I did. Bugs happen. — Dispenser 19:33, 28 March 2009 (UTC)

I have no idea where this kind of thing should be posted, so please excuse me if this is the wrong place, and feel free to remove my comment.

The toolserver linkchecker tool (a wonderful tool btw) shows the following for a certain webpage:

Media type text/html; charset=UTF-8 is wrong for .xml files

It doesn't seem to make any sense to throw an error for this - firstly as far as I'm aware there's no rule or standard precluding the text/html mime type for any extension whatsoever, xml or other - extensions can/should be assessed completely separately from mime-types. Secondly, even if there were such a rule or standard, it would be the responsibility of webmasters to adhere to it, not Wikipedia editors referencing such webmasters pages. Many sites use various file-extensions for text/html pages completely legitimately, including their own custom file extensions or no extension whatsoever. Also, many apps use xslt, xsl to legitimately serve xml as html - mod-xslt is a good example. ɹəəpıɔnı 06:11, 23 March 2009 (UTC)

This is the scenario the error is thrown: the tool opens a PDF file expecting a application/pdf media-type, instead it receives a “200 OK” with a media-type of text/html and it is actually a soft 404 error. The tool is designed to give warnings for the level it cannot determine to be either dead or good. So flags many things for human review. — Dispenser 16:43, 7 April 2009 (UTC)
What about this scenario: the tool opens a .html file expecting a text/html media-type, it receives a “200 OK” with a media-type of text/html but it is actually a soft 404 error. The tool throws no error, because mime-type is really no indication of the likelyhood of such, in fact soft 404's are almost certainly more likely for .html files than any other. Aren't soft 404's out of scope of this tool really? I should imagine the amount of false positives thrown are far greater than the amount of genuine errors of this kind. At the least XML could possibly be bundled in with html as a valid extension for text/html... Anyway, sorry if I came across as over-critical, honestly not intended. Thanks for responding all the same. ɹəəpıɔnı 17:36, 7 April 2009 (UTC)
If the tool GET the page instead of obtaining the headers (HEAD) it does perform some basic content analysis. The ranking system has been in the need of an overhaul, but I have not found a good grouping system for the various errors. — Dispenser 15:26, 13 April 2009 (UTC)
That's grand, I just though I'd alert you to it. FYI, the site showing this error was stats.yandex.ru/stats.xml Thanks for the replies. ɹəəpıɔnı 16:45, 13 April 2009 (UTC)

You reverted me adding your tool to WP:PW saying it was a duplicate to the pages you posted on WP:WikiProject Professional wrestling/Broken external links, however, you did forget that the WP:PW page includes FACs and GACs which are very important to right broken links. How can we make a page directly for them? Feedback 03:33, 13 April 2009 (UTC)

If you look at the broken external linkspage, you will see that they link to automatic scan of GA and FA article categories. It appears that the list at WP:PW is not kept up to date as there are 66 GA missing. — Dispenser 15:26, 13 April 2009 (UTC)
FACs and GACs = Featured Article Candidates and Good Article Candidates. I want for an automatic update for the articles that are nominated before they get promoted. Feedback 02:02, 16 April 2009 (UTC)
That's a pretty bad miss read on my part. Using the htmlregex option you can selectively choose only the links preceding particular icons. So you have a few option for there inclusion. Select only GAN/FAC pages, select GA/GAN/FA/FAC pages, or select all main space links. If you opt for the latter two you should delete or redirect (or soft-redirect to the Checklinks sub page) the broken links sub page.
htmlregex for selecting pages with the FAC or GAN image preceding the link:
<a [^<>]+? title="(Featured article nominee|Good article candidate)"><img [^<>]+?/></a> *<a href="/wiki/(?P<page>[^"]*)" title="[^"]*">
htmlregex for selecting pages with an image preceding the link:
<img [^<>]+?/></a> *<a href="/wiki/(?P<page>[^"]*)" title="[^"]*">
Dispenser 03:12, 16 April 2009 (UTC)

I used webchecklinks to verify Water fluoridation, and it complained about this citation:

  • Sheiham A (2001). "Dietary effects on dental diseases" (PDF). Public Health Nutr. 4 (2B): 569–91. doi:10.1079/PHN2001142. PMID 11683551.

saying "302 Journal subscription required". It's true that the URL in question causes the web server to respond with a "302 Moved Temporarily" HTTP result, but the download does then succeed, without requiring a subscription or registration. Just thought you'd like to know. Eubulides (talk) 06:22, 28 May 2009 (UTC)

The Journal subscription thing is domain based so I'll have to see if its possible to improve it whenever I get around to refactoring that script. — Dispenser 14:38, 16 June 2009 (UTC)

Option to tag sites requiring registration

It would be convenient to have an option to call up {{registration required}} in addition to the dead- or spam-link options. LarryGilbert (talk) 08:00, 26 April 2009 (UTC)

That's funny, I was just about to say the same thing. That would be great if you made it automatically put the {{registration required}} template next to any links that require registration. Logan | Talk 19:30, 21 May 2009 (UTC)
  Declined The last time I checked the policy regarding registration sites was only concerned itself with the stuff in the external links section. The policy for that is to simply remove them and is part of the reason why detection is include in the tool. It looks to me that template is rather superfuical and companies many change their registration polices (New York Times) or is dependent on the IP address the user is connecting with (as is the case with many universities). I may in the future add support for custom templates. — Dispenser 10:23, 21 June 2009 (UTC)

Hi, The reported deadlink at timesonline.co.uk is alive when you click on its link and on the link on the page. Drop me a line on my talk in case you can't reproduce this and it seems to be a domain/browser/OS related issue. Cheers (Cool tool btw.) Enki H. (talk) 04:27, 18 June 2009 (UTC)

Not a bug The server is flaky; the first time I opened that link I was greeted with a 404 Not Found message. I waited for a minute before trying again and finally got the article. This sort of behavior is not uncommon (see #False deadlinks? above) and is likely related to some timeout out issue on the hosting server. I get around to rewriting the core I will be adding something to indicate this flaky behavior. — Dispenser 10:23, 21 June 2009 (UTC)

URLs with user+password

I recently ran checklinks on Oxygen toxicity and it flagged this citation:

  • <code>{{cite web |url=ftp://downloadfiles:decompression1@ftp.decompression.org/Baker/Oxygen%20Toxicity%20Calculations.pdf |format=PDF|title=Oxygen toxicity calculations |author=Baker, Erik C. |year=2000 |accessdate=2009-06-29 }}</code>
  • Baker, Erik C. (2000). "Oxygen toxicity calculations" (PDF). Retrieved 2009-06-29. {{cite web}}: Check |url= value (help)

I guess checklinks can't handle URLs of the form "ftp://username:password@domain/..."? Thought I'd mention it in case you have time to fix this. Eubulides (talk) 16:37, 1 July 2009 (UTC)

File-->Image

Checklinks is 'correcting' links to the file namespace by changing them to the 'image' namespace. While totally innocuous, it is unnecessary and may cause confusion down the road. Thanks for the tool, love it otherwise. Protonk (talk) 22:11, 13 August 2009 (UTC)

Minor spelling changes

Here are some suggested spelling changes for the English language messages.

1. Would you like to run reflinks.py bot script to attempt add missing title on external links and combine idenitical refernces

I suggest changing it to:

Would you like to run reflinks.py bot script to attempt to add missing titles for external links and to combine identical references?

2. Change Excessed to Exceeded in "Excessed redirect limit".

It appears when the tool tries to follow the link:

http://darwin-online.org.uk/content/frameset?itemID=F373&viewtype=text&pageseq=506

from Thomas Henry Huxley.

The links all work, but darwin-online.org.uk does take several seconds to display the section of a large page, in case that is the reason. -84user (talk) 09:16, 16 September 2009 (UTC)

cite.php update

The cite software has been updated to allow definition of references within the reference list. See Wikipedia talk:Footnotes#cite.php update. Both Checklinks and refTools fail when this style is used. See Arthur Rudolph for a sample and Help:Cite messages for the new error messages. ---— Gadget850 (Ed) talk 18:59, 17 September 2009 (UTC)

Bug: Removes parantheses in article name

When clicking on the link on a WP:FLC page of an article which has parantheses in the name (e.g. Wikipedia:Featured list candidates/List of National Treasures of Japan (paintings)/archive1), the parantheses get removed. Checklinks and similar tools search for the name without parantheses to which generally no article exists. bamse (talk) 17:22, 11 November 2009 (UTC)

  Works for me on three different browsers using both direct links and copy & pasting. — Dispenser 06:00, 15 November 2009 (UTC)
OK. Tried again. No problem with Opera or IE. Must be a problem with my firefox. Maybe a buggy add-on. bamse (talk) 10:56, 15 November 2009 (UTC)

Manual changes do no longer work

For example: on the check for Maurice Garin, there is currently a 302 error on "Trans-Alpine du Livre, Vallée d'Aoste, Article on Maurice Garin [transalplivre.eu]". This is a redirect to a wrong page, so I tried to report the link as dead. I did that the usual way (clicking the plus, and changing the operation to "{{dead link}}"). Then I click "Save changes". I get a message saying that no changes will be made. This happens with every page and every option I try. It seems like the script only uses the default actions, and not the ones that are changed by users. I tried it on three different browsers (Firefox3.0, IE7.0 and Chrome) and all have the same effect. --EdgeNavidad (talk) 09:48, 2 October 2009 (UTC)

I can confirm that the "Save changes" no longer has the effect of making the manual changes. I just tried it on Lulu (company) and none of my manual changes were recognised. -84user (talk) 14:16, 4 October 2009 (UTC)
Now it seems to work again. --EdgeNavidad (talk) 17:14, 5 October 2009 (UTC)Fals hope, it still does not work.--EdgeNavidad (talk) 17:16, 5 October 2009 (UTC)

I have picked up development again on Checklinks, this unfortunately means lots of stuff will break as I attempt to redesign and rewrite the JavaScript interface. And likely wont work in IE for a while after. The goal is to increase automation and usability.

Automation will need some backend changes such as knowing how long a link's been around. It will also know if WebCite or the Wayback Machine has archive copies and automatically replace those instead of just tagging them with {{dead link}} if there a copy close enough to the access date. Possibly direct saving without previewing.

Since most users encounter this tool through article review processes, it is often overlooked that it can be used to modify articles. The basic are to change icons to text, reduces clicks needed to get things done by enlarging/removing container and adding quicklinks. Another addition will be adding more contextual help and make it clearer why some tools are provided.

So while I'm not done, feedback about the design and any other ideas is welcomed. — Dispenser 04:33, 6 October 2009 (UTC)

This is a great tool, and if you are working to improve it, great! --EdgeNavidad (talk) 06:38, 6 October 2009 (UTC)
I am glad that improvements are to be made, however at present using Firefox 3.0.14 it doesn't work, i.e. tagging dead links, links to Wayback Machine, etc. Is it possible to reinstate the previous version, with a link to the under development version? Jezhotwells (talk) 20:17, 21 November 2009 (UTC)

Make the + (expand) button switch to a - (reduce) button when it opens

First of all, YAY, what a great tool! I'd been wanting something like this for a while...

Now, for the enhancement request: It'd be nice to be able to close the info windows from the same place as is used to open them (i.e. the + button and/or the (info) link). I (now) figured out that I can close the info window with the x in the right corner, but it'd be better to not have to reach all the way over there. I may look into the code and see if I can hack up a patch. JesseW, the juggling janitor 04:35, 10 December 2009 (UTC)

Dead?

Is this service basically dead? We can no longer replace dead URLs with working ones in this script. Gary King (talk) 03:17, 1 February 2010 (UTC)

Agreed, I did ask about this some time ago, but got no response. I would prefer to yuse the old version whilst any bugs or development is being acrried out. Jezhotwells (talk) 10:45, 1 February 2010 (UTC)
Yeah, I prefer the old version. From what I can see, Dispenser was working on a new version of the script, and apparently the other ones too as the new script looks more streamlined with the others, but he never got around to finishing it so now it's not really able to do much that it used to. Gary King (talk) 19:35, 1 February 2010 (UTC)
I've hacked up something in between now, it reminds me that I should really ditch the interface code. Its all based on the index number of the drop down list, too much depends on that so its difficult to rewrite. I made an option (available via &debug=1) that extracted the data from HTML instead of wikitext, but then realized that some links wouldn't be editable since they're transcluded from templates. Anyway, I've included "Replace link", "Tag {{dead link}}", and "Update accessdate" anything else while I'm still available? — Dispenser 02:28, 2 February 2010 (UTC)
I get a "Error: trDisplay is not defined" error when unchecking any of the checkboxes for "Good", "Warn", "Status", etc. I can't seem to expand any of the links to even change their URLs or modify them in any way anymore. Gary King (talk) 20:58, 2 February 2010 (UTC)
Yikes, I separated the script from the rest of the site, but forgot to add it too webchecklinks.py.   Fixed now. — Dispenser 21:14, 2 February 2010 (UTC)

Not working?

I tried to check an article I'm working on, and got unprocessed html results. I'll try again tomorrow. - UtherSRG (talk) 08:26, 1 June 2010 (UTC)

I just tried the same article, and it seems to work OK now. --Philcha (talk) 12:46, 1 June 2010 (UTC)

Australian Domains

I have a domain ending with .com.au in my External Links, and it seems to link it's supposed to be a .au file: http://toolserver.org/~dispenser/cgi-bin/webchecklinks.py?page=Debtors_Anonymous -- Scarpy (talk) 04:37, 15 June 2010 (UTC)

External Wikis

Can this functionality be used on an external wiki? Is there e.g. an extension available for download? --Robinson weijman (talk) 09:50, 19 February 2010 (UTC)

Currently the core source code to Checklinks is unavailable, its rather messy and I do not wish to release it until after the rewrite. I may be able to add support for your wiki though. — Dispenser 22:08, 11 March 2010 (UTC)
Thanks - but we've got a bot to do it now. --Robinson weijman (talk) 12:05, 16 June 2010 (UTC)

Pages with special characters failing

It seems that page titles with special characters like en dashes, em dashes, and accented letters fail at mergeChanges.py. It appears that the characters aren't being converted back when passed to get(). Below is the results using List of Pokémon (461–480) as an example, though it doesn't currently need changes:

<class 'wikipedia.NoPage'>: (wikipedia.Site('en', wikipedia.Family('wikipedia')), u'[[en:List of Pok\xe9mon (461\x96480)]]') 
      args = (wikipedia.Site('en', wikipedia.Family('wikipedia')), u'[[en:List of Pok\xe9mon (461\x96480)]]') 
      message = ''
/home/dispenser/public_html/cgi-bin/tracebacks/tmpXd6F1n.html contains the description of this error.
So you were that Google Chrome user? That browser is broken as it ignores the specified charset and doesn't inherit the parent window's charset to the popup. I've worked around this by using the accept-charset in the popup form. — Dispenser 18:15, 15 July 2010 (UTC)
Yea, that was me. Sorry that I didn't realize it was a problem with Chrome; you've probably seen me in the logs a few times because this wasn't the first time I had the problem. The workaround seems to fix the problem, so thanks for looking into it. —Ost (talk) 21:33, 15 July 2010 (UTC)
I gave up the first time a few weeks ago as I was unable to reproduce the bug (Chrome didn't auto-update yet). — Dispenser 01:44, 16 July 2010 (UTC)

Hi I believe you are connected / look after / are put upon in connection with the external link checker ? I have a problem with a link to the London Gazette on the British Commando article its ref 75 Gazette issue no 37134. If you clink on the link it directs to the correct page etc. However using the A Class link checker tool here Wikipedia:WikiProject Military history/Assessment/British Commandos it comes up as a red dead link. Any idea how to solve the problem --Jim Sweeney (talk) 16:44, 22 July 2010 (UTC)

Template:London Gazette doesn't take any URL parameters, but if it did, the one you gave it is certainly dead. — Dispenser 16:05, 28 July 2010 (UTC)

Seplling erorr

"Redirect perserves id number" contains a spelling error. SpinningSpark 08:27, 28 July 2010 (UTC)

  Fixed with a few others thanks to vim's source code friendly spellchecker. — Dispenser 03:43, 29 July 2010 (UTC)

Nice tool (from Signpost). But I see it only checks if the link is already in WebCite. Do you know of any tool (and if not, might you add this to yours) which automates or semi-automates making WebCites? Even pre-filling known fields (from Cite templates) on the Create page of WebCite, and then letting you check and submit manually, would be helpful. cheers, Rd232 talk 10:00, 7 September 2010 (UTC)

In Checklinks: for the white rows click "(info)" then click on "Request to archive this link". I've hacked it into Reflinks as well, but it doesn't check if the link works first. You could also just use their bookmarklet or it could be incorporated into reftools. — Dispenser 20:17, 7 September 2010 (UTC)
Fantastic! Thanks! Though (info) isn't terribly intuitive as something to click on - I spent some time looking for it even after your instruction. Rd232 talk 20:41, 7 September 2010 (UTC)

Source code availability

At work we have a growing wiki with increasing link rot and no management of it. Is the link checker available as an extension or such for mediawiki?..or what would be involved in setting this up on an intranet mediawiki instance? If the source code is available i might have to port to perl or php, as we have no python...but that's separate. Biomimicry (talk) 03:40, 4 August 2010 (UTC)

Unfortunately, I have not made the source code available. However, there is weblinkchecker.py which is a well documented command line link checker for MediaWiki installations and the W3C has a linkchecker written in perl which might also be a good choice. — Dispenser 18:50, 9 September 2010 (UTC)

French Wikipedia

When running Checklinks on fr.wiki, the added dead link template no longer exists. Logan Talk Contributions 02:19, 2 January 2011 (UTC)

I've updated to use the new name. I'll note that fr:Modèle:Dead link still exists. — Dispenser 19:00, 29 January 2011 (UTC)

Hi, the tool checklinks isn't working well, see here. Thanks in advance. --Vitor Mazuco Talk! 16:16, 29 January 2011 (UTC)

  Fixed. It was a bug recently introduced by a security fix and code refactoring. — Dispenser 18:53, 29 January 2011 (UTC)

Thankful. --Vitor Mazuco Talk! 23:30, 29 January 2011 (UTC)

Is it possible to change both the url and the accessdate at the same time? If the url of a source moved, I would like to change the reference to the new url and also change the accessdate (since at the time of the old accessdate the new url did not exist yet). However if I first click on "replace link" (adding the new link) and then on "update accessdate" all that happens is that the accessdate gets changed. Is this a feature or a bug or am I doing something wrong? bamse (talk) 14:09, 26 February 2011 (UTC)

File extension (.do) not in dictionary

Hello, the URL [6] is referenced in de:Nekrolog 1. Quartal 2011. This leads to the message "File extension (.do) not in dictionary". Could you please enhance the Checklinks tool in a way that covers this Java Servlet extension? Thanks and regards. --84.162.185.20 (talk) 15:13, 25 April 2011 (UTC)

Non-problematic URL leads to error message

Hi, checking this link ends up with the return code 503 and the message SERVER: function show_moreinfo(var1){ document.getElementById(var1).style.display="block"; document.getElementById(var. Could you please have a look? Thank you. --217.227.194.113 (talk) 22:10, 27 April 2011 (UTC)

Obey Manual of style (dates and numbers)

I do not use the tool myself, but I understand that it does not respect MOS:DATEUNIFY; instead it puts all dates that it generates in the YYYY-MM-DD format. Jc3s5h (talk) 00:03, 19 May 2011 (UTC)

I tried out the tool, and it appears it is incapable of dealing with accessdates that are not in the YYYY-MM-DD format. When it encounters a date in the DD Month YYYY format, it inserts a second accessdate parameter. Jc3s5h (talk) 14:25, 21 May 2011 (UTC)

Checklinks: Wrong 404

Hello.

Checklinks marks all www.moneyhouse.ch as 404 dead link, but they are active. Two examples. Please check. Thanks. --KurtR (talk) 23:35, 8 June 2011 (UTC)

Linking to Redirects

Using checklinks is "breaking" the links in the fleet tables of airline articles. For example its changing [[Boeing 777|Boeing 777-300ER]] to [[Boeing 777-300ER]]. Its linking to redirects. It seems to do it to the Boeing 777 in particular, but also for other aircraft as well. Before you save the changes you need to manually change these back. Thanks. --JetBlast (talk) 10:09, 12 June 2011 (UTC)

Unable to reproduce/Not a bug: I've identified problematic edits (stripping section link) from your recent contributions. I am unable to reproduce that behavior. The behavior of replacing section links with redirects linking to the same section, is intended as it identifies linking errors and eases merging and splitting articles. — Dispenser 17:11, 12 June 2011 (UTC)

Other citation tags

Hi, First off, I love checklinks, it's so useful for checking references while doing GA reviews. Can I suggest including other citation cleanup tags like:

  • {{Registration required}} citation flag
  • {{Subscription required}} citation flag
  • {{Failed verification}} citation flag

I know we can enter them manually in the preview interface but that's a bit difficult when trying to juggle multiple tags. --Deadly∀ssassin 21:33, 17 October 2011 (UTC)

The page at tools:~dispenser/view/Checklinks show up normally but when I click on "Check article" I get nothing. Here is a secure link that shows as a blank page. I don't think it matters but I am using Chomium 15.0.874.106 (Developer Build 107270 Linux) Ubuntu 10.04. – Allen4names 23:58, 8 January 2012 (UTC)

I agree. I, too, get a "messed up" page when I try to check an article.
Allen (talk) 01:23, 9 January 2012 (UTC)
Fixed I've been trying to merge code related to cached copies of checklinks results. I accidentally flipped the condition to keep bots out in the HTTP handler. Sorry for the incontinence. — Dispenser 02:20, 9 January 2012 (UTC)
Thank you for the quick response. – Allen4names 03:59, 9 January 2012 (UTC)
Yes, thank you for the very quick response. My time online is fairly limited, so that was a great help!
Allen (talk) 11:02, 9 January 2012 (UTC)

Hi, what's wrong with checklinks? An error occurs whenever i try to get the external links checked. Joyson Prabhu Holla at me! 17:03, 14 December 2011 (UTC)

I only have an error on record for an incomplete response from WMF servers for elwiki. Whatever it was its probably fixed by now. — Dispenser 17:29, 9 January 2012 (UTC)

iranica.com

The online version of Encyclopædia Iranica has changed domain from iranica.com to iranicaonline.org. The domain iranica.com is now inactive and each page now displays only an error message. Per this bot request, I changed as many links as I could, but now we want to tag the remaining links to iranica.com with {{dead link}}. I was hoping we could use one of your tools to do this automatically, but Checklinks still shows the links as OK. Do you have any suggestions for mass tagging the remaining links to iranica.com? Thanks! GoingBatty (talk) 00:19, 21 January 2012 (UTC)

deadurl not respected

tools:~dispenser/cgi-bin/webchecklinks.py?page=Lily_Serna checks the wrong urls; the deadurl in all the cite templates is =no (I've preemptively archived them), so it's not the webcite urls that need checking - it's the unarchived ones. Josh Parris 13:19, 2 April 2012 (UTC)

Swedish

Hello could you please change the "{{dead link|date=March 2012}}" to "{{Död länk|datum=2012-03}}" when you edit on the swedish (sv.) wikipedia. -Josve05a (talk) 19:54, 5 March 2012 (UTC)

Unfortunately I'd have to rewrite the code to make this work. Instead you can use the follow compatibility code {{#if:{{{date|}}}|{{#time:Y-m|{{{date}}}}}|{{{datum|utan datum}}}}} in the template (Tech note: <ref>{{subst:#time:Y-m}}</ref> doesn't expand). But I'll keep this in mind going forward for other programs. — Dispenser 22:49, 9 April 2012 (UTC)

Protocol relative URLs

Hi Dispenser, as far as I know MediaWiki supports protocol relative URLs (//example.tld/ instead of http://example.tld/ or https://example.tld/). It seems to me that Checklinks tool does not detect those links. Is there a way to fix this? Cheers, --Alex (talk) 14:38, 8 April 2012 (UTC)

When would it be a good idea to use protocol relative URLs in article? What if its hosted on ftp://, smb://, or file:// (iOS)? Do you have any examples? — Dispenser 22:13, 9 April 2012 (UTC)
It didn‘t occur to me when I was checking links within an article, but on an internal worklist. Anyway, is there a way to check if – no matter if //example.tld/ leads to http://example.tld/ or https://example.tld/ – //example.tld/ is a dead link? --Alex (talk) 20:20, 15 April 2012 (UTC)

Trailing slash in URL

Where the URL has a trailing slash, Checklinks marks it as dead. Example:

Markup Renders as
<ref>{{cite news |last=Wells |first=Mike |title=Scouts Take Trip To The Sci-Fi Zone |work=The Tampa Tribune |url=http://www2.tbo.com/news/metro/2008/may/25/me-hey-darth-be-prepared-ar-141509/}}</ref>

[1]

  1. ^ Wells, Mike. "Scouts Take Trip To The Sci-Fi Zone". The Tampa Tribune.

Checklinks

When you click on the link within Checklinks, the URL ends with /}}/. ---— Gadget850 (Ed) talk 16:45, 19 April 2012 (UTC)

monobook / vector user scripts?

Is there a way to add this tool to one's toolbox or such? An old post above suggest a monobook script used to exist, but the current page offers no hint that it does. --Piotr Konieczny aka Prokonsul Piotrus| talk to me 18:00, 30 May 2012 (UTC)

The Filipino American article uses unusual ref names which confuse Checklinks (example: <ref name="url=http://www.gov.ph/1">). I've suggested that these unusual names be changed, and I'm reporting the problem here. Wtmitchell (talk) (earlier Boracay Bill) 02:21, 31 July 2012 (UTC)

http://78.136.27.54:8080/programming/article_1142.asp is reported as dead every time I do Checklinks on Dylan and Cole Sprouse. It's been live every time I checked it since I've put it in the article. It says it's giving a 404 error. I figured I should report it. - Purplewowies (talk) 06:23, 14 August 2012 (UTC)

Failure

In this figure this error: Checklinks had an error 'ascii' codec can't decode byte 0xc3 in position 18: ordinal not in range(128), ¿what happens? — Preceding unsigned comment added by Ondando (talkcontribs) 04:49, 21 August 2012 (UTC)

geosearch.py

I've been looking at error detection in coordinates lately, and rediscovered your pretty tool. I had been thinking of some new regexps, but the tool can't handle them yet:

  • Possible degree, minute and second characters [°′'`´‘’″"“”] seem to match with article names: regexes should be applied to the coordinates only.
  • Negation of all the allowed characters [^0-9NSEW._-] doesn't seem to show only coordinates with other characters.
  • Negation queries NOT REGEXP might be nice so that a regex of the correct format could be given and the tool would then show all erroneous ones, as kind of a catchall, but that might require a way to give multiple patterns. Or maybe the tool could have that query built in?

--Para (talk) 23:51, 2 December 2008 (UTC)

A few things seem to be going on here. MediaWiki stores the URLs in UTF-8 if provided in UTF-8, but percent encode them when rendering. Firefox barfs on Unicode quote, sad really. MySQL seems to have only byte-wise support for regexing UTF.
  • params=[^&:]*[°′'`´‘’″"“”]? [°′'`´‘’″"“”] seems to be interpreted as [\xC2\xB0\xE2\x80\xB2'`\xC2\xB4\xE2\x80\x98\xE2\x80\x99\xE2\x80\xB3]
  • params=[^&:a-z]*[^0-9NSEW._&:a-z-]
  • It's a good idea, but I still don't have the database tools design fully fleshed out yet.
Originally I wrote program that parsed the external link table, with a good amount of error correction. Docu sent me a patch to add more verbose messages. It currently runs daily and dumps it's logs into tools:~dispenser/resources/logs/coord-enwiki.log. — Dispenser 05:11, 3 December 2008 (UTC)
Ok, the results from the logging tool look good, much better than a simple catchall query that doesn't classify anything. Would it be possible to have a table interface to prettify the log and sort by error type? --Para (talk) 14:23, 4 December 2008 (UTC)
It's formatted in Tab-Separated Format which can be imported into excel and then "filter"ed into a table interface. — Dispenser 16:42, 4 December 2008 (UTC)
Right, I have no trouble reading it, but I was more thinking of casual users who stumble upon a link of things to do. The various errors are listed on Wikipedia:WikiProject Geographical coordinates#Coordinates search tool, with a link to a tool that's easy to read and allows easy access to the problem article. I wouldn't say that the raw log achieves that... and so prettifying without people needing to copy and paste things would be nice, especially if someone happens to have a framework set out already. ;) --Para (talk) 00:33, 5 December 2008 (UTC)
We need a better way to index tools on the Toolserver, as there's been probably someone who wrote a viewer already. File viewer is my rendition of a simple universal log viewer. Its doesn't have drop down lists or sorting, but that's what the table tools extension is for. — Dispenser 21:25, 20 December 2008 (UTC)
Thanks, that looks just fine. Can you make the Javascript read a url parameter, so that a certain log file in the viewer would open when linked from here? --Para (talk) 17:35, 5 January 2009 (UTC)
Back/forward doesn't work, but it can be wikilinked - tools:~dispenser/view/File_viewer#log:coord-enwiki.log. — Dispenser 06:21, 26 January 2009 (UTC)
Heading text

Ø=Ű!Ø=Ű!--70.180.131.220 (talk) 19:18, 2 November 2012 (UTC)

ummm.... it's not doing anything

well, i've used checklinks successfully several times in the past, and today even, but it all of the sudden stopped working. it does nothing, no table generates, although it does say it's showing a cached version or something.... Aunva6 (talk) 23:58, 1 March 2013 (UTC) never mind. it's working now.... Aunva6 (talk)

False positive

Checklinks tagged Wayback Machine links as dead link. Why is that? Regards, --Klemen Kocjancic (talk) 10:37, 11 March 2013 (UTC)

Mark as dead

If there is this tag "_ssl.c:499: error:140770FC:SSL routines:SSL23_GET_SERVER_HELLO:unknown protocol" then the checklinks should mark that as deas as default. -(tJosve05a (c) 21:48, 6 April 2013 (UTC)

Problem?

I was going to use checklinks (as I often do, thank you), but my Avast virus scanner said that the site had been infected with malware. Maybe you should check it out. On another PC, Google Chrome blocked it as well. BollyJeff | talk 17:48, 10 April 2013 (UTC)

Okay problem solved. I tried another title and did not see the same result. The problem happened when I tried it on Thriller (album). I am going to comment out the offending link and move on. In general, do we have way of rooting out malware from article sources? BollyJeff | talk 17:38, 12 April 2013 (UTC)

Pywikipedia.py error

I got this following error code while trying to check dead links at: Scottish Gliding Union, Aurora de Chile, Whim, United States Virgin Islands and Military history at 2013-05-13 10:25 to 10:45 (Swedish time).

<class 'wikipedia.ServerError'>	Python 2.7.1: /usr/bin/python
Mon May 13 08:25:56 2013
A problem occurred in a Python script. Here is the sequence of function calls leading up to the error, in the order they occurred.

 /home/dispenser/public_html/cgi-bin/webchecklinks.py in ()
    150                 wikipedia.handleUrlAndHeader(allowBots=(wikipedia.SysArgs.get('cache')=='yes'))
    151         try:
=>  152                 main()
    153         finally:
    154                 wikipedia.endContent()
main = <function main>
 /home/dispenser/public_html/cgi-bin/webchecklinks.py in main()
     75         if use_cache != 'yes':
     76                 try:
=>   77                         page.get() # first call, handles errors!
     78                 except wikipedia.NoPage as errmsg:
     79                         html('NoPage error encountered <br/><code>%s</code>', (errmsg,))
page = Page('wikipedia:en', u'Scottish Gliding Union'), page.get = <bound method Page.get of Page('wikipedia:en', u'Scottish Gliding Union')>
 /home/dispenser/public_html/cgi-bin/wikipedia.py in get(self=Page('wikipedia:en', u'Scottish Gliding Union'), force=False, get_redirect=False, nofollow_redirects=False, change_edit_time=True)
    377                                         raise NoPage("Please use offical correct interwiki")
    378                                 else:
=>  379                                         raise ServerError("API server error (%r)"%text[:80])
    380                         self.protection = None
    381                         # get values for put()
global ServerError = <class 'wikipedia.ServerError'>, text = '<html>\r\n<head><title>502 Bad Gateway</title></he...<center>nginx/1.1.19</center>\r\n</body>\r\n</html>\r\n'
<class 'wikipedia.ServerError'>: API server error ('<html>\r\n<head><title>502 Bad Gateway</title></head>\r\n<body bgcolor="white">\r\n<ce') 
      args = ('API server error (\'<html>\\r\\n<head><title>502 Ba.../title></head>\\r\\n<body bgcolor="white">\\r\\n<ce\')',) 
      message = 'API server error (\'<html>\\r\\n<head><title>502 Ba.../title></head>\\r\\n<body bgcolor="white">\\r\\n<ce\')'

/home/dispenser/public_html/cgi-bin/tracebacks/tmp_W89F9.html contains the description of this error.

-(tJosve05a (c) 08:30, 13 May 2013 (UTC)

Last update at 08:37, 13 May 2013 (UTC)

Malware?

Hi, I just tried using Checklinks and my browser warned me that the site is malware infected. I use Google Chrome and I had used checklinks before too. It hadn't shown any malware warning before though. --WonderBoy1998 (talk) 15:20, 19 June 2013 (UTC)

http://toolserver.org/~dispenser/cgi-bin/webchecklinks.py answers with 404

Seems to be working now. GoingBatty (talk) 03:27, 5 September 2013 (UTC)

URN resolving

Hi Dispenser, thanks for ur checklinks tool! I'd like you to do a little change at whitelisting. In Germany many libraries offer digital books by web access that are identified and referenced by an URN that has to be resolved what is done by nbn-resolving.de service. Example: book's URN is urn:nbn:de:kobv:517-vlib-231 so that resolving via web link http://nbn-resolving.de/urn:nbn:de:kobv:517-vlib-231 will lead us to Potsdam University but checklinks always will say "Uncategorized redirects/Changes domain and changes path" to this wanted behavior. --Achim55 (talk) 15:09, 31 October 2013 (UTC)

Whitelist

Put *.blogspot.* (e.g. www.blogspot.com) on a white list from being a 302 (Changes tld) since it depends on where the user/server is running. For me I get reirected to .se but on "Headers and redirects" it says that the website redirects to .nl . -(tJosve05a (c) 23:05, 1 December 2013 (UTC)

Sv.wikipedia.org

On svwp use |datum= instead of |date=. -(tJosve05a (c) 11:51, 2 December 2013 (UTC)

And 2013-12 instead of December 2013. -(tJosve05a (c) 11:51, 2 December 2013 (UTC)

Dab Solver

As always, thanks for your great tools. I have two questions about Dab Solver:

  1. Why was the last update for WikiProjects over 50000 minutes ago?
  2. Why do some pages remain on WikiProject lists when they seemingly have no links to dab? For instance, Mario (franchise) on [7] always lists Princess Daisy, but I can never find where it thinks that it is still linked.

Ost (talk) 16:23, 5 December 2013 (UTC)

@Ost316: - Since the Toolserver is going to be phased out, it seems no one cares about fixing the replication lag, which is why the last update hasn't happened lately. For your second question, please see User talk:Dispenser/Dab solver#Accumulation of false positives over time. I look forward to using Dab Solver on Labs! Happy editing! GoingBatty (talk) 03:04, 7 December 2013 (UTC)
Oops; didn't think to post on the tools' subpage and somehow I thought I was on Dispenser's main talk page. Thanks, GoingBatty. I had thought that the Labs thing might have something to do with it, but I thought the move wasn't till next year and this started in October. I do understand the need to prioritize fixes, though. Are Dispenser's tools being moved to Labs? —Ost (talk) 15:17, 7 December 2013 (UTC)
@Ost316: - I hope so, but I would like to get confirmation from Dispenser. GoingBatty (talk) 19:17, 7 December 2013 (UTC)

When I checked https://toolserver.org/~dispenser/cgi-bin/webchecklinks.py?page=University_of_Toronto_Quarterly all the URLs are 200/301, but one is marked in the wikitext as being 404. Checklinks doesn't edit that so that the 404 is removed; is this by design? Josh Parris 21:47, 14 December 2013 (UTC)

Many articles have links to data from the United States Census Bureau, which come up in various forms as problematic or "crufty", appearing in green or blue, despite the fact that the links are accurate. This link from the article for South Hackensack, New Jersey is perfectly fine, though the Census Bureau turns it into a generic link. The data is correct and there seems to be no reason why these links shouldn't be treated as acceptable, though it would be nice if we could convince the Census Bureau to show the actual link. Any thoughts? Alansohn (talk) 02:53, 18 December 2013 (UTC)

Another false dead

Why do links to India Today, such as http://indiatoday.intoday.in/story/Preity+Zinta+to+look+after+34+orphans/1/33516.html always show up as 404 and get marked as dead links, when they come up just fine in a browser? BollyJeff | talk 00:12, 4 February 2014 (UTC)

I tried to use the wikiblame & webcite links just now, but they sent me to URLs with $0, $1, etc. in them, so I had to paste the information in by hand. Still better than not having the links, but not nearly as helpful as if it worked! —SamB (talk) 19:07, 14 February 2014 (UTC)

So I'm not the only one...
@Dispenser: if you're not too busy (I really appreciate all the tools you've made!), here's a bug report:
On the page Fluoxetine, (permalink) I ran the tool and attempted to revive this page, listed as a 404. I tried clicking on both Wayback and Webcite but nothing happened. I opened it in a new tab and got the following: [8] [9]. The same thing happens for any given article and any given link, for both Wayback and Webcite. And Find accessdate (WikiBlame) and Headers and redirects as well: [10] [11].
I think it's a bug common to all the quick links. It looks like a syntax error for regex, offhand, but I really can't tell. That might explain your comment here:

Users don't seem to understand that they can make edits WITH the tool or search the Internet Archive Wayback Machine and WebCite (archive too).

Hopefully this was helpful and not bitchy-sounding, Meteor sandwich yum (talk) 02:45, 18 March 2014 (UTC)
Actually, I should add this: Headers and Redirects loads when clicked (but doesn't do anything), and the page will occasionally load in the site in the mini-window to the right. I got this on the link I tried to revive above:
HEAD http://www2.indystar.com/library/factfiles/business/companies/lilly/stories/2001_0802.html
HTTP/1.1 404 Not Found

No matches from the Internet Archive

Checklinks first sucessfully accessed this url on 2010-05-21
which is odd because it has 19 hits: [12].
It still will tell you when the link went dead, and if Webcite is available, on the right-hand side. That works, and is very helpful. For the same link it states: "Dead since 2013-12-22". Meteor sandwich yum (talk) 02:55, 18 March 2014 (UTC)

Breaks table sorting attributes

As seen in this diff, the data-sort-value custom attributes were changed to data-sort- value,which broke their functionality. --Waldir talk 15:57, 30 April 2014 (UTC)

Is the tool working? Its showing me a cache link for whichever article I'm checking, and not allowing me to check it real-time. —Indian:BIO · [ ChitChat ] 15:14, 3 July 2014 (UTC)

For future GAN's and FAC's, I really need Checklinks. If it can't be repaired, someone please do provide and alternative. SNUGGUMS (talk · contribs) 15:43, 3 July 2014 (UTC)

I get the following error message:

The following error was encountered while trying to retrieve the URL: http://dispenser/cgi-bin/webchecklinks.py?
Unable to determine IP address from host name "dispenser"
The DNS server returned:
Name Error: The domain name does not exist.

This looks like a minor problem with the full URL. --Leyo 22:29, 6 July 2014 (UTC)

Checklinks is still not working after 2-3 more days, and I noticed there is another Reflinks village pump section above currently at #13 on this page. Could someone do a look and see at this. For example on Sonia Sotomayor footnote #263 does link successfully to Esquire magazine, but Checklinks gives a deadlink notice for it to somewhere at an Arizona news press. LawrencePrincipe (talk) 13:42, 7 August 2014 (UTC)
@LawrencePrincipe: It appears you (and others) have done a lot of work on the Sonia Sotomayor article, so we can't reproduce the issue by running Checklinks on the current version. You may want to create a test page in your userspace that will generate the same issues you were seeing, and then provide a link on User talk:Dispenser/Reflinks. Good luck! GoingBatty (talk) 16:40, 7 August 2014 (UTC)
@GoingBatty:, Hi Going Batty, yes, that's a good point. I will stop all editing for the next 24-48 hrs until someone can confirm the reflinks error. Cite Number 263 gives the following deadlink warning:
263 Sotomayor urges anxious grads to embrace future (info) [kold.com]
accessdate=May 19, 2010
date=May 8, 2010
publisher=KOLD-TV
410 Dead since 2011-01-17
However, when you go straight to footnote #263 in the article, the article footnote at 263 has nothing to do with the KOLD-TV deadlink notification from checklinks but uses Esquire magazine instead. I will postpone all update edits until someone confirms. LawrencePrincipe (talk) 18:21, 7 August 2014 (UTC)
@LawrencePrincipe: Oh, you're using Checklinks, not Reflinks. I confirm that Checklinks is showing that #263 is kold.com, even though the article shows kold.com is #259. The proper place to report this issue is User talk:Dispenser/Checklinks so Dispenser can resolve the issue. Good luck! GoingBatty (talk) 03:23, 8 August 2014 (UTC)
Could someone take a look and see? LawrencePrincipe (talk) 13:58, 8 August 2014 (UTC)

The link given for the Checklinks tool, toolserver.org/~dispenser/view/Checklinks, is obsolete.

We're sorry, but the user-supported tool you have attempted to reach did not leave a forwarding URL where we could automatically redirect you.

A bit ironic. :-) --Maiden taiwan (talk) 15:47, 11 December 2014 (UTC)

(talk page stalker) @Maiden taiwan: Try http://dispenser.homenet.org/~dispenser/view/Checklinks instead. GoingBatty (talk) 18:52, 13 December 2014 (UTC)

Table markup changes

In addition to repairing links, the Checklinks tool changes table markup [13] in what it thinks to be a cleanup. Who is it doing that. This tool is designed to check and repair links, it should leave table markup alone. In the case of the edit ([14]) I referred to, the markup changes it made corrupted the sortability that was provided in the table. Tvx1 22:36, 12 March 2015 (UTC)

align="center" was deprecated long before Wikipedia's founding. Why didn't blacklist this in the v1 parser, we didn't really know better then. — Dispenser 02:21, 13 March 2015 (UTC)
Fine. I have no problem with even changing them. But that's irrelevant. This tool should not be doing that. That's not what it's designed for. It should solely focus on references and links. In the edit I referred to it changed every instance of style="background-color:#......" to style="background:#......;", which causes the sortable function to stop working (a well known issue by the way), even changed some of the colors and changed every instance of data-sort-value=".." to data-sort- value="..", which messed the programmed sorting sequence. Why does it do these things? How have these changes anything to do with its purpose of helping us to detect and repair broken links. Tvx1 03:50, 13 March 2015 (UTC)

Apologies if this is not the correct place to bring this up but it's the only place I could find. I was using the tool last night to try and repair a link and I think I've come across a bug. I was trying to repair a link on Jamie Clarke (Neighbours) and the tool changed the character's classification. The classification should read "List of past Neighbours characters#C|Former; regular" but the tool wanted to change it to "Edith Chubb|Former; regular"! As amusing as that is, it's not correct. Just to make sure that this wasn't a one-off with this article, I also ran the tool on Carmella Cammeniti and again it tried to change the classification to the above.

Any ideas what's wrong?--5 albert square (talk) 14:13, 19 March 2015 (UTC)

arxiv is giving 403

This is just a heads up that arxiv links are getting 403 in Checklinks, though they appear fine on my end. For example, try http://dispenser.homenet.org/~dispenser/cgi-bin/webchecklinks.py?page=en:Rule_110#view:0.0.0.1.1.1SamB (talk) 02:52, 25 June 2015 (UTC)

The Wayback Machine has a new domain

So the wayback machine has started handing out URLs that look like http://wayback.archive.org/web/20140416031355/http://www3.alcatel-lucent.com/bstj/vol46-1967/articles/bstj46-3-497.pdf, and Checklinks doesn't recognize them. As a workaround, changing "wayback" back to "web" at the beginning of the domain name after pasting it into Checklinks seems to work, but it would be nice if it would recognize the new domain so I wouldn't have to do that. —SamB (talk) 16:06, 25 June 2015 (UTC)

Nice, it's working now! —SamB (talk) 00:46, 29 June 2015 (UTC)
Except, uh, only in some cases? —SamB (talk) 00:47, 29 June 2015 (UTC)

dead -> Dead

Hi, is there a reason why the script likes to replace dead link with Dead link? You can see an example of this capitalisation here. Thanks! --Cpt.a.haddock (talk) 13:55, 31 July 2015 (UTC)

Notice

  There is currently a discussion at Wikipedia:Administrators' noticeboard/Incidents regarding an issue with which you may have been involved. Thank you. Tvx1 16:20, 3 August 2015 (UTC)

Discussion moved to Village pump (technical)Dispenser 16:53, 11 August 2015 (UTC)

Context of tagger in article lead?

What is the context of tagger in article lead?

Checkingfax (talk) 01:20, 3 September 2015 (UTC)

Not working

Anyone else finding that the tool is not working tonight? It keeps telling me that the page cannot be displayed.--5 albert square (talk) 21:40, 14 September 2015 (UTC)

Never mind, seems to be working now.--5 albert square (talk) 22:38, 14 September 2015 (UTC)
There was a power outage from 15:20 to 21:35 UTC, the server was restored approximately one hour later. — Dispenser 02:08, 15 September 2015 (UTC)

I would like to report a bug in Checklinks. Can Checklinks be fixed so that it can ignore checking links to paywall/subscription sites? Links to these sites (newspaperarchive.com, ProQuest, Newsbank, etc.) can only be visited by persons who happens to be on a network that has a subscription to those sites (i.e., universities, public libraries, and some secondary school systems). A person at the University of Texas or the New York Public Library can easily retrieve articles from these sites while a person sitting at Starbucks will not be able to access these articles.

I would suggest modifying the tool to ignore the following domains:

  • *.newspaperarchive.com
  • *.proquest.com
  • *.newsbank.com
  • *.ebscohost.com
  • *.lexisnexis.com

It is a waste of time for Checklinks to check links at these sites since access by Checklinks to these sites would most likely be blocked and Checklinks would not be able to find anything on Archive.org.

In particular, Checklinks would receive a 404 Not Found response when it visit any link at the access.newspaperarchive.com domain and erroneously mark them as a Dead Link.

Example, the following link would give a person at a subscribing institution access to an archived page of the Oakland Tribune newspaper, but Checklinks would label it as a dead link:

 http://access.newspaperarchive.com/oakland-tribune/1960-08-17/page-74

68.45.68.54 (talk) 18:23, 29 August 2015 (UTC)

Or maybe offer to tag those links with |subscription=yes or {{subscription needed}}? GoingBatty (talk) 18:34, 29 August 2015 (UTC)
I think GoingBatty's suggestion of having the tool and/or bot adding a |subscription=yes or {{subscription needed}} tags instead of wasting resources by performing pointless checks and then add {{dead link}} tags is a great idea!
Adding the suggested new tags would help point out to the viewer that the nearby link is just restricted and not actually dead. We just need to point out to the viewers that the material are just semi-restricted and can still be obtained under certain circumstances, and not permanently broken. To make this to work, the tool and/or bot will also need to look for |subscription=yes or {{subscription needed}} tags. (Now if we can only have the tool/bot remove the {{dead link}} tags it erroneously just placed...) 68.45.68.54 (talk) 19:58, 29 August 2015 (UTC)
Editors who are paywall subscribers are supposed to use a template similar to this format with the subscription=yes and via= parameters added:
<ref> {{cite journal | title=Bowling alone: America's declining social capital | author1= Putnam, Robert | url=http://muse.jhu.edu/journals/journal_of_democracy/v006/6.1putnam.html | year=1995 | journal=Journal of Democracy | volume=6 | issue=1 | pages=65–78 | subscription=yes | via=[[Project MUSE]]}}</ref>
It would be helpful if the bots were able to work with the paywall subscription companies and work this out. Cheers! {{u|Checkingfax}} {Talk} 03:25, 31 October 2015 (UTC)

Wikiproject

Great tool. Can we use it on all the pages tagged with a wikiproject at once. If yes please, tell the procedure (where, how). -- Pankaj Jain Capankajsmilyo (talk · contribs · count) 10:32, 31 October 2015 (UTC)

duplicate archiveurl

I had to do this fix and this fix after an edit attributed to this tool. it looks like the articles had the archiveurl and url backwards before the changes, which is probably what caused the problem? thank you. Frietjes (talk) 14:40, 30 October 2015 (UTC)

had to fix another one here. Frietjes (talk) 12:23, 31 October 2015 (UTC)
and another one here. Frietjes (talk) 15:11, 1 November 2015 (UTC)

Hello, Dispenser: I was wondering why Checklinks removed the brackets from [[MTV]] is this diff? I've noticed it before. Please ping me back. Thank you. Cheers! {{u|Checkingfax}} {Talk} 20:26, 2 November 2015 (UTC)

Typo on Checklinks: *Avalible* should be: *Available*

It's in the right hand column.

Great tool! Cheers! {{u|Checkingfax}} {Talk} 04:54, 25 October 2015 (UTC)

Weirdness

Hello. On Checklinks, when the tool finds any existing dead link parameter it changes it to: Dead link‍—‌it upcases the d‍—‌however, when Checklinks inserts a new dead link parameter, it inserts it as: dead link (all lowercase).

Just so you know. Cheers! {{u|Checkingfax}} {Talk} 00:12, 27 October 2015 (UTC)

Hello. Still waiting for a ping back from you. Here is a diff showing the upcasing of the letters, and it also shows the de-wikilinking of sources like [[MTV]] and [[Rolling Stone]]. Cheers! {{u|Checkingfax}} {Talk} 11:24, 11 November 2015 (UTC)

I'd like to reuse this program and I'd like to encourage more bugfixing

Today was my first time to use http://toolserver.org/~dispenser/cgi-bin/webchecklinks.py?page=List_of_Arabic_loanwords_in_English

The program falsely reported that the following link is "Dead since 2012-05-14": http://www.nma.gov.au/shared/libraries/attachments/publications/metal_04_proceedings/section_4_composite_artefacts/files/7859/NMA_metals_s4_p03_theophilus_shrine_vitus.pdf

I re-ran the program, and the second time it completely omitted the above link from its output. Then, later on in its list of output, it says: "Checklinks had an error 'ascii' codec can't decode byte 0xc3 in position 35: ordinal not in range(128)"

Keep up the good work. — Preceding unsigned comment added by Seanwal111111 (talkcontribs) 20:07, 14 May 2012 (UTC)

Hi Dispenser. See [[Beauty and the Bestie]] where in this Diff Checklinks changes [[action comedy]] to [[action comedy]] whereas Checklinks could more easily change it to just [[action comedy]]. This is popping up a lot lately. Ping me back. Cheers! {{u|Checkingfax}} {Talk} 07:35, 19 January 2016 (UTC)

How to run Checklist

Please explain how to install(?) and use Checklist.--DThomsen8 (talk) 16:23, 14 January 2016 (UTC)


Hi Dthomsen8.

To install, go to: Special:Preferences#mw-prefsection-gadgets

Scroll down to the Appearances section

Tick off the checkbox for: MoreMenu: add Page and User dropdown menus to the toolbar with links to common tasks, analytic tools and logs (documentation: Vector, Monobook or Modern)

Go to the bottom of the page and click on Save

Go to an article page and refresh the page

You should now have a Page and a User drop-down tab on your menu-bar

Hover on the Page tab

Hover on the Tools link

Click on the Check external links link

This will launch Checklinks are run a check on the page you launched it from

If you want to go the easy route, just click on the Save changes button when Checklinks is fully done with evaluating the page

If you want to do it the right way, work on each dead link and try to find archive versions of them and paste the archive link and archive date in to the data box, and put a pipe (vertical bar) between the archive-url and the archive-date. Best to find an archive about the same time the page was originally used as a citation, or about the time the citation went dead and link rotted. Hit me back with any questions. Checklinks is awesome once you get used to it. Cheers! {{u|Checkingfax}} {Talk} 17:37, 19 March 2016 (UTC)

Checklinks is tagging links already archived via {{wayback}} template as dead links. Spirit Ethanol (talk) 05:49, 26 March 2016 (UTC)

(talk page stalker). Hello Spirit Ethanol. That wayback template could be improved. For one thing, it includes the URL from the citation. For another thing it could parse the date from the wayback URL anyway. So, Checklinks is actually tagging each citation twice that has a wayback template in it: once for the cited instance, and once for the wayback instance (of the identical URL). Having two identical URLs in a citation seems redundant. This is something for the wayback template folks and Dispenser to collaborate on. Checklinks should learn to ignore citations that already have a wayback template or a piped archive-url= within a regular CS1 citation template. Cheers! {{u|Checkingfax}} {Talk} 11:47, 26 March 2016 (UTC)
Thanks for prompt response and pointers. It would also be nice if Checklinks source is freely available on some collaboration platform (e.g. github), that way users can improve software/fix bugs same way wikipedia articles are improved. Spirit Ethanol (talk) 11:53, 26 March 2016 (UTC)
Hi, why does the Checklinks tool doesn't work anymore for ro:wiki? I used it for GA and FA dead link checking and it was very useful. Can you fix it or you know another tool that checks for dead links?Ionutzmovie (talk) 18:38, 19 April 2016 (UTC)

Hello. In the past, checklinks works well on zhwiki. Recently, however, it would always say that "Checklinks had an error 'ascii' codec can't decode byte 0xe5 in position 29: ordinal not in range(128)" on every articles of zhwiki (e.g. [15][16]). Can you help fix this? Thanks!--- Earth Saver (talk) at 07:43, 28 April 2016 (UTC)

Hi Dispenser!
In elwiki we also get "Checklinks had an error 'ascii' codec can't decode byte 0xce in position 18: ordinal not in range(128)". Same byte, same position in every article. -geraki TL 07:55, 1 May 2016 (UTC)
Fixed A mistake when I hastily added i18n without double checking the encoding. — Dispenser 02:30, 2 May 2016 (UTC)
Thank you! -geraki TL 04:47, 2 May 2016 (UTC)
  Thank you! -- Earth Saver (talk) at 14:25, 2 May 2016 (UTC)

Hello

I use this page in order to check the weblinks, but i can't save because the script applys uneeded fixes (example). So i use this tool to check and do changes manually.

So question is how to disabled redirect fix, and addition of archived date or url..? ... a dot at the end of each reference is needed in french manual of style, but the script delete it. And how to disabled this to ?

So, how to disable this "regex fixing" and apply only changes on weblink ?

I tried to look but did't find a help page for fixing this alone, sorry, and sorry for my english --Archimëa (talk) 11:22, 6 October 2016 (UTC)

Hi, there are a number of problems with Checklinks and Internet Archive. I'll start with hopefully the easiest problem. The proper URL format for Wayback Machine is https://web.archive.org/web/... currently Checklinks is using an old format of "http://wayback.archive.org/web/..." this adds overhead of a redirect, and doesn't use https. Could this be corrected in the source? -- GreenC 19:43, 27 October 2016 (UTC)

I recently rewrote {{wayback}} in Lua and added error traps and red error messages. It is catching syntax problems introduced by Checklinks. Example. Note the red error message in footnotes #287, 422, 247, 250, 224, etc.. here's an example pre Checklinks and after Checklinks:

Pre:

<ref name=Abbas-090900>[http://unispal.un.org/UNISPAL.NSF/0/172D1A3302DC903B85256E37005BD90F ''Abu Mazen's speechat the meeting of the PLO's Palestinian Central Council''], 9 September 2000 {{webarchive |url=https://web.archive.org/web/20140908061418/http://unispal.un.org/UNISPAL.NSF/0/172D1A3302DC903B85256E37005BD90F |date=8 September 2014 }}</ref>

After:

<ref name=Abbas-090900>[http://wayback.archive.org/web/20111026110339/http://unispal.un.org/UNISPAL.NSF/0/172D1A3302DC903B85256E37005BD90F ''Abu Mazen's speechat the meeting of the PLO's Palestinian Central Council''], 9 September 2000 {{webarchive |url=https://web.archive.org/web/20140908061418/http://wayback.archive.org/web/20111026110339/http://unispal.un.org/UNISPAL.NSF/0/172D1A3302DC903B85256E37005BD90F |date=8 September 2014 }}</ref>

It did two things wrong. The source URL should not have been modified to an archive URL (in either case), and the date field in {{wayback}} should have been updated. It should look like this:

<ref name=Abbas-090900>[http://unispal.un.org/UNISPAL.NSF/0/172D1A3302DC903B85256E37005BD90F ''Abu Mazen's speechat the meeting of the PLO's Palestinian Central Council''], 9 September 2000 {{webarchive |url=https://web.archive.org/web/20111026110339/http://unispal.un.org/UNISPAL.NSF/0/172D1A3302DC903B85256E37005BD90F |date=26 October 2011 }}</ref>

Notice the only thing that changed was |date=20111026110339. The source URL that precedes {{wayback}} remains as-is otherwise there is no point in using the {{wayback}} template. Corrections diff.

However before you make any changes for {{wayback}} there is an ongoing TfM to merge {{wayback}} into {{webarchive}} as a universal web archive template agnostic of the service provider. Assuming the TfM goes through which looks likely, {{wayback}} will be deprecated. However {{webarchive}} will have an argument for the original URL (dead link) and will be important that Checklink not try to rescue it, in other words it should ignore the contents of {{webarchive}} entirely, as well as any links in a ref pair where that template is in use. -- GreenC 19:59, 27 October 2016 (UTC)

@Dispenser: - Hoping you can respond. IABot is adding 2 to 3 thousand new instances of {{wayback}} every day so the intersection between articles with {{wayback}} and Checklinks is quickly growing. I had to manually fix over 500 instances of broken {{wayback}}, many caused by Checklinks, it took me 4 days. Look forward to hearing from you. It will be the same with {{webarchive}} once that merger gets approved soon. -- GreenC 04:28, 28 October 2016 (UTC)

Very minor capitalisation point

Using this tool to save changes now changes all existing "dead link" templates to "Dead link" (capital D), but adds new ones as "dead link" (lower case). Odd! CMD (talk) 12:51, 15 November 2016 (UTC)

Updating accessdates

@Dispenser: Is there a way that Checklinks can be able to recognize there is already an accessdate parameter without adding another one? MCMLXXXIX 11:07, 20 February 2017 (UTC)

"Disabled until Fixed"

Since no one else seems to have asked, why is Checklinks down? The "save changes" button is disabled and says "save changes (disabled until fixed)". Elisfkc (talk) 18:21, 19 April 2017 (UTC)

Related to User_talk:Dispenser#Checklinks_retire.3F. Recommend using IABot in the mean time which accomplishes much the same. In the History tab click on "Fix dead links". -- GreenC 19:58, 19 April 2017 (UTC)
Thanks Elisfkc (talk) 20:05, 19 April 2017 (UTC)

Error when trying to log in

This is probably because I'm using an old version of Firefox (39.0.3) On the dab solver page, upper right corner, when I try to log in, I get the pop-up window but when I click the blue Allow I get the message, "Server error

There was a script error

A problem occurred in a Python script.

/home/dispenser/public_html/cgi-bin/tracebacks/connect_TypeError_110_fgFjwj.html contains the description of this error.
Vmavanti (talk) 04:03, 3 January 2018 (UTC)

Thank you - and - 404 Troubleshooting

Hello! First off: thank you very much for creating the Checklinks tool! It is fantastic, and very useful. Thank you.

I used Checklinks to fix most of the links for The Phenomenauts, but there is one link that still appears as red/broken. It is this link here. The tool says it receives a 404 from the site, but when I visit it manually it loads in my browser. I also see that when I click the '+' button in Checklinks to expand the details, the page appears to load normally in the Checklinks preview window.

Is there a way to figure out why Checklinks says it's getting a 404? Could this be an issue with wait times, if perhaps the page loads too slowly? Are there logs I could look at to help troubleshoot? I can provide a screenshot to illustrate what's happening if that helps.

Thanks for your time! --Culix (talk) 04:11, 7 June 2018 (UTC)

Is this tool broken?

Two things:

  • I cannot get the greyed out button that says "Save changes (disabled until fixed)" to be un-greyed out, even though I corrected everything, and I even tried redirecting all items so that they were all bold green, but the button still remained greyed out. This means that I can't actually edit any articles and fix links.
  • I get a browser error saying "Load unsafe scripts" at the webpage, but I assume this is normal for normal operation.

WinterSpw (talk) 21:40, 6 June 2018 (UTC)

@Dispenser:I too only see a grayed out save key. Is this by design? Is the tool supposed to carry out the changes editors are able to select within it? If so, how are those changes implemented? Through the grayed out save key? Please advise.  spintendo  00:32, 3 August 2018 (UTC)
@Dispenser: I see that it says "Disabled until fixed". The Checklinks program is one used to fix references. The label you've added to the save key can be misleading, in that it suggests to editors unfamiliar with the program that they must "fix" the listed references before the button activates. The "Disabled until fixed" label should be re-worded to say "Disabled" only. This should limit confusion over whether the button is supposed to be grayed out or not.  spintendo  01:02, 3 August 2018 (UTC)
But Checklinks is basically broken though right? Beatpoet (talk) 12:54, 15 August 2018 (UTC)
It sure is, which is super sad because it was a pretty useful tool. It looks like Dispenser is still active, but I haven't seen him respond to issues on Checklinks, so I'm not sure why it's disabled but has been for close to a year now. - Scarpy (talk) 14:30, 15 August 2018 (UTC)

Bump, for a response from the author or anyone else with a solution to the problem of the greyed out checklinks button. Apart from, of course, manually editing an article by hand while using the website, which is undesirable. WinterSpw (talk) 17:50, 17 August 2018 (UTC)

Wrong rtl languages

https://dispenser.info.tm/~dispenser/sources/wikipedia.py thinks that Hungarian is right-to-left. In fact, it’s as left-to-right as it can be (uses Latin script, just like English). Probably you’ve meant Hebrew (he), which uses its own right-to-left script. It’s quite inconvenient to read Hungarian from right to left, so please fix it. Thanks in advance, —Tacsipacsi (talk) 20:21, 1 June 2018 (UTC)

Still wrong. —Tacsipacsi (talk) 10:33, 31 August 2018 (UTC)

common.js

Can I add this tool to my tools on left side of the Wikipedia user interface through common.js? --Hanyangprofessor2 (talk) 08:19, 24 October 2018 (UTC)

@Hanyangprofessor2: Yes you can. See User:Smartse/vector.js and copy this part (not sure whether the line breaks make a difference):
function linkcheck(){addPortletLink("p-tb", "http://toolserver.org/~dispenser/cgi-bin/webchecklinks.py?page=" + wgPageName, "Check Links");}
SmartSE (talk) 13:23, 16 November 2018 (UTC)
Your script is very out-of-date, and it will break sooner or later (it’s already kinda broken because of the warning on the second page redirect after each toolbox click). The full, working, future-proof version is the following:
function linkcheck() {
	mw.util.addPortletLink("p-tb", "https://dispenser.info.tm/~dispenser/cgi-bin/webchecklinks.py?page=" + mw.config.get("wgPageName"), "Check Links");
}

if (mw.config.get("wgNamespaceNumber") === 0) {
	$.when(
		mw.loader.using("mediawiki.util"),
		$.ready
	).then(linkcheck);
}
(all whitespace in this code is entirely optional, except for the space after function). —Tacsipacsi (talk) 00:02, 17 November 2018 (UTC)