When it comes to keeping your website or your blog healthy and strong in search engine listings, regularly checking for broken or otherwise bad links on your website is a very good idea.
Not only is it good for your site’s overall standings in search listings, it’s also good for your visitors. No one wants to encounter broken links on a site – it’s a sign of a poorly cared for website.
Then again, regularly checking your site for bad links can be a tedious chore. We’ve covered a few tools here at MUO that can help you stay on top of things if you have the time, like Ann’s article on Xenu Link Sleuth, and Saikat’s article on Hyperlink Checker.
Those are good solutions, but again, they require your time. Checking for bad links takes work. A better solution would be to set up an automated script on your own server (or any computer connected to the Internet) to check your website for bad links and then issue a report with all of the results. Better yet, schedule it to run weekly or monthly, and you never have to think about the task again, until there are bad links discovered.
Setting Up Your LinkChecker Script
The tool that I want to focus on is one that we’ve listed in the MUO directory before called LinkChecker.
The reason I choose that tool is because LinkChecker provides an alternative command-line alternative to the standard GUI. In the command line, all you need to do is call the linkchecker application, followed by the URL that you want to check, as well as a assortment of parameters (which I’ll get to below).
In the case where everything goes well, and there are no issues, the command window will look something like this.
There are no errors, so you’re just seeing a regular status update of link counts checked by the tool. On the other hand, if you have any bad links, you’ll see the errors echoed back in the command window as shown here.
While this is useful only if you want to manually launch the script, what we’re trying to do here is automate this command. To do this, you need to understand some of the parameters that are available with this command mode option. You can find all of the parameters and flags at the LinkChecker help page. The ones that we are going to focus on are -0 and -f. That’s the output and the file parameters.
For example, you can issue the command, “linkchecker -r1 -ocsv -Fcsv http://www.topsecretwriters.com” to get a CSV file with all of the link check results, which shows up in the root linkchecker directory under the default file name “linkchecker-out.csv”.
The CSC file shows a full listing of all bad links found, with the link checked in one column and the time/date the broken link was found as well as the warning error script.
This is a very cool thing – because now all you have to do is schedule a batch command to run that issues the above linkchecker command, and every day you’ll have an updated CSV file with a list of all bad links on your website. You can use this list to go through and fix all of the bad links.
If you’re unfamiliar with how to set up such a scheduled job, all you have to do is create a batch job that has the above command in it, name it something like “CheckLinks.bat”, and then schedule that batch job to run every week or every month. Just go to Start –>Accessories –> System Tools –> Scheduled Tasks.
Click on “Add Scheduled Task” to add the time/date you want it to run and just browse to your batch file when asked.
An even cooler solution, if you’re running your own web server, would be to run the LinkChecker command on your website, output to HTML format, and the move that HTML results page to a special page on your website that you can call up on any web browser to check out.
You can have the LinkChecker tool output to HTML by issuing the following command:
“linkchecker -r1 -ohtml -Fhtml/badlinks.html http://www.topsecretwriters.com”
What this does is outputs the results to an HTML file on the local linkchecker path. You’ll want to add a line to your batch file to move that HTML file to the file path where your public HTML files are stored. Just add:
“MOVE badlinks.html d:/web-server/xampp/htdocs/badlinks/”
This way, you’ll have a web directory on your website /badlinks/ where you can call up the badlinks.html page. This is what the results look like in HTML format.
Using this approach, you never have to even log into the server or computer where the link check script is running. You can just check your special page and deal with the errors that you find there.
Keeping on top of bad links on your site is more important than most people realize. It also happens more often than you might think, as many of the web pages you might have linked to in that article a few years ago went out of service, leaving all of your visitors out of luck when they try to follow your link to that page.
So, give the command-line option of LinkChecker a shot and see if it lets you improve your website link quality without taking up all of your valuable time. Did you take the approach above, or did you try some other technique? Share your own experiences with the script in the comments section below.
Image Credit: Shutterstock