Book Image

Linux Shell Scripting Cookbook, Second Edition - Second Edition

Book Image

Linux Shell Scripting Cookbook, Second Edition - Second Edition

Overview of this book

The shell remains one of the most powerful tools on a computer system — yet a large number of users are unaware of how much one can accomplish with it. Using a combination of simple commands, we will see how to solve complex problems in day to day computer usage.Linux Shell Scripting Cookbook, Second Edition will take you through useful real-world recipes designed to make your daily life easy when working with the shell. The book shows the reader how to effectively use the shell to accomplish complex tasks with ease.The book discusses basics of using the shell, general commands and proceeds to show the reader how to use them to perform complex tasks with ease.Starting with the basics of the shell, we will learn simple commands with their usages allowing us to perform operations on files of different kind. The book then proceeds to explain text processing, web interaction and concludes with backups, monitoring and other sysadmin tasks.Linux Shell Scripting Cookbook, Second Edition serves as an excellent guide to solving day to day problems using the shell and few powerful commands together to create solutions.
Table of Contents (16 chapters)
Linux Shell Scripting Cookbook
Credits
About the Authors
About the Reviewers
www.PacktPub.com
Preface
Index

Finding broken links in a website


Some people manually check every page on a website to search for broken links. It is feasible for websites having very few pages, but gets difficult when the number of pages become large. It becomes really easy if we can automate the process of finding broken links. We can find the broken links by using HTTP manipulation tools. Let's see how to do it.

Getting ready

To identify the links and find the broken ones from the links, we can use lynx and curl. It has an option, namely -traversal, which will recursively visit pages on the website and build a list of all hyperlinks in the website. We can use cURL to verify each of the links for whether they're broken or not.

How to do it...

Let's write a Bash script with the help of the curl command to find out the broken links on a web page:

#!/bin/bash 
#Filename: find_broken.sh
#Desc: Find broken links in a website

if [ $# -ne 1 ]; 
then 
  echo -e "$Usage: $0 URL\n" 
  exit 1; 
fi 


echo Broken links: 

mkdir /tmp...