I have a wget
script named Chktitle.sh
-- this script takes a command like below
$ Chktitle.sh "my url"
I then have a file name url.txt
with over 100 lines with urls and ips to check for web-page titles.
Then i have results.txt
as a blank file.
Is there any way I can perform a repetitive action like below for each line in the file:
Grab line1 from url.txt
-----
then execute Chktitle.sh "line1"
-----
Now save the result for line1 in results.txt
-----
Now goto Line2 ........
etc etc etc
I need to make sure that it will only execute the next line after the previous one has finished. Can any one show me any easy way to perform this? I am happy to use Perl, sh, and consider other languages..
The contents of chktitle.sh
:
#!/bin/bash
string=$1"/search/"
wget --quiet -O - $string \
| sed -n -e 's!.*<title>\(.*\)</title>.*!\1!p'
Here is how you could do this in Perl:
See
xargs
, especially the-I
switch.This
xargs
call will read the input (url.txt
) line by line and call./Chktitle.sh
with each such read line as a parameter.The
{}
is the placeholder for the line read. You can also write(with
foo
as placeholder) but{}
is the placeholder that is usually used forxargs
.You can create your script with 2 parameters as follows
HOW SCRIPT WORKS ON COMMAND LINE
. The codes are broken down as follows with explanation
STEP 1
remove any file that has the name result.txt so that i can create a new blank file
STEP 2
Set up a while do loop to read all lines in the url file (which is known as"$1").
Each line read is saved as "my_url".
The loop take your script script (Chktitle.sh - $2) followed by the line read known as "my_url" and execute it on the command line and redirect the output to result.txt. This is done for each line.
NOW LET US SUMMARIZE ALL THE CODES INTO ONE SCRIPT AS FOLLOWS
Maybe something like this could help (provided that I understood correctly) :
For each line in
/path/to/input.txt
, execute your script and append the output (>>
) toresults.txt
.Of course you could always add additional statements in your while loop :