Bill Jonas on Thu, 20 Apr 2000 19:28:55 -0400 (EDT) |
On Thu, 20 Apr 2000, neodem wrote: >I'd like to be able to quickly test them to see if they still exist (ie. >they don't return a 4040 error). Is there a quick way I can do this without >writing a dedicated application? Let's see... you have two options thus far, with Python & Perl. You shell experts correct me if I'm wrong, but... #!/path/to/bash (or sh, or whatever) if (wget $1) echo "BAD: $1" fi Proper syntax, of course, depends on the return codes for wget (which I can't check right now because I'm at work on a SunOS box which does not have wget). I'm assuming that if wget is successful it would return a zero and a 404 would return a non-zero value. Call the script, say, urlcheck or something. Would you be able to just "urlcheck < $URL_LIST" (whatever you call your file full of URLS) or something? You could the redirect the output to a file. If you remove the "BAD: " then couldn't you then "cat $URL_LIST $BAD_URLS" (or whatever your output file is) and pipe the output to uniq with the proper options to give you just a list of the *good* urls... I need some shell expert to tell me why this wouldn't work. I'm looking at it, and logically, I think it should, but I've got this nagging feeling that it won't. I need to study up on shell... O'Reilly has a book, don't they? Bill -- Linux: Because CTRL-ALT-DEL is for rebooting, not logging on. Harry Browne for President: http://www.harrybrowne2000.org/ Visit me at http://www.netaxs.com/~bj/ ______________________________________________________________________ Philadelphia Linux Users Group - http://plug.nothinbut.net Announcements - http://lists.nothinbut.net/mail/listinfo/plug-announce General Discussion - http://lists.nothinbut.net/mail/listinfo/plug
|
|