Michael Bevilacqua on 15 Apr 2008 15:51:40 -0700


[Date Prev] [Date Next] [Thread Prev] [Thread Next] [Date Index] [Thread Index]

Re: [PLUG] Capture web page as image?


On Tue, Apr 15, 2008 at 3:35 PM, Eugene Smiley <eug+plug@esmiley.net> wrote:
> I'm looking for a Linux scripting solution for this without much luck. I use
>  Firefox and like the idea, but not the implementation, of the Speed Dial addon
>  (https://addons.mozilla.org/en-US/firefox/addon/4810).
>
>  What I want is a Cron-able script that will take text file of URLs and create a
>  JPG of the site(s). I'd then crop and resize it down to 100x100 and put it into
>  a folder on a web server where I can have the layout and styling to my liking.
>
>  Any one know of anything?
>
>
>  Edit to add: The server in question has no X installed, so I think KHTML and FF
>  from the command line are out of the question since they're not installed.
>
>
>
>  ___________________________________________________________________________
>  Philadelphia Linux Users Group         --        http://www.phillylinux.org
>  Announcements - http://lists.phillylinux.org/mailman/listinfo/plug-announce
>  General Discussion  --   http://lists.phillylinux.org/mailman/listinfo/plug
>

Without X?  The dependencies on a binary distro might be too
overwhelming. Is this just simple HTML? If so, you might want to use
PDF instead of JPEG. Do something like:

elinks or curl -> htmldoc ->  ImageMagick


A very simple (untested) BASH script might look like:

#!/bin/bash

$1=$URL

elinks -dump http://$URL > $URL

htmldoc --webpage --size letter --outfile "$URL.pdf" $URL

convert $URL.pdf $URL.jpg

convert -resize 100x100 $URL.jpg


-- 
Michael D. Bevilacqua
michael@bevilacqua.us
___________________________________________________________________________
Philadelphia Linux Users Group         --        http://www.phillylinux.org
Announcements - http://lists.phillylinux.org/mailman/listinfo/plug-announce
General Discussion  --   http://lists.phillylinux.org/mailman/listinfo/plug