Guest glossywhite Posted September 30, 2011 Report Posted September 30, 2011 (edited) Just knocked up this bash script to download ALL high resolution images from a Flickr photo set. It seems to work on my Flickr, but then again I am logged in, but it didn't work on my brother's sets. Anyhow, this is almost 2 hours worth of coding work, and you can do what you wish with it, as long as you leave the credits intact. Usage: You simply chmox+x, then run the script. It will ask for the set URL, which you'd get from going to an image set page, on Flickr, and copying the URL from the address bar, and pasting it into the script. It will then download ALL high resolution images from that set, into the same directory as the script. It needs work, but it kinda does what it is supposed to. If I fix it properly, I'll edit this. THIS SCRIPT DOES NOT WORK WITH PRIVATE SETS! Syntax: ./fetchflickr.sh <setUrlHere> #!/bin/bash ### FLICKR SET LARGE IMAGE GRABBER ### ### 30/09/2011 Matt Foot ### foldername=`echo $1 | sed -e 's/.*sets\///'` mkdir $foldername cd $foldername curl -L `curl -L $1 | grep .jpg | grep /photos/ | grep href | sed -e 's/.*href="//' | sed -e 's/title=.*<\/span>//' | sed -e 's/<\/div>//' | sed -e 's/\/photos.*<\/a>//' | sed -e 's/\/"/"/' | sed -e 's/"//' | sed -e 's/\/photos/http:\/\/www.flickr.com/'` | grep "url": | grep _o.jpg | sed -e 's/url://' | sed -e 's/url: //' | sed -e 's/'\''http/http/' | sed -e 's/'\'',//' > largeurls.txt fetchlargeimgs=`cat largeurls.txt` echo "NOW DOWNLOADING ALL LARGE IMAGES FROM THAT SET" wget $fetchlargeimgs echo "ALL LARGE IMAGES DOWNLOADED" rm largeurls.txt Edited September 30, 2011 by glossywhite
Recommended Posts
Please sign in to comment
You will be able to leave a comment after signing in
Sign In Now