Skip to content

Instantly share code, notes, and snippets.

@LABCAT
Forked from dribnet/.block
Last active May 19, 2017 05:40
Show Gist options
  • Select an option

  • Save LABCAT/1304cb9cfe85a1e85d9f6655af998093 to your computer and use it in GitHub Desktop.

Select an option

Save LABCAT/1304cb9cfe85a1e85d9f6655af998093 to your computer and use it in GitHub Desktop.
17.1.MDDN242 PS3
license: mit
downloads
resized
colorgrid
smartgrid
customgrid

PS3 MDDN 242 2017

Final Smart Grid - Festival Images

The main motivation that determined how I have chosen the images for my montages was the desire to create visually interesting and beautiful images. I wanted the results to be montages that I can use for desktop backgrounds on my home computer.

The images used in the final montage have been scraped from flickr using the tag 'festival' and by retrieving all 80 pages that are available. This meant I had over 3000 images before commencing the manual filtering. Due to this the curation process was quite daunting so I tried to do it very quickly.

My main goal was to remove the images I felt didn't fit the festival theme or I didn't find very visually interesting. The reason for this is that I wanted the final montage to be a better representation of the theme. I removed all the images I didn't find visually interesting because of my overall goal to create beautiful and interesting montages. I also tried to make sure I removed any images that were duplicates or too similar.

I also attempted to create a script that would remove all the unwanted images when using the run script. This worked great on a site that I used earlier as the results didn't change very often. However it didn't work for flickr as the results are constantly changing.

I thought the unsorted images were excellent, visually interesting and mostly high quality. They were definitely evocative of the festival theme and I think even the unfiltered grids will work pretty well.

The colour grid worked great and looks fantastic. A lot of the images are very colourful so obviously this helped a lot to make the layout look good. The smart grid also looks great but I don't think it looks as good when zoomed out. When you zoom in it is a lot more interesting as you start to see how the images have been grouped together.

In my resize script I chose dimensions of 240x216 and in the customgrid script I set the tile argument to 48x40. These dimensions work great in regards to my initial idea as the final montage can be cut perfectly into 12 different 4k background images, each consisting of a 16x10 grid of thumbnails.

In the final montage the most obvious zones are in the top right hand corner and bottom right hand corner. Both these zones consist of images with the sky in the background. In the top right are images taken during the day and in the bottom right are images taken during the night.

Due to the large number of images used, other zones aren't obviously evident until you zoom in and start looking around. One zone that I found quite interesting was near the bottom left hand corner. The smart grid has done an excellent job of grouping all the festival posters and graffiti in the same area. Another zone that I thought was good was an area near the top right hand corner next to the images in the sky. In this zone you can see all the close ups on musicians playing their instruments.

#!/bin/bash
#this script collects images from flickr using a tag - the tag should be passed to this script as the only argument
# show commands and stop if there is an error
set -ex
# make the directory if it is not there
mkdir -p downloads
# clean the directory if there are old results
rm -f downloads/*
# get 80 pages
for PAGE in {1..80}
do
# build the url
URL='https://www.flickr.com/photos/tags/'$1'/page'$PAGE
# fetch the images
wget --adjust-extension \
--random-wait \
--limit-rate=100k \
--span-hosts \
--convert-links \
--backup-converted \
--no-directories \
--timestamping \
--page-requisites \
--directory-prefix=downloads \
--execute robots=off \
--accept=*jpg*,*png* \
$URL
# other unused arguments
# --recursive \
# --level 1 \
# --domains en.wikipedia.org \
done
#!/bin/bash
# show commands and stop if there is an error
set -ex
# make the directory if it is not there
mkdir -p downloads
# clean the directory if there are old results
rm -f downloads/*
PAGE=0
# get 57 pages
for i in {0..57}
do
PAGE=$((i * 50))
# build the url
URL='http://www.publicdomainpictures.net/browse-category.php?page='$PAGE'&c=lightnbspeffects&s=9'
# fetch the images
wget --adjust-extension \
--random-wait \
--limit-rate=100k \
--span-hosts \
--convert-links \
--backup-converted \
--no-directories \
--timestamping \
--page-requisites \
--directory-prefix=downloads \
--execute robots=off \
--accept=*jpg*,*png* \
$URL
# other unused arguments
# --recursive \
# --level 1 \
# --domains en.wikipedia.org \
done
#!/bin/bash
# show commands and stop if there is an error
set -ex
# make the directory if it is not there
mkdir -p downloads
# clean the directory if there are old results
rm -f downloads/*
#SEARCH_STRING="robot"
#Create a array of keywords to use in the scrape.
#This allows me to scrape images for a set of keywords instead of just one.
#I needed to implement this as the search functionality of the site I am scraping
#was not complex enough to allow for "OR" based searches.
declare -a arr=("guitar" "piano" "keyboard" "turntable" "violin" "cello" "saxophone" "trumpet" "bass" "horn" "drum" "tuba" "harp")
for i in "${arr[@]}"
do
# get 20 pages
for PAGE in {1..32}
do
# build the url
URL='http://www.sluniverse.com/snapzilla/Home/Search?term='$i'&page='$PAGE'#pictures'
# fetch the images
wget --adjust-extension \
--random-wait \
--limit-rate=100k \
--span-hosts \
--convert-links \
--backup-converted \
--no-directories \
--timestamping \
--page-requisites \
--directory-prefix=downloads \
--execute robots=off \
--accept=*jpg*,*png* \
$URL
# other unused arguments
# --recursive \
# --level 1 \
# --domains en.wikipedia.org \
done
done
# URL='http://www.trademe.co.nz/Browse/SearchResults.aspx?&cid=0&searchType=&searchString='$SEARCH_STRING'&x=0&y=0&type=Search&sort_order=&redirectFromAll=False&rptpath=all&page='$PAGE'&user_region=100&user_district=0&generalSearch_keypresses=8&generalSearch_suggested=0&generalSearch_suggestedCategory='
#!/bin/bash
# show commands and stop if there is an error
set -ex
# make the directory if it is not there
mkdir -p downloads
# clean the directory if there are old results
rm -f downloads/*
# get 5 pages
for PAGE in {1..3}
do
# this is an example with a group
URL='https://www.flickr.com/groups/hdr/pool/page'$PAGE
# this is an example with tags
# URL='https://www.flickr.com/photos/tags/'$SEARCH_STRING'/page'$PAGE
echo "about to fetch URL: " $URL
sleep 3
# fetch the images
wget --adjust-extension \
--random-wait \
--limit-rate=100k \
--span-hosts \
--convert-links \
--backup-converted \
--no-directories \
--timestamping \
--page-requisites \
--directory-prefix=downloads \
--execute robots=off \
--accept=.jpg \
$URL
# other unused arguments
# --recursive \
# --level 1 \
# --domains en.wikipedia.org \
done
#!/bin/bash
if [ ! -d "/usr/local/anaconda/extras" ]; then
# Control will enter here if DIRECTORY doesn't exist.
echo "smartgrid program not found"
echo "please first install using directions on blackboard"
exit 1
fi
# show commands and stop if there is an error
set -ex
HOME="/usr/local/anaconda/extras/home"
export PATH="/usr/local/anaconda/bin:$PATH"
python /usr/local/anaconda/extras/smartgrid.py \
--tile 48x40 \
--input-glob 'resized/*' \
--left-image 'resized/15472998524_26f69c0173_n.jpg' \
--right-image 'resized/15139800171_7191c9e8e8_n.jpg' \
--output-path customgrid
<head>
<link rel="stylesheet" href="http://cdn.leafletjs.com/leaflet-0.7.3/leaflet.css"></script>
<style>
body {padding: 0; margin: 0;}
#image-map {
width: 960;
height: 500;
border: 1px solid #ccc;
margin-bottom: 10px;
}
</style>
</head>
<body style="background-color:white">
<div id="image-map"></div>
<script src="http://cdn.leafletjs.com/leaflet-0.7.3/leaflet.js"></script>
<script language="javascript" type="text/javascript" src="zoom_image.js"></script>
<br>
<a href="montage.jpg">full size montage</a><br>
<a href="left_right.jpg">left right images</a><br>
<a href="tsne.png">tsne</a><br>
<a href="tsne_spun.png">tsne spun</a><br>
<a href="movement.png">movement</a><br>
</body>
#!/bin/bash
if [ -d "/usr/local/anaconda/extras" ]; then
# Control will enter here if DIRECTORY doesn't exist.
echo "smartgrid program already installed"
exit 1
fi
# show commands and stop if there is an error
set -ex
# make the directory if it is not there
mkdir -p /tmp/smartgrid
# clean the directory if there are old results
rm -f /tmp/smartgrid/*
cd /tmp/smartgrid
wget http://deeptom.staff.vuw.ac.nz:9000/smartgrid.tgz
cd /usr/local
tar xvfz /tmp/smartgrid/smartgrid.tgz
echo "DONE: smartgrid program installed"
View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment