Sucks the images from imgur for a given subreddit and gives you a usable json document containing image vitals.
Want to see pretty graphs? Log in now!
npm install imgur-sucker
|13||downloads in the last month|
|Last Published By|
|Version||0.1.3 last updated 5 months ago|
|Keywords||imgur, reddit, images, cats|
|Dependencies||commander, progress, lodash, moment|
A command line utility for vacuuming a subreddit's images right off of imgur
You can run this command from any location, but know it will create a /sucked/ folder in your current working directory and will add subdirectories based on the subreddits that you pull. If logging is turned on, it will also create a json file with image vitals.
You will need an Imgur Application Client Id to run this app. To get a client ID: https://api.imgur.com/oauth2/addclient
Usage: imgursucker [options]
-h, --help output usage information -V, --version output the version number -c, --client-id <imgur App Client Id> imgur client id -p, --pages <pages> number of pages to suck  -s, --subreddit <subreddit> subreddit to suck [cats] -l, --logging <boolean> log downloaded image data to json file [true] -d, --download <boolean> download the files found [true] -r, --rate-limit-check check your current imgur credits to prevent rate limiting
Pretty much says what it does on the box. Tests have show that with 10 pages worth of sucking, you can acquire about 1750 unique images per subreddit