Packages depending on eyespect

Page 1

atc Manage fleet spawns

couch-profile Store profile information in couchdb

couchdb-update-views Keep couchdb views up to date

dispatch-request-spawn-all Continually request a Dispatch server to spawn all commands until succesful

docparse DocParse is a integrated solution to process data from supplier bills. Data is scraper from supplier websites and matched against user supplied uploads of scanned paid bills. From here the system outputs the data to a specially formatted website which is delivered to the client

docparse-api api server for the docparse project

docparse-parse-scraped-worker Parse data fetched by docparse scrapers

docparse-router router for the docparse project

docparse-scraped-parser Server for parsing documents scraped from supplier websites

docparse-scraper handle the initial setup needed for processing add scraper requests

docparse-scraper-add-node allow node based scrapers to add new data via the docparse api

docparse-scraper-bills-fetch fetch bills from the docparse scraper api server

docparse-scraper-nge scrape Hess Energy bills for use in the docparse system

docparse-scraper-nst Node Zombie based scraper to scrape bills from the NStar (supplier code "NST)" website

docparse-scraper-runner manage running docparse scraper jobs on a schedule

docparse-scraper-server Serve scraper api requests

docparse-secure-proxy Handle ssl connections and forward them to a router server over standard http

docparse-supplier-hes Hess Energy (HES) parsing for the docparse system

docparse-supplier-nga 'Handle scraped data from the Ngrid Gas website as well as the ocr text from scanned Ngrid Gas bills. Ngrid Gas uses the *supplier_code* **NGA** in the Docparse system'

docparse-supplier-nge process ngrid electric utility bill data for use in the docparse system

docparse-supplier-nst NStar (NST) supplier specific code for use in the DocParse system

docparse-upload-fetch Fetch the latest version of an upload document from the docparse database

docparse-upload-process process upload api request for the docparse server

fleet-atc Manage fleet spawns

fleet-stopall Helpers for managing fleet drones

fleet-stopregex Fleet stop with fields and regex parameters

hello-world-server Super simple node.js server which listens on process.env["PORT"] and responds to all requests with "hello world\n"

hess-at-account-homepage Test if the current page loaded in cheerio is the account homepage for

hess-bills-table-do-bills-exist Test if bill rows appear on the current page

hess-scrape-for-profile Scrape all accounts for all types of commodities for a given profile and customer login on the Hess Energy website

joyent-ip Get the internal ip address of a Linux machine hosted by Joyent

little-popo will be retired when popo reaches v0.1

nst-process-bills Process html bills downloaded from the NStar website

nstar-login Login to the NStar website

parse-server Server boilerplate for supplier specific parsing both scraped and raw

parse-test-server Start a test server and create the test users

pdf-extract Node PDF is a set of tools that takes in PDF files and converts them to usable formats for data processing. The library supports both extracting text from searchable pdf files as well as performing OCR on pdfs which are just scanned images of text

pdfer-api api server for the pdfer service

pdfer-job-collector Collect results of pdfer worker jobs

pdfer-job-pusher Push new pdf extract jobs out to workers

pdfer-jobs Job processing for the pdfer service

pdfer-router bounce api and web requests to the appropriate server for the pdfer service

pdfer-seaport-server Create a seaport server to so that all the various pdfer services can register in central registry

quick-proxy Proxy all requests on port 80 to port 8080

renew Execute an asyncronous function repeatedly until it completes or the max number of attempts are reached

riak-streaming Basic riak client that is fully streaming

riaks Riak client with a streaming interface

scraped-parse Parsed data fetched by docparse scrapers

sea-free Free service from a seaport server by id on the command line

secure-proxy Handle ssl connections and forward them to a router server over standard http

service-router router to bounce requests to either api or web servers

userific-test Test suite for any userific backend

Page 1

npm loves you