Crawl data from multiple sources for openfreecabs.org
Currently we crawl data from:
mkdir -p $GOPATH/src/github.com/maddevsio/
cd $GOPATH/src/github.com/maddevsio
git clone https://github.com/maddevsio/openfreecab-crawler
cd openfreecab-crawler
make depends
make
Or Go way
mkdir -p $GOPATH/src/github.com/maddevsio/
cd $GOPATH/src/github.com/maddevsio
git clone https://github.com/maddevsio/openfreecab-crawler
cd openfreecab-crawler
go get -v
go build -v
go install
GLOBAL OPTIONS:
--storage_root_url value OpenfreeCabStorage root url (default: "http://localhost:8090") [$OPEN_FREE_CAB_STORAGE_URL]
--loglevel value set log level (default: "debug") [$LOG_LEVEL]
--test_mode set test mode [$TEST_MODE]
--update_interval value Set update interval (default: 15) [$UPDATE_INTERVAL]
--help, -h show help
--version, -v print the version
./openfreecab-crawler
Feel free to create issues, sending pull requests.
- Fork repo
- Make a changes
- Commit
- Create pull request