@@ -11,7 +11,7 @@ This crawler automates the following step:
1111* download source code and book cover
1212* upload files to Google Drive, OneDrive or via scp
1313* store data on Firebase
14- * notify viaemail , IFTTT orJoin (on success and errors)
14+ * notify viaGmail , IFTTT, Join orPushover (on success and errors)
1515* schedule daily job on Heroku or with Docker
1616
1717###Default command
@@ -41,7 +41,7 @@ python script/spider.py --config config/prod.cfg --all --extras --upload googled
4141python script/spider.py -c config/prod.cfg -t epub --upload onedrive
4242python script/spider.py --config config/prod.cfg --all --extras --upload onedrive
4343
44- # download and notify: gmail|ifttt|join
44+ # download and notify: gmail|ifttt|join|pushover
4545python script/spider.py -c config/prod.cfg --notify gmail
4646
4747# only claim book (no downloads):
@@ -303,15 +303,15 @@ More info about Heroku [Scheduler](https://devcenter.heroku.com/articles/schedul
303303
304304Build your image
305305```
306- docker build -t niqdev/packtpub-crawler:2.3 .0 .
306+ docker build -t niqdev/packtpub-crawler:2.4 .0 .
307307```
308308
309309Run manually
310310```
311311docker run \
312312 --rm \
313313 --name my-packtpub-crawler \
314- niqdev/packtpub-crawler:2.3 .0 \
314+ niqdev/packtpub-crawler:2.4 .0 \
315315 python script/spider.py --config config/prod.cfg
316316```
317317
@@ -320,7 +320,7 @@ Run scheduled crawler in background
320320docker run \
321321 --detach \
322322 --name my-packtpub-crawler \
323- niqdev/packtpub-crawler:2.3 .0
323+ niqdev/packtpub-crawler:2.4 .0
324324
325325# useful commands
326326docker exec -i -t my-packtpub-crawler bash