I’m trying to figure out what I’m doing wrong with my Python based scraper at https://morph.io/mithro/numato-opsis-crowdfunding-campaign
The scraper was suppose to record the stats about the crowdfunding campaign, so every run I was expecting it to just add a new record to the sqlite database. It however seems to be adding and removing a line each time,
Auto ran revision 8fdc669b and completed successfully a day ago.
run time 36 s
1 record added, 1 record removed in the database
1 page scraped
The line which uses the scraperwiki to write the data is at https://github.com/mithro/numato-opsis-crowdfunding-campaign/blob/master/scraper.py#L114
Tim ‘mithro’ Ansell
It looks fine to me.
Is this a problem with morph.io or the scraperwiki library - when you run it locally does it create new records?
When running multiple times locally, I get multiple entries in my sqlite database.
sqlite> select * from data;
8725|https://www.crowdsupply.com/numato-lab/opsis|34720|397|1445994145.86833|159|Oct 26 funded on
8725|https://www.crowdsupply.com/numato-lab/opsis|34720|397|1445998316.89963|159|Oct 26 funded on
When running on morph.io it appears that a new sqlite database is getting created every time?
This is really weird then. It should behalf exactly the same on morph.io as it does on your machine.
You’ll love this: I forked it and my forked copy works fine. WTH?!