I’m involved in building a platform that is about sharing data extraction and processing scripts regarding energy data time series. We would like to require our developers to use iPython notebooks, and mostly they will use Pandas / Numpy to do clean-up of the timeseries data after downloading it from the sources.
Now, I’m thinking about whether it would be possible to put these iPython Notebooks to second use by running them through Morph.io and regularly execute them as scraper scripts.
This is an example Notebook (not necessarily all best practice, but it shows the idea):
As far as I heard, there is a problem running Numpy/Pandas in scrapers, right? Do you think this would be a problem that could be solved eventually or is it a hard constraint that will never work?
Also, I heard that there are scripts which enable executing iPython Notebooks headlessly. Would that be an option for integrating them in a scraper with maybe some glue code around it?
Thanks in advance for any helpful reply or pointer!