Python, Web Scraping and Content Management: Scrapy and Django Sammy Fung http://sammy.hk OpenSource.HK Workshop 2014.07.05
Sammy Fung ● Perl → PHP → Python ● Linux → Open Source → Open Data ● Freelance → Startup ● http://sammy.hk ● email@example.com
Can computer program read this ?
Is this UI easy understanding ?
Five Star Open Data 1.make your stuff available on the Web (whatever format) under an open license. 2.make it available as structured data (e.g., Excel instead of image scan of a table) 3.use non-proprietary formats (e.g., CSV instead of Excel) 4.use URIs to denote things, so that people can point at your stuff. 5.link your data to other data to provide context. 5stardata.info by Tim Berners-Lee, the inventor of the Web.
Open Data ● Data.One – Lead by OGCIO of Hong Kong Government. – Use the term “public sector information” (PSI) insteads of “open data”. – Many data are not available in machine-readable format with useful data structure. – A lot of data are still requiring web scraping with customized data extraction to collect useful machine-readable data.
Web Scraping with Scrapy
Web Scraping a computer software technique of extracting information from websites. (Wikipedia)
Scrapy ● Python. ● Open source web scraping framework. ● Scrap websites and extract structured data. ● From data mining to monitoring and automated testing.
Scrapy ● Define your own data structures. ● Write spiders to extract data. ● Built-in XPath selectors to extracting data. ● Built-in JSON, CSV, XML output. ● Interactive shell console, telnet console, logging......
Creating Scrapy Project ● Define your data structure ● Write your first spider – Test with scrapy shell console ● Output / Store collected data – Output with built-in supported formats – Store to database / object store.
Define your data structure items.py class Hk0WeatherItem(Item): reporttime = Field() station = Field() temperture = Field() humidity = Field()
Write your first spider ● Import a Class of your own data structure. – $ scrapy genspider -t basic <YOUR SPIDER NAME> <DOMAIN> – $ scrapy list ● Import any scrapy class which you required. – eg. Spider, XPath Selector ● Extend parse() function of a Spider class. ● Test with scrapy shell console – $ scrapy shell <URL>
Output / Store collected data ● Use built-in JSON, CSV, XML output at command line. – $ scrapy crawl <Spider Name> -t json -o <Output File> ● Pipelines.py – Import a Class of your own data structure. – Extend process_item() function. – Add to ITEM_PIPELINES at settings.
Create django app ● Define your own data model. ● Define and activate your admin UI. ● Furthermore: – Define your data views. – Addi URL routers to connect with data views.
Define django data model ● Define at models.py. ● Import django data model base class. ● Define your own data model class. ● Create database table(s). – $ python manage.py syncdb
Define django data model class WeatherData(models.Model): reporttime = models.DateTimeField() station = models.CharField(max_length=3) temperture = models.FloatField(null=True, blank=True) humidity = models.IntegerField(null=True, blank=True)
Define django data model ● admin.py – Import admin class – Import your own data model class. – Extend admin class for your data model. – Register admin class ● with admin.site.register() function.
Define django data model class WeatherDataAdmin(admin.ModelAdmin): list_display = ('reporttime', 'station', 'temperture', 'humidity', 'windspeed') list_filter = ['station'] admin.site.register(WeatherData, WeatherDataAdmin)
Enable django admin ui ● Adding to INSTALLED_APPS at settings.py – django.contrib.admin ● Adding URL router at urls.py – $ python manage.py runserver ● Access admin UI – http://127.0.0.1:8000/admin
Scrapy + Django
Scrapy + Django ● Define django environment at scrapy settings. – Load django configuration. ● Use Scrapy DjangoItem class – Insteads of Item and Field class – Define which django data model should be linked with. ● Query and insert data at scrapy pipelines.
hk0weather ● Hong Kong Weather Data. – 20+ HKO weather stations in Hong Kong. – Regional weather data. – Rainfall data. – Weather forecast report.
hk0weather ● Setup and activate a python virtual enviornment, and install scrapy and django with pip. ● Clone hk0weather from GitHub – $ git clone https://github.com/sammyfung/hk0weather.git ● Setup database connection at Django and create database, tables and first django user. ● Scrap regional weather data – $ scrapy crawl regionalwx -t json -o regional.json