Civic Tech Toronto Hacknight: October 27

Fifteenth hacknight – 13 participants.

No speaker this week; a working session.

Breakout groups:

  • City of Toronto development application scraper – Gabe
  • Exploring geospatial data, finding correlations between voter turnout and other things – Ushnish
  • Government website plain language audit / report card – Matthew

Thanks to Lighthouse Labs for hosting us, and for sponsoring dinner! 

2 thoughts on “Civic Tech Toronto Hacknight: October 27

  • Group: Development application scraper
    Participants: Gabe, Mahshid, Hanifa, Josh, Howard

    We’ve set the parameter that we want to complete this project by the end of November. So we’ve kept the scope small: it’s a system that regularly scrapes development applications from the City website, and dumps them into a database that we can easily access and query. For bonus points, we might choose to keep old versions of updated records.

    There are lots of things that could be built on top of that once it exists, but we’re not going there yet.

    First we spent some time exploring the data.

    Howard looked for planning applications at neighbouring municipalities, and found that Brampton’s planning applications are open data, and way more complete than what’s available on the Toronto site.

    Howard found what seem to be some consistency errors: some properties seem to be tagged with the wrong ward. Others are tagged with ward “00”. We haven’t yet figured out if this is meaningful, or simply incorrect.

    Gabe asked the Orders-in-Council group what they used for scraping; they suggested Beautiful Soup.

    Josh grabbed the form response data and shared it with us on Slack. Then he started writing a basic scraper using Beautiful Soup.

    Mahshid and Gabe looked at the data that’s returned by the form. The basic records are straightforward (though unfortunately not in JSON). There’s a bunch of extra data in the response — we haven’t yet figured out if it’s helpful.

    Josh agreed to share his work in progress on GitHub. Gabe will create a MySQL database for storing the data.

    Back at it next week!

Leave a Reply to gabe Cancel reply

Your email address will not be published. Required fields are marked *

52 − = 49