How To Use Open APIs

Yesterday I gave a talk entitled “How Citizens Can Use Open APIs to Create New City Services” at Asheville Cloud Day 2011. I aimed to inspire city IT personnel to create open APIs and to show everyone else how they could leverage those APIs. You can find the slides of my talk here.

While I’ve been aware of open APIs, I had never used one until I was preparing for this talk. I learned how to use open APIs about a month ago after watching the excellent video Google Docs Unleashed from AppSumo. Open APIs are surprisingly easy to learn and use, and I encourage everyone to try out the import functions of Google Spreadsheet that allow you to access them.

In preparation for my talk, I did some research about open APIs and the “mashup” applications built on top of them. This post details the results of that research and walks through the live examples I gave during the talk for those who want to learn more.

Government Mashup Applications

Below are government mashup examples I bookmarked for my talk. The application name link takes you to the application itself, but I’ve included links to the ProgrammableWeb listing for each application so you can see which APIs were used to build the app.

For this list, I selected only a handful of applications. For a broader list, check out the government tag on Programmable Web.

Live Examples

During my talk, I walked through two examples of how to access open APIs in Google Spreadsheet.

New Corporations Example

I wanted to give a local government example. Unfortunately, North Carolina does not yet have many open APIs. Instead, I chose to screen scrape the North Carolina Secretary of State web site for new corporations formed in Buncombe County last month, then map those corporations onto Google Maps.

The steps I took to create this example include:

  1. I used Needlebase to screen scrape the data after doing a search for new corporations in July 2011. Needlebase uses machine learning algorithms to automatically detect and parse data. I used Needlebase to execute the search form, parse the results page, then link into each detail page and parse that.
  2. After parsing, I published the data set so public services like Google could access it. Links at the bottom let me grab the data as plain text, CSV or JSON. For this example, I used the CSV data.
  3. I created a new Google Spreadsheet and imported the data using the formula:
    =IMPORTDATA("https://pub.needlebase.com/actions/api/V2Visualizer.do?domain=NC-New-Corporations-Buncombe-Cou&render=Csv&query=company")

    I renamed the sheet containing this data “Data Feed”.

  4. I added a new tab to use to stage data I wanted to import into Google Maps. Google Maps takes two columns, an address and a tool tip. So I created these two columns and then entered the following two formulas into the first two columns:
    =IF('Data Feed'!G2 = "NO ADDRESS", "", 'Data Feed'!G2)

    =IF('Data Feed'!G2 = "NO ADDRESS", "", CONCATENATE("<b>", 'Data Feed'!A2, "</b><br>",A2 , "<p>Type: ", 'Data Feed'!M2, "<br>Citizenship: ", 'Data Feed'!E2, "<br>Incorporated: ",'Data Feed'!H2, "<br>Incorporated State: ",'Data Feed'!J2, "<p><u>Agent</u><br>",'Data Feed'!B2, "<br>",'Data Feed'!C2))

    I then copied these down the rest of the spreadsheet for each row.

  5. Finally, I created a third tab and added a Maps gadget using the Insert > Gadget menu item. For the data range, I just used the two columns from my reformatted data sheet. Then I specified I wanted tool tips and mouse wheel zooming.
  6. It took a minute or so for the Google Maps to appear, but when it did all the July incorporations for Buncombe County were mapped out geographically and you could clearly see some corporations had out-of-state offices.

You can see the final spreadsheet here.

YouTube Example

Since the government example didn’t use a true API, I wanted to an example that demonstrated mashing up multiple services to get information using actual API calls. I chose to use the YouTube API to retrieve the first 10 videos in Asheville, North Carolina and then to see how often they are shared on Facebook. Since YouTube requires latitude and longitude coordinates, I used another API to translate from a zip code to latitude and longitude.

The steps I took to create this example include:

  1. I created a new Google Spreadsheet and decided the first sheet was going to be my main “application” sheet. At the top I added the label “Zip Code:” and in the cell next to it entered “28801″.
  2. I added another sheet, calling it Lookups. In the first row I called the Geocoder.us API to translate from the zip code entered on my main sheet into a latitude and longitude using:
    =IMPORTDATA(CONCATENATE("http://geocoder.us/service/csv/geocode?zip=",Main!B1))
  3. I looked up the syntax for the YouTube API call for searching videos. I grabbed their example link, modified it to use the “location” and “location-radius” parameters, removed the keyword query parameter and tested it with the Asheville latitude and longitude:
    https://gdata.youtube.com/feeds/api/videos?orderby=published&start-index=11&max-results=10&v=2&prettyprint=true&location=35.595661,-82.55632&location-radius=10km
  4. I converted this to a formula with the latitude and longitude coming from the result of my API call on the first row:
    =CONCATENATE("https://gdata.youtube.com/feeds/api/videos?orderby=published&start-index=11&max-results=10&v=2&prettyprint=true&location=", Lookups!B1, ",", Lookups!C1, "&location-radius=10km")
  5. I switched back to my main sheet and underneath the zip code line, used the ImportFeed function to pull in the data from the YouTube feed URL defined on the Lookup sheet:
    =ImportFeed(Lookups!B3)
  6. This gave me a listing of the most recent Asheville videos with links I could click to see each one. Unfortunately, most of them were advertising videos for car dealerships or real estate (not surprising, since only videos that are geocoded for Asheville appear in this listing).
  7. To demonstrate also pulling data from a JSON API, I wrote a script I learned in Google Docs Unleashed. I started by opening the script editor using Tools > Script editor. Then I entered:
    function FacebookShares(url) {
        var service  = "http://graph.facebook.com/" + escape(url);
        var jsondata = UrlFetchApp.fetch(service);
        var object   = Utilities.jsonParse(jsondata.getContentText());

        return object.shares != null ? object.shares : 0;
    }

    This defined a user function for my spreadsheet called FacebookShares which retrieves the number of times a given URL is shared on Facebook using the JSON format from the Facebook API.

  8. I used this function to retrieve the number of shares for each link I retrieved using:
    =FacebookShares(C3)
  9. Not surprisingly, none of the advertising videos had been shared on Facebook, so I created another sheet called “Facebook Shares” to confirm my function was working and put in some popular web sites. Sure enough, I got the number of times these links were shared on Facebook.

You can see the final spreadsheet here.

Resources

Resources I found while researching my talk include:

Have you used open APIs or are thinking about it? Comment below.

Credits: The photo used in this article was taken by Jeff Keyzer.

8 comments

  1. Jonathan Feldman says:

    Great job Trevor! Thanks for the writeup!

  2. DataViz says:

    Thanks for the great article.

    I’ve been searching for a simple way to import JSON feed data into a CSV format without having to write code.

    Products such as DabbleDB or Needlebase formerly provided this functionality prior to their acquisitions.

    Any recommendations for a business user friendly Zoho like database or utility which can natively import a JSON feed url or data output file & can output CSV data?

    Thanks

    1. trevor says:

      When I was working on my demos, my first demo was going to parse a JSON feed into a Google Spreadsheet directly, similar to the ImportData function. I didn’t have time to make this work, but if I wind up going back to it, I’ll let you know. Once in Google Spreadsheet, you can easily export to CSV.

  3. David says:

    I have been searching for hours and hours on how to parse json objects in google spread sheets. Every single cool tutorial is about xml or csv or parsing json that comes from a Google doc.

    Most of the cool api’s that I want to use are json. I see you were working on it also? :) I have been able to get some elements like:

    return object.title;

    and it pulls the title text and sticks it in the cell. But the problem is inside that there is:

    actor

    displayName

    I am pulling for Google Plus api. No matter what I do I can not get displayName to show up in the cell. I kinda hacked it by doing find and replace with nothing for all the text besides the name as a spreadsheet function but I want a clean parse. :)

  4. David says:

    I figured it out. Here is the relevant part of the Goog’e App Script code that will return an array to the Google Doc spreadsheet and auto separate into separate columns.

    var jsondata = UrlFetchApp.fetch(url);
    var object = JSON.parse(jsondata.getContentText());
    var displayName = object.displayName;
    var gender = object.gender;
    var relationshipStatus = object.relationshipStatus;
    var aboutMe = object.aboutMe;
    var profileImage = object.image.url;

    return [displayName, profileImage, gender, relationshipStatus, aboutMe];

    1. trevor says:

      Thanks. At some point I may use this as a basis for a generic function to do the same. Creating an ImportJson function has been on my todo list for a while. This code definitely helps.

  5. John says:

    David,

    I’ve tried you piece of code but i’m having a hard time with the “var gender = object.gender” for example. The “object.gender” is an array and we’re trying to put that into a regular variable which if you use “Firebug”, you will see that that variable stays “undefined”.

    Also, the part of auto-separate into seperate columns…

    Can your share solution?

    Thanks

    John

  6. vivek says:

    Thanks for an awesome write up sir.
    I am a student who was trying to learn the basics Open API and couldn’t find a simpler and a more precise article than this

1 ping

  1. MIT Course Majors Male Female Ratio, Fall 2011 (scraping with google docs) | orange narwhals says:

    […] I referred tohttp://blog.ouseful.info/2008/10/14/data-scraping-wikipedia-with-google-spreadsheets/http://blog.fastfedora.com/2011/08/how-to-use-open-apis.html *shudders* speaking of government apis, I really should be less terrible and fix my github […]

Leave a Reply

Your email address will not be published.

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>

«

»