ImportJSON

ImportJSON imports data from public JSON APIs into Google Spreadsheets. It aims to operate similarly to how the native Google Spreadsheet functions ImportData and ImportXML work.

The ImportJSON library contains four public functions:

  • ImportJSON
    Use to import a JSON feed from a URL.

  • ImportJSONViaPost
    Use to import a JSON feed from a URL using POST parameters.

  • ImportJSONAdvanced
    Use if you are a script developer and want to easily extend the functionality of this library.

  • URLEncode
    Use to URL encode a string to concatenate it to a URL.

Usage

ImportJSON

Imports a JSON feed and returns the results to be inserted into a Google Spreadsheet. The JSON feed is flattened to create a two-dimensional array. The first row contains the headers, with each column header indicating the path to that data in the JSON feed. The remaining rows contain the data.

ImportJSON takes 3 parameters:

  • url
    The URL to a JSON feed.

  • query
    A comma-separated list of paths to import. Any path starting with one of these paths gets imported.

  • parseOptions
    A comma-separated list of options that alter processing of the data.

By default, data gets transformed so it looks more like a normal data import. Specifically:

  • Data from parent JSON elements gets inherited to their child elements, so rows representing child elements contain the values of the rows representing their parent elements.
  • Values longer than 256 characters get truncated.
  • Headers have slashes converted to spaces, common prefixes removed and the resulting text converted to title case.

To change this behavior, pass in one of these values in the parseOptions parameter:

  • noInherit
    Don’t inherit values from parent elements

  • noTruncate
    Don’t truncate values

  • rawHeaders
    Don’t prettify headers

  • noHeaders
    Don’t include headers, only the data

  • debugLocation
    Prepend each value with the row & column it belongs in

For example, to return all the number of shares and comments for the URL http://www.yahoo.com/ from the Facebook Graph API, you could use:

=ImportJSON(CONCATENATE("http://graph.facebook.com/", URLEncode("http://www.yahoo.com/")),  "", "")

If you wanted to get rid of the headers, you would add a “noHeaders” to the last parameter.

=ImportJSON(CONCATENATE("http://graph.facebook.com/", URLEncode("http://www.yahoo.com/")), "", "noHeaders")

As an advanced example, if you wanted to query YouTube for the most popular videos, but only see the data returned relating to the ‘title’ and the ‘content’, you could use:

=ImportJSON("http://gdata.youtube.com/feeds/api/standardfeeds/most_popular?v=2&alt=json", "/feed/entry/title,/feed/entry/content", "noInherit,noTruncate,rawHeaders")

The “rawHeaders” allows us to see the full path to each column of data in the spreadsheet.

ImportJSONViaPost

Imports a JSON feed via a POST request and returns the results to be inserted into a Google Spreadsheet. This function works the same as ImportJSON, but allows you to specify a payload and fetch options to perform a POST request instead of a GET request.

To retrieve the JSON, a POST request is sent to the URL and the payload is passed as the content of the request using the content type “application/x-www-form-urlencoded”. If the fetchOptions define a value for “method”, “payload” or “contentType”, these values will take precedent. For example, advanced users can use this to make this function pass XML as the payload using a GET
request and a content type of “application/xml; charset=utf-8″.

ImportJSONViaPost takes 5 parameters:

  • url
    The URL to a JSON feed.

  • payload
    The content to pass with the POST request; usually a URL encoded list of name-value parameters separated by ampersands. Use the URLEncode function to URL encode parameters.

  • fetchOptions
    A comma-separated list of options used to retrieve the JSON feed from the URL.

  • query
    A comma-separated list of paths to import. Any path starting with one of these paths gets imported.

  • parseOptions
    A comma-separated list of options that alter processing of the data.

For more information on the available fetch options, see the documentation for UrlFetchApp. At this time the “headers” option is not supported.

For a list of the supported parseOptions and how to use queries, see ImportJSON.

ImportJSONAdvanced

An advanced version of ImportJSON designed to be easily extended by a script. This version cannot be called from within a spreadsheet.

Imports a JSON feed and returns the results to be inserted into a Google Spreadsheet. The JSON feed is flattened to create a two-dimensional array. The first row contains the headers, with each column header indicating the path to that data in the JSON feed. The remaining rows contain the data.

ImportJSONAdvanced takes 6 parameters:

  • url
    The URL to a JSON feed.

  • fetchOptions
    An Object whose properties are the options used to retrieve the JSON feed from the URL.

  • query
    A comma-separated list of paths to import. Any path starting with one of these paths gets imported.

  • parseOptions
    A comma-separated list of options that alter processing of the data.

  • includeFunc
    A function with the signature func(query, path, options) that returns true if the data element at the given path should be included or false otherwise.

  • transformFunc
    A function with the signature func(data, row, column, options) where data is a 2-dimensional array of the data and row & column are the current row and column being processed. Any return value is ignored. Note that row 0 contains the headers for the data, so test for row==0 to process headers only.

The function returns a two-dimensional array containing the data, with the first row containing headers.

The fetchOptions can be used to change how the JSON feed is retrieved. For instance, the “method” and “payload” options can be set to pass a POST request with post parameters. For more information on the available parameters, see the documentation for UrlFetchApp. The fetchOptions must be an Object.

Use the include and transformation functions to determine what to include in the import and how to transform the data after it is imported.

For example:

ImportJSON("http://gdata.youtube.com/feeds/api/standardfeeds/most_popular?v=2&alt=json",
          new Object() { "method" : "post", "payload" : "user=bob&apikey=xxxx" },
          "/feed/entry",
          "",
          function (query, path) { return path.indexOf(query) == 0; },
          function (data, row, column) { data[row][column] = data[row][column].toString().substr(0, 100); } )

In this example, the import function checks to see if the path to the data being imported starts with the query. The transform function takes the data and truncates it. For more robust versions of these functions, see the internal code of this library.

URLEncode

Encodes the given value to use within a URL.

URLEncode takes 1 parameters:

  • value
    The value to be encoded.

The function returns the given value encoded using the URL encoding scheme.

For instance:

=URLEncode("Value with spaces")

Returns the value “Value%20with%20spaces”.

You can use the URLEncode function to create URL (GET) and POST parameters by combining it with the CONCATENATE function. For instance:

=CONCATENATE("param1=", URLEncode("Value 1"), "&", "param2=", URLEncode("Value 2"))

Would be encoded as “param1=Value%20&param2=Value%20″. This could then be added to the query of a URL or passed as a payload in the ImportJSONViaPost function.

Source Code

The code is now available in a GitHub repository.

Bugs & Feature Requests

If you have features you’d like to see added, or notice bugs in this library, please submit a new issue on GitHub. If you don’t have a GitHub account, you can always leave a comment on this page too.

The library isn’t under active development, so don’t expect issues to be fixed quickly. But I do occasionally revisit it and add new features or fix bugs. If you would like to contribute to the development of this library, let me know.

136 comments

  1. Nathan Ziarek says:

    I can’t figure out what the xPath would be for something like this:

    {
       blah: "blah",
       aaData: [
          [
             "John",
             "Smith",
             "jsmith@example.com"
          ],
          [
             "Toby",
             "Harris",
             "tharris@example.com"
          ]
    }

    …transformed to a table like this:

    John  |  Smith  |  jsmith@example.com
    Tody  |  Harris  |  tharris@example.com

    Am I barking up the wrong tree?

    1. trevor says:

      The ImportJSON library doesn’t use XPath syntax. Eventually it should use a JSONPath syntax, but for now it does a simple starts-with match.

      If you want to use the function to import the JSON data you showed, use the query “/aaData”. Though currently the ImportJSON function would return the JSON you provided as a comma-separated list within a single cell because it currently collapses all scalar arrays into single values.

      If I have time I’ll work on adding an option to allow you to import scalar arrays as arrays in the final result. For my initial test cases it made sense to import them as a single cell, but now that I have your test case, I can add to it. Thanks.

      1. Nick says:

        Hello trevor,

        Thank you for your awesome script!! It is amazing!
        I am really looking forward to what Nathan Ziarek is looking for.

        Do you have any news regarding that request?

        Thank you,
        Nick

        1. trevor says:

          I haven’t had time to work on this script, so no. But your request is definitely noted, so I’ll bump up the priority of this feature when I do start working on it.

          You can follow the issue at GitHub: Add option to return scalar arrays as arrays. Click Watch Thread to receive e-mail updates about this specific feature.

  2. Bo says:

    I’m trying to pull;

    “subscriberCount”: “240750″ just the 240750 number into a cell from this json;

    http://gdata.youtube.com/feeds/api/users/vikkstar123?v=2&alt=json&prettyprint=true&fields=yt:statistics

    How would I structure the query to do that?

    1. trevor says:

      Use the path to the JSON attribute you want to retrieve, using a slash to separate elements and then set the options parameter to “noHeaders”.

      In your example, the path would be “/entry/yt$statistics/subscriberCount”, so you’re import function would look like:

      =ImportJSON(“http://gdata.youtube.com/feeds/api/users/vikkstar123?v=2&alt=json&prettyprint=true&fields=yt:statistics”, “/entry/yt$statistics/subscriberCount”, “noHeaders”)

  3. Laura- says:

    Nice library! One suggestion would be to tag it with a version, so that other developers can include it in their own scripts.

    Currently, I had to fork it to give it a version, which isn’t ideal.

    https://developers.google.com/apps-script/guide_libraries#includeLibrary

    1. trevor says:

      Thanks. I just re-published the script with versions enabled. The current version is now v1.1.1 which translates to the Google script version 3. It may take a couple days for the script to be published in the main directory.

      1. Laura- says:

        Hmm… you sure you published this? I still only see the former version available in Google :/

        1. trevor says:

          I did publish it, but didn’t check the Script Gallery beforehand. It looks like someone using the e-mail address “tlckreative@gmail.com” republished my library back in January and hijacked my listing.

          It looks like I may not be able to get this republished until I can get Google to re-assign ownership back to me. I’m working on figuring out how to do this.

          Unfortunately, it looks like Google Scripts is not well supported by Google. The Report Abuse link points to an old e-mail that bounces and I never wind up getting notifications when the scripts do get published.

          I’m trying multiple avenues, so hopefully they’ll get the problem fixed soon. Thanks for letting me know!

        2. trevor says:

          Laura, I was able to get everything sorted out with the Google Apps team, so now my version with versioning is available in the Script Library.

          I also just updated the library to provide support for POST requests. I just submitted it, so it should be available in the next couple days. Thanks for alerting me to the issue.

  4. Besim Karadeniz says:

    Hello Trevor,
    I’m experimenting with the Facebook API and I’m watching a strange behaviour. I have following API-String to get informations for two facebook pages at once):

    =ImportJSON(“http://graph.facebook.com/?ids=15765808054,116335138405853″, “”, “noHeaders”))

    The result of this call is put in only one lines of my spreadshet although I get results for two pages. The results of the second page is put in the same line after the results of the first line. I expected these data of the second page under the first page in a second line.

    The JSON output looks good for me I cannot find any error.

    I think that there could be a problem in your script with the phenomen of using a commata in the facebook api call.

    Cheers,
    Besim

    1. trevor says:

      Besim,

      The problem is not with the script, but with the way Facebook is returning multiple objects in the API. Rather than returning the objects as a top-level array, which ImportJSON would correctly parse as multiple rows, Facebook is returning the objects as a key-value map where the key is the id of each object.

      This makes each object appear to be attributes of a single top-level object, rather than multiple objects. Hence the single row.

      There may be a way to add another function parameter, or change how the query gets parsed, to make this work the way you’re expecting it to. I added an enhancement request in Github for this (see Figure out how to handle multiple objects in Facebook API using id fields).

      Though it’s not a trivial fix, and I don’t know when I’ll have time to work on it. If I do add this functionality, I’ll definitely let you know. In the meantime, your best bet is to use separate ImportJSON calls, one per line.

      – Trevor

      1. Besim Karadeniz says:

        Hello Trevor,

        I just wanted to ask just what the status is of this feature. Is there hope that it might be implemented in the near future?

        Regards,
        Besim

        1. trevor says:

          Besim,

          Unfortunately I don’t have any time right now to work on this feature. My business is booming and is consuming all of my time. The soonest things might slow down is November.

          The code is on Github, so if someone else decides to add the feature, I’ll go ahead and publish it. Otherwise, I’ll let you know when I’m able to get around to it.

          — Trevor

  5. Nathan says:

    This is so great! I’ve been running this for a bit and pretty quickly hit the Google Docs rate limit. It returns #ERROR! and the following message:
    Service invoked too many times in a short time: urlfetch. Try Utilities.sleep(1000) between calls. (line 167, file “Code”)

    I don’t know how to add this myself. How could I add this pause so that I wouldn’t keep hitting the rate limit? Thanks!

    1. trevor says:

      From within your Google Spreadsheet:

      1. Use the Tools > Script editor menu item to launch the script editor.
      2. Add Utilities.sleep(1000); on line 167 so it becomes the first line of the ImportJSONAdvanced function. The code var jsondata = UrlFetchApp.fetch(url, fetchOptions); should now be on line 168 after you do this.
      3. Use File > Save to save the script.
      4. Go back to your spreadsheet, which should refresh in a couple seconds to use the new code.

      Experiment with smaller values than 1000 milliseconds if this is taking too long between calls.

      1. Nathan says:

        As long as I only fetch 50 at a time, I don’t hit the rate limit. If I push it up to over 100 at a time, I hit the #ERROR!, even after adding the sleep instruction. From watching the page load, it doesn’t appear that the sheet loads one JSON record, pauses for 1 second, then loads the next. It looks like they just snowball. Maybe that’s just how it looks, though.

        Pulling down 10,000 records is a bit of a slog, 50 at a time, but it’s sure better than the process I used previously.

        1. trevor says:

          Now that I think about it, that makes sense. Each function on the spreadsheet is likely executed in a separate thread. So the sleep being executed on one instance doesn’t affect the other instances.

          One question though: why do you need separate statements for each row?

          If the API you’re querying was designed well, it should have a way to pull multiple records at the same time. The ImportJSON places each record into a separate row if it receives multiple rows of data. This way you’d only use a single ImportJSON statement instead of multiple statements, and you shouldn’t hit the rate limit.

          If the API doesn’t have a way to query multiple rows of data, consider contacting the creator of the API and asking them to add this capability.

          The other thing you can try is to write your own JavaScript function that calls ImportJSONAdvanced multiple times with different parameters and concatenates the results. This would then run in a single thread and the sleep function would work as intended.

        2. Rabah Rahil says:

          Having the same issues Nathan. Did you ever resolve this?

          P.S. Thanks soo much Trevor this script is amazing!

  6. Martin Snygg says:

    Hey looks like the exact thing I need but i get:

    error: SyntaxError: Unexpected token: < (line 168, file "Code")

    Trying to use this:

    https://github.com/ninetwozero/BF3-Battlelog/wiki/API:-Get-platoon-members

    What should the right formatted =ImportJSON look like in my case?

    1. trevor says:

      The API you refer to requires you to substitute {PLATOON_ID} in the URL with an actual platoon ID. It looks like if the wrong platoon ID is entered, an HTML error page is returned rather than JSON. Which is probably what’s causing the error you’re seeing.

      Try using:

      =ImportJSON(“http://battlelog.battlefield.com/bf3/platoon/{PLATOON_ID}/listmembers/”)

      And replace {PLATOON_ID} with the ID of the platoon you want to query. If that’s still generating an error, try accessing the URL directly in the browser to confirm you’re getting JSON back (it should look like plain text with lots of { and } in the text).

      If you are getting back JSON, but still getting the error, post the function exactly as you’re using it and I can look into it.

  7. Andriy says:

    Hi

    Thanks for the info, this is exactly what I needed

    But I have some issues and I don’t know how to deal with them (it’s 3 in the morning and I cant go to bed :))

    So here it is:

    I’m trying to parse some info from the http://prodvigator.ua/api/v1/domain_info?query=glasko.com.ua&token=8f678b10caa5cc28d3f9b66e3a94376d

    I need the “phrases_count” data

    I’ve tried different variations based on your example, but I get the parse error every time :(

    I tried to use this one:

    =ImportJSON(“http://gdata.youtube.com/feeds/api/standardfeeds/most_popular?v=2&alt=json”, “/feed/entry/title,/feed/entry/content”, “noInherit,noTruncate,rawHeaders”)

    And I got the damn error

    But everything works just fine if I use only the following code:

    =ImportJSON(“http://gdata.youtube.com/feeds/api/standardfeeds/most_popular?v=2&alt=json”)

    I don’t get any errors, just lots of tables and data.

    So my question is:

    Why do your examples return the error and can you please tell me how can I get only phrases_count data from http://prodvigator.ua/api/v1/domain_info?query=glasko.com.ua&token=8f678b10caa5cc28d3f9b66e3a94376d

    And here’s the screenshot if the link wouldn’t work:

    http://clip2net.com/s/5uknu1

  8. Andriy says:

    the problem solved by using ; instead of ,

  9. Peter Daly-Dickson says:

    You sir, are a feckin’ genius.

    A few hours on Google to find ImportJSON and http://fixer.io/ has given me EXACTLY what I was looking for in my Google Doc.

    Can’t thank you enough! :)

    cheers

    Pete

  10. Michael Hirsch says:

    This is really great! However I am experiencing a strange issue.

    I set up a time based trigger to update my spreadsheet automatically every minute (I’m creating a visualization) that tracks NYC’s citibike program. However, the spreadsheet isn’t updating based on the trigger, and instead only updates when I go into the Script Editor and press save. I don’t even have to make any changes.

    Here are the execution logs… can you help me out? Thanks so much!

    [13-10-25 12:04:17:290 EDT] Starting execution
    [13-10-25 12:04:17:298 EDT] UrlFetchApp.fetch([[object Object], null]) [0 seconds]
    [13-10-25 12:04:17:339 EDT] Execution failed: Invalid argument: http://object%20Object (line 167, file “Code”) [0.04 seconds total runtime]

    1. trevor says:

      Michael,

      Sorry for the delayed response. I’ve been travelling.

      I’ve never used time-based triggers, so can’t help you out there.

      I do know why it updates when you press Save in the Script Editor. By pressing Save, you’re telling Google Spreadsheet the script has been updated and needs to be re-run.

      I’m not sure how to interpret your logs. It almost looks like the URL being used to fetch the results is the string “object Object”, which is the string representation of an object. This shouldn’t be occurring in the ImportJSON library (nor are there normally execution logs created from the library), so I don’t really know where to point you.

      If you do figure out the cause, or how to get time-based triggers to work, can you leave a reply here so others can learn from it. Thanks.

      Best of luck getting it to work.

      – Trevor

  11. aweesom1 says:

    Currently: Data from parent JSON elements gets inherited to their child elements, so rows representing child elements contain the values of the rows representing their parent elements.

    Is it possible to switch over to COLUMNS representing child elements contain the values of the COLUMNS representing their parent elements?

    I am running into the problem where there are 100+ child elements for one parent, and google docs is not wide enough to display all of them. It is however tall enough, but how do we get the function to display the data in rows instead of across columns?

    Sample Data
    =ImportJSON(http://api.worldoftanks.com//2.0/clan/info//?application_id=16924c431c705523aae25b6f638c54dd&clan_id=1000008386,”/data/1000008386/members”, “”)

    Thanks!

    1. trevor says:

      Use the TRANSPOSE function built into Google Spreadsheet. For example:

      =TRANSPOSE(ImportJSON(http://api.worldoftanks.com//2.0/clan/info//?application_id=16924c431c705523aae25b6f638c54dd&clan_id=1000008386,”/data/1000008386/members”, “”))

  12. Dominik says:

    how do you get this to update the data every few minutes?

    1. trevor says:

      You’d have to set up a time-based trigger. I don’t know how to do this, and Michael Hirsch commented earlier that he was having problems getting this to work with ImportJSON.

      Be careful about setting up a time-based trigger in the Script Editor. I believe these only run the scripts, rather than asking the spreadsheet to update. The ImportJSON function doesn’t work standalone in the script editor; you have to create a wrapper parameterless function which passed parameters to ImportJSON to get it to work.

      Or at least, I’ve never been able to figure out how to run a function with parameters from within the Script Editor. During debugging, I just create a wrapper function and test it.

    2. David Dollar says:

      Add the following value to a field on your sheet:

      =CONCATENATE(YEAR(NOW()), “-”, MONTH(NOW()), “-”, DAY(NOW()), ” “, HOUR(NOW()), “:”, MINUTE(NOW()))

      You can set the foreground and background color the same to hide it.

      You should then specify that field as an extra argument to ImportJSON() and friends.

      Google caches the result of functions based on the parameters input. You need to bust this cache with a column that updates every minute like above.

      1. Craig says:

        How do I specify that field as an extra argument to importJSON?

        I need to update the JSON script every few min but the trigger doesn’t seem to be working. When I try run the script is says “DNS error: http://undefined

        Thank you

        Craig

        1. Rory says:

          Hi Craig,

          Did you every get this sorted? I’m after the same sort of thing and having similar problems.

          Thanks,

          Rory

          1. Tuomo says:

            Hello. Does anyone know how to make this work? I have similar problem but I haven’t found any solution yet.

  13. Ardalan says:

    Do you know if this can handle cursor paging with Facebook’s GRAPH API?

    Essentially what happens is it only displays ONE child data elements, and other child elements appear as a paginated link…yet the last parent element shows all it’s child elements.

    JSON:
    https://graph.facebook.com/102158363185/likes?fields=likes.limit(51).fields(talking_about_count,likes,link,name)&limit=200&access_token=CAAHrXZB23ZCxgBAKsi4JkENJyhZB0LZC02jOWzB8ZAFPQdhaqqh7ZCshLSwyQKhPPZBuNkipV3icIE2ufJZBh15ZBVjzUzG8B2MQjn6oKSULU3xjrku7yDEjE8xBjDSSXU09IXem67STZC5qZBN6mbuxY3tpZB5Snx0R3ZCnXUO978lqABvkjndATYOUP

    1. trevor says:

      No, the ImportJSON library doesn’t handle paginated links.

      You could use the ImportJSONAdvanced function within a loop so it retrieves the data, grabs the paging/next link at the end and then uses that to retrieve the next set of data. Then append all the results together into a single array to return them to the spreadsheet.

  14. Greg Gerber says:

    Turns out it’s called “time-driven triggers” and it’s surprisingly simple. Instructions here:
    https://developers.google.com/apps-script/understanding_triggers#TimeTriggers

  15. Patrick says:

    Is there a way to pass a request header using this function? The site I am currently trying to pull data from requires that you post an identifier header.

    1. trevor says:

      Depending on what you’re trying to do, you may be able to use the ImportJSONViaPost function and set the fetchOptions to include the header you want sent. See the list of options you can set in the documentation for UrlFetchApp.fetch, which is what ImportJSON uses to retrieve the JSON from the server.

      If you need to send a custom header, you likely need to call ImportJSONViaPost from a script, or modify the library to convert text headers into a JavaScript object, since the ‘headers’ option requires an object, not a string.

      Also, though the function is called ImportJSONViaPost, you can do GET requests by setting the ‘method’ fetch option to “GET”.

  16. Brendan says:

    Thanks for the script. However, it’s not working with a JSONP url (Example: http://api.sportsdatabase.com/nba/query.json?sdql=date%2Cteam%2Co%3Ateam%2Cpoints%2Co%3Apoints%40team%3DJazz%20and%20season%3D2013&output=json).

    Any ideas?

    1. trevor says:

      The problem is that JSON-P adds extra JavaScript code that the JSON parser can’t parse. This should be able to be gotten around by trimming the start and end of the JSON-P response.

      I just created the issue Figure out how to import JSON-P data in the issue tracking system. Check this out for a way you can modify the code to get what you want. If you try it, please add a comment that describes whether it was successful or not.

  17. Rory says:

    Hi, thanks a lot for the script. It’s great for what I’m doing.

    I can’t work out how to tailor it to my needs though. I’m using this code to gather photos for a foursquare venue:

    =ImportJSON("https://api.foursquare.com/v2/venues/4bc20cf2920eb7134b881b2c?oauth_token=TOMSBZZ3OPENPMWHFITMVLEM4YQA41UCTSPRPWUCBQIDXQ3C&v=20140304", "/response/venue/photos/groups/items/prefix,/response/venue/photos/groups/items/width,/response/venue/photos/groups/items/height,/response/venue/photos/groups/items/suffix", "noHeaders")

    This creates a column for prefix, suffix, width and height and a row for each photo then populates the cells with the relevant data. However, I wish to combine the prefix, suffix, width and height for each photo along with the letter ‘x’ to generate a full url for each photo and display the URLS in a single cell separated by commas, like so:

    URL1,URL2,URL3,URL4 etc

    Is there anyway to do this?

    Thanks for any help.

    1. trevor says:

      If you have a known number of rows, you should be able to do this by setting up your worksheet as follows:

      1. Enter your ImportJSON function in B1

      2. In A1, enter =CONCATENATE(B1, C1, D1, E1) to concatenate the prefix, width, height and suffix into a single column. Add whatever other text you need as parameters here. So you might use =CONCATENATE("http://someserver/images/", B1, "_image-", C1, "x", D1, ".", E1) which would produce something like “http://someserver/images/bu_image-200×420.jpg”.

      3. Copy the formula in A1 down the rest of column A for as many rows as you expect to receive from the server

      3. In F1 (or any other column which doesn’t have data), enter =JOIN(",", A:A). This will combine all the individual URLs into a single cell, separating them by commas.

      The only issue here is that you have to manually copy the A1 formula and know how many rows you’re receiving.

      To do this with an unknown number of rows, you’d have to write a script using the ImportJSONAdvanced function to do this.

  18. Dave says:

    Trevor –

    Thank you for this script! And I hate to add a real noob question, but here goes (I tried googling for a few hours before I posted this, so if I missed something obvious please forgive my fried brain).

    So I have an import source from a url that feeds me order data, but unfortunately it does so in order of most recent to oldest.

    I’m trying to make an additional column to where we can assign a value for the order (a room number), and the problem I’m running into is the resultant structure looks something like this:

    Room Order Person
    105 3 Joe
    461 2 Curly
    107 1 Moe

    Once the script refreshes after a new order, the first column, being static, does this:

    Room Order Person
    105 4 Shemp
    461 3 Joe
    107 2 Curly
    1 Moe

    (So on and so forth for each order thereafter)

    Now, it would be easy if I could just simply sort the data by the order column, but when I do that I get partial sort, that’s removes most of the entries because it’s being referenced by formulas such as “=CONTINUE(A1, 2, 3)” instead of the values.

    I guess is there any way to work around this, e.g. to have the JSON data sent to the cells as actual values instead of formulas, or something else I’m not aware of?

    Any assistance you could offer would be a godsend. Thanks for the script though -for the advertised purpose, it worked BEAUTIFULLY!

    1. trevor says:

      If you only need to update once in a while, the simple solution is to click the upper-left corner of the worksheet to highlight the entire worksheet, press Copy, create a new worksheet, then use Paste Special > As Values into that sheet. Then you can sort normally.

      You could probably also do this programmatically by calling ImportJSONAdvanced, then calling some Google Spreadsheet methods to insert the data directly into the spreadsheet. I’m pretty sure this can be done, but I don’t know how to do it. You could search for how to insert data programatically into the spreadsheet.

      The problem right now is that ImportJSON is designed to be used as a function, so it returns the results dynamically, which doesn’t allow for sorting, as you’ve noticed.

      1. Dave says:

        Thanks for the tip! I’ll see what I can find, and if I can figure it out I’ll post it back here in case anyone comes up with the same dilemma. :)

  19. Joey says:

    Hey Trevor,
    Thanks for the great script. I’m trying to use it to pull data from basecamp into a spreadsheet, but it seems to get tripped up by the html basic authentication. Any way to get around this?

    Here’s my function call formatting:

    =ImportJSON(“https://username:password@basecamp.com/1111111/api/v1/projects/1111111-my-project/todolists.json”, “/name”, “noHeaders”)

    And this is the error I get:

    error: Unexpected error: https://username:password@basecamp.com/1111111/api/v1/projects/1111111-my-project/todolists.json (line 130, file “Code”)

    1. trevor says:

      Joey,

      ImportJSON uses the UrlFetchApp class from Google, which I don’t think supports using the username & password in the URL itself.

      It may be possible to do this with ImportJSONAdvanced by creating a custom function in a script. UrlFetchApp supports arbitrary headers, so you could pass the username & password using the Authenticate HTTP header.

      Try adding this function into the ImportJSON codebase:

      function ImportJSONBasicAuthentication(url, query, parseOptions, username, password) {
         var username = 'myUser';
         var password = 'myPassword';
         var fetchOptions = {
            "headers" : {
               "Authorization" : 'Basic ' + Utilities.base64Encode(username + ':' + password)
            },
            muteHttpExceptions: true
         };

        return ImportJSONAdvanced(url, fetchOptions, query, parseOptions, includeXPath_, defaultTransform_);
      }

      I didn’t have time to test this. But let me know if this works.

      References that may help you are the UrlFetchApp.fetch documentation, RFC 2617 – HTTP Authentication and this post on the Google Apps Script Issues forum. Good luck!

      1. Ralph says:

        Nice one Trevor. I can confirm that this works, though anybody else will want to change the “myUser”,”myPassword” variables …

  20. Anton B says:

    I use your script for writing JSON data to Google Spreadsheet and I have encountered one problem.

    I use ‘parseJSONObject_’ function for JSON object from one API and I get next data:

    [["Contextlimitsum", "Status", "Disableddomains", "Keyword"]
    [100, 'Active', 'mail.ru']
    [100, 'NoActive']
    [200, 'Active', 'mail.ru', 'diamond']]

    How you can see that rows aren’t same length (1st – 4 column, 2nd – 3 columns..). This data can’t be written to spreadsheet using setValues() function because all rows are need to have same length. If I write it by one row with sheet.appendRow function, there is time limit exceed..

    Can you help me?

    1. trevor says:

      Since you’re doing this in code, try normalizing the array before you pass it into the setValues() function. That is, go through the array and increase the length of any rows that are too short before calling setValues() by populating the extra places with blank or empty data (e.g. ” or null).

  21. Ethan says:

    Hey Trevor! Loving ImportJSON. Any reason: http://www.google.com/trends/fetchComponent?q=qwerty&cid=TIMESERIES_GRAPH_0&export=3 would not work?

    Here is my formula: =ImportJSON(urlencode(“http://www.google.com/trends/fetchComponent?q=qwerty&cid=TIMESERIES_GRAPH_0&export=3″), “”, “”)

    1. trevor says:

      The response coming back from the Google Trends URL you posted is JSON-P, not pure JSON. So it can’t be parsed by ImportJSON.

      See this issue in the issue tracking system for a potential workaround using a custom function that calls into ImportJSONAdvanced.

      Basically, you need to trim off the beginning and end of the response to get at the pure JSON. Also, if Google ever decides to change the callback function or comments in the JSON-P, you’ll need to adjust where you do that trimming (in case you get it to work & it suddenly stops working at some point).

  22. Steven says:

    Hello

    Nice work with the JSON script and have a quick question for you. I am trying to pull member details from a page based on the Member ID from another sheet.

    Here is the syntax I have now.

    =ImportJSON(CONCATENATE(ʺhttp://grimreapergamers.com/members/ʺ,URLEncode(ʺʺ&=Memberlist!A3&ʺ.jsonʺ)),,ʺnoHeadersʺ)

    If I put the actual memberID number in where the cell reference is it works, but this way I get a parse error.

    Thanks

    1. trevor says:

      Steven,

      What jumps out right away is the equals sign before the cell reference. Try taking that out.

      Also, you can simply this in two ways. First, if the member ID is always going to be a number, you don’t need to URL encode the value, since it’ll be the same thing after you encode it. So you can do:

      =ImportJSON(ʺhttp://grimreapergamers.com/members/ʺ & Memberlist!A3 & ʺ.jsonʺ,,ʺnoHeadersʺ)

      Or

      =ImportJSON(CONCATENATE(ʺhttp://grimreapergamers.com/members/ʺ, Memberlist!A3, ʺ.jsonʺ),,ʺnoHeadersʺ)

      If the member ID might have spaces or non-alphanumeric characters, then you do need to encode it. But just encode that single attribute:

      =ImportJSON(CONCATENATE(ʺhttp://grimreapergamers.com/members/ʺ, URLEncode(Memberlist!A3), ʺ.jsonʺ),,ʺnoHeadersʺ)

      Hope that works for you.

      1. Steven says:

        Trevor

        Thanks for the quick reply. Missed the equal sign thanks. But I am still getting a parse error on all three of the options you suggested. I even tried calling a cell from the same sheet to if that was the problem.

        Thanks

        1. Steven says:

          Trevor,

          Nevermind. Seems it does not like copy and past. Had to retype out the entire formula. This works now.

          Thanks Again for the help.

  23. Roy says:

    Hi Trevor,

    This importJSON function just saved me loads of work! Great work! I have one issue i’ve been trying to solve for some days now. I have several url’s with a different parameter I want to include in one spreadsheat So:
    http://www.staagg.com/webservices/v4/getTransactions/mypersonalparameters/networkAccId/1367/
    and
    http://www.staagg.com/webservices/v4/getTransactions/mypersonalparameters/networkAccId/1369/

    Is there a way to combine this?

    Thanks!
    Roy

    1. trevor says:

      Roy,

      If you know the response is always a fixed number of rows, you can use the ImportJSON function on the row after the previous response ends. For instance, if the response is always 5 rows, you can insert your first ImportJSON function with the first URL at line 1, the second one at line 6, the third one at line 11, etc.

      If you have a variable number of rows, you’ll need to get into editing the code to create a new function. This function can iterate through your list of potential URLs, call ImportJSON to retrieve the data for each one, and then merge the results together.

      The key to know is that all ImportJSON is doing is returning a 2-dimensional array. So all you need to do is call it multiple times, merge the arrays and return the result.

      Conceptually, you want something like:

      function ImportJSONVariations(baseUrl, variations, query, parseOptions) {
         var variables = variations.toString().split(",");
         var result = new Array();
         var url, variantArray;

         for (var i = 0; i < variables.length; i++) {
            url = baseUrl + variables[i];
            variantArray = ImportJSON(url, query, parseOptions);
            result.concat(variantArray);
         }

        return result;
      }

      Then you could use it by passing in a comma-separate list of variations from a base URL:

      =ImportJSONVariations("http://www.staagg.com/webservices/v4/getTransactions/mypersonalparameters/networkAccId/", "1367/,1369/");

      The above code probably won’t work because I don’t think the concat() method works on 2-dimensional arrays. But this answer on StackOverflow might be a useful start to create a function that merges 2-dimensional arrays.

      If you do work out a generic solution, please post it here so others can use it. Thanks and good luck!

  24. lam says:

    Hi Trevor, your script is very helpful.
    I get this error when i try import api

    error: Request failed for xxxx returned code 403. Truncated server response: 403 Forbidden Forbidden You don’t have the permission to access … (use muteHttpExceptions option to examine full response) (line 167, file “Code”)

    It seems the site doesn’t allow GAS user agent. Can your script manually change the user agent?

    1. trevor says:

      ImportJSON doesn’t currently have a way to set the headers for a request. So you can’t set the user agent.

      In theory, the UrlFetchApp class that ImportJSON uses can set the headers of the request and you could try explicitly setting the “User-Agent” header. You could do this modifying ImportJSON to pass in a ‘headers’ object in the fetchOptions:

      function ImportJSON(url, query, parseOptions) {
        var fetchOptions = { 'headers' : { 'User-Agent' : 'My user agent' } };

        return ImportJSONAdvanced(url, fetchOptions, query, parseOptions, includeXPath_, defaultTransform_);
      }

      I wouldn’t be surprised if Google doesn’t let you override the user agent though. See here for full documentation on UrlFetchApp, which is what ImportJSON uses to do the retrieval. The ‘fetchOptions’ are what are passed through as the ‘params’.

      However, it’s unclear from the error message you posted that the problem is the user agent. A 403 Forbidden error can occur for many reasons. You may want to set muteHttpExceptions to see the full exception. You can do this by modifying the ImportJSON function above:

      var fetchOptions = { 'muteHttpExceptions' : true };

      Or by using ImportJSONViaPost, which allows you to pass in a limited set of the fetch options:

      =ImportJSONViaPost("http://myurl.com/json", "", "muteHttpExceptions=true")

      I haven’t tested ImportJSON with muteHttpExceptions set, so I don’t know if this will generate another error or not. But it at least gives you an avenue of investigation. Good luck.

  25. Simon Thompson says:

    Thank you so much for this – I’ve used it as the basis of a URL checker for bit.ly libraries. I will blog this on my web site at some point and will be happy to give you credit.

  26. Ethan says:

    Hi Trevor,

    Again, can’t thank you enough for writing this script. I’ve tried many variations of the formula to make it work on this url: http://api.grepwords.com/lookup?apikey=carter&q=mesothelioma|seo. What would the parameters be to get all elements? I know it’s something probably really elementary that I’m missing.

    Thank you for your help in advance.

    - Ethan

    1. trevor says:

      Ethan,

      The problem is the pipe character ‘|’ in your URL. The UrlFetchApp class that ImportJSON uses rejects this as an invalid character. You need to encode this in the URL before you pass it to ImportJSON.

      Two ways of doing this:

      1. Use %7C instead of | in your URL

      This is the encoded value for the pipe character. For instance:

      ImportJSON("http://api.grepwords.com/lookup?apikey=carter&q=mesothelioma%7Cseo")

      2. Use the URLEncode function to encode your parameter

      If you think you might have other non-alphanumeric characters in your ‘q’ parameter, it’s better to just let the URLEncode function do the encoding. For instance:

      ImportJSON(CONCATENATE("http://api.grepwords.com/lookup?apikey=carter&q=", URLEncode("mesothelioma|seo")))

      Note that you cannot encode the entire URL, since the colon, slashes and question mark need to remain unencoded in the URL (i.e., they are special characters whose meaning you want to remain special). Just encode the parameters you need to.

      Trevor

      1. Ethan says:

        Perfect! Worked like a charm. Thank you!

  27. Gabriela says:

    Hi, This is a great script. I got it working in one of my spreesheets but not in another.

    In the spreed I got it working I’ve set the JSON URL and it displays data. My question is, how do I set it to refresh data automatically. I have done everything. I changed spreedsheet configurations, set a trigger in script and even added Minute(Now()) at the end of my JSON url. I can’t get it to refresh data in minutes. Can you help me in this task.

    The other thing is, when I load the script using ImportJSON button in the script editor it gives me an error at line 130. It says DNS error: http://undefined (line 130, file “importJSON”).

    Thank you.

    1. trevor says:

      To get ImportJSON to refresh every minutes, you need to use the function GOOGLECLOCK() instead of NOW(). This forces an update every minute whereas NOW() only recalculates when the spreadsheet is opened or re-calculated.

      An easy way to do this is to add GOOGLECLOCK() as a 4th parameter:

      =ImportJSON("http://myurl.com", "/", "", GOOGLECLOCK())

      Read the documentation linked above; if you are using the new Google Spreadsheets, you must use the Google NOW() function and change the recalculation setting to every minute under File > Spreadsheet settings instead.

      The error you are getting in the script editor using the ImportJSON button is because you are calling the function with no parameters. ImportJSON needs at least a URL passed in. When you execute it with no parameters, the URL is undefined (hence the error you’re seeing).

      To fix this, add a wrapper function into the script file:

      function ImportJSONTest()
      {
         return ImportJSON("http://myurl.com", "/", "");
      }

      Then use the ImportJSONTest button to run the function. This is how I do debugging of the code.

      1. Leadustin says:

        Hi Trevor,

        hope u can help me with my problem. I have created a new spreadsheet and added ur JSON Import Script into the script editor and put this in a cell

        =ImportJSON(“http://www.gw2spidy.com/api/v0.9/json/all-items/5″)

        The import works. But the solution with now() and recalculation settings not.

        Is it possible to add a function which reload the .csv complete by refresh the browser?

        1. trevor says:

          You can only use the NOW() function if you are using the new Google Spreadsheets. If you are still using the old Google Spreadsheets, you have to use GOOGLECLOCK().

          Re-read the comment above that you replied to for full details. Unfortunately, this is all I know about how to get ImportJSON to automatically refresh. I did a test using the old Google Spreadsheets and GOOGLECLOCK() worked for me. I haven’t tried using the new Google Spreadsheets and changing the recalculation settings.

          1. Greg says:

            Looks like NOW() won’t work with custom functions:

            https://developers.google.com/apps-script/migration/sheets

            :(

          2. crypt says:

            I am having issues with your fix here Fedora.
            GoogleClock has been disabled and the refresh menu on the new Google Sheets removed.

            Although the API changes on my link http://poloniex.com/public?command=returnOrderBook&currencyPair=BTC_XMR

            Sheet: https://docs.google.com/spreadsheet/ccc?key=0AmSWZWkGJ1V3dFRwYmoySElGbGxsTEFqRVJZc3Y0Qnc#gid=1

            My table will not fresh automatically, no matter what I do or how long I wait. Help!

          3. trevor says:

            Crypt,

            The GoogleClock() function still should be active in the old Google Spreadsheets. It’s not active in the new version. As Greg has noted, there is no way to update your data automatically using custom functions in the new Google Spreadsheets. You have to use the old version.

            The sheet you linked to is using the old version of Google Spreadsheets. To make ImportJSON execute every minute, simply add GoogleClock() as the fourth parameter:

            =ImportJSON("http://poloniex.com/public?command=returnOrderBook&currencyPair=BTC_XMR","","noTruncate", GOOGLECLOCK())

            This fourth parameter is ignored by ImportJSON, but causes the function to be re-evaluated every minute.

            To create a new spreadsheet using the old Google Spreadsheets, go to http://g.co/oldsheets, then save it to your Google Drive as you normally would.

            Hope this helps.

            Trevor

          4. crypt says:

            I think that’s working Thanks!

  28. RICHARD says:

    The script works great. Sometimes an API responses data and is placed by script into columns instead of rows. For exampIe get the following headers back.

    /478755/x1 /478755/x2 /478755/x3 /478755/x4 /478755/x5 /615424/x1

    I do not want 478755 to be in the header, it is basically the value of row id. I want 615424 to be on the next row? Is there a way to achieve this?

    1. trevor says:

      Richard,

      Can you point me to the URL that generates the JSON or copy-and-paste part of the JSON into a comment?

      JSON is a key-value pair format. For ImportJSON, the column headers are always created using the keys and the data always comes from the values.

      If the JSON you’re trying to retrieve uses the row IDs as the keys for the rows rather than the values, then you’ll see those in the column headers. Right now there’s no way to format this differently.

      Trevor

  29. Tim says:

    Hi Trevor, thanks a lot for your great library.
    I’d like to use it to extract the direct iTunes link out of an iTunes Search API result.

    The Search Results URL would be e.g.:
    https://itunes.apple.com/search?term=%22notorious%20big%20juicy%22&country=de&limit=1%22

    So I’ve set up a spreadsheet with your script.
    =Importjson(“https://itunes.apple.com/search?term=”&A5&”&country=de&limit=1″)
    works fine and extracts me the entire search result into the spreadsheet. However, I’m only interested in the URL thats located in

    trackViewUrl”: “https://itunes.apple.com/de/album/juicy/id204669326?i=204669559&uo=4″

    Unfortunately, whenever I’m trying to use a query to only extract this part, I’m getting a Parsing Error.

    Then I noticed that even when I paste the example code (see below) from your page, it creates a parsing error.

    =ImportJSON(“http://gdata.youtube.com/feeds/api/standardfeeds/most_popular?v=2&alt=json”, “/feed/entry/title,/feed/entry/content”, “noInherit,noTruncate,rawHeaders”)
    -> Error

    However,
    =ImportJSON(“http://gdata.youtube.com/feeds/api/standardfeeds/most_popular?v=2&alt=json”)
    works fine.

    Do you have any ideas what I’m missing? Everything works except the use of the query path.

    Thanks a lot!
    Tim

    1. trevor says:

      Tim,

      I suspect your problem is due to the format of the formula, rather than a problem with ImportJSON itself.

      When I copy and paste the test you pasted above into Google Spreadsheets, I get a parsing error. But this is because the double quotes being used are smart quotes rather than regular double quotes. Google Spreadsheets doesn’t recognize smart quotes as string delimiters, so throws a Parsing Error trying to parse the formula. This prevents ImportJSON from ever being called.

      If you convert your smart quotes to regular quotes, you should be able to use:

      =ImportJSON("https://itunes.apple.com/search?term=”&A5&”&country=de&limit=1", "/results/trackViewUrl", "noInherit,noTruncate,rawHeaders")

      Try that and see if that works.

      Trevor

  30. Ralph says:

    Hi Trevor,

    I think you have a bug with null values. Then a new row contains columns with a null value you are inserting the non-null value from the previous row rather than null as it should be.

    No biggy as I changed my api to replace nulls with something else as a work around.

    Cheers

    1. trevor says:

      Ralph,

      Can you provide an example that demonstrates the problem so I can test it?

      Also, confirm whether the bug still exists if you use the “noInherit” option. By design, rows inherit the values of their parent rows, but they shouldn’t inherit the values of their sibling rows.

      Thanks.

      Trevor

      1. Brian Lewis says:

        I’m having the same problem, and the first thing I tried was the noInherit option.

        JSON for reproduction:

        {
          "version": "1.0",
          "items":
          [
            {
              "user_id": "1002",
              "first_name": "Sarah",
              "last_name": "Supervisor",
              "title": "Supervisor",
              "last_login_date_utc": "2014-05-09T21:18:11.363"
            },
            {
              "user_id": "1003",
              "first_name": "Edward ",
              "last_name": "Employee",
              "title": null,
              "last_login_date_utc": null
            }
          ]
        }

        The row with user_id = 1003 consistently shows a title of “Supervisor” and a last login date of 2014-05-09T21:18:11.363, where both values should be null.

        1. Brian Lewis says:

          On line 286, why are you checking for i > 0? Shouldn’t we check to see whether data[state.rowIndex] has a value, even on the first pass?

          I took out i > 0 &&and not only did my data start appearing correctly, but I got back the first row of values I didn’t know I was missing.

          My current line 286 now says:

                  if (data[state.rowIndex]) {
                    state.rowIndex++;
                  }

          I don’t know what else I may have messed up by removing that check, but at least in my single situation I now consistently get the values I was expecting.

  31. Rabah Rahil says:

    Hi I am trying to install the script to a spreadsheet, but cannot find it in the add-ons library. How do I install? I cannot find script manager as it has been deprecated in light of add-ons. Any thoughts? Thanks!

    1. trevor says:

      Until I have time to figure out how to convert the script to an add-on, you’ll have to install the script manually.

      To do this, open the script editor using the Tools > Script Editor menu item. Then copy-and-paste the ImportJSON code from here and use File > Save to save the script. You should then be able to use ImportJSON within your spreadsheet.

  32. Brad says:

    Hi Trevor,

    Awesome script. It works flawlessly! Thank you so much for your continued contribution to the community.

    I was just wondering if there is an option (presumably in ImportJSONAdvanced() that allows for the return of only the first value of an array? I have scoured the documentation and found a hint that it may be require row==0 in tranformFunc somewhere, but can’t make heads or tails of it.

    My end-requirement is to return the first value of “/return/ANC/sellorders/price” from the API “http://pubapi.cryptsy.com/api.php?method=singleorderdata&marketid=66″.

    At the moment, I use the function =ImportCSV(http://pubapi.cryptsy.com/api.php?method=singleorderdata&marketid=66, /return/ANC/sellorders/price, “noHeaders”) but this returns 100 values – I simply need the first.

    Thanks again for such an awesome function.

    Brad

    1. trevor says:

      Brad,

      If you are using the new Google Spreadsheets, try using the ARRAY_CONSTRAIN function.

      If you are using the old Google Spreadsheets, then you’ll have to create a custom function which calls ImportJSON. Something like this should work:

      function ImportJSONFirstCell()
      {
         var result = ImportJSON(url, query, parseOptions);
        
         return [ result[0][0] ];
      }

      Trevor

  33. statdude says:

    Receiving error ” exceeded maximum execution time” on some calls. What gives?

    Thanks,

    1. trevor says:

      This can occur when the resource you’re connecting to is taking a long time to respond.

  34. telefrancisco says:

    Regarding in updating the data in Google Spreadsheets, just to mention that Google Drive Time-Based triggers do not work.
    ImportJSON as said earlier is not an standalone function.

    Regarding the new Google Spreadsheets you can update ImportJson every minute configuring in the options of the Spreadsheet to update it every minute and using now() instead of the deprecated “googleclock” (which in fact, updated the spreadsheet every minute, so there is no function loss at all for this topic in new spreadsheets).

  35. Daniel Seijo says:

    I think I have some trouble with return format, because Spreadsheets treat cells with importJSON as string, not number.

    I.e. I have a column with results: 1, 30, 5, 30, 2 from importJSON, then I create a “select A1:A4 order by A desc and obtain: 5, 3, 30, 2, 1.

    With same query but typing manually I get 30, 5, 3, 2, 1.

    I tried to force to numeric format cells, but it´s the same. Any idea?

    Thank you!

    1. trevor says:

      Where are you doing the SELECT statement?

      I just did a test in Google Spreadsheets using a YouTube JSON feed and the results were coming back as numbers. Or, at least all of Google’s internal functions were treating them as numbers. So it may be a problem with executing queries against the spreadsheet.

      You could try forcing the numbers by modifying the defaultTransform_ function. To force every value to an float, at the end you could put:

      if (!isNaN(parseFloat(data[row][column]))) {
         data[row][column] = parseFloat(data[row][column]);
      }

      Check the documentation here for a more robust version of parseFloat that you may want to use if your data contains string values that start with number characters.

      Let me know what you find out. If none of these work, post an link to a sample Google Spreadsheet that demonstrates the problem.

      1. frank says:

        i tried to add the option above. but i didnt get any numer out of my importjson function.

        Is there anything else i could try.

        1. trevor says:

          Not that I can think of. The code I gave should ensure the values being returned are formatted as numbers.

          Where are you doing a SELECT statement? Are you trying to use the Google Visualization Query Language, or is this from another client?

          Make sure you don’t have any other data in the column besides the header that could force the column to be treated as a string.

          1. frank says:

            I just copied de importjson code as a new script. and then make an importjson on the tab i want the data, i dont use any other program to import. im just a noob in this :)

          2. frank says:

            The column is just used for numbers, the rows have some text. but the columns i want to use are just numbers.

          3. frank says:

            in a strange way it worked, when i changed some code to properly format in my sheet en script it now has numers wich i can calculate with,

            Thank you very much for your great work :)

  36. John says:

    Hi Trevor,
    Great work. Really nice. I was curious if I am compliant with the GNU license GPL-3.0 with the change I made below in my copy of your script in order to get this script to work with what I needed. I simply added a header object directly to the UrlFetchApp so I could authenticate with the JSON feed I needed to pull into the spreadsheet. It works great. No issues. My question is if I am technically allowed to make this modification to a copy of the script in my spreadsheet based on the GNU GPL 3.0 license?

    function ImportJSONAdvanced(url, query, options, includeFunc, transformFunc) {
    var jsondata = UrlFetchApp.fetch(url,{
    headers: {

    ‘Authorization’: ‘apikey ‘ +
    ‘MyUser:MyAPIKEY’
    }
    });

    1. trevor says:

      John,

      If this is for your own internal use, you definitely are compliant with the license.

      If you are thinking about distribution your changes, check out an overview of the GPL 3.0 license here or here for a detailed breakdown of the new changes in the 3.0 version.

      The simplest way to remain compliant is to include your modified source with your distribution, note the changes you made and keep the GPL 3.0 license.

      Trevor

      1. John says:

        Thanks Trevor.
        I linked to you in my blog post.
        http://www.bdna.com/blogs/pull-technopedia-data-google-sheets/
        Thanks again for the great google sheets function.

  37. Fabio Montefuscolo says:

    It’s really awesome!

  38. Raffaele Negro says:

    Hi Trevor,
    I can’t see the ImportJSON formula in my cells even though I coded it in the editor, neither I see it in my tool submenu. Any thoughts?

    Thanks

    1. trevor says:

      Raffaele,

      Are you entering the ImportJSON formula into your cells?

      ImportJSON is a custom function, not an add-on. It won’t appear on any menus within Google Spreadsheets. Nor does Google Spreadsheets show it as an auto-complete suggestion when you’re typing formulas in cells. You need to use the documentation at the top of this page to use it within your spreadsheet.

      Trevor

  39. Petar says:

    Hello Trevor,

    Im traying to import data from game API to google Spredsheet but i constantly get error: The coordinates or dimensions of the range are invalid. (line 19, file “Code”)
    Im new in this and would very much appreciate any help

    function readJSON() {
    var ss = SpreadsheetApp.getActiveSpreadsheet();
    var ss = SpreadsheetApp.openById(“1taOtpenF4-VQez2Pq4-ZpctRtiuagMk_ha14ZGnHcsU”);
    var sheet = ss.getActiveSheet();
    var startRow = 1; // Change if needed

    var url=”http://www.cscpro.org/secura/citizen/86935.json”; // Paste your JSON URL here
    var response = UrlFetchApp.fetch(url); // get feed
    var dataAll = JSON.parse(response.getContentText()); //
    var dataSet = dataAll;

    var rows = [],
    data;

    for (i = 0; i < dataSet.length; i++) {
    data = dataSet[i];
    rows.push([data.id, data.name,data.age, data.is_online, data.alive, data.ban, data.level, data.exp, data.strengh]); //your JSON entities here
    }
    dataRange = sheet.getRange(1, 1,rows.length, 3); // 3 Denotes total number of entites
    dataRange.setValues(rows);

    }

    1. trevor says:

      Petar,

      Try asking your question on the Google Docs Help Forum.

      I’m sorry, but I’m not an expert at Google Spreadsheets. I wrote the ImportJSON for a specific project and open sourced it for others to use. My experience with Google Spreadsheets is limited to what I needed to learn for this project.

      Trevor

  40. Jumbo Panda says:

    Hey Trevor – Awesome snippet! What a time saver…

    Question on slowing down the API calls when I use this in a Google Docs worksheet to pull in Flurry data. I have a worksheet with 10+ =importJSON(blah) cells in it. Seems they all fire at once and I exceed the 1 API call per second Flurry limit.

    Any suggestions? Sleep function perhaps, but where?

    Thanks,
    John

    1. trevor says:

      John,

      I’d create a wrapper function that sleeps before executing, e.g.:

      function DelayedImportJSON(url, query, parseOptions, delay, random) {
          Utilities.sleep((random ? delay * Math.random() : delay) * 1000);

          return ImportJSON(url, query, parseOptions);
      }

      This would allow you to sleep a specific number of seconds or a random number of seconds up to a maximum number of seconds.

      To use this, just add a different number after each import statement:

      DelayedImportJSON('http://myurl/', '', '', 1)
      DelayedImportJSON('http://myurl/', '', '', 2)
      DelayedImportJSON('http://myurl/', '', '', 3)

      The statements should then get executed with time in between each one.

      Trevor

      1. Jumbo Panda says:

        Works great! Many thanks.

        Owe you a beer next time your in Seattle…

        John

  41. James says:

    Hi

    This script is just what I was looking for and I have it working. But I’ve run into a problem, possibly caused by the json format of the API call I am making.

    The json I am loading can be seen at http://jsonviewer.stack.hu/#http://europeana.eu/api//v2/search.json?wskey=ISXVIQBYBQ&query=*:*&start=1&rows=0&profile=facets

    The call I am using is =ImportJSON(“http://europeana.eu/api//v2/search.json?wskey=ISXVIQBYBQ&query=*:*&start=1&rows=0&profile=facets”, “/facets/name,/facets/fields/label,/facets/fields/count”, “noTruncate”) which can be seen at https://docs.google.com/spreadsheets/d/18CBvDafs8t8vSc0Y88gx12C-WAbSTrgLDgRk97tTXl4/edit?usp=sharing

    But what i want to do is just list for example all the label/count pairs within name:LANGUAGE as currently only the very first entry is being displayed.

    Is this possible?

    Many thanks, James

    1. trevor says:

      James,

      Apologies for the delay in responding. The problem looks to be two-fold.

      First, there is a bug in the ImportJSON library so not all the rows from your JSON file are being returned. I’ve documented it as Issue 8, but don’t have time right now to look into it. Based on my current schedule, it may be several months before I’ll have time to do another round of bug fixes.

      The other issue is that there is no way to filter based on the values of what is returned. So even if the bug was fixed, you would get all the rows of your data (including all of the name:LANGUAGE ones–not just the first one). The query parameter let’s you filter columns, but there’s no way currently to filter rows.

      Sorry I couldn’t be more helpful.

      Trevor

  42. ratman720 says:

    Hey Trevor, The scrypt is phenomenal and works wonders for me. Thanks so much for the effort. I’m trying to work through a spreadsheet of cryptocoin data from cryptsy and the API get functions simply aren’t kind. The trouble I am having is I only want the first entry in the orderbook data. Is there any way to get importjson to do this? advice would be greatly appreciated.

    this is my function for now.

    =ImportJSON(concatenate(“http://pubapi.cryptsy.com/api.php?method=singleorderdata&marketid=”,A1),concatenate(“/return/”,left(B1,len(B1)-4),”/sellorders/price”),”noTruncate,noheaders,noInherit”)

    I get all the orderbook data but I’m only interested in the first row.

    1. trevor says:

      To only show a subset of rows from the ImportJSON result, use the ARRAY_CONSTRAIN function from the new Google Spreadsheets:

      =ARRAY_CONSTRAIN(ImportJSON(concatenate("http://pubapi.cryptsy.com/api.php?method=singleorderdata&marketid=",A1),concatenate("/return/",left(B1,len(B1)-4),"/sellorders/price"),"noTruncate,noheaders,noInherit"), 1, 20)

      The second parameter is the number of rows to display and the third parameter is the number of columns to display. Set the number of columns to be equal to or greater than the number of columns you’re fetching.

      1. ratman720 says:

        Nice my big sheet of doom is in the old sheets and I have needed an excuse to switch over. Thanks for the fast reply, I had no idea that function existed.

  43. Tim says:

    I install the script and tried to run importJSON, it says “SyntaxError: Empty JSON string (line 168, file “Code”)”

    1. trevor says:

      Check the URL that you are passing into ImportJSON and confirm you are getting data back.

      If the data is coming back when you request it directly in the browser, then post the parameters for the ImportJSON function you are using here, and I’ll see if I can figure it out. If you have a sample Google Spreadsheet showing the problem, that’s ideal.

  44. Justin says:

    Hi Trevor – Thanks for creating this great tool. I’ve been trying to use it with the Edmunds API (vehicle data) to populate a google spreadsheet with all makes/models and some specifications of each vehicle. However ImportJSON is only pulling in one model from each make (e.g. Acura is a make that has several different models, but only the first model is being pulled in to the spreadsheet). I think the issue has something to do with the way that recursion is set up because at the bottom of the google spreadsheet a few additional models are brought in but it seems to stop importing when it gets back up to a make that only has one model listed (e.g. Tesla). Basically, I want column A to contain all makes and column B to contain all models that exist. Do you have any idea why ImportJSON is not pulling the entire population of models in? Is it user error? Any help would be much appreciated.

    Here is an example URL: https://api.edmunds.com/api/vehicle/v2/makes?&year=2014&view=full&fmt=json&api_key=2eq33utnmuh88tuwsbcrednt

    1. trevor says:

      Justin,

      There was a bug in the ImportJSON code in how it handled nested arrays. I just fixed it.

      Since the Script Library is now deprecated, you’ll have to install the code manually. To do so, open the script editor using the Tools > Script editor… menu item. Then copy and paste the new code from here.

      Let me know if this solves the problem. I only had time to do basic testing.

      Trevor

      1. Chris says:

        Hi Trevor,

        I appreciate your work on this very much!

        This fix addressed one issue I was having, but substituted another. Now values from nested arrays are showing up, but the first item in each array is missing. It’s like the index started at 1 rather than 0.

        Thank you in advance for any time you can spend on this one.

        CR

        1. Chris says:

          Trevor,

          I was able to fix this problem by making a minor change to the ParseData_ function. I changed line 286 in version 1.2.1 to:

          if (i >= 0 && data[state.rowIndex]) {

          and it seems to have addressed the issue.

          Thank you!

          CR

  45. Greg Wright says:

    Hi Trevor,
    I’m trying to access ZenDesk ticket information. We have API token authentication enabled.

    ZenDesk expects this:
    https://{domain}.zendesk.com/api/v2/ticketmetrics.json -u {username}/token:{API token}
    When I test via cURL, the format above works correctly.

    I’ve tried every variation of =ImportJSON that I can think of but I cannot pass authentication. Error: Couldn’t authenticate you

    Do you have any suggestions?
    Thanks!

    1. trevor says:

      Greg,

      The current version of ImportJSON does not support password-protected JSON resources.

      However, I describe code in this comment which can be used to add basic authentication support using a hardcoded username & password. ZenDesk appears to use basic authentication, so this should work for you too.

      Hope this helps.

      Trevor

      1. Greg Wright says:

        I added the code you cited but I still get the “error”:”Couldn’t authenticate you”. I tried a couple of variants with and without the ‘/token” syntax that ZenDesk ask for but was not able to get it to work.

        By using the following syntax I am no longer getting the “error”:”Couldn’t authenticate you”:

        =ImportJSON(“-u {username}/token:{token} https://{domain}.zendesk.com/api/v2/ticket_metrics.json”)

        But now I get Error: SyntaxError: Empty JSON string (line 185).

        Line 185 = var object = JSON.parse(jsondata.getContentText());

        Any other suggestions? Have you ever tried connecting to ZenDesk with ImportJSON?

        Thanks!

        1. trevor says:

          Greg,

          Can you post the code you used?

          Make sure your username is “{username}%2Ftoken” and your password is “{token}”, as this is what your curl arguments are indicating. If you’re not familiar with curl, check out the manpage, which explains the arguments and their formats.

          Note that curl is a completely different program from ImportJSON and none of the curl options work with ImportJSON, nor with URLs in general. So passing in “-u {username}/token:{token} https://{domain}.zendesk.com/api/v2/ticket_metrics.json” is essentially meaningless. You’re passing an invalid URL, so you should be getting an error.

          I haven’t used ZenDesk before with ImportJSON, but based on the Security and Authentication section of the Core API, it should work fine with the code for basic authentication.

          If you’re still having problems, you may have to hire a contractor on Elance or oDesk to help connect. You may need to authenticate using OAuth2, which is also supported by ZenDesk, but outside the scope of what I can help you with. You can read about OAuth2 support in UrlFetchApp, the library used by ImportSON. It requires custom scripting.

          Good luck.

          Trevor

  46. frank says:

    I have a problem with a import job. I try to import an api from an website but it gave me an error.

    It says
    error: SyntaxError: Unterminated object literal (line 140, file “importjson3″)

    I have no idea what is going wrong in this job, The most pages tells me that an \ is missing.

    1. trevor says:

      Frank,

      This error most likely comes from a missing closing brace within the JSON you’re trying to parse. You’ll have to fix the JSON you’re importing for that.

      Trevor

      1. frank says:

        ok, so if i understand, i have to do something about the data thats going to be imported?
        and not with the import script itself ?

        1. trevor says:

          Correct. That is the most likely problem, since “Unterminated object literal” is an error that comes from Google’s JSON parser.

          You can check the validity of your JSON by copied the URL to the JSON, or the full JSON that you get when you access the URL in your browser, into this tool or this tool.

          Trevor

          1. frank says:

            Im still struggling with the data. i couldnt find out whats wrong with it.
            im trying to import this URL
            http://pubapi.cryptsy.com/api.php?method=marketdatav2

            And before it always worked, but in one day i ge the error above

  47. Jim e says:

    I’ve noticed the script seems to have issues when values have empty strings as such
    [
    {"key":"value"},
    {"key":""}
    ]

    Both rows will import as having “value” under “key” despite one being empty. I’ve been looking through the code but I’m not certain where the fix may lie.

    1. Jim e says:

      I see what I was doing wrong, now it’s working exactly. Thank you.

  48. Silver says:

    I’m having an issue with the script not importing ALL of the values for a specific set…

    URL:
    http://www.bungie.net/platform/Destiny/TigerXbox/Account/4611686018437676868/

    I’m using the function like so:
    =IMPORTJSON(“http://www.bungie.net/platform/Destiny/TigerXbox/Account/4611686018437676868/”,”/Response/data/characters/characterBase/characterId”,”noHeaders”)

    The first part yields the address given above. However, only 2 of the 3 characterId values are being displayed. Specifically it is leaving out the FIRST item in the “characters” element….

    Any idea what the issue is?

  49. John says:

    Hi,

    I love that you’ve put up this script, however I’m having a random issue with my spreadsheet.

    When I use it to import values into my sheet I occasionally get the following:
    #ERROR
    Error: SyntaxError. Empty JSON string (line 131).

    I’ve found that if I modify the URL slight for example, change http to https and then back again to http it works. But it means I have to do it for the multiple cells I’m importing data for.

    Any ideas on what could be causing Sheets to suddenly find and error and then resolve upon edit?

  50. Jakke says:

    Hey, I love the script, perfect to import what I want from the WoW Armory. However, since I am loading rather big amounts of info (650 rows, 5 columns, almost 3000 cells), I tend to get the famous “Use utilities.sleep(1000)” error often.

    Is there any way that I can tell the ImportJSON to not refresh the information it fetches everytime, and instead add a button to my custom menu that allows me to update it? Would make it alot easier for both Google servers and for the use we have for the Spreadsheet (very sluggish till you have had it open for some minutes).

  51. Andy says:

    I love this. I have a feeling it can do something more advanced I want to see, but I’m new to scripting and having trouble with some of the lingo.

    How do I use this to do a series of json imports by inserting different {client} variables into a url formatted like “https://example.com/{client}/search?term={item}”

    I want to find the way to insert the list if

    https://example.com/{client1}/search?term={item}
    https://example.com/{client2}/search?term={item}
    https://example.com/{client3}/search?term={item}…

  52. Mike says:

    Does the google sheet make a local call?
    That is, if the endpoint is in my network and not publicly available, is it reasonable to expect results? Or does the endpoint have to be public?”

  53. Mickael says:

    Hello,

    It looks really useful, two questions:

    1. Do you know how to import this script onto the NEW google spreadsheets? I created the script, but then none of the functions appear on the new spreadsheets… do we need to do anything else?

    2. Do you know the query that I’d have to run here, http://d.yimg.com/autoc.finance.yahoo.com/autoc?query=carnival%20plc&callback=YAHOO.Finance.SymbolSuggest.ssCallback , to get the symbol just for the LSE exch for example?

    Many thank in advance!!!

    1. Mickael says:

      I managed to see the function in the new spreadsheet, however, a basic try to import it with:

      =ImportJSON(“http://d.yimg.com/autoc.finance.yahoo.com/autoc?query=carnival&callback=YAHOO.Finance.SymbolSuggest.ssCallback”,”",”")

      and I’m getting:
      Error: Syntax Error. Unexpected token: Y (line 166)

      I saw that what’s coming back in the browser starts with Yahoo (and there’s a closing bracket at the end as well. Do you know how to remove that part so that we just have a real JSON?

3 pings

  1. The training spreadsheet, part 4 | Neil's Ironman Blog says:

    […] a custom script to do this. This is much easier than it sounds. I got all of this from here: http://blog.fastfedora.com/projects/import-json, which is a brilliant overview of it. If you don’t want to read it, do […]

  2. How to use triggers with functions that require parameters from the spreadsheet? | Question and Answer says:

    […] I found a simple and great solution to retrieve JSON into a Google spreadsheet. […]

  3. Scraping the Ofsted website. Part 2 – front-end (Javascript and D3) | Philip Nye, freelance journalist - data, Hackney, public policy and more says:

    […] it doesn’t contain an option for importing JSON data – not out the box anyone. Helpfully, someone has written a Google Apps Script that adds an ImportJSON function to Sheets with minimal […]

Leave a Reply

Your email address will not be published.

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>