Note: This Is Archived Documentation ONLY

The maintenance of this script has been transferred to Brad Jasper. This page exists only for archive purposes.

If you have bugs, feature requests or other issues related to ImportJSON, please submit them via the Issues page for the new GitHub project. The maintained source code is available in the bradjasper/ImportJSON repo.

Many thanks go to Brad Jasper who took over this project in 2017, and has reactivated it to continue development.

ImportJSON imports data from public JSON APIs into Google Spreadsheets. It aims to operate similarly to how the native Google Spreadsheet functions ImportData and ImportXML work.

The ImportJSON library contains four public functions:

  • ImportJSON
    Use to import a JSON feed from a URL.

  • ImportJSONViaPost
    Use to import a JSON feed from a URL using POST parameters.

  • ImportJSONAdvanced
    Use if you are a script developer and want to easily extend the functionality of this library.

  • URLEncode
    Use to URL encode a string to concatenate it to a URL.



Imports a JSON feed and returns the results to be inserted into a Google Spreadsheet. The JSON feed is flattened to create a two-dimensional array. The first row contains the headers, with each column header indicating the path to that data in the JSON feed. The remaining rows contain the data.

ImportJSON takes 3 parameters:

  • url
    The URL to a JSON feed.

  • query
    A comma-separated list of paths to import. Any path starting with one of these paths gets imported.

  • parseOptions
    A comma-separated list of options that alter processing of the data.

By default, data gets transformed so it looks more like a normal data import. Specifically:

  • Data from parent JSON elements gets inherited to their child elements, so rows representing child elements contain the values of the rows representing their parent elements.
  • Values longer than 256 characters get truncated.
  • Headers have slashes converted to spaces, common prefixes removed and the resulting text converted to title case.

To change this behavior, pass in one of these values in the parseOptions parameter:

  • noInherit
    Don’t inherit values from parent elements

  • noTruncate
    Don’t truncate values

  • rawHeaders
    Don’t prettify headers

  • noHeaders
    Don’t include headers, only the data

  • debugLocation
    Prepend each value with the row & column it belongs in

For example, to return all the number of shares and comments for the URL from the Facebook Graph API, you could use:

=ImportJSON(CONCATENATE("", URLEncode("")),  "", "")

If you wanted to get rid of the headers, you would add a “noHeaders” to the last parameter.

=ImportJSON(CONCATENATE("", URLEncode("")), "", "noHeaders")

As an advanced example, if you wanted to query YouTube for the most popular videos, but only see the data returned relating to the ‘title’ and the ‘content’, you could use:

=ImportJSON("", "/feed/entry/title,/feed/entry/content", "noInherit,noTruncate,rawHeaders")

The “rawHeaders” allows us to see the full path to each column of data in the spreadsheet.


Imports a JSON feed via a POST request and returns the results to be inserted into a Google Spreadsheet. This function works the same as ImportJSON, but allows you to specify a payload and fetch options to perform a POST request instead of a GET request.

To retrieve the JSON, a POST request is sent to the URL and the payload is passed as the content of the request using the content type “application/x-www-form-urlencoded”. If the fetchOptions define a value for “method”, “payload” or “contentType”, these values will take precedent. For example, advanced users can use this to make this function pass XML as the payload using a GET
request and a content type of “application/xml; charset=utf-8″.

ImportJSONViaPost takes 5 parameters:

  • url
    The URL to a JSON feed.

  • payload
    The content to pass with the POST request; usually a URL encoded list of name-value parameters separated by ampersands. Use the URLEncode function to URL encode parameters.

  • fetchOptions
    A comma-separated list of options used to retrieve the JSON feed from the URL.

  • query
    A comma-separated list of paths to import. Any path starting with one of these paths gets imported.

  • parseOptions
    A comma-separated list of options that alter processing of the data.

For more information on the available fetch options, see the documentation for UrlFetchApp. At this time the “headers” option is not supported.

For a list of the supported parseOptions and how to use queries, see ImportJSON.


An advanced version of ImportJSON designed to be easily extended by a script. This version cannot be called from within a spreadsheet.

Imports a JSON feed and returns the results to be inserted into a Google Spreadsheet. The JSON feed is flattened to create a two-dimensional array. The first row contains the headers, with each column header indicating the path to that data in the JSON feed. The remaining rows contain the data.

ImportJSONAdvanced takes 6 parameters:

  • url
    The URL to a JSON feed.

  • fetchOptions
    An Object whose properties are the options used to retrieve the JSON feed from the URL.

  • query
    A comma-separated list of paths to import. Any path starting with one of these paths gets imported.

  • parseOptions
    A comma-separated list of options that alter processing of the data.

  • includeFunc
    A function with the signature func(query, path, options) that returns true if the data element at the given path should be included or false otherwise.

  • transformFunc
    A function with the signature func(data, row, column, options) where data is a 2-dimensional array of the data and row & column are the current row and column being processed. Any return value is ignored. Note that row 0 contains the headers for the data, so test for row==0 to process headers only.

The function returns a two-dimensional array containing the data, with the first row containing headers.

The fetchOptions can be used to change how the JSON feed is retrieved. For instance, the “method” and “payload” options can be set to pass a POST request with post parameters. For more information on the available parameters, see the documentation for UrlFetchApp. The fetchOptions must be an Object.

Use the include and transformation functions to determine what to include in the import and how to transform the data after it is imported.

For example:

          new Object() { "method" : "post", "payload" : "user=bob&apikey=xxxx" },
          function (query, path) { return path.indexOf(query) == 0; },
          function (data, row, column) { data[row][column] = data[row][column].toString().substr(0, 100); } )

In this example, the import function checks to see if the path to the data being imported starts with the query. The transform function takes the data and truncates it. For more robust versions of these functions, see the internal code of this library.


Encodes the given value to use within a URL.

URLEncode takes 1 parameters:

  • value
    The value to be encoded.

The function returns the given value encoded using the URL encoding scheme.

For instance:

=URLEncode("Value with spaces")

Returns the value “Value%20with%20spaces”.

You can use the URLEncode function to create URL (GET) and POST parameters by combining it with the CONCATENATE function. For instance:

=CONCATENATE("param1=", URLEncode("Value 1"), "&", "param2=", URLEncode("Value 2"))

Would be encoded as “param1=Value%20&param2=Value%20″. This could then be added to the query of a URL or passed as a payload in the ImportJSONViaPost function.

Source Code

The code is now available in a GitHub repository.

Bugs & Feature Requests

If you have features you’d like to see added, or notice bugs in this library, please submit a new issue on GitHub. If you don’t have a GitHub account, you can always leave a comment on this page too.

The library isn’t under active development, so don’t expect issues to be fixed quickly. But I do occasionally revisit it and add new features or fix bugs. If you would like to contribute to the development of this library, let me know.


  1. Nathan Ziarek says:

    I can’t figure out what the xPath would be for something like this:

       blah: "blah",
       aaData: [

    …transformed to a table like this:

    John  |  Smith  |
    Tody  |  Harris  |

    Am I barking up the wrong tree?

    1. trevor says:

      The ImportJSON library doesn’t use XPath syntax. Eventually it should use a JSONPath syntax, but for now it does a simple starts-with match.

      If you want to use the function to import the JSON data you showed, use the query “/aaData”. Though currently the ImportJSON function would return the JSON you provided as a comma-separated list within a single cell because it currently collapses all scalar arrays into single values.

      If I have time I’ll work on adding an option to allow you to import scalar arrays as arrays in the final result. For my initial test cases it made sense to import them as a single cell, but now that I have your test case, I can add to it. Thanks.

      1. Nick says:

        Hello trevor,

        Thank you for your awesome script!! It is amazing!
        I am really looking forward to what Nathan Ziarek is looking for.

        Do you have any news regarding that request?

        Thank you,

        1. trevor says:

          I haven’t had time to work on this script, so no. But your request is definitely noted, so I’ll bump up the priority of this feature when I do start working on it.

          You can follow the issue at GitHub: Add option to return scalar arrays as arrays. Click Watch Thread to receive e-mail updates about this specific feature.

          1. Eugene says:

            Hi! Any luck with scalar arrays parsing?

  2. Bo says:

    I’m trying to pull;

    “subscriberCount”: “240750″ just the 240750 number into a cell from this json;

    How would I structure the query to do that?

    1. trevor says:

      Use the path to the JSON attribute you want to retrieve, using a slash to separate elements and then set the options parameter to “noHeaders”.

      In your example, the path would be “/entry/yt$statistics/subscriberCount”, so you’re import function would look like:

      =ImportJSON(“”, “/entry/yt$statistics/subscriberCount”, “noHeaders”)

  3. Laura- says:

    Nice library! One suggestion would be to tag it with a version, so that other developers can include it in their own scripts.

    Currently, I had to fork it to give it a version, which isn’t ideal.

    1. trevor says:

      Thanks. I just re-published the script with versions enabled. The current version is now v1.1.1 which translates to the Google script version 3. It may take a couple days for the script to be published in the main directory.

      1. Laura- says:

        Hmm… you sure you published this? I still only see the former version available in Google :/

        1. trevor says:

          I did publish it, but didn’t check the Script Gallery beforehand. It looks like someone using the e-mail address “” republished my library back in January and hijacked my listing.

          It looks like I may not be able to get this republished until I can get Google to re-assign ownership back to me. I’m working on figuring out how to do this.

          Unfortunately, it looks like Google Scripts is not well supported by Google. The Report Abuse link points to an old e-mail that bounces and I never wind up getting notifications when the scripts do get published.

          I’m trying multiple avenues, so hopefully they’ll get the problem fixed soon. Thanks for letting me know!

        2. trevor says:

          Laura, I was able to get everything sorted out with the Google Apps team, so now my version with versioning is available in the Script Library.

          I also just updated the library to provide support for POST requests. I just submitted it, so it should be available in the next couple days. Thanks for alerting me to the issue.

  4. Besim Karadeniz says:

    Hello Trevor,
    I’m experimenting with the Facebook API and I’m watching a strange behaviour. I have following API-String to get informations for two facebook pages at once):

    =ImportJSON(“,116335138405853″, “”, “noHeaders”))

    The result of this call is put in only one lines of my spreadshet although I get results for two pages. The results of the second page is put in the same line after the results of the first line. I expected these data of the second page under the first page in a second line.

    The JSON output looks good for me I cannot find any error.

    I think that there could be a problem in your script with the phenomen of using a commata in the facebook api call.


    1. trevor says:


      The problem is not with the script, but with the way Facebook is returning multiple objects in the API. Rather than returning the objects as a top-level array, which ImportJSON would correctly parse as multiple rows, Facebook is returning the objects as a key-value map where the key is the id of each object.

      This makes each object appear to be attributes of a single top-level object, rather than multiple objects. Hence the single row.

      There may be a way to add another function parameter, or change how the query gets parsed, to make this work the way you’re expecting it to. I added an enhancement request in Github for this (see Figure out how to handle multiple objects in Facebook API using id fields).

      Though it’s not a trivial fix, and I don’t know when I’ll have time to work on it. If I do add this functionality, I’ll definitely let you know. In the meantime, your best bet is to use separate ImportJSON calls, one per line.

      – Trevor

      1. Besim Karadeniz says:

        Hello Trevor,

        I just wanted to ask just what the status is of this feature. Is there hope that it might be implemented in the near future?


        1. trevor says:


          Unfortunately I don’t have any time right now to work on this feature. My business is booming and is consuming all of my time. The soonest things might slow down is November.

          The code is on Github, so if someone else decides to add the feature, I’ll go ahead and publish it. Otherwise, I’ll let you know when I’m able to get around to it.

          — Trevor

  5. Nathan says:

    This is so great! I’ve been running this for a bit and pretty quickly hit the Google Docs rate limit. It returns #ERROR! and the following message:
    Service invoked too many times in a short time: urlfetch. Try Utilities.sleep(1000) between calls. (line 167, file “Code”)

    I don’t know how to add this myself. How could I add this pause so that I wouldn’t keep hitting the rate limit? Thanks!

    1. trevor says:

      From within your Google Spreadsheet:

      1. Use the Tools > Script editor menu item to launch the script editor.
      2. Add Utilities.sleep(1000); on line 167 so it becomes the first line of the ImportJSONAdvanced function. The code var jsondata = UrlFetchApp.fetch(url, fetchOptions); should now be on line 168 after you do this.
      3. Use File > Save to save the script.
      4. Go back to your spreadsheet, which should refresh in a couple seconds to use the new code.

      Experiment with smaller values than 1000 milliseconds if this is taking too long between calls.

      1. Nathan says:

        As long as I only fetch 50 at a time, I don’t hit the rate limit. If I push it up to over 100 at a time, I hit the #ERROR!, even after adding the sleep instruction. From watching the page load, it doesn’t appear that the sheet loads one JSON record, pauses for 1 second, then loads the next. It looks like they just snowball. Maybe that’s just how it looks, though.

        Pulling down 10,000 records is a bit of a slog, 50 at a time, but it’s sure better than the process I used previously.

        1. trevor says:

          Now that I think about it, that makes sense. Each function on the spreadsheet is likely executed in a separate thread. So the sleep being executed on one instance doesn’t affect the other instances.

          One question though: why do you need separate statements for each row?

          If the API you’re querying was designed well, it should have a way to pull multiple records at the same time. The ImportJSON places each record into a separate row if it receives multiple rows of data. This way you’d only use a single ImportJSON statement instead of multiple statements, and you shouldn’t hit the rate limit.

          If the API doesn’t have a way to query multiple rows of data, consider contacting the creator of the API and asking them to add this capability.

          The other thing you can try is to write your own JavaScript function that calls ImportJSONAdvanced multiple times with different parameters and concatenates the results. This would then run in a single thread and the sleep function would work as intended.

        2. Rabah Rahil says:

          Having the same issues Nathan. Did you ever resolve this?

          P.S. Thanks soo much Trevor this script is amazing!

          1. Cara says:

            Hello guys,

            I, too, am having this issue with the “Send to Google Drive” script (

            Can any of you guys help me with the usage of Utilities.sleep()? The error I’m getting is “Service invoked too many times in a short time: driveWriteVolume rateMax. Try Utilities.sleep(1000) between calls.”

            Any help is appreciated! Thank you!

  6. Martin Snygg says:

    Hey looks like the exact thing I need but i get:

    error: SyntaxError: Unexpected token: < (line 168, file "Code")

    Trying to use this:

    What should the right formatted =ImportJSON look like in my case?

    1. trevor says:

      The API you refer to requires you to substitute {PLATOON_ID} in the URL with an actual platoon ID. It looks like if the wrong platoon ID is entered, an HTML error page is returned rather than JSON. Which is probably what’s causing the error you’re seeing.

      Try using:


      And replace {PLATOON_ID} with the ID of the platoon you want to query. If that’s still generating an error, try accessing the URL directly in the browser to confirm you’re getting JSON back (it should look like plain text with lots of { and } in the text).

      If you are getting back JSON, but still getting the error, post the function exactly as you’re using it and I can look into it.

  7. Andriy says:


    Thanks for the info, this is exactly what I needed

    But I have some issues and I don’t know how to deal with them (it’s 3 in the morning and I cant go to bed :))

    So here it is:

    I’m trying to parse some info from the

    I need the “phrases_count” data

    I’ve tried different variations based on your example, but I get the parse error every time :(

    I tried to use this one:

    =ImportJSON(“”, “/feed/entry/title,/feed/entry/content”, “noInherit,noTruncate,rawHeaders”)

    And I got the damn error

    But everything works just fine if I use only the following code:


    I don’t get any errors, just lots of tables and data.

    So my question is:

    Why do your examples return the error and can you please tell me how can I get only phrases_count data from

    And here’s the screenshot if the link wouldn’t work:

  8. Andriy says:

    the problem solved by using ; instead of ,

  9. Peter Daly-Dickson says:

    You sir, are a feckin’ genius.

    A few hours on Google to find ImportJSON and has given me EXACTLY what I was looking for in my Google Doc.

    Can’t thank you enough! :)



  10. Michael Hirsch says:

    This is really great! However I am experiencing a strange issue.

    I set up a time based trigger to update my spreadsheet automatically every minute (I’m creating a visualization) that tracks NYC’s citibike program. However, the spreadsheet isn’t updating based on the trigger, and instead only updates when I go into the Script Editor and press save. I don’t even have to make any changes.

    Here are the execution logs… can you help me out? Thanks so much!

    [13-10-25 12:04:17:290 EDT] Starting execution
    [13-10-25 12:04:17:298 EDT] UrlFetchApp.fetch([[object Object], null]) [0 seconds]
    [13-10-25 12:04:17:339 EDT] Execution failed: Invalid argument: http://object%20Object (line 167, file “Code”) [0.04 seconds total runtime]

    1. trevor says:


      Sorry for the delayed response. I’ve been travelling.

      I’ve never used time-based triggers, so can’t help you out there.

      I do know why it updates when you press Save in the Script Editor. By pressing Save, you’re telling Google Spreadsheet the script has been updated and needs to be re-run.

      I’m not sure how to interpret your logs. It almost looks like the URL being used to fetch the results is the string “object Object”, which is the string representation of an object. This shouldn’t be occurring in the ImportJSON library (nor are there normally execution logs created from the library), so I don’t really know where to point you.

      If you do figure out the cause, or how to get time-based triggers to work, can you leave a reply here so others can learn from it. Thanks.

      Best of luck getting it to work.

      – Trevor

  11. aweesom1 says:

    Currently: Data from parent JSON elements gets inherited to their child elements, so rows representing child elements contain the values of the rows representing their parent elements.

    Is it possible to switch over to COLUMNS representing child elements contain the values of the COLUMNS representing their parent elements?

    I am running into the problem where there are 100+ child elements for one parent, and google docs is not wide enough to display all of them. It is however tall enough, but how do we get the function to display the data in rows instead of across columns?

    Sample Data
    =ImportJSON(,”/data/1000008386/members”, “”)


    1. trevor says:

      Use the TRANSPOSE function built into Google Spreadsheet. For example:

      =TRANSPOSE(ImportJSON(,”/data/1000008386/members”, “”))

  12. Dominik says:

    how do you get this to update the data every few minutes?

    1. trevor says:

      You’d have to set up a time-based trigger. I don’t know how to do this, and Michael Hirsch commented earlier that he was having problems getting this to work with ImportJSON.

      Be careful about setting up a time-based trigger in the Script Editor. I believe these only run the scripts, rather than asking the spreadsheet to update. The ImportJSON function doesn’t work standalone in the script editor; you have to create a wrapper parameterless function which passed parameters to ImportJSON to get it to work.

      Or at least, I’ve never been able to figure out how to run a function with parameters from within the Script Editor. During debugging, I just create a wrapper function and test it.

    2. David Dollar says:

      Add the following value to a field on your sheet:

      =CONCATENATE(YEAR(NOW()), “-”, MONTH(NOW()), “-”, DAY(NOW()), ” “, HOUR(NOW()), “:”, MINUTE(NOW()))

      You can set the foreground and background color the same to hide it.

      You should then specify that field as an extra argument to ImportJSON() and friends.

      Google caches the result of functions based on the parameters input. You need to bust this cache with a column that updates every minute like above.

      1. Craig says:

        How do I specify that field as an extra argument to importJSON?

        I need to update the JSON script every few min but the trigger doesn’t seem to be working. When I try run the script is says “DNS error: http://undefined

        Thank you


        1. Rory says:

          Hi Craig,

          Did you every get this sorted? I’m after the same sort of thing and having similar problems.



          1. Tuomo says:

            Hello. Does anyone know how to make this work? I have similar problem but I haven’t found any solution yet.

      2. Mohan says:

        Hi David,

        Can you please provide a sample on how to implement the extra argument – “You should then specify that field as an extra argument to ImportJSON() and friends.”

        FYI: I am trying to invoke ImportJson() from within the script.

        Your help is appriciated. Thanks.


  13. Ardalan says:

    Do you know if this can handle cursor paging with Facebook’s GRAPH API?

    Essentially what happens is it only displays ONE child data elements, and other child elements appear as a paginated link…yet the last parent element shows all it’s child elements.


    1. trevor says:

      No, the ImportJSON library doesn’t handle paginated links.

      You could use the ImportJSONAdvanced function within a loop so it retrieves the data, grabs the paging/next link at the end and then uses that to retrieve the next set of data. Then append all the results together into a single array to return them to the spreadsheet.

  14. Greg Gerber says:

    Turns out it’s called “time-driven triggers” and it’s surprisingly simple. Instructions here:

  15. Patrick says:

    Is there a way to pass a request header using this function? The site I am currently trying to pull data from requires that you post an identifier header.

    1. trevor says:

      Depending on what you’re trying to do, you may be able to use the ImportJSONViaPost function and set the fetchOptions to include the header you want sent. See the list of options you can set in the documentation for UrlFetchApp.fetch, which is what ImportJSON uses to retrieve the JSON from the server.

      If you need to send a custom header, you likely need to call ImportJSONViaPost from a script, or modify the library to convert text headers into a JavaScript object, since the ‘headers’ option requires an object, not a string.

      Also, though the function is called ImportJSONViaPost, you can do GET requests by setting the ‘method’ fetch option to “GET”.

  16. Brendan says:

    Thanks for the script. However, it’s not working with a JSONP url (Example:

    Any ideas?

    1. trevor says:

      The problem is that JSON-P adds extra JavaScript code that the JSON parser can’t parse. This should be able to be gotten around by trimming the start and end of the JSON-P response.

      I just created the issue Figure out how to import JSON-P data in the issue tracking system. Check this out for a way you can modify the code to get what you want. If you try it, please add a comment that describes whether it was successful or not.

  17. Rory says:

    Hi, thanks a lot for the script. It’s great for what I’m doing.

    I can’t work out how to tailor it to my needs though. I’m using this code to gather photos for a foursquare venue:

    =ImportJSON("", "/response/venue/photos/groups/items/prefix,/response/venue/photos/groups/items/width,/response/venue/photos/groups/items/height,/response/venue/photos/groups/items/suffix", "noHeaders")

    This creates a column for prefix, suffix, width and height and a row for each photo then populates the cells with the relevant data. However, I wish to combine the prefix, suffix, width and height for each photo along with the letter ‘x’ to generate a full url for each photo and display the URLS in a single cell separated by commas, like so:

    URL1,URL2,URL3,URL4 etc

    Is there anyway to do this?

    Thanks for any help.

    1. trevor says:

      If you have a known number of rows, you should be able to do this by setting up your worksheet as follows:

      1. Enter your ImportJSON function in B1

      2. In A1, enter =CONCATENATE(B1, C1, D1, E1) to concatenate the prefix, width, height and suffix into a single column. Add whatever other text you need as parameters here. So you might use =CONCATENATE("http://someserver/images/", B1, "_image-", C1, "x", D1, ".", E1) which would produce something like “http://someserver/images/bu_image-200×420.jpg”.

      3. Copy the formula in A1 down the rest of column A for as many rows as you expect to receive from the server

      3. In F1 (or any other column which doesn’t have data), enter =JOIN(",", A:A). This will combine all the individual URLs into a single cell, separating them by commas.

      The only issue here is that you have to manually copy the A1 formula and know how many rows you’re receiving.

      To do this with an unknown number of rows, you’d have to write a script using the ImportJSONAdvanced function to do this.

  18. Dave says:

    Trevor –

    Thank you for this script! And I hate to add a real noob question, but here goes (I tried googling for a few hours before I posted this, so if I missed something obvious please forgive my fried brain).

    So I have an import source from a url that feeds me order data, but unfortunately it does so in order of most recent to oldest.

    I’m trying to make an additional column to where we can assign a value for the order (a room number), and the problem I’m running into is the resultant structure looks something like this:

    Room Order Person
    105 3 Joe
    461 2 Curly
    107 1 Moe

    Once the script refreshes after a new order, the first column, being static, does this:

    Room Order Person
    105 4 Shemp
    461 3 Joe
    107 2 Curly
    1 Moe

    (So on and so forth for each order thereafter)

    Now, it would be easy if I could just simply sort the data by the order column, but when I do that I get partial sort, that’s removes most of the entries because it’s being referenced by formulas such as “=CONTINUE(A1, 2, 3)” instead of the values.

    I guess is there any way to work around this, e.g. to have the JSON data sent to the cells as actual values instead of formulas, or something else I’m not aware of?

    Any assistance you could offer would be a godsend. Thanks for the script though -for the advertised purpose, it worked BEAUTIFULLY!

    1. trevor says:

      If you only need to update once in a while, the simple solution is to click the upper-left corner of the worksheet to highlight the entire worksheet, press Copy, create a new worksheet, then use Paste Special > As Values into that sheet. Then you can sort normally.

      You could probably also do this programmatically by calling ImportJSONAdvanced, then calling some Google Spreadsheet methods to insert the data directly into the spreadsheet. I’m pretty sure this can be done, but I don’t know how to do it. You could search for how to insert data programatically into the spreadsheet.

      The problem right now is that ImportJSON is designed to be used as a function, so it returns the results dynamically, which doesn’t allow for sorting, as you’ve noticed.

      1. Dave says:

        Thanks for the tip! I’ll see what I can find, and if I can figure it out I’ll post it back here in case anyone comes up with the same dilemma. :)

  19. Joey says:

    Hey Trevor,
    Thanks for the great script. I’m trying to use it to pull data from basecamp into a spreadsheet, but it seems to get tripped up by the html basic authentication. Any way to get around this?

    Here’s my function call formatting:

    =ImportJSON(“”, “/name”, “noHeaders”)

    And this is the error I get:

    error: Unexpected error: (line 130, file “Code”)

    1. trevor says:


      ImportJSON uses the UrlFetchApp class from Google, which I don’t think supports using the username & password in the URL itself.

      It may be possible to do this with ImportJSONAdvanced by creating a custom function in a script. UrlFetchApp supports arbitrary headers, so you could pass the username & password using the Authenticate HTTP header.

      Try adding this function into the ImportJSON codebase:

      function ImportJSONBasicAuthentication(url, query, parseOptions, username, password) {
         var username = 'myUser';
         var password = 'myPassword';
         var fetchOptions = {
            "headers" : {
               "Authorization" : 'Basic ' + Utilities.base64Encode(username + ':' + password)
            muteHttpExceptions: true

        return ImportJSONAdvanced(url, fetchOptions, query, parseOptions, includeXPath_, defaultTransform_);

      I didn’t have time to test this. But let me know if this works.

      References that may help you are the UrlFetchApp.fetch documentation, RFC 2617 – HTTP Authentication and this post on the Google Apps Script Issues forum. Good luck!

      1. Ralph says:

        Nice one Trevor. I can confirm that this works, though anybody else will want to change the “myUser”,”myPassword” variables …

        1. Gowthami says:

          Somehow this way of passing parameters also doesnt work for me. Can you please help me on resolving this?

          1. Johannes says:

            I couldn’t get this to work with the JIRA Rest API as well, although I’ve been able to make the same call with basic auth via Postman. Anybody found out how to do this?

  20. Anton B says:

    I use your script for writing JSON data to Google Spreadsheet and I have encountered one problem.

    I use ‘parseJSONObject_’ function for JSON object from one API and I get next data:

    [["Contextlimitsum", "Status", "Disableddomains", "Keyword"]
    [100, 'Active', '']
    [100, 'NoActive']
    [200, 'Active', '', 'diamond']]

    How you can see that rows aren’t same length (1st – 4 column, 2nd – 3 columns..). This data can’t be written to spreadsheet using setValues() function because all rows are need to have same length. If I write it by one row with sheet.appendRow function, there is time limit exceed..

    Can you help me?

    1. trevor says:

      Since you’re doing this in code, try normalizing the array before you pass it into the setValues() function. That is, go through the array and increase the length of any rows that are too short before calling setValues() by populating the extra places with blank or empty data (e.g. ” or null).

  21. Ethan says:

    Hey Trevor! Loving ImportJSON. Any reason: would not work?

    Here is my formula: =ImportJSON(urlencode(“″), “”, “”)

    1. trevor says:

      The response coming back from the Google Trends URL you posted is JSON-P, not pure JSON. So it can’t be parsed by ImportJSON.

      See this issue in the issue tracking system for a potential workaround using a custom function that calls into ImportJSONAdvanced.

      Basically, you need to trim off the beginning and end of the response to get at the pure JSON. Also, if Google ever decides to change the callback function or comments in the JSON-P, you’ll need to adjust where you do that trimming (in case you get it to work & it suddenly stops working at some point).

  22. Steven says:


    Nice work with the JSON script and have a quick question for you. I am trying to pull member details from a page based on the Member ID from another sheet.

    Here is the syntax I have now.


    If I put the actual memberID number in where the cell reference is it works, but this way I get a parse error.


    1. trevor says:


      What jumps out right away is the equals sign before the cell reference. Try taking that out.

      Also, you can simply this in two ways. First, if the member ID is always going to be a number, you don’t need to URL encode the value, since it’ll be the same thing after you encode it. So you can do:

      =ImportJSON(ʺʺ & Memberlist!A3 & ʺ.jsonʺ,,ʺnoHeadersʺ)


      =ImportJSON(CONCATENATE(ʺʺ, Memberlist!A3, ʺ.jsonʺ),,ʺnoHeadersʺ)

      If the member ID might have spaces or non-alphanumeric characters, then you do need to encode it. But just encode that single attribute:

      =ImportJSON(CONCATENATE(ʺʺ, URLEncode(Memberlist!A3), ʺ.jsonʺ),,ʺnoHeadersʺ)

      Hope that works for you.

      1. Steven says:


        Thanks for the quick reply. Missed the equal sign thanks. But I am still getting a parse error on all three of the options you suggested. I even tried calling a cell from the same sheet to if that was the problem.


        1. Steven says:


          Nevermind. Seems it does not like copy and past. Had to retype out the entire formula. This works now.

          Thanks Again for the help.

  23. Roy says:

    Hi Trevor,

    This importJSON function just saved me loads of work! Great work! I have one issue i’ve been trying to solve for some days now. I have several url’s with a different parameter I want to include in one spreadsheat So:

    Is there a way to combine this?


    1. trevor says:


      If you know the response is always a fixed number of rows, you can use the ImportJSON function on the row after the previous response ends. For instance, if the response is always 5 rows, you can insert your first ImportJSON function with the first URL at line 1, the second one at line 6, the third one at line 11, etc.

      If you have a variable number of rows, you’ll need to get into editing the code to create a new function. This function can iterate through your list of potential URLs, call ImportJSON to retrieve the data for each one, and then merge the results together.

      The key to know is that all ImportJSON is doing is returning a 2-dimensional array. So all you need to do is call it multiple times, merge the arrays and return the result.

      Conceptually, you want something like:

      function ImportJSONVariations(baseUrl, variations, query, parseOptions) {
         var variables = variations.toString().split(",");
         var result = new Array();
         var url, variantArray;

         for (var i = 0; i < variables.length; i++) {
            url = baseUrl + variables[i];
            variantArray = ImportJSON(url, query, parseOptions);

        return result;

      Then you could use it by passing in a comma-separate list of variations from a base URL:

      =ImportJSONVariations("", "1367/,1369/");

      The above code probably won’t work because I don’t think the concat() method works on 2-dimensional arrays. But this answer on StackOverflow might be a useful start to create a function that merges 2-dimensional arrays.

      If you do work out a generic solution, please post it here so others can use it. Thanks and good luck!

  24. lam says:

    Hi Trevor, your script is very helpful.
    I get this error when i try import api

    error: Request failed for xxxx returned code 403. Truncated server response: 403 Forbidden Forbidden You don’t have the permission to access … (use muteHttpExceptions option to examine full response) (line 167, file “Code”)

    It seems the site doesn’t allow GAS user agent. Can your script manually change the user agent?

    1. trevor says:

      ImportJSON doesn’t currently have a way to set the headers for a request. So you can’t set the user agent.

      In theory, the UrlFetchApp class that ImportJSON uses can set the headers of the request and you could try explicitly setting the “User-Agent” header. You could do this modifying ImportJSON to pass in a ‘headers’ object in the fetchOptions:

      function ImportJSON(url, query, parseOptions) {
        var fetchOptions = { 'headers' : { 'User-Agent' : 'My user agent' } };

        return ImportJSONAdvanced(url, fetchOptions, query, parseOptions, includeXPath_, defaultTransform_);

      I wouldn’t be surprised if Google doesn’t let you override the user agent though. See here for full documentation on UrlFetchApp, which is what ImportJSON uses to do the retrieval. The ‘fetchOptions’ are what are passed through as the ‘params’.

      However, it’s unclear from the error message you posted that the problem is the user agent. A 403 Forbidden error can occur for many reasons. You may want to set muteHttpExceptions to see the full exception. You can do this by modifying the ImportJSON function above:

      var fetchOptions = { 'muteHttpExceptions' : true };

      Or by using ImportJSONViaPost, which allows you to pass in a limited set of the fetch options:

      =ImportJSONViaPost("", "", "muteHttpExceptions=true")

      I haven’t tested ImportJSON with muteHttpExceptions set, so I don’t know if this will generate another error or not. But it at least gives you an avenue of investigation. Good luck.

  25. Simon Thompson says:

    Thank you so much for this – I’ve used it as the basis of a URL checker for libraries. I will blog this on my web site at some point and will be happy to give you credit.

  26. Ethan says:

    Hi Trevor,

    Again, can’t thank you enough for writing this script. I’ve tried many variations of the formula to make it work on this url:|seo. What would the parameters be to get all elements? I know it’s something probably really elementary that I’m missing.

    Thank you for your help in advance.

    - Ethan

    1. trevor says:


      The problem is the pipe character ‘|’ in your URL. The UrlFetchApp class that ImportJSON uses rejects this as an invalid character. You need to encode this in the URL before you pass it to ImportJSON.

      Two ways of doing this:

      1. Use %7C instead of | in your URL

      This is the encoded value for the pipe character. For instance:


      2. Use the URLEncode function to encode your parameter

      If you think you might have other non-alphanumeric characters in your ‘q’ parameter, it’s better to just let the URLEncode function do the encoding. For instance:

      ImportJSON(CONCATENATE("", URLEncode("mesothelioma|seo")))

      Note that you cannot encode the entire URL, since the colon, slashes and question mark need to remain unencoded in the URL (i.e., they are special characters whose meaning you want to remain special). Just encode the parameters you need to.


      1. Ethan says:

        Perfect! Worked like a charm. Thank you!

        1. Jason says:

          Thanks to both of you! Grepwords does support XML responses now so importXML will work as well.

  27. Gabriela says:

    Hi, This is a great script. I got it working in one of my spreesheets but not in another.

    In the spreed I got it working I’ve set the JSON URL and it displays data. My question is, how do I set it to refresh data automatically. I have done everything. I changed spreedsheet configurations, set a trigger in script and even added Minute(Now()) at the end of my JSON url. I can’t get it to refresh data in minutes. Can you help me in this task.

    The other thing is, when I load the script using ImportJSON button in the script editor it gives me an error at line 130. It says DNS error: http://undefined (line 130, file “importJSON”).

    Thank you.

    1. trevor says:

      To get ImportJSON to refresh every minutes, you need to use the function GOOGLECLOCK() instead of NOW(). This forces an update every minute whereas NOW() only recalculates when the spreadsheet is opened or re-calculated.

      An easy way to do this is to add GOOGLECLOCK() as a 4th parameter:

      =ImportJSON("", "/", "", GOOGLECLOCK())

      Read the documentation linked above; if you are using the new Google Spreadsheets, you must use the Google NOW() function and change the recalculation setting to every minute under File > Spreadsheet settings instead.

      The error you are getting in the script editor using the ImportJSON button is because you are calling the function with no parameters. ImportJSON needs at least a URL passed in. When you execute it with no parameters, the URL is undefined (hence the error you’re seeing).

      To fix this, add a wrapper function into the script file:

      function ImportJSONTest()
         return ImportJSON("", "/", "");

      Then use the ImportJSONTest button to run the function. This is how I do debugging of the code.

      1. Leadustin says:

        Hi Trevor,

        hope u can help me with my problem. I have created a new spreadsheet and added ur JSON Import Script into the script editor and put this in a cell


        The import works. But the solution with now() and recalculation settings not.

        Is it possible to add a function which reload the .csv complete by refresh the browser?

        1. trevor says:

          You can only use the NOW() function if you are using the new Google Spreadsheets. If you are still using the old Google Spreadsheets, you have to use GOOGLECLOCK().

          Re-read the comment above that you replied to for full details. Unfortunately, this is all I know about how to get ImportJSON to automatically refresh. I did a test using the old Google Spreadsheets and GOOGLECLOCK() worked for me. I haven’t tried using the new Google Spreadsheets and changing the recalculation settings.

          1. Greg says:

            Looks like NOW() won’t work with custom functions:



          2. crypt says:

            I am having issues with your fix here Fedora.
            GoogleClock has been disabled and the refresh menu on the new Google Sheets removed.

            Although the API changes on my link


            My table will not fresh automatically, no matter what I do or how long I wait. Help!

          3. trevor says:


            The GoogleClock() function still should be active in the old Google Spreadsheets. It’s not active in the new version. As Greg has noted, there is no way to update your data automatically using custom functions in the new Google Spreadsheets. You have to use the old version.

            The sheet you linked to is using the old version of Google Spreadsheets. To make ImportJSON execute every minute, simply add GoogleClock() as the fourth parameter:

            =ImportJSON("","","noTruncate", GOOGLECLOCK())

            This fourth parameter is ignored by ImportJSON, but causes the function to be re-evaluated every minute.

            To create a new spreadsheet using the old Google Spreadsheets, go to, then save it to your Google Drive as you normally would.

            Hope this helps.


          4. crypt says:

            I think that’s working Thanks!

  28. RICHARD says:

    The script works great. Sometimes an API responses data and is placed by script into columns instead of rows. For exampIe get the following headers back.

    /478755/x1 /478755/x2 /478755/x3 /478755/x4 /478755/x5 /615424/x1

    I do not want 478755 to be in the header, it is basically the value of row id. I want 615424 to be on the next row? Is there a way to achieve this?

    1. trevor says:


      Can you point me to the URL that generates the JSON or copy-and-paste part of the JSON into a comment?

      JSON is a key-value pair format. For ImportJSON, the column headers are always created using the keys and the data always comes from the values.

      If the JSON you’re trying to retrieve uses the row IDs as the keys for the rows rather than the values, then you’ll see those in the column headers. Right now there’s no way to format this differently.


  29. Tim says:

    Hi Trevor, thanks a lot for your great library.
    I’d like to use it to extract the direct iTunes link out of an iTunes Search API result.

    The Search Results URL would be e.g.:

    So I’ve set up a spreadsheet with your script.
    works fine and extracts me the entire search result into the spreadsheet. However, I’m only interested in the URL thats located in

    trackViewUrl”: “″

    Unfortunately, whenever I’m trying to use a query to only extract this part, I’m getting a Parsing Error.

    Then I noticed that even when I paste the example code (see below) from your page, it creates a parsing error.

    =ImportJSON(“”, “/feed/entry/title,/feed/entry/content”, “noInherit,noTruncate,rawHeaders”)
    -> Error

    works fine.

    Do you have any ideas what I’m missing? Everything works except the use of the query path.

    Thanks a lot!

    1. trevor says:


      I suspect your problem is due to the format of the formula, rather than a problem with ImportJSON itself.

      When I copy and paste the test you pasted above into Google Spreadsheets, I get a parsing error. But this is because the double quotes being used are smart quotes rather than regular double quotes. Google Spreadsheets doesn’t recognize smart quotes as string delimiters, so throws a Parsing Error trying to parse the formula. This prevents ImportJSON from ever being called.

      If you convert your smart quotes to regular quotes, you should be able to use:

      =ImportJSON("”&A5&”&country=de&limit=1", "/results/trackViewUrl", "noInherit,noTruncate,rawHeaders")

      Try that and see if that works.


  30. Ralph says:

    Hi Trevor,

    I think you have a bug with null values. Then a new row contains columns with a null value you are inserting the non-null value from the previous row rather than null as it should be.

    No biggy as I changed my api to replace nulls with something else as a work around.


    1. trevor says:


      Can you provide an example that demonstrates the problem so I can test it?

      Also, confirm whether the bug still exists if you use the “noInherit” option. By design, rows inherit the values of their parent rows, but they shouldn’t inherit the values of their sibling rows.



      1. Brian Lewis says:

        I’m having the same problem, and the first thing I tried was the noInherit option.

        JSON for reproduction:

          "version": "1.0",
              "user_id": "1002",
              "first_name": "Sarah",
              "last_name": "Supervisor",
              "title": "Supervisor",
              "last_login_date_utc": "2014-05-09T21:18:11.363"
              "user_id": "1003",
              "first_name": "Edward ",
              "last_name": "Employee",
              "title": null,
              "last_login_date_utc": null

        The row with user_id = 1003 consistently shows a title of “Supervisor” and a last login date of 2014-05-09T21:18:11.363, where both values should be null.

        1. Brian Lewis says:

          On line 286, why are you checking for i > 0? Shouldn’t we check to see whether data[state.rowIndex] has a value, even on the first pass?

          I took out i > 0 &&and not only did my data start appearing correctly, but I got back the first row of values I didn’t know I was missing.

          My current line 286 now says:

                  if (data[state.rowIndex]) {

          I don’t know what else I may have messed up by removing that check, but at least in my single situation I now consistently get the values I was expecting.

          1. Diamond in the rough says:

            Just confirming that I also fixed the same problem using your Line 286 code, so thanks very much for posting. I was encountering it with’s API.

   to be specific. (not the profile, but I am assuming not everyone will know of a twitch streamer to insert into the username section).

            These columns (Follows User Type, Follows User Logo, Follows User Bio)were being misformatted, or excluded, if they had null, causing the entire array to not line up consistently.

            Thanks again!

          2. Kees Romkes says:

            confirmed! Works with pipedrive CRM now ;-)

  31. Rabah Rahil says:

    Hi I am trying to install the script to a spreadsheet, but cannot find it in the add-ons library. How do I install? I cannot find script manager as it has been deprecated in light of add-ons. Any thoughts? Thanks!

    1. trevor says:

      Until I have time to figure out how to convert the script to an add-on, you’ll have to install the script manually.

      To do this, open the script editor using the Tools > Script Editor menu item. Then copy-and-paste the ImportJSON code from here and use File > Save to save the script. You should then be able to use ImportJSON within your spreadsheet.

  32. Brad says:

    Hi Trevor,

    Awesome script. It works flawlessly! Thank you so much for your continued contribution to the community.

    I was just wondering if there is an option (presumably in ImportJSONAdvanced() that allows for the return of only the first value of an array? I have scoured the documentation and found a hint that it may be require row==0 in tranformFunc somewhere, but can’t make heads or tails of it.

    My end-requirement is to return the first value of “/return/ANC/sellorders/price” from the API “″.

    At the moment, I use the function =ImportCSV(, /return/ANC/sellorders/price, “noHeaders”) but this returns 100 values – I simply need the first.

    Thanks again for such an awesome function.


    1. trevor says:


      If you are using the new Google Spreadsheets, try using the ARRAY_CONSTRAIN function.

      If you are using the old Google Spreadsheets, then you’ll have to create a custom function which calls ImportJSON. Something like this should work:

      function ImportJSONFirstCell()
         var result = ImportJSON(url, query, parseOptions);
         return [ result[0][0] ];


    2. Joanna M says:

      I know this is nuw quite some time afte you posted the message, but think this might be useful info for others, looking through this thread.

      You can use google sheets function INDEX() to select a specific cell from the result returned by ImportJSON(). If your result is a single column with lots of rows, and you want the second value (for illustration purposes) you would use:

      =INDEX(ImportJSON(url, query, parseOptions),2,1)

      1. Kyle Pennell says:

        Just wanted to say thanks for this, this helped. Parsing arrays is a bit tricky even though there are many options.

  33. statdude says:

    Receiving error ” exceeded maximum execution time” on some calls. What gives?


    1. trevor says:

      This can occur when the resource you’re connecting to is taking a long time to respond.

  34. telefrancisco says:

    Regarding in updating the data in Google Spreadsheets, just to mention that Google Drive Time-Based triggers do not work.
    ImportJSON as said earlier is not an standalone function.

    Regarding the new Google Spreadsheets you can update ImportJson every minute configuring in the options of the Spreadsheet to update it every minute and using now() instead of the deprecated “googleclock” (which in fact, updated the spreadsheet every minute, so there is no function loss at all for this topic in new spreadsheets).

  35. Daniel Seijo says:

    I think I have some trouble with return format, because Spreadsheets treat cells with importJSON as string, not number.

    I.e. I have a column with results: 1, 30, 5, 30, 2 from importJSON, then I create a “select A1:A4 order by A desc and obtain: 5, 3, 30, 2, 1.

    With same query but typing manually I get 30, 5, 3, 2, 1.

    I tried to force to numeric format cells, but it´s the same. Any idea?

    Thank you!

    1. trevor says:

      Where are you doing the SELECT statement?

      I just did a test in Google Spreadsheets using a YouTube JSON feed and the results were coming back as numbers. Or, at least all of Google’s internal functions were treating them as numbers. So it may be a problem with executing queries against the spreadsheet.

      You could try forcing the numbers by modifying the defaultTransform_ function. To force every value to an float, at the end you could put:

      if (!isNaN(parseFloat(data[row][column]))) {
         data[row][column] = parseFloat(data[row][column]);

      Check the documentation here for a more robust version of parseFloat that you may want to use if your data contains string values that start with number characters.

      Let me know what you find out. If none of these work, post an link to a sample Google Spreadsheet that demonstrates the problem.

      1. frank says:

        i tried to add the option above. but i didnt get any numer out of my importjson function.

        Is there anything else i could try.

        1. trevor says:

          Not that I can think of. The code I gave should ensure the values being returned are formatted as numbers.

          Where are you doing a SELECT statement? Are you trying to use the Google Visualization Query Language, or is this from another client?

          Make sure you don’t have any other data in the column besides the header that could force the column to be treated as a string.

          1. frank says:

            I just copied de importjson code as a new script. and then make an importjson on the tab i want the data, i dont use any other program to import. im just a noob in this :)

          2. frank says:

            The column is just used for numbers, the rows have some text. but the columns i want to use are just numbers.

          3. frank says:

            in a strange way it worked, when i changed some code to properly format in my sheet en script it now has numers wich i can calculate with,

            Thank you very much for your great work :)

  36. John says:

    Hi Trevor,
    Great work. Really nice. I was curious if I am compliant with the GNU license GPL-3.0 with the change I made below in my copy of your script in order to get this script to work with what I needed. I simply added a header object directly to the UrlFetchApp so I could authenticate with the JSON feed I needed to pull into the spreadsheet. It works great. No issues. My question is if I am technically allowed to make this modification to a copy of the script in my spreadsheet based on the GNU GPL 3.0 license?

    function ImportJSONAdvanced(url, query, options, includeFunc, transformFunc) {
    var jsondata = UrlFetchApp.fetch(url,{
    headers: {

    ‘Authorization’: ‘apikey ‘ +

    1. trevor says:


      If this is for your own internal use, you definitely are compliant with the license.

      If you are thinking about distribution your changes, check out an overview of the GPL 3.0 license here or here for a detailed breakdown of the new changes in the 3.0 version.

      The simplest way to remain compliant is to include your modified source with your distribution, note the changes you made and keep the GPL 3.0 license.


      1. John says:

        Thanks Trevor.
        I linked to you in my blog post.
        Thanks again for the great google sheets function.

  37. Fabio Montefuscolo says:

    It’s really awesome!

  38. Raffaele Negro says:

    Hi Trevor,
    I can’t see the ImportJSON formula in my cells even though I coded it in the editor, neither I see it in my tool submenu. Any thoughts?


    1. trevor says:


      Are you entering the ImportJSON formula into your cells?

      ImportJSON is a custom function, not an add-on. It won’t appear on any menus within Google Spreadsheets. Nor does Google Spreadsheets show it as an auto-complete suggestion when you’re typing formulas in cells. You need to use the documentation at the top of this page to use it within your spreadsheet.


  39. Petar says:

    Hello Trevor,

    Im traying to import data from game API to google Spredsheet but i constantly get error: The coordinates or dimensions of the range are invalid. (line 19, file “Code”)
    Im new in this and would very much appreciate any help

    function readJSON() {
    var ss = SpreadsheetApp.getActiveSpreadsheet();
    var ss = SpreadsheetApp.openById(“1taOtpenF4-VQez2Pq4-ZpctRtiuagMk_ha14ZGnHcsU”);
    var sheet = ss.getActiveSheet();
    var startRow = 1; // Change if needed

    var url=””; // Paste your JSON URL here
    var response = UrlFetchApp.fetch(url); // get feed
    var dataAll = JSON.parse(response.getContentText()); //
    var dataSet = dataAll;

    var rows = [],

    for (i = 0; i < dataSet.length; i++) {
    data = dataSet[i];
    rows.push([,,data.age, data.is_online, data.alive, data.ban, data.level, data.exp, data.strengh]); //your JSON entities here
    dataRange = sheet.getRange(1, 1,rows.length, 3); // 3 Denotes total number of entites


    1. trevor says:


      Try asking your question on the Google Docs Help Forum.

      I’m sorry, but I’m not an expert at Google Spreadsheets. I wrote the ImportJSON for a specific project and open sourced it for others to use. My experience with Google Spreadsheets is limited to what I needed to learn for this project.


  40. Jumbo Panda says:

    Hey Trevor – Awesome snippet! What a time saver…

    Question on slowing down the API calls when I use this in a Google Docs worksheet to pull in Flurry data. I have a worksheet with 10+ =importJSON(blah) cells in it. Seems they all fire at once and I exceed the 1 API call per second Flurry limit.

    Any suggestions? Sleep function perhaps, but where?


    1. trevor says:


      I’d create a wrapper function that sleeps before executing, e.g.:

      function DelayedImportJSON(url, query, parseOptions, delay, random) {
          Utilities.sleep((random ? delay * Math.random() : delay) * 1000);

          return ImportJSON(url, query, parseOptions);

      This would allow you to sleep a specific number of seconds or a random number of seconds up to a maximum number of seconds.

      To use this, just add a different number after each import statement:

      DelayedImportJSON('http://myurl/', '', '', 1)
      DelayedImportJSON('http://myurl/', '', '', 2)
      DelayedImportJSON('http://myurl/', '', '', 3)

      The statements should then get executed with time in between each one.


      1. Jumbo Panda says:

        Works great! Many thanks.

        Owe you a beer next time your in Seattle…


      2. San4x says:

        Hi Trevor,

        Nice elegant solution, thanks!

        It works fine, but I do get “failure notices” from Google Apps Scripts. No idea if they’re really failures since the DelayedImportJSON script seems to work.

        The Error Message from Google:

        Cannot convert NaN to (class). (line 18, file “DelayedJSON”)

        And here’s line 17 and 18 from the script:

        17. function DelayedImportJSON(url, query, parseOptions, delay, random) {
        18.     Utilities.sleep((random ? delay * Math.random() : delay) * 1000);

        I suspect the output of the Utilities.sleep function needs to be an integer. I do feed delay with integers only, starting at 1.

  41. James says:


    This script is just what I was looking for and I have it working. But I’ve run into a problem, possibly caused by the json format of the API call I am making.

    The json I am loading can be seen at*:*&start=1&rows=0&profile=facets

    The call I am using is =ImportJSON(“*:*&start=1&rows=0&profile=facets”, “/facets/name,/facets/fields/label,/facets/fields/count”, “noTruncate”) which can be seen at

    But what i want to do is just list for example all the label/count pairs within name:LANGUAGE as currently only the very first entry is being displayed.

    Is this possible?

    Many thanks, James

    1. trevor says:


      Apologies for the delay in responding. The problem looks to be two-fold.

      First, there is a bug in the ImportJSON library so not all the rows from your JSON file are being returned. I’ve documented it as Issue 8, but don’t have time right now to look into it. Based on my current schedule, it may be several months before I’ll have time to do another round of bug fixes.

      The other issue is that there is no way to filter based on the values of what is returned. So even if the bug was fixed, you would get all the rows of your data (including all of the name:LANGUAGE ones–not just the first one). The query parameter let’s you filter columns, but there’s no way currently to filter rows.

      Sorry I couldn’t be more helpful.


  42. ratman720 says:

    Hey Trevor, The scrypt is phenomenal and works wonders for me. Thanks so much for the effort. I’m trying to work through a spreadsheet of cryptocoin data from cryptsy and the API get functions simply aren’t kind. The trouble I am having is I only want the first entry in the orderbook data. Is there any way to get importjson to do this? advice would be greatly appreciated.

    this is my function for now.


    I get all the orderbook data but I’m only interested in the first row.

    1. trevor says:

      To only show a subset of rows from the ImportJSON result, use the ARRAY_CONSTRAIN function from the new Google Spreadsheets:

      =ARRAY_CONSTRAIN(ImportJSON(concatenate("",A1),concatenate("/return/",left(B1,len(B1)-4),"/sellorders/price"),"noTruncate,noheaders,noInherit"), 1, 20)

      The second parameter is the number of rows to display and the third parameter is the number of columns to display. Set the number of columns to be equal to or greater than the number of columns you’re fetching.

      1. ratman720 says:

        Nice my big sheet of doom is in the old sheets and I have needed an excuse to switch over. Thanks for the fast reply, I had no idea that function existed.

  43. Tim says:

    I install the script and tried to run importJSON, it says “SyntaxError: Empty JSON string (line 168, file “Code”)”

    1. trevor says:

      Check the URL that you are passing into ImportJSON and confirm you are getting data back.

      If the data is coming back when you request it directly in the browser, then post the parameters for the ImportJSON function you are using here, and I’ll see if I can figure it out. If you have a sample Google Spreadsheet showing the problem, that’s ideal.

      1. Thiago says:

        Got the same error. Any solution?

        1. Jay says:

          Thiago did you ever figure this out? I’m getting the exact same error.

  44. Justin says:

    Hi Trevor – Thanks for creating this great tool. I’ve been trying to use it with the Edmunds API (vehicle data) to populate a google spreadsheet with all makes/models and some specifications of each vehicle. However ImportJSON is only pulling in one model from each make (e.g. Acura is a make that has several different models, but only the first model is being pulled in to the spreadsheet). I think the issue has something to do with the way that recursion is set up because at the bottom of the google spreadsheet a few additional models are brought in but it seems to stop importing when it gets back up to a make that only has one model listed (e.g. Tesla). Basically, I want column A to contain all makes and column B to contain all models that exist. Do you have any idea why ImportJSON is not pulling the entire population of models in? Is it user error? Any help would be much appreciated.

    Here is an example URL:

    1. trevor says:


      There was a bug in the ImportJSON code in how it handled nested arrays. I just fixed it.

      Since the Script Library is now deprecated, you’ll have to install the code manually. To do so, open the script editor using the Tools > Script editor… menu item. Then copy and paste the new code from here.

      Let me know if this solves the problem. I only had time to do basic testing.


      1. Chris says:

        Hi Trevor,

        I appreciate your work on this very much!

        This fix addressed one issue I was having, but substituted another. Now values from nested arrays are showing up, but the first item in each array is missing. It’s like the index started at 1 rather than 0.

        Thank you in advance for any time you can spend on this one.


        1. Chris says:


          I was able to fix this problem by making a minor change to the ParseData_ function. I changed line 286 in version 1.2.1 to:

          if (i >= 0 && data[state.rowIndex]) {

          and it seems to have addressed the issue.

          Thank you!


          1. James says:

            Chris, thanks for this. I had exactly the same problem and it worked perfectly!

          2. Chris says:

            Same problem here.
            >= fixes it for me too

  45. Greg Wright says:

    Hi Trevor,
    I’m trying to access ZenDesk ticket information. We have API token authentication enabled.

    ZenDesk expects this:
    https://{domain} -u {username}/token:{API token}
    When I test via cURL, the format above works correctly.

    I’ve tried every variation of =ImportJSON that I can think of but I cannot pass authentication. Error: Couldn’t authenticate you

    Do you have any suggestions?

    1. trevor says:


      The current version of ImportJSON does not support password-protected JSON resources.

      However, I describe code in this comment which can be used to add basic authentication support using a hardcoded username & password. ZenDesk appears to use basic authentication, so this should work for you too.

      Hope this helps.


      1. Greg Wright says:

        I added the code you cited but I still get the “error”:”Couldn’t authenticate you”. I tried a couple of variants with and without the ‘/token” syntax that ZenDesk ask for but was not able to get it to work.

        By using the following syntax I am no longer getting the “error”:”Couldn’t authenticate you”:

        =ImportJSON(“-u {username}/token:{token} https://{domain}”)

        But now I get Error: SyntaxError: Empty JSON string (line 185).

        Line 185 = var object = JSON.parse(jsondata.getContentText());

        Any other suggestions? Have you ever tried connecting to ZenDesk with ImportJSON?


        1. trevor says:


          Can you post the code you used?

          Make sure your username is “{username}%2Ftoken” and your password is “{token}”, as this is what your curl arguments are indicating. If you’re not familiar with curl, check out the manpage, which explains the arguments and their formats.

          Note that curl is a completely different program from ImportJSON and none of the curl options work with ImportJSON, nor with URLs in general. So passing in “-u {username}/token:{token} https://{domain}” is essentially meaningless. You’re passing an invalid URL, so you should be getting an error.

          I haven’t used ZenDesk before with ImportJSON, but based on the Security and Authentication section of the Core API, it should work fine with the code for basic authentication.

          If you’re still having problems, you may have to hire a contractor on Elance or oDesk to help connect. You may need to authenticate using OAuth2, which is also supported by ZenDesk, but outside the scope of what I can help you with. You can read about OAuth2 support in UrlFetchApp, the library used by ImportSON. It requires custom scripting.

          Good luck.


  46. frank says:

    I have a problem with a import job. I try to import an api from an website but it gave me an error.

    It says
    error: SyntaxError: Unterminated object literal (line 140, file “importjson3″)

    I have no idea what is going wrong in this job, The most pages tells me that an \ is missing.

    1. trevor says:


      This error most likely comes from a missing closing brace within the JSON you’re trying to parse. You’ll have to fix the JSON you’re importing for that.


      1. frank says:

        ok, so if i understand, i have to do something about the data thats going to be imported?
        and not with the import script itself ?

        1. trevor says:

          Correct. That is the most likely problem, since “Unterminated object literal” is an error that comes from Google’s JSON parser.

          You can check the validity of your JSON by copied the URL to the JSON, or the full JSON that you get when you access the URL in your browser, into this tool or this tool.


          1. frank says:

            Im still struggling with the data. i couldnt find out whats wrong with it.
            im trying to import this URL

            And before it always worked, but in one day i ge the error above

  47. Jim e says:

    I’ve noticed the script seems to have issues when values have empty strings as such

    Both rows will import as having “value” under “key” despite one being empty. I’ve been looking through the code but I’m not certain where the fix may lie.

    1. Jim e says:

      I see what I was doing wrong, now it’s working exactly. Thank you.

      1. Stephen says:

        How did you fix this/what were you doing wrong? Thanks in advance.

      2. Greg Pearl says:

        I too am interested in how you corrected this issue.

  48. Silver says:

    I’m having an issue with the script not importing ALL of the values for a specific set…


    I’m using the function like so:

    The first part yields the address given above. However, only 2 of the 3 characterId values are being displayed. Specifically it is leaving out the FIRST item in the “characters” element….

    Any idea what the issue is?

  49. John says:


    I love that you’ve put up this script, however I’m having a random issue with my spreadsheet.

    When I use it to import values into my sheet I occasionally get the following:
    Error: SyntaxError. Empty JSON string (line 131).

    I’ve found that if I modify the URL slight for example, change http to https and then back again to http it works. But it means I have to do it for the multiple cells I’m importing data for.

    Any ideas on what could be causing Sheets to suddenly find and error and then resolve upon edit?

    1. Renzo says:

      I have the same issue. For some reason the import stops at this line. When I change the URL or eg the options it runs through.

      Someone already has a solution for this?

      The script is great BTW!

  50. Jakke says:

    Hey, I love the script, perfect to import what I want from the WoW Armory. However, since I am loading rather big amounts of info (650 rows, 5 columns, almost 3000 cells), I tend to get the famous “Use utilities.sleep(1000)” error often.

    Is there any way that I can tell the ImportJSON to not refresh the information it fetches everytime, and instead add a button to my custom menu that allows me to update it? Would make it alot easier for both Google servers and for the use we have for the Spreadsheet (very sluggish till you have had it open for some minutes).

  51. Andy says:

    I love this. I have a feeling it can do something more advanced I want to see, but I’m new to scripting and having trouble with some of the lingo.

    How do I use this to do a series of json imports by inserting different {client} variables into a url formatted like “{client}/search?term={item}”

    I want to find the way to insert the list if{client1}/search?term={item}{client2}/search?term={item}{client3}/search?term={item}…

  52. Mike says:

    Does the google sheet make a local call?
    That is, if the endpoint is in my network and not publicly available, is it reasonable to expect results? Or does the endpoint have to be public?”

  53. Mickael says:


    It looks really useful, two questions:

    1. Do you know how to import this script onto the NEW google spreadsheets? I created the script, but then none of the functions appear on the new spreadsheets… do we need to do anything else?

    2. Do you know the query that I’d have to run here, , to get the symbol just for the LSE exch for example?

    Many thank in advance!!!

    1. Mickael says:

      I managed to see the function in the new spreadsheet, however, a basic try to import it with:


      and I’m getting:
      Error: Syntax Error. Unexpected token: Y (line 166)

      I saw that what’s coming back in the browser starts with Yahoo (and there’s a closing bracket at the end as well. Do you know how to remove that part so that we just have a real JSON?

  54. Maximaal says:

    Hi, getting errors.
    This doesn’t work for NEW google spreadsheets?
    Some have a fix for this script?

  55. Alex says:

    Hi, I’ve got a problem with that import tool.
    when i import this:
    instead of 0 values it copies value from a row before.
    Could you let me know what to do?
    also when I try to get a inner query it only reads the second one and forgets about the first.

  56. Henry says:

    Hello, great script and works well. I wanted to use it with a database I’ve started on I can use http requests but it uses https.

    I added the username/password script that you posted here, but it didn’t seem to work. Here is the request format:

    Which is discussed here:

    Getting this to work would be a great tool to be able to access the database in my spreadsheet and standalone apps that use the same database.

    Thanks for your great work and answering so many questions,

  57. Frank Braker says:

    Hello. I’m trying to use the json output from a mongoDB (via mongoHub). This Produces something like this (where each record is contained in a pair of braces).

    “time”: “01:28:10 AM”,
    “milliseconds_since_epoch”: 1422408490495,
    “date”: “01-28-2015″
    “time”: “01:28:10 AM”,
    “milliseconds_since_epoch”: 1422408490495,
    “date”: “01-28-2015″

    Is there any way your script can parse this (i.e. do the options support this format)?

    1. Frank Braker says:

      Okay – I got it by surrounding the whole file with a leading [ and trailing ], and adding commas at the end of the surrounding {…} pair. Luckily this is easy because each record is newline terminated. Easy to do with vi using regular expressions.

  58. emin says:

    hi friends how can i get with this code sell orders buy live , lasttrade was working awsome but icannot find the way to read the sell orders just 1 or all example please !.

    echo print_r($obj->{‘return’}->{‘markets’}->{‘DOGE’}->{‘lasttradeprice’}.”\n”, true);

  59. sarath says:

    Hi , is it possible to recreate the json from spreadsheet? . So that we can make it as a db sync app for low security applications . After editing on spreadsheet , generate json and make a put request to the url .

    1. trevor says:

      Not with this library. You’d have to write a different library to do that, or find another one already written.

      If you have other JavaScript code that needs to read your Google Spreadsheet, there’s other JavaScript libraries that read from Google Spreadsheets directly. Miso is a decent one.

  60. Benjamin says:

    Hey Trevor,

    First of all, great work. This is so useful for our business, and we really appreciate your expertise, time, and attention to helping the community by creating ImportJSON.

    I’m dealing with paginated JSON. I read through the comments here and saw that someone else asked about this, and this was your reponse:

    Would you be so kind as to provide a more specific example of how this might be an accomplished? I’m afraid I have very little experience with building scripts.

  61. Benjamin says:


    I’m following the steps here:

    It seems that I can’t delete rows of data that are imported via ImportJSON. If I delete the first row in a Google Spreadsheet (the one that contains the formula) it messes up all of the rows below it. If I try to delete any row besides the first, it simply won’t let me delete it – I presume it’s because it’s connected to the JSON URL that I specified.

    Is there a way around this? Or, do you think this predicament will change after implementing a script to access the paginated JSON that I need as I mentioned above?

  62. Benjamin says:

    Okay, I’m using this script here:

    I added a loop to line 94 which is accessing the paginated JSON. The problem is, I’m not sure how to append/concatenate the results, so it only returns the first page to my spreadsheet.

  63. Philippe Siguret says:

    You’re script is great!!!
    I’ve tried with lots of public JSON and it works great.
    Then I’ve tried to use it with a POST request and it’s a other matter.

    the request I have to use is :
    Query the API at :
    – Make a POST request to my API with these three parameters:
    name : The full name to search
    domain : The domain to search against
    token : My token

    I I don’t know how to do it. It connect to the serveur, but did not give the name/domain/token

    Here’s the formula I’ve tested:

    I’ve tried few hours, but I do not how to do it..

    Thanks a lot if you answer !

  64. Tony says:

    Is it possible to pass through tokens via the header?

    Ideally, I would like to do something like this:

    var auth_token = ‘tokendfhjklsdfhjkl3w4asdfjldsrf’;
    var url = “”,
    options = {
    “headers”: {
    “X-API-KEY”: auth_token

  65. James says:

    Hi, I’m using this function a lot to pull in stats and data from a public API that I promote –

    It works brilliantly but I’m having trouble when I try to use it too many times in one sheet. I need a way to call the function sequentially with a small time delay.

    As an example see

    The first 4 columns are populated from a single call to the Wikipedia API returning data on 500 landmarks. The 5th column runs a separate call per row to find the umber of results the Europeana API will return for each landmark, so in total 501 calls.

    Do you have any tips how I might get this to work more elegantly, for example by only calling the function once the cell above has completed. If it takes 10 minutes to complete loading that’s not an issue to me!

    Thanks, James

    1. trevor says:


      You can try the solution in this comment.

      I also proposed a different solution in this comment that didn’t seem to work as well.

      But the other thing you can try is to write your own JavaScript function that calls ImportJSONAdvanced multiple times with different parameters and concatenates the results. This would then run in a single thread and the sleep function would work as intended.

      Alternatively, if you control the API, you could add a function that let’s you pass in multiple parameters and get an array of results back. This would let you reduce the number of calls dramatically.


      1. James says:

        Thanks Trevor. I can see the logic of the first option but it’s still proving problematic. Some cells work fine, but many return “Error. Internal error executing the custom function”. I think it’s just the sheer number of calls. I wonder if having each cell call the next one when it has loaded would work? Or even ten at a time. Not quite sure how to do that though!

        1. trevor says:

          Yes, having each cell call the next one when it has loaded should work.

          The problem is likely that Google Spreadsheet is running each of the formulas asynchronously, so they all start getting requested at once.

          One approach would be to experiment with sleep times that are a multiple of the row number (e.g. row * 100ms would load 10 rows a second), so the requests are staggered.

  66. Zubair says:

    I am trying to import a json link, but the problem is it importing one of the tree it has like 5 another rows below it. Basically I just want to import certain number of rows and columns and how?

  67. Silveress says:

    Hi there!

    I have an array like: [ [item1], [item2]….]
    However when running through it it omits item1, otehr than that it is working fine.

    JSON sample:


  68. Sky says:

    Hey, man. Thanks for the way cool script. I wrote up a little blog post on how we just used it for a scientific data system called ScienceBase –


    1. trevor says:

      Glad you found the script useful. Nice blog post!

  69. Shiva Shinde says:

    Awesome blog, I really appreciate.

    =ImportJSON(“http://jiraserver/rest/api/2/filter/favourite”, “/1.0 GA”, “noInherit, noTruncate”)

    says an ERROR! (SyntaxError: Empty JSON String (Line 165) )

    I have seen couple of them (users) got the same error above. My URL works only when I am logged in on JIRA (private network), is that a cause?

    This is my JSON reponse to http://jiraserver/rest/api/2/filter/favourite is below

    [@note: Removed JSON so comment doesn't overwhelm the page. -- Trevor]

    Please let me know what could be the cause of this error?

    1. trevor says:

      Yes, the problem is that Google doesn’t have access to the data. You may be logged into JIRA, but Google’s servers are not.

      Currently, ImportJSON doesn’t have support for passing in username and password, but you can add it depending on how the server is set up for authentication. Read this comment for instructions how.

      JIRA may also have a way to pass a token in on the URL for authentication. Read up on the JIRA API to see how their REST API is authenticated.

      If those don’t work, your other choice is to upload the JSON you are receiving to a public site that Google can access. This is, of course, a manually process and not ideal.

      1. Shiva says:

        Sorry, I couldn’t follow up the comment link you posted. Please post the link. Once again, thanks alot for getting me through this. I will make my JIRA REST Api ResourceURL public.

        1. trevor says:

          Sorry, forgot a quote. The comment can be found at:

          1. Shiva says:

            Looks like the only option left is this “If those don’t work, your other choice is to upload the JSON you are receiving to a public site that Google can access. This is, of course, a manually process and not ideal.”

            I tried your script for authentication but it is authenticating JIRA not my private network (server). This is my problem, we are discussing.

            Manually process looks fine for now, but my actual task is quiet big writing manual code may not be the good thing to do. Please let me know any other way? other than the manual one.

          2. trevor says:

            If you are running a server on a private network not connected to the Internet, then there is no way to get Google Spreadsheets to access your data. The Google servers need to be able to connect to the server containing your data to retrieve the data.

            If you cannot make your server available on the Internet, then you’ll need to manually upload your data somewhere.

            Also, note that the authentication code I linked to only works for Basic authentication. If Basic authentication is not turned on for the server you are accessing, then that code won’t work.

  70. NightroadAX says:

    I’m having a bit of trouble when I try to pull a list of items from a json file, the results always seem to be missing the first item. JSON View for this first item always contains a [0] element. (e.g. itemList.items[0].name). I’m guessing this is a pretty simple/common thing but just couldn’t find a solution. Any advice would be greatly appreciated!

    1. trevor says:

      I’m not actively supporting this library right now, but you can check out the answer to a previous issue that seems related here. Hope this helps!

  71. Ben says:

    Hi, Awesome script but I am having trouble converting this into a table format

    Anyone have any ideas on how to do this?

    I have spent a lot of time trying to figure this out!

    Thank you :-D

    1. Ben says:

      I have now figured out how to display this into a table on a google spreadsheet, now I have issues with the True and False booleans as they do not display correctly… actually they do not display at all just blanks … also the biggest issue though is I cannot authenticate my Zendesk.

      I tried using the code in a previous post but still cannot get this to work.

      Has anyone come across this issue before?

      many thanks :-D

      1. Will says:

        Hey, is a truly useful scrip – been looking for something to do this for weeks!
        Though I’m also having issues the the True and False values.
        I find that True appears, while False is left blank.

        Still looking for a solution.

  72. FastZ says:

    Hello and thanks for the ImportJSON. I have a question:
    I have a query: ImportJSON(URL, buy + "," + sell, "noHeaders");. As you can see, I trying to get 2 values (“/pair/buy” and “/pair/sell”), but they are showing in 2 different cells, like this:
    [ buy][sell]
    How can I place the result in one cell like this?
    [ buy / sell]

    1. trevor says:

      No, there is no way to combine values into a single cell using ImportJSON.

      You can write your own function that calls ImportJSONAdvanced and combine the cells programatically though.

  73. Jim T says:

    Hi Trevor! Using Your script a lot in my sheet, and love it. However, I did once ask for a menu button that makes me able to manually refresh the data.

    However, since I got my own custom menu, is it any way I can add it there instead? I lose the ability to refresh it since my menu overrides Yours (which it must, so I don’t lose tons of handy features I need for sorting my sheets).

    Also, is there any way to make it not update/refresh the info unless I manually Call for it to do so? I am hitting the Google limit when the Blizzard Armory API (for World of Warcraft) gets issues, it seems like it is trying to pull the info many times, and another script I got updates everytime ImportJSON updates its info. Makes havoc, got a backup and a backup backup sheet on other users to be sure I get the info in case something goes wrong.

    Wouldn’t mind an email if you got the time, as I am travelling a bit these days and can’t check around on all these websites ;)

    Jim, Norway

    1. Bernio says:

      Hi Trevor,

      Like Jim, I’m using your script a lot. It’s incredibly useful. But now something’s wrong with google spreadsheet.

      At first time read the values, but if we change something in the URL then it does not automatically recalculated as it did before. In the importjson cell appear the message of #error! (internal error).

      Anything to help would be great.

      Thank you very much

  74. frank says:

    Hi there,

    Still using this great script. Some jsons give some errors but overall it works great.

    I now have the problem that I tried to import an json, and it gives me only back a lot of collums on 1 row, is there any option to say that it must break the row and go on on the next row.

    this is the json I want to import.

    I think I must give an option but I don’t know when and where.

    1. trevor says:

      Unfortunately, the JSON feed you want to import doesn’t have the structure that ImportJSON was designed for. There’s no easy way to force ImportJSON to read it.

      The structure is fairly straightforward. Your best bet is to have someone code a specific import for this structure.

  75. Michael Perdue says:

    Hello Trevor,

    I have been trying for a couple of days to find a way to download Json data into a database table (either stata, access, google, or anything similar).
    Earlier today I installed your code for installing Jason with google. I ran across one problem. When I perform a command to fetch data a run into the following error:
    SyntaxError: Unexpected token: < (line 170, file "")

    line 170 reads:
    var object = JSON.parse(jsondata.getContentText());
    and is from the script you provided for downloading the intall json program.

    the command i entered is the following:
    ImportJSON(";, "actions/acted_at/committee", "noInherit,noTruncate,rawHeaders")

    If you can offer me any advice in this endeavor I would greatly appreciate it. The data I need is in bulk, and there is a lot of it:

    Best regards,

    1. trevor says:


      Your problem is that the URL you are trying to import doesn’t contain JSON data:

      This URL points to a directory listing of CSV and YAML files.

      If you want to import this data into a spreadsheet, all you need to do is download the CSV file, which most spreadsheet programs support. If you want to import into Google Spreadsheets, you can use the built-in ImportData function.

      In general, though, if you get an error on line 170, then it means the JSON URL is formatting incorrectly. This can be because the URL doesn’t point to a JSON source, the JSON is incorrectly formatted or the JSON source is inaccessible.


  76. Mario Muñoz says:

    Hi I have a problem, I have a json that is divided on different urls, can I read more than one?



    1. trevor says:

      If you want to read more than one, you have to use multiple ImportJSON statements.

      Alternatively, you could write code that uses ImportJSONAdvanced to read each URL and concatenate the results together.

  77. TonyM says:

    Hi Trevor

    I ve been following your posts and I am overwhelmed,
    I tried using the ImportJSON code but i keep getting unknown function every time i try pasting or write a formula

    what can be wrong


    1. trevor says:

      Have you copied the ImportJSON code into Script Editor for your spreadsheet (via the Tools > Script Editor menu item)?

      You won’t be able to use ImportJSON in the spreadsheet until you have the code copied over. If you’ve copied the code into the script editor and it’s still giving you an unknown function, I don’t know what the problem is.

      1. TonyM says:

        Thanks Trevor for the quick response
        I followed the steps just like you stated, but it still not working, could it be Google is changing something.

        this is the doc I’m working with:

        1. trevor says:

          I just pasted a function from my test worksheet into yours and it seems to work fine. What are you entering?

          1. TonyM says:

            Thanks Trevor

            i inserted the code bellow at line 128

            * TonyM – ImportJSON Secured REST JSON

            function ImportJSONBasicAuthentication(url, query, parseOptions, username, password) {
            var username = ‘tm_user’;
            var password = ‘tm_paswrd’;
            var fetchOptions = {
            “headers” : {
            “Authorization” : ‘Basic ‘ + Utilities.base64Encode(username + ‘:’ + password)
            muteHttpExceptions: true

            return ImportJSONAdvanced(url, fetchOptions, query, parseOptions, includeXPath_, defaultTransform_);


            =ImportJSON(“”, “SELECT%20customerNumber%20FROM%20Customer”, “”, “tm_user”, “tm_paswrd”)

            what am i doing wrond

        2. trevor says:

          A couple problems with what you’re doing:

          1. Use ImportJSONBasicAuthentication as the function

          You’ve defined a separate custom function called ImportJSONBasicAuthentication. You need to use this instead of ImportJSON in your spreadsheet:

          =ImportJSONBasicAuthentication(“”, “SELECT%20customerNumber%20FROM%20Customer”, “”, “tm_user”, “tm_paswrd”)

          2. Remove var ‘username’ & var ‘password’

          In the ImportJSONBasicAuthentication function you define, you have username and password as parameters, but then you override them by defining local parameters with the same name. Remove these to allow the variables to come through the function call.

          3. Remove “SELECT …” as parameter

          The ‘query’ parameter is not a SQL query. It is a comma-separated list of JSON paths to import. See the examples at the top of this page.

          Even if you do all these things, this may not solve your problem. Trying to access the URL directly returns the result:

          903 - Internal system error.

          This means your basic authentication is not likely to work, since directly accessing a URL protected by basic authentication should result in a 401 Not Authorized (or, before that, a dialog should pop up on the browser).

          The service appears to be written poorly, since there is no 903 HTTP response code (an internal system error should be a 500-level code).

          I’d recommend contacting the developers of the API you’re trying to access to see how/if you can access it in the way you want.

          I also haven’t tested the basic authentication code you’ve added, so can’t confirm this works. Good luck.

          1. TonyM says:


            I ve sent u a mail with the link to the google sheet, and the right Api

            please let me know wat am doing wrong

  78. Alan Crookes says:


    I’m trying to get child elements from a JSON array but keep getting #REF!.
    Not sure if I’m doing it right but what I have is

    =importjson("", "/fixture_history/all/0/1", "noInherit")

    The original JSON data has nested data and I am using your script 1.2.1.

    Also, can I use more than one “parseOption”?

    1. trevor says:

      The problem is that your query (“/fixture_history/all/0/1″) is referencing array indexes and ImportJSON flattens those into a single value. For this data, there is no easy fix for this. ImportJSON isn’t designed to pick elements out of arrays within JSON data.

      As far as using more than one parseOption, yes, that’s possible. Just use a comma to separate each option.

      1. Alan Crookes says:

        Ok thanks, so if there is nested elements what parameter should I put into the query to get all the values in one?

        I have tried using (“/fixture_history/”) and I am still getting the #REF!.

        Am I right in saying then that since this is nested it can’t be displayed?

        Thanks for the reply :)

        1. trevor says:

          Using ‘/fixture_history/’ should work. The following is working fine for me:

          =ImportJSON("", "/fixture_history/", "noInherit,rawHeaders")

          If this isn’t working in your version, I don’t know what the problem could be. But it should work.

  79. Matej Petr says:


    I’m trying to fetch some data from reddit using function below.

    =ImportJSON(“”, “/data/children/data/score”, “”)

    I want it to read only ‘score’ column, but the function also imports ‘score_hidden’ column. Is there a way to fix it?


    1. trevor says:

      No, there’s no way currently to fix that. The query path is looking for paths that start with the given path, and both ‘score_hidden’ and ‘score’ start with the same path.

      1. Matej Petr says:

        Thanks for quick response.

        I managed to fix it. I modified your code a little.

        function ImportJSON(url, query, options) {
        var imp = ImportJSONAdvanced(url, query, options, includeXPath_, defaultTransform_);
        if (query.indexOf(“score”) > -1) {
        return getCol(imp, 0); }
        return imp;

        getCol returns first column of the 2D array.

  80. Dwight says:

    Hi Trevor! Love your script! However, when I run it, it seems to be consistently omitting the first row of data. Do you know why that’d be?

    1. trevor says:

      See the solution posted in issue 8 on GitHub. This seems to work for some people, but I haven’t had time to test it and merge it into the main code.

  81. James says:

    Hi, I keep on coming back to this script and use it across many sheets. But each time I want to use it I have to add it as a new script on that sheet. I had a quick look into this and it seems you could publish this as an add-on and then it would be simple for anyone to add. Maybe I’m wrong, but there’s more info at


    1. trevor says:

      Thanks. I’m extremely busy right now, but I’ll see what I can do. It’s unlikely to be for a couple weeks. I’ll update the top of the page here and leave another reply when it’s done.

  82. Andy Morris says:

    I know this is probably simple but I don’t understand jason or paths and am trying to do a little hack. is the jason url. there are columns which are the column labels and records. i used =ImportJSON(“”, “”, “”) which lists the column labels as in separate rowes instead labels accrossed columns over the records. i have a way to hack this with in excel but how can I use this function to view everything properly?

    1. Andy Morris says:

      O.k. i did find a way to get what I wanted. But I think this is a janky work around so i wouldn’t mind knowing the answers to the question:
      first paste this:
      =transpose(ImportJSON(“”, “/columns/label”, “noHeaders”))
      then below this:
      =ImportJSON(“”, “/records”, “noHeaders”)

      1. trevor says:

        That seems like a reasonable solution.

        The URL you pointed to is basically trying to take record data and format it as JSON. The approach taken isn’t how you’d normally use JSON data, which is usually meant to represent hierarchical object trees rather than data rows.

        In general, flat structured record data like this is better represented using CSV. CSV is a more compact format that better reflects the structure of the data. Additional header rows can be added to represent data types.

        If the data was formatted as CSV, you’d use the built-in ImportData() function for your Google spreadsheet.

        But given what you’re working with, your solution is reasonable.

  83. Chad Specter says:

    Hello. First of all thanks for this wonderful script. So I am trying to import data from a service that requires a bearer token. In curl I use something like this

    -H “Authorization: Bearer token_id”

    to send the header. How do I construct the call in Sheets using your script. I tried out the ImportJSONBasicAuthentication function, but it wasn’t working. Any help appreciated.


    1. trevor says:

      It may be possible to do this with ImportJSONAdvanced by creating a custom function in a script. UrlFetchApp supports arbitrary headers, so you could pass the token using your Authorization HTTP header.

      Try adding this function into the ImportJSON codebase:

      function ImportJSONWithToken(url, query, parseOptions, token) {
         var fetchOptions = {
            "headers" : {
               "Authorization" : token
            muteHttpExceptions: true

        return ImportJSONAdvanced(url, fetchOptions, query, parseOptions, includeXPath_, defaultTransform_);

      I didn’t have time to test this. But let me know if this works.

      1. Chad Specter says:

        Hi Trevor,

        Thanks for the quick response. I think it’s working, but the API I am pinging is responding with XML by default. I’ll figure it out though. Great script.

      2. Kees Romkes says:

        This works, but change the code into:

        function ImportJSONWithToken(url, query, parseOptions, token) {
           var fetchOptions = {
              "headers" : {
                "Authorization" : "Bearer " + token
              muteHttpExceptions: true

          return ImportJSONAdvanced(url, fetchOptions, query, parseOptions, includeXPath_, defaultTransform_);

        In case you’re using ‘bearer’ as authorization as well (e.g. importing ASANA data)

  84. Tavis Elliott says:

    Hi Trevor, excellent script!

    I’m using it to import data from Guild Wards 2 API into a google spreadsheet. Most of the API returns only IDs for items. I would like the names, but no problem I then add another import that concats the IDS together to ask for them, like:,66924,43766

    My problem: The GW2 team has decided that the order of the JSON returned is random (probably for performance on their end), and there’s no way I can make them return the JSON order the IDS the same as I requested.

    I think I’m looking at needed to write my own function that uses ImportJSONAdvanced() to take the array and then sort the results based on that, but any bright ideas you might have would be most appreciated!

    1. trevor says:

      You may be able to use the SORT function from Google Spreadsheet to do the sorting after ImportJSON has finished importing the data.

  85. Alex says:

    Thanks for the great work Trevor, keep it up!

  86. Andy Morris says:

    Maybe I missed this. But how often does this pull fresh data (refresh) from the API link? Will it do it at all when sheet is closed or only when sheet is open and tab is active? Will it only refresh if I edit cell that the IMPORTJSON is in? Just trying to figure out how fresh my data is. I have been using this but can’t easily tell.

    1. trevor says:

      Sorry, I don’t know the answers to your questions. That depends on the current implementation of Google Spreadsheets. Try searching or asking on the Google Docs Help forum.

  87. Philip Nye says:

    Hi Trevor,

    Firstly, thanks for creating this tool – very useful.

    I’m currently experiencing a problem, though, which I was hoping you might have some insight into.

    I have a Google Sheet that makes four ImportJSON calls.

    It pulls in data that is new each day. It’s worked fine for months, but as of October 20 has ceased to work – when I check the spreadsheet now it always displays October 20′s data.

    Here are the things I’ve tried, and the results:

    1. Reloading the page without using the cache (Ctrl + F5) – no change
    2. Loading the spreadsheet in Firefox (I mainly use Chrome) – the same as with Chrome; 20 October data
    3. Reauthorising the ImportJSON code – I did this by trying to run the ImportJSON code in Script Editor, as I think I had to the first time I used it. Doing so obviously throws an error, as there’s no URL from which to carry out the import, but I was prompted to authorise the code (which surprised me, as, like I say, I thought I’d authorised it when I first started using it). In any case, this didn’t make any difference.
    4. Using the exact same ImportJSON code in a new spreadsheet – in this case, I did get up-to-date data i.e. fresh 30 October data. So obviously I could just recreate the dashboard that I’ve made in Sheets, but I’d rather not do that if it’s not necessary (and given I’d be concerned the same issue could arise again).

    If you had any thoughts on other things to try then that would be much appreciated – it’s a great tool, so I’m missing being able to use it!

    Thank you.

    Philip Nye

    1. trevor says:


      That sounds like a problem with Google Spreadsheets. Maybe they released a new version. Either way, it’s outside my expertise. If ImportJSON works in a new spreadsheet, then you’ve eliminated the code as the problem. Good luck on solving it.


      1. Herbert Huyo says:

        Hi Philip.

        I stumble upon your blog today, cause I was searching a way how to have the auto-update feature on your IMPORTJSON script… However, I was able to discover that I have a script that was based on IMPORTXML in googlesheets…

        I tried to apply substitute the IMPORTXML to IMPORTJSON, then BINGO… my sheet autoupdates from time to time…

        Below is the script I have substituted:

        function ImportJSONupdate() {
        var queryString = Math.random();

        var PHP = ‘=IMPORTJSON(“‘ + SpreadsheetApp.getActiveSheet().getRange(‘A16′).getValue() + ‘?’ + queryString + ‘”)’;

        var CNY = ‘=IMPORTJSON(“‘ + SpreadsheetApp.getActiveSheet().getRange(‘A19′).getValue() + ‘?’ + queryString + ‘”)’;

        Hopes, this helps… Once again, thank you… I like your IMPORTJSON since Googlesheet doesn’t have that function…

    2. Dan says:

      I’ve had the same problem, except I’ve only really been using the function since then. I think it worked for a day and then stopped refreshing.

      A cheat that I’ve found to manually refresh the data is to insert a column. Everything then recalculates. Same when you then remove that column.

      I can’t rewrite the spreadsheet again as it’s massive and took me 3 days to build. I’m going to have to keep manually refreshing it this way unless someone comes up with a bright idea.

      Thanks for the script though Trevor. As somebody who doesn’t know how to code, this has allowed me to pull API data that would otherwise be unavailable to me.

      1. Philip Nye says:

        Thank you both, Trevor and Dan, for the very speedy replies, and for what you suggest.

        As it happens, I’d seen these replies earlier today and was thinking I’d give your suggestion a try, Dan. But now I’ve come to try it, my spreadsheet has updated for the first time in a fortnight! Will definitely keep your suggestion in my pocket for if and when it happens again.

        Has it started automatically updating for you too, Dan?


        1. Dan says:

          No, I’m afraid it hasn’t. It seems that it’s trying to though, as now quite often when I reopen my spreadsheets there are errors where it has failed to pull the data.

          I’m running up to 70 ImportJSON calls on a single sheet due to the way the API I’m pulling data from deals with individual records separately.

          The errors disappear when I go back to my old trick of inserting a column. It’s a shame though because I’m sharing the sheet with other people and they’ll be regularly opening it to errors. I’m also automatically recording data at a set time each day. If there are errors at this time, it doesn’t record the data :(

          1. Philip Nye says:

            That’s a shame. Mine – touch wood – seems just to have started working fine since about a week ago without me having done anything.

            Thanks for the column adding tip, Dan, will remember that if I start having problems again. I just hope yours starts working too!

  88. Vincent REHM says:

    With =ImportJSONBasicAuthentication(=concat(“,name,total-price,shipping_address,tags,note,shipping_lines,fulfillment_status&page=”,A2)), it seems that some null cells are replaced by the previous non null cell. How can I fix that ? Regards, Vincent REHM

    1. trevor says:

      Use the noInherit option:

      =ImportJSONBasicAuthentication(=concat(“,name,total-price,shipping_address,tags,note,shipping_lines,fulfillment_status&page=”,A2), "", "noInherit")

  89. Tom says:

    Hello Trevor,

    Thanks so much for this!

    I have a problem in my google spreadsheet. I use this your tool to import item data for the game of Runescape. The game has a simple API for each item to get information about it, example located here:

    I use a custom function to change the item id right before the .json in the url to get change which item’s price I want to import. My function only returns the last value from the “Daily” portion of the json data. Here is my custom function:

    The problem I get is sometimes I get an “Error Loading Data…” in the sheet, from what I’ve read this may be caused by having a large import range. What’s strange is I can make the call from a different sell and it will work just fine… Have any idea what might be causing this any way to fix it?

    Thanks so much,

  90. stuart gano says:

    Hi i am calling the above api into a spreadsheet and I am unsure of how to do the last step. and using the =importJson(, ‘/sponsors/name, noHeaders)

    but I dont want all of the values (4) I only want the first element of the array. Any way to limit this further?


  91. FanJunLee says:

    Anyone can help me ?
    I would like to import Data from the free WG API for the Game World of Warships

    I tested it with my Ingame Name


    Status Meta Count Data 503519284 Nickname
    ok 1 FanJunLee

    I would like to have as output only my Nickname – noHeaders doesnt working

    How i can fix it

    1. trevor says:

      You have to use a query path to select a single field along with using “noHeaders”. The query path for the ‘nickname’ field for this API includes your account ID: /data/503519284/nickname.

      Thus, the full ImportJSON would be:

      =ImportJSON(“″, "/data/503519284/nickname", "noHeaders")

  92. Jason says:

    Love this library! Saves me so much time. Any progress with JSONpath functionality?


    1. trevor says:

      No. I don’t have time to do active development on it. I’m still looking for someone to take it over. If I find someone, I’ll let them know people have been asking about the JSONPath functionality.

  93. Paul Griswold says:

    Thanks for providing this awesome resource. I’m attempting to use it to pull inventory lists from several sites we sell products on like Etsy & Storenvy.

    The problem I’m having, and other folks have mentioned this as well, is that Storenvy only returns 20 records and then at the bottom of what’s returned is:

    pagination: {
    next_url: “[my_api_key]&newer_than_id=726453″,
    current_newest_id: 726453,
    current_oldest_id: 72992

    I read your post about using ImportJSONAdvanced in a loop to handle this, but to be totally honest, that’s way beyond my abilities. At best I can tinker with other people’s code, but creating a loops to deal with paginated data is over my head.

    Do you have any suggestions or resources you’d recommend?

    Thanks again for creating this!


    1. trevor says:


      Try looking on or Upwork for a JavaScript programmer you can hire to create a script that will solve your issue.

      Good luck.


      1. Paul Griswold says:

        Ah, I’m not quite ready for that.

        ATM my solution is – since the last thing I get is the URL for the next page of data, I’ve created a new page that calls importJSON and points to the cell containing the URL of the next page. I end up with multiple pages instead of one, but it works for now.

        Again Trevor, thank you very much for providing this!

  94. Dan says:

    Had a major problem this morning pulling API info that normally works with no issues.

    SyntaxError: String contains control character (line 169).

    Trying to pull as follows
    =ImportJSON(“”, “/data/nationdata/”, “noInherit,noTruncate”)


    1. Dan says:

      Also, when it did work, it wouldn’t pull the #1 ranked nation. Any idea why?

      Would appreciate any help, particularly in solving the immediate issue of it not pulling anything.

    2. Dan says:

      OK, the main issue is now fixed. There was a dodgy character in one nation name out of the 4000 or so pulled by this API. I got the game creator to fix the nation name.

      I have a bit of an issue with names that have apostrophes in them too. It doesn’t pull the data for them. I can live with it though.

  95. Jeune says:

    There appears to be an issue where a backslash “\” causes the script to fail to parse the data. Is there a fix for that or where in the script would this need to be accounted for? Even better question, why is it having an issue with that character. The API is using it as an escape character before an apostrophe.


    1. trevor says:

      The problem is most likely because apostrophes shouldn’t be escaped in JSON, so an escaped apostrophe is considered incorrectly formatted JSON. You can see the formal JSON spec here.

      ImportJSON uses JSON.parse() to parse the JSON. You could write your own parser that was more lenient, but it’s probably better to get the API to send correctly formatted JSON.

  96. Leonardo says:

    Hi, my English is not good. Therefore, it will be objective.

    I would like to retrieve this link:|Seattle&destinations=San+Francisco|Victoria+BC&mode=bicycling&language=fr-FR

    What to do?

    Thank you so much.

  97. Herbert Huyo says:

    Thank you for these wonderful IMPORTJSON formula… You, have help me alot… Now, it’s my time to help for the auto-update issues in new googlesheets. For those who wants their IMPORTJSON auto-updated in gSheets, this is what I did to have it auto-updated every minute…. I have just added a new script in my script editor below, and have it trigger every minute…. Hope this helps…

    function ImportJSONupdate() {
    var queryString = Math.random();

    var PHP = ‘=IMPORTJSON(“‘ + SpreadsheetApp.getActiveSheet().getRange(‘A16′).getValue() + ‘?’ + queryString + ‘”)’;

    var CNY = ‘=IMPORTJSON(“‘ + SpreadsheetApp.getActiveSheet().getRange(‘A19′).getValue() + ‘?’ + queryString + ‘”)’;

    1. Nescaf says:

      Hi Herbert. This script is appearing with an `Illegal character. (line 8, file “ImportJSON”)` error – do you know why this would be the case?

  98. Rob says:

    Having some weird issue where i can no longer see the importjson when i type in a = sign on my sheet. Have v1.2.1 with the script named and copied and pasted the raw code from github. Any ideas?

  99. Vincent REHM says:

    Hello, is it possible to use the post option with basic authentification ? Vincent

    1. trevor says:

      See this comment for details on how to modify the code for basic authentication. You can do something similar with the ImportJSONViaPost function.

  100. Al says:


    Thanks so much for your script.

    I’ve successfully used it to import data to my sheet from a json feed but the values which should be showing as integers seem to have been imported as strings and as such I can’t aggregate them with a function like sum().

    Please help.

    1. trevor says:

      By default, ImportJSON imports all the data as strings.

      You could try forcing the numbers by modifying the defaultTransform_ function. To force every value to an float, at the end you could put:

      if (!isNaN(parseFloat(data[row][column]))) {
         data[row][column] = parseFloat(data[row][column]);

  101. Alex says:

    I have some troubles:
    one api take me 5 strings:
    but in google sheet i see only 4 string (from second to five)

    how to display first string too?

    1. Red says:

      Hello I am having a very similar problem. I think your reply was awaiting moderation when I replied (below) so I didn’t see it.

      If you figure this out, please let me know. I will do the same. Thank you.

    2. San says:

      Check line 286 or find

      if (i > 0 && data[state.rowIndex]) {

      Replace with:

        if (i > -1 && data[state.rowIndex] && i < value.length - 1) {

      1. trevor says:

        So this has been a tricky line of code. The proposed code:

        if (i > -1 && data[state.rowIndex] && i < value.length - 1) {

        is actually the same as:

        if (data[state.rowIndex] && i < value.length - 1) {

        since i is initialized to 0 and can never be less than 1.

        This solution changes the exception from being the first element in the array to the last element. That may work for some people. Others have proposed:

        if (data[state.rowIndex]) {

        Which definitely works for some problems. I haven’t merged this solution into the main codebase because there was clearly some problem I was solving by putting i > 0, and while taking that out fixes a bug for some people, it may create another bug for others.

        Good luck!

  102. Red says:

    Hello. Firstly, this even existing is a life saver. Part of what I’m doing involves over 200 jsons which look like this:

    “termsOfUse” : “Please use this data feed responsibly, as it can consume significant amounts of server resources if called repeatedly. The feed requires that you be signed into your member account to use. If you’re trying to access it programmatically, you may POST your username and API Token to this page in the format”,
    “id” : “10676651″,
    “state” : “Finished”,
    “name” : “Game 1″,
    “numberOfTurns” : “6″,
    “players” : [ {
    "id" : "9650427823",
    "name" : "Oh",
    "email" : "...@...",
    "isAI" : "False",
    "humanTurnedIntoAI" : "False",
    "hasCommittedOrders" : "False",
    "color" : "#ff4700",
    "state" : "SurrenderAccepted"
    }, {
    "id" : "486201411",
    "name" : "ps",
    "email" : "...@...",
    "isAI" : "False",
    "humanTurnedIntoAI" : "False",
    "hasCommittedOrders" : "False",
    "color" : "#606060",
    "state" : "Won"
    } ]

    For each one, I am trying to pull /players/name but it never returns the first player (in this case, “Oh”). Some of the jsons I want to import from have 4 or 6 players and every time it returns all players except the first and I’m not sure why. I’d appreciate any help. Thanks.

  103. Adam says:

    Hey Trevor,

    As everyone else is saying – awesome script.

    I’m trying to pull Facebook API data with this and the JSON looks something like this:

    “data”: [
    “actions”: [
    "action_type": "link_click",
    "value": 37
    "action_type": "offsite_conversion.fb_pixel_add_to_cart",
    "value": 9
    "action_type": "offsite_conversion.fb_pixel_purchase",
    "value": 1
    "action_type": "offsite_conversion.fb_pixel_view_content",
    "value": 85
    "action_type": "post_like",
    "value": 20
    "action_type": "page_engagement",
    "value": 57
    "action_type": "post_engagement",
    "value": 57
    "action_type": "offsite_conversion",
    "value": 95
    “ad_id”: “6041647382313″,
    “date_start”: “2016-03-27″,
    “date_stop”: “2016-03-30″

    Now, I want to pull out of here the value from the “offsite_conversion.fb_pixel_add_to_cart” field.

    I tried the following:


    (where my URL is in A1,) but it didn’t seem to like that.

    Also tried a bunch of other combinations which didn’t work.

    Any tips would be greatly appreciated.


    1. trevor says:

      Your problem is that “offsite_conversion.fb_pixel_add_to_cart” isn’t a field, it’s a value. You would need to use:


      To pull the array of action types, then use a VLOOKUP to find the one with the whose ‘actiion_type’ property has the value “offsite_conversion.fb_pixel_add_to_cart” and look for the ‘value’ property in the same row.

      1. Adam says:

        Hey Trevor –

        Getting an ‘invalid argument’ (line 164) using the following call:,adset_id,ad_id,spend,action_values,actions&level=ad&limit=1000&ids=6044683652113,6044683651913,6044622109313,6044621654113,6044620260513,6044619872713,6044590007713,6044354868113,6044354847313,6044349570713,6044218836713,6044217976113,6044175821913&filtering={%27field%27:%27action_type%27,%27operator%27:%27CONTAIN%27,%27value%27:%27offsite_conversion.fb_pixel_add_to_cart%27},{%27field%27:%27action_type%27,%27operator%27:%27CONTAIN%27,%27value%27:%27offsite_conversion.fb_pixel_purchase%27}]&action_attribution_windows=28d_click&time_range={%27since%27:%272016-5-5%27,%27until%27:%272016-5-10%27}&action_report_time=conversion&p=5&access_token=[TOKEN]

        Any ideas? Char limit?

        1. trevor says:

          Not sure. The URL looks encoded enough.

          You can test the character limit by progressively cutting down the URL and seeing if the error goes away (the request doesn’t have to return valid data, it only has to get through line 164).

          You might also want to test chopping off bits of the URL and progressively adding them back in to see if there’s a special character that’s tripping it up. For instance, chop off 25% of the length until you stop getting the error on line 164, then add back in the length you just chopped off and chop off 25% of that. Keep going until you’re removing one character at a time and find the character causing the problem.

          The only other thing I can think of off the top of my head looking at the URL is to consider encoding the colons, in case the Google fetch code is mistaking these as part of the protocol.

          Good luck.

          1. Adam says:

            Hey Trevor –

            Just for your knowledge and anyone else that might encounter this problem –

            The function seems not to like squirly brackets, { and }, I encoded them in the URL to %7B and %7D respectively and that seemed to do the trick.

            Thanks again for all your help.

  104. Bram Flowers says:

    Hey, Trevor,
    Thanks so much for the amazing library here. Any idea why the entire JSON isn’t being parsed from this link:

    If I put into a JSON parser, I get all the results, but on google sheets, I am missing all of the eV2 values and only Hole 18 gives me all the stats.

    Not using any queries or parameters, just simple:

  105. Obed Vazquez says:

    dude, I want to suggest an edition, I’m not sure what are you doing in there but in line 403 you are ignoring 0s:

    what if I need the 0s to appear?

    Instead of:

    if (!data[row][column]) {
        if (row < 2 || hasOption_(options, "noInherit")) {
          data[row][column] = "";
        } else {
          data[row][column] = data[row-1][column];

    Im changing it to:

    if (!data[row][column]) {
        if (row < 2 || hasOption_(options, "noInherit")) {
            if(data[row][column] != 0)
                data[row][column] = "";
        } else {
          data[row][column] = data[row-1][column];

    But I’m not sure at all of what I’m doing so, I hope you can help me because this is provisional, but I hope that I don’t mess the whole thing up.

    1. trevor says:

      I no longer use tests for falsy values in my code for this reason. If you want to correct this code, you need to do it at the outer conditional:

      if (data[row][column] != null) {
          if (row < 2 || hasOption_(options, "noInherit")) {
            data[row][column] = "";
          } else {
            data[row][column] = data[row-1][column];

      However, since the ‘data’ is being parsed from JSON, any zeros will be in string form, so unless there’s other code that’s converting values to numbers, the test !data[row][column] will evaluate to true for a zero data value (since “0″ is true, while 0 is false).

      1. Lawrence says:

        Two years later… But I hope you can help.

        I’m currently trying to use your code to import this ( JSON.

        All works like a dream, except for a problem which I sense is related to this.

        Paticularly noticeable on the “Links” column, anything with a value of 0 does not indeed print “0″, or ” “, but instead prints the value of the column above. I end up with long lists of supposed “5″ links, instead of the zeros.

        Could you offer me a fix – I’m pretty useless at programming so any help would be much appreciated!

        1. trevor says:

          I’m no longer maintaining this code. I recommend checking out the updated code at the new GitHub repo. The issue might already be fixed over there. Good luck!

  106. Tom says:

    When I use this formula, I’m getting an error. I’m trying to get the county for a zip.

    Any Idea? I’m using your script with other JSON APIs with no issue. It’s just this particular one that’s giving me an error.

    =PROPER(ImportJSON(CONCATENATE(""), "/county","noHeaders"))

    SyntaxError: Unexpected token: ( (line 132).

    Thank you for your wonderful work!

  107. ralph r says:

    Hay nice but when i drop in the code all i get is a blank cell

  108. Kelly says:


    Does this hold true for GeoJSON?

    1. Kim says:

      It does! I use the query language here that I learned here in ArcMap to grab the geographic information where I take the fsURL = baseURL + query and use it as my base for the google sheets coode.

      Here is an example Query from one of the result with a geographic filter and selected fields by pulling data from the county into a spreadsheet of a town that I work for.,gis.DBO.Parcel_Polygon.PARCEL_ID,gis.DBO.Parcel_Polygon.ACREAGE&Geometry=504053.940454141,2234489.05280089,511367.038413671,2238150.89345791&geometryType=esriGeometryEnvelopee&f=json

      His code is amazing.

  109. Rob says:

    Hi there – I’m trying to use ImportJSONWithToken function you suggested above to use a Google Sheet to access my Clash of Clans data. I added in the ImportJSONWithToken code:

    function ImportJSONWithToken(url, query, parseOptions, token) {
       var fetchOptions = {
          "headers" : {
             "Authorization: Bearer " : token
          muteHttpExceptions: true

    The call I’m making is to get the members of my clan (/members), I’ve put my clan ID in B1, and my Developer API token in B2


    When I use ImportJSONWithToken, it tells me “Error
    Attribute provided with invalid value: Header:Authorization: Bearer (line 164)”

    According to the Developer page, the curl for the data I’m trying to access is:

    curl -X GET –header ‘Accept: application/json’ –header “authorization: Bearer ” ‘[CLAN ID]‘

    Any ideas on how to successfully pass the authorization bearer?

  110. Michael says:

    Is there a way to limit the update cycle of the JSON data? My source allows for a limited number of fetches per day, so I wanted to put it on an hourly refresh cycle.

  111. Giuseppe says:

    Really awesome! Thank you very much!

    I’m trying to read data from an API which stands behind AD credentials.
    Is the API capable to also send credentials?

    Thanks in advance,

  112. Marlon says:

    Nice script! This is what I needed but when I tried the script, I already imported it. I’m getting formula parse error using the google JSON data below,7.3553838

    Everytime, I run the script there’s an error. DNS error: http://undefined (line 164, file “Code”)

    Here’s the formula I used: =ImportJSON(“,7.3553838”)

    Can you anyone help me on this please?

  113. Dídac says:

    Good afternoon, I have a problem I’m using this on a Google Drive but I need to update only every 1 or 2 hours. It updates me live and the Drive takes time to recalculate it. Can somebody help me? Thank you

  114. shahrul says:

    Hello there, if anyone still need to automatic update every minute, u just need to create new script and paste this code, then click run

    function importJSONupdate() {
    var queryString = Math.random();

    var cellFunction1 = ‘=IMPORTJSON(“‘ + SpreadsheetApp.getActiveSheet().getRange(‘A203′).getValue() + ‘?’ + queryString + ‘”,”‘+ SpreadsheetApp.getActiveSheet().getRange(‘b203′).getValue() + ‘”,”noInherit,noTruncate”)’;

    A1 = let it blank
    A203=ur json url (my data is huge, u can put at any cell u want)
    B203= /

    then its work.. =importjson{“blablabla”) will appear at ur A1 and u make it run and trigger every minute.. (sorry for my bad english)

  115. baco says:

    After some hours to find a way to ask http request with headers in spreadsheet cell, I finally found:

    add these following lines to the script (v1.2.1) at line 108:

    if (postOptions["headers"] != null) {
    postOptions["headers"] = JSON.parse(postOptions["headers"]);

    in cell use ImportJSONViaPost function like this


  116. yo says:

    how can i do exact match in query?

  117. Dieter says:

    Hi Trevor

    Many thanks for this from me too.

    I’m having quite some trouble trying to get the sheet to update automatically. I’ve followed many of the suggestions mentioned in the forum but nothing seems to work. Here are my observations:

    1) Setting a trigger doesn’t seem to fire
    2) When i try use now() as an additional parameter, I get an error that says “This function is not allowed to reference a cell with NOW(), RAND(), or RANDBETWEEN()”
    3) Trying shahrul’s suggestion of dynamically creating the call from another cell, I get the error “You do not have permission to call setValue”

    Furthermore, I would need to run the automated update regularly (Every 10 seconds) and it seems most solutions will only run every minute. Does anybody have any suggestions or a working copy that I could have a look at please?

    1. Joseph Ward says:

      Hey everyone, I figured out how to get it to automatically update every minute.

      First you need to pick a cell somewhere in your spreadsheet that says “http”
      Next you write a script the every minute clears that cell and sets the value back to http
      Next set a trigger to run the script every minute
      Finally, go to the cell where you are using the importjson and in your url replace http with “&Sheet1!O1&” (including the quotes) replacing Sheet1!O1 with the cell that says http.

      Hope this helps!

      If you found this helpful, consider sending me a small tip with bitcoin

      Or if you want me to do it for you, send me an email and I’ll do it for $5.

  118. Pavel says:

    Greetings. I really liked your script, but I have such a question, you can transfer it to the microsoft excel, but it uses the language of vba, I did not find anything like yours on the microsoft excel

  119. John Park says:

    I’ve had quite a few issues automatically refreshing the JSON data. I have a separate sheet which collects all my JSON data in order to feed into other worksheets. The only way I could refresh it was to add a row at the top of the JSON sheet, then delete that same row – this forces the script to grab the data. Use a time trigger to execute this script (I run mine every minute).

    Here’s a copy of my script, change the values in square brackets:

    function RefreshJSONData() {

    // The code below creates a new row, then deletes it in order to refresh JSON data

    var sheet = ss.getSheetByName(“[SHEET NAME WHERE JSON DATA IS OUTPUTTED TO]“);

    // This inserts a row before the first row position

    // This will delete the first row


  120. Avinash says:

    Thank you very much …!
    This is what I was looking for :)

  121. slavi says:

    Hi how can I import this chart via =ImportJSONviaPost()

    Thanks in advance! :)

    1. trevor says:

      The maintenance of this project has been transferred to Brad Jasper. For the latest version of the code or questions about how to use it, visit the new project page on GitHub.

14 pings

  1. The training spreadsheet, part 4 | Neil's Ironman Blog says:

    […] a custom script to do this. This is much easier than it sounds. I got all of this from here:, which is a brilliant overview of it. If you don’t want to read it, do […]

  2. How to use triggers with functions that require parameters from the spreadsheet? | Question and Answer says:

    […] I found a simple and great solution to retrieve JSON into a Google spreadsheet. […]

  3. Scraping the Ofsted website. Part 2 – front-end (Javascript and D3) | Philip Nye, freelance journalist - data, Hackney, public policy and more says:

    […] it doesn’t contain an option for importing JSON data – not out the box anyone. Helpfully, someone has written a Google Apps Script that adds an ImportJSON function to Sheets with minimal […]

  4. Import JSON do Arkuszy Google w 5 minut - Michał Bryś says:

    […] Google. Dane z pliku JSON zaimportujemy do tabeli. Wykorzystałem do tego celu Google Apps Script (importJSON oraz […]

  5. How answering questions proves authenticity in marketing - Christopher S. Penn Blog says:

    […] accessible ONLY by JSON, but that’s outside the scope of the blog post. You’d have to build the custom ImportJSON function from Trevor Lohrbeer to do […]

  6. Converting JSON to Google Sheet | Aug's Blog says:

    […] ImportJSON at Fast Fedora […]

  7. Estadísticas de la Wikipedia como alternativa a Google Trends says:

    […] forma podremos por ejemplo cargar los datos en una hoja de cálculo de Drive (usando la librería import-json) para hacer nuestra propia […]

  8. Pulling policing data into google sheets | says:

    […] links to api calls to pull in boundary data as well as other stuff. It uses a script called importJSON which means you can query the data from the police api directly and have it appear nicely in a […]

  9. Use the search API to get useful information about GOV.UK content | Data at GDS says:

    […] you’ve used the ImportXML function to scrape page data in Google Sheets, you’ll find ImportJSON quite similar. It’s a custom script that’s not built in or supported by Google. It works well […]

  10. IMPORTJSON not pulling data after first time | Mentalist Nuno says:

    […] Using ImportJSON script – […]

  11. IMPORTJSON Google Script giving SyntaxError says:

    […] been using this IMPORTJSON script with lots of success. I installed the JSON API WordPress plugin to create my own JSON […]

  12. Live Updated Cryptocurrency Investment Tracking Spreadsheet - CoinFabrik Blog says:

    […] uses a script based on FastFedora Project which can be optionally downloaded […]

  13. How to Import Coin Market Cap data using JSON - Novel Tech Aust says:

    […] and another on how to put it into your google sheet and how to use it ( This can be used in any Google Sheets document to get import any JSON data. That should be all […]

Leave a Reply to trevor Cancel reply

Your email address will not be published.

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>