In this guide, we will learn how to connect to a GraphQL API in Airtable using the Data Fetcher extension. GraphQL is a query language for APIs made by Facebook that is growing in popularity. We will connect to an open-source GraphQL countries API to import country data in Airtable. You can use the same approach to connect to other GraphQL APIs in Airtable, such as the Shopify Admin API or the Github API.
Install Data Fetcher from the Airtable marketplace. Data Fetcher is a free Airtable extension that lets you connect to any JSON or GraphQL API. After the extension launches, sign up for a free Data Fetcher account by entering a password and clicking 'Sign up for free'.
On the home screen of the Data Fetcher extension, click 'Create your first request'. Requests in Data Fetcher are how you import data to or send data from your Airtable base.
On the create request screen in Data Fetcher, for Application, select 'Custom'.
Enter a Name for your request, e.g. 'Import Countries'.
Set the request method to POST. GraphQL APIs always require the request method to be POST.
Enter the following URL. This is the base URL for the countries GraphQL API we are using.
Add a GraphQL request body by entering the GraphQL query you want to use, as well as any variables.
Select the Output Table & View you want to import the data into.
Click 'Save & Run'.
The request will run and the Response field mapping modal will open. This is where you set how the available fields in the GraphQL API response should map to fields in the output table. For each field name in the response, you can either import or filter it. For an imported field, you can set whether to map them to an existing field or create a new field. You can also set the field type for each new field.
Once you are happy with the mapping, click 'Save & Run'.
Data Fetcher will create any new fields in the output table, then run the request and import the data from the GraphQL API in Airtable.
If there is a unique id field in the GraphQL API response, you can import that and set it as the Update based on field in the 'Advanced settings'. This means even if you move records around in Airtable, Data Fetcher will still be able to match up data from the API response with records in the output table.
At the moment, we have to click 'Run' in order to import the data from the GraphQL API. We can use Data Fetcher's scheduled requests feature to automatically import the data every 15 minutes/ hour/ day etc.
In Data Fetcher, scroll to Schedule and click 'Upgrade'.
A new tab will open where you can select a plan and enter your payment details to upgrade.
Return to the Data Fetcher extension and click 'I've done this'.
Under Schedule click '+ Authorize'.
A pop-up will open and prompt you to authorize the Airtable bases you want Data Fetcher to have access to.
By selecting 'All current and future bases in all current and future workspaces' you'll avoid issues with any unauthorized bases in the future.
Click 'Grant access'.
In the Data Fetcher interface, Schedule this request will now be toggled on.
Select how often you want the request to run, e.g. 'Every 15 mins'. Click 'Save' The request will now run on the schedule and automatically import the GraphQL API data.
If your GraphQL endpoint uses offset or cursor pagination to return all the data, you can set this up in Data Fetcher.
First, enter your GraphQL query (including pagination arguments) and variables into our GraphQL to JSON body converter to convert them to JSON.
Back in the Data Fetcher extension, add this as a JSON request body, rather than a GraphQL one.
Scroll down to the bottom of the screen and click "Advanced settings". Then scroll down to Pagination and select "Offset body". Enter the Offset path and Limit path for the GraphQL pagination variables within the JSON. These use JSONPath syntax.
In this example, our Offset path would be "$.variables.offset" and our Limit path would be "$.variables.limit".
Enter the Limit value. This is how many items should be returned in each page. In this example, we'll use 50.
Now when we fun the request the offset/ limit values will be replaced with the correct values each time.