How to Import AWS S3 Data to Airtable with No-Code

May 24, 2023Rosie Threlfall

In this simple guide, we'll run through how to import AWS S3 Data to Airtable without the need for any code.

AWS (Amazon Web Services) is a cloud computing platform by Amazon. AWS provides various scalable services, including computing power, storage, databases, networking and machine learning. By using AWS, businesses and individuals can easily access and utilize computing resources on-demand, paying only for what they use.

We'll be importing data from an AWS service called S3 or Amazon Simple Storage Service. S3 is a cloud-based storage service similar to Dropbox or iCloud.

Add Data Fetcher to Airtable

We'll be using Data Fetcher to import AWS S3 Data to Airtable. Data Fetcher is a powerful Airtable extension that enables you to import many different types of data from APIs or websites into Airtable.

Add Data Fetcher to Airtable via the Airtable marketplace. Sign up for a Data Fetcher account by entering a password and clicking 'Sign up for free'. Alternatively, you can use your Google login to create a new account. If you already have a Data Fetcher account, use the 'Have an account?' text in the bottom left of the screen to log in.

Data Fetcher Sign Up

Import AWS S3 Data to Airtable

Data Fetcher requests enable you to import data into your Airtable base. To begin, click 'Create your first request' on the home screen of the Data Fetcher extension.

Create your first request in Data Fetcher

For Application, select 'Custom' to set up the AWS Airtable integration.

wordpress-airtable-1.jpg

For this demo, we'll use an example CSV file of UK government jobs data, which you can find here (or use your own CSV file.) This needs to be uploaded to your S3 bucket. You can follow these instructions if you are unsure of this process.

We will then use the AWS S3 API to import our CSV file into Airtable.

The URL to get the file using the API is: https://s3.amazonaws.com/jobs-bucket-data-fetcher/jobs.csv

`jobs-bucket-data-fetcher` is our S3 bucket name, so replace 'jobs-bucket-data-fetcher' with your S3 bucket name.

`jobs.csv` is our file name, so replace this with your file name. You might also need to add a folder name to the URL before the file name.

Enter your URL into the URL field in Data Fetcher.

AWS-S3-Data-to-Airtable-1.jpg

Under Authorization, select 'AWS Signature V4' and enter your AWS Access key and Secret key. For Region, use the region that your S3 bucket is in 'us-east-1'.

You can find out how to access your AWS keys here. Alternatively, you can use a root user access key. However, it's better security practice to use IAM users with the minimum level of security.

This authorization method will also work with any other AWS API. 

Give your request a name, such as 'Import AWS S3 Data' and click 'Save & Run'.

AWS-S3-Data-to-Airtable-2.jpg

Data Fetcher will run the request to AWS, and the Response field mapping modal will open. This is where we can select which fields from the AWS file to import into Airtable, and we can choose how they map to our output table. 

It is possible to select 'Filter all' to clear all the pre-selected fields and then use the Find field search bar to locate the fields you want to import.

Once you are happy, click 'Save & Run'.

AWS-S3-Data-to-Airtable-3.jpg

Data Fetcher will create any new fields in the output table. It will then run the request to AWS and import the data from the CSV file into Airtable. You'll now be able to see the data in your output table.

AWS-S3-Data-to-Airtable-4.jpg

Automatically Import AWS S3 Data to Airtable

Currently, you'd need to manually run the request each time you want to import any new data from your AWS S3 file. It is possible to automatically run the AWS Airtable integration on a chosen schedule with Data Fetcher's scheduling feature.

Scheduling is a paid feature, so in Data Fetcher, scroll to Schedule and click 'Upgrade'.

schedule options

Choose the plan that suits you the best and enter your payment details.

data fetcher upgrade pricing plans.png

In Data Fetcher, click 'I've done this'.

Schedule Data Fetcher

Under Schedule click '+ Authorize'.

Schedule Data Fetcher

A window will now open where you'll need to authorize the Airtable bases you want Data Fetcher access.

By selecting 'All current and future bases in all current and future workspaces' you'll also avoid having to authorize bases for future use.

Click 'Grant access'.

schedule-authorize-bases.jpg

In Data Fetcher, you'll see Schedule this request is now toggled to on.

Select a schedule for the AWS Airtable integration to run. You can choose intervals of 'Minutes', 'Hours', 'Days' or 'Months', then click 'Save'.

Schedule this request

If changes are made to the file you have stored in AWS S3 these will now all be imported into Airtable automatically on this schedule.

Other Recent Posts

How to Scrape Instagram into Airtable using RapidAPI

How to Scrape Instagram into Airtable using RapidAPI

Apr 11, 2024

Andy Cloke

RapidAPI
How to Look Up Data in Airtable When a New Record is Created

How to Look Up Data in Airtable When a New Record is Created

Mar 2, 2024

Andy Cloke

No-CodeAirtable
How to Sync Pipedrive Data to Airtable Automatically

How to Sync Pipedrive Data to Airtable Automatically

Feb 13, 2024

Andy Cloke

Pipedrive