Write Files from EC2 to S3 Programmatically

Tucker Clinton
5 min readDec 24, 2020

This is a step-by-step tutorial of how to write files from AWS EC2 to S3 programmatically using EC2, S3, IAM, and Python.

We’ll take the following steps to setup a scenario to explore accessing S3 programmatically and writing files to a bucket in the CLI.

  1. Create an S3 bucket
  2. Create an EC2 instance
  3. Create an IAM role
  4. Apply role to EC2
  5. Write python code on EC2
  6. Inspect the file

1. Create an S3 Bucket

  • Navigate to S3 and create a bucket
  • Give the bucket a name (my-ec2-files-project), ignore versioning, tags, and encryption settings while keeping the “block all public access” setting.
  • Create Bucket and it can be viewed in the S3 Dashboard

2. Create an EC2 Instance

  • Go to the EC2 Dashboard and click Launch instance
  • Select the Linux 2 instance as the server type
  • Select t2.micro as the instance type and click Review and Launch. Click Launch on the next screen.
  • The popup prompt will ask you to create a key pair, which we will need to do to ssh into our EC2 instance. Create and name the key pair and then download it and launch your instance. I named the key pair “S3EC2FilesProjectKeyPair” for this instance.
  • Navigate back to the EC2 dashboard to view your instance.
  • The instance will be assigned a default security group which will allow ssh access from any IP. In the next step, create an IAM role to adjust permissions to allow ssh access only from our IP address.

3. Create an IAM Role

  • Navigate to IAM Dashboard, click on Roles on the left and then Create Role
  • Select AWS Service under type of trusted entity and choose EC2 as the use case. Click Next: Permissions
  • Search the policies for S3 and choose AmazonS3FullAccess. This allows the instance to have full access to the S3 service. Typically we would choose the most restrictive access that works for our use case, but we will choose all access for this project.
  • Skip the Tags by clicking Next
  • Name the IAM role (I’ll name it EC2-S3-ProjectAccess).
  • Click “Create Role”
  • In the next Step, apply this IAM role to our EC2 instance

4. Apply Role to EC2 Instance

  • Ssh into the instance from the EC2 Dashboard
  • Select the instance and click “Connect”
  • Open an SSH client.
  • Locate the private key file. The key used to launch this instance is S3EC2FilesProjectKeyPair.pem
  • Run this command, if necessary, to ensure the key is not publicly viewable.
chmod 400 S3EC2FilesProjectKeyPair.pem
  • Connect to our instance using its Public DNS:
ssh -i “S3 EC2 Files Project Key Pair.pem” ec2-user@ec2–34–207–190–235.compute-1.amazonaws.com
  • Before adding the role, try to access S3 without adding the role. Run the following command:
$ aws s3 ls
  • We will get the following error: “Unable to locate credentials. You can configure credentials by running “aws configure”.
  • Instead of setting up aws configure, we can add a role
  • In the EC2 Dashboard select our instance and click the “Actions” dropdown. In the “Security” selection dropdown, select “Modify IAM role”
  • Find the EC2-S3-ProjectAccess role we just created and click Save.
  • After adding the role, confirm the role works. Rerun the $ aws s3 ls command in the ssh terminal to make sure we do not get the same “unable to locate credentials” error message.
The my-ec2-files-project S3 bucket that we created in step 1 is listed.

5. Write Python Code on the Instance

  • Install Python3 on the EC2 instance (EC2 instances come with Python2, but for this instance we will upgrade to Python 3)
  • In the terminal, run the following command to see which versions are available
$ sudo yum list | grep python3
Python 3 is available, so install it
$ sudo yum install python3
  • To access S3 programmatically, install packages with pip. We’ll use a package called boto3. Install it with command
$ sudo pip3 install boto3
  • Create an empty python file
$ touch my_script.py
  • Next, add logging code to the file. Open the file with vim or nano (I use vim)
vi my_script.py
  • Update the file (press “i” to insert) with the following.
import boto3
from datetime import datetime
cli = boto3.client('s3')
cli.put_object(
Body='The time now is '+str(datetime.now()),
Bucket='my-ec2-files-project',
Key='ec2.txt')
  • This will write the date and time to the ec2.txt file. Save and exit the vim file by pressing “esc” then “:wq” and “enter”
  • The boto3 package is used to write a text file called “ec2.txt” to S3 with the current time in a string. From the terminal in your EC2 instance, run the script we just wrote with the command
python3 my_script.py

6. Inspect the File

  • Navigate to S3 and click the “my-ec2-files-project” bucket
  • Select the file with the name we gave in in our script (ec2.txt) and select Download in the Actions dropdown
  • Open the file in a text editor and it should display the contents of what we told python to write in the file

In this project, we created and S3 Bucket, EC2 instance, and IAM Role, then assigned the role to the EC2 instance. After installing Python, we wrote a script that prompts our EC2 instance to write contents that we defined (date and time) to a file in our S3 bucket. Opening the .txt file confirms that our python code is working correctly.

--

--

Tucker Clinton

Working toward a career in cloud engineering. I’ll be documenting some of my progress here!