AWS SDK and AWS CLI are essential tools for interacting with AWS services programmatically and via the command line. AWS CLI provides a command-line interface to manage AWS resources, while AWS SDKs allow developers to integrate AWS services into their applications using programming languages such as Python, Java, and Node.js.
In this guide, we will:
- Install and configure the AWS CLI.
- Set up the AWS SDK for Python (Boto3).
- Demonstrate AWS CLI commands by creating and deleting an S3 bucket.
- Perform the same operations using Boto3 in Python.
Prerequisites for AWS CLI & SDK
- An AWS account with programmatic access enabled.
- Basic knowledge of AWS services, especially S3 (Simple Storage Service).
- A system with Python 3.x (for AWS SDK demo).
Step 1: Installing AWS CLI on Debian
1.1 Update System Packages
Before installing AWS CLI, ensure that all system packages are up to date. This helps prevent compatibility issues and ensures you have the latest security updates.
Run the following command to update your package lists and upgrade existing packages: `sudo apt update && sudo apt upgrade -y`

1.2 Download AWS CLI Installer
AWS provides a precompiled installer for AWS CLI. Use curl to download it from the official AWS website.
Run the following command to download the installation package: `curl “https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip” -o “awscliv2.zip”`
1.3 Extract and Install AWS CLI
Since the AWS CLI installer is packaged as a .zip file, you need to extract it before installation. If you do not have unzip installed, install it first.
Run the following command to install unzip: `sudo apt install unzip -y`

Now, extract the AWS CLI package: `unzip awscliv2.zip`
Finally, install AWS CLI by running: `sudo ./aws/install`

1.4 Verify Installation
To check if AWS CLI has been installed successfully, verify the version: `aws –version`

You should see an output similar to: aws-cli/2.x.x Python/x.x.x Linux/x86_64
Step 2: Configuring AWS CLI
Once AWS CLI is installed, it must be configured with your AWS credentials to interact with AWS services.
Run the following command to start the configuration process: `aws configure`

You will be prompted to enter:
- AWS Access Key ID: Your AWS access key (found in the AWS IAM console under Security Credentials).

- AWS Secret Access Key: Your AWS secret key.
- Default region name: The AWS region you want to use (e.g., us-east-1).
- Default output format: The output format for CLI commands (choose json, table, or text).
To verify that AWS CLI is configured correctly, run: `aws sts get-caller-identity`

This command retrieves details about your IAM identity and should return your AWS account details.
Step 3: Installing AWS SDK (Boto3) on Debian
3.1 Install Python & Pip
Boto3, the AWS SDK for Python, requires Python and pip (Python’s package manager). Install them by running: `sudo apt install python3 python3-pip -y`

To confirm the installation, check the versions:
- python3 –version
- pip3 –version
3.2 Install Boto3 SDK
Once Python and pip are installed, install Boto3 using pip: `pip3 install boto3`

3.3 Verify Installation
To check if Boto3 is installed correctly, run the following Python command: `python3 -c “import boto3; print(boto3.__version__)”`

If installed successfully, this will print the installed version of Boto3.
Step 4: AWS CLI Demo – Creating and Deleting an S3 Bucket
4.1 Create an S3 Bucket
To store objects in AWS S3, you first need to create a bucket. Each bucket name must be unique across all AWS accounts.
Run the following command to create an S3 bucket: `aws s3 mb s3://my-demo-bucket-12345891243`

To verify that the bucket was created, list all buckets: `aws s3 ls`

4.2 Upload a File to the Bucket
Create a sample text file that we will upload to the S3 bucket: `echo “Hello AWS S3” > demo.txt`

Now, upload the file to the newly created S3 bucket: `aws s3 cp demo.txt s3://my-demo-bucket-12345891243/`

To confirm the file was uploaded, list the contents of the bucket: `aws s3 ls s3://my-demo-bucket-12345891243/`

4.3 Delete an S3 Bucket
To delete an S3 bucket, you must first remove all objects inside it.
Remove the uploaded file: `aws s3 rm s3://my-demo-bucket-12345891243/demo.txt`

Once empty, delete the bucket: `aws s3 rb s3://my-demo-bucket-12345891243`

To confirm the deletion, list all buckets again: `aws s3 ls`

Step 5: Boto3 Demo – Creating and Deleting an S3 Bucket
5.1 Create an S3 Bucket
The following Python script creates an S3 bucket using Boto3:
import boto3
s3 = boto3.client(‘s3’)
bucket_name = “my-demo-bucket-12345891243”
s3.create_bucket(Bucket=bucket_name)
print(f”Bucket {bucket_name} created successfully!”)

5.2 Upload a File to the Bucket
To upload a file using Boto3, use this script:
import boto3
# Define bucket name
bucket_name = “my-demo-bucket-12345891243”
# Create an S3 client
s3 = boto3.client(‘s3’)
# Upload the file to S3
s3.upload_file(‘demo.txt’, bucket_name, ‘demo.txt’)
print(f”File ‘demo.txt’ uploaded successfully to {bucket_name}!”)

5.3 List Objects in the Bucket
This script lists all objects stored in the specified S3 bucket. If the bucket is empty, it notifies the user accordingly.
import boto3
# Initialize S3 client
s3 = boto3.client(‘s3’)
# Define bucket name
bucket_name = “my-demo-bucket-12345891243”
# List objects in the bucket
response = s3.list_objects_v2(Bucket=bucket_name)
# Check if the bucket contains any objects
if ‘Contents’ in response:
print(f”Objects in {bucket_name}:”)
for obj in response[‘Contents’]:
print(obj[‘Key’])
else:
print(f”Bucket ‘{bucket_name}’ is empty.”)

5.4 Delete an S3 Bucket
Before deleting an S3 bucket, we must remove all objects inside it. This script ensures that all objects and versions (if versioning is enabled) are deleted before deleting the bucket itself.
# Initialize S3 client
s3 = boto3.client(‘s3’)
# Define bucket name
bucket_name = “my-demo-bucket-12345891243”
# List all objects in the bucket
response = s3.list_objects_v2(Bucket=bucket_name)
# Check if the bucket contains any objects and delete them
if ‘Contents’ in response:
print(f”Deleting all objects in {bucket_name}…”)
for obj in response[‘Contents’]:
s3.delete_object(Bucket=bucket_name, Key=obj[‘Key’])
print(f”Deleted: {obj[‘Key’]}”)
else:
print(f”No objects to delete in {bucket_name}.”)
# Delete the S3 bucket
s3.delete_bucket(Bucket=bucket_name)
print(f”Bucket ‘{bucket_name}’ deleted successfully!”)

Step 6: Extended AWS SDK Operations
6.1 Automating EC2 Instance Management with SDK
Launch an EC2 instance:
import boto3
ec2 = boto3.resource(‘ec2’)
instance = ec2.create_instances(
ImageId=’ami-08b5b3a93ed654d19′,
MinCount=1,
MaxCount=1,
InstanceType=’t2.micro’
)[0]
print(f”EC2 instance {instance.id} launched successfully!”)

List all EC2 instances:
import boto3
ec2 = boto3.client(‘ec2’)
response = ec2.describe_instances()
for reservation in response[‘Reservations’]:
for instance in reservation[‘Instances’]:
print(f”Instance ID: {instance[‘InstanceId’]}, State: {instance[‘State’][‘Name’]}”)

Terminate an EC2 instance:
import boto3
ec2 = boto3.client(‘ec2’)
instance_id = ‘i-05f2004f3d9b991b0’
ec2.terminate_instances(InstanceIds=[instance_id])
print(f”Instance {instance_id} terminated successfully.”)

Replace the instance id with the instance you want to terminate.
6.2 Managing IAM Users and Policies
Create an IAM user and attach a policy:
import boto3
iam = boto3.client(‘iam’)
user = iam.create_user(UserName=’newUser’)
policy_arn = ‘arn:aws:iam::aws:policy/AmazonS3FullAccess’
iam.attach_user_policy(UserName=’newUser’, PolicyArn=policy_arn)
print(f”User {user[‘User’][‘UserName’]} created with S3 access!”)

6.3 Automating AWS Backup with S3 Versioning with SDK
Enable versioning on an S3 bucket:
import boto3
s3 = boto3.client(‘s3’)
bucket_name = “boto3-sdk-testing”
s3.put_bucket_versioning(
Bucket=bucket_name,
VersioningConfiguration={‘Status’: ‘Enabled’}
)
print(f”Versioning enabled on bucket {bucket_name}!”)

Best Practices for AWS CLI & AWS SDK
To ensure secure and efficient use of AWS services, follow these recommended best practices. They help protect your environment, maintain compliance, and reduce potential security risks.
- Use IAM Roles instead of hardcoded credentials for better security.
- Enable Multi-Factor Authentication (MFA) for AWS accounts.
- Follow the Least Privilege Principle when granting AWS permissions.
- Monitor API calls using AWS CloudTrail for auditing.
AWS Macie
Running wide scans on S3 data can get expensive. Elite Cloud helps you scope Macie jobs intelligently—so you secure sensitive data without scanning the entire bucket unnecessarily.
Connect with us to optimize security monitoring and cost.
Conclusion
This guide covered installing and configuring AWS CLI and Boto3 on Debian. We demonstrated how to create and delete an S3 bucket using both AWS CLI and Python. Mastering these tools enables efficient cloud management and automation of AWS services.
FAQs
What is the difference between AWS CLI and AWS SDK?
AWS CLI is a tool to manage AWS services via terminal commands, while the AWS SDK allows developers to write code in languages like Python (Boto3) to interact with AWS programmatically.
How do I install AWS CLI on Debian?
You install it using curl and unzip, followed by executing the AWS installer script. After that, run aws --version
to verify it’s correctly installed.
What is Boto3 SDK and why is it useful?
Boto3 is the Python SDK for AWS. It lets you write scripts to automate tasks like creating S3 buckets, launching EC2 instances, and managing IAM users.
Can I perform the same AWS tasks with both CLI and SDK?
Yes, actions like creating S3 buckets or managing EC2 can be done using both AWS CLI commands and Boto3 scripts, depending on your preference or workflow.
What are some security best practices when using AWS tools?
Avoid hardcoded credentials, use IAM roles, enable MFA, apply least privilege permissions, and monitor usage via AWS CloudTrail for better security.