Here is a complete, working example of a .gitlab-ci.yml file that meets your requirements.
I have chosen a Python base image for this example because it makes installing the AWS CLI (via pip) extremely straightforward and fast for a test scenario.
Copy and paste this into your repository's CI/CD configuration file.
stages:
- connectivity_test
aws_s3_list_test:
stage: connectivity_test
# Using a standard python image makes installing the CLI via pip easy
image: python:3.11-slim
before_script:
# 1. Update pip and install the AWS CLI
- pip install --upgrade pip
- pip install awscli
- aws --version # Verify installation
script:
# 2. Run the AWS command using the env vars automatically
- echo "Attempting to list S3 buckets..."
- aws s3 ls
For the script above to work, do not hardcode your credentials in the YML file. You must set them as environment variables in the GitLab UI.
- Go to your GitLab Project.
- Navigate to Settings > CI/CD.
- Expand the Variables section.
- Add the following three variables:
| Key | Value Example | Note |
|---|---|---|
AWS_ACCESS_KEY_ID |
AKIAIOSFODNN7EXAMPLE |
Make sure to check "Mask variable" |
AWS_SECRET_ACCESS_KEY |
wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY |
Check "Mask variable" to hide in logs |
AWS_DEFAULT_REGION |
us-east-1 |
Required for the CLI to know where to look |
Note on Security: GitLab automatically injects these variables into the pipeline runner's environment. The AWS CLI is built to look for these specific variable names automatically, so you don't need to explicitly pass them in the script command (e.g., aws s3 ls just works).
If this pipeline runs frequently, installing the CLI every time is inefficient. A more optimized approach uses the official AWS Docker image, which has the CLI pre-installed.
stages:
- connectivity_test
aws_s3_list_test:
stage: connectivity_test
# Official AWS image - no installation steps needed
image: amazon/aws-cli:latest
# We need to override the entrypoint because this image
# defaults to running the 'aws' command immediately
entrypoint: [""]
script:
- aws --version
- aws s3 ls
- "Unable to locate credentials": Double-check that your variable names in GitLab Settings match exactly (
AWS_ACCESS_KEY_ID, etc.). - "Access Denied": The IAM User associated with the keys does not have the
s3:ListAllMyBucketspermission in AWS. - Protected Branches: By default, GitLab variables might be set to "Protected." If you are running this test on a feature branch (not
mainormaster), uncheck the "Protected" flag on the variables in the settings.
Would you like me to write a Terraform snippet or AWS IAM policy to generate the specific user permissions needed for this test?
1. List all S3 Buckets
aws s3 ls
2. List SSM Parameters by Path Prefix
aws ssm get-parameters-by-path
--path "/your/path/prefix/"
--recursive
--with-decryption
--query "Parameters[].{Name:Name, Value:Value}"
--output table
3. List Subnets for a specific VPC ID
Replace 'vpc-0123456789abcdef0' with your actual VPC ID
aws ec2 describe-subnets
--filters Name=vpc-id,Values=vpc-0123456789abcdef0
--query "Subnets[].{ID:SubnetId, CIDR:CidrBlock, AZ:AvailabilityZone, Name:Tags[?Key=='Name'].Value|[0]}"
--output table