Skip to content

Instantly share code, notes, and snippets.

@rexwhitten
Created January 15, 2026 18:36
Show Gist options
  • Select an option

  • Save rexwhitten/910c87c33905c3c34379333ca867dbc9 to your computer and use it in GitHub Desktop.

Select an option

Save rexwhitten/910c87c33905c3c34379333ca867dbc9 to your computer and use it in GitHub Desktop.
GitLab

Here is a complete, working example of a .gitlab-ci.yml file that meets your requirements.

I have chosen a Python base image for this example because it makes installing the AWS CLI (via pip) extremely straightforward and fast for a test scenario.

1. The Configuration (.gitlab-ci.yml)

Copy and paste this into your repository's CI/CD configuration file.

stages:
  - connectivity_test

aws_s3_list_test:
  stage: connectivity_test
  # Using a standard python image makes installing the CLI via pip easy
  image: python:3.11-slim

  before_script:
    # 1. Update pip and install the AWS CLI
    - pip install --upgrade pip
    - pip install awscli
    - aws --version  # Verify installation

  script:
    # 2. Run the AWS command using the env vars automatically
    - echo "Attempting to list S3 buckets..."
    - aws s3 ls

2. Required Setup: GitLab CI/CD Variables

For the script above to work, do not hardcode your credentials in the YML file. You must set them as environment variables in the GitLab UI.

  1. Go to your GitLab Project.
  2. Navigate to Settings > CI/CD.
  3. Expand the Variables section.
  4. Add the following three variables:
Key Value Example Note
AWS_ACCESS_KEY_ID AKIAIOSFODNN7EXAMPLE Make sure to check "Mask variable"
AWS_SECRET_ACCESS_KEY wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY Check "Mask variable" to hide in logs
AWS_DEFAULT_REGION us-east-1 Required for the CLI to know where to look

Note on Security: GitLab automatically injects these variables into the pipeline runner's environment. The AWS CLI is built to look for these specific variable names automatically, so you don't need to explicitly pass them in the script command (e.g., aws s3 ls just works).


3. A "Better" Way (Optimization)

If this pipeline runs frequently, installing the CLI every time is inefficient. A more optimized approach uses the official AWS Docker image, which has the CLI pre-installed.

stages:
  - connectivity_test

aws_s3_list_test:
  stage: connectivity_test
  # Official AWS image - no installation steps needed
  image: amazon/aws-cli:latest
  
  # We need to override the entrypoint because this image 
  # defaults to running the 'aws' command immediately
  entrypoint: [""]

  script:
    - aws --version
    - aws s3 ls

Troubleshooting Tips

  • "Unable to locate credentials": Double-check that your variable names in GitLab Settings match exactly (AWS_ACCESS_KEY_ID, etc.).
  • "Access Denied": The IAM User associated with the keys does not have the s3:ListAllMyBuckets permission in AWS.
  • Protected Branches: By default, GitLab variables might be set to "Protected." If you are running this test on a feature branch (not main or master), uncheck the "Protected" flag on the variables in the settings.

Would you like me to write a Terraform snippet or AWS IAM policy to generate the specific user permissions needed for this test?

{ "Version": "2012-10-17", "Statement": [ { "Sid": "ECRLogin", "Effect": "Allow", "Action": [ "ecr:GetAuthorizationToken" ], "Resource": "" }, { "Sid": "ECRPushImages", "Effect": "Allow", "Action": [ "ecr:CompleteLayerUpload", "ecr:UploadLayerPart", "ecr:InitiateLayerUpload", "ecr:BatchCheckLayerAvailability", "ecr:PutImage" ], "Resource": "arn:aws:ecr:YOUR_REGION:YOUR_ACCOUNT_ID:repository/YOUR_ECR_REPO_NAME" }, { "Sid": "S3PushArtifacts", "Effect": "Allow", "Action": [ "s3:PutObject" ], "Resource": "arn:aws:s3:::YOUR_TARGET_BUCKET_NAME/cmapp/dev/" }, { "Sid": "SSMUpdateParameters", "Effect": "Allow", "Action": [ "ssm:PutParameter", "ssm:GetParameter" ], "Resource": "arn:aws:ssm:YOUR_REGION:YOUR_ACCOUNT_ID:parameter/cmapp/dev/*" } ] }

@rexwhitten
Copy link
Author

Define the logic once here

.aws_auth_template:
stage: audit
image: amazon/aws-cli:latest
before_script:
- echo "Assuming AWS Role..."
# (Insert your assume role export command here)

Just list the jobs simply

audit_s3:
extends: .aws_auth_template
script:
- aws s3 ls

audit_vpc:
extends: .aws_auth_template
script:
- aws ec2 describe-vpcs

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment