Track & Audit AWS CLI Commands with Azure DevOps

Who ran it? Captured automatically in the logs.
Command Status? Success / Failure – Recorded instantly.
This Azure DevOps pipeline dynamically executes AWS CLI commands and logs who triggered the command and its status in an S3-hosted CSV file. With automated logging, you gain full visibility, transparency, and auditability in your cloud operations.
Every run is tracked—no more guessing.
Youtube Link:https://youtu.be/mLIYsHW3qNE
trigger: none
pool:
vmImage: 'ubuntu-latest'
variables:
AWS_REGION: 'ap-south-1'
S3_BUCKET: 'aws-cli-tracker'
LOG_FILE: 'cli-execution.csv'
parameters:
- name: awsCliCommand
displayName: "Enter AWS CLI Command"
type: string
default: "aws s3 ls"
steps:
- task: AWSShellScript@1
inputs:
awsCredentials: 'Pradeep-AWS'
regionName: $(AWS_REGION)
scriptType: 'inline'
inlineScript: |
# Capture execution details
if [ -n "$(Build.QueuedBy)" ]; then
EXECUTOR="\"$(Build.QueuedBy)\"" # Using QueuedBy and quoting to handle spaces
else
EXECUTOR="\"Unknown\"" # Fallback if no value
fi
echo "QueuedBy: $(Build.QueuedBy)"
TIMESTAMP=$(date -u +"%Y-%m-%dT%H:%M:%SZ")
CLI_COMMAND="${{ parameters.awsCliCommand }}"
# Run command and capture result
OUTPUT_FILE="aws_output.txt"
ERROR_FILE="aws_error.txt"
$CLI_COMMAND > $OUTPUT_FILE 2> $ERROR_FILE
if [ $? -eq 0 ]; then
RESULT="success"
cat $OUTPUT_FILE
else
RESULT="fail"
cat $ERROR_FILE
fi
# Prepare log entry in CSV format
LOG_ENTRY="$EXECUTOR,$TIMESTAMP,\"$CLI_COMMAND\",$RESULT"
# Download existing CSV log file from S3 (if exists)
aws s3 cp s3://$S3_BUCKET/$LOG_FILE $LOG_FILE || touch $LOG_FILE
# Append new row to the CSV log
echo "$LOG_ENTRY" >> $LOG_FILE
# Upload updated CSV back to S3
aws s3 cp $LOG_FILE s3://$S3_BUCKET/$LOG_FILE
displayName: 'Run AWS CLI and Upload Logs to S3'