Moving on with my journey towards the cloud, following the roadmap from LTC, my next challenge was to create a bash script that lets me upload files into a cloud storage solution. For this, I worked with an AWS S3 bucket and the AWS CLI.
One of the first things I did was to lay out the steps the program needs to take to achieve the goal.
- Get command line arguments.
- Check if the given argument file path exists.
- if the file doesn’t exist, throw and error.
- Use the AWS CLI to upload the file to a specific bucket.
- Display an upload confirmation.
With that outline in mind, I started working.
The first script
The first thing to do was to create a .sh
file.
To test it out, I did a basic hello
world.
#!/bin/bash
echo "Hello World!"
Every bash script must start with #!/bin/bash
, this tells the system what interpreter to
use when we execute the script.
Running the script printed Hello World!
in the terminal. Nice.
Getting command line arguments in bash
To get a command line argument in bash we call the built-in variable $1
, this gives us
the first argument passed in.
If we have more arguments, we simply call $2
and so on and so forth.
As per the requirements for this script, we needed two command line arguments and a way to check if I passed in the required number of arguments.
Thankfully there is a built-in variable called $#
which contains the number of arguments
we passed in to the script.
Combining this with an if
statement I did the following.
if [ $# -lt 2 ]
then
echo "Usage:"
echo "clouduploader <source_file> <destination>"
fi
Few things to note:
- Whitespaces inside the brackets of an
if
statement are not optional. if..else
blocks ends withfi
.
Checking if file exists
The next step was to check if the file exists or not.
For this, I worked with bash for
loops, while also making use of another bash built in variable called $@
which contains
all the parameters passed to the script.
This allowed me to pass any number of arguments
and check if all of them are valid file paths.
for argument in "$@"
do
if [ "$argument" == "$BASH_ARGV" ]
then
break
fi
if [ ! -f "$argument" ]
then
echo "File does not exist."
exit
fi
done
Another useful built in variable is $BASH_ARGV
. As per the bash manual:
BASH_ARGV
An array variable containing all of the parameters in the current bash execution call stack. The final parameter of the last subroutine call is at the top of the stack; the first parameter of the initial call is at the bottom. When a subroutine is executed, the parameters supplied are pushed ontoBASH_ARGV
.
In line 7, I’m checking if the file doesn’t exist with
[ ! -f "$argument"]
.
Things to note:
for
loops start withdo
and end withdone
.
Handling the file upload
Now that the file paths are valid, We proceed with the file upload using the AWS CLI.
for argument in "$@"
do
if [ "$argument" == "$BASH_ARGV" ]
then
echo "Upload complete!"
exit
fi
aws s3 cp "$argument" "$S3URI"
if [ $? -eq 1 ]
then
echo "Error uploading file."
exit
fi
done
In this last bit, I’m looping through all the passed in parameters and uploading each file
to the S3 bucket with the aws s3
command.
For the last bit of the script, to handle
possible errors uploading, I used the built-in variable $?
to check if the previous
command was successful.
Closing thoughts
This relatively simple bash script made me realize the interesting things we can do with bash scripting, here we streamlined a multi-file upload to an Amazon S3 bucket using the AWS CLI. I am now aware of the possibilities of bash scripting.
Down into an automation rabbit hole with bash.
Hope to see you again on my next blog.