Browsed by
Author: Seth Fuller

AWS Certified!

AWS Certified!

I’m grateful to announce that I am now a Certified AWS SysOps Associate! Over the past few months I’ve been studying intensely for the exam several hours a day. The positive side of this is that I now hold a wealth of practical knowledge about how to use AWS across a broad number of AWS services. The drawback is that I’ve not had time to work on my personal AWS projects, but I intend to pick back up on that now. I’ve also already started studying for the DevOps Professional exam, which should nicely complement my hands-on usage of AWS, since the DevOps exam requires a much deeper knowledge of AWS on a practical level. Here I go!

WordPress Continued: How to Migrate Assets to AWS S3 And CloudFront

WordPress Continued: How to Migrate Assets to AWS S3 And CloudFront

So in my WordPress experiment, one of my goals has been to create a fully scalable WordPress site that can be scaled up (increase host resources) and/or scaled out (increase number of hosts). To do this, our core WordPress hosts must be stateless. By that I mean that, for example, my stored images cannot be different on one host than another. Otherwise, users will get inconsistent experiences. The solution to this is to store all of our assets (css, images, media, etc.) in Amazon S3 and distribute those assets using Amazon CloudFront. That way, when a user browses a page containing an image, the image will be served to the user through CloudFront, rather than the webserver. This means that I can add as many WordPress hosts as I need to without worrying about an inconsistent browsing experience for my users. Not only that, but it will significantly speed up the load time of my site.

Fortunately the setup for this is fairly straightforward and very low cost. The W3 Total Cache plugin we will use is free. There is a small fee for using S3 and CloudFront. You can check out the pricing details on the Amazon Webservices website.

Here’s how to set it all up.

  1. Ensure that you have an AWS account. You can sign up at
  2. Next, we need to create our S3 Bucket. Once logged in to AWS, click Services at the top and select S3.
  3. Click the ‘Create Bucket’ button. Name the bucket the same as your domain name (e.g. Submit changes.
  4. Go to Services -> IAM. Here we will create the user that will be used to upload our WordPress content to our Amazon S3 bucket.
  5. Click ‘Users’ in the left sidebar, then the blue ‘Create Users’ button. Enter your desired username into the first field and click ‘Create.’
  6. On the next screen, copy and/or download the user access ID and secret key to a desired location. These credentials are crucial for WordPress to access Amazon Web Services. I usually save my credentials in a secure S3 bucket.
  7. On the next screen, click the user just created, then click the ‘Permissions’ tab. Click the ‘Attach Policy’ button. Search for and select ‘AmazonS3FullAccess’ and ‘CloudFrontFullAccess’ and then click the ‘Attach Policy’ button. The IAM user is now ready.
  8. Next we need to create our CloudFront distribution. Select Services -> CloudFront. Click the ‘Create Distribution’ button, and select ‘Get Started’ under the Web section.
  9. Click your mouse inside the Origin Domain Name field. Select the S3 bucket that you created in Step 3. You can leave the other settings unchanged.
  10. Click the ‘Create Distribution’ button at the bottom of the form. Now our S3 and CloudFront settings are complete.
  11. Login to your WordPress dashboard and install the W3 Total Cache plugin.
  12. In your dashboard’s left sidebar, hover your mouse over ‘Performance’ and click ‘General Settings.’
  13. Scroll down to the section labeled ‘CDN.’ Check the ‘Enable’ checkbox. In the ‘CDN Type’ dropdown, select ‘Amazon CloudFront’ under Origin Push. Do NOT select ‘Amazon CloudFront’ under ‘Origin Pull.’ Click the ‘Save All Settings’ button.
  14. A warning message will appear about your settings missing. Click the associated button to be taken to the settings page.
  15. In the ‘Configuration’ box, enter the Access ID and Secret Key you acquired in Step 6. The Bucket field should be your domain name, which should be the same S3 bucket name you created in Step 3.
  16. In the ‘Replace site’s hostname with’ field, you will need the subdomain of your CloudFront distribution. To get this, go back to CloudFront as you did in Step 8. You will see your CloudFront distribution listed in the table. One of the columns is ‘Domain Name’ and it will look something like Copy the part before Paste that into the the ‘Replace site’s hostname….’ field.
  17. Click the ‘Test S3 upload & CloudFront distribution’ button. You will see ‘Test passed’ if your settings are correct. If you get an error, review the steps provided to ensure they were followed properly.
  18. Click the ‘Save all settings button.’
  19. On the same page in the General section, check the boxes desired. You want to host attachments, includes, theme files etc. in order for your site to be fully scalable and cloud enhanced for speed.
  20. Click the buttons ‘Upload attachments’, ‘Upload includes files,’ etc. to upload your WordPress files to S3.
  21. Congratulations, your WordPress assets are now being served on the cloud. You can verify this by checking the URLs of your post images, which should now be from CloudFront.
Subnet and VPC Gotcha’s When Using An Elastic Load Balancer

Subnet and VPC Gotcha’s When Using An Elastic Load Balancer

So I’ve successfully migrated this site to a load balanced environment using an Elastic Load Balancer. There are a couple “gotcha’s” that I ran into while doing this, and I want to jot them down before I forget:

  1. I wanted to create multiple subnets for a load balanced environment across availability zones, but since my original subnet was the same size as my VPC (e.g. =, I had no IP ranges left to create new subnets. So to have multiple subnets, one must create subnets smaller than the VPC IP range. I ended up having to create an entirely new VPC with smaller subnets.
  2. I wanted to have each of my two web servers in different availability zones. To do this one simply launches an instance into different subnets, since one subnet equals one availability zone. After I got my new VPC and subnets created, I launched two EC2 instances using an AMI of the original WordPress EC2 host. I needed to SSH into each of them to change the database host endpoint (I also had to recreate my DB host in the new VPC). I was able to SSH into one host, but not the other. After some troubleshooting, I realized that one of the subnets did not have an Internet Gateway in the route table. Turns out that when creating a new VPC, AWS automatically assigns an Internet Gateway to the first subnet you create, but not for any additional subnets you create afterwards. For those you must add the Internet Gateway manually to the Route Table.
Oops! Creating AMI’s Shuts Down Services

Oops! Creating AMI’s Shuts Down Services

So this afternoon I was playing around with creating AMI’s. After I created an AMI based on the EC2 instance hosting this website, the website went down. When I logged into the instance I found that Apache wasn’t running. After some quick searching, I found out that during the AMI creation process, services are stopped unless otherwise specified. And I had made the rookie mistake of not setting the run level for the httpd service so that Apache would start up automatically after reboot. Lesson learned.

AWS Blogs I’m Reading

AWS Blogs I’m Reading

I haven’t had a chance the past few days to work very much on my personal AWS environment, but I wanted to post some Cloud Computing related blogs/sites I’ve started reading which are useful for getting up to speed and following the industry:

A Cloud Guru Blog – A Cloud Guru is arguably the top site for AWS exam preparation. This blog covers related topics as well as industry news.

AWS on Stack Overflow – This link filters all AWS posts on Stack Overflow. Useful to see what sorts of problems related to AWS folks are working through.

Cloud Academy Blog – Cloud Academy is another cloud training site. Their blog covers related topics.

The Netflix Tech Blog – The Netflix Tech Blog is perhaps the best way to follow real-world scenarios in cloud computing. Netflix infrastructure is hosted with Amazon Web Services.

New AWS Feature: EC2 Screenshot

New AWS Feature: EC2 Screenshot

I just learned today that AWS has a new feature for Amazon EC2. It’s called Instance Console Screenshot. From the AWS website:

Instance Console Screenshot provides an on-demand screenshot of the instance console, conveying valuable debug information. This capability is particularly useful when diagnosing instances that have become unreachable via RDP (Windows) or SSH (Linux) due to in-progress software updates, VM Import issues, or other blocking system events. Screenshots can be viewed in the AWS console or accessed via the AWS API or CLI, with both Linux and Windows instances supported.

Certainly not as useful as a live interactive console, but very helpful in debugging non-responsive or unreachable instances.

Installing WordPress

Installing WordPress

One of the most recommended exercises for learning AWS is the installation of a working WordPress site. This exercise appealed to me in particular because I have a strong background with WordPress development, so it is a technology very familiar to me. I also decided that as part of this exercise I would utilize Route 53 to register a new domain name, and then point my new domain to WordPress installation. A number of steps I implemented used this helpful article as a guide.

I attempted to set up my WordPress installation via a script run during the initial EC2 Instance setup. It didn’t work the first time, but after tweaking it some, here is what I came up with.

yum -y update
yum install httpd
service httpd start
yum install php php-mysql
service httpd restart
yum install mysql-server
service mysqld start
mysqladmin -uroot create dbname
cd /var/www/html
tar -xzvf latest.tar.gz
mv wordpress website_directory
cd website_directory
mv wp-config-sample.php wp-config.php

There are some challenges with this script that need to be worked out:

  1. It doesn’t automatically secure the MySQL root account.
  2. It doesn’t set the correct WordPress ownership and directory permissions.
  3. It installs the MySQL server on the same host as WordPress web server. This means that if you auto-scale these instances, each instance will point to a separate database, which of course is not feasible. So this script should be considered as an exercise only.

Here are some commands from various web articles I ended up using in my installation:

Set Correct WordPress permissions
find . -type d -exec chmod 755 {} \;
find . -type f -exec chmod 644 {} \;

If Setting Hostname after WP install, need to manually change hostname in wp-config.php

Apache should own the html directory contents
chown -R apache:apache *

Change AllowOverride None to AllowOverride All in the section of /etc/httpd/conf/httpd.conf

MySQL Status
mysqladmin -u root -p status

List Databases
mysql –user=your-user-name –password=your-password
mysql> show databases;

Secure Database Server

Create Database
mysql> CREATE DATABASE databasename;

I ended up installing a MySQL database server on a separate EC2 instance so that my environment could be scalable. I plan to write about that soon, as well as my exercise in replicating this environment in CloudFormation.

First Post

First Post

So as the blog title suggests, this blog is more or less an experiment and log of my journey into the world of cloud computing. Currently I’m on course to take the AWS SysOps exam in a couple of months. Meanwhile I am using various study sources to prepare, including, Cloud Academy, Linux Academy, as well as a book titled “Amazon Web Services In Action” which seems very promising. Even though I’m taking the SysOps exam first, I’m going through the Solutions Architect and Developer courses as well in order to get a more rounded exposure to AWS. I’m also spending a lot of time on real working examples, like this blog for instance, which is also running on the AWS Cloud platform. It utilizes Amazon EC2, VPC, and Route 53 services. Learning the required knowledge is essential, but putting that knowledge into practice has been crucial for me in more fully understanding how to use AWS.