Automate FileMaker Cloud's wake up and bedtime schedule

Updated: Feb 8, 2019



One of the best features of Cloud computing is the ability to manage costs by powering off instances when they are not needed, and then back on when they are. Unfortunately, on Amazon Web Services this is currently a mostly manual process.

Now that FileMaker Cloud has officially joined the ranks of enterprise-grade servers on AWS, many FileMaker developers may be looking for strategies for automating the startup and shutdown of those server instances according to a schedule. The trick is that, to power a server on and off remotely, you really need to run commands from a second machine that can "push" those virtual power buttons.


AWS Data Pipeline logo

Data Pipeline to the rescue

While it is possible to startup and shutdown FileMaker Cloud by running AWS CLI scripts from a machine or service outside your Virtual Private Cloud (VPC) environment, doing so will require managing (and safeguarding) ".pem" access keys, and perhaps introduce external dependencies or security concerns that you may want to avoid. One surefire alternative is to leverage AWS Data Pipeline services.

AWS Data Pipeline offers an affordable, reliable, and more secure way of executing those same CLI commands from inside your own VPC with no external dependencies. Here's a worry-free way to schedule the stopping and starting of FileMaker Cloud instances on AWS at pre-determined times.

What's involved

In essence, Data Pipelines work by launching EC2 instances for the duration of the scheduled action and then terminating them upon task completion. Configuring AWS Data Pipeline to execute scheduled activities requires three basic steps:

Step 1. Create a Resource Role

Define an AWS resource role for use by Data Pipeline instances. This is a global setting so you only have to do this once.

Step 2. Create a Security Policy

Add a Managed Policy allowing your Data Pipeline Resource Role to start and stop other EC2 instances and to write logs to S3 and attach it to the Resource Role you created in Step 1. This is also a global, one time set up.

Step 3. Create your Power On/Power Off Data Pipelines

Create your custom startup and shutdown schedules. Your Data Pipelines may be modified, activated/deactivated, and even run on demand as often as you wish, but mostly you will just "set them and forget them".

What's required

  • Region-Specific Pipelines: While Policies and Roles are global settings, the actual Data Pipelines are region specific, so step 3 must be repeated for each region in which you plan to use this service.

  • Selection of a supported Region: AWS Data Pipeline is not universally available. As of this writing, Data Pipeline services are only available in five (5) of the fourteen (14) AWS geographic Regions (Oregon, N. Virginia, Ireland, Sydney and Tokyo). That's ok for now, since FileMaker Cloud is currently available in only two (2) of those Regions (N. Virginia and Oregon). However, if you intend to use Data Pipelines for scheduling FileMaker Server instances, or any other EC2 instance, know that only certain AWS Regions are currently supported.

Recurring Costs

AWS Data Pipeline is billed based on how often your activities are run. For scheduled startup and shutdown sequences, AWS Data Pipeline is very economical, costing (at current prices) no more than $12/year per Pipeline. Factoring in costs for log archiving to AWS S3, a typical pair of startup and shutdown Pipelines is not likely to cost more than $25/year per Region. If you have one or more FileMaker Cloud instances running 24/7 in either Region, the savings, reliability and security more than offset the $2 or so that you will be paying per month.

How-to guide

Follow the steps outlined in these videos, or download the document attached to the graphic below.


 

Step 1. How to create a Resource Role for Data Pipeline


 

Step 2. How to create a Security Policy for Data Pipeline and attach it to your Resource Role


To follow the steps in this second video, you will need to copy the JSON expression below and paste it into the body of your new "Policy Document".

{

"Version": "2012-10-17",

"Statement": [

{

"Effect": "Allow",

"Action": [

"s3:*",

"ec2:Describe*",

"ec2:Start*",

"ec2:RunInstances",

"ec2:Stop*",

"datapipeline:*",

"cloudwatch:*"

],

"Resource": [

"*"

]

}

]

}