top of page

Automate FileMaker Cloud's wake up and bedtime schedule

Updated: Feb 8, 2019



One of the best features of Cloud computing is the ability to manage costs by powering off instances when they are not needed, and then back on when they are. Unfortunately, on Amazon Web Services this is currently a mostly manual process.

Now that FileMaker Cloud has officially joined the ranks of enterprise-grade servers on AWS, many FileMaker developers may be looking for strategies for automating the startup and shutdown of those server instances according to a schedule. The trick is that, to power a server on and off remotely, you really need to run commands from a second machine that can "push" those virtual power buttons.


AWS Data Pipeline logo

Data Pipeline to the rescue

While it is possible to startup and shutdown FileMaker Cloud by running AWS CLI scripts from a machine or service outside your Virtual Private Cloud (VPC) environment, doing so will require managing (and safeguarding) ".pem" access keys, and perhaps introduce external dependencies or security concerns that you may want to avoid. One surefire alternative is to leverage AWS Data Pipeline services.

AWS Data Pipeline offers an affordable, reliable, and more secure way of executing those same CLI commands from inside your own VPC with no external dependencies. Here's a worry-free way to schedule the stopping and starting of FileMaker Cloud instances on AWS at pre-determined times.

What's involved

In essence, Data Pipelines work by launching EC2 instances for the duration of the scheduled action and then terminating them upon task completion. Configuring AWS Data Pipeline to execute scheduled activities requires three basic steps:

Step 1. Create a Resource Role

Define an AWS resource role for use by Data Pipeline instances. This is a global setting so you only have to do this once.

Step 2. Create a Security Policy

Add a Managed Policy allowing your Data Pipeline Resource Role to start and stop other EC2 instances and to write logs to S3 and attach it to the Resource Role you created in Step 1. This is also a global, one time set up.

Step 3. Create your Power On/Power Off Data Pipelines

Create your custom startup and shutdown schedules. Your Data Pipelines may be modified, activated/deactivated, and even run on demand as often as you wish, but mostly you will just "set them and forget them".

What's required

  • Region-Specific Pipelines: While Policies and Roles are global settings, the actual Data Pipelines are region specific, so step 3 must be repeated for each region in which you plan to use this service.

  • Selection of a supported Region: AWS Data Pipeline is not universally available. As of this writing, Data Pipeline services are only available in five (5) of the fourteen (14) AWS geographic Regions (Oregon, N. Virginia, Ireland, Sydney and Tokyo). That's ok for now, since FileMaker Cloud is currently available in only two (2) of those Regions (N. Virginia and Oregon). However, if you intend to use Data Pipelines for scheduling FileMaker Server instances, or any other EC2 instance, know that only certain AWS Regions are currently supported.

Recurring Costs

AWS Data Pipeline is billed based on how often your activities are run. For scheduled startup and shutdown sequences, AWS Data Pipeline is very economical, costing (at current prices) no more than $12/year per Pipeline. Factoring in costs for log archiving to AWS S3, a typical pair of startup and shutdown Pipelines is not likely to cost more than $25/year per Region. If you have one or more FileMaker Cloud instances running 24/7 in either Region, the savings, reliability and security more than offset the $2 or so that you will be paying per month.

How-to guide

Follow the steps outlined in these videos, or download the document attached to the graphic below.


 

Step 1. How to create a Resource Role for Data Pipeline


 

Step 2. How to create a Security Policy for Data Pipeline and attach it to your Resource Role


To follow the steps in this second video, you will need to copy the JSON expression below and paste it into the body of your new "Policy Document".

{

"Version": "2012-10-17",

"Statement": [

{

"Effect": "Allow",

"Action": [

"s3:*",

"ec2:Describe*",

"ec2:Start*",

"ec2:RunInstances",

"ec2:Stop*",

"datapipeline:*",

"cloudwatch:*"

],

"Resource": [

"*"

]

}

]

}

 

Step 3. How to create startup and shutdown Pipelines


To create start and stop schedules as illustrated in this third video, you will need to create two separate Pipelines, and attach the following AWS command line scripts:

Startup CLI command:

aws ec2 start-instances --instance-ids i-12345a1b2defg6789 --region us-west-2

Shutdown CLI command:

aws ec2 stop-instances --instance-ids i-12345a1b2defg6789 --region us-west-2

In these examples, "i-12345a1b2defg6789" refers to your FileMaker Cloud's instance id (you can find this under "Instances" in your AWS console.

Also, "us-west-2" refers to your particular AWS Region. For now, this will be either "us-east1" (N. Virginia), or "us-west-2" (Oregon), as these are the only two regions where FileMaker Cloud can currently be instantiated.

To start and stop more than one FileMaker Cloud instance per Pipeline, simply add a second line identical to the first except for the instance id and separate your lines with a semicolon ";" and a carriage return. You can continue to add commands for as many instance ids as you wish and all servers will be acted upon according to your schedule.

Note: Times are all UTC, so you must add/substract according to your time zone. Also, be sure to click "Custom" under "IAM" roles and select the EC2 Resource Role that you created in Step 1.

Next, you will need to set up an S3 bucket to which your Data Pipelines can save execution logs. Although this is not required, logging is always a good idea. In the US, it's possible to create a single S3 bucket that can be shared across all US Regions. For instructions on how to create an S3 bucket, click here.

Finally, you will need to select the correct Resource Role you created in Step 1 and apply it to your Pipelines.

 

Caveats and Gotchas

  • IP Addresses: With or without Data Pipeline, stopping and starting FileMaker Cloud instances will cause their IP address to change each time they start up. The only way to avoid this is by using static IP addressing, known as Elastic IP addressing in AWS terms, which at the moment FileMaker Cloud does not officially support. If you currently point to hosted files on AWS servers by host IP address, consider switching to using DNS names.

  • Instance IDs: The CLI commands to start and stop FileMaker Cloud all point to your servers by their AWS Instance ID. There are four (4) scenarios under which your FileMaker Cloud Instance ID will change: 1) an EC2 upgrade/downgrade, 2) a Storage upgrade, 3) a server "Refresh", or 4) Restoring from a backup. In each one of these cases, the old EC2 instance is terminated and a brand new EC2 instance is generated (and therefore a brand new Instance ID). Whenever this happens, you must go back to your Data Pipelines and update the start and stop commands with the new instance ID.

  • Daylight Savings time: If your country observes daylight savings time changes, know that the CLI commands to start and stop FileMaker Cloud do not automatically adjust for these time changes. Unless you manually modify the startup and shutdown script schedules the day prior, you may find that your FileMaker Cloud instances will be starting and stopping at the incorrect time whenever daylight savings kicks in or reverts back.

  • Times are approximate: Data Pipelines launch EC2 instances for the purpose of executing your CLI commands, and it takes a few minutes for any EC2 instance to spin up. Likewise, it takes a few minutes for your FileMaker Cloud server to start or stop. What this means is your scheduled tasks will take full effect approximately within 1 or 2 minutes of the time you specified. For example, if you scheduled a server to start up at 9 am, you can expect the server to be up and running sometime between 9:00 and 9:05. If you need to insure a system is available or shut down by a specific time, give yourself a little extra time for Data Pipeline to finish doing its job.

0 comments
bottom of page