Written by Polaris
Published May 10, 2016

Ship it to Amazon Web Services the easy way

Would you like to boost your Jenkins CI? Learn how to create simple pipelines from Jenkins to Amazon Web Services (AWS) using Job DSL Plugin and how to manage Jenkins jobs from source code!

The Jenkins

Polaris is a cross-functional team located in
Kraków. They deliver various web solutions
for Polaris media group. The biggest
stakeholder of the group is Adresseavisen,
the oldest newspaper in Norway.

The Jenkins guy is probably the most common open-source automation server in IT. It’s well known for providing hundreds of plugins to support a full process of building software. There are already many plugins for creating pipelines. Since the new Jenkins 2.0 release, there is even a built-in support for the pipelines. But still none of them were good enough for us. What we needed and were looking for was a simple and clean solution for deploying into the Amazon Cloud.

These days Continuous Delivery is the correct approach to creating software. So we decided that we had to figure out our own solution. And the Continuous Delivery approach was especially important now when everything goes into the clouds such as AWS or Heroku. In this article we would like to describe how we did it, hoping you will benefit from our solution.

Why should we care about the pipeline workflow?

Here are just the main advantages:

  • It accelerates time to market by building a package only once.
  • Releasing a new version with less risk. The same package guarantees binaries haven’t changed during the process and they are the same on every machine.
  • Higher productivity and efficiency of developers, testers, and ops through process automation.
  • The product quality is increased by frequent deployment which reduces the number of bugs.
  • The team receives feedback more often and can ensure that it is building the Right Product.

Job-dsl-plugin for Jenkins

The job-dsl-plugin which allows scripting Jenkins jobs using Groovy language turned out to be our blessing. The plugin generates an XML job definition from Groovy DSL scripts using GroovyScriptEngine. Additionally, it also solves the important issue with storing job-definitions created in Jenkins.  

With this plugin, all job configurations might be stored in a single Git repository in a readable format (Groovy, not XML). So basically everything that had to be clicked before with Jenkins UI, is now a code. There is no need to repeat yourself while creating new similar jobs for each environment like ‘development’, ‘integration’, ‘acceptance’, or ‘production’.

Even restoring all of them, in case of any mistakes or when setting up a new Jenkins machine, is just a matter of minutes. Since a job definition is a script, everything can be done and Jenkins UI is no longer needed.

Main DSL Plugin features:

  • Creating all Jenkins job’s items from code
  • Full access to the shell command line
  • Access to the Jenkins environment variable in DSL
  • Reading/listing files from your Job Workspace
  • Using other Jenkins plugins in a pragmatic way
  • Running a DSL Script locally without pushing it to Jenkins
  • Generating a job config.xml without having to fire up Jenkins
  • Possibility of defining custom/extended DSL methods in Groovy
  • Using external libraries from classpath
  • All benefits of Groovy language power
  • Rich API viewer

Example of a very simple job definition:

def gitUrl = 'git://github.com/jenkinsci/job-dsl-plugin.git'

job('projectName) {

   scm {
     git(gitUrl)
   }

   triggers {
     scm('*/15 * * * *')
   }

   steps {
     maven('clean install')
     maven('sonar:sonar')
     shell('cleanup.sh')
     shell("ruby script.rb”)
   }
}

 

Pipeline Workflow with AWS Elastic Beanstalk

As the DSL plugin allows you to run any script from the command line, the job definition can be a mix of ruby and bash scripts with any other. Therefore, for deploying on Amazon we utilize the AWS Command Line Interface which gives full access to the Amazon Webservices API. During the deploys we also use AWS S3 for storing temporary packages’ versions, and then Elastic Beanstalk as a full stack environment for our application.

General process of deploying a package from Jenkins into AWS:

  1. Build the package locally with Jenkins.
  2. Generate a new build version ID using the Delivery Pipeline Plugin.
  3. Upload the new app version into AWS S3.
  4. On elasticbeanstalk create the new version of the app from the previously uploaded package.

Repeat for each AWS environment (dev, int, prod):

  1. Update the elasticbeanstalk environment with the new version of the application.
  2. Wait until the elasticbeanstalk env. update is finished. This will be indicated by the check health icon turning green.
  3. Run integration tests.
  4. Run post-build actions like: “Archive Build/Publish Mails/Save Test Report”.
  5. Check the environment’s health.
  6. If all succeeded, fire execution of the next job with the given version ID.

AWS CLI commands needed:

Copy built package into the S3 Bucket

aws s3 cp package.zip s3://bucketName/package.zip

Delete the application’s version on Elastic Beanstalk if it already exists

aws elasticbeanstalk delete-application-version \
--application-name myApp \
--version-label v1 \
--delete-source-bundle

Create the new version of the application on Elastic Beanstalk from the S3 file

aws elasticbeanstalk create-application-version \
--application-name myApp \
--version-label v1 \
--source-bundle S3Bucket=”bucketName”,S3Key=”package.zip”

Update the environment with the new version on Elastic Beanstalk

aws elasticbeanstalk update-environment \
--environment-name myAppProd \
--version-label v1

Check the environment’s health

aws elasticbeanstalk describe-environments \
--environment-name myAppProd

“Factory Job” in Jenkins

In our solution we develop a simple Jenkins’ job which creates new jobs or updates the existing ones whenever it is run. First, it fetches a project with all DSLs definitions from our Git repository. Then the Job DSL plugin processes each Groovy file from a given path. As a result, new jobs in Jenkins are created.

Example project with a simple pipeline definition:

https://bitbucket.org/szczepaniak_mateusz/jenkins-pipeline-example/src

Conclusion:

The Job DSL plugin is a great extension and we highly recommend it to every team working with Jenkins. It can be used to create any job meant for work automation and for speeding up the time of creating new jobs. 

In our case, it was the pipeline workflow for a remote deployment, but it can also be used for a local environment or local configuration files update. The project presented as an example can be extended with many more features such as removing unused versions, saving successful versions, or running integration tests. For us, it is currently the best solution for doing Continuous Delivery with Amazon and also for managing Jenkins jobs.

Setup needed for Jenkins:

  • Job DSL Plugin
  • Delivery Pipeline Plugin for generating pipeline version environmental variable
  • AWS CLI with configured credentials in Jenkins

Useful links:

Real World Example:  https://github.com/jenkinsci/job-dsl-plugin/wiki/Real-World-Examples

Job DSL Plugin: https://github.com/jenkinsci/job-dsl-plugin/wiki

Delivery Pipeline Plugin: https://wiki.jenkins-ci.org/display/JENKINS/Delivery+Pipeline+Plugin

AWS CLI:  http://docs.aws.amazon.com/cli/latest/reference/

AWS CLI ElasticBeanstalk: http://docs.aws.amazon.com/cli/latest/reference/elasticbeanstalk/

Written by Polaris
Published May 10, 2016