Automated Deployment & Continuous Integration

NathanRidley

New member
Jul 26, 2008
189
1
0
London, UK
I don't know if this post is meant to educate so much as to allow me to feel smug and self-satisfied about setting up a nice automated deployment scenario, but maybe it will give some of you some ideas if you're not doing this sort of thing already. I've set myself up a really cool continuous integration/automated deployment system, using free tools and some custom ones. Even though I'm developing using .Net and deploying on Windows servers, you can still do this sort of thing with your platform-of-choice equivalent.

Basically, I have three websites that are different facets of the same application. One is the public-facing website with members' area and so forth. The next is a pure REST API, hosted on its own subdomain. The third is my own personal admin dashboard for monitoring and maintaining the system on the go, and hosted using another domain name. On top of that I have a primary background worker process which is designed to self-update when I have a new version to deploy, and there's also windows service that detects crashed background processes and restarts them if necessary, though hopefully this is an extreme rarity. There's also the updater application which performs the actual background worker update, and also updates when configuration files have changed.

All of this automatically builds and deploys with a single push to my master git branch. Because my worker processes are designed to scale out to multiple EC2 servers enmasse, I deploy prebuilt updates, packaged as zip files, to S3, where my workers and updaters on each EC2 instance can look for them. Usually they don't have to though, because they're each subscribed to a Redis channel so I can push out an update notification the moment I complete a build/deploy, then they know to shut down, run the updater application and then restart.

The process flows like this:

1) I push to my master branch in my private Github repository.

2) My build server is running TeamCity (a freemium continuous integration tool), which picks up my changes from the git repo, then rebuilds my web apps and background apps to a build directory and updates config files with production settings and so forth.

3) I have a custom tool I whipped up that I invoke as part of TeamCity's build process. It zips up each web app and background app that I need to deploy, appends a timestamp to the filename of each package and uploads them to an S3 bucket, then publishes a notification to the Redis channel I mentioned earlier.

4) My worker processes, running on an arbitrary number of servers, each independently detect the new version (via the Redis notification or by an occasional automated check of the S3 bucket's contents), allow internal processes to finish cleanly, then download the latest version of the updater, which they then execute and quit. The updater (again, independently on each EC2 instance) then updates the worker binaries, relaunches the newly-updated worker process and quits, completing the background worker self-update process.

5) On my web server (again, an EC2 instance) I have running a simple node.js-driven service which listens for HTTP requests (authenticated of course) that indicate that it should redeploy a given web application (specified in the HTTP request). Each production app has alternating A and B folders, so when a deploy is requested, it downloads and unzips the specified package to one of those two folders (the one not currently in use), then updates the web server to point at the new folder for that web application, essentially meaning I can deploy a new version of the website without ever interrupting any users on the site. So, the TeamCity build process, upon completing steps 2-4 (above), then uses cURL to call the web deploy process mentioned above, once for each of the three web applications.

As a result of all of this, a single push to my master branch in my git repository causes any number of background processes to automatically shut down cleanly, update and relaunch, and each my web applications to update seamlessly without interrupting anyone using them.

Is good, yes?

(Now expecting someone to come along and make this seem like child's play. Mahzkrieg? dchuk?)
 


Yeah pretty cool indeed. I'm now playing around with ansible, tried puppet before, but didn't like it. Also used Jenkins before for CI, but TeamCity looks pretty sweet too!
 
Pretty cool, and thanks for that. Always nice seeing how other people setup their devel environments. I have something similar myself, albeit different. Same concept though, and gets the same job done. Setup is like:

IDE --> NETWORK --> MULTIPLE SYSTEMS​

In the middle is the network (think my own Github), which stores all the packages, components, released upgrades, developer accounts, access controls, licensing system, ionCube encoding, etc.

On the left, I have my IDE, which looks something like:

c57E9.jpg


Upon opening it, I just get a little login box, which logs me into the network, from there I have access to all packages available to me. Open a package, it'll download a "table of contents" of it from the network, and display it in a nice categorized list as you see in the right panel. No code is stored on the local computer, and instead upon opening a component (ie. library file) within the IDE, the code is automatically pulled from the network, and the file is locked from other developers modifying it. This just allows better organization, plus standardization.

Standardized set of components, as you can see in the left toolbar. Just to provide efficient development, plus standardization across the team. For example, in my systems you'll see the same tables everywhere with AJAX pagination / search / sort, row overlays, etc. If it's simple data, one of those takes about 90 seconds to add. Although there is standard components, there aren't many restrictions, and you still have the flexibility to open up a blank file and start typing code.

When you create a new package, a devel environment is automatically setup for it. As you're developing, all changes are automatically saved on the devel system. So when you save a file, the contents is instantly sent to the network, and saved in the devel system. Again, all code stays on the network, and never your local computer.

Once done, goto the "Publish Package" menu, hit the button, and all components within the package get bundled up into my own format, automatically encoded with ionCube, and from there can be installed on any system with one click.

You then create upgrade points as needed, and once defined, all changes to the package will be automatically tracked. Automated version control, basically. You just continue on doing development, and once the upgrade is done, hit one button, and all unit tests will be automatically executed, all changes will be bundled, and upgrade is instantly available to all customers.

Then for the individual end-user systems, they all by default have a "Core Framework" package installed on them. This just includes the template engine, AJAX library, HTTP request handler, theme files, manage administrator features, etc. Then depending on the product purchased, they'll have different packages installed on their individual system.

And there's quite a bit more, but I'm done typing... but... yeah...

inb4 inefficient faggot who uses Homesite. heh, you can even see Homesite in that screenshot.
 
Kiopa_Matt, your system seems neat, but what about if you want to develop without an internet connection? Seems you're pretty much screwed.
 
Kiopa_Matt, your system seems neat, but what about if you want to develop without an internet connection? Seems you're pretty much screwed.

True, you need internet access to even open the IDE and login to the network, but really, how often are you behind your computer without internet access? If I can have high-speed internet while touring the smallest villages in Thailand with a 3G aircard, I don't see the problem.

Other main issue is branching / merging, which this doesn't really support. You can't create a separate branch of a package, then merge it weeks later. That's because this was developed under more of an agile concept. If you branch a package, within a few weeks the libraries you're using will be outdated already, kind of thing. That, and it's meant for smaller 1 - 5 man development teams, not 500 developers getting their hands into a package at the same time.

Who knows, maybe I'll even manage to get this bad boy to public some time as an open source devel tool. Time will tell!
 
Kiopa_Matt, your system seems neat, but what about if you want to develop without an internet connection? Seems you're pretty much screwed.

To be fair, a lot of what I'm developing doesn't really make much sense without an internet connection (because of databases and third party resources), so if Kiopa_Matt is the same, being without a connection is going to seriously hamstring him anyway, even if he did have all the files hosted locally.
 
I see what you're saying, but all of my projects also have production databases online / API's needed etc. That's why for developing locally I have a development database with the same schema and can mock out API responses.