Archive for the ‘version control’ Category

MikeCI and Codesion Integration

September 13, 2010

Here at MikeCI we’re in the process of integrating with Codesion’s SCM allowing users to access Codesion with a couple of clicks of the mouse, giving them a Codesion repository for either Subversion or Git.

Codesion Integration

We are delighted to have started the integration and we’re looking forward to future developments with Codesion and the SaaS community.

With SaaS specialists offering professional services at affordable prices, we are starting on a journey to offer a consortium of services across your entire stack allowing small businesses and SMEs to benefit from enterprise level services at a fraction of the cost.

We feel that joining forces with other SaaS providers allows you to get the biggest bang for your buck.

As we look to complete the integration with Codesion we are excited about the potential conglomerate we are creating.

Both Codesion and ourselves are excited by the potential new offering, you’ll be able to access award winning services from different providers in what is essentially a partners program.

Our tech lead Rob Knowles managed to shed some light on the integration and let us know what he is hoping to achieve, over the long term.

Rob claims “Complimentary SaaS providers are a great way to help get referrals as well as offer a great service to customers. This sign up processes gives SMEs a simple setup in just a couple of clicks”.

With the cloud becoming the development environment of the future it is no surprise that more developers are placing more of their stack in the cloud. Rob claims “As stacks become increasingly complex with developers creating more sophisticated builds, it makes sense to utilise the cloud accordingly, you can write your code in the cloud, save it, compile it, test it, bug track, and deploy, simply by using hosted services.”

We’ll keep you up to date with the progress and look forward to full integration in the upcoming weeks.

Try MikeCI Today For FREE with a 30 day trial.

Underdoing The Competition

September 6, 2010

Hosted, low-cost, internet-based services for software developers have been in existence since the formative years of the web itself. In their earliest incarnation, such services would usually involve obtaining some real estate on an Apache web server, accessible via FTP, with the ability to upload a web application and invoke some server-side technology (Perl/CGI hacking anyone?) to create the latest and greatest dot-com experimental idea for a business. Or maybe just a simple website.

Zoom forward a few years and these services start to encompass functionality that provides a platform for the activity of development itself. For example, hosted version control, issue tracking and task management applications. Now smaller web-based development shops, who are already used to renting infrastructure for their production environments from a hosting company, start to use these complimentary services – particularly when team members are not co-located. In fact, developers soon discover that these tools promote a model that supports geographically dispersed teams, and these ideas really begin to take root, like much technology these days, amongst the open source software community. The likes of Sourceforge for example, begin to acquire many thousands of projects, both large and small, due in part to providing accessible, web-based collaboration services for development. For private projects, pay-for services such as those provided by CvsDude (now Codesion) and Collabnet start to combine the power of the ideas taking root in the open source world with the industrial strength security and data protection required by larger IT businesses.

And here we are today, with a proliferation of Software-as-a-Service (SaaS) companies that offer a full range of services along the application lifecycle management continuum. Certain types of services lend themselves well to the SaaS model – the barrier to entry is relatively low, especially when the tool or service is already web-enabled. An example is hosted bug tracking solutions. These are implicitly suited to the multi-tenant application model used by most SaaS businesses. They have a very predictable resource consumption profile and a security model that easily supports multiple users. Data protection and recovery can be usually handled using industry standard practices for RDBMS management. There are a lot of businesses that offer these types of services.


When we look to other services along the ALM continuum, hosted SaaS offerings are few and far between, an example being services that support Continuous Integration. Hopefully, by now, all thinking software engineers believe that CI is integral to the practice of modern software development. In fact, I’d go further, anyone who thinks otherwise is not fit to call themselves a (thinking) software engineer. You’ve no excuse, really, and if you don’t embrace its benefits then CI is likely to be imposed on you anyway.

So, it’s 2009, you have your hosted low-cost SCM system, you’ve got your cheap web based bug tracker and your rented deployment environment (still Apache, or maybe Google App Engine?). You search for “hosted continuous integration” and hmmm…no dice. “Why does nobody do this stuff?” you ask, “Didn’t I read somewhere that its ‘integral to the practice of modern software development’ ?”. I know the answer – a simple reason really – a hosted CI service is damn hard thing to deliver.

For a hosted solution to be low-cost and truly multi-tenant it needs to be very efficient with resources. This usually involves sharing of such resources amongst many users, a model which does not easily lend itself to CI. Secondly, a hosted solution must be secure, especially with respect to user data. User data in the world of CI includes source code, build metrics and the built artefact’s themselves. Reconciling resource optimization with data security is the biggest challenge here. To illustrate this point, if such a solution provided every user with a dedicated, isolated, physical environment for their builds it would need to incorporate the associated costs into the offering. This would quickly place the service into the realms of ‘expensive luxury’ for our small agile team example. As a comparison, consider Codesions ‘Team Edition’ for hosted Subversion which starts at $6.99/month. To be competitive, the CI provider has to get very clever with shared resource optimization while still ensuring data protection and security for its users.

In addition to data protection and resource management, there are additional security concerns relating to what a build can be permitted to do in such an environment. This is usually never a concern for an on-premise CI server. Want to scan for available ports, open particular sockets or start and stop certain daemon processes? No problemo. However, I’m afraid that in the shared real-estate scenario there are some necessary limitations on what build tools might be able to do. This is something of a compromise that a user has to take on the chin if they wish to use a low-cost hosted solution. If their build process requires a bunch of ‘non-orthodox’ things to happen, they need to understand that these types of advanced builds will never play well in a shared environment. But…if your project conforms to the standard life-cycle of ‘prepare-compile-test-package’ then this model fits much better and probably covers the majority of day-to-day software builds. This is just simple pareto principle logic really – a low-cost, shared hosted solution can only realistically cover the 80% well and assume that its not the right choice anyway for the remaining 20%.

So we’ve articulated some of the challenges of establishing a hosted CI offering, but what are the potential benefits that such a solution might provide to end-users? These may seem obvious, but are worth stating:

  • Zero cost initial implementation for CI infrastructure.
  • Lowers the barrier to entry for CI, simply login, configure project(s) and use it.
  • Significantly reduced software and hardware maintenance costs. Less software hosted internally means less hardware
 required, which also translates into a reduced burden of patches and patch support complexity.
  • Reduced staffing (sorry guys/gals).
  • Better use of skills/ resources.
  • Pay for what you use and no more. Subscription models for SaaS typically allow for cancellation given a months notice. Try-before-buy is very common too.

One final consideration is the range of features that hosted solutions offer when compared with the wide variety of open source and proprietary CI servers available today. There are some very slick options out there that you can download, install, configure and use. At the moment, the available low-cost multi-tenant hosted CI solutions will not win in a feature beauty contest against these tools. However, in my humble opinion this new breed of services offer something quite different, something that sets them apart. To bend a phrase from Bill Clinton: “Its because they are hosted, stupid!”. Personally, I’m a firm believer in the idea that you should (to borrow a concept from 37signals), ‘under-do the competition’. Hosted CI should solve the simple problems and leave the hairy, difficult, nasty problems to everyone else. Instead of one-upping, it should one-down. Instead of outdoing, it should under-do. Hosted CI should stick to what’s truly essential.

Kiln Repository Support

May 25, 2010

Today we are pleased to announce support for projects hosted within Kiln On-Demand repositories.

Kiln Logo

For those of you who are unfamiliar with Kiln, it is based upon the popular open source Distributed Version Control System (DVCS) Mercurial.

We have created a short screen cast that demonstrates this new feature using the Java Petstore (built using Maven) as an example project.

You can also find additional information about our support for Kiln in our new, comprehensive, on-line user guide

Continuous Integration for Agile Project Managers (Part 1)

March 30, 2010

Any Agile Project Manager worth his salt should be aware of the term ‘Continuous Integration’ (often shortened to ‘CI’). But what is it, and how is it done?

This series of short blog articles aims to answer these two questions, so you can start your next project, or re-configure an existing project, armed with the necessary understanding about this key practice within agile software delivery.

Background

The basic premise of CI is pretty straightforward. An agile team needs a repeatable and reliable method to create a build of the software under development. Why so? Well, if its not already obvious, you may want to revisit the principles behind the Agile Manifesto. Within them you will notice a number of references to ‘working software’, and the foundation of any working software is a stable, tested build.

Recipe for CI

So how does CI help to create this build? Lets list the essential ingredients that we need :

  1. Source Code Control – in a typical agile project, developers turn User Stories into source code, in whatever programming language(s) the project is using. Once their work is at an appropriate level of completeness, they check-in or commit their work to the source code (a.k.a version) control system; for example, Subversion
  2. Build Tool – if the source code needs to be compiled (e.g. Java or C++) then we will need tooling to support that. Modern Integrated Developer Environments (IDE), such as Eclipse or Visual Studio are able to perform this task as developers save source code files. But if we want to build the software independently of an IDE in an automated fashion, say on a server environment, we need an additional tool to do this. Examples of this type of tool are Ant, Maven and Rake and Make. These tools can also package a binary output from the build. For example, with Java projects this might be a JAR or WAR file – the deployable unit that represents the application being developed.
  3. Test Tools – as part of the build process, in addition to compilation and the creation of binary outputs, we should also verify that (at minimum) the unit tests pass. For example, in Java these are often written using the JUnit automated unit testing framework. The tools in (2) often natively support the running of such tests, so they should always be executed during a build. In addition to unit testing, there are numerous other quality checks we can perform and status reports CI can produce. I’ll cover these in detail in a subsequent part to this series.
  4. Schedule or Trigger – we might want to create our build according to a schedule (e.g ‘every afternoon’) or when there is a change in the state of the project source code. In the latter case we can set up a simple rule that triggers a build whenever a developer changes the state of the source code by committing his/her changes, as outlined in (1). This has the effect of ensuring that your teams work is continuously integrated to produce a stable build, and, as you may have guessed, is where this practice gets its name from.
  5. Notifications – the team needs to know when a build fails, so it can respond and fix the issue. There are lots of ways to notify a team these days – instant messaging, Twitter etc, but the most common by far is still email.
  6. Continuous Integration Recipe

    Continuous Integration Recipe


    The tool that wires these five elements together is a Continuous Integration Server. It interacts with the source control system to obtain the latest revision of the code, launches the build tool (which also runs the unit tests) and notifies us of any failures. And it does this according to a schedule or state change based trigger. A CI server often also provides a web-based interface that allows a team to review the status, metrics and data associated with each build.

    CI Server options

    There is a pretty overwhelming choice of available tools in this space. Some are open source, some proprietary. I don’t have time to go into all the available options here unfortunately. However, there is a handy feature comparison matrix available here. Of course, it would be remiss of me not to mention our own hosted service, which allows you to get started with CI in no time at all, without having to be an ‘expert’ user.

    Mike Test Reports

    Test Reports generated by Mike

    In the next part of this series, I’ll delve deeper into how you can use CI to enforce software quality within your team during the various stages of the development process.

Bootstrap your Agile Java Startup Infrastructure in < 30 minutes

March 25, 2010

So, you have a great idea for the next web-based killer application. You have assembled a small but geographically dispersed team with the requisite amount of Agile Java development-fu to get the job done. What you need now is some hosted infrastructure so you can all get to work effectively.

In less than 30 minutes.

Step 1 – Get some IDE

Get Eclipse Galileo (JEE version). Hopefully your pipe is fat enough (~190 MB download). Fire it up and use update sites to obtain m2eclipse and subclipse (we’re assuming SVN for a repository, but you could use Git). Install Tomcat and configure as a server.

Step 2 – Get some Scaffolding

Within Eclipse go to File>New>Other, to open the Select a wizard dialog. From the scrollbox, select Maven Project and then click Next. Move through the wizard to select an archetype of an appropriate type (e.g. Appfuse). Click Finish. Validate you can build and deploy your app.

Step 3 – Get some Version Control

While you are waiting for your build to complete, pick a hosted version control provider. There are a number who provide low-cost or even free hosted Subversion for private projects, typically with a trial period. Here is a list. Once signed-up with a clean repository, use subclipse to share the scaffolding app created in (2).

Step 4 – Get some Project Tracking and Wiki

The majority of providers listed in (3) also offer something in the area of hosted task/issue tracking apps that often have sufficient wiki capabilities. Alternatively you can try those who specialise
in this area such as FogBugz or Agilo. We use Agilo on Mike. And you might want to get Basecamp. We use that too.

Step 5 – Get some CI

You could roll your own on Amazon EC2, but that isn’t happening in 30 minutes or 30 hours, probably! Hmmm…Oh I almost forgot – you could use our excellent hosted service ;).

Step 6 – Get some BRM

Whats a BRM? Its Binary Repository Manager. If you’re using Maven, I’d recommend you use one. The only hosted one I’m aware of is provided by Artifactory Online.

Right. We are good to go. What are you waiting for?

All The Young (Ex) Dudes

February 9, 2010

As we near the end of beta here at Mike HQ, I’d just like to publicly say a big thank you to all our participants who have helped to shape and improve our platform over the past few months – your feedback has been invaluable.

We are currently in the final phases of testing the version of our platform that will form the basis of our commercial offering. Our acceptance testing work flow involves consuming services provided by other organisations to best simulate a real-world usage scenario for our platform. The primary third party service we use is hosted version control. In fact, it is only reasonable to state that Mike has a strong dependency on the existence of such services. It is the first link in the chain of what we here at Mike HQ refer to as the ‘hosted ALM continuum’ – the suite of co-operating and complementary hosted services that provide agile teams with a full outsourced, web-enabled, development ‘stack’. Disclaimer: it is most definitely in our interest to promote hosted version control solutions, as they are an enabler for the use of our own platform.

So, what do we use for testing?

Well, at present we support Subversion repositories that are accessible over Http(s). To simulate repositories that do not require authentication for read access we use Google Code. For those who are unfamiliar with Google Code (there can’t be that many of you surely?) it provides a free collaborative development environment for open source projects, and provides each project with its own Subversion repository. Thanks, Google.

However, our main scenario is retrieving (or updating) source code from repositories that do require authentication and also provide a secure transport using Https. After surveying the landscape, we decided to trial a service offered by Codesion. At the point we signed up (last year) they were known as CVSDude and they have recently re-branded themselves under a new name. We did like the old name – it has allowed us to indulge in some office banter during our acceptance testing phases, which, lets face it, are often not one of the more exciting aspects of software engineering. I won’t bore you with our banter though as it probably falls into the camp of ‘you had to be there’ to seem even remotely funny.

Codesion web site

Codesion

Setting up a free 30-day trial on Codesion was a cinch:

  1. We swiftly signed up via their website, http://codesion.com/
  2. We created a new project, and added the Subversion service
  3. We created our users, groups, roles (they have a bunch of defaults), and assigned them to our project.
  4. We cut-n-paste’d the SVN URL from the project page, into our SVN import statement and we were done.

At this point we now had data we needed to test our platform – our test fixtures are a range of Java projects of different flavours. A side-effect was that it definitely gave us a view of what a slick SaaS sign-up process and after sales care looks like – something for us to aim for with our own offering. Since we started using them we’ve had zero problems. In some of our test cases we hit the repository repeatedly and it gives us the same reliable service every time.

We’d have no hesitation in recommending Codesion if you are looking for a low-cost, industrial-grade hosted solution for Subversion. But, if you are reading guys, we did slightly prefer the old name….sorry ;-).

Working with Custom Maven Archetypes (Part 2)

January 26, 2010

In part 1 of this series of blog entries I demonstrated how you can quickly create a custom Maven archetype. This nice feature of Maven allows you to produce template or skeleton projects, which are great for bootstrapping your development efforts. In this second part of the series, I’ll show you how to ‘production-ize’ your archetype, which involves the following steps:

  • Add your archetype to version control
  • Update the appropriate metadata elements in your archetype’s POM
  • Release your archetype using the maven-release-plugin

Step 1 – Add your archetype to version control

I decided to use Git as the VCS (version control system) for my archetype. This is in part due to the fact that we will soon be releasing a new version of Mike that supports projects hosted on the popular ‘Social coding’ site GitHub. GitHub offers free project hosting for ‘public’ (AKA open source) projects.

So, lets get down to business. First off, you must have the Git client installed of course. If you are on a flavour of *nix this is a cinch, but there is tooling support for other OS’ including TortoiseGit, from the creators of the popular TortoiseSVN client. As there are already plenty of tutorials about using Git, i’m not going to replicate all steps here. Try Git for the lazy if you are time poor, for a good intro.

First up, I navigate to a directory, initialise my Git repository and add my fledgling archetype:


~/foo$ cd mikeci-archetype-springmvc-webapp
~/foo/mikeci-archetype-springmvc-webapp$ git init
Initialized empty Git repository in /home/leggetta/foo/mikeci-archetype-springmvc-webapp/.git/
~/foo/mikeci-archetype-springmvc-webapp$ git add .
~/foo/mikeci-archetype-springmvc-webapp$ git commit -m "Initial commit"
Created initial commit fad815f: Initial commit
 13 files changed, 513 insertions(+), 0 deletions(-)
 create mode 100644 pom.xml
 create mode 100644 src/main/resources/META-INF/maven/archetype-metadata.xml
 create mode 100644 src/main/resources/META-INF/maven/archetype.xml
[...]

Now that i have my archetype added to a local Git repo, I want to share this via GitHub. This obviously requires a GitHub account and you also need to ensure you have added a public key to provide you with the requisite privileges to ‘push’ your changes. Once you’ve set up a bare repository on GitHub, you can execute the following commands:


~/foo/mikeci-archetype-springmvc-webapp$ git remote add origin git@github.com:amleggett/mikeci-archetype-springmvc-webapp.git
~/foo/mikeci-archetype-springmvc-webapp$ git push origin master
Counting objects: 34, done.
Compressing objects: 100% (22/22), done.
Writing objects: 100% (34/34), 21.23 KiB, done.
Total 34 (delta 1), reused 0 (delta 0)
To git@github.com:amleggett/mikeci-archetype-springmvc-webapp.git
 * [new branch]      master -> master

What did I just do? The command remote add origin [url] adds the location of the remote GitHub repository to my local repository configuration and calls it ‘origin’. When I type push origin master this sends or ‘pushes’ my local changes on the ‘master’ branch to the ‘origin’ server. The ‘master’ branch is one that is created by default for me when I initialized the repository above.

Step 2 – Updating the POM

For Maven to function effectively, you should always ensure that you include project VCS information in your POM file. Now that we’ve added the archetype to a Git repository we can include the appropriate <scm> configuration:


  <scm>
   <connection>
   scm:git:ssh://github.com/amleggett/${artifactId}.git
   </connection>
   <developerConnection>
   scm:git:ssh://git@github.com/amleggett/${artifactId}.git
   </developerConnection>
   <url>
   http://github.com/amleggett/${artifactId}
   </url>
  </scm>

It’s important to understand the meaning of each of the child elements of <scm>. The <connection> element defines a read-only url and the <developerConnection> element a read+write url. For both of these elements the url must adhere to the following convention:


 scm:<scm implementation>:<scm implementation-specific path>

Finally, the <url> element content should point to a browsable location and for me this is the GitHub repository home page. Note that in all cases, I’m using an interpolated value which is my project artifactId.

One handy tip is that you can verify this configuration by using the maven-scm-plugin. This plugin offers ‘vendor’ independent access to common VCS commands by offering a set of command mappings for the configured VCS. The validate goal should confirm all is well:


~/foo/mikeci-archetype-springmvc-webapp$ mvn scm:validate
[INFO] Scanning for projects...
[INFO] Searching repository for plugin with prefix: 'scm'.
[INFO] --------------------------------------------------------------
[INFO] Building MikeCI Spring-Mvc web application archetype
[INFO]    task-segment: [scm:validate] (aggregator-style)
[INFO] --------------------------------------------------------------
[INFO] Preparing scm:validate
[INFO] No goals needed for project - skipping
[INFO] [scm:validate {execution: default-cli}]
[INFO] connectionUrl scm connection string is valid.
[INFO] project.scm.connection scm connection string is valid.
[INFO] project.scm.developerConnection scm connection string is valid.
[INFO] --------------------------------------------------------------
[INFO] BUILD SUCCESSFUL
[INFO] --------------------------------------------------------------

We also have to update the POM to tell Maven (or rather, the maven-deploy-plugin) where to deploy snapshot and released versions of our archetype. For the time being, i’m just going to specify my local filesystem as this destination, but in a real world example this would most likely point to a location appropriate for a Maven repository manager, such as Nexus or Artifactory:


 <distributionManagement>
  <repository>
   <id>release-repo</id>
   <url>file:///home/leggetta/foo/release-repository</url>
  </repository>
  <snapshotRepository>
   <id>snapshot-repo</id>
   <url>file:///home/leggetta/foo/snapshot-repository</url>
  </snapshotRepository>
 </distributionManagement>

Once satisfied with the POM modifications, I commit to my local Git repo and then push the changes to GitHub.

Step 3 – Releasing the archetype

So, i’m now almost ready to create my first early release of the archetype. I can accomplish this using the maven-release-plugin. This plugin exposes two major goals – prepare and perform.
The prepare goal does some pre-flight checking by running a build and verifying all is well before promoting the version in the pom and creating a tag of the release. The perform goal then checks out this tag, builds the project and deploys the resulting artefact to the Maven <repository> specified in your <distributionManagement> section.

Its always a good idea to use the most recent version of the maven-release-plugin, which is currently 2.0-beta-9. This will require a further POM modification and Git commit+push:


 <plugin>
  <groupId>org.apache.maven.plugins</groupId>
  <artifactId>maven-release-plugin</artifactId>
  <version>2.0-beta-9</version>
 </plugin> 

Next, run the release:prepare goal. By default this runs interactively, prompting you in your shell to provide info about the release version and subsequent development version:


:~/foo/mikeci-archetype-springmvc-webapp$ mvn release:prepare
[INFO] Scanning for projects...
[INFO] --------------------------------------------------------------
[INFO] Building MikeCI Spring-Mvc web application archetype
[INFO]    task-segment: [release:prepare] (aggregator-style)
[INFO] --------------------------------------------------------------
[INFO] [release:prepare {execution: default-cli}]
[INFO] Verifying that there are no local modifications...
[...]
[INFO] Checking dependencies and plugins for snapshots ...
What is the release version for "MikeCI Spring-Mvc web application archetype"? (com.mikeci:mikeci-archetype-springmvc-webapp) 0.1.2: : 
[...]
[INFO] Release preparation complete.
[INFO] --------------------------------------------------------------
[INFO] BUILD SUCCESSFUL
[INFO] --------------------------------------------------------------

Then run release:perform. If it all goes smoothly, you should have something akin to the following in your remote repository:


~/foo$ ls -1 release-repository/com/mikeci/mikeci-archetype-springmvc-webapp/0.1.2/
mikeci-archetype-springmvc-webapp-0.1.2.jar
mikeci-archetype-springmvc-webapp-0.1.2.jar.md5
mikeci-archetype-springmvc-webapp-0.1.2.jar.sha1
mikeci-archetype-springmvc-webapp-0.1.2.pom
mikeci-archetype-springmvc-webapp-0.1.2.pom.md5
mikeci-archetype-springmvc-webapp-0.1.2.pom.sha1
mikeci-archetype-springmvc-webapp-0.1.2-sources.jar
mikeci-archetype-springmvc-webapp-0.1.2-sources.jar.md5
mikeci-archetype-springmvc-webapp-0.1.2-sources.jar.sha1

So, to summarise, I now have the appropriate configuration management in place to make changes to my archetype and release it to a Maven repository.
In the next part of this series, I’ll look into the different ways you can integrate your archetype into the development process.

Why DVCS won’t kill Subversion in 2010

January 21, 2010

At Mike HQ we are currently implementing support for GitHub as this has been requested by a number of our private beta participants. We like Git and are currently undergoing an internal debate/argument as to whether we should switch (no pun intended) to a DVCS from Subversion for our own source code management.

I was going to write something that discussed why, although we might decide to use Git ourselves, it isn’t suitable for everyone – I firmly believe that Subversion usage will continue to thrive in 2010.

However, after searching some, I discovered a comment from this blog entry which more or less summed up my own thoughts. I’m simply going to reproduce it here verbatim (courtesy of clr_lite whoever you are):

  • distributed version control is not for everyone
  • too many people are enamored of a tool or something because it’s new
  • every organization has it’s own situation and needs
  • getting a full repository and allowing people to work in silos without collaboration is not necessarily a good thing
  • git addresses the needs of linux kernel development, with many contibutors funnelling to a gatekeeper
  • some development shops benefit from a locking checkout model which forces developers to communicate and plan
  • subversion has a lot of users and knowledge pool; this can be important for some situations
  • distributed model has it’s plusses; getting all the history, changes, diffs, etc, while offline can be real helpful, but it all depends on the nature of the development and the developers
  • no one tool or process is ‘right’ for everyone

Well put, I thought.