Posts Tagged ‘development tools’

Continuous Integration for Agile Project Managers (Part 2)

April 27, 2010

In part 1 of this series, I described the essential elements of Continuous Integration within the context of agile development and briefly discussed the software options for a CI server. I explained that the building blocks of CI are a version control system and a build management tool. The former creates the foundation, by giving the CI process access to the latest copy of project source code. The latter then takes the source code and, in most scenarios, transforms it into the deployable binary artefact that represents the software application being developed.

It is useful to consider this transformational process as a series of steps or life-cycle phases. By analyzing our build in this conceptual way, we can better understand how to ‘bind’ or attach actions to each distinct life-cycle phase. Why is this important? Well, using this simple idea I can demonstrate how you can hardwire quality checks into to your CI process. Performing such checks, and publishing their results, means that we can really start to take advantage of the significant benefits CI can bring to your agile projects.

I actually prefer the term ‘quality gate‘ as opposed to ‘quality check’ as it better fits the idea that you must pass through one gate before entering another. In each case, if we are unable to pass through the gate, we can choose to fail the build and our CI system should dispatch a notification to us. In addition, a good CI process will publish the metrics and ‘violations’ concerning each gate, so that they can be reviewed in detail after each build has occurred.

Build Life-Cycle Phases and Quality Gates

Phase 1 – Resolve component dependencies

Often our projects will have dependencies on other internal or third-party components. This is very common for example, in the Java or Ruby programming language worlds where the re-use of packaged binary artefacts (JAR’s for the former and Gem’s for the latter) is considered best practice. One key attribute of a released component is its version (e.g. 2.1.1) and it is here we can apply our first simple gate – ensure that we are using the correct versions of our dependent components.

Phase 2 – Static Analysis of the source code

Source code inspection is (or should be) a integral part of your software development process. Often this requires human beings(!) to perform formal code reviews who possess an understanding of the software requirements and application design. However, we can automate much of this using tooling integrated into the build process. Some examples of inspection that we can perform are

  • Adherence to organisational coding standards (e.g. all source files must start with a copyright statement, all functions should be commented)
  • Finding duplicated, or copied and pasted code statements.
  • Finding garden variety programming mistakes.
  • Adherence to architectural design rules.

Phase 2 – Compilation of the source code

If you are writing software using a programming language that needs to be compiled, such as Java or C++, this is obviously a very basic and essential gate to pass through.

Phase 4 – Execution of unit tests and test coverage analysis

We should always seek to execute the unit tests. Indeed, for some build management tools, this is the default setting and we can impose a gate that fails the build if unit tests do not pass.

The ability to write robust unit tests, that can be repeatedly executed in an automated fashion, in different environments, is one of the hallmarks of a good software developer IMHO. Wiring the execution of these tests into the CI process via the build tool, helps us to ensure that the tests do not contain dependencies on a developer’s local environment – a mistake more junior team members often make. However, how can we determine that the tests are actually doing something useful, and exercising appropriate functions or routines within the application code? This is accomplished using a code coverage analysis tool (such as Rcov), which can highlight those areas of the code-base untroubled by testing. Gates can be set as course-grained thresholds, e.g. trigger a build failure if less than 80% of all application code is being tested, or more granular configurations can be applied.

Phase 5 – Packaging the application

Although we normally don’t attribute a quality gate to this phase, it is often useful during the CI process to embed information into the packaged application for the purposes of traceability and provenance. For example, this can help identify the source code origin or ‘tag’ of an application deployed to a testing environment when defects are found.

Quality Gates

Quality Gates and Lifecycle Phases

Summary

Hopefully, I’ve demonstrated how CI, in addition to continuously integrating the source code developed by your team, can also provide a valuable feedback loop with respect to the quality of the software being created. In the next part, we’ll look into some options for applying ‘functional’ quality gates within the CI process using advanced testing automation.

Working with Custom Maven Archetypes (Part 3)

February 25, 2010

In part 1 and part 2 of this series I was able to demonstrate how you can create a custom archetype and release it to a Maven repository. In this final part we’ll look at what you need to do to integrate it into your development process. This will involve the following steps:

  • Uploading the archetype and its associated metadata to a Maven Repository Manager.
  • Configuring an IDE to use the archetype.
  • Generating a skeleton project from the archetype.

Step 1 – Upload your archetype

In part 2 we covered releasing and deploying the archetype. For reasons of brevity I simply demonstrated deploying a release to the local file system, but if we wish to share our archetype we must deploy it to a remote repository that can be accessed by other developers. A remote Maven repository, in its simplest form, can be served up using a Http server such as Apache or Nginx. However, these days I would recommend that you use a Maven Repository Manager (MRM) instead, as these tools are purpose-built for serving (and deploying) Maven artefacts. There are basically three options for your MRM – Nexus, Artifactory or Archiva. A features matrix comparision is available here. All are available in open source flavours and both Nexus and Artifactory in particular are great tools. However, currently Artifactory is the only one that supports a Cloud-based service option which, as you might expect, integrates very well with our hosted Continuous Integration service. This allows you to provision yourself a fully-fledged MRM in very short order.

So, how do we add our archetype to the repository? This is a simple process using the built in Artifact Deployer of Artifactory which allows you to upload a file and supply its Maven GAV co-ordinates.

Artifactory Deployer

Artifactory Web-based Deployer

Next, we need to add some additional metadata about our archetype in the form of a ‘catalog’:


<?xml version="1.0" encoding="UTF-8"?>
<archetype-catalog>
  <archetypes>
    <archetype>
    <groupId>com.mikeci</groupId>
    <artifactId>mikeci-archetype-springmvc-webapp</artifactId>
    <version>0.1.4</version>
    <repository>http://mikeci.artifactoryonline.com/mikeci/libs-releases</repository>
    <description>Mike CI archetype for creating a Spring-Mvc web application.</description>
    </archetype>
  </archetypes>
</archetype-catalog>

This file should ideally be placed into an appropriate folder of a Maven repository and it contains information about all of the archetypes that live within the repository. We can simply add this file to our ‘libs-releases’ repository using Artifactory’s REST API:


curl -u username:password -f -d @archetype-catalog.xml -X PUT "http://mikeci.artifactoryonline.com/mikeci/libs-releases/archetype-catalog.xml"

Step 2 – Configure your IDE

Now that our archetype is deployed remotely, we can start to use it from within our IDE – in my case – Eclipse. To get good Maven integration inside Eclipse, you really should be using the latest release (0.10.0) of m2eclipse. Once m2eclipse is installed, it provides a handy feature that allows you to add and remove archetype catalogs. You will need to add your deployed archetypes catalog to the list of catalogs accessible from within m2eclipse. This ensures that you can access your custom archetypes when you run the Create a Maven Project wizard in Eclipse as we will see shortly.

Choose the menu item, Windows>Preferences, to open the Preferences dialog and drill down to the Maven>Archetypes preferences, as shown.


m2eclipse Archetype Prefs

Setting archetype preferences


Click Add Remote Catalog to bring up the Remote Archetype Catalog dialog. In the Catalog File text box, enter the path to your remote catalog file and in the Description text box, enter a name for your catalog:

m2eclipse Remote Catalog

Adding the catalog

Step 3 – Generate your project

You should now be ready to generate a skeleton Maven project inside Eclipse. Choose the menu item, File>New>Other, to open the Select a wizard dialog. From the scrollbox, select Maven Project and then click Next. Follow the wizard to configure your project location. Eventually, the wizard allows you to select the archetype to generate your Maven project. From the Catalog drop-down list, select your custom catalog. Then locate and select your archetype :

m2eclipse Select Archetype

Select the archetype

Click Next and enter the GAV values for your new project. Et voila – you should have just created a skeleton project based upon your custom archetype using a slick IDE wizard.

Pretty impressive, don’t you think?

Could your next development environment be in the cloud?

November 17, 2009

Your new project has been given the green light. You need to get your team up and running quickly.  Could a cloud based development environment be the answer? This blog discusses some of the options  and issues for moving to a  hosted development environment.

Cloud City

One of the first tasks for any Agile project team is to establish a robust, reliable set of development tools and associated infrastructure. Generally on any new development endeavour the following needs to be put in place:

  • Locate or create a source code repository and import your skeleton project.
  • Locate or create a continuous integration server to build your source code, run your tests and notify the team via email of any failures.
  • Locate or create a task/user story tracking server and/or issue management tool and add your project to it.
  • Locate or create your test environments for developer smoke, integration, and functional testing.

Now you might be lucky and have much of this infrastructure in place, in which case adding some extra users, a new project and associated set of build jobs might be relatively straightforward. However, if it isn’t you will need to   allocate a decent chunk of time to setting things up yourself. This may involve procurement of hardware, installation of an appropriate OS, installation of the relevant applications, providing secure access to project team members and so on. This gets more complex if the team is distributed and the infrastructure must be accessible beyond a LAN.

Often these development tools are open source, which means that while the cost of acquisition for the software may be zero, the ongoing maintenance and support will probably require specialist knowledge.  Any time your team spends doing this is time that could be spent progressing the project.

With either option, server space and administration time is required and there are obviously costs associated with this, and the costs may be disproportionately large to small and medium-sized organisations.

In the same way that SaaS offerings for email and collaboration suites (such as Google Apps) have sought over the past few years sought to turn these services into low-cost, click-and-go commodities, there are now equivalent hosted solution options available for Agile development infrastructure.

Hosted version control solutions have been available for a while and the market has expanded rapidly over the past few years. Collabnet (the people behind Subversion) and CVSdude are probably the best known. Both companies have expanded their offerings, and not only provide version control but are bundled with or integrate to a host of other tools.  These now compete with newer entrants like Beanstalk, Assembla and Unfuddle (which also does Git hosting). GitHub itself has seen huge growth over the past year, with over 400 new users and 1,000 projects being added each day.

Coming from the other direction are suppliers of Development collaboration and productivity tools, Atlassian are probably the biggest player here and their hosted JIRA studio suite based around the very popular Jira issue tracking tool was launched in May 2008.  Fogbugz is another alternative, based around an integrated Project Management, Wiki, Issue Management and Helpdesk set of tools.

Each vendor has a different focus with JIRA studio and Collabnet’s Team-Forge perhaps the most fully featured for development teams. Both offer a very comprehensive stack and are moving towards the idea of an Application Lifecycle Management (ALM) suite. It will be interesting to see how these platforms develop, and how the traditional Enterprise tool vendors (IBM Rational, Microsoft and Borland) respond.

If  a platform sounds too restrictive and you want to “pick and mix” your own set of tools you can.  There are also some great tools which focus on a specific area – Basecamp for Project Management and Lighthouse for Issue Management are a couple of well-known examples worth looking at. Most of these tools have open APIs that enable you to integrate easily with others, so getting together an integrated  set of tooling is easily achievable.

My advice here is to be clear about what you want from these tools. Some are very feature rich and developer centric while others do a great job of providing a clean and simple process and interface. So which tools suits you will depend on your project, your team, and your organisation. What is clear though is that there is growth in this sector, increased competition and greater integration between the providers. All of this can only be good for those who are happy to outsource their development environments – increased choice and competition against a backdrop of  decreasing hosting costs.

Moving on from  managing your project to testing it, the use of externalized environments that allow teams to deploy a release of a web-based application and run functional tests against it, is trickier. Depending on the nature of the application and its associated runtime dependencies this may require the creation of a bespoke environment. However, recent developments in cloud computing should soon make this much easier. Google’s App Engine for example allows you to run Java (and Python) applications on their infrastructure. So if the AppEngine is your production target, creating a test environment that is a clone of production should be a relatively straight-forward activity. More recently in August this year, SpringSource launched their Cloud Foundry which allows you to rapidly deploy a test (and also production) environment for your Java web application with a few mouse clicks and Microsoft have also weighed in with Azure. Both Google and Microsoft are promising tighter integration with the IDE, and I’ll be watching these platforms closely.

One area which is less mature is hosted continuous integration. There are currently only a small number of pioneering providers in this space, which may surprise some, as the practice of continuous integration is at the heart of the Agile development process. The SaaS multi-tenant application model does not fit easily with the requirements for continuous and often complex software builds. It is computing resource intensive activity, especially for programming languages such as Java, and this will inevitably impact the cost of such a service to the end-user. Mike CI is one of these pioneers and there is a good analysis of the others here.

Now I’m not saying trusting your code to a 3rd party is a simple decision to make. There are often legal, security and organisational hurdles to consider. It isn’t be for everyone and for many large corporates it might be a step too far. But for many people the cost, convenience and management overhead of maintaining it all yourself does not stack up. Your team is in place to write great software. For your next project, I’d recommend that you seriously consider using these low-cost, on-demand hosted services.