Awari is an ancient game originating from Africa which consists of 12 holes in the ground (called houses) split up into 2 rows of 6. Designed for two players, each player selects a row as their territory. Each house starts with 4 seeds in it.
A round in the game goes as follows:
The player picks a number of seeds from one of their houses. They sow the seeds one by one in an anti-clockwise direction around the board.
The number of seeds in the last sown house dictates what happens next:
If there are not 2 or 3 seeds then the go finishes and its the opponents turn.
If the last hole is one of your opponents and the number of beads is 2 or 3 you win those beads
If you collect, then you may look at the next-to-last, and if that satisfies the same criteria, you collect those also
If you play so many beads that they wrap around the board (more than 11) you do not ‘sow’ a bead in the house from which you picked up
If an opponent’s houses are all empty, the current player must make a move that gives the opponent seeds. If no such move is possible, the current player captures all seeds in his/her own territory, ending the game.
Winning
The game is over when:
one player has captured 25 or more seeds
each player has taken 24 seeds (draw).
If both players agree that the game has been reduced to an endless cycle, each player captures the seeds on their side of the board.
I keep coming across story cards written like this:
This is usually a tell tale sign that the why stack was not popped. The value (In order to ) has been written at the lowest possible level of abstraction, the same as the feature (I want ). The card does not tell us why this feature is being built.
If you find yourself writing cards like this take it as a chance to find out a bit more about why you are adding this feature. Hopefully you will uncover a more meaningful value that you can write on the card or perhaps discover that you don’t really need this feature after all.
Your Cucumber features are living documentation. The world is evolving around them and without exposure they tend to rot.
Evolving language
When features are written we use a snapshot of the domain language at some specific timeframe. This ubiquitous language changes and grows outside the codebase and is influenced by more than just developers. When do we refactor features to reflect these changes?
Taints
Since the features are in the code base it becomes the developers/QAs responsibility to maintain them. When writing the features for the first time we have lots of discussion. But at a later date the feature language may be tweaked by a developer to allow reuse or some new technical constraint means they want a special step added. This is where taints can sneak into the language.
Quality
It’s amazing what a difference it makes when people know something they are writing will be published and read by others. Exposing the features can help increase their quality.
Barriers to entry
It’s great that the features sit close to the code but there is a barrier in how you gain access to that code. While developers find source control like second nature its something that gets in the way for non-technical people.
Allowing features to breathe
When someone wants to know about the behaviour of a feature they should be able to turn to the features irrelevant of technical expertise. This engenders discussion which helps bring about changes to the features to better reflect the ubiquitous language and remove taints. The easier it is to access the features the more likely they are to be the first port of call.
So my advice is exposure your Cucumber features to your team (and if possible your users, which Rspec has done with Relish), allowing them to browse and question them.
Relish
To expose features I currently use Relish. This is a web based browser for features.
Relish has a command line gem for pushing features up to the website. I use a post-receive server git hook which upon every push also pushes the features to Relish.
.git/hooks/post-receive
#!/bin/sh
relish push --project air --organization breathe
Through various meetings at Agile2010 and London agile testing meetings Gojko Adzic and others have been converging on a common language to talk about Acceptance testing and the different phases. In doing so we have fleshed out a lifecycle for Acceptance testing.
Seeing a wider picture of the lifecycle helped me gain a better scope of how Cucumber Features are born. I thought I would share a real example demonstrating how the acceptance testing lifecycle (can) flow at Songkick.com.
Example
Business goal
Increase the ratio of people signing up
This gives us something to measure and learn from to assess the success of the features built.
User stories – MMFs
Generating ideas through discussion and brainstorming that achieve this business goal. These are captured as Minimal Marketable Features and have our business goal as their value:
In order to increase signups
I want visitors to signup through their facebook accounts
We break the minimal marketable features into user stories. We try and break the stories into as small as possible units.
Successful signup through Facebook
In order to sign up with as little effort as possible
As a non-member
I want to signup through my facebook account
Failed signup up through Facebook
In order to sign with as little effort as possible
As a non-member
I want to know what I can do to correct errors preventing signup
Key examples
We have a card with the story narrative representing a token for conversation. We use this card to start discussions with QAs, developers and UX exploring the requirements through concrete examples. Playing the what-if game.
Through this we generate a list of scenarios:
Scenario: Successful signup through facebook
Scenario: Failing due to already having a facebook account connected
Scenario: Facebook is unavailable
Note that while we express these as scenarios its not implied that all are going to be tested at the acceptance test level. The developer for scaling/performance issues may push a scenario down to lower level tests.
Specification with examples
Pairing with a QA we take the cards and our exploration notes and codify them into Cucumber features.
Feature: Visitor signups up through Facebook
In order to reduce friction in signing up
As a non-member
I want to signup through my facebook account
Scenario: Successful signup through facebook
Given ...
When ...
Then ...
Scenario:Failing due to already having a facebook account connected
Given ...
When ...
Then ...
Scenario: Failed due to Facebook being unavailable
Given ...
When ...
Then ...
Note that there is not a 1-1 mapping between a Feature and a User Story.
Literal automation: Mapping Feature down to step definitions
The developer turns the feature into an executable test at the same time as writing the feature. When using Cucumber this consists of writing the step definitions.
Continuous validation
The feature is continuously run upon every commit (Using a Continuous integration server)
Living documentation
Exposing features for all to read. Using a web based system such as Relish the features are easily browserable by everyone yet they live in the codebase.
I was recently interviewed about Cucumber and whether it solves the problem of getting business people writing specifications.
You can read the article on SDTimes:
Cucumber puts plain English on requirements
I got a little bit carried away in answering some of the questions posed and Alex Handy was kind enough to post all my responses in detail:
Peeling Cucumber
What useful metrics are we missing that our tests could provide and what should we be recording?
Recording Test Builds
Your using a Continuous Integration server right? Running all your tests at every checkin in your source control repository. The CI environment represents our pipeline in which all code needs to flow through. It tends to be the place where all of the tests are run before the code flows into the outside world. Hence this is a perfect environment to start capturing detailed metrics about all of our tests. It’s also not the end of the world if we add a little extra time to the test build in-order to capture these metrics.
Mining Metrics from Test Builds
What interesting things can we discover? Here are some suggestions:
Failure rates
Areas of your product which are prone to failure/bugs and tests which might be fragile. Perhaps highlighting area QA’s should focus extra attention to.
Flickering tests
If a test keeps failing and passing frequently.
Fragile Tests
An all or nothing feature where all the tests fail or none fail.
Never failing tests
Tests which have never failed, do we need to run them all the time, are they now redundant?
Kent Beck has some additional ideas, lets copy him and pretend to look smart.
Intelligent Selection of the Tests to Run
Kent Beck wrote a tool called JUnit Max which is a plugin for Eclipse and JUnit which helps programmers stay focused on coding by running tests intelligently.
“Max fails fast, running the tests most likely to fail first.”
One of the key principles behind this tool is that:
“Tests that failed recently are more likely to fail than tests which have never failed.”
Super Fast Feedback
If we prioritise the tests that failed recently and those which have been recorded as being likely to fail we increase the chance that a failure occurs early on in the test build. The closer the distance between pushing the code and knowing there is a fail the better.
One problem this helps alleviate is when a test fails 99% of the way through the build. To know you’re fix worked you have to sit and wait for the entire build to run.
CukeMax (alpha-1)
CukeMax is a project that aims to:
Provide a web service to record Cucumber test builds
Provide a web based interface to uncover juicy metrics about your tests.
Feed recorded metrics back into the running of tests prioritising those most likely to fail.
Cool stuff
CukeMax is intended to be used when you run your tests on your CI server. While this initial version just supports Cucumber there is no reason why it cannot be expanded to other test tools such as Rspec. I’m already using this for my own projects and I have a special version working at Songkick.com HQ.
Wanna Play?
You can browser around an example of the web interface at CukeMax - www.cukemax.com
Want to be one of the first Guinea pigs to try out CukeMax? Let me know.
The client tool will be leaked slowly into the world to ensure we can balance server load.
Whats next?
All I can say is there is a lot of activity around this project with some exciting tools in the pipeline
Also Matt Wynne has been working on some similar ideas and we are discussing if we can combine our thoughts.
At Songkick.com we have developed a number of patterns to make it easier to write Cucumber features. I thought I would start sharing some of those patterns here. So here is the first one:
Implicit Reference Pattern
Make use of implicit references to previously discussed topics to produce scenarios which are easier to read and write. Achieving this while avoiding highly coupled steps.
Problem
Storing and relying on state in a step definition can make it hard to reuse. So often state is avoided. This can lead to scenarios like the following:
Given there is a Artist named "XXs"
And I visit the page for the Artist named "The XXs"
This leaves us with:
Verbose steps which are not natural to read.
Extra noise information purely for identity (referencing the name)
Solution
Map implicit references in the language to the objects being discussed.
Accept that we have to store state but do so in an encapsulated way where the feature language is the only thing needed within the step definition to provide a direct mapping to the relevant object from the state.
What we are aiming for is a scenario like this:
Given there is an Artist named "XXs"
When I visit the page for the Artist
Domain model class names are meaningful to everyone
In our features when we talk about something in our domain we refer to its class name and we use the correct capitalization.
Non-technical people still understand what the references mean while allowing us to simplify identifying the model in a snippet of feature text.
This leads to steps such as:
Given the Artist
Given the AdminUser
Implementation
Somewhere to store stuff
We store all the state in a specialised hash. This has a special find_things method which performs some validation and gives us nice error messages if we try and access incorrect types or non-existent objects.
12345678910111213141516171819202122
classStuffContainer<Hashdeffind_thing(opts)expected_type=opts[:type]name=opts[:name]thing=self[name]raise("Unable to find any object in stuff[] with the name '#{name}' that you asked for, boss. I could however offer you one of the following: #{self.to_s}")unlessthingraise("That thing you asked for, it appears to be a #{thing.class.name} when you asked for a #{expected_type.name}")unlessthing.is_a?(expected_type)thingenddefto_sresult=["#{self.length} items in total:"]self.eachdo|key,thing|result<<"the #{thing.class.name}\"#{key}\""endresult.join("\n")endend
Recording the subjects under discussion
In order to record references to created Models we extend Factory Girl’s ‘Factory’ method which is used for all our model creations.
We will record created models in steps like these:
1234
Given/^there is (?:one|an|a|another) ([^ ]+) named "([^"]+)"$/do|entity_type,name|attributes={:name=>name}entity=Factory(entity_type.underscore.to_sym,attributes)end
Resolving Implicit references
Starting with the step definition:
123
When/I (?:view|visit|go to) the page for (#{IDRE})$/do|entity|visitmodel_path(identified_model(entity))end
IDRE represents the ID regexp which provides a way of identifying a model and is reused in many steps.
1
IDRE=/(?:the(?: first | last | )(?:[^ ]+)|the (?:[^ ]+) "(?:[^"]+)"|"(?:[^"]+)")/
The identified_model method turns an English string into a Model. It provides a number of ways of referencing a model.
the Artist
the first Artist
the last Artist
the Artist "Jude"
The identified_model method:
(The key case we are focusing on in this example is where we have no identify just the type of the model – line 11 and 12)
12345678910111213141516171819
defidentified_model(str)casestrwhen/^the (first|last) ([^ ]+)$/klass=safe_constantize($2)returnklass.__send__($1.to_sym)when/^the ([^ ]+) "(.+)"$/klass=safe_constantize($1)instance=stuff.find_thing(:type=>klass,:name=>$2)instance.reloadreturninstancewhen/^the ([^ ]+)$/returnimplicit_model($1)when/^"(.+)"$/instance=stuff[$1]instance.reloadifinstancereturninstanceendraise"No such instance: '#{str}'.\n Current stuff: #{stuff.to_s}"end
The implicit_model method
123456
defimplicit_model(str)klass=safe_constantize(str)raise"expected only one #{klass.name}"ifklass.count>1raise"expected one #{klass.name} to exist"ifklass.count==0klass.firstend
Notice to avoid ambiguity we restrict that only one model of the specified type must exist.
moduleStuffManagmentdefmap_to_id_attribute(type_of_thing){:user=>:username,:concert=>:title,}[type_of_thing]||:nameendclassStuffContainer<Hashdeffind_thing(opts)expected_type=opts[:type]name=opts[:name]thing=self[name]raise("Unable to find any object in stuff[] with the name '#{name}' that you asked for, boss. I could however offer you one of the following: #{self.to_s}")unlessthingraise("That thing you asked for, it appears to be a #{thing.class.name} when you asked for a #{expected_type.name}")unlessthing.is_a?(expected_type)thingenddefto_sresult=["#{self.length} items in total:"]self.eachdo|key,thing|result<<"the #{thing.class.name}\"#{key}\""endresult.join("\n")endenddefclear_stuff@stuff=StuffContainer.newenddefstuffreturn@stuffif@stuffclear_stuffendSEARCH_MODULES=['']defidentified_model(str)casestrwhen/^the (first|last) ([^ ]+)$/klass=safe_constantize($2)returnklass.__send__($1.to_sym)when/^the ([^ ]+) "(.+)"$/klass=safe_constantize($1)instance=stuff.find_thing(:type=>klass,:name=>$2)instance.reloadreturninstancewhen/^the ([^ ]+)$/returnimplicit_model($1)when/^"(.+)"$/instance=stuff[$1]instance.reloadifinstancereturninstanceendraise"No such instance: '#{str}'.\n Current stuff: #{stuff.to_s}"enddefimplicit_model(str)klass=safe_constantize(str)raise"expected only one #{klass.name}"ifklass.count>1raise"expected one #{klass.name} to exist"ifklass.count==0klass.firstenddefsafe_constantize(str)beginrecorded_exception=nilSEARCH_MODULES.eachdo|mod|beginreturn"#{mod}::#{str}".constantizerescueNameError=>erecorded_exception=eendenderror_message="\"#{str}\" does not appear to be a valid object in the domain. Did you mean \"#{str.classify}\"?"\+"\nDetailed error message:\n#{recorded_exception.message}"raiseNameError.new(error_message)endendendWorld(StuffManagment)
I recently created an adapter in Cucumber which provides support for writing step definitions in Javascript. So as a Javascript programmer you can test your code with Cucumber without having to write any Ruby.
Feature: Fibonacci
In order to calculate super fast fibonacci series
As a Javascriptist
I want to use Javascript for that
@fibonacci
Scenario Outline: Series
When I ask Javascript to calculate fibonacci up to <n>
Then it should give me <series>
Examples:
| n | series |
| 1 | [] |
| 2 | [1, 1] |
| 3 | [1, 1, 2] |
| 4 | [1, 1, 2, 3] |
| 6 | [1, 1, 2, 3, 5] |
| 9 | [1, 1, 2, 3, 5, 8] |
| 100 | [1, 1, 2, 3, 5, 8, 13, 21, 34, 55, 89] |
The Step Definitions
123456789101112
Before(['@fibonacci'],function(){fibResult=0;});When(/^I ask Javascript to calculate fibonacci up to (\d+)$/,function(n){assertEqual(0,fibResult)fibResult=fibonacciSeries(n);});Then(/^it should give me (\[.*\])$/,function(expectedResult){assertEqual(expectedResult,fibResult)});
I have tried to make the Javascript Api as close to the Cucumber Ruby Api as possible. However it currently does not support a couple of things the Ruby version is capable of: Calling step definitions from within step definitions with multiline arguments and giving line reporting on step definitions.
Loading your Javascript code into the World
The most important difference to take note of in the Javascript Api compared with the Ruby one is how we load code into the World so it is in scope within the step definitions.
This Javascript Cucumber adapter represents an experiment to see how well we can use Cucumber through Javascript and V8. I would love to hear ideas and feedback on the Javascript Api.