After some initial experiments into doing BDD using Cucumber our team has decided to adopt a full BDD and TDD approach to our current sprint. It's slightly experimental at the moment as we work out the best ways of working. Once we are done we will retrospect and see if we want to adopt the approach going forward. Initial feelings from our first couple of days are that it is going well. Here's what we have done so far.....
Sprint Planning
After deciding to give BDD and TDD a try for the sprint, the first thing was the sprint planning. Fortunately for this sprint we have a story that is pretty well understood, but sufficiently challenging to be of value in evaluating the approach. The story basically consists of taking some existing data, retrieving additional data from a new remote system and delivering the combination of the two to a remote destination (basic decoration of existing content).
Our sprint planning consisted of two parts:
- a high level technical discussion of the story and agreement of the general architectural structure and approach
- the writing of Cucumber 'Feature' files containing all of the acceptance scenarios for the story
These two parts went well and we created a good set of acceptance scenarios, but we then found ourself struggling to get our heads around how to go about implementing them. Our reasoning was that rather than writing out individual technical tasks, we would just create one card for each scenario. These scenario cards would then form our sprint backlog.
However, the first scenario was a 'happy path' one and we were thinking that in order to implement it we would have to build the entire solution; the remaining scenarios would then just be tiny additions. This didn't feel right in that we would then have one scenario stuck as 'in progress' for days and then all of them moving within a few hours right at the end of the sprint. Our next thought was to therefore break down this scenario into a number of technical tasks. As it turns out, this wasn't the correct approach and it soon became clear that by doing this we were missing the benefits of BDD and TDD: we were defining the detailed solution in the planning session, rather than letting it emerge from the implementation of the tests.
We decided to take a break and think over our approach. It was then that it came to us that we were still thinking in a non-TDD mindset. To pass the first scenario, we didn't need to implement the whole solution, just enough to make the BDD test pass. As we then moved on to the next scenario we could implement more of the functionality, add depth and so on. With this in mind we decided to start the sprint and inspect and adapt as we went.
Setup and Configuration
The first step in the sprint was to add Cucumber support to our project. Given that we are using BDD for acceptance tests, we decided to add this support into our functional test suite. Unit level TDD will continue to be done using our existing testing tools (JUnit, Hamcrest and mockito).
Our functional tests are written in Java and are built and executed using Maven. We therefore added a dependency to cuke4duke, which runs Cucumber on top of the JRuby platform. This setup included adding a dependency on the maven-cuke4duke-plugin, which runs the Cucumber 'Features' within a maven test phase.
Aside: Why did we pick Cucumber (which is a Ruby framework) when we are doing a Java project? There were a number of reasons, but the main ones were:
- We liked the HTML report format
- A number of the team are already using Cucumber on external projects (in Ruby and Scala) so there was an existing familiarity
- It seems to keep out of the way and be less invasive than many of the Java based alternatives
With this set up, we added the Features file that we built during spring planning into the appropriate features directory and created step definitions with pending methods for each of the Given/When/Then cases.
The First Scenario
The first scenario was to set up some existing content and some additional content, decorate the existing content with the additional content and then verify that the content delivered at the destination had been correctly decorated. The steps for Cucumber were written accordingly and the assertions added the 'Then' case. Now it was time to implement.
This was where our TDD thinking really kicked in. Previously we would have started building the whole solution. Instead we just implemented the elements of the destination required to allow us to verify the decorated content. Given that we were only testing with one piece of content we just hardcoded to return a fixed set of additional data. Test passed.
Now, we know that this is not the full solution and that we have plenty more work to do, but we have a green on our test report. We have 8 scenarios to complete and the story is not done until they all pass, so what's wrong with this approach? Our next scenario tests with a different set of content and additional data, so this is broken by the hardcoding approach, so we can then work on the next step of delivering different sets of additional data to the destination. In this case we will still probably hardcode the data that is delivered based on the content, but progress will be made. Later scenarios will then add the calls to the new remote system to get actual data. By the end of the scenarios we will have built the entire solution driven solely by tests.
Areas of Uncertainty
Even though things are going well, we still have a number of areas of uncertainty that we need to resolve by then end of the sprint. These include:
- Whether we can successfully complete the story in this way
- Whether this approach impacts our productivity
- Whether the team members are able to adapt to working in this TDD style
- Whether the business will identify with the inherent value of this approach
- Whether we can get the BAs and the POs to start thinking of the stories as BDD scenarios
We hope to resolve these as the sprint progresses. I do think that initially productivity will be reduced due to the time it takes to learn a new approach and to become proficient in it. However, while productivity may be reduced for a short time, I believe that quality will go up due to the test-driven nature of the work and that we will be more certain of building the correct solution. I it my hope that the team members will see the value of this approach and will opt to continue with it for a few more sprints to allow them all time to become proficient in the approach. We will then be able to evaluate its success or failure more objectively.
To help the business understand the value of this approach (assuming of course that it is successful) I will plan to include a quick overview of the BDD approach in the Sprint Review next week. I will give a short presentation on the approach we took, show the scenarios that we created during planning, talk through the approach we took to implement them and show the Cucumber report that is produced. It will be interesting to see the reactions.
I'll post a follow-up at the end of the sprint to let you know how we got on.
Wondering what your results were ? Could you please share your experiences ? I am also interested in knowing how you tackled the planning .
ReplyDelete