Agile Testing with Lisa Crispin
Providing Practical Agile Testing Guidance
"We're on a mission..."

“We’re on a mission…”

Most of us have more we’d like to do than the time to do it, even if we’re in an experienced agile team working at a sustainable pace. A couple of us that work on our iOS team wanted to try automating some UI regression tests. The programmers write their code using TDD with good test coverage, but UI regression testing was all done manually and tended to be a bit hit-or-miss.  However, we were stretched thin on the testing side, and that idea didn’t get to the top of the priority list.

Luckily, a highly experienced tester, JoEllen Carter (@testingmojo), joined our team. Soon after that, when we were regression testing an iOS release, JoEllen found a showstopper regression failure. Automating regression tests through the UI to catch failures such as that more quickly became a more compelling idea.

Getting started

The developer who is also the iOS product owner championed the automation effort, and knew some testers in our company’s Toronto office who were successfully automating mobile UI tests. He set up a meeting so we could pick their brains. They recommended using Apple’s built-in test automation tools in Instruments, along with the tuneup.js library, and kindly sent us a book, examples of their own tests, and other helpful documentation.

The PO helped us install and build the iOS code, and spike some rudimentary automation with Instruments. We agreed on some simple scripts to try, and he put stories for those in the backlog. Since the main focus of our team is our web-based SaaS product, and we testers wear other hats including helping with customer support, it was hard to find time to devote to the iOS test automation. JoEllen suggested blocking out an hour every day to pair on it. This proved key to getting traction.

Pairing FTW

You know how it’s good to be the dumbest person in the room? This worked for me as I watched JoEllen fearlessly use the record function in Instruments as well as the logging feature to see how the iOS app pages were laid out and how to refer to the different elements. Every time we paired, we made good progress, starting with the simplest of scripts and incrementally adding to them. Demands on our time for more business-critical work sometimes meant we let the iOS automation effort slide, but we’ve experimented with trying a different time of day when distractions are fewer.

Obstacles to overcome

We had a few scripts written when the element names changed unexpectedly due to updates to support iOS 8. We were discouraged because we’d had no warning about it from the programmers. They work in a different office two timezones away, and though we have a daily standup and are on chat all day, we testers are usually focused on other products. We discussed this with the PO, and agreed it was time to get our test scripts into CI, so that if we weren’t warned about big changes, they’d at least be immediately obvious to everyone. We wrote more stories for the steps needed.

Some obstacles to making the UI automation scripts part of CI seemed tough. For example, we needed to figure out how to run the tests from the command line rather than from within Instruments. After one pairing session of Internet searching, reading doc, and trial-and-error, we nailed it! By “we” I mean JoEllen, with me contributing information from searches and a bit of shell command expertise. We also got help at one point with a Javascript question from a pair of programmers that work on our web-based app.

Overcoming my own weenieness

I have personally experienced a lot of fear and trepidation about the UI test automation effort. I don’t know Javascript, I’m a newbie at iOS (even as a user) and mobile app testing in general, and I always find test automation hard even though I also really enjoy doing it. But I finally forced myself to ‘drive’ and felt much better actually trying things. We have a plan for refactoring to make the scripts easier to run and maintain, and of course more tasks to do to get them into CI.

Lessons learned

I can’t advise you on the best techniques for iOS UI test automation, but I can share some general takeaways from this experience so far:

  • Has anyone else in the company already succeeded with a similar effort? Ask them for help – they’re probably happy to share their experiences.
  • Pairing brings much better chances for success than working alone, for so many reasons.
  • Put time on the calendar every day to work on an effort like this, even if it’s only one hour (that is also good advice if you want to write a book!) Experiment with different times of day. Right after lunch has been a good time for us, before getting pulled back into the routine, and before afternoon meetings.
  • Bring up obstacles during stand-ups, set up meetings to discuss them with people who can help.
  • Write stories for automation tasks to make them visible.

I still have some misgivings about our automation effort. I’d like for the programmers to be more involved. After all, test automation is coding, and they are the people who code all day every day, rather than one hour or so a day at best. However, we have lots of support from the PO who is also a programmer, and we’re moving towards making this automation a part of the existing test automation already in the product’s CI.

If your team has a tough testing problem to solve, try pairing to spike a solution. Reflect and adapt as you go! Oh, and hiring excellent experienced testers like JoEllen is also a good idea, but don’t be trying to poach her away from our team.



atdAgile Testing Days is a wonderful experience: a chance to join a community of professionals who are passionate about delivering software with the best possible quality and value for customers.

Janet Gregory and I are co-facilitating a tutorial/workshop where we ask participants to bring your toughest agile testing challenges so that we can work on them together. The theme of Agile Testing Days is the future of testing. We can’t predict the future, but we could all be doing a better job right now! You’ll leave our day-long workshop with new tools in your toolbox and experiments to try with your own team to forge new frontiers in agile testing and software quality.

We’ll have not only unicorns, but donkeys and dragons – watch the videos to learn more about our workshop! You can also see what went on in my workshop last year, though we will have some different group exercises and slightly different format this year – plus, Janet and me pairing, always better!


Moar Agile TestingMoreAgileTestingCover

More (Agile Testing)

(More Agile) Testing

Janet Gregory and I, along with more than 40 contributors and many helpful reviewers (it takes a village), have finished our new book. We delve into many areas that are new or more important since our first book Agile Testing was published. Please see our book website to see the giant mind map of the entire book, plus bios of all our contributors, and one of the chapters, Using Models to Help Plan!

Drawings explaining the business rules for a feature

Drawings explaining the business rules for a feature

In my previous post I told about a great experience collaborating with teammates on a difficult new feature. The conversations around this continued this week when we were all in the office. Many of our remaining questions were answered when the programmers walked us through a whiteboard drawing of how the new algorithm should work. As we talked, we updated and improved the drawings. We took pictures to attach to the pertinent story.

Not only is a picture worth a thousand words, but talking with a marker in hand and a whiteboard nearby (or their virtual equivalents) greatly enhances communication.

I can’t say it enough: it takes the whole team to build quality into a software product, and testing is an integral part of software development along with coding, design and many other activities. I’d like to illustrate this yet again with my experience a couple days ago.

Some of my teammates at our summer picnic - we have fun working AND playing!

Some of my teammates at our summer picnic – we have fun working AND playing!

Recently our team has been brainstorming and experimenting with ways to make our production site more reliable and robust. Part of this effort involved controlling the rate at which the front end client “pings” the server when the server is under a heavy load or there is a problem with our hosting service. Friday morning, after our standup, the front end team decided on a new algorithm for “backing off” the ping interval when necessary and then restoring it to normal when appropriate.

Naturally I had questions about how we would test this, not only to ensure the correct new behavior, but to make sure other scenarios weren’t adversely affected. For example, the client should behave differently when it detects that the user’s connection has gone offline. There are many risks, and the impact of incorrect behavior can be severe.

I was working from home that day. Later in the morning, the programmer pair working on the story asked if we could meet via Skype. They also had asked another tester in the office to join us. The programmers explained three conditions in which the new pinger behavior should kick in, and how we could simulate each one.

The other tester asked if there were a way we could control the pinger interval for testing. For example, sometimes the ping interval should go up to five minutes. That’s a long time to wait when testing. We discussed some ideas how to do this. The programmers started typing examples of commands we might be able to do in the browser developer tools console to control the pinger interval, as well as to get other useful information while testing. We came up with a good strategy.

Note that this conversation took place before they started writing the actual code. While they started writing the code using TDD, the other tester and I started creating given-when-then style test scenarios on a wiki page. We started with the happy path, then thought of more scenarios and questions to discuss with the programmers. By anticipating edge cases and discussing them, we’ll end up with more robust code much more quickly than if we waited until the code was complete to start thinking about testing it end to end.

There are so many stories in this vein in Janet Gregory‘s and my new book, More Agile Testing: Learning Journeys for the Whole Team. We’ll have more information about the book on our book website soon, including introductions to our more than 40 contributors. The book will be available October 1. It takes a village to write a useful book on agile testing, and it takes a whole team to build quality into a software product!




I was so excited to see A Coach’s Guide to Agile Testing from Samantha Laing and Karen Greaves of Growing Agile. I don’t know of any other book to guide coaches in helping agile teams learn good ways to succeed with testing and build quality into their software product. Here are my immediate thoughts after reading it.

The book is useful even for coaches/trainers who don’t themselves have a lot of

Growing Agile: Coach's Guide to Agile Testing

Growing Agile: Coach’s Guide to Agile Testing

knowledge of or experience in testing. The authors use the “4 Cs” from Sharon Bowman’s Training from the Back of the Room, and show how to apply them. Newbie coaches and presenters usually make the mistake of wanting to deliver a lot of information via lecture and slides. Bowman’s techniques have helped me make my own training classes and workshops much better learning experiences for participants, so I’m glad to see them incorporated here.

It’s so helpful that the guide includes a training kit, and advice on what supplies are best, even the best markers to use. I liked the idea to send participants photos of the workshop, though I might ask up front if everyone is comfortable with having photos taken.

Sam and Karen give options on how to schedule and organize a class or a shorter workshop. I’m betting that readers will find this incredibly helpful.

The visuals in the book are terrific. We know how well pictures help convey concepts!

One thing that bothered me about Training from the Back of the Room was that I’m not comfortable asking participants to do some of the exercises. I’m an introvert, and I find it hard to ask people to do something I wouldn’t feel good doing. The exercises here are well within my own comfort level, which is great.

Most importantly, the content is spot on. It is SO wonderful to read a book whose authors truly grok the testing mindset. Karen and Sam understand what teams need to know about testing, and how coaches can help them learn those things.


I’m currently at Booster in beautiful Bergen. It’s a terrific conference, about 240 people

Thanks Martin Burns for tweeting this photo!

Thanks Martin Burns for tweeting this photo!

which is a perfect number. Everyone is so enthusiastic, the food is great, the sessions are awesome. I’ll blog more about that.

I facilitated a workshop today on “changing your testing mindset”. I’ve uploaded the slide deck, and will be blogging more about that too. Many thanks to the workshop participants, we all learned a lot from each other! Some great ideas generated.

Lisa CrispinI’m doing a day-long tutorial on “Changing your testing mindset“, and an “interactive session” at Belgium Testing Days in Bruges, Belgium, March 17 – 20. You can learn more about my tutorial (and see my adorable donkeys) in my introductory video.

What could be more wonderful than being in beautiful Bruges? Why, meeting up with a bunch of amazing testing practitioners for four days to share experiences and break new ground in testing.

Check out the program. You can use my discount code, FeahY8, for a 10% discount on registration! I hope to see you there.

I presented a session called “Developers who grok testing: Why I love them and how they mitigate risk” at CodeMash. We did a mini workshop, dividing into groups to brainstorm what devs need to learn about testing in order to grow towards grokking it. How can testers and developers communicate and collaborate better?

Take a look at the pictures to see the ideas. Participants also shared wonderful stories about how they got testers and developers collaborating for testing. For example, one team has testers and developers pair on a story. They do test automation and coding together. The story doesn’t have to move to a separate “testing” column or state, the pair does all testing activities together and then the story’s done!

Do you have any success stories for getting developers more involved in testing? Please share in the comments directly or via links to your own blog/articles. (Sorry the pictures are displaying in a dorky way but I don’t have time to fix!)


IMG_1162 IMG_1161 IMG_1160 IMG_1158

Thanks to the participants in today’s workshop! We tackled quite a few tough testing-related issues that many agile teams encounter.

Big visible outputs emerged!

Big visible outputs emerged!

The slides, pictures of all the outcomes such as topics, impact maps and brainwriting pages, are available here.

After each table group identified their most burning testing issues, they created SMART goals for each one and then brainstormed experiments to try which would address the issues. We tried impact mapping, brainwriting, mind mapping, and “if you had super powers”. In the afternoon, we had an enthusiastic round table discussion and finished up with a LeanCoffee format.

One of the most interesting ideas I heard was using food to build trust. One participant said he brings food to meetings (at his own cost). Sharing snacks creates a social atmosphere, promotes conversation, and helps team members build more trust.

We had good conversations about encouraging whole-team responsibility for testing activities, incorporating testers onto cross-functional teams, letting teams jell, finding ways to make story requirements more clear. Take a look at the ideas in the pictures. I hope some participants will try some of the experiments created, and blog about what they learned. I learned a lot but I couldn’t take notes!