logo

Podcasts

Podcasts are a nice way to learn about new topics, pick up new ideas and hear the experiences of other testers and agile practitioners. I highly recommend the Testing Podcast site for informative interviews with a wide range of testing practitioners.

This interview with Michael Kircher of Software Engineering Radio was posted in June 2010 and can be found on testingpodcast.com as well, along with some other podcasts I’ve done in the past. Michael and I cover several topics ranging from the role of the tester in agile teams, over test automation strategy and regression testing, to continuous integration. I really enjoyed it because we got to talk for over an hour (though the podcast is not that long.)

4 comments on “Podcasts

  1. Lisa, thanks for promoting testingpodcast.com. I hope people find it useful. A lot of new podcasts on software testing should be added to the site soon (and a lot of podcasts are there already). If anybody is interested in helping out, just let me know.

  2. Listened to your SE radio podcast today (it was what lead me here to your blog!). I thought it was a great overview of agile as it relates to testers/QA. I’ll be pointing a few people towards it.

    At some point I would love to hear your insights on dealing with large external dependencies in automated regression tests. Say you’re testing a web front-end for a large retail system. It would talk to lots of enterprisey back-end systems, all out of your team’s influence. How do you automate that? Maybe there are some patterns out there, but I’ve failed to find them so far…

  3. Hi Pete, I’m glad you liked the podcast! Thanks for the feedback!

    I worked at eToys / KB Toys in 2002-2003, I’m not sure if that’s as large as you are thinking. It was indeed difficult to test things like verifying the front-end inventory matched the actual warehouse inventory – there was integration between them but it never worked correctly. We implemented Oracle Enterprise to manage everything from the front end to the supply chain. We also had to test business intelligence stuff. Oracle Enterprise had a lot of problems at the time, so that made everything more difficult. Also, we were not really an agile team. We did some agile practices, such as using customer tests to drive coding (but no unit-level TDD) and 2 week iterations.

    So, how did we automate? We were fairly successful given that there were almost no automated unit tests. We had Mercury Winrunner imposed upon us, so I hired a WinRunner expert to create and maintain some GUI regression tests. He did an awesome job and also trained the other testers on WinRunner. But there was a lot of testing that was costly to automate in WinRunner.

    The website was developed in Tcl (believe it or not). I heard Tcl was pretty good for scripting tests. So I bought a Tcl book and started writing test scripts in Tcl. When I had problems, the developers were willing to help me, since it was what they were coding in (whereas they would never have helped with a WinRunner test!) I was fairly successful at automating some scripts this way.

    We didn’t have any continuous integration. Nevertheless, we were able to keep up with testing. The dev team was fairly small, maybe 8 programmers, and we had 4 testers. The programmers, while they wouldn’t do TDD, were nevertheless good coders and did a good job. We were able to communicate closely with them. Of course, there was lots of chaos, long hours, and website crashes in the middle of the night. There were problems we could have easily solved, but the culture was such that they weren’t really interested in changing. Still, we were able to automate enough tests to keep up with development.

Leave a Reply

Your email address will not be published. Required fields are marked *

Search
Categories
Archives

Recent Posts: