Here’s my third post to share the many insights I got at Agile Testing Days USA 2023! There was such a wide range of topics. I hope some of the takeaways I share in this series will inspire readers as well! if you haven’t already, please check out Part 1 and Part 2.
The featured image here ^^^ features us three Agile Testing Fellowship co-founders: Janet Gregory, José Diaz, and myself! It’s always great to get together and dream up new ways to help people learn about holistic testing.
The Age-Old problem: Using experiences, age & biases
Pete Walen and Janet Gregory addressed the “wisdom” we often here that young people learn new technology faster than older people. They agreed this is true – because we older people have a “big data” problem! We already have so much knowledge in our brains, it can take longer to retrieve the fact we want.
Janet and Pete addressed many myths we hear so often. They did research to sort out the facts. Do younger people do the job better? Older people have crystallized and fluid abilities. They have communication and collaboration experience to help build harmony on teams. Should older people step aside and make room for the new generation? There is no evidence that young people are blocked from jobs, or hired to replace retirees.
Are older workers grumpy and unable to get along with younger generations? Again, there’s no evidence to support that (though personally, I often feel grumpy!) There IS evidence that diversity helps creativity, innovation and performance.
Companies that let go of the employees who know the organization, understand the business, have long experience with the software often have disastrous results. People over 50 years of age are 50% more likely to be made redundant. Companies who lay off long-time practitioners lose the age diversity that provides diverse perspectives, skills, knowledge and social connections.
Adding my thoughts here: Age bias is definitely a thing. Like other biases, it’s often unconscious. Having a diverse group making hiring and firing decisions is one way to offset bias and choose more wisely. Also, don’t let yourself make assumptions about a person just by how they look and what they do. I know I tend to assume everyone in the software profession has a similar background to me. In fact, each person has different obstacles to overcome. Which brings me to the next talk I want to share.
(This testing consultant of a certain age, yours truly, has a ton of experience to share! If your team is struggling to fit testing into the fast pace of agile development and continuous delivery, please get in touch with me.)
The Secret to My Success
Melissa Eaden’s keynote impacted me the most of any talk at the conference – and there were lots of excellent, engaging talks. I’ve known Melissa for quite a few years. We’ve paired up to give workshops and tutorials. As I discovered in this talk, there was so much I didn’t know about her. And a disclaimer, I may have mis-remembered or misinterpreted some of what she said.
Melissa had expectations shared by many of us when starting out in life. Check off the success list: Get a degree, get a job, buy a car, have a family. Success == happiness. But, Melissa faced almost insurmountable obstacles of poverty and trauma in her childhood. Only 16% of children survive poverty to become economically successful – and most of those have two parents. Melissa’s was a single parent family.
Melissa studied and worked hard. She built a low-paying tech job into a highly successful career. She achieved success far beyond anything her family could have imagined. And realized that success != happiness. She struggled with mental health issues.
Fortunately, she had an understanding manager who encouraged her to get help, which she did. She went into debug mode and asked herself questions. This helped her start having more good days than bad. It was time to make her own checklist for happiness. It includes boundaries, dependencies, diversity and inclusion biases, variable analysis and psychological safety .
Melissa encouraged us to take a chance on someone who doesn’t have all the the so-called “success” boxes checked. One of my takeaways was to remember not to make assumptions about people, don’t put them in a box where I think they fit. I need to have empathy.
Story Impact Checklist
Allison Lazarz and I teamed up to share her Story Impact Checklist (SIC). this is an easy-to-create tool that can be used in planning sessions to help make sure all testing activities for each story get done. It prevents those “facepalm” moments when a huge bug becomes so obvious just as you release a new change to customers. Teams can develop an SIC that helps them avoid last-minute scrambles to test some quality attribute that was forgotten. It helps teams keep a reliable cadence.
The idea is simple, and, so beneficial. A team can use their checklist – created by them to fit their needs – to add testing-specific tasks to stories. Referring to the checklist during iteration planning and other planning sessions, it helps us remember to ask questions like: What types of testing do we need? Is good test data readily available? Do we need monitoring and alerting?
Ours was a combo session: a 30 minute talk, followed after a break by a 75 minute workshop. Our workshop participants read a case study and came up with a story impact checklist to help our imaginary (and yet, realistic) “ScrummyBears” team address the obstacles getting in the way of frequent, consistent releases. Then they brainstormed ways the SIC can help their own teams, and how they might get their teams to try it. Zoom in on the image here to see some of their great ideas! If you’d like a copy of the slides, please get in touch.
I have much more to share, and, lots to do so I’m not sure how soon I’ll get to more posts in this series. Please stay tuned!