Good morning, everyone. Happy Thursday to you :). STO-CON is heading into its final day, and I will say it again, it's been wonderful to have a software testing conference in my own backyard. It's also been fun to give tips and recommendations for places to go and things to see for people visiting from out of town, and to that end, for those who are here tonight, if you are interested in going out to dinner locally this evening, I'm happy to make recommendations and tag along ;).
First thing this morning we are starting off with is a panel with Dave Haeffner, Mike Lyles, Smita Mishra and Damian Synadinos. The topic is free form and a Q&A submitted by the audience, but the overall theme is (R)Evolution in software testing. Testing is evolving, and there is a sea change happening at the same time. We don't have a clear consensus as to what testing will be in the future, or who will be doing it. The whole "Testing is Dead" meme has proven to not be true. There will absolutely be testing, and there will be a lot of testing. The bigger question is "who will be doing it, and in what capacity?" This really doesn't surprise me, because I have been in software now for twenty-five years, and I can plainly say I do not test the same way today that I did twenty-five years ago. I use many of the same techniques, and principles relevant then are still relevant now, but the implementation and approach is different. Additionally, as I've learned and grown, I've extended my reach into other areas. Going forward, I would encourage any software tester not to just think "how can I improve my testing skills" (please understand, that is very important, and I encourage everyone to improve those skills) but also to consider "how can I help add value to the organization beyond my testing ability?"
I've much appreciated the fact that all three panelists are talking about adding value to their teams and also creating physical capital within yourselves as well. Damian emphasized the way that we speak, and that we understand the agreements we make, and the depth of those agreements. Smita emphasizes developing your social capital, in the sense that you may be doing amazing work at your own company, but long term that will not help you if no one else knows who you are or what you are doing. Invest in yourself, develop your craft, and make sure that people can discover that. If you think I'm going to take this opportunity to say "start a blog, maintain it, learn in public, and share your ups and downs"... well, yeah :).
It's no surprise that Dave is being asked about automation and tooling, because, well, that's what he does and what he's known for. One of the things I appreciate about what Dave does is that he maintains an active newsletter sharing tips about what he does and has implemented, and how he's gotten around challenges and issues. I appreciate these bits (yes, I'm a subscriber) in that he exemplifies a lot of the sea change we are seeing. Dave doesn't just state that he knows hat he is doing, he physically demonstrates it every week, at least as far as I am concerned. What's cool is that I am able to get access to what he has learned, consider it, see if it's something I can use, and then experiment with his findings myself. Dave of course got to field the "testing vs. checking" distinction, and how he used to think it was an unnecessary distinction, however, as he's been exploring other industries, he's come to see that there is a more nuanced way to look at it. Automation speed things up, it's a helper, but it's not going to think for itself. Automation and the checking element is helpful after the testing and discovery element has been completed and solidified. Damian makes the case that tools help us do things we need, but that tools will not always be appropriate for all people (and he gave a shout out to my session... thanks Damian :) ). For me, every programmer explores and tests to come up with their solution, and then runs checks to make sure their solution is still valid going forward (or at least, they should ;) ).
Damian also got to field a number of questions around metric and their relevance, and in most cases, Damian would be happy to see most of the metrics disappear, because they are either useless and the benign end, and dangerous in their manipulation at the more malignant end. The worst implementation of metrics are often applied not to measure processes, but to measure people, especially people against other people. Sadly, there are areas where people have top produce to an expected level. Think of sale people needing to meet sales goals. Fair or not, that's a metric that is frequently applied and has significant ramifications for those people. As testers, we'd love to believe we are isolated from most metrics, but in fact, we are not. Were it to be up to me, I would encourage testers to develop realistic and interesting goals for themselves to learn and to stretch, and to evaluate them regularly. Fact is, none of us can learn everything and be experts at everything. I'd rather have a tester who focuses on the goals they are passionate about, and do their best to develop some peripheral skills for broader application, but I don't want to put someone into a position where they are working on a goal that they hate. In that case, neither of us wins. They get frustrated and don't progress, and we don't get the benefit of their working on the things that they are excited about, and therefore willing to pour their energies into.
The irony of metrics is that everyone wants to see how close they are to complete automation, or if they are automating enough. Dave thinks they are asking the wrong question if that is their focus. He'd encourage organizations to look at their defect list, their customer logs and access reports, and see where people are actually focusing their efforts and sharing their pain. If your automation efforts are not covering those areas, perhaps target your focus there. In other words, don't think about "are we automating all the things" but "are we actually targeting our automation at the things that matter?" My guess is, the areas that are falling short are the less easy areas to automate.
Outsourcing is a reality, and it's universal. Every one of us outsources something we do, so we should not be surprised that testing gets outsourced as well. Crowdsourcing is becoming more common and an additional tool for organizations to use. As a coordinator and facilitator for Weekend Testing events, I know well the value of crowdsourcing and having many eyeballs in a short amount of time on a product providing great amounts of information, but without focus and targeting, the feedback you get back is likely unfocused and untargeted.
Many of the skills we learn do not come to us fully formed. Pardon the mental image, but a human mother does not give birth to a fully grown adult, ready and capable from the get go. Likewise, we don't become masters of any endeavor overnight, or with little effort. More often, we try, we experiment, we get frustrated, we try again, we do a little better, we move forward, sideways, and backwards, but each time, we learn and expand our experiments, until we get to a point where we have a natural facility.
Round Two extended out to the audience with ad-hoc questions, and the first question was communicating through outsourcing. How do you have effective communication with off-shore participants? Smita points out there are two issues, first is the conversation level, and the second is the time difference and isolation. It will never be the same as having the team all together in the same place. In these cases, communication needs to be clear, unambiguous and agreed upon. This reminds me of the reports I had to have reviewed by my leads a decade plus ago to be translated to Japanese and then communicated back in the opposite direction. We had to spend a significant amount of time to make sure that we were all communicating on the same wavelength. Often, that meant I had to rewrite and clarify my positions, and to do so frequently. Damian pointed out that location is not the only way that communication can go wrong, though the lossy communication is more pronounced. Reducing miscommunication comes by asking for clarity. In short, don't be afraid to ask. In my own work world, I joke that I am allowed "one completely stupid question per day" of everyone on my team. Yes, I actually word it like that. It's a bit of my self-deprecating humor, but it sends a message that I do not pretend I automatically know everything, and that I may not fully understand what is being asked or needed. The benefit of that is that it shows everyone else in my organization that they can approach me in the same manner. It's not foolproof, but it certainly helps.
No comments:
Post a Comment