OK, I'm finally home and able to gather my thoughts and put this all together.
Day 2 started bright and early with me getting to the World Trade Center Portland so I could set up the room where I'd be moderating. Since I was moderating the Usability track, that put me on the 2nd floor in the long grouping of three conference rooms made into one big one. again, moderating required setting up the room, making sure the electronics worked, that the lavaliere microphone worked as expected, and that everyone received feedback forms.
The first session of the day on Tuesday was the 2nd Keynote delivered by Harry Robinson of Microsoft (noted by many of the attendees... PNSQC has been known for the lack of Microsoft participation in the past, and this year there were several presenters from Microsoft. the times they are a'changin', as they said :) ). The keynote address was "Using Simple Automation to Test Complex Software", and focused on the challenges and methods used to test the launch and improvements made to the bing search engine (with lots of jokes later in the day about "binging" for things instead of, well, you know :) ). The focus of the talk was the moving away from older record/playback and dense architecture tools (it felt good to hear that even Microsoft found the commercial tools to be a bit, well, bulky and heavy) and encouraged the focus and continued use and reach of exploratory tests.
In the Usability track, Matt Primrose presented his talk about how to use Kano Categories to Grade User Experiences. The Kano Model classifies product attributes based on how they are perceived by customers and their effects on customer satisfaction. The idea is to look at multiple features and separate them out into three categories: Must Have, Desired and Differentiator. In addition, the Kano Model allows for grading of each item with regard to implementation (0.0 for not implemented, 0.5 for partially implemented, 1.0 for fully implemented, and 1.25 for implemented beyond minimum requirement). I found this to be an interesting talk and follow-on discussion, and think this may well be an approach I will use in future projects.
Kathleen Naughton did the second presentation in the Usability track about her efforts to help simplify printer installation success for consumers. Her presentation focused on the test lab that H-P developed with the express purpose to develop methods to lower the number of support calls from customers regarding printer installations (specifically with regard to wireless implementations). Kathleen went into great detail about many of the issues found in the field and reported to H-P Customer service, and the steps they took to beef up their testing approach and methods to make the process of installing printers easier for consumers.
The afternoon sessions I moderated were all focused on Test technique, and we had a broad range of topics to consider. Mark Fink came all the way from Germany to share his techniques for visualizing software quality in large projects (the main idea being to use color and shapes to help describe and show a number of methods and techniques to show visual representations of software quality (and bonus, this is all part of an open source project that mark hoes to make available within the next six months!). Brian Walker demonstrated methods to perform static code analysis and find bugs even before the builds are finished to determine if there are issues, and to follow on and use these static analysis tools in the nightly builds. Ashish Khandelwal and Gunankar Tyagi came from India to present their talk on Adaptive Application Security Testing Model and its implementation at McAfee. They described their challenges with implementing this model and the benefits they achieved by implementing this approach. The final talk of the day was presented by Jean Hartmann and discussed how Microsoft implemented Large-Scale Integration Testing.
The evening presentations included more poster paper presentations and the Rose City SPIN meeting and their special guest, Jonathan Bach. Jonathan presented a talk titled "My Crazy Plan for Responding to Change". Some may have noticed Jon and James Bach discussing an idea they called "thread-based testing", and this talk fleshed out a lot of the ideas and allowed the attendees to ask questions. It was a spirited give and take, with a number of the attendees enthusiastic about the idea and some saying they were not quite so sure. I had to leave at 7:30 PM due to dinner reservations with a friend, but the debate was still going when I left, so it was definitely an exciting discussion for all the participants.
Day 3 was Workshop day, and I was able to, in addition to moderating, also participate in Michael Dedolphs workshop on "Intuitive Test Management". This was a very interesting discussion and participation session, in which we had the chance to look at various risk categories and apply them to a real project, determine the probability of the risk occurring, and then determine methods for mitigating the risk (and minimizing the additional risks that mitigation can introduce into the current risk profile).
At the end of the workshop, I helped break down the room, turn in the evaluation forms, and said my goodbyes to a number of new friends and people I hope I'll get the chance to work with again. I had a great time at PNSQC, and I am already making plans to participate again next year in a volunteer capacity so that I can do more and get more involved. Who knows, I may even present a paper for next year :). Time will tell, but I'm certainly motivated to do so right now. In any event, I look forward to seeing everyone again next year, in one capacity or another.
No comments:
Post a Comment