I'm going to have to get aggressive if I'm going to make the end of month deadline. therefore, I'm going to try to be more specific with these posts and discuss them in a way that will allow me to be more targeted and to post more entries. It's becoming a matter of pride now! Anyway, more "Thirty Days of Testability".
Explore the output of your applications analytics, how can they help guide your testing?
I'll take a two-pronged approach with this. I'll talk a bit aboiut my company's product and a bit about what I use on my own TESTHEAD blog.
With my company, we have two applications we use and monitor for Analytics. One is more for overall HTTP traffic and demographics (such as what pages get hit the most, what browsers are used the most and what time of days do people most use the application, as well as how many parallel connections are we maintaining at any given time. that helps me define what I should prioritize in my testing as well as what level of load and other parameters I might want to consider.
The second tool we use is based on feature analytics, as in "what features in our product are our customers actually using?" We've had a few times in the past where we had a request to get something implemented and that implementation was everything to getting the deal. From there, it often meant we would maintain and keep a feature running which was important for a small set of users but meant a big part of our business. Sometimes though, we would discover that an organization would demand something, later for us to discover that their adoption rate of that feature was low to non-existent. That often meant there would be follow-up discussions. Those discussions would help us to decide if the issue was worth us keeping the feature. Perhaps we weren't making a compelling enough story for why it would be important for the organization wanting to use it. Alternately, we also often decided that the feature had such low adoption that turning it off would have little overall effect on the use of the product. We would then gracefully deprecate that feature.
In short, getting familiar with your product's analytics can tell you a lot about what is being used but also what isn't.
Explore the output of your applications analytics, how can they help guide your testing?
I'll take a two-pronged approach with this. I'll talk a bit aboiut my company's product and a bit about what I use on my own TESTHEAD blog.
With my company, we have two applications we use and monitor for Analytics. One is more for overall HTTP traffic and demographics (such as what pages get hit the most, what browsers are used the most and what time of days do people most use the application, as well as how many parallel connections are we maintaining at any given time. that helps me define what I should prioritize in my testing as well as what level of load and other parameters I might want to consider.
The second tool we use is based on feature analytics, as in "what features in our product are our customers actually using?" We've had a few times in the past where we had a request to get something implemented and that implementation was everything to getting the deal. From there, it often meant we would maintain and keep a feature running which was important for a small set of users but meant a big part of our business. Sometimes though, we would discover that an organization would demand something, later for us to discover that their adoption rate of that feature was low to non-existent. That often meant there would be follow-up discussions. Those discussions would help us to decide if the issue was worth us keeping the feature. Perhaps we weren't making a compelling enough story for why it would be important for the organization wanting to use it. Alternately, we also often decided that the feature had such low adoption that turning it off would have little overall effect on the use of the product. We would then gracefully deprecate that feature.
In short, getting familiar with your product's analytics can tell you a lot about what is being used but also what isn't.
No comments:
Post a Comment