preload

WTA08: Messing With The Smart Guys

Posted by Albert Gareev on Mar 16, 2011 | Categories: WTAmericas

The Mission

Weekend Testing Americas session N8 was initially planned as a mini-project: explore and map the product, deliver a test plan and test results. That’s ambitious already for 90 minutes of work! However, we took that over the edge – and succeeded.

Today’s session is dedicated to session-based exploratory test management. The product we test is located here: http://ribbit.cc/. Your goal is to explore the product and plan testing missions – charters. The test plan must be a single deliverable per group.
We suggest you to spend first 15 minutes organizing into teams, and then team leads can split the application into targets. Then you can break down the targets into charters within your groups. Then you’ll have 60 minutes to work exploring, testing, and mapping charters. Any group, which decided that mapping is done, can continue testing for the rest of the time allocated. Groups don’t need to compete with each other; remember, at the end we will provide the result as a whole.
Report any bugs found into the common chat with a hashtag #bug.
Report any issues into the common chat with a hashtag #issue.
Report any charters you suggest into the *group* chat with a hashtag #charter.
Report any focus areas within the charter into the *group* chat with a hashtag #area.
Leads, you need to do some admin work as well, please create a mindmap (or at least an ordered list) of charters and areas created by your group. Would be good if you collected bug/issue reports too.
After 60 minutes is over, leads need to present results as a summary (all fit in 3-5 minutes).
Then we will discuss THE OVERALL result.

Now, let me digress here. “Weekend Testing Americas” does not mean that it’s open for “western hemisphere” only. Practically, it’s just a name of the chapter and a timezone. We have attendants from Europe, from UK, from Israel, from India… Most of these people have never seen each other in person and first time “met” in one of the Weekend Testing sessions. In the paired testing exercise testers need to get along and communicate effectively right on the go. And this time the first challenge was about building a testing team.

The Challenges

While planning the session Michael Larsen and I decided to allocate 15 minutes for the first part – team building. No, that wasn’t an intentional “trap”, but rather a constraint dictated by the overall timing. During the debriefing, however, many reported that organizing a team was the hardest part, and it impacted the further activities. Well, here’s the first lesson to say out loud.

Building a team is a critical part, take it seriously and support by all means!

Let’s stay reminded also, that at Weekend Testing Americas the goal of every session is learning through practice. Yes, we also produce great, at times brilliant test results, but those are bonus stuff. If something didn’t go as expected that might be even better – to understand what you needed, or what you should have done to succeed is much more important than a casual success from which you didn’t learn anything. We learn consciously, we are sharpening our testing skills.

..And we’re back to the session review.

The exploratory test plan is a single deliverable. Teams don’t need to compete, but need to collaborate instead to reach better coverage.
Communication within the teams and between the leads was the second biggest challenge, but more of a “technical limitation” type: it is hard indeed to keep up with a few chats. For leads it was even harder because the additional task for them was documenting the charters. Yet, happy to say, all of them succeeded and presented rich documents at the end of the mission. Shmuel Gershon used his favorite Rapid Reporter, Justin Byers used XMind, and Ajay Balamurugadas used TypeWith.me

And let me highlight a conclusion on the second challenge.

Communication plays a critical role in testing, and must be supported on all levels
– from administrative to technical.

Before I get to the third challenge encountered during the mission, I want to talk more about the product and the product owner.

Our special guest this time was Ben Simo, creator of “Is There A Problem Here?” and “Questioning Software” web-sites, dedicated to testing. His latest project, http://ribbit.cc/, – is a quotes server. Not just some quotes – the quotes dedicated to software development and testing, as well as online references to documents you can download and books you can buy. As a facilitator I couldn’t immerse myself into one of the testing teams, but I can share my thoughts on the product based on the testing I made while preparing for the session.

I felt comfortable using the product (well, except of the color scheme, “Blood on your screen”, as was twitted). It is compact and rich, as an iPhone. I think, so extensive testing experience, as Ben Simo has, benefited him with a great taste in functional and usability design. The application gives an instant response (AJAX) for on-screen user actions, and is truly Web 2.0 as it’s tightly integrated with social media networks (Facebook, Twitter). And it has its own little entertainment page, which makes missing a content not so sad.
Ben created an API functionality allowing to integrate random quotes as a widget on other sites.
Overall, this is “a must see” product for all developers and a remarkable contribution for the online testing community!

..And the final part in the session review.

As it often happens, “simple outside” means “complex inside”. This inner complexity was the third major challenge reported, but only from the technical stand point: in-depth testing required using certain tools ready. Nevertheless, the related charters were identified and reported as requiring further testing. And this is the last keypoint:

Tools don’t make the master!

References

WTA08 – Quoth the Developer, ‘Nevermore’!

Response document by Ben Simo


Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported
This work by Albert Gareev is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported.