Tuesday 13 May 2014

Reflecting on my Software Testing World Cup Experience

Leading up the competition:
When I first heard about the Software Testing World Cup, I was extremely excited. What a concept! Having a challenge for people to showcase their skills as testers? - It's brilliant! Developers have development competitions, so I'm glad the testing community can embrace similar concepts.

We quickly assembled our team (@_eddegraaf@drewbrend and @josh_assad. Unfortunately Josh ended up having a prior commitment and couldn't partake with us). We had a short initial meeting to talk about how we would develop our strategy before the competition start date. We developed a mind map of the things we needed to take into account, as well as a tracking strategy for all the information we'd need to track during the competition (besides bugs, which we knew we would be tracking in the provided HP software).

Unfortunately, once the date was announced I came to the realization that I would be away for the two weeks prior for vacation and a speaking slot at STPCon New Orleans. The week after my return was shaping up to be a busy one, with STWC being on the Friday at the tail end of that. If I'm honest, I had considered backing out because I was disappointed with the amount of pre-planning we'd be able to accomplish. I felt unprepared and this upset me because I had grand ideals built up in my mind about our execution of test strategies during the STWC.
Then it struck me: in the field of software testing, we're often hit with unexpected events. How often are we thrown into a project last minute and asked to sign-off on a project or feature? What about when we switch Agile teams and have to quickly ramp up to the new team's processes? Part of being a good tester means tackling challenges as they come to us.
With this in mind, I decided this was a challenge I needed to see through because it was a real-life representation of true software testing.


With the day of the competition came another unexpected roadblock - none of us were able to co-locate for the competition. Enter: my first experience with Google Hangouts. Wow! If anyone has to work remotely, and needs a convenient solution for video chat & desktop sharing with more than one person, USE GOOGLE HANGOUTS! It worked unbelievably well. I was also fortunate enough to have multiple machines at home (though both were OSX machines). This was super useful for me, as I used one for watching the YouTube live stream to listen for important info and ask questions as we needed, as well as for entering data into our Google shared docs/HP Agile Manager. I used the other for the software under test.

Our strategy:

We used shared Google docs to track the issues we found before entering them into HP Agile Manager. This allowed us to very quickly see the items the other team members had found, as well as what platforms they had been tried on. We also had a Google doc to track the "areas" of the tool that needed to be tested (broken down as we observed them - ie. Installer, individual features, mobile integration, etc). This allowed us to see what areas each member was working on so we didn't hammer at the same areas. It also allowed us to structure our Exploratory Test sessions into reasonable sizes.

With about an hour left in the competition, one of our members began porting the list of issues over to a document for the beginnings of our Test Report. We also took that opportunity to organize them by priority (based on end-user impact) and charting the results. In the final half hour, we collectively decided on what info to include in the executive summary and we made a ship/no-ship decision on the product (by platform, since we had different versions on PC and OSX).

Things to improve on next time:
  • We inaccurately assumed the software under test would be a web application. We prepared by assembling lots of tools to use for performance/load testing, debugging proxies, accessibility, etc. In future, we should assemble a similar list, but be prepared for both online and offline applications.
  • Try and co-locate for the duration of the testing. Having access to whiteboards and a central network would have been far ideal to using the online solution.
  • Be more prepared for multiple platforms. We got lucky having both PCs and Macs but we ended up only having Android devices to test the mobile integration. We should have had a better way of tracking the testing performed and issues found on each platform.
  • Build a template of a test report ahead of time. We knew what types of info we wanted to include in the report, but we didn't actually have a document framework to plug the data into. This would have saved valuable time wasted on basic formatting.

Final thoughts:

As I stated above, I'm so happy that this competition took place, regardless of our final standings (EDIT: We ended up placing within the Top 10, and won special recognition for "Most Useful Test Report"). It was a really good learning experience which I feel will only further my skills as a tester, especially in the area of test strategy and design. I strongly encourage everyone to participate if the opportunity ever presents itself again.

Lastly, a huge thank you to the organizers, sponsors, judges and other participants. I know lots of people have put quite a lot of time into this event and I hope in future I can pay it back and volunteer my own time and experience to something. As always, engage me on Twitter (@graemeRharvey) or email me (graeme@iteststuff.ca) to chat about all things testing.

4 comments:

  1. Interesting. I would love to see a part 2 on this regarding the judge's grades, your placement, etc. You can see my postings on the STWC here: http://about98percentdone.blogspot.com/2014/05/software-test-world-cup-2014-part-1.html

    Part 2 includes our score and some analysis: http://about98percentdone.blogspot.com/2014/05/software-test-world-cup-2014-part-2.html

    ReplyDelete
    Replies
    1. Hi JCD,

      I hadn't actually received any hard grades but I prodded for some result info so I can do a similar follow up to yours. I got the data, so I'll post back here and update the original article when I've posted the follow up. You can expect it over the next few days.
      Thanks for the idea!
      Cheers,
      Graeme

      Delete
    2. Did you ever get back your data? I'd love to see someone else's grades purely as a comparison to see if the judges adjusted their grading/documentation/expectations over time. Obviously it is possible yours was right before/after ours but it still might adjust the expectations not to mention not all judges saw all the responses so you might have gotten different judges. Anyway, I'm curious!

      - JCD

      Delete
  2. Hey JCD,

    I didn't get granular enough data to compare judges grades. Our team decided we didn't want to get overly analytical with the judges data. They had one heck of a tough job. We've chosen to look beyond the grades we were given, and look to the notes - we want to use the STWC as a chance to improve ourselves as testers, not to not worry about how we stacked up against the competition across the continent.

    ReplyDelete