As a dedicated tester and member of Rangle’s quality team, it’s my job not only to ensure that all the apps and platforms we build together with our clients are functional and operating the right way but also to stay current on the tools and technologies that will enable me and my team to do our jobs better, faster and smarter. Over the past couple of months, I committed to assessing a variety of testing tools, looking at a new tool every two weeks. This might seem like a lot, but I think far from being eccentric, it’s going to become the norm in our constantly-changing tech world.
Staying on top of new tooling will allow you and your Quality team to get used to new functionalities as they’re developed, rather than being left behind as new innovations bring better tools to our community. Being “evergreen” when it comes to new tools — as in, being in an open-minded position where you can adapt to new technologies as soon as they become widely available — will prevent you from being stuck with an outdated solution or tool when better ones are out there.
When assessing QA tools, whether you’re at a consultancy like Rangle or working on your own solutions for a product company, it’s key to start with a problem-oriented approach. This is always going to be the most effective, cheapest, and simplest way to address your issues, and the most potent way to identify the right tools for your job.
The problem-first approach (rather than the tool-first approach) keeps you open-minded. You’ll think case-by-case, rather than being stuck in a “one-size-fits-all” mindset. If you’re aware of a breadth of tools, it’s going to improve the overall quality of your assessments for your clients, whether they’re internal or external. For example, mobile testing in a private and secure cloud server is a great option if there’s a strong concern about privacy, whereas concern about cost and volume would be better served by the use of simulations.
Three steps for assessing new Quality tools
In the process of my deep-dive into the latest tooling, I’ve developed a simple methodology for assessing testing tools.
Step 1: Identify the tool
First off, you’re leading with value. What can this tool do for your organization that your existing tools cannot? To this end, I tend to prioritize tools that have proven records. For me, these are tools that have existed for some time, are popular in the QA community, were developed by our clients, or are highly recommended or requested by industry leaders.
I also prioritize open-source tools, as we are dedicated to supporting the open-source community at Rangle. Open-source tools usually have a dedicated community behind them, and come with recommended resources that can bring added value. They are free to use without commitment and compared to paid tools, open-source tools don’t tend to have any gatekeepers or built-in limitations. It’s much easier to train team members on open-source tools than paid tools, which is a key consideration for a consultancy like Rangle, as a tool that’s compatible with our client’s teams is much more likely to be adopted into their operations.
Step 2: Dive into the official documentation
The official documentation that developers create for their software is the source of truth, since it’s constantly updated with the latest info (at least it should be!), new features, and technical issues that were addressed, or will come soon. If I’m assessing a paid tool, most vendors will have dedicated tech support to answer your questions or address issues.
Documentation can take a while to get through, so I have a specific methodology that I recommend to help speed the process along:
- Start with a high-level search of the tool to check reviews. Watch short demo videos, and (informally) ask coworkers and the community how they feel about the tool.
- Skim the documentation content, paying attention to how it’s organized. The most important info will likely be presented at the top of each section or menu.
- Create a checklist or questionnaire based on the problem you’re trying to solve and the feedback from your peers. Search for all information needed, and prepare your notes for internal QA documentation to share with your colleagues — this way, the knowledge is never lost, and no time is wasted in the future assessing tools that were already reviewed.
Step 3: Document your learning while you use the tool
The official documentation and training is a great place to start when you’re learning a new tool, but what’s even better is a tip sheet from a colleague that helps you learn the tricks and best practices faster.
While I’m using the new tool to assess its functionality, I keep good notes (especially if I think the tool is good) to help my colleagues and our clients’ teams ramp up faster when the tool is implemented. To start, I usually practice with examples provided on the tool’s website. These examples are often designed to showcase the best features and functionalities and are useful for teaching the tool to others. The most established tools usually have certificates for training, plenty of videos, and a sophisticated e-learning site or GitHub repository to make practicing easier.
Once I’ve had a good look at these resources, I usually know whether the tool is right for Rangle and our clients, and can add it to our list of recommended tools, or flag it as the wrong fit.
The benefits of staying current on Quality tools
I’ve noticed that the more tools I assess, the easier it has become for me to test new ones. I’ve found that some tools share a structure or baseline, or present information in a similar manner. (This is why vendors like to showcase differentiators they possess that competitors lack.)
But one of the biggest benefits of keeping up to date on tools is being aware of market trends as they happen. For example, I was one of the first people on my team to notice that vendors are moving towards compatibility with open-source end-to-end tools. This has a big impact on our ability to integrate new tools and can be important for our clients in cost savings and modernizing their suite of tools.
For my personal skills, I’ve noticed an increased technical proficiency — each new tool I assessed was an additional opportunity to practice, write code, or debug with other developers and Quality team members. I’ve also become an internal expert and resource. I’ve created dozens of tool assessments in our internal hub for anybody to access, all without being overwhelmed by unnecessary information. My peers know to come to me with questions and recommendations for tooling for their client projects. If you’d like to hear more about our tool review methodology and the benefits of staying on top of what’s new in testing, reach out to us to discuss trends in our ever-changing industry.