As testers, our job is to develop a deeper understanding of systems, products, ideas, and situations. That requires experiencing, exploring, and experimenting with the products we're testing—and applying analysis to them.
Some testers—especially those new to testing—may be intimidated by the notion of analysis and freeze in the headlights when presented with the idea of performing it. It's a fancy-sounding word! It's a big idea! It sounds mathy! Where to start? What do you do after that? What is analysis, anyway?
No problem. Here's how to get over some of the fear with a bit of analysis on analysis itself, plus how to apply these same approaches to your job of using analysis to provide an X-ray into your software.
Bootstrapping
When you analyze something, it's usually with the goal of understanding it for ourselves and describing it for others. But therein lies a paradox: You perform analysis to understand something better, but how can you analyze something that you don't understand?
One way is not to fret, but simply start somewhere. Whatever you do and whatever you learn will suggest next steps. Good analysis work begins with pulling yourself up by the bootstraps, as you move from uncertainty toward knowledge.
Analysis is a bootstrapping process.
Getting started
For instance, you could start analyzing something by considering where it comes from. In the case of analysis itself, the word comes to English from a Latin word, analusis, which in turn came from the Greek analuein, which means "unloose."
That comes from two smaller Greek words, ana- which means "up," and luein, which means to loosen. So analysis literally means "loosening up," or "unpacking"— which is just what I've done with the word itself.
Did that help? Maybe; maybe not. In his book Sensemaking in Organizations, Karl Weick retells the story of a little girl who, upon being told to be sure of her meaning before she spoke, said, "How will I know what I think until I see what I say?" So another potential starting point is to say something about analysis, and then try to see what you think.
Analysis is looking at stuff and figuring it out somehow, so people can understand it.
That sounds vague. Still, perhaps you can consider that description, try it out by applying it to an example, see what you think, and refine it. A problem-solving approach that helps you learn but is fallible is called a heuristic.
Analysis is a heuristic process. (Getting started is a heuristic process, too.)
An example: Analyzing a box
A simple example: Imagine that I challenge you to look at a plain wooden box and analyze it, to figure out things about it. You are at the center of your context; you know some things already and, to rise to the challenge, you want and need to know more.
What you do next depends on the situation, what you have in front of you, and your perception of what I want.
Analysis is a process that happens within a context.
You might begin by simply looking at the box, but almost immediately, you would start to make choices about aspects to examine, questions to consider, and further steps. Each choice you make will depend in part on what happened as the result of previous choices.
Analysis is an exploratory process.
As you examine the box, you consider its shape and structure. You use different modes of observing the box: you might pick it up, heft it in your hand, turn it over, shake it to find out whether you could feel or hear things inside. You might attend to the wood, especially if you already know something about wood.
You might look for affordances—that is, means of interacting with the box and what might be inside it. You could consider the purposes to which people might put the box. In short, you would look at it in various ways and from various perspectives.
Analysis is a diversified process.
You open the box and discover something inside: a mechanism that makes music. You examine that more closely. You interact with the mechanism to see and hear it working. You might begin to consider special tools to extend or enable other kinds of observation.
Analysis is an iterative process of interaction and observation.
Based on what you've seen and heard so far, the possibilities for analysis explode. From here you could go in all kinds of directions. You could go online and research the parts of a music box (cylinder, teeth, comb, spring housing, etc.). You could discover relationships between mechanical musical instruments and computers (both used punch cards at various points in their histories). You could consider the physics of sound and the relationships between the lengths of the tines on the comb and the pitch of the musical notes.
Analysis is an open-ended research process.
Periodically, you'll check in with me to review your mission and your work. As your analysis deepens, you could consult with experts on machinery or music or history, and learn from them. As you apply your existing knowledge and learn more from your own investigations, you may contribute to what they know.
Analysis is typically a collaborative process.
You'll alternate between making observations and reporting on them, between spontaneous and deliberate activity. You'll direct and redirect your attention, zooming in and zooming out on details.
Analysis is a process of alternating, focusing, and defocusing.
To analyze something deeply and well, you'll keep track of your observations and questions. Throughout, you'll take notes, collect and collate data, make sketches or diagrams, map out relationships. You'll sift and refine your data, your mental models, and your representations of them. You'll notice patterns and affinities, and feed them back for further analysis.
Analysis feeds back on itself.
Applying analysis to software
Having examined a box and gained a little experience with the process of analysis, let's try extending that experience to examining a software product. How might that be similar to examining a box? How might it be different?
Some people believe that to test and analyze software, you must start with a requirements document or specification. That's not a law, though; it's a heuristic. Just as with the box, it's okay to start wherever you like, analyzing whatever you've got, and letting that feed back into further analysis.
Start with a spec, or a part of a product, or a conversation, or a survey of the whole product. (Bootstrapping)
- Try making sense of your mission and the things that would influence it; the information available to you; the nature of the product; the tools available to you; the project schedule. (Context)
- Take a tour of what the product exposes to its users; or you can tour a requirements document to learn people's intentions and desires for it. (Exploration)
- Consider things that about the product that might be important, but that aren't functions—data, or the platforms that the product depends on, or the work that people want to perform with the product. (Diversifying)
- Experiment with the product; use it, and observe it in action. Tools can help to drive and probe the product, but remember that a product is designed to help people get things done. Try doing those things; talk to and watch people who do them, too. Look for problems; investigate bugs. (Interaction and observation)
- Study the requirements, specifications, and plans for the product. Study its history, and the history of products like it. Look into the product's technical and business domain. (Research)
- Talk with managers, developers, testers, tech support people, and users to learn more. (Collaboration)
- Unlike a music box, most of software's elements aren't directly visible. Try developing ideas about things that you don't see, but that are necessary for visible features and functions to work. Consider tools to help make the invisible visible. (Focusing and defocusing)
- Every step of the way, catalog ideas about risk, collect data, sketch diagrams, make lists, create stories to inform further analysis. Visualize output from the product and pore over it. Loop around and assess your findings. Share your discoveries with colleagues and your testing clients, and determine where you want to go next. (Feedback)
Analysis and testing
Given all that, here's a more elaborate description of analysis than my earlier one:
Analysis is a process of examining things in exploratory, diversified, iterative, organized ways so people can understand them better.
That's a deeper and more explicit description. It's also similar to the definition of testing used in rapid software testing: evaluating a product by learning about it through experiencing, exploring, and experimenting.
Excellent testing and excellent analysis are intertwined; each supports and feeds back to the other. In testing, you analyze products and systems, problems, risks … and testing informs our analysis. You also apply analysis to analysis itself, to your approaches to performing analysis as you test software, and to your evaluation of the quality of our work.
Analysis is kind of a fractal process that way. It loops back on itself.
Getting started
Where to start? Simply start somewhere, and whatever you do and whatever you learn will suggest next steps. And remember: Good analysis work begins with pulling yourself up by the bootstraps.
Want to know more? During my STAREAST Virtual+ conference session, "X-Ray Vision for Testers: How to Analyze Things," I'll talk more about strategies and approaches you can use to identify and reason about the things that matter. The conference runs April 26-30, 2021.
Keep learning
Take a deep dive into the state of quality with TechBeacon's Guide. Plus: Download the free World Quality Report 2022-23.
Put performance engineering into practice with these top 10 performance engineering techniques that work.
Find to tools you need with TechBeacon's Buyer's Guide for Selecting Software Test Automation Tools.
Discover best practices for reducing software defects with TechBeacon's Guide.
- Take your testing career to the next level. TechBeacon's Careers Topic Center provides expert advice to prepare you for your next move.