A Day in the Life of a QA Leveling Up
That moment when you wonder if you're actually doing it right
It's 10 AM. You have a ticket assigned, a test environment that works... sometimes, and a Slack that never stops. You run your test cases, report what you found, and move on to the next one. Day after day.
And at some point, almost without noticing, you ask yourself that uncomfortable question:
Am I being a good QA... or am I just doing the job?
It's not a trick question. It's one of the most honest things you can ask yourself. And the reason it's worth asking isn't to beat yourself up — it's to understand where there's room to grow.
That's what we call opportunity areas. Not weaknesses. Not failures. Just the places you haven't gotten to yet.
This blog is built on three resources worth keeping close: The QA Blueprint, which covers the fundamentals of professional testing; XPath 101, a practical guide to understanding one of the core pillars of test automation; and The Locators Guide, a visual and hands-on companion to understanding how to locate any element on your web app — from CSS selectors to XPath. All three by Angry Tester. If you want to go deeper on any of the topics we'll cover, those are your starting points.
What happens on a regular Tuesday
Picture Marcus. QA with two years of experience, a sharp eye for bugs, knows the product better than half the dev team. But lately something feels off.
He reports a bug. The developer replies: "Can you give me more context?" He resends the ticket. The dev asks again. Three rounds of back-and-forth before they're on the same page.
Later, in the daily standup, someone asks about test coverage. He knows he has test cases, but no concrete numbers. He improvises the answer.
At the end of the day, his lead mentions it'd be great if he picked up some automation. He nods, but internally files it under "that's for the more technical QAs." Nobody ever told him that automation starts with understanding how to locate elements on screen — and that with the right resources, it's not nearly as intimidating as it sounds.
Sound familiar?
Marcus isn't bad at his job. Marcus has specific opportunity areas. And the difference between staying stuck and leveling up is recognizing them without making it a whole thing.
The most common opportunity areas (and why they matter)
1. Communication — the 70% of the job nobody teaches you
You can find the most critical bug of the sprint and still create more friction than value if you don't know how to communicate it. The report matters as much as the finding.
A good bug report isn't just "the button doesn't work." It's a clear title, a precise description, numbered steps to reproduce, the environment where it happens, and visual evidence. All in one message, without the dev needing to ask for more context.
And this goes beyond tickets. The way you write in Slack, in meetings, in PR comments — all of it shapes how people perceive you professionally. A QA who communicates well doesn't just find bugs: they make them actionable.
Think of it this way: if your report requires three rounds of questions before the dev can start working, the bug is also reproducing in your process, not just in the software.
The opportunity: Next time you send a message to a teammate, include the full context from the very first message. Don't make them ask. Clear title, numbered steps, expected result vs. actual result. Just that. It already makes a difference.
2. Documentation — if it's not written, it didn't happen
If you run a test and don't document it, technically it didn't happen. If you spotted a weird behavior but didn't formally report it, it's invisible to the team. And if six months from now someone tries to understand what was tested in that sprint, they won't find anything useful.
Test cases have structure for a reason: ID, title, description, preconditions, execution steps, expected result, and status. That's not bureaucracy. It's the difference between a team that can audit its own work and one that repeats the same mistakes sprint after sprint.
Good documentation is also your professional protection. When someone asks "was this tested?", you have the answer, the date, and the result.
The opportunity: Look back at your last five test cases. Do they have all the fields? Could someone who just joined the team and doesn't know the product execute them? If the answer is no to either question, that's your work right there.
3. Testing techniques — not everything is free exploration
A lot of QAs test by instinct. And instinct is valuable, especially when you know the product well. But it has a clear limit: instinct covers the scenarios you already imagined. Testing techniques cover the ones you didn't.
Equivalence Partitioning helps you divide the possible values of a field into groups that behave the same way, so you're not testing the same thing over and over. Boundary Value Analysis takes you straight to the edges — which is where most bugs live. Decision Tables let you map combinations of conditions systematically, without leaving any untested.
It's not about memorizing formulas. It's about having a system that helps you design test cases that actually matter, instead of testing things at random and hoping something turns up.
If you want to explore these techniques with more structure, The QA Blueprint has a dedicated section on test design that shows how to apply them in real contexts — not just in theory.
The opportunity: Next time you have an input field to test, stop before you execute. Identify the valid and invalid partitions and the boundary values. Document them as test cases before running the first one. You'll be surprised what you catch.
4. Understanding the full cycle — do you know where your work lives?
Testing doesn't start when a ticket gets assigned to you and it doesn't end when you mark it "passed." There's planning, requirements analysis, test design, implementation, execution, reporting, and closure. Every phase matters, and each one has specific deliverables.
When you only participate in execution, your understanding of the system is incomplete. You don't know what design decisions were made before you got involved, or what risks were already identified. And that affects the quality of what you test.
When you understand the full cycle, you stop being the QA who shows up at the end to "approve or reject." You start asking questions in refinement sessions. You identify ambiguities in requirements before they become bugs. You add value long before there's anything to execute.
The opportunity: Ask yourself which phases of the testing cycle you currently participate in. If the answer is mostly "execution," there's a whole world before and after waiting for you. Start by asking to sit in on the next refinement session. Just listen. That alone shifts your perspective.
5. Automation and locators — it's not for the technical QAs, it's for you
This is where most QAs get stuck. "I do manual testing," "I don't know how to code," "that's for SDETs." And meanwhile, those same QAs are running the same three critical flows by hand, every sprint, week after week.
Automation doesn't replace manual testing. It complements it. It frees your time from repetitive verifications so you can focus on what actually requires human judgment: exploratory testing, edge cases, new flows.
But before you automate, there's something fundamental to understand: how to locate elements on screen. Every automation script needs to tell the framework where to click, where to type, what to verify. That's done through locators.
This is exactly where The Locators Guide becomes your best companion. The key insight it opens with says it all: if you can see it in the DOM, you can select it. HTML elements live inside a tree structure called the Document Object Model, and every single one of them can be reached — you just need to know how.
There are more ways to locate an element than most QAs realize. You can target it by its ID (the # selector — precise, but use it carefully as it may be tied to event listeners), its class name (the . selector — chainable, but coupled to styles), its tag name (div, input, span — very generic, rarely enough on its own), or through attribute selectors ([placeholder="Email"], [type="button"]) which let you use any property the element has. You can also combine them, narrow them down using pseudo-class selectors like :nth-child(), :first-of-type, :not(), or :has(), and chain parent-child relationships using the descendant space or the direct child > operator.
But here's the honest ranking: the best locator is a Test ID. A data-testid attribute added specifically for automation purposes. It's not tied to styles, not tied to event listeners, won't break when the layout changes, and guarantees a single match every time. The guide puts it plainly: talk to your devs to set up those IDs. It's the most impactful conversation you can have as a QA who wants to build stable automation.
For everything XPath — navigating the DOM hierarchy, using functions like contains() and starts-with(), combining conditions with AND and OR, and moving across the tree with ancestor::, parent::, or following-sibling:: — XPath 101 covers it from scratch, with practical examples and a direct writing style.
The opportunity: Identify three test cases you run manually every week without fail. Those are your perfect candidates to automate first. Before writing a single line of code, open your browser's DevTools and inspect those elements. Practice building their locators: try an ID, a class, an attribute selector. Verify each one selects exactly what you're looking for — and nothing else. That's the real first step.
The point isn't to be perfect. It's to know where to grow.
Reviewing your opportunity areas isn't an exercise in self-criticism. It's a professional practice. The best QAs aren't the ones who never make mistakes — they're the ones who have clarity about what they know, what they don't, and what they're doing about it.
A day in the life of a QA who's leveling up doesn't look dramatically different from one who's plateaued. The difference is in those small decisions: writing the complete report, documenting the case even when nobody asks, studying a new testing technique even when it's not urgent, opening The Locators Guide or XPath 101 on a Tuesday afternoon for no particular reason other than wanting to understand a little better.
The question "am I being a good QA?" doesn't have a final answer. It has a direction.
And that direction is enough to get started.
📚 Resources mentioned
- The QA Blueprint — Angry Tester: fundamentals of professional testing, test design, and QA mindset.
- XPath 101 — Angry Tester: a practical guide to understanding and writing XPath locators from scratch.
- The Locators Guide — Angry Tester: a visual, hands-on guide to locating any element on your web app using CSS selectors, pseudo-classes, and XPath.
