Your constantly-updated definition of Qualitative Research and
collection of videos and articles. Be a conversation starter: Share this page and inspire others!
2,525shares
What is Qualitative Research?
Qualitative research is the methodology researchers use to gain deep contextual understandings of users via non-numerical means and direct observations. Researchers focus on smaller user samples—e.g., in interviews—to reveal data such as user attitudes, behaviors and hidden factors: insights which guide better designs.
"To find ideas, find problems. To find problems, talk to people." -- Julie Zhou, Former VP of Product Design at Facebook
In this video, William Hudson, UX Strategist and Founder of Syntagm Ltd, discusses the differences between qualitative and quantitative research.
ShowHide
video transcript
Transcript loading…
See how you can use qualitative research to expose hidden truths about users and iteratively shape better products.
Qualitative Research Focuses on the “Why”
Qualitative research is used extensively in user experience (UX) and user research. By doing qualitative research, you aim to gain narrowly focused but rich information about why users feel and think the ways they do. Unlike its more statistics-oriented “counterpart”, quantitative research, qualitative research can help expose hidden truths about your users’ motivations, hopes, needs, pain points and more to help you keep your project’s focus on track throughout development. UX professionals do qualitative research typically from early on in projects because—since the insights they reveal can alter product development dramatically—they can prevent costly design errors from arising later. Compare and contrast qualitative with quantitative research here:
Qualitative research
Quantitative Research
You Aim to Determine
The “why” – to get behind how users approach their problems in their world
The “what”, “where” & “when” of the users’ needs & problems – to help keep your project’s focus on track during development
Methods
Often loosely structured (e.g., contextual inquiries) – to learn why users behave how they do & explore their opinions
Highly structured (e.g., surveys) – to gather data about what users do & find patterns in large user groups
Number of Representative Users
Typically 5 to 20
Ideally 30+
Level of Contact with Users
More direct & less remote (e.g., usability testing to examine users’ stress levels when they use your design)
You need to take great care with handling non-numerical data (e.g., opinions), as your own opinions might influence findings
Reliable, given enough test users, but bias remains an issue to monitor
You can enjoy greater success through triangulation, as different research methods have different strengths, and so when you use multiple approaches, you can obtain a more complete picture. For example:
Analytics might show you where users drop off in a funnel (the “what”).
A/B testing can validate whether addressing those pain points does indeed improve the experience (the “Does the solution work?” aspect).
Discover how to complement your research through triangulation, in this video with William Hudson.
ShowHide
video transcript
Transcript loading…
Qualitative Research Methods You Can Use to Get Behind Your Users
You have a choice of many methods to help gain the clearest insights into your users’ world, which you might want to complement with quantitative research methods. In iterative processes such as user-centered design, you and your design team would use quantitative research to spot design problems, discover the reasons for these with qualitative research, make changes and then test your improved design on users again. The best method/s to pick will depend on the stage of your project and your objectives. Here are some:
Diary studies – You ask users to document their activities, interactions, etc., over a defined period. This empowers users to deliver context-rich information. Although such studies can be subjective—since in-the-moment human issues and emotions will inevitably influence users—they’re helpful tools to access generally authentic information.
Interviews – These are either structured, semi-structured, or ethnographic:
Structured – You ask users specific questions and analyze their responses with others.
Semi-structured – You have a more free-flowing conversation with users, but still follow a prepared script loosely.
Ethnographic – You interview users in their environment to appreciate how they perform tasks and view aspects of tasks.
Get your free template for “How to Structure a User Interview”
Unmoderated – Users complete tests remotely: e.g., through a video call.
Guerrilla – “Down-the-hall”/“down-and-dirty” testing on a small group of random users or colleagues.
Get your free template for “How to Plan a Usability Test”
User observation – You watch users get to grips with your design and note their actions, words and reactions as they attempt to perform tasks.
Qualitative research can be more or less structured depending on the method.
Qualitative Research and Usability Testing – How to Get Reliable Results
Some helpful points to remember are:
Participants – Select a number of test users carefully—typically around 5 for single sessions of usability testing, often more sessions are needed—but for other types of qualitative research such as contextual inquiry you might need 10 or 20 participants. Observe the finer points such as body language. Remember the difference between what they do and what they say they do.
Moderated vs. unmoderated – (Specific to usability testing) You can obtain the richest data from moderated studies, but these can involve considerable time and practice. You can usually conduct unmoderated studies more quickly and cheaply, but you should plan these carefully to ensure instructions are clear, etc.
Types of questions – You’ll learn far more by asking open-ended questions. Avoid leading users’ answers – ask about their experience during, say, the “search for deals” process rather than how easy it was. Try to frame questions so users respond honestly: i.e., so they don’t withhold grievances about their experience because they don’t want to seem impolite. Distorted feedback may also arise in guerrilla testing, as test users may be reluctant to sound negative or to discuss fine details if they lack time.
Location – Think how where users are might affect their performance and responses. If, for example, users’ tasks involve running or traveling on a train, select the appropriate method (e.g., diary studies for them to record aspects of their experience in the environment of a train carriage and the many factors impacting it).
Research Bias – Beware of bias when you conduct research, as it can be extremely difficult to spot. You may be unconsciously relying on assumptions and exercising confirmation bias, for example, where you pay attention to findings that confirm your expectations but ignore evidence that suggests otherwise.
Another approach to get reliable research is grounded theory. It helps to reduce bias and discover your users’ true needs and behaviors. In this video, William Hudson, User Experience Strategist and Founder of Syntagm Ltd, explains more.
ShowHide
video transcript
Transcript loading…
Overall, no single research method can help you answer all your questions. Nevertheless, The Nielsen Norman Group advise that if you only conduct one kind of user research, you should pick qualitative usability testing, since a small sample size can yield many cost- and project-saving insights. Always treat users and their data ethically. Finally, remember the importance of complementing qualitative methods with quantitative ones: You gain insights from the former; you test those using the latter.
How does qualitative research differ from quantitative research?
Qualitative and quantitative research serve complementary purposes in understanding human behavior. Qualitative research examines the “why” behind the actions and thoughts of people, using methods like interviews and observations to generate rich, descriptive insights about motivations, pain points, and emotions. On the other hand, quantitative research focuses on the “how many” and “how often” aspects, employing surveys and analytics to produce numerical data that can test patterns and validate hypotheses.
For example, qualitative research might reveal that users feel stressed during checkout, while quantitative data shows 40% abandon carts at step three. Together, the two research “traditions” complement each other: qualitative explains behaviors, quantitative confirms their scale. You use qualitative methods to understand context and generate ideas, and quantitative to test and measure, so you get qualitative uncovering meaning and quantitative delivering measurable proof.
When should I use qualitative research in the design process?
Use qualitative research early, when you want to uncover user needs, motivations, and mental models before you commit to solutions. At the discovery stage, interviews and field studies help frame problems from the user perspective. Then, during ideation, qualitative feedback on prototypes shows whether concepts resonate. And in usability testing, it reveals why users struggle with navigation or content.
Qualitative research works well when numbers alone cannot explain behavior, too; for example, analytics may show high drop-off, but interviews uncover frustration with confusing language. Use it when you need depth, context, and empathy, not statistics. It is good for helping you shape strategy and guide design decisions so your products solve real human problems rather than stand on the shaky ground of assumptions.
Explore mental models to secure a stronger grasp of how to tailor designs that meet with what users expect to find.
What are the strengths and weaknesses of qualitative research?
Strengths: Qualitative research delivers deep insights into motivations, emotions, and lived experiences. This research type helps you build empathy, reveals hidden needs, and uncovers unexpected problems. It is flexible, too, as researchers can adapt questions in real time. Plus, it is valuable for early design, concept testing, and uncovering pain points that quantitative data misses.
Weaknesses: Findings do not generalize easily because sample sizes are small.Data collection and analysis take time. Also, the risk of bias, through leading questions or selective analysis, is high. Lastly, findings may be difficult to “sell” to stakeholders who want numbers.
The key is to use qualitative insights to guide design direction and combine them with quantitative evidence for validation and scale.
What are the most common qualitative research methods in UX?
The most common methods include:
User interviews: One-on-one discussions that explore needs, motivations, and pain points.
Contextual inquiry: Observing people in their natural environment to see real behaviors.
Usability testing: Watching users interact with prototypes or products to uncover friction.
Diary studies: Asking participants to record their experiences over time.
Focus groups: Facilitated discussions with multiple users, best for exploring attitudes.
Each method serves different goals: interviews for depth, contextual inquiry for real-world context, usability tests for task insights,and diary studies for long-term patterns. It is wise to mix methods across the design cycle to capture a full view of the user experience.
As usability testing in particular forms such an essential part of UX design, enjoy our Master Class How to Get Started with Usability Testing with Cory Lebson, Principal User Experience researcher with 20+ years experience and author of The UX Careers Handbook.
How do I conduct a user interview properly?
To begin, define your goal: What do you want to learn? Write open-ended, neutral questions that encourage stories rather than yes/no answers. Recruit participants who reflect your target users, not just convenience samples. Build rapport by explaining the purpose, ensuring confidentiality, and asking warm-up questions.
During a user interview, listen actively, do not interrupt, and probe deeper with “why” and “tell me more.” Avoid leading or biased phrasing; let the participants guide the conversation. Record or take notes (with their permission) for later analysis. Always wrap up by thanking them and checking if they want to add anything. Afterwards, analyze transcripts to identify and extract patterns. Good interviewing requires empathy, patience, and discipline; getting insights means designers listen more than they speak.
Find out more about what goes into effective user interviews so you can gain more valuable insights from them.
How do I run a contextual inquiry or field study?
A contextual inquiry means observing people in their natural environment as they perform tasks. Begin by defining the scope: What behavior or process do you need to study?Recruit participants who represent your audience.Shadow them in context, in an office, store, or home. Ask them to “think aloud” as they work, explaining decisions, frustrations, and shortcuts.
Take notes, capture photos (with their consent), and focus on environment, tools, and workarounds. Do not intrude too much; observe first and then ask clarifying questions. Afterwards, consolidate notes into themes, workflows, and artifacts like affinity diagrams. The insights you can get into real workflows, constraints, and unspoken needs are “gems” that users might never mention in an interview but demonstrate naturally.
How do I choose the right qualitative method for my project?
Match the method to your research goal and timeline. Use interviews if you need rich stories about attitudes or motivations. Choose contextual inquiry if you want to see tasks in real environments. Pick usability testing when evaluating the ease of use of a design, but be sure to start testing as soon as you can. Go for diary studies if you want long-term behaviors.
Two factors to consider are participant availability and budget: interviews are quick; diary studies can take weeks. Factor in whichyour product stage: early discovery favors interviews and field studies; do early design testing before building the first prototypes—via card sorting, tree testing, and first-click testing. Blend methods when possible to triangulate findings. Ask yourself: “Do I want to know what people say, what they do, or what they experience over time?” The answer will help you land on the right method.
Explore ethnographic research to get a better idea of which approach might be most suitable and what to expect with it.
How do I avoid bias in qualitative research?
Beware: bias creeps in easily, so active prevention is critical, and include these measures to keep it at bay:
Write neutral, open questions—avoid phrasing that suggests a “right” answer. Let participants lead instead of steering them toward your assumptions. Diversify recruitment to avoid only hearing from easy-to-reach groups.
During analysis, you might want to involve multiple researchers to compare interpretations, reducing individual bias. Use frameworks like affinity mapping to cluster themes systematically.Do not cherry-pick quotes that confirm your hypothesis (that plays into confirmation bias); include counterexamples instead.
Reflect on your own perspective and try to acknowledge blind spots. Recording (always with the permission of participants) and transcribing sessions keeps evidence transparent, too.
Dig deeper into bias to know what to watch out for and how to steer clear of it for more genuine research results.
How do I analyze qualitative data from user interviews or studies?
Start with transcription or detailed notes.Break responses into discrete statements or behaviors.Use coding, and label data with categories like “frustration,” “motivation,” or “workaround.” Group codes into themes with affinity mapping and look for recurring patterns: do multiple participants describe the same barrier? Contrast differences across groups, such as new users versus experts.
Summarize with quotes to preserve the authentic voices of users. Build artifacts like journey maps or workflow diagrams that show where pain points cluster.Prioritize findings by frequency, severity, and design impact. Always tie insights back to research questions. Qualitative analysis is iterative, so revisit codes and clusters as new patterns emerge. Soon, any messy anecdotes should fall into place as structured insights you can act upon.
What are common mistakes designers make in qualitative research?
Designers acting as researchers often fall into these traps:
Asking leading questions that push answers away from true insights.
Recruiting unrepresentative participants, such as colleagues or friends.
Collecting too little data and then overgeneralizing.
Skipping systematic analysis and relying on memory or impressions.
Ignoring negative or contradictory findings (and so falling victim to confirmation bias).
Overloading participants with long or unclear sessions.
Not documenting consent or protecting privacy.
Another critical mistake is treating qualitative insights as statistically valid; they are not—they are to reveal depth, not breadth. To prevent that, prepare carefully, ask neutrally, recruit diverse users, document rigorously, and analyze systematically.
Ground yourself in statistical significance to understand what is important and what is not in research findings, in this video with William Hudson.
ShowHide
video transcript
Transcript loading…
What ethical issues should I consider in qualitative research?
Ethics begin with informed consent: participants must know what is studied, how data is used, and their right to withdraw at any time. Respect privacy; store recordings securely, anonymize data, and share only with the permission of the respective user. Avoid harm by ensuring participation does not cause distress or undue pressure.
Compensate participants fairly for their time. Be transparent about your role—never mislead or disguise purposes. When you are observing in context, respect boundaries; if users reveal sensitive information, handle it responsibly and assure them not to worry. Avoid power imbalances; for example, if you are interviewing employees, assure them that findings will not affect jobs. Always ask: does this process respect autonomy, dignity, and trust? Ethical rigor protects participants, strengthens research credibility, and helps researchers sleep at night, too.
How do qualitative insights support user personas or journey maps?
Personas and journey maps must reflect real user needs, not designer assumptions. Qualitative insights provide the raw stories, behaviors, and motivations that give these artifacts credibility.
Interviews reveal goals, frustrations, and contexts you synthesize into persona traits. Field studies and diary studies uncover touchpoints and pain points that shape journey maps. Quotes and anecdotes enrich these deliverables with authenticity, which makes them relatable to stakeholders.
Without qualitative research, personas risk becoming stereotypes, and journey maps risk reflecting guesswork. With it, they become grounded, evidence-based “tools” that help guide design decisions and remind teams that behind metrics and wireframes and more stand real human perspectives every design should honor.
Learn how important personas are, and how design without them falls short, in this video with William Hudson.
ShowHide
video transcript
Transcript loading…
What are some helpful resources about qualitative research for UX designers?
Books
Goodman, E., Kuniavsky, M., & Moed, A. (2012). Observing the User Experience: A Practitioner′s Guide to User Research (2nd ed.). Morgan Kaufmann. This book is widely regarded as a foundational UX research text. It provides a thorough walkthrough of qualitative and quantitative user research methods—from planning and recruiting to data analysis and reporting. Aimed at practitioners working in digital product development, it includes case studies, practical guidance, and checklists. Its breadth and clarity have made it a staple in UX design curricula and professional practice, especially for those working in agile or interdisciplinary teams.
Hall, E. (2024). Just Enough Research (2nd ed.). Mule Books. Just Enough Research by Erika Hall remains one of the most widely recommended guides for design and product professionals entering the world of UX research. This newly updated 2024 edition, now published independently by Mule Books, refreshes the original content with a new layout, edits, and additions like a section on Jobs to Be Done. The book is brief, engaging, and accessible, with a reputation for demystifying research without dumbing it down. It is particularly valuable for cross-functional teams, startups, and anyone needing a practical, evidence-focused approach to design decisions.
Buley, L., & Natoli, J. (2024). The User Experience Team of One: A Research and Design Survival Guide (2nd ed.). Rosenfeld Media. Now in its second edition, The User Experience Team of One is a widely respected guide for UX practitioners working solo or within small teams. With contributions from Leah Buley and Joe Natoli, the 2024 update offers new insights, tools, and real-life examples. It balances UX philosophy with 25 step-by-step methods to tackle design challenges pragmatically. This edition broadens its scope, making it valuable not just for lone UX professionals but for any designer seeking lean, scalable techniques that prioritize clarity over complexity. Its enduring popularity comes from its practicality, accessibility, and relevance across experience levels.
Pettit, F. A. (2011). People Aren′t Robots: A Practical Guide to the Psychology and Technique of Questionnaire Design. F. Annie Pettit. People Aren′t Robots by Dr. F. Annie Pettit offers a refreshingly human-centered approach to questionnaire design. With over fifteen years of industry experience, Pettit critiques the robotic tone of traditional survey design and argues for a style that better resonates with real respondents. The book is concise, practical, and filled with clear examples, making it ideal for marketers, brand managers, and seasoned researchers alike. It addresses the psychology behind how people interpret and respond to survey questions and provides strategies for improving data quality through smarter, friendlier design.
It's Easy to Fast-Track Your Career with the World's Best Experts
Master complex skills effortlessly with proven best practices and toolkits directly from the world's top design experts. Meet your experts for this course:
Frank Spillers: Service Designer and Founder and CEO of Experience Dynamics.
Ann Blandford: Professor of Human-Computer Interaction at University College London.
Alan Dix: Author of the bestselling book “Human-Computer Interaction” and Director of the Computational Foundry at Swansea University.
We believe in Open Access and the democratization of knowledge. Unfortunately, world-class educational materials such as this page are normally hidden behind paywalls or in expensive textbooks.