• AI & NLP
  • Churn & Loyalty
  • Customer Experience
  • Customer Journeys
  • Customer Metrics
  • Feedback Analysis
  • Product Experience
  • Product Updates
  • Sentiment Analysis
  • Surveys & Feedback Collection
  • Try Thematic

Welcome to the community

how to write data analysis for qualitative research

Qualitative Data Analysis: Step-by-Step Guide (Manual vs. Automatic)

When we conduct qualitative methods of research, need to explain changes in metrics or understand people's opinions, we always turn to qualitative data. Qualitative data is typically generated through:

  • Interview transcripts
  • Surveys with open-ended questions
  • Contact center transcripts
  • Texts and documents
  • Audio and video recordings
  • Observational notes

Compared to quantitative data, which captures structured information, qualitative data is unstructured and has more depth. It can answer our questions, can help formulate hypotheses and build understanding.

It's important to understand the differences between quantitative data & qualitative data . But unfortunately, analyzing qualitative data is difficult. While tools like Excel, Tableau and PowerBI crunch and visualize quantitative data with ease, there are a limited number of mainstream tools for analyzing qualitative data . The majority of qualitative data analysis still happens manually.

That said, there are two new trends that are changing this. First, there are advances in natural language processing (NLP) which is focused on understanding human language. Second, there is an explosion of user-friendly software designed for both researchers and businesses. Both help automate the qualitative data analysis process.

In this post we want to teach you how to conduct a successful qualitative data analysis. There are two primary qualitative data analysis methods; manual & automatic. We will teach you how to conduct the analysis manually, and also, automatically using software solutions powered by NLP. We’ll guide you through the steps to conduct a manual analysis, and look at what is involved and the role technology can play in automating this process.

More businesses are switching to fully-automated analysis of qualitative customer data because it is cheaper, faster, and just as accurate. Primarily, businesses purchase subscriptions to feedback analytics platforms so that they can understand customer pain points and sentiment.

Overwhelming quantity of feedback

We’ll take you through 5 steps to conduct a successful qualitative data analysis. Within each step we will highlight the key difference between the manual, and automated approach of qualitative researchers. Here's an overview of the steps:

The 5 steps to doing qualitative data analysis

  • Gathering and collecting your qualitative data
  • Organizing and connecting into your qualitative data
  • Coding your qualitative data
  • Analyzing the qualitative data for insights
  • Reporting on the insights derived from your analysis

What is Qualitative Data Analysis?

Qualitative data analysis is a process of gathering, structuring and interpreting qualitative data to understand what it represents.

Qualitative data is non-numerical and unstructured. Qualitative data generally refers to text, such as open-ended responses to survey questions or user interviews, but also includes audio, photos and video.

Businesses often perform qualitative data analysis on customer feedback. And within this context, qualitative data generally refers to verbatim text data collected from sources such as reviews, complaints, chat messages, support centre interactions, customer interviews, case notes or social media comments.

How is qualitative data analysis different from quantitative data analysis?

Understanding the differences between quantitative & qualitative data is important. When it comes to analyzing data, Qualitative Data Analysis serves a very different role to Quantitative Data Analysis. But what sets them apart?

Qualitative Data Analysis dives into the stories hidden in non-numerical data such as interviews, open-ended survey answers, or notes from observations. It uncovers the ‘whys’ and ‘hows’ giving a deep understanding of people’s experiences and emotions.

Quantitative Data Analysis on the other hand deals with numerical data, using statistics to measure differences, identify preferred options, and pinpoint root causes of issues.  It steps back to address questions like "how many" or "what percentage" to offer broad insights we can apply to larger groups.

In short, Qualitative Data Analysis is like a microscope,  helping us understand specific detail. Quantitative Data Analysis is like the telescope, giving us a broader perspective. Both are important, working together to decode data for different objectives.

Qualitative Data Analysis methods

Once all the data has been captured, there are a variety of analysis techniques available and the choice is determined by your specific research objectives and the kind of data you’ve gathered.  Common qualitative data analysis methods include:

Content Analysis

This is a popular approach to qualitative data analysis. Other qualitative analysis techniques may fit within the broad scope of content analysis. Thematic analysis is a part of the content analysis.  Content analysis is used to identify the patterns that emerge from text, by grouping content into words, concepts, and themes. Content analysis is useful to quantify the relationship between all of the grouped content. The Columbia School of Public Health has a detailed breakdown of content analysis .

Narrative Analysis

Narrative analysis focuses on the stories people tell and the language they use to make sense of them.  It is particularly useful in qualitative research methods where customer stories are used to get a deep understanding of customers’ perspectives on a specific issue. A narrative analysis might enable us to summarize the outcomes of a focused case study.

Discourse Analysis

Discourse analysis is used to get a thorough understanding of the political, cultural and power dynamics that exist in specific situations.  The focus of discourse analysis here is on the way people express themselves in different social contexts. Discourse analysis is commonly used by brand strategists who hope to understand why a group of people feel the way they do about a brand or product.

Thematic Analysis

Thematic analysis is used to deduce the meaning behind the words people use. This is accomplished by discovering repeating themes in text. These meaningful themes reveal key insights into data and can be quantified, particularly when paired with sentiment analysis . Often, the outcome of thematic analysis is a code frame that captures themes in terms of codes, also called categories. So the process of thematic analysis is also referred to as “coding”. A common use-case for thematic analysis in companies is analysis of customer feedback.

Grounded Theory

Grounded theory is a useful approach when little is known about a subject. Grounded theory starts by formulating a theory around a single data case. This means that the theory is “grounded”. Grounded theory analysis is based on actual data, and not entirely speculative. Then additional cases can be examined to see if they are relevant and can add to the original grounded theory.

Methods of qualitative data analysis; approaches and techniques to qualitative data analysis

Challenges of Qualitative Data Analysis

While Qualitative Data Analysis offers rich insights, it comes with its challenges. Each unique QDA method has its unique hurdles. Let’s take a look at the challenges researchers and analysts might face, depending on the chosen method.

  • Time and Effort (Narrative Analysis): Narrative analysis, which focuses on personal stories, demands patience. Sifting through lengthy narratives to find meaningful insights can be time-consuming, requires dedicated effort.
  • Being Objective (Grounded Theory): Grounded theory, building theories from data, faces the challenges of personal biases. Staying objective while interpreting data is crucial, ensuring conclusions are rooted in the data itself.
  • Complexity (Thematic Analysis): Thematic analysis involves identifying themes within data, a process that can be intricate. Categorizing and understanding themes can be complex, especially when each piece of data varies in context and structure. Thematic Analysis software can simplify this process.
  • Generalizing Findings (Narrative Analysis): Narrative analysis, dealing with individual stories, makes drawing broad challenging. Extending findings from a single narrative to a broader context requires careful consideration.
  • Managing Data (Thematic Analysis): Thematic analysis involves organizing and managing vast amounts of unstructured data, like interview transcripts. Managing this can be a hefty task, requiring effective data management strategies.
  • Skill Level (Grounded Theory): Grounded theory demands specific skills to build theories from the ground up. Finding or training analysts with these skills poses a challenge, requiring investment in building expertise.

Benefits of qualitative data analysis

Qualitative Data Analysis (QDA) is like a versatile toolkit, offering a tailored approach to understanding your data. The benefits it offers are as diverse as the methods. Let’s explore why choosing the right method matters.

  • Tailored Methods for Specific Needs: QDA isn't one-size-fits-all. Depending on your research objectives and the type of data at hand, different methods offer unique benefits. If you want emotive customer stories, narrative analysis paints a strong picture. When you want to explain a score, thematic analysis reveals insightful patterns
  • Flexibility with Thematic Analysis: thematic analysis is like a chameleon in the toolkit of QDA. It adapts well to different types of data and research objectives, making it a top choice for any qualitative analysis.
  • Deeper Understanding, Better Products: QDA helps you dive into people's thoughts and feelings. This deep understanding helps you build products and services that truly matches what people want, ensuring satisfied customers
  • Finding the Unexpected: Qualitative data often reveals surprises that we miss in quantitative data. QDA offers us new ideas and perspectives, for insights we might otherwise miss.
  • Building Effective Strategies: Insights from QDA are like strategic guides. They help businesses in crafting plans that match people’s desires.
  • Creating Genuine Connections: Understanding people’s experiences lets businesses connect on a real level. This genuine connection helps build trust and loyalty, priceless for any business.

How to do Qualitative Data Analysis: 5 steps

Now we are going to show how you can do your own qualitative data analysis. We will guide you through this process step by step. As mentioned earlier, you will learn how to do qualitative data analysis manually , and also automatically using modern qualitative data and thematic analysis software.

To get best value from the analysis process and research process, it’s important to be super clear about the nature and scope of the question that’s being researched. This will help you select the research collection channels that are most likely to help you answer your question.

Depending on if you are a business looking to understand customer sentiment, or an academic surveying a school, your approach to qualitative data analysis will be unique.

Once you’re clear, there’s a sequence to follow. And, though there are differences in the manual and automatic approaches, the process steps are mostly the same.

The use case for our step-by-step guide is a company looking to collect data (customer feedback data), and analyze the customer feedback - in order to improve customer experience. By analyzing the customer feedback the company derives insights about their business and their customers. You can follow these same steps regardless of the nature of your research. Let’s get started.

Step 1: Gather your qualitative data and conduct research (Conduct qualitative research)

The first step of qualitative research is to do data collection. Put simply, data collection is gathering all of your data for analysis. A common situation is when qualitative data is spread across various sources.

Classic methods of gathering qualitative data

Most companies use traditional methods for gathering qualitative data: conducting interviews with research participants, running surveys, and running focus groups. This data is typically stored in documents, CRMs, databases and knowledge bases. It’s important to examine which data is available and needs to be included in your research project, based on its scope.

Using your existing qualitative feedback

As it becomes easier for customers to engage across a range of different channels, companies are gathering increasingly large amounts of both solicited and unsolicited qualitative feedback.

Most organizations have now invested in Voice of Customer programs , support ticketing systems, chatbot and support conversations, emails and even customer Slack chats.

These new channels provide companies with new ways of getting feedback, and also allow the collection of unstructured feedback data at scale.

The great thing about this data is that it contains a wealth of valubale insights and that it’s already there! When you have a new question about user behavior or your customers, you don’t need to create a new research study or set up a focus group. You can find most answers in the data you already have.

Typically, this data is stored in third-party solutions or a central database, but there are ways to export it or connect to a feedback analysis solution through integrations or an API.

Utilize untapped qualitative data channels

There are many online qualitative data sources you may not have considered. For example, you can find useful qualitative data in social media channels like Twitter or Facebook. Online forums, review sites, and online communities such as Discourse or Reddit also contain valuable data about your customers, or research questions.

If you are considering performing a qualitative benchmark analysis against competitors - the internet is your best friend. Gathering feedback in competitor reviews on sites like Trustpilot, G2, Capterra, Better Business Bureau or on app stores is a great way to perform a competitor benchmark analysis.

Customer feedback analysis software often has integrations into social media and review sites, or you could use a solution like DataMiner to scrape the reviews.

G2.com reviews of the product Airtable. You could pull reviews from G2 for your analysis.

Step 2: Connect & organize all your qualitative data

Now you all have this qualitative data but there’s a problem, the data is unstructured. Before feedback can be analyzed and assigned any value, it needs to be organized in a single place. Why is this important? Consistency!

If all data is easily accessible in one place and analyzed in a consistent manner, you will have an easier time summarizing and making decisions based on this data.

The manual approach to organizing your data

The classic method of structuring qualitative data is to plot all the raw data you’ve gathered into a spreadsheet.

Typically, research and support teams would share large Excel sheets and different business units would make sense of the qualitative feedback data on their own. Each team collects and organizes the data in a way that best suits them, which means the feedback tends to be kept in separate silos.

An alternative and a more robust solution is to store feedback in a central database, like Snowflake or Amazon Redshift .

Keep in mind that when you organize your data in this way, you are often preparing it to be imported into another software. If you go the route of a database, you would need to use an API to push the feedback into a third-party software.

Computer-assisted qualitative data analysis software (CAQDAS)

Traditionally within the manual analysis approach (but not always), qualitative data is imported into CAQDAS software for coding.

In the early 2000s, CAQDAS software was popularised by developers such as ATLAS.ti, NVivo and MAXQDA and eagerly adopted by researchers to assist with the organizing and coding of data.  

The benefits of using computer-assisted qualitative data analysis software:

  • Assists in the organizing of your data
  • Opens you up to exploring different interpretations of your data analysis
  • Allows you to share your dataset easier and allows group collaboration (allows for secondary analysis)

However you still need to code the data, uncover the themes and do the analysis yourself. Therefore it is still a manual approach.

The user interface of CAQDAS software 'NVivo'

Organizing your qualitative data in a feedback repository

Another solution to organizing your qualitative data is to upload it into a feedback repository where it can be unified with your other data , and easily searchable and taggable. There are a number of software solutions that act as a central repository for your qualitative research data. Here are a couple solutions that you could investigate:  

  • Dovetail: Dovetail is a research repository with a focus on video and audio transcriptions. You can tag your transcriptions within the platform for theme analysis. You can also upload your other qualitative data such as research reports, survey responses, support conversations, and customer interviews. Dovetail acts as a single, searchable repository. And makes it easier to collaborate with other people around your qualitative research.
  • EnjoyHQ: EnjoyHQ is another research repository with similar functionality to Dovetail. It boasts a more sophisticated search engine, but it has a higher starting subscription cost.

Organizing your qualitative data in a feedback analytics platform

If you have a lot of qualitative customer or employee feedback, from the likes of customer surveys or employee surveys, you will benefit from a feedback analytics platform. A feedback analytics platform is a software that automates the process of both sentiment analysis and thematic analysis . Companies use the integrations offered by these platforms to directly tap into their qualitative data sources (review sites, social media, survey responses, etc.). The data collected is then organized and analyzed consistently within the platform.

If you have data prepared in a spreadsheet, it can also be imported into feedback analytics platforms.

Once all this rich data has been organized within the feedback analytics platform, it is ready to be coded and themed, within the same platform. Thematic is a feedback analytics platform that offers one of the largest libraries of integrations with qualitative data sources.

Some of qualitative data integrations offered by Thematic

Step 3: Coding your qualitative data

Your feedback data is now organized in one place. Either within your spreadsheet, CAQDAS, feedback repository or within your feedback analytics platform. The next step is to code your feedback data so we can extract meaningful insights in the next step.

Coding is the process of labelling and organizing your data in such a way that you can then identify themes in the data, and the relationships between these themes.

To simplify the coding process, you will take small samples of your customer feedback data, come up with a set of codes, or categories capturing themes, and label each piece of feedback, systematically, for patterns and meaning. Then you will take a larger sample of data, revising and refining the codes for greater accuracy and consistency as you go.

If you choose to use a feedback analytics platform, much of this process will be automated and accomplished for you.

The terms to describe different categories of meaning (‘theme’, ‘code’, ‘tag’, ‘category’ etc) can be confusing as they are often used interchangeably.  For clarity, this article will use the term ‘code’.

To code means to identify key words or phrases and assign them to a category of meaning. “I really hate the customer service of this computer software company” would be coded as “poor customer service”.

How to manually code your qualitative data

  • Decide whether you will use deductive or inductive coding. Deductive coding is when you create a list of predefined codes, and then assign them to the qualitative data. Inductive coding is the opposite of this, you create codes based on the data itself. Codes arise directly from the data and you label them as you go. You need to weigh up the pros and cons of each coding method and select the most appropriate.
  • Read through the feedback data to get a broad sense of what it reveals. Now it’s time to start assigning your first set of codes to statements and sections of text.
  • Keep repeating step 2, adding new codes and revising the code description as often as necessary.  Once it has all been coded, go through everything again, to be sure there are no inconsistencies and that nothing has been overlooked.
  • Create a code frame to group your codes. The coding frame is the organizational structure of all your codes. And there are two commonly used types of coding frames, flat, or hierarchical. A hierarchical code frame will make it easier for you to derive insights from your analysis.
  • Based on the number of times a particular code occurs, you can now see the common themes in your feedback data. This is insightful! If ‘bad customer service’ is a common code, it’s time to take action.

We have a detailed guide dedicated to manually coding your qualitative data .

Example of a hierarchical coding frame in qualitative data analysis

Using software to speed up manual coding of qualitative data

An Excel spreadsheet is still a popular method for coding. But various software solutions can help speed up this process. Here are some examples.

  • CAQDAS / NVivo - CAQDAS software has built-in functionality that allows you to code text within their software. You may find the interface the software offers easier for managing codes than a spreadsheet.
  • Dovetail/EnjoyHQ - You can tag transcripts and other textual data within these solutions. As they are also repositories you may find it simpler to keep the coding in one platform.
  • IBM SPSS - SPSS is a statistical analysis software that may make coding easier than in a spreadsheet.
  • Ascribe - Ascribe’s ‘Coder’ is a coding management system. Its user interface will make it easier for you to manage your codes.

Automating the qualitative coding process using thematic analysis software

In solutions which speed up the manual coding process, you still have to come up with valid codes and often apply codes manually to pieces of feedback. But there are also solutions that automate both the discovery and the application of codes.

Advances in machine learning have now made it possible to read, code and structure qualitative data automatically. This type of automated coding is offered by thematic analysis software .

Automation makes it far simpler and faster to code the feedback and group it into themes. By incorporating natural language processing (NLP) into the software, the AI looks across sentences and phrases to identify common themes meaningful statements. Some automated solutions detect repeating patterns and assign codes to them, others make you train the AI by providing examples. You could say that the AI learns the meaning of the feedback on its own.

Thematic automates the coding of qualitative feedback regardless of source. There’s no need to set up themes or categories in advance. Simply upload your data and wait a few minutes. You can also manually edit the codes to further refine their accuracy.  Experiments conducted indicate that Thematic’s automated coding is just as accurate as manual coding .

Paired with sentiment analysis and advanced text analytics - these automated solutions become powerful for deriving quality business or research insights.

You could also build your own , if you have the resources!

The key benefits of using an automated coding solution

Automated analysis can often be set up fast and there’s the potential to uncover things that would never have been revealed if you had given the software a prescribed list of themes to look for.

Because the model applies a consistent rule to the data, it captures phrases or statements that a human eye might have missed.

Complete and consistent analysis of customer feedback enables more meaningful findings. Leading us into step 4.

Step 4: Analyze your data: Find meaningful insights

Now we are going to analyze our data to find insights. This is where we start to answer our research questions. Keep in mind that step 4 and step 5 (tell the story) have some overlap . This is because creating visualizations is both part of analysis process and reporting.

The task of uncovering insights is to scour through the codes that emerge from the data and draw meaningful correlations from them. It is also about making sure each insight is distinct and has enough data to support it.

Part of the analysis is to establish how much each code relates to different demographics and customer profiles, and identify whether there’s any relationship between these data points.

Manually create sub-codes to improve the quality of insights

If your code frame only has one level, you may find that your codes are too broad to be able to extract meaningful insights. This is where it is valuable to create sub-codes to your primary codes. This process is sometimes referred to as meta coding.

Note: If you take an inductive coding approach, you can create sub-codes as you are reading through your feedback data and coding it.

While time-consuming, this exercise will improve the quality of your analysis. Here is an example of what sub-codes could look like.

Example of sub-codes

You need to carefully read your qualitative data to create quality sub-codes. But as you can see, the depth of analysis is greatly improved. By calculating the frequency of these sub-codes you can get insight into which  customer service problems you can immediately address.

Correlate the frequency of codes to customer segments

Many businesses use customer segmentation . And you may have your own respondent segments that you can apply to your qualitative analysis. Segmentation is the practise of dividing customers or research respondents into subgroups.

Segments can be based on:

  • Demographic
  • And any other data type that you care to segment by

It is particularly useful to see the occurrence of codes within your segments. If one of your customer segments is considered unimportant to your business, but they are the cause of nearly all customer service complaints, it may be in your best interest to focus attention elsewhere. This is a useful insight!

Manually visualizing coded qualitative data

There are formulas you can use to visualize key insights in your data. The formulas we will suggest are imperative if you are measuring a score alongside your feedback.

If you are collecting a metric alongside your qualitative data this is a key visualization. Impact answers the question: “What’s the impact of a code on my overall score?”. Using Net Promoter Score (NPS) as an example, first you need to:

  • Calculate overall NPS
  • Calculate NPS in the subset of responses that do not contain that theme
  • Subtract B from A

Then you can use this simple formula to calculate code impact on NPS .

Visualizing qualitative data: Calculating the impact of a code on your score

You can then visualize this data using a bar chart.

You can download our CX toolkit - it includes a template to recreate this.

Trends over time

This analysis can help you answer questions like: “Which codes are linked to decreases or increases in my score over time?”

We need to compare two sequences of numbers: NPS over time and code frequency over time . Using Excel, calculate the correlation between the two sequences, which can be either positive (the more codes the higher the NPS, see picture below), or negative (the more codes the lower the NPS).

Now you need to plot code frequency against the absolute value of code correlation with NPS. Here is the formula:

Analyzing qualitative data: Calculate which codes are linked to increases or decreases in my score

The visualization could look like this:

Visualizing qualitative data trends over time

These are two examples, but there are more. For a third manual formula, and to learn why word clouds are not an insightful form of analysis, read our visualizations article .

Using a text analytics solution to automate analysis

Automated text analytics solutions enable codes and sub-codes to be pulled out of the data automatically. This makes it far faster and easier to identify what’s driving negative or positive results. And to pick up emerging trends and find all manner of rich insights in the data.

Another benefit of AI-driven text analytics software is its built-in capability for sentiment analysis, which provides the emotive context behind your feedback and other qualitative textual data therein.

Thematic provides text analytics that goes further by allowing users to apply their expertise on business context to edit or augment the AI-generated outputs.

Since the move away from manual research is generally about reducing the human element, adding human input to the technology might sound counter-intuitive. However, this is mostly to make sure important business nuances in the feedback aren’t missed during coding. The result is a higher accuracy of analysis. This is sometimes referred to as augmented intelligence .

Codes displayed by volume within Thematic. You can 'manage themes' to introduce human input.

Step 5: Report on your data: Tell the story

The last step of analyzing your qualitative data is to report on it, to tell the story. At this point, the codes are fully developed and the focus is on communicating the narrative to the audience.

A coherent outline of the qualitative research, the findings and the insights is vital for stakeholders to discuss and debate before they can devise a meaningful course of action.

Creating graphs and reporting in Powerpoint

Typically, qualitative researchers take the tried and tested approach of distilling their report into a series of charts, tables and other visuals which are woven into a narrative for presentation in Powerpoint.

Using visualization software for reporting

With data transformation and APIs, the analyzed data can be shared with data visualisation software, such as Power BI or Tableau , Google Studio or Looker. Power BI and Tableau are among the most preferred options.

Visualizing your insights inside a feedback analytics platform

Feedback analytics platforms, like Thematic, incorporate visualisation tools that intuitively turn key data and insights into graphs.  This removes the time consuming work of constructing charts to visually identify patterns and creates more time to focus on building a compelling narrative that highlights the insights, in bite-size chunks, for executive teams to review.

Using a feedback analytics platform with visualization tools means you don’t have to use a separate product for visualizations. You can export graphs into Powerpoints straight from the platforms.

Two examples of qualitative data visualizations within Thematic

Conclusion - Manual or Automated?

There are those who remain deeply invested in the manual approach - because it’s familiar, because they’re reluctant to spend money and time learning new software, or because they’ve been burned by the overpromises of AI.  

For projects that involve small datasets, manual analysis makes sense. For example, if the objective is simply to quantify a simple question like “Do customers prefer X concepts to Y?”. If the findings are being extracted from a small set of focus groups and interviews, sometimes it’s easier to just read them

However, as new generations come into the workplace, it’s technology-driven solutions that feel more comfortable and practical. And the merits are undeniable.  Especially if the objective is to go deeper and understand the ‘why’ behind customers’ preference for X or Y. And even more especially if time and money are considerations.

The ability to collect a free flow of qualitative feedback data at the same time as the metric means AI can cost-effectively scan, crunch, score and analyze a ton of feedback from one system in one go. And time-intensive processes like focus groups, or coding, that used to take weeks, can now be completed in a matter of hours or days.

But aside from the ever-present business case to speed things up and keep costs down, there are also powerful research imperatives for automated analysis of qualitative data: namely, accuracy and consistency.

Finding insights hidden in feedback requires consistency, especially in coding.  Not to mention catching all the ‘unknown unknowns’ that can skew research findings and steering clear of cognitive bias.

Some say without manual data analysis researchers won’t get an accurate “feel” for the insights. However, the larger data sets are, the harder it is to sort through the feedback and organize feedback that has been pulled from different places.  And, the more difficult it is to stay on course, the greater the risk of drawing incorrect, or incomplete, conclusions grows.

Though the process steps for qualitative data analysis have remained pretty much unchanged since psychologist Paul Felix Lazarsfeld paved the path a hundred years ago, the impact digital technology has had on types of qualitative feedback data and the approach to the analysis are profound.  

If you want to try an automated feedback analysis solution on your own qualitative data, you can get started with Thematic .

how to write data analysis for qualitative research

Community & Marketing

Tyler manages our community of CX, insights & analytics professionals. Tyler's goal is to help unite insights professionals around common challenges.

We make it easy to discover the customer and product issues that matter.

Unlock the value of feedback at scale, in one platform. Try it for free now!

  • Questions to ask your Feedback Analytics vendor
  • How to end customer churn for good
  • Scalable analysis of NPS verbatims
  • 5 Text analytics approaches
  • How to calculate the ROI of CX

Our experts will show you how Thematic works, how to discover pain points and track the ROI of decisions. To access your free trial, book a personal demo today.

Recent posts

Watercare is New Zealand's largest water and wastewater service provider. They are responsible for bringing clean water to 1.7 million people in Tamaki Makaurau (Auckland) and safeguarding the wastewater network to minimize impact on the environment. Water is a sector that often gets taken for granted, with drainage and

Become a qualitative theming pro! Creating a perfect code frame is hard, but thematic analysis software makes the process much easier.

Qualtrics is one of the most well-known and powerful Customer Feedback Management platforms. But even so, it has limitations. We recently hosted a live panel where data analysts from two well-known brands shared their experiences with Qualtrics, and how they extended this platform’s capabilities. Below, we’ll share the

Analyst Answers

Data & Finance for Work & Life

man doing qualitative research

Data Analysis for Qualitative Research: 6 Step Guide

Data analysis for qualitative research is not intuitive. This is because qualitative data stands in opposition to traditional data analysis methodologies: while data analysis is concerned with quantities, qualitative data is by definition unquantified . But there is an easy, methodical approach that anyone can take use to get reliable results when performing data analysis for qualitative research. The process consists of 6 steps that I’ll break down in this article:

  • Perform interviews(if necessary )
  • Gather all documents and transcribe any non-paper records
  • Decide whether to either code analytical data, analyze word frequencies, or both
  • Decide what interpretive angle you want to take: content analysis , narrative analysis, discourse analysis, framework analysis, and/or grounded theory
  • Compile your data in a spreadsheet using document saving techniques (windows and mac)
  • Identify trends in words, themes, metaphors, natural patterns, and more

To complete these steps, you will need:

  • Microsoft word
  • Microsoft excel
  • Internet access

You can get the free Intro to Data Analysis eBook to cover the fundamentals and ensure strong progression in all your data endeavors.

What is qualitative research?

Qualitative research is not the same as quantitative research. In short, qualitative research is the interpretation of non-numeric data. It usually aims at drawing conclusions that explain why a phenomenon occurs, rather than that one does occur. Here’s a great quote from a nursing magazine about quantitative vs qualitative research:

“A traditional quantitative study… uses a predetermined (and auditable) set of steps to confirm or refute [a] hypothesis. “In contrast, qualitative research often takes the position that an interpretive understanding is only possible by way of uncovering or deconstructing the meanings of a phenomenon. Thus, a distinction between explaining how something operates (explanation) and why it operates in the manner that it does (interpretation) may be [an] effective way to distinguish quantitative from qualitative analytic processes involved in any particular study.” (bold added) (( EBN ))

Learn to Interpret Your Qualitative Data

This article explain what data analysis is and how to do it. To learn how to interpret the results, visualize, and write an insightful report, sign up for our handbook below.

how to write data analysis for qualitative research

Step 1a: Data collection methods and techniques in qualitative research: interviews and focus groups

Step 1 is collecting the data that you will need for the analysis. If you are not performing any interviews or focus groups to gather data, then you can skip this step. It’s for people who need to go into the field and collect raw information as part of their qualitative analysis.

Since the whole point of an interview and of qualitative analysis in general is to understand a research question better, you should start by making sure you have a specific, refined research question . Whether you’re a researcher by trade or a data analyst working on one-time project, you must know specifically what you want to understand in order to get results.

Good research questions are specific enough to guide action but open enough to leave room for insight and growth. Examples of good research questions include:

  • Good : To what degree does living in a city impact the quality of a person’s life? (open-ended, complex)
  • Bad : Does living in a city impact the quality of a person’s life? (closed, simple)

Once you understand the research question, you need to develop a list of interview questions. These questions should likewise be open-ended and provide liberty of expression to the responder. They should support the research question in an active way without prejudicing the response. Examples of good interview questions include:

  • Good : Tell me what it’s like to live in a city versus in the country. (open, not leading)
  • Bad : Don’t you prefer the city to the country because there are more people? (closed, leading)

Some additional helpful tips include:

  • Begin each interview with a neutral question to get the person relaxed
  • Limit each question to a single idea
  • If you don’t understand, ask for clarity
  • Do not pass any judgements
  • Do not spend more than 15m on an interview, lest the quality of responses drop

Focus groups

The alternative to interviews is focus groups. Focus groups are a great way for you to get an idea for how people communicate their opinions in a group setting, rather than a one-on-one setting as in interviews.

In short, focus groups are gatherings of small groups of people from representative backgrounds who receive instruction, or “facilitation,” from a focus group leader. Typically, the leader will ask questions to stimulate conversation, reformulate questions to bring the discussion back to focus, and prevent the discussion from turning sour or giving way to bad faith.

Focus group questions should be open-ended like their interview neighbors, and they should stimulate some degree of disagreement. Disagreement often leads to valuable information about differing opinions, as people tend to say what they mean if contradicted.

However, focus group leaders must be careful not to let disagreements escalate, as anger can make people lie to be hurtful or simply to win an argument. And lies are not helpful in data analysis for qualitative research.

Step 1b: Tools for qualitative data collection

When it comes to data analysis for qualitative analysis, the tools you use to collect data should align to some degree with the tools you will use to analyze the data.

As mentioned in the intro, you will be focusing on analysis techniques that only require the traditional Microsoft suite programs: Microsoft Excel and Microsoft Word . At the same time, you can source supplementary tools from various websites, like Text Analyzer and WordCounter.

In short, the tools for qualitative data collection that you need are Excel and Word , as well as web-based free tools like Text Analyzer and WordCounter . These online tools are helpful in the quantitative part of your qualitative research.

Step 2: Gather all documents & transcribe non-written docs

Once you have your interviews and/or focus group transcripts, it’s time to decide if you need other documentation. If you do, you’ll need to gather it all into one place first, then develop a strategy for how to transcribe any non-written documents.

When do you need documentation other than interviews and focus groups? Two situations usually call for documentation. First , if you have little funding , then you can’t afford to run expensive interviews and focus groups.

Second , social science researchers typically focus on documents since their research questions are less concerned with subject-oriented data, while hard science and business researchers typically focus on interviews and focus groups because they want to know what people think, and they want to know today.

Non-written records

Other factors at play include the type of research, the field, and specific research goal. For those who need documentation and to describe non-written records, there are some steps to follow:

  • Put all hard copy source documents into a sealed binder (I use plastic paper holders with elastic seals ).
  • If you are sourcing directly from printed books or journals, then you will need to digitalize them by scanning them and making them text readable by the computer. To do so, turn all PDFs into Word documents using online tools such as PDF to Word Converter . This process is never full-proof, and it may be a source of error in the data collection, but it’s part of the process.
  • If you are sourcing online documents, try as often as possible to get computer-readable PDF documents that you can easily copy/paste or convert. Locked PDFs are essentially a lost cause .
  • Transcribe any audio files into written documents. There are free online tools available to help with this, such as 360converter . If you run a test through the system, you’ll see that the output is not 100%. The best way to use this tool is as a first draft generator. You can then correct and complete it with old fashioned, direct transcription.

Step 3: Decide on the type of qualitative research

Before step 3 you should have collected your data, transcribed it all into written-word documents, and compiled it in one place. Now comes the interesting part. You need to decide what you want to get out of your research by choosing an analytic angle, or type of qualitative research.

The available types of qualitative research are as follows. Each of them takes a unique angle that you must choose to get what information you want from the analysis . In addition, each of them has a different impact on the data analysis for qualitative research (coding vs word frequency) that we use.

Content analysis

Narrative analysis, discourse analysis.

  • Framework analysis, and/or

Grounded theory

From a high level, content, narrative, and discourse analysis are actionable independent tactics, whereas framework analysis and grounded theory are ways of honing and applying the first three.

  • Definition : Content analysis is identify and labelling themes of any kind within a text.
  • Focus : Identifying any kind of pattern in written text, transcribed audio, or transcribed video. This could be thematic, word repetition, idea repetition. Most often, the patterns we find are idea that make up an argument.
  • Goal : To simplify, standardize, and quickly reference ideas from any given text. Content analysis is a way to pull the main ideas from huge documents for comparison. In this way, it’s more a means to an end.
  • Pros : The huge advantage of doing content analysis is that you can quickly process huge amounts of texts using simple coding and word frequency techniques we will look at below. To use a metaphore, it is to qualitative analysis documents what Spark notes are to books.
  • Cons : The downside to content analysis is that it’s quite general. If you have a very specific, narrative research question, then tracing “any and all ideas” will not be very helpful to you.
  • Definition : Narrative analysis is the reformulation and simplification of interview answers or documentation into small narrative components to identify story-like patterns.
  • Focus : Understanding the text based on its narrative components as opposed to themes or other qualities.
  • Goal : To reference the text from an angle closer to the nature of texts in order to obtain further insights.
  • Pros : Narrative analysis is very useful for getting perspective on a topic in which you’re extremely limited. It can be easy to get tunnel vision when you’re digging for themes and ideas from a reason-centric perspective. Turning to a narrative approach will help you stay grounded. More importantly, it helps reveal different kinds of trends.
  • Cons : Narrative analysis adds another layer of subjectivity to the instinctive nature of qualitative research. Many see it as too dependent on the researcher to hold any critical value.
  • Definition : Discourse analysis is the textual analysis of naturally occurring speech. Any oral expression must be transcribed before undergoing legitimate discourse analysis.
  • Focus : Understanding ideas and themes through language communicated orally rather than pre-processed on paper.
  • Goal : To obtain insights from an angle outside the traditional content analysis on text.
  • Pros : Provides a considerable advantage in some areas of study in order to understand how people communicate an idea, versus the idea itself. For example, discourse analysis is important in political campaigning. People rarely vote for the candidate who most closely corresponds to his/her beliefs, but rather for the person they like the most.
  • Cons : As with narrative analysis, discourse analysis is more subjective in nature than content analysis, which focuses on ideas and patterns. Some do not consider it rigorous enough to be considered a legitimate subset of qualitative analysis, but these people are few.

Framework analysis

  • Definition : Framework analysis is a kind of qualitative analysis that includes 5 ordered steps: coding, indexing, charting, mapping, and interpreting . In most ways, framework analysis is a synonym for qualitative analysis — the same thing. The significant difference is the importance it places on the perspective used in the analysis.
  • Focus : Understanding patterns in themes and ideas.
  • Goal : Creating one specific framework for looking at a text.
  • Pros : Framework analysis is helpful when the researcher clearly understands what he/she wants from the project, as it’s a limitation approach. Since each of its step has defined parameters, framework analysis is very useful for teamwork.
  • Cons : It can lead to tunnel vision.
  • Definition : The use of content, narrative, and discourse analysis to examine a single case, in the hopes that discoveries from that case will lead to a foundational theory used to examine other like cases.
  • Focus : A vast approach using multiple techniques in order to establish patterns.
  • Goal : To develop a foundational theory.
  • Pros : When successful, grounded theories can revolutionize entire fields of study.
  • Cons : It’s very difficult to establish ground theories, and there’s an enormous amount of risk involved.

Step 4: Coding, word frequency, or both

Coding in data analysis for qualitative research is the process of writing 2-5 word codes that summarize at least 1 paragraphs of text (not writing computer code). This allows researchers to keep track of and analyze those codes. On the other hand, word frequency is the process of counting the presence and orientation of words within a text, which makes it the quantitative element in qualitative data analysis.

Video example of coding for data analysis in qualitative research

In short, coding in the context of data analysis for qualitative research follows 2 steps (video below):

  • Reading through the text one time
  • Adding 2-5 word summaries each time a significant theme or idea appears

Let’s look at a brief example of how to code for qualitative research in this video:

Click here for a link to the source text. 1

Example of word frequency processing

And word frequency is the process of finding a specific word or identifying the most common words through 3 steps:

  • Decide if you want to find 1 word or identify the most common ones
  • Use word’s “Replace” function to find a word or phrase
  • Use Text Analyzer to find the most common terms

Here’s another look at word frequency processing and how you to do it. Let’s look at the same example above, but from a quantitative perspective.

Imagine we are already familiar with melanoma and KITs , and we want to analyze the text based on these keywords. One thing we can do is look for these words using the Replace function in word

  • Locate the search bar
  • Click replace
  • Type in the word
  • See the total results

Here’s a brief video example:

Another option is to use an online Text Analyzer. This methodology won’t help us find a specific word, but it will help us discover the top performing phrases and words. All you need to do it put in a link to a target page or paste a text. I pasted the abstract from our source text, and what turns up is as expected. Here’s a picture:

text analyzer example

Step 5: Compile your data in a spreadsheet

After you have some coded data in the word document, you need to get it into excel for analysis. This process requires saving the word doc as an .htm extension, which makes it a website. Once you have the website, it’s as simple as opening that page, scrolling to the bottom, and copying/pasting the comments, or codes, into an excel document.

You will need to wrangle the data slightly in order to make it readable in excel. I’ve made a video to explain this process and places it below.

Step 6: Identify trends & analyze!

There are literally thousands of different ways to analyze qualitative data, and in most situations, the best technique depends on the information you want to get out of the research.

Nevertheless, there are a few go-to techniques. The most important of this is occurrences . In this short video, we finish the example from above by counting the number of times our codes appear. In this way, it’s very similar to word frequency (discussed above).

A few other options include:

  • Ranking each code on a set of relevant criteria and clustering
  • Pure cluster analysis
  • Causal analysis

We cover different types of analysis like this on the website, so be sure to check out other articles on the home page .

How to analyze qualitative data from an interview

To analyze qualitative data from an interview , follow the same 6 steps for quantitative data analysis:

  • Perform the interviews
  • Transcribe the interviews onto paper
  • Decide whether to either code analytical data (open, axial, selective), analyze word frequencies, or both
  • Compile your data in a spreadsheet using document saving techniques (for windows and mac)
  • Source text [ ↩ ]

About the Author

Noah is the founder & Editor-in-Chief at AnalystAnswers. He is a transatlantic professional and entrepreneur with 5+ years of corporate finance and data analytics experience, as well as 3+ years in consumer financial products and business software. He started AnalystAnswers to provide aspiring professionals with accessible explanations of otherwise dense finance and data concepts. Noah believes everyone can benefit from an analytical mindset in growing digital world. When he's not busy at work, Noah likes to explore new European cities, exercise, and spend time with friends and family.

File available immediately.

how to write data analysis for qualitative research

Notice: JavaScript is required for this content.

how to write data analysis for qualitative research

Grad Coach

Qualitative Data Analysis Methods 101:

The “big 6” methods + examples.

By: Kerryn Warren (PhD) | Reviewed By: Eunice Rautenbach (D.Tech) | May 2020 (Updated April 2023)

Qualitative data analysis methods. Wow, that’s a mouthful. 

If you’re new to the world of research, qualitative data analysis can look rather intimidating. So much bulky terminology and so many abstract, fluffy concepts. It certainly can be a minefield!

Don’t worry – in this post, we’ll unpack the most popular analysis methods , one at a time, so that you can approach your analysis with confidence and competence – whether that’s for a dissertation, thesis or really any kind of research project.

Qualitative data analysis methods

What (exactly) is qualitative data analysis?

To understand qualitative data analysis, we need to first understand qualitative data – so let’s step back and ask the question, “what exactly is qualitative data?”.

Qualitative data refers to pretty much any data that’s “not numbers” . In other words, it’s not the stuff you measure using a fixed scale or complex equipment, nor do you analyse it using complex statistics or mathematics.

So, if it’s not numbers, what is it?

Words, you guessed? Well… sometimes , yes. Qualitative data can, and often does, take the form of interview transcripts, documents and open-ended survey responses – but it can also involve the interpretation of images and videos. In other words, qualitative isn’t just limited to text-based data.

So, how’s that different from quantitative data, you ask?

Simply put, qualitative research focuses on words, descriptions, concepts or ideas – while quantitative research focuses on numbers and statistics . Qualitative research investigates the “softer side” of things to explore and describe , while quantitative research focuses on the “hard numbers”, to measure differences between variables and the relationships between them. If you’re keen to learn more about the differences between qual and quant, we’ve got a detailed post over here .

qualitative data analysis vs quantitative data analysis

So, qualitative analysis is easier than quantitative, right?

Not quite. In many ways, qualitative data can be challenging and time-consuming to analyse and interpret. At the end of your data collection phase (which itself takes a lot of time), you’ll likely have many pages of text-based data or hours upon hours of audio to work through. You might also have subtle nuances of interactions or discussions that have danced around in your mind, or that you scribbled down in messy field notes. All of this needs to work its way into your analysis.

Making sense of all of this is no small task and you shouldn’t underestimate it. Long story short – qualitative analysis can be a lot of work! Of course, quantitative analysis is no piece of cake either, but it’s important to recognise that qualitative analysis still requires a significant investment in terms of time and effort.

Need a helping hand?

how to write data analysis for qualitative research

In this post, we’ll explore qualitative data analysis by looking at some of the most common analysis methods we encounter. We’re not going to cover every possible qualitative method and we’re not going to go into heavy detail – we’re just going to give you the big picture. That said, we will of course includes links to loads of extra resources so that you can learn more about whichever analysis method interests you.

Without further delay, let’s get into it.

The “Big 6” Qualitative Analysis Methods 

There are many different types of qualitative data analysis, all of which serve different purposes and have unique strengths and weaknesses . We’ll start by outlining the analysis methods and then we’ll dive into the details for each.

The 6 most popular methods (or at least the ones we see at Grad Coach) are:

  • Content analysis
  • Narrative analysis
  • Discourse analysis
  • Thematic analysis
  • Grounded theory (GT)
  • Interpretive phenomenological analysis (IPA)

Let’s take a look at each of them…

QDA Method #1: Qualitative Content Analysis

Content analysis is possibly the most common and straightforward QDA method. At the simplest level, content analysis is used to evaluate patterns within a piece of content (for example, words, phrases or images) or across multiple pieces of content or sources of communication. For example, a collection of newspaper articles or political speeches.

With content analysis, you could, for instance, identify the frequency with which an idea is shared or spoken about – like the number of times a Kardashian is mentioned on Twitter. Or you could identify patterns of deeper underlying interpretations – for instance, by identifying phrases or words in tourist pamphlets that highlight India as an ancient country.

Because content analysis can be used in such a wide variety of ways, it’s important to go into your analysis with a very specific question and goal, or you’ll get lost in the fog. With content analysis, you’ll group large amounts of text into codes , summarise these into categories, and possibly even tabulate the data to calculate the frequency of certain concepts or variables. Because of this, content analysis provides a small splash of quantitative thinking within a qualitative method.

Naturally, while content analysis is widely useful, it’s not without its drawbacks . One of the main issues with content analysis is that it can be very time-consuming , as it requires lots of reading and re-reading of the texts. Also, because of its multidimensional focus on both qualitative and quantitative aspects, it is sometimes accused of losing important nuances in communication.

Content analysis also tends to concentrate on a very specific timeline and doesn’t take into account what happened before or after that timeline. This isn’t necessarily a bad thing though – just something to be aware of. So, keep these factors in mind if you’re considering content analysis. Every analysis method has its limitations , so don’t be put off by these – just be aware of them ! If you’re interested in learning more about content analysis, the video below provides a good starting point.

QDA Method #2: Narrative Analysis 

As the name suggests, narrative analysis is all about listening to people telling stories and analysing what that means . Since stories serve a functional purpose of helping us make sense of the world, we can gain insights into the ways that people deal with and make sense of reality by analysing their stories and the ways they’re told.

You could, for example, use narrative analysis to explore whether how something is being said is important. For instance, the narrative of a prisoner trying to justify their crime could provide insight into their view of the world and the justice system. Similarly, analysing the ways entrepreneurs talk about the struggles in their careers or cancer patients telling stories of hope could provide powerful insights into their mindsets and perspectives . Simply put, narrative analysis is about paying attention to the stories that people tell – and more importantly, the way they tell them.

Of course, the narrative approach has its weaknesses , too. Sample sizes are generally quite small due to the time-consuming process of capturing narratives. Because of this, along with the multitude of social and lifestyle factors which can influence a subject, narrative analysis can be quite difficult to reproduce in subsequent research. This means that it’s difficult to test the findings of some of this research.

Similarly, researcher bias can have a strong influence on the results here, so you need to be particularly careful about the potential biases you can bring into your analysis when using this method. Nevertheless, narrative analysis is still a very useful qualitative analysis method – just keep these limitations in mind and be careful not to draw broad conclusions . If you’re keen to learn more about narrative analysis, the video below provides a great introduction to this qualitative analysis method.

QDA Method #3: Discourse Analysis 

Discourse is simply a fancy word for written or spoken language or debate . So, discourse analysis is all about analysing language within its social context. In other words, analysing language – such as a conversation, a speech, etc – within the culture and society it takes place. For example, you could analyse how a janitor speaks to a CEO, or how politicians speak about terrorism.

To truly understand these conversations or speeches, the culture and history of those involved in the communication are important factors to consider. For example, a janitor might speak more casually with a CEO in a company that emphasises equality among workers. Similarly, a politician might speak more about terrorism if there was a recent terrorist incident in the country.

So, as you can see, by using discourse analysis, you can identify how culture , history or power dynamics (to name a few) have an effect on the way concepts are spoken about. So, if your research aims and objectives involve understanding culture or power dynamics, discourse analysis can be a powerful method.

Because there are many social influences in terms of how we speak to each other, the potential use of discourse analysis is vast . Of course, this also means it’s important to have a very specific research question (or questions) in mind when analysing your data and looking for patterns and themes, or you might land up going down a winding rabbit hole.

Discourse analysis can also be very time-consuming  as you need to sample the data to the point of saturation – in other words, until no new information and insights emerge. But this is, of course, part of what makes discourse analysis such a powerful technique. So, keep these factors in mind when considering this QDA method. Again, if you’re keen to learn more, the video below presents a good starting point.

QDA Method #4: Thematic Analysis

Thematic analysis looks at patterns of meaning in a data set – for example, a set of interviews or focus group transcripts. But what exactly does that… mean? Well, a thematic analysis takes bodies of data (which are often quite large) and groups them according to similarities – in other words, themes . These themes help us make sense of the content and derive meaning from it.

Let’s take a look at an example.

With thematic analysis, you could analyse 100 online reviews of a popular sushi restaurant to find out what patrons think about the place. By reviewing the data, you would then identify the themes that crop up repeatedly within the data – for example, “fresh ingredients” or “friendly wait staff”.

So, as you can see, thematic analysis can be pretty useful for finding out about people’s experiences , views, and opinions . Therefore, if your research aims and objectives involve understanding people’s experience or view of something, thematic analysis can be a great choice.

Since thematic analysis is a bit of an exploratory process, it’s not unusual for your research questions to develop , or even change as you progress through the analysis. While this is somewhat natural in exploratory research, it can also be seen as a disadvantage as it means that data needs to be re-reviewed each time a research question is adjusted. In other words, thematic analysis can be quite time-consuming – but for a good reason. So, keep this in mind if you choose to use thematic analysis for your project and budget extra time for unexpected adjustments.

Thematic analysis takes bodies of data and groups them according to similarities (themes), which help us make sense of the content.

QDA Method #5: Grounded theory (GT) 

Grounded theory is a powerful qualitative analysis method where the intention is to create a new theory (or theories) using the data at hand, through a series of “ tests ” and “ revisions ”. Strictly speaking, GT is more a research design type than an analysis method, but we’ve included it here as it’s often referred to as a method.

What’s most important with grounded theory is that you go into the analysis with an open mind and let the data speak for itself – rather than dragging existing hypotheses or theories into your analysis. In other words, your analysis must develop from the ground up (hence the name). 

Let’s look at an example of GT in action.

Assume you’re interested in developing a theory about what factors influence students to watch a YouTube video about qualitative analysis. Using Grounded theory , you’d start with this general overarching question about the given population (i.e., graduate students). First, you’d approach a small sample – for example, five graduate students in a department at a university. Ideally, this sample would be reasonably representative of the broader population. You’d interview these students to identify what factors lead them to watch the video.

After analysing the interview data, a general pattern could emerge. For example, you might notice that graduate students are more likely to read a post about qualitative methods if they are just starting on their dissertation journey, or if they have an upcoming test about research methods.

From here, you’ll look for another small sample – for example, five more graduate students in a different department – and see whether this pattern holds true for them. If not, you’ll look for commonalities and adapt your theory accordingly. As this process continues, the theory would develop . As we mentioned earlier, what’s important with grounded theory is that the theory develops from the data – not from some preconceived idea.

So, what are the drawbacks of grounded theory? Well, some argue that there’s a tricky circularity to grounded theory. For it to work, in principle, you should know as little as possible regarding the research question and population, so that you reduce the bias in your interpretation. However, in many circumstances, it’s also thought to be unwise to approach a research question without knowledge of the current literature . In other words, it’s a bit of a “chicken or the egg” situation.

Regardless, grounded theory remains a popular (and powerful) option. Naturally, it’s a very useful method when you’re researching a topic that is completely new or has very little existing research about it, as it allows you to start from scratch and work your way from the ground up .

Grounded theory is used to create a new theory (or theories) by using the data at hand, as opposed to existing theories and frameworks.

QDA Method #6:   Interpretive Phenomenological Analysis (IPA)

Interpretive. Phenomenological. Analysis. IPA . Try saying that three times fast…

Let’s just stick with IPA, okay?

IPA is designed to help you understand the personal experiences of a subject (for example, a person or group of people) concerning a major life event, an experience or a situation . This event or experience is the “phenomenon” that makes up the “P” in IPA. Such phenomena may range from relatively common events – such as motherhood, or being involved in a car accident – to those which are extremely rare – for example, someone’s personal experience in a refugee camp. So, IPA is a great choice if your research involves analysing people’s personal experiences of something that happened to them.

It’s important to remember that IPA is subject – centred . In other words, it’s focused on the experiencer . This means that, while you’ll likely use a coding system to identify commonalities, it’s important not to lose the depth of experience or meaning by trying to reduce everything to codes. Also, keep in mind that since your sample size will generally be very small with IPA, you often won’t be able to draw broad conclusions about the generalisability of your findings. But that’s okay as long as it aligns with your research aims and objectives.

Another thing to be aware of with IPA is personal bias . While researcher bias can creep into all forms of research, self-awareness is critically important with IPA, as it can have a major impact on the results. For example, a researcher who was a victim of a crime himself could insert his own feelings of frustration and anger into the way he interprets the experience of someone who was kidnapped. So, if you’re going to undertake IPA, you need to be very self-aware or you could muddy the analysis.

IPA can help you understand the personal experiences of a person or group concerning a major life event, an experience or a situation.

How to choose the right analysis method

In light of all of the qualitative analysis methods we’ve covered so far, you’re probably asking yourself the question, “ How do I choose the right one? ”

Much like all the other methodological decisions you’ll need to make, selecting the right qualitative analysis method largely depends on your research aims, objectives and questions . In other words, the best tool for the job depends on what you’re trying to build. For example:

  • Perhaps your research aims to analyse the use of words and what they reveal about the intention of the storyteller and the cultural context of the time.
  • Perhaps your research aims to develop an understanding of the unique personal experiences of people that have experienced a certain event, or
  • Perhaps your research aims to develop insight regarding the influence of a certain culture on its members.

As you can probably see, each of these research aims are distinctly different , and therefore different analysis methods would be suitable for each one. For example, narrative analysis would likely be a good option for the first aim, while grounded theory wouldn’t be as relevant. 

It’s also important to remember that each method has its own set of strengths, weaknesses and general limitations. No single analysis method is perfect . So, depending on the nature of your research, it may make sense to adopt more than one method (this is called triangulation ). Keep in mind though that this will of course be quite time-consuming.

As we’ve seen, all of the qualitative analysis methods we’ve discussed make use of coding and theme-generating techniques, but the intent and approach of each analysis method differ quite substantially. So, it’s very important to come into your research with a clear intention before you decide which analysis method (or methods) to use.

Start by reviewing your research aims , objectives and research questions to assess what exactly you’re trying to find out – then select a qualitative analysis method that fits. Never pick a method just because you like it or have experience using it – your analysis method (or methods) must align with your broader research aims and objectives.

No single analysis method is perfect, so it can often make sense to adopt more than one  method (this is called triangulation).

Let’s recap on QDA methods…

In this post, we looked at six popular qualitative data analysis methods:

  • First, we looked at content analysis , a straightforward method that blends a little bit of quant into a primarily qualitative analysis.
  • Then we looked at narrative analysis , which is about analysing how stories are told.
  • Next up was discourse analysis – which is about analysing conversations and interactions.
  • Then we moved on to thematic analysis – which is about identifying themes and patterns.
  • From there, we went south with grounded theory – which is about starting from scratch with a specific question and using the data alone to build a theory in response to that question.
  • And finally, we looked at IPA – which is about understanding people’s unique experiences of a phenomenon.

Of course, these aren’t the only options when it comes to qualitative data analysis, but they’re a great starting point if you’re dipping your toes into qualitative research for the first time.

If you’re still feeling a bit confused, consider our private coaching service , where we hold your hand through the research process to help you develop your best work.

how to write data analysis for qualitative research

Psst... there’s more!

This post was based on one of our popular Research Bootcamps . If you're working on a research project, you'll definitely want to check this out ...

You Might Also Like:

Sampling methods and strategies in research

84 Comments

Richard N

This has been very helpful. Thank you.

netaji

Thank you madam,

Mariam Jaiyeola

Thank you so much for this information

Nzube

I wonder it so clear for understand and good for me. can I ask additional query?

Lee

Very insightful and useful

Susan Nakaweesi

Good work done with clear explanations. Thank you.

Titilayo

Thanks so much for the write-up, it’s really good.

Hemantha Gunasekara

Thanks madam . It is very important .

Gumathandra

thank you very good

Pramod Bahulekar

This has been very well explained in simple language . It is useful even for a new researcher.

Derek Jansen

Great to hear that. Good luck with your qualitative data analysis, Pramod!

Adam Zahir

This is very useful information. And it was very a clear language structured presentation. Thanks a lot.

Golit,F.

Thank you so much.

Emmanuel

very informative sequential presentation

Shahzada

Precise explanation of method.

Alyssa

Hi, may we use 2 data analysis methods in our qualitative research?

Thanks for your comment. Most commonly, one would use one type of analysis method, but it depends on your research aims and objectives.

Dr. Manju Pandey

You explained it in very simple language, everyone can understand it. Thanks so much.

Phillip

Thank you very much, this is very helpful. It has been explained in a very simple manner that even a layman understands

Anne

Thank nicely explained can I ask is Qualitative content analysis the same as thematic analysis?

Thanks for your comment. No, QCA and thematic are two different types of analysis. This article might help clarify – https://onlinelibrary.wiley.com/doi/10.1111/nhs.12048

Rev. Osadare K . J

This is my first time to come across a well explained data analysis. so helpful.

Tina King

I have thoroughly enjoyed your explanation of the six qualitative analysis methods. This is very helpful. Thank you!

Bromie

Thank you very much, this is well explained and useful

udayangani

i need a citation of your book.

khutsafalo

Thanks a lot , remarkable indeed, enlighting to the best

jas

Hi Derek, What other theories/methods would you recommend when the data is a whole speech?

M

Keep writing useful artikel.

Adane

It is important concept about QDA and also the way to express is easily understandable, so thanks for all.

Carl Benecke

Thank you, this is well explained and very useful.

Ngwisa

Very helpful .Thanks.

Hajra Aman

Hi there! Very well explained. Simple but very useful style of writing. Please provide the citation of the text. warm regards

Hillary Mophethe

The session was very helpful and insightful. Thank you

This was very helpful and insightful. Easy to read and understand

Catherine

As a professional academic writer, this has been so informative and educative. Keep up the good work Grad Coach you are unmatched with quality content for sure.

Keep up the good work Grad Coach you are unmatched with quality content for sure.

Abdulkerim

Its Great and help me the most. A Million Thanks you Dr.

Emanuela

It is a very nice work

Noble Naade

Very insightful. Please, which of this approach could be used for a research that one is trying to elicit students’ misconceptions in a particular concept ?

Karen

This is Amazing and well explained, thanks

amirhossein

great overview

Tebogo

What do we call a research data analysis method that one use to advise or determining the best accounting tool or techniques that should be adopted in a company.

Catherine Shimechero

Informative video, explained in a clear and simple way. Kudos

Van Hmung

Waoo! I have chosen method wrong for my data analysis. But I can revise my work according to this guide. Thank you so much for this helpful lecture.

BRIAN ONYANGO MWAGA

This has been very helpful. It gave me a good view of my research objectives and how to choose the best method. Thematic analysis it is.

Livhuwani Reineth

Very helpful indeed. Thanku so much for the insight.

Storm Erlank

This was incredibly helpful.

Jack Kanas

Very helpful.

catherine

very educative

Wan Roslina

Nicely written especially for novice academic researchers like me! Thank you.

Talash

choosing a right method for a paper is always a hard job for a student, this is a useful information, but it would be more useful personally for me, if the author provide me with a little bit more information about the data analysis techniques in type of explanatory research. Can we use qualitative content analysis technique for explanatory research ? or what is the suitable data analysis method for explanatory research in social studies?

ramesh

that was very helpful for me. because these details are so important to my research. thank you very much

Kumsa Desisa

I learnt a lot. Thank you

Tesfa NT

Relevant and Informative, thanks !

norma

Well-planned and organized, thanks much! 🙂

Dr. Jacob Lubuva

I have reviewed qualitative data analysis in a simplest way possible. The content will highly be useful for developing my book on qualitative data analysis methods. Cheers!

Nyi Nyi Lwin

Clear explanation on qualitative and how about Case study

Ogobuchi Otuu

This was helpful. Thank you

Alicia

This was really of great assistance, it was just the right information needed. Explanation very clear and follow.

Wow, Thanks for making my life easy

C. U

This was helpful thanks .

Dr. Alina Atif

Very helpful…. clear and written in an easily understandable manner. Thank you.

Herb

This was so helpful as it was easy to understand. I’m a new to research thank you so much.

cissy

so educative…. but Ijust want to know which method is coding of the qualitative or tallying done?

Ayo

Thank you for the great content, I have learnt a lot. So helpful

Tesfaye

precise and clear presentation with simple language and thank you for that.

nneheng

very informative content, thank you.

Oscar Kuebutornye

You guys are amazing on YouTube on this platform. Your teachings are great, educative, and informative. kudos!

NG

Brilliant Delivery. You made a complex subject seem so easy. Well done.

Ankit Kumar

Beautifully explained.

Thanks a lot

Kidada Owen-Browne

Is there a video the captures the practical process of coding using automated applications?

Thanks for the comment. We don’t recommend using automated applications for coding, as they are not sufficiently accurate in our experience.

Mathewos Damtew

content analysis can be qualitative research?

Hend

THANK YOU VERY MUCH.

Dev get

Thank you very much for such a wonderful content

Kassahun Aman

do you have any material on Data collection

Prince .S. mpofu

What a powerful explanation of the QDA methods. Thank you.

Kassahun

Great explanation both written and Video. i have been using of it on a day to day working of my thesis project in accounting and finance. Thank you very much for your support.

BORA SAMWELI MATUTULI

very helpful, thank you so much

Submit a Comment Cancel reply

Your email address will not be published. Required fields are marked *

Save my name, email, and website in this browser for the next time I comment.

  • Print Friendly

Research-Methodology

Qualitative Data Analysis

Qualitative data refers to non-numeric information such as interview transcripts, notes, video and audio recordings, images and text documents. Qualitative data analysis can be divided into the following five categories:

1. Content analysis . This refers to the process of categorizing verbal or behavioural data to classify, summarize and tabulate the data.

2. Narrative analysis . This method involves the reformulation of stories presented by respondents taking into account context of each case and different experiences of each respondent. In other words, narrative analysis is the revision of primary qualitative data by researcher.

3. Discourse analysis . A method of analysis of naturally occurring talk and all types of written text.

4. Framework analysis . This is more advanced method that consists of several stages such as familiarization, identifying a thematic framework, coding, charting, mapping and interpretation.

5. Grounded theory . This method of qualitative data analysis starts with an analysis of a single case to formulate a theory. Then, additional cases are examined to see if they contribute to the theory.

Qualitative data analysis can be conducted through the following three steps:

Step 1: Developing and Applying Codes . Coding can be explained as categorization of data. A ‘code’ can be a word or a short phrase that represents a theme or an idea. All codes need to be assigned meaningful titles. A wide range of non-quantifiable elements such as events, behaviours, activities, meanings etc. can be coded.

There are three types of coding:

  • Open coding . The initial organization of raw data to try to make sense of it.
  • Axial coding . Interconnecting and linking the categories of codes.
  • Selective coding . Formulating the story through connecting the categories.

Coding can be done manually or using qualitative data analysis software such as

 NVivo,  Atlas ti 6.0,  HyperRESEARCH 2.8,  Max QDA and others.

When using manual coding you can use folders, filing cabinets, wallets etc. to gather together materials that are examples of similar themes or analytic ideas. Manual method of coding in qualitative data analysis is rightly considered as labour-intensive, time-consuming and outdated.

In computer-based coding, on the other hand, physical files and cabinets are replaced with computer based directories and files. When choosing software for qualitative data analysis you need to consider a wide range of factors such as the type and amount of data you need to analyse, time required to master the software and cost considerations.

Moreover, it is important to get confirmation from your dissertation supervisor prior to application of any specific qualitative data analysis software.

The following table contains examples of research titles, elements to be coded and identification of relevant codes:

 Qualitative data coding

Step 2: Identifying themes, patterns and relationships . Unlike quantitative methods , in qualitative data analysis there are no universally applicable techniques that can be applied to generate findings. Analytical and critical thinking skills of researcher plays significant role in data analysis in qualitative studies. Therefore, no qualitative study can be repeated to generate the same results.

Nevertheless, there is a set of techniques that you can use to identify common themes, patterns and relationships within responses of sample group members in relation to codes that have been specified in the previous stage.

Specifically, the most popular and effective methods of qualitative data interpretation include the following:

  • Word and phrase repetitions – scanning primary data for words and phrases most commonly used by respondents, as well as, words and phrases used with unusual emotions;
  • Primary and secondary data comparisons – comparing the findings of interview/focus group/observation/any other qualitative data collection method with the findings of literature review and discussing differences between them;
  • Search for missing information – discussions about which aspects of the issue was not mentioned by respondents, although you expected them to be mentioned;
  • Metaphors and analogues – comparing primary research findings to phenomena from a different area and discussing similarities and differences.

Step 3: Summarizing the data . At this last stage you need to link research findings to hypotheses or research aim and objectives. When writing data analysis chapter, you can use noteworthy quotations from the transcript in order to highlight major themes within findings and possible contradictions.

It is important to note that the process of qualitative data analysis described above is general and different types of qualitative studies may require slightly different methods of data analysis.

My  e-book,  The Ultimate Guide to Writing a Dissertation in Business Studies: a step by step approach  contains a detailed, yet simple explanation of qualitative data analysis methods . The e-book explains all stages of the research process starting from the selection of the research area to writing personal reflection. Important elements of dissertations such as research philosophy, research approach, research design, methods of data collection and data analysis are explained in simple words. John Dudovskiy

Qualitative Data Analysis

  • Search Menu
  • Browse content in Arts and Humanities
  • Browse content in Archaeology
  • Anglo-Saxon and Medieval Archaeology
  • Archaeological Methodology and Techniques
  • Archaeology by Region
  • Archaeology of Religion
  • Archaeology of Trade and Exchange
  • Biblical Archaeology
  • Contemporary and Public Archaeology
  • Environmental Archaeology
  • Historical Archaeology
  • History and Theory of Archaeology
  • Industrial Archaeology
  • Landscape Archaeology
  • Mortuary Archaeology
  • Prehistoric Archaeology
  • Underwater Archaeology
  • Urban Archaeology
  • Zooarchaeology
  • Browse content in Architecture
  • Architectural Structure and Design
  • History of Architecture
  • Residential and Domestic Buildings
  • Theory of Architecture
  • Browse content in Art
  • Art Subjects and Themes
  • History of Art
  • Industrial and Commercial Art
  • Theory of Art
  • Biographical Studies
  • Byzantine Studies
  • Browse content in Classical Studies
  • Classical History
  • Classical Philosophy
  • Classical Mythology
  • Classical Literature
  • Classical Reception
  • Classical Art and Architecture
  • Classical Oratory and Rhetoric
  • Greek and Roman Epigraphy
  • Greek and Roman Law
  • Greek and Roman Archaeology
  • Greek and Roman Papyrology
  • Late Antiquity
  • Religion in the Ancient World
  • Digital Humanities
  • Browse content in History
  • Colonialism and Imperialism
  • Diplomatic History
  • Environmental History
  • Genealogy, Heraldry, Names, and Honours
  • Genocide and Ethnic Cleansing
  • Historical Geography
  • History by Period
  • History of Agriculture
  • History of Education
  • History of Emotions
  • History of Gender and Sexuality
  • Industrial History
  • Intellectual History
  • International History
  • Labour History
  • Legal and Constitutional History
  • Local and Family History
  • Maritime History
  • Military History
  • National Liberation and Post-Colonialism
  • Oral History
  • Political History
  • Public History
  • Regional and National History
  • Revolutions and Rebellions
  • Slavery and Abolition of Slavery
  • Social and Cultural History
  • Theory, Methods, and Historiography
  • Urban History
  • World History
  • Browse content in Language Teaching and Learning
  • Language Learning (Specific Skills)
  • Language Teaching Theory and Methods
  • Browse content in Linguistics
  • Applied Linguistics
  • Cognitive Linguistics
  • Computational Linguistics
  • Forensic Linguistics
  • Grammar, Syntax and Morphology
  • Historical and Diachronic Linguistics
  • History of English
  • Language Acquisition
  • Language Variation
  • Language Families
  • Language Evolution
  • Language Reference
  • Lexicography
  • Linguistic Theories
  • Linguistic Typology
  • Linguistic Anthropology
  • Phonetics and Phonology
  • Psycholinguistics
  • Sociolinguistics
  • Translation and Interpretation
  • Writing Systems
  • Browse content in Literature
  • Bibliography
  • Children's Literature Studies
  • Literary Studies (Asian)
  • Literary Studies (European)
  • Literary Studies (Eco-criticism)
  • Literary Studies (Modernism)
  • Literary Studies (Romanticism)
  • Literary Studies (American)
  • Literary Studies - World
  • Literary Studies (1500 to 1800)
  • Literary Studies (19th Century)
  • Literary Studies (20th Century onwards)
  • Literary Studies (African American Literature)
  • Literary Studies (British and Irish)
  • Literary Studies (Early and Medieval)
  • Literary Studies (Fiction, Novelists, and Prose Writers)
  • Literary Studies (Gender Studies)
  • Literary Studies (Graphic Novels)
  • Literary Studies (History of the Book)
  • Literary Studies (Plays and Playwrights)
  • Literary Studies (Poetry and Poets)
  • Literary Studies (Postcolonial Literature)
  • Literary Studies (Queer Studies)
  • Literary Studies (Science Fiction)
  • Literary Studies (Travel Literature)
  • Literary Studies (War Literature)
  • Literary Studies (Women's Writing)
  • Literary Theory and Cultural Studies
  • Mythology and Folklore
  • Shakespeare Studies and Criticism
  • Browse content in Media Studies
  • Browse content in Music
  • Applied Music
  • Dance and Music
  • Ethics in Music
  • Ethnomusicology
  • Gender and Sexuality in Music
  • Medicine and Music
  • Music Cultures
  • Music and Religion
  • Music and Culture
  • Music and Media
  • Music Education and Pedagogy
  • Music Theory and Analysis
  • Musical Scores, Lyrics, and Libretti
  • Musical Structures, Styles, and Techniques
  • Musicology and Music History
  • Performance Practice and Studies
  • Race and Ethnicity in Music
  • Sound Studies
  • Browse content in Performing Arts
  • Browse content in Philosophy
  • Aesthetics and Philosophy of Art
  • Epistemology
  • Feminist Philosophy
  • History of Western Philosophy
  • Metaphysics
  • Moral Philosophy
  • Non-Western Philosophy
  • Philosophy of Science
  • Philosophy of Action
  • Philosophy of Law
  • Philosophy of Religion
  • Philosophy of Language
  • Philosophy of Mind
  • Philosophy of Perception
  • Philosophy of Mathematics and Logic
  • Practical Ethics
  • Social and Political Philosophy
  • Browse content in Religion
  • Biblical Studies
  • Christianity
  • East Asian Religions
  • History of Religion
  • Judaism and Jewish Studies
  • Qumran Studies
  • Religion and Education
  • Religion and Health
  • Religion and Politics
  • Religion and Science
  • Religion and Law
  • Religion and Art, Literature, and Music
  • Religious Studies
  • Browse content in Society and Culture
  • Cookery, Food, and Drink
  • Cultural Studies
  • Customs and Traditions
  • Ethical Issues and Debates
  • Hobbies, Games, Arts and Crafts
  • Lifestyle, Home, and Garden
  • Natural world, Country Life, and Pets
  • Popular Beliefs and Controversial Knowledge
  • Sports and Outdoor Recreation
  • Technology and Society
  • Travel and Holiday
  • Visual Culture
  • Browse content in Law
  • Arbitration
  • Browse content in Company and Commercial Law
  • Commercial Law
  • Company Law
  • Browse content in Comparative Law
  • Systems of Law
  • Competition Law
  • Browse content in Constitutional and Administrative Law
  • Government Powers
  • Judicial Review
  • Local Government Law
  • Military and Defence Law
  • Parliamentary and Legislative Practice
  • Construction Law
  • Contract Law
  • Browse content in Criminal Law
  • Criminal Procedure
  • Criminal Evidence Law
  • Sentencing and Punishment
  • Employment and Labour Law
  • Environment and Energy Law
  • Browse content in Financial Law
  • Banking Law
  • Insolvency Law
  • History of Law
  • Human Rights and Immigration
  • Intellectual Property Law
  • Browse content in International Law
  • Private International Law and Conflict of Laws
  • Public International Law
  • IT and Communications Law
  • Jurisprudence and Philosophy of Law
  • Law and Politics
  • Law and Society
  • Browse content in Legal System and Practice
  • Courts and Procedure
  • Legal Skills and Practice
  • Primary Sources of Law
  • Regulation of Legal Profession
  • Medical and Healthcare Law
  • Browse content in Policing
  • Criminal Investigation and Detection
  • Police and Security Services
  • Police Procedure and Law
  • Police Regional Planning
  • Browse content in Property Law
  • Personal Property Law
  • Study and Revision
  • Terrorism and National Security Law
  • Browse content in Trusts Law
  • Wills and Probate or Succession
  • Browse content in Medicine and Health
  • Browse content in Allied Health Professions
  • Arts Therapies
  • Clinical Science
  • Dietetics and Nutrition
  • Occupational Therapy
  • Operating Department Practice
  • Physiotherapy
  • Radiography
  • Speech and Language Therapy
  • Browse content in Anaesthetics
  • General Anaesthesia
  • Neuroanaesthesia
  • Browse content in Clinical Medicine
  • Acute Medicine
  • Cardiovascular Medicine
  • Clinical Genetics
  • Clinical Pharmacology and Therapeutics
  • Dermatology
  • Endocrinology and Diabetes
  • Gastroenterology
  • Genito-urinary Medicine
  • Geriatric Medicine
  • Infectious Diseases
  • Medical Oncology
  • Medical Toxicology
  • Pain Medicine
  • Palliative Medicine
  • Rehabilitation Medicine
  • Respiratory Medicine and Pulmonology
  • Rheumatology
  • Sleep Medicine
  • Sports and Exercise Medicine
  • Clinical Neuroscience
  • Community Medical Services
  • Critical Care
  • Emergency Medicine
  • Forensic Medicine
  • Haematology
  • History of Medicine
  • Browse content in Medical Dentistry
  • Oral and Maxillofacial Surgery
  • Paediatric Dentistry
  • Restorative Dentistry and Orthodontics
  • Surgical Dentistry
  • Medical Ethics
  • Browse content in Medical Skills
  • Clinical Skills
  • Communication Skills
  • Nursing Skills
  • Surgical Skills
  • Medical Statistics and Methodology
  • Browse content in Neurology
  • Clinical Neurophysiology
  • Neuropathology
  • Nursing Studies
  • Browse content in Obstetrics and Gynaecology
  • Gynaecology
  • Occupational Medicine
  • Ophthalmology
  • Otolaryngology (ENT)
  • Browse content in Paediatrics
  • Neonatology
  • Browse content in Pathology
  • Chemical Pathology
  • Clinical Cytogenetics and Molecular Genetics
  • Histopathology
  • Medical Microbiology and Virology
  • Patient Education and Information
  • Browse content in Pharmacology
  • Psychopharmacology
  • Browse content in Popular Health
  • Caring for Others
  • Complementary and Alternative Medicine
  • Self-help and Personal Development
  • Browse content in Preclinical Medicine
  • Cell Biology
  • Molecular Biology and Genetics
  • Reproduction, Growth and Development
  • Primary Care
  • Professional Development in Medicine
  • Browse content in Psychiatry
  • Addiction Medicine
  • Child and Adolescent Psychiatry
  • Forensic Psychiatry
  • Learning Disabilities
  • Old Age Psychiatry
  • Psychotherapy
  • Browse content in Public Health and Epidemiology
  • Epidemiology
  • Public Health
  • Browse content in Radiology
  • Clinical Radiology
  • Interventional Radiology
  • Nuclear Medicine
  • Radiation Oncology
  • Reproductive Medicine
  • Browse content in Surgery
  • Cardiothoracic Surgery
  • Gastro-intestinal and Colorectal Surgery
  • General Surgery
  • Neurosurgery
  • Paediatric Surgery
  • Peri-operative Care
  • Plastic and Reconstructive Surgery
  • Surgical Oncology
  • Transplant Surgery
  • Trauma and Orthopaedic Surgery
  • Vascular Surgery
  • Browse content in Science and Mathematics
  • Browse content in Biological Sciences
  • Aquatic Biology
  • Biochemistry
  • Bioinformatics and Computational Biology
  • Developmental Biology
  • Ecology and Conservation
  • Evolutionary Biology
  • Genetics and Genomics
  • Microbiology
  • Molecular and Cell Biology
  • Natural History
  • Plant Sciences and Forestry
  • Research Methods in Life Sciences
  • Structural Biology
  • Systems Biology
  • Zoology and Animal Sciences
  • Browse content in Chemistry
  • Analytical Chemistry
  • Computational Chemistry
  • Crystallography
  • Environmental Chemistry
  • Industrial Chemistry
  • Inorganic Chemistry
  • Materials Chemistry
  • Medicinal Chemistry
  • Mineralogy and Gems
  • Organic Chemistry
  • Physical Chemistry
  • Polymer Chemistry
  • Study and Communication Skills in Chemistry
  • Theoretical Chemistry
  • Browse content in Computer Science
  • Artificial Intelligence
  • Computer Architecture and Logic Design
  • Game Studies
  • Human-Computer Interaction
  • Mathematical Theory of Computation
  • Programming Languages
  • Software Engineering
  • Systems Analysis and Design
  • Virtual Reality
  • Browse content in Computing
  • Business Applications
  • Computer Security
  • Computer Games
  • Computer Networking and Communications
  • Digital Lifestyle
  • Graphical and Digital Media Applications
  • Operating Systems
  • Browse content in Earth Sciences and Geography
  • Atmospheric Sciences
  • Environmental Geography
  • Geology and the Lithosphere
  • Maps and Map-making
  • Meteorology and Climatology
  • Oceanography and Hydrology
  • Palaeontology
  • Physical Geography and Topography
  • Regional Geography
  • Soil Science
  • Urban Geography
  • Browse content in Engineering and Technology
  • Agriculture and Farming
  • Biological Engineering
  • Civil Engineering, Surveying, and Building
  • Electronics and Communications Engineering
  • Energy Technology
  • Engineering (General)
  • Environmental Science, Engineering, and Technology
  • History of Engineering and Technology
  • Mechanical Engineering and Materials
  • Technology of Industrial Chemistry
  • Transport Technology and Trades
  • Browse content in Environmental Science
  • Applied Ecology (Environmental Science)
  • Conservation of the Environment (Environmental Science)
  • Environmental Sustainability
  • Environmentalist Thought and Ideology (Environmental Science)
  • Management of Land and Natural Resources (Environmental Science)
  • Natural Disasters (Environmental Science)
  • Nuclear Issues (Environmental Science)
  • Pollution and Threats to the Environment (Environmental Science)
  • Social Impact of Environmental Issues (Environmental Science)
  • History of Science and Technology
  • Browse content in Materials Science
  • Ceramics and Glasses
  • Composite Materials
  • Metals, Alloying, and Corrosion
  • Nanotechnology
  • Browse content in Mathematics
  • Applied Mathematics
  • Biomathematics and Statistics
  • History of Mathematics
  • Mathematical Education
  • Mathematical Finance
  • Mathematical Analysis
  • Numerical and Computational Mathematics
  • Probability and Statistics
  • Pure Mathematics
  • Browse content in Neuroscience
  • Cognition and Behavioural Neuroscience
  • Development of the Nervous System
  • Disorders of the Nervous System
  • History of Neuroscience
  • Invertebrate Neurobiology
  • Molecular and Cellular Systems
  • Neuroendocrinology and Autonomic Nervous System
  • Neuroscientific Techniques
  • Sensory and Motor Systems
  • Browse content in Physics
  • Astronomy and Astrophysics
  • Atomic, Molecular, and Optical Physics
  • Biological and Medical Physics
  • Classical Mechanics
  • Computational Physics
  • Condensed Matter Physics
  • Electromagnetism, Optics, and Acoustics
  • History of Physics
  • Mathematical and Statistical Physics
  • Measurement Science
  • Nuclear Physics
  • Particles and Fields
  • Plasma Physics
  • Quantum Physics
  • Relativity and Gravitation
  • Semiconductor and Mesoscopic Physics
  • Browse content in Psychology
  • Affective Sciences
  • Clinical Psychology
  • Cognitive Neuroscience
  • Cognitive Psychology
  • Criminal and Forensic Psychology
  • Developmental Psychology
  • Educational Psychology
  • Evolutionary Psychology
  • Health Psychology
  • History and Systems in Psychology
  • Music Psychology
  • Neuropsychology
  • Organizational Psychology
  • Psychological Assessment and Testing
  • Psychology of Human-Technology Interaction
  • Psychology Professional Development and Training
  • Research Methods in Psychology
  • Social Psychology
  • Browse content in Social Sciences
  • Browse content in Anthropology
  • Anthropology of Religion
  • Human Evolution
  • Medical Anthropology
  • Physical Anthropology
  • Regional Anthropology
  • Social and Cultural Anthropology
  • Theory and Practice of Anthropology
  • Browse content in Business and Management
  • Business Strategy
  • Business History
  • Business Ethics
  • Business and Government
  • Business and Technology
  • Business and the Environment
  • Comparative Management
  • Corporate Governance
  • Corporate Social Responsibility
  • Entrepreneurship
  • Health Management
  • Human Resource Management
  • Industrial and Employment Relations
  • Industry Studies
  • Information and Communication Technologies
  • International Business
  • Knowledge Management
  • Management and Management Techniques
  • Operations Management
  • Organizational Theory and Behaviour
  • Pensions and Pension Management
  • Public and Nonprofit Management
  • Strategic Management
  • Supply Chain Management
  • Browse content in Criminology and Criminal Justice
  • Criminal Justice
  • Criminology
  • Forms of Crime
  • International and Comparative Criminology
  • Youth Violence and Juvenile Justice
  • Development Studies
  • Browse content in Economics
  • Agricultural, Environmental, and Natural Resource Economics
  • Asian Economics
  • Behavioural Finance
  • Behavioural Economics and Neuroeconomics
  • Econometrics and Mathematical Economics
  • Economic Systems
  • Economic Methodology
  • Economic History
  • Economic Development and Growth
  • Financial Markets
  • Financial Institutions and Services
  • General Economics and Teaching
  • Health, Education, and Welfare
  • History of Economic Thought
  • International Economics
  • Labour and Demographic Economics
  • Law and Economics
  • Macroeconomics and Monetary Economics
  • Microeconomics
  • Public Economics
  • Urban, Rural, and Regional Economics
  • Welfare Economics
  • Browse content in Education
  • Adult Education and Continuous Learning
  • Care and Counselling of Students
  • Early Childhood and Elementary Education
  • Educational Equipment and Technology
  • Educational Strategies and Policy
  • Higher and Further Education
  • Organization and Management of Education
  • Philosophy and Theory of Education
  • Schools Studies
  • Secondary Education
  • Teaching of a Specific Subject
  • Teaching of Specific Groups and Special Educational Needs
  • Teaching Skills and Techniques
  • Browse content in Environment
  • Applied Ecology (Social Science)
  • Climate Change
  • Conservation of the Environment (Social Science)
  • Environmentalist Thought and Ideology (Social Science)
  • Natural Disasters (Environment)
  • Social Impact of Environmental Issues (Social Science)
  • Browse content in Human Geography
  • Cultural Geography
  • Economic Geography
  • Political Geography
  • Browse content in Interdisciplinary Studies
  • Communication Studies
  • Museums, Libraries, and Information Sciences
  • Browse content in Politics
  • African Politics
  • Asian Politics
  • Chinese Politics
  • Comparative Politics
  • Conflict Politics
  • Elections and Electoral Studies
  • Environmental Politics
  • European Union
  • Foreign Policy
  • Gender and Politics
  • Human Rights and Politics
  • Indian Politics
  • International Relations
  • International Organization (Politics)
  • International Political Economy
  • Irish Politics
  • Latin American Politics
  • Middle Eastern Politics
  • Political Methodology
  • Political Communication
  • Political Philosophy
  • Political Sociology
  • Political Theory
  • Political Behaviour
  • Political Economy
  • Political Institutions
  • Politics and Law
  • Public Administration
  • Public Policy
  • Quantitative Political Methodology
  • Regional Political Studies
  • Russian Politics
  • Security Studies
  • State and Local Government
  • UK Politics
  • US Politics
  • Browse content in Regional and Area Studies
  • African Studies
  • Asian Studies
  • East Asian Studies
  • Japanese Studies
  • Latin American Studies
  • Middle Eastern Studies
  • Native American Studies
  • Scottish Studies
  • Browse content in Research and Information
  • Research Methods
  • Browse content in Social Work
  • Addictions and Substance Misuse
  • Adoption and Fostering
  • Care of the Elderly
  • Child and Adolescent Social Work
  • Couple and Family Social Work
  • Developmental and Physical Disabilities Social Work
  • Direct Practice and Clinical Social Work
  • Emergency Services
  • Human Behaviour and the Social Environment
  • International and Global Issues in Social Work
  • Mental and Behavioural Health
  • Social Justice and Human Rights
  • Social Policy and Advocacy
  • Social Work and Crime and Justice
  • Social Work Macro Practice
  • Social Work Practice Settings
  • Social Work Research and Evidence-based Practice
  • Welfare and Benefit Systems
  • Browse content in Sociology
  • Childhood Studies
  • Community Development
  • Comparative and Historical Sociology
  • Economic Sociology
  • Gender and Sexuality
  • Gerontology and Ageing
  • Health, Illness, and Medicine
  • Marriage and the Family
  • Migration Studies
  • Occupations, Professions, and Work
  • Organizations
  • Population and Demography
  • Race and Ethnicity
  • Social Theory
  • Social Movements and Social Change
  • Social Research and Statistics
  • Social Stratification, Inequality, and Mobility
  • Sociology of Religion
  • Sociology of Education
  • Sport and Leisure
  • Urban and Rural Studies
  • Browse content in Warfare and Defence
  • Defence Strategy, Planning, and Research
  • Land Forces and Warfare
  • Military Administration
  • Military Life and Institutions
  • Naval Forces and Warfare
  • Other Warfare and Defence Issues
  • Peace Studies and Conflict Resolution
  • Weapons and Equipment

The Oxford Handbook of Qualitative Research (2nd edn)

  • < Previous chapter
  • Next chapter >

The Oxford Handbook of Qualitative Research (2nd edn)

29 Qualitative Data Analysis Strategies

Johnny Saldaña, School of Theatre and Film, Arizona State University

  • Published: 02 September 2020
  • Cite Icon Cite
  • Permissions Icon Permissions

This chapter provides an overview of selected qualitative data analysis strategies with a particular focus on codes and coding. Preparatory strategies for a qualitative research study and data management are first outlined. Six coding methods are then profiled using comparable interview data: process coding, in vivo coding, descriptive coding, values coding, dramaturgical coding, and versus coding. Strategies for constructing themes and assertions from the data follow. Analytic memo writing is woven throughout as a method for generating additional analytic insight. Next, display and arts-based strategies are provided, followed by recommended qualitative data analytic software programs and a discussion on verifying the researcher’s analytic findings.

Qualitative Data Analysis Strategies

Anthropologist Clifford Geertz ( 1983 ) charmingly mused, “Life is just a bowl of strategies” (p. 25). Strategy , as I use it here, refers to a carefully considered plan or method to achieve a particular goal. The goal in this case is to develop a write-up of your analytic work with the qualitative data you have been given and collected as part of a study. The plans and methods you might employ to achieve that goal are what this article profiles.

Some may perceive strategy as an inappropriate, if not manipulative, word, suggesting formulaic or regimented approaches to inquiry. I assure you that is not my intent. My use of strategy is dramaturgical in nature: Strategies are actions that characters in plays take to overcome obstacles to achieve their objectives. Actors portraying these characters rely on action verbs to generate belief within themselves and to motivate them as they interpret their lines and move appropriately on stage.

What I offer is a qualitative researcher’s array of actions from which to draw to overcome the obstacles to thinking to achieve an analysis of your data. But unlike the prescripted text of a play in which the obstacles, strategies, and outcomes have been predetermined by the playwright, your work must be improvisational—acting, reacting, and interacting with data on a moment-by-moment basis to determine what obstacles stand in your way and thus what strategies you should take to reach your goals.

Another intriguing quote to keep in mind comes from research methodologist Robert E. Stake ( 1995 ), who posited, “Good research is not about good methods as much as it is about good thinking” (p. 19). In other words, strategies can take you only so far. You can have a box full of tools, but if you do not know how to use them well or use them creatively, the collection seems rather purposeless. One of the best ways we learn is by doing . So, pick up one or more of these strategies (in the form of verbs) and take analytic action with your data. Also keep in mind that these are discussed in the order in which they may typically occur, although humans think cyclically, iteratively, and reverberatively, and each research project has its unique contexts and needs. Be prepared for your mind to jump purposefully and/or idiosyncratically from one strategy to another throughout the study.

Qualitative Data Analysis Strategy: To Foresee

To foresee in qualitative data analysis (QDA) is to reflect beforehand on what forms of data you will most likely need and collect, which thus informs what types of data analytic strategies you anticipate using. Analysis, in a way, begins even before you collect data (Saldaña & Omasta, 2018 ). As you design your research study in your mind and on a text editing page, one strategy is to consider what types of data you may need to help inform and answer your central and related research questions. Interview transcripts, participant observation field notes, documents, artifacts, photographs, video recordings, and so on are not only forms of data but also foundations for how you may plan to analyze them. A participant interview, for example, suggests that you will transcribe all or relevant portions of the recording and use both the transcription and the recording itself as sources for data analysis. Any analytic memos (discussed later) you make about your impressions of the interview also become data to analyze. Even the computing software you plan to employ will be relevant to data analysis because it may help or hinder your efforts.

As your research design formulates, compose one to two paragraphs that outline how your QDA may proceed. This will necessitate that you have some background knowledge of the vast array of methods available to you. Thus, surveying the literature is vital preparatory work.

Qualitative Data Analysis Strategy: To Survey

To survey in QDA is to look for and consider the applicability of the QDA literature in your field that may provide useful guidance for your forthcoming data analytic work. General sources in QDA will provide a good starting point for acquainting you with the data analysis strategies available for the variety of methodologies or genres in qualitative inquiry (e.g., ethnography, phenomenology, case study, arts-based research, mixed methods). One of the most accessible (and humorous) is Galman’s ( 2013 ) The Good, the Bad, and the Data , and one of the most richly detailed is Frederick J. Wertz et al.’s ( 2011 ) Five Ways of Doing Qualitative Analysis . The author’s core texts for this chapter come from The Coding Manual for Qualitative Researchers (Saldaña, 2016 ) and Qualitative Research: Analyzing Life (Saldaña & Omasta, 2018 ).

If your study’s methodology or approach is grounded theory, for example, then a survey of methods works by authors such as Barney G. Glaser, Anselm L. Strauss, Juliet Corbin, and, in particular, the prolific Kathy Charmaz ( 2014 ) may be expected. But there has been a recent outpouring of additional book publications in grounded theory by Birks and Mills ( 2015 ), Bryant ( 2017 ), Bryant and Charmaz ( 2019 ), and Stern and Porr ( 2011 ), plus the legacy of thousands of articles and chapters across many disciplines that have addressed grounded theory in their studies.

Fields such as education, psychology, social work, healthcare, and others also have their own QDA methods literature in the form of texts and journals, as well as international conferences and workshops for members of the profession. It is important to have had some university coursework and/or mentorship in qualitative research to suitably prepare you for the intricacies of QDA, and you must acknowledge that the emergent nature of qualitative inquiry may require you to adopt analysis strategies that differ from what you originally planned.

Qualitative Data Analysis Strategy: To Collect

To collect in QDA is to receive the data given to you by participants and those data you actively gather to inform your study. Qualitative data analysis is concurrent with data collection and management. As interviews are transcribed, field notes are fleshed out, and documents are filed, the researcher uses opportunities to carefully read the corpus and make preliminary notations directly on the data documents by highlighting, bolding, italicizing, or noting in some way any particularly interesting or salient portions. As these data are initially reviewed, the researcher also composes supplemental analytic memos that include first impressions, reminders for follow-up, preliminary connections, and other thinking matters about the phenomena at work.

Some of the most common fieldwork tools you might use to collect data are notepads, pens and pencils; file folders for hard-copy documents; a laptop, tablet, or desktop with text editing software (Microsoft Word and Excel are most useful) and Internet access; and a digital camera and voice recorder (functions available on many electronic devices such as smartphones). Some fieldworkers may even employ a digital video camera to record social action, as long as participant permissions have been secured. But everything originates from the researcher. Your senses are immersed in the cultural milieu you study, taking in and holding onto relevant details, or significant trivia , as I call them. You become a human camera, zooming out to capture the broad landscape of your field site one day and then zooming in on a particularly interesting individual or phenomenon the next. Your analysis is only as good as the data you collect.

Fieldwork can be an overwhelming experience because so many details of social life are happening in front of you. Take a holistic approach to your entrée, but as you become more familiar with the setting and participants, actively focus on things that relate to your research topic and questions. Keep yourself open to the intriguing, surprising, and disturbing (Sunstein & Chiseri-Strater, 2012 , p. 115), because these facets enrich your study by making you aware of the unexpected.

Qualitative Data Analysis Strategy: To Feel

To feel in QDA is to gain deep emotional insight into the social worlds you study and what it means to be human. Virtually everything we do has an accompanying emotion(s), and feelings are both reactions and stimuli for action. Others’ emotions clue you to their motives, values, attitudes, beliefs, worldviews, identities, and other subjective perceptions and interpretations. Acknowledge that emotional detachment is not possible in field research. Attunement to the emotional experiences of your participants plus sympathetic and empathetic responses to the actions around you are necessary in qualitative endeavors. Your own emotional responses during fieldwork are also data because they document the tacit and visceral. It is important during such analytic reflection to assess why your emotional reactions were as they were. But it is equally important not to let emotions alone steer the course of your study. A proper balance must be found between feelings and facts.

Qualitative Data Analysis Strategy: To Organize

To organize in QDA is to maintain an orderly repository of data for easy access and analysis. Even in the smallest of qualitative studies, a large amount of data will be collected across time. Prepare both a hard drive and hard-copy folders for digital data and paperwork, and back up all materials for security from loss. I recommend that each data unit (e.g., one interview transcript, one document, one day’s worth of field notes) have its own file, with subfolders specifying the data forms and research study logistics (e.g., interviews, field notes, documents, institutional review board correspondence, calendar).

For small-scale qualitative studies, I have found it quite useful to maintain one large master file with all participant and field site data copied and combined with the literature review and accompanying researcher analytic memos. This master file is used to cut and paste related passages together, deleting what seems unnecessary as the study proceeds and eventually transforming the document into the final report itself. Cosmetic devices such as font style, font size, rich text (italicizing, bolding, underlining, etc.), and color can help you distinguish between different data forms and highlight significant passages. For example, descriptive, narrative passages of field notes are logged in regular font. “Quotations, things spoken by participants, are logged in bold font.”   Observer’s comments, such as the researcher’s subjective impressions or analytic jottings, are set in italics.

Qualitative Data Analysis Strategy: To Jot

To jot in QDA is to write occasional, brief notes about your thinking or reminders for follow-up. A jot is a phrase or brief sentence that will fit on a standard-size sticky note. As data are brought and documented together, take some initial time to review their contents and jot some notes about preliminary patterns, participant quotes that seem vivid, anomalies in the data, and so forth.

As you work on a project, keep something to write with or to voice record with you at all times to capture your fleeting thoughts. You will most likely find yourself thinking about your research when you are not working exclusively on the project, and a “mental jot” may occur to you as you ruminate on logistical or analytic matters. Document the thought in some way for later retrieval and elaboration as an analytic memo.

Qualitative Data Analysis Strategy: To Prioritize

To prioritize in QDA is to determine which data are most significant in your corpus and which tasks are most necessary. During fieldwork, massive amounts of data in various forms may be collected, and your mind can be easily overwhelmed by the magnitude of the quantity, its richness, and its management. Decisions will need to be made about the most pertinent data because they help answer your research questions or emerge as salient pieces of evidence. As a sweeping generalization, approximately one half to two thirds of what you collect may become unnecessary as you proceed toward the more formal stages of QDA.

To prioritize in QDA is also to determine what matters most in your assembly of codes, categories, patterns, themes, assertions, propositions, and concepts. Return to your research purpose and questions to keep you framed for what the focus should be.

Qualitative Data Analysis Strategy: To Analyze

To analyze in QDA is to observe and discern patterns within data and to construct meanings that seem to capture their essences and essentials. Just as there are a variety of genres, elements, and styles of qualitative research, so too are there a variety of methods available for QDA. Analytic choices are most often based on what methods will harmonize with your genre selection and conceptual framework, what will generate the most sufficient answers to your research questions, and what will best represent and present the project’s findings.

Analysis can range from the factual to the conceptual to the interpretive. Analysis can also range from a straightforward descriptive account to an emergently constructed grounded theory to an evocatively composed short story. A qualitative research project’s outcomes may range from rigorously achieved, insightful answers to open-ended, evocative questions; from rich descriptive detail to a bullet-point list of themes; and from third-person, objective reportage to first-person, emotion-laden poetry. Just as there are multiple destinations in qualitative research, there are multiple pathways and journeys along the way.

Analysis is accelerated as you take cognitive ownership of your data. By reading and rereading the corpus, you gain intimate familiarity with its contents and begin to notice significant details as well as make new connections and insights about their meanings. Patterns, categories, themes, and their interrelationships become more evident the more you know the subtleties of the database.

Since qualitative research’s design, fieldwork, and data collection are most often provisional, emergent, and evolutionary processes, you reflect on and analyze the data as you gather them and proceed through the project. If preplanned methods are not working, you change them to secure the data you need. There is generally a postfieldwork period when continued reflection and more systematic data analysis occur, concurrent with or followed by additional data collection, if needed, and the more formal write-up of the study, which is in itself an analytic act. Through field note writing, interview transcribing, analytic memo writing, and other documentation processes, you gain cognitive ownership of your data; and the intuitive, tacit, synthesizing capabilities of your brain begin sensing patterns, making connections, and seeing the bigger picture. The purpose and outcome of data analysis is to reveal to others through fresh insights what we have observed and discovered about the human condition. Fortunately, there are heuristics for reorganizing and reflecting on your qualitative data to help you achieve that goal.

Qualitative Data Analysis Strategy: To Pattern

To pattern in QDA is to detect similarities within and regularities among the data you have collected. The natural world is filled with patterns because we, as humans, have constructed them as such. Stars in the night sky are not just a random assembly; our ancestors pieced them together to form constellations like the Big Dipper. A collection of flowers growing wild in a field has a pattern, as does an individual flower’s patterns of leaves and petals. Look at the physical objects humans have created and notice how pattern oriented we are in our construction, organization, and decoration. Look around you in your environment and notice how many patterns are evident on your clothing, in a room, and on most objects themselves. Even our sometimes mundane daily and long-term human actions are reproduced patterns in the form of routines, rituals, rules, roles, and relationships (Saldaña & Omasta, 2018 ).

This human propensity for pattern-making follows us into QDA. From the vast array of interview transcripts, field notes, documents, and other forms of data, there is this instinctive, hardwired need to bring order to the collection—not just to reorganize it but to look for and construct patterns out of it. The discernment of patterns is one of the first steps in the data analytic process, and the methods described next are recommended ways to construct them.

Qualitative Data Analysis Strategy: To Code

To code in QDA is to assign a truncated, symbolic meaning to each datum for purposes of qualitative analysis—primarily patterning and categorizing. Coding is a heuristic—a method of discovery—to the meanings of individual sections of data. These codes function as a way of patterning, classifying, and later reorganizing them into emergent categories for further analysis. Different types of codes exist for different types of research genres and qualitative data analytic approaches, but this chapter will focus on only a few selected methods. First, a code can be defined as follows:

A code in qualitative data analysis is most often a word or short phrase that symbolically assigns a summative, salient, essence-capturing, and/or evocative attribute for a portion of language-based or visual data. The data can consist of interview transcripts, participant observation field notes, journals, documents, open-ended survey responses, drawings, artifacts, photographs, video, Internet sites, e-mail correspondence, academic and fictional literature, and so on. The portion of data coded … can range in magnitude from a single word to a full paragraph, an entire page of text or a stream of moving images.… Just as a title represents and captures a book or film or poem’s primary content and essence, so does a code represent and capture a datum’s primary content and essence. (Saldaña, 2016 , p. 4)

One helpful precoding task is to divide or parse long selections of field note or interview transcript data into shorter stanzas . Stanza division unitizes or “chunks” the corpus into more manageable paragraph-like units for coding assignments and analysis. The transcript sample that follows illustrates one possible way of inserting line breaks between self-standing passages of interview text for easier readability.

Process Coding

As a first coding example, the following interview excerpt about an employed, single, lower middle-class adult male’s spending habits during a difficult economic period in the United States is coded in the right-hand margin in capital letters. The superscript numbers match the beginning of the datum unit with its corresponding code. This method is called process coding (Charmaz, 2014 ), and it uses gerunds (“-ing” words) exclusively to represent action suggested by the data. Processes can consist of observable human actions (e.g., BUYING BARGAINS), mental or internal processes (e.g., THINKING TWICE), and more conceptual ideas (e.g., APPRECIATING WHAT YOU’VE GOT). Notice that the interviewer’s (I) portions are not coded, just the participant’s (P). A code is applied each time the subtopic of the interview shifts—even within a stanza—and the same codes can (and should) be used more than once if the subtopics are similar. The central research question driving this qualitative study is, “In what ways are middle-class Americans influenced and affected by an economic recession?”

Different researchers analyzing this same piece of data may develop completely different codes, depending on their personal lenses, filters, and angles. The previous codes are only one person’s interpretation of what is happening in the data, not a definitive list. The process codes have transformed the raw data units into new symbolic representations for analysis. A listing of the codes applied to this interview transcript, in the order they appear, reads:

BUYING BARGAINS

QUESTIONING A PURCHASE

THINKING TWICE

STOCKING UP

REFUSING SACRIFICE

PRIORITIZING

FINDING ALTERNATIVES

LIVING CHEAPLY

NOTICING CHANGES

STAYING INFORMED

MAINTAINING HEALTH

PICKING UP THE TAB

APPRECIATING WHAT YOU’VE GOT

Coding the data is the first step in this approach to QDA, and categorization is just one of the next possible steps.

Qualitative Data Analysis Strategy: To Categorize

To categorize in QDA is to cluster similar or comparable codes into groups for pattern construction and further analysis. Humans categorize things in innumerable ways. Think of an average apartment or house’s layout. The rooms of a dwelling have been constructed or categorized by their builders and occupants according to function. A kitchen is designated as an area to store and prepare food and to store the cooking and dining materials, such as pots, pans, and utensils. A bedroom is designated for sleeping, a closet for clothing storage, a bathroom for bodily functions and hygiene, and so on. Each room is like a category in which related and relevant patterns of human action occur. There are exceptions now and then, such as eating breakfast in bed rather than in a dining area or living in a small studio apartment in which most possessions are contained within one large room (but nonetheless are most often organized and clustered into subcategories according to function and optimal use of space).

The point is that the patterns of social action we designate into categories during QDA are not perfectly bounded. Category construction is our best attempt to cluster the most seemingly alike things into the most seemingly appropriate groups. Categorizing is reorganizing and reordering the vast array of data from a study because it is from these smaller, larger, and meaning-rich units that we can better grasp the particular features of each one and the categories’ possible interrelationships with one another.

One analytic strategy with a list of codes is to classify them into similar clusters. The same codes share the same category, but it is also possible that a single code can merit its own group if you feel it is unique enough. After the codes have been classified, a category label is applied to each grouping. Sometimes a code can also double as a category name if you feel it best summarizes the totality of the cluster. Like coding, categorizing is an interpretive act, because there can be different ways of separating and collecting codes that seem to belong together. The cut-and-paste functions of text editing software are most useful for exploring which codes share something in common.

Below is my categorization of the 15 codes generated from the interview transcript presented earlier. Like the gerunds for process codes, the categories have also been labeled as “-ing” words to connote action. And there was no particular reason why 15 codes resulted in three categories—there could have been less or even more, but this is how the array came together after my reflections on which codes seemed to belong together. The category labels are ways of answering why they belong together. For at-a-glance differentiation, I place codes in CAPITAL LETTERS and categories in upper- and lowercase Bold Font :

Category 1: Thinking Strategically

Category 2: Spending Strategically

Category 3: Living Strategically

Notice that the three category labels share a common word: strategically . Where did this word come from? It came from analytic reflection on the original data, the codes, and the process of categorizing the codes and generating their category labels. It was the analyst’s choice based on the interpretation of what primary action was happening. Your categories generated from your coded data do not need to share a common word or phrase, but I find that this technique, when appropriate, helps build a sense of unity to the initial analytic scheme.

The three categories— Thinking Strategically, Spending Strategically , and Living Strategically —are then reflected on for how they might interact and interplay. This is where the next major facet of data analysis, analytic memos, enters the scheme. But a necessary section on the basic principles of interrelationship and analytic reasoning must precede that discussion.

Qualitative Data Analysis Strategy: To Interrelate

To interrelate in QDA is to propose connections within, between, and among the constituent elements of analyzed data. One task of QDA is to explore the ways our patterns and categories interact and interplay. I use these terms to suggest the qualitative equivalent of statistical correlation, but interaction and interplay are much more than a simple relationship. They imply interrelationship . Interaction refers to reverberative connections—for example, how one or more categories might influence and affect the others, how categories operate concurrently, or whether there is some kind of domino effect to them. Interplay refers to the structural and processual nature of categories—for example, whether some type of sequential order, hierarchy, or taxonomy exists; whether any overlaps occur; whether there is superordinate and subordinate arrangement; and what types of organizational frameworks or networks might exist among them. The positivist construct of cause and effect becomes influences and affects in QDA.

There can even be patterns of patterns and categories of categories if your mind thinks conceptually and abstractly enough. Our minds can intricately connect multiple phenomena, but only if the data and their analyses support the constructions. We can speculate about interaction and interplay all we want, but it is only through a more systematic investigation of the data—in other words, good thinking—that we can plausibly establish any possible interrelationships.

Qualitative Data Analysis Strategy: To Reason

To reason in QDA is to think in ways that lead to summative findings, causal probabilities, and evaluative conclusions. Unlike quantitative research, with its statistical formulas and established hypothesis-testing protocols, qualitative research has no standardized methods of data analysis. Rest assured, there are recommended guidelines from the field’s scholars and a legacy of analysis strategies from which to draw. But the primary heuristics (or methods of discovery) you apply during a study are retroductive, inductive, substructive, abductive , and deductive reasoning.

Retroduction is historic reconstruction, working backward to figure out how the current conditions came to exist. Induction is what we experientially explore and infer to be transferable from the particular to the general, based on an examination of the evidence and an accumulation of knowledge. Substruction takes things apart to more carefully examine the constituent elements of the whole. Abduction is surmising from a range of possibilities that which is most likely, those explanatory hunches of plausibility based on clues. Deduction is what we generally draw and conclude from established facts and evidence.

It is not always necessary to know the names of these five ways of reasoning as you proceed through analysis. In fact, you will more than likely reverberate quickly from one to another depending on the task at hand. But what is important to remember about reasoning is:

to examine the evidence carefully and make reasonable inferences;

to base your conclusions primarily on the participants’ experiences, not just your own;

not to take the obvious for granted, because sometimes the expected will not happen;

your hunches can be right and, at other times, quite wrong; and

to logically yet imaginatively think about what is going on and how it all comes together.

Futurists and inventors propose three questions when they think about creating new visions for the world: What is possible (induction)? What is plausible (abduction)? What is preferable (deduction)? These same three questions might be posed as you proceed through QDA and particularly through analytic memo writing, which is substructive and retroductive reflection on your analytic work thus far.

Qualitative Data Analysis Strategy: To Memo

To memo in QDA is to reflect in writing on the nuances, inferences, meanings, and transfer of coded and categorized data plus your analytic processes. Like field note writing, perspectives vary among practitioners as to the methods for documenting the researcher’s analytic insights and subjective experiences. Some advise that such reflections should be included in field notes as relevant to the data. Others advise that a separate researcher’s journal should be maintained for recording these impressions. And still others advise that these thoughts be documented as separate analytic memos. I prescribe the latter as a method because it is generated by and directly connected to the data themselves.

An analytic memo is a “think piece” of reflective free writing, a narrative that sets in words your interpretations of the data. Coding and categorizing are heuristics to detect some of the possible patterns and interrelationships at work within the corpus, and an analytic memo further articulates your retroductive, inductive, substructive, abductive, and deductive thinking processes on what things may mean. Though the metaphor is a bit flawed and limiting, think of codes and their consequent categories as separate jigsaw puzzle pieces and their integration into an analytic memo as the trial assembly of the complete picture.

What follows is an example of an analytic memo based on the earlier process coded and categorized interview transcript. It is intended not as the final write-up for a publication, but as an open-ended reflection on the phenomena and processes suggested by the data and their analysis thus far. As the study proceeds, however, initial and substantive analytic memos can be revisited and revised for eventual integration into the final report. Note how the memo is dated and given a title for future and further categorization, how participant quotes are occasionally included for evidentiary support, and how the category names are bolded and the codes kept in capital letters to show how they integrate or weave into the thinking:

April 14, 2017 EMERGENT CATEGORIES: A STRATEGIC AMALGAM There’s a popular saying: “Smart is the new rich.” This participant is Thinking Strategically about his spending through such tactics as THINKING TWICE and QUESTIONING A PURCHASE before he decides to invest in a product. There’s a heightened awareness of both immediate trends and forthcoming economic bad news that positively affects his Spending Strategically . However, he seems unaware that there are even more ways of LIVING CHEAPLY by FINDING ALTERNATIVES. He dines at all-you-can-eat restaurants as a way of STOCKING UP on meals, but doesn’t state that he could bring lunch from home to work, possibly saving even more money. One of his “bad habits” is cigarettes, which he refuses to give up; but he doesn’t seem to realize that by quitting smoking he could save even more money, not to mention possible health care costs. He balks at the idea of paying $2.00 for a soft drink, but doesn’t mind paying $6.00–$7.00 for a pack of cigarettes. Penny-wise and pound-foolish. Addictions skew priorities. Living Strategically , for this participant during “scary times,” appears to be a combination of PRIORITIZING those things which cannot be helped, such as pet care and personal dental care; REFUSING SACRIFICE for maintaining personal creature-comforts; and FINDING ALTERNATIVES to high costs and excessive spending. Living Strategically is an amalgam of thinking and action-oriented strategies.

There are several recommended topics for analytic memo writing throughout the qualitative study. Memos are opportunities to reflect on and write about:

A descriptive summary of the data;

How the researcher personally relates to the participants and/or the phenomenon;

The participants’ actions, reactions, and interactions;

The participants’ routines, rituals, rules, roles, and relationships;

What is surprising, intriguing, or disturbing (Sunstein & Chiseri-Strater, 2012 , p. 115);

Code choices and their operational definitions;

Emergent patterns, categories, themes, concepts, assertions, and propositions;

The possible networks and processes (links, connections, overlaps, flows) among the codes, patterns, categories, themes, concepts, assertions, and propositions;

An emergent or related existent theory;

Any problems with the study;

Any personal or ethical dilemmas with the study;

Future directions for the study;

The analytic memos generated thus far (i.e., metamemos);

Tentative answers to the study’s research questions; and

The final report for the study. (adapted from Saldaña & Omasta, 2018 , p. 54)

Since writing is analysis, analytic memos expand on the inferential meanings of the truncated codes, categories, and patterns as a transitional stage into a more coherent narrative with hopefully rich social insight.

Qualitative Data Analysis Strategy: To Code—A Different Way

The first example of coding illustrated process coding, a way of exploring general social action among humans. But sometimes a researcher works with an individual case study in which the language is unique or with someone the researcher wishes to honor by maintaining the authenticity of his or her speech in the analysis. These reasons suggest that a more participant-centered form of coding may be more appropriate.

In Vivo Coding

A second frequently applied method of coding is called in vivo coding. The root meaning of in vivo is “in that which is alive”; it refers to a code based on the actual language used by the participant (Strauss, 1987 ). The words or phrases in the data record you select as codes are those that seem to stand out as significant or summative of what is being said.

Using the same transcript of the male participant living in difficult economic times, in vivo codes are listed in the right-hand column. I recommend that in vivo codes be placed in quotation marks as a way of designating that the code is extracted directly from the data record. Note that instead of 15 codes generated from process coding, the total number of in vivo codes is 30. This is not to suggest that there should be specific numbers or ranges of codes used for particular methods. In vivo codes, however, tend to be applied more frequently to data. Again, the interviewer’s questions and prompts are not coded, just the participant’s responses:

The 30 in vivo codes are then extracted from the transcript and could be listed in the order they appear, but this time they are placed in alphabetical order as a heuristic to prepare them for analytic action and reflection:

“ALL-YOU-CAN-EAT”

“ANOTHER DING IN MY WALLET”

“BAD HABITS”

“CHEAP AND FILLING”

“COUPLE OF THOUSAND”

“DON’T REALLY NEED”

“HAVEN’T CHANGED MY HABITS”

“HIGH MAINTENANCE”

“INSURANCE IS JUST WORTHLESS”

“IT ALL ADDS UP”

“LIVED KIND OF CHEAP”

“NOT A BIG SPENDER”

“NOT AS BAD OFF”

“NOT PUTTING AS MUCH INTO SAVINGS”

“PICK UP THE TAB”

“SCARY TIMES”

“SKYROCKETED”

“SPENDING MORE”

“THE LITTLE THINGS”

“THINK TWICE”

“TWO-FOR-ONE”

Even though no systematic categorization has been conducted with the codes thus far, an analytic memo of first impressions can still be composed:

March 19, 2017 CODE CHOICES: THE EVERYDAY LANGUAGE OF ECONOMICS After eyeballing the in vivo codes list, I noticed that variants of “CHEAP” appear most often. I recall a running joke between me and a friend of mine when we were shopping for sales. We’d say, “We’re not ‘cheap,’ we’re frugal .” There’s no formal economic or business language in this transcript—no terms such as “recession” or “downsizing”—just the everyday language of one person trying to cope during “SCARY TIMES” with “ANOTHER DING IN MY WALLET.” The participant notes that he’s always “LIVED KIND OF CHEAP” and is “NOT A BIG SPENDER” and, due to his employment, “NOT AS BAD OFF” as others in the country. Yet even with his middle class status, he’s still feeling the monetary pinch, dining at inexpensive “ALL-YOU-CAN-EAT” restaurants and worried about the rising price of peanut butter, observing that he’s “NOT PUTTING AS MUCH INTO SAVINGS” as he used to. Of all the codes, “ANOTHER DING IN MY WALLET” stands out to me, particularly because on the audio recording he sounded bitter and frustrated. It seems that he’s so concerned about “THE LITTLE THINGS” because of high veterinary and dental charges. The only way to cope with a “COUPLE OF THOUSAND” dollars worth of medical expenses is to find ways of trimming the excess in everyday facets of living: “IT ALL ADDS UP.”

Like process coding, in vivo codes could be clustered into similar categories, but another simple data analytic strategy is also possible.

Qualitative Data Analysis Strategy: To Outline

To outline in QDA is to hierarchically, processually, and/or temporally assemble such things as codes, categories, themes, assertions, propositions, and concepts into a coherent, text-based display. Traditional outlining formats and content provide not only templates for writing a report but also templates for analytic organization. This principle can be found in several computer-assisted qualitative data analysis software (CAQDAS) programs through their use of such functions as “hierarchies,” “trees,” and “nodes,” for example. Basic outlining is simply a way of arranging primary, secondary, and subsecondary items into a patterned display. For example, an organized listing of things in a home might consist of the following:

Large appliances

Refrigerator

Stove-top oven

Microwave oven

Small appliances

Coffee maker

Dining room

In QDA, outlining may include descriptive nouns or topics but, depending on the study, it may also involve processes or phenomena in extended passages, such as in vivo codes or themes.

The complexity of what we learn in the field can be overwhelming, and outlining is a way of organizing and ordering that complexity so that it does not become complicated. The cut-and-paste and tab functions of a text editing page enable you to arrange and rearrange the salient items from your preliminary coded analytic work into a more streamlined flow. By no means do I suggest that the intricate messiness of life can always be organized into neatly formatted arrangements, but outlining is an analytic act that stimulates deep reflection on both the interconnectedness and the interrelationships of what we study. As an example, here are the 30 in vivo codes generated from the initial transcript analysis, arranged in such a way as to construct five major categories:

Now that the codes have been rearranged into an outline format, an analytic memo is composed to expand on the rationale and constructed meanings in progress:

March 19, 2017 NETWORKS: EMERGENT CATEGORIES The five major categories I constructed from the in vivo codes are: “SCARY TIMES,” “PRIORTY,” “ANOTHER DING IN MY WALLET,” “THE LITTLE THINGS,” and “LIVED KIND OF CHEAP.” One of the things that hit me today was that the reason he may be pinching pennies on smaller purchases is that he cannot control the larger ones he has to deal with. Perhaps the only way we can cope with or seem to have some sense of agency over major expenses is to cut back on the smaller ones that we can control. $1,000 for a dental bill? Skip lunch for a few days a week. Insulin medication to buy for a pet? Don’t buy a soft drink from a vending machine. Using this reasoning, let me try to interrelate and weave the categories together as they relate to this particular participant: During these scary economic times, he prioritizes his spending because there seems to be just one ding after another to his wallet. A general lifestyle of living cheaply and keeping an eye out for how to save money on the little things compensates for those major expenses beyond his control.

Qualitative Data Analysis Strategy: To Code—In Even More Ways

The process and in vivo coding examples thus far have demonstrated only two specific methods of 33 documented approaches (Saldaña, 2016 ). Which one(s) you choose for your analysis depends on such factors as your conceptual framework, the genre of qualitative research for your project, the types of data you collect, and so on. The following sections present four additional approaches available for coding qualitative data that you may find useful as starting points.

Descriptive Coding

Descriptive codes are primarily nouns that simply summarize the topic of a datum. This coding approach is particularly useful when you have different types of data gathered for one study, such as interview transcripts, field notes, open-ended survey responses, documents, and visual materials such as photographs. Descriptive codes not only help categorize but also index the data corpus’s basic contents for further analytic work. An example of an interview portion coded descriptively, taken from the participant living in tough economic times, follows to illustrate how the same data can be coded in multiple ways:

For initial analysis, descriptive codes are clustered into similar categories to detect such patterns as frequency (i.e., categories with the largest number of codes) and interrelationship (i.e., categories that seem to connect in some way). Keep in mind that descriptive coding should be used sparingly with interview transcript data because other coding methods will reveal richer participant dynamics.

Values Coding

Values coding identifies the values, attitudes, and beliefs of a participant, as shared by the individual and/or interpreted by the analyst. This coding method infers the “heart and mind” of an individual or group’s worldview as to what is important, perceived as true, maintained as opinion, and felt strongly. The three constructs are coded separately but are part of a complex interconnected system.

Briefly, a value (V) is what we attribute as important, be it a person, thing, or idea. An attitude (A) is the evaluative way we think and feel about ourselves, others, things, or ideas. A belief (B) is what we think and feel as true or necessary, formed from our “personal knowledge, experiences, opinions, prejudices, morals, and other interpretive perceptions of the social world” (Saldaña, 2016 , p. 132). Values coding explores intrapersonal, interpersonal, and cultural constructs, or ethos . It is an admittedly slippery task to code this way because it is sometimes difficult to discern what is a value, attitude, or belief since they are intricately interrelated. But the depth you can potentially obtain is rich. An example of values coding follows:

For analysis, categorize the codes for each of the three different constructs together (i.e., all values in one group, attitudes in a second group, and beliefs in a third group). Analytic memo writing about the patterns and possible interrelationships may reveal a more detailed and intricate worldview of the participant.

Dramaturgical Coding

Dramaturgical coding perceives life as performance and its participants as characters in a social drama. Codes are assigned to the data (i.e., a “play script”) that analyze the characters in action, reaction, and interaction. Dramaturgical coding of participants examines their objectives (OBJ) or wants, needs, and motives; the conflicts (CON) or obstacles they face as they try to achieve their objectives; the tactics (TAC) or strategies they employ to reach their objectives; their attitudes (ATT) toward others and their given circumstances; the particular emotions (EMO) they experience throughout; and their subtexts (SUB), or underlying and unspoken thoughts. The following is an example of dramaturgically coded data:

Not included in this particular interview excerpt are the emotions the participant may have experienced or talked about. His later line, “that’s another ding in my wallet,” would have been coded EMO: BITTER. A reader may not have inferred that specific emotion from seeing the line in print. But the interviewer, present during the event and listening carefully to the audio recording during transcription, noted that feeling in his tone of voice.

For analysis, group similar codes together (e.g., all objectives in one group, all conflicts in another group, all tactics in a third group) or string together chains of how participants deal with their circumstances to overcome their obstacles through tactics:

OBJ: SAVING MEAL MONEY → TAC: SKIPPING MEALS + COUPONS

Dramaturgical coding is particularly useful as preliminary work for narrative inquiry story development or arts-based research representations such as performance ethnography. The method explores how the individuals or groups manage problem solving in their daily lives.

Versus Coding

Versus (VS) coding identifies the conflicts, struggles, and power issues observed in social action, reaction, and interaction as an X VS Y code, such as MEN VS WOMEN, CONSERVATIVES VS LIBERALS, FAITH VS LOGIC, and so on. Conflicts are rarely this dichotomous; they are typically nuanced and much more complex. But humans tend to perceive these struggles with an US VS THEM mindset. The codes can range from the observable to the conceptual and can be applied to data that show humans in tension with others, themselves, or ideologies.

What follows are examples of versus codes applied to the case study participant’s descriptions of his major medical expenses:

As an initial analytic tactic, group the versus codes into one of three categories: the Stakeholders , their Perceptions and/or Actions , and the Issues at stake. Examine how the three interrelate and identify the central ideological conflict at work as an X VS Y category. Analytic memos and the final write-up can detail the nuances of the issues.

Remember that what has been profiled in this section is a broad brushstroke description of just a few basic coding processes, several of which can be compatibly mixed and matched within a single analysis (see Saldaña’s 2016   The Coding Manual for Qualitative Researchers for a complete discussion). Certainly with additional data, more in-depth analysis can occur, but coding is only one approach to extracting and constructing preliminary meanings from the data corpus. What now follows are additional methods for qualitative analysis.

Qualitative Data Analysis Strategy: To Theme

To theme in QDA is to construct summative, phenomenological meanings from data through extended passages of text. Unlike codes, which are most often single words or short phrases that symbolically represent a datum, themes are extended phrases or sentences that summarize the manifest (apparent) and latent (underlying) meanings of data (Auerbach & Silverstein, 2003 ; Boyatzis, 1998 ). Themes, intended to represent the essences and essentials of humans’ lived experiences, can also be categorized or listed in superordinate and subordinate outline formats as an analytic tactic.

Below is the interview transcript example used in the previous coding sections. (Hopefully you are not too fatigued at this point with the transcript, but it is important to know how inquiry with the same data set can be approached in several different ways.) During the investigation of the ways middle-class Americans are influenced and affected by an economic recession, the researcher noticed that participants’ stories exhibited facets of what he labeled economic intelligence , or EI (based on the formerly developed theories of Howard Gardner’s multiple intelligences and Daniel Goleman’s emotional intelligence). Notice how theming interprets what is happening through the use of two distinct phrases—ECONOMIC INTELLIGENCE IS (i.e., manifest or apparent meanings) and ECONOMIC INTELLIGENCE MEANS (i.e., latent or underlying meanings):

Unlike the 15 process codes and 30 in vivo codes in the previous examples, there are now 14 themes to work with. They could be listed in the order they appear, but one initial heuristic is to group them separately by “is” and “means” statements to detect any possible patterns (discussed later):

EI IS TAKING ADVANTAGE OF UNEXPECTED OPPORTUNITY

EI IS BUYING CHEAP

EI IS SAVING A FEW DOLLARS NOW AND THEN

EI IS SETTING PRIORITIES

EI IS FINDING CHEAPER FORMS OF ENTERTAINMENT

EI IS NOTICING PERSONAL AND NATIONAL ECONOMIC TRENDS

EI IS TAKING CARE OF ONE’S OWN HEALTH

EI MEANS THINKING BEFORE YOU ACT

EI MEANS SACRIFICE

EI MEANS KNOWING YOUR FLAWS

EI MEANS LIVING AN INEXPENSIVE LIFESTYLE

EI MEANS YOU CANNOT CONTROL EVERYTHING

EI MEANS KNOWING YOUR LUCK

There are several ways to categorize the themes as preparation for analytic memo writing. The first is to arrange them in outline format with superordinate and subordinate levels, based on how the themes seem to take organizational shape and structure. Simply cutting and pasting the themes in multiple arrangements on a text editing page eventually develops a sense of order to them. For example:

A second approach is to categorize the themes into similar clusters and to develop different category labels or theoretical constructs . A theoretical construct is an abstraction that transforms the central phenomenon’s themes into broader applications but can still use “is” and “means” as prompts to capture the bigger picture at work:

Theoretical Construct 1: EI Means Knowing the Unfortunate Present

Supporting Themes:

Theoretical Construct 2: EI Is Cultivating a Small Fortune

Theoretical Construct 3: EI Means a Fortunate Future

What follows is an analytic memo generated from the cut-and-paste arrangement of themes into “is” and “means” statements, into an outline, and into theoretical constructs:

March 19, 2017 EMERGENT THEMES: FORTUNE/FORTUNATELY/UNFORTUNATELY I first reorganized the themes by listing them in two groups: “is” and “means.” The “is” statements seemed to contain positive actions and constructive strategies for economic intelligence. The “means” statements held primarily a sense of caution and restriction with a touch of negativity thrown in. The first outline with two major themes, LIVING AN INEXPENSIVE LIFESTYLE and YOU CANNOT CONTROL EVERYTHING also had this same tone. This reminded me of the old children’s picture book, Fortunately/Unfortunately , and the themes of “fortune” as a motif for the three theoretical constructs came to mind. Knowing the Unfortunate Present means knowing what’s (most) important and what’s (mostly) uncontrollable in one’s personal economic life. Cultivating a Small Fortune consists of those small money-saving actions that, over time, become part of one’s lifestyle. A Fortunate Future consists of heightened awareness of trends and opportunities at micro and macro levels, with the understanding that health matters can idiosyncratically affect one’s fortune. These three constructs comprise this particular individual’s EI—economic intelligence.

Again, keep in mind that the examples for coding and theming were from one small interview transcript excerpt. The number of codes and their categorization would increase, given a longer interview and/or multiple interviews to analyze. But the same basic principles apply: codes and themes relegated into patterned and categorized forms are heuristics—stimuli for good thinking through the analytic memo-writing process on how everything plausibly interrelates. Methodologists vary in the number of recommended final categories that result from analysis, ranging anywhere from three to seven, with traditional grounded theorists prescribing one central or core category from coded work.

Qualitative Data Analysis Strategy: To Assert

To assert in QDA is to put forward statements that summarize particular fieldwork and analytic observations that the researcher believes credibly represent and transcend the experiences. Educational anthropologist Frederick Erickson ( 1986 ) wrote a significant and influential chapter on qualitative methods that outlined heuristics for assertion development . Assertions are declarative statements of summative synthesis, supported by confirming evidence from the data and revised when disconfirming evidence or discrepant cases require modification of the assertions. These summative statements are generated from an interpretive review of the data corpus and then supported and illustrated through narrative vignettes—reconstructed stories from field notes, interview transcripts, or other data sources that provide a vivid profile as part of the evidentiary warrant.

Coding or theming data can certainly precede assertion development as a way of gaining intimate familiarity with the data, but Erickson’s ( 1986 ) methods are a more admittedly intuitive yet systematic heuristic for analysis. Erickson promotes analytic induction and exploration of and inferences about the data, based on an examination of the evidence and an accumulation of knowledge. The goal is not to look for “proof” to support the assertions, but to look for plausibility of inference-laden observations about the local and particular social world under investigation.

Assertion development is the writing of general statements, plus subordinate yet related ones called subassertions and a major statement called a key assertion that represents the totality of the data. One also looks for key linkages between them, meaning that the key assertion links to its related assertions, which then link to their respective subassertions. Subassertions can include particulars about any discrepant related cases or specify components of their parent assertions.

Excerpts from the interview transcript of our case study will be used to illustrate assertion development at work. By now, you should be quite familiar with the contents, so I will proceed directly to the analytic example. First, there is a series of thematically related statements the participant makes:

“Buy one package of chicken, get the second one free. Now that was a bargain. And I got some.”

“With Sweet Tomatoes I get those coupons for a few bucks off for lunch, so that really helps.”

“I don’t go to movies anymore. I rent DVDs from Netflix or Redbox or watch movies online—so much cheaper than paying over ten or twelve bucks for a movie ticket.”

Assertions can be categorized into low-level and high-level inferences . Low-level inferences address and summarize what is happening within the particulars of the case or field site—the micro . High-level inferences extend beyond the particulars to speculate on what it means in the more general social scheme of things—the meso or macro . A reasonable low-level assertion about the three statements above collectively might read, The participant finds several small ways to save money during a difficult economic period . A high-level inference that transcends the case to the meso level might read, Selected businesses provide alternatives and opportunities to buy products and services at reduced rates during a recession to maintain consumer spending.

Assertions are instantiated (i.e., supported) by concrete instances of action or participant testimony, whose patterns lead to more general description outside the specific field site. The author’s interpretive commentary can be interspersed throughout the report, but the assertions should be supported with the evidentiary warrant . A few assertions and subassertions based on the case interview transcript might read as follows (and notice how high-level assertions serve as the paragraphs’ topic sentences):

Selected businesses provide alternatives and opportunities to buy products and services at reduced rates during a recession to maintain consumer spending. Restaurants, for example, need to find ways during difficult economic periods when potential customers may be opting to eat inexpensively at home rather than spending more money by dining out. Special offers can motivate cash-strapped clientele to patronize restaurants more frequently. An adult male dealing with such major expenses as underinsured dental care offers: “With Sweet Tomatoes I get those coupons for a few bucks off for lunch, so that really helps.” The film and video industries also seem to be suffering from a double-whammy during the current recession: less consumer spending on higher-priced entertainment, resulting in a reduced rate of movie theater attendance (recently 39 percent of the American population, according to a CNN report); coupled with a media technology and business revolution that provides consumers less costly alternatives through video rentals and Internet viewing: “I don’t go to movies anymore. I rent DVDs from Netflix or Redbox or watch movies online—so much cheaper than paying over ten or twelve bucks for a movie ticket.”

To clarify terminology, any assertion that has an if–then or predictive structure to it is called a proposition since it proposes a conditional event. For example, this assertion is also a proposition: “Special offers can motivate cash-strapped clientele to patronize restaurants more frequently.” Propositions are the building blocks of hypothesis testing in the field and for later theory construction. Research not only documents human action but also can sometimes formulate statements that predict it. This provides a transferable and generalizable quality in our findings to other comparable settings and contexts. And to clarify terminology further, all propositions are assertions, but not all assertions are propositions.

Particularizability —the search for specific and unique dimensions of action at a site and/or the specific and unique perspectives of an individual participant—is not intended to filter out trivial excess but to magnify the salient characteristics of local meaning. Although generalizable knowledge is difficult to formulate in qualitative inquiry since each naturalistic setting will contain its own unique set of social and cultural conditions, there will be some aspects of social action that are plausibly universal or “generic” across settings and perhaps even across time.

To work toward this, Erickson advocates that the interpretive researcher look for “concrete universals” by studying actions at a particular site in detail and then comparing those actions to actions at other sites that have also been studied in detail. The exhibit or display of these generalizable features is to provide a synoptic representation, or a view of the whole. What the researcher attempts to uncover is what is both particular and general at the site of interest, preferably from the perspective of the participants. It is from the detailed analysis of actions at a specific site that these universals can be concretely discerned, rather than abstractly constructed as in grounded theory.

In sum, assertion development is a qualitative data analytic strategy that relies on the researcher’s intense review of interview transcripts, field notes, documents, and other data to inductively formulate, with reasonable certainty, composite statements that credibly summarize and interpret participant actions and meanings and their possible representation of and transfer into broader social contexts and issues.

Qualitative Data Analysis Strategy: To Display

To display in QDA is to visually present the processes and dynamics of human or conceptual action represented in the data. Qualitative researchers use not only language but also illustrations to both analyze and display the phenomena and processes at work in the data. Tables, charts, matrices, flow diagrams, and other models and graphics help both you and your readers cognitively and conceptually grasp the essence and essentials of your findings. As you have seen thus far, even simple outlining of codes, categories, and themes is one visual tactic for organizing the scope of the data. Rich text, font, and format features such as italicizing, bolding, capitalizing, indenting, and bullet pointing provide simple emphasis to selected words and phrases within the longer narrative.

Think display was a phrase coined by methodologists Miles and Huberman ( 1994 ) to encourage the researcher to think visually as data were collected and analyzed. The magnitude of text can be essentialized into graphics for at-a-glance review. Bins in various shapes and lines of various thicknesses, along with arrows suggesting pathways and direction, render the study a portrait of action. Bins can include the names of codes, categories, concepts, processes, key participants, and/or groups.

As a simple example, Figure 29.1 illustrates the three categories’ interrelationship derived from process coding. It displays what could be the apex of this interaction, LIVING STRATEGICALLY, and its connections to THINKING STRATEGICALLY, which influences and affects SPENDING STRATEGICALLY.

Three categories’ interrelationship derived from process coding.

Figure 29.2 represents a slightly more complex (if not playful) model, based on the five major in vivo codes/categories generated from analysis. The graphic is used as a way of initially exploring the interrelationship and flow from one category to another. The use of different font styles, font sizes, and line and arrow thicknesses is intended to suggest the visual qualities of the participant’s language and his dilemmas—a way of heightening in vivo coding even further.

In vivo categories in rich text display

Accompanying graphics are not always necessary for a qualitative report. They can be very helpful for the researcher during the analytic stage as a heuristic for exploring how major ideas interrelate, but illustrations are generally included in published work when they will help supplement and clarify complex processes for readers. Photographs of the field setting or the participants (and only with their written permission) also provide evidentiary reality to the write-up and help your readers get a sense of being there.

Qualitative Data Analysis Strategy: To Narrate

To narrate in QDA is to create an evocative literary representation and presentation of the data in the form of creative nonfiction. All research reports are stories of one kind or another. But there is yet another approach to QDA that intentionally documents the research experience as story, in its traditional literary sense. Narrative inquiry serves to plot and story-line the participant’s experiences into what might be initially perceived as a fictional short story or novel. But the story is carefully crafted and creatively written to provide readers with an almost omniscient perspective about the participants’ worldview. The transformation of the corpus from database to creative nonfiction ranges from systematic transcript analysis to open-ended literary composition. The narrative, however, should be solidly grounded in and emerge from the data as a plausible rendering of social life.

The following is a narrative vignette based on interview transcript selections from the participant living through tough economic times:

Jack stood in front of the soft drink vending machine at work and looked almost worriedly at the selections. With both hands in his pants pockets, his fingers jingled the few coins he had inside them as he contemplated whether he could afford the purchase. Two dollars for a twenty-ounce bottle of Diet Coke. Two dollars. “I can practically get a two-liter bottle for that same price at the grocery store,” he thought. Then Jack remembered the upcoming dental surgery he needed—that would cost one thousand dollars—and the bottle of insulin and syringes he needed to buy for his diabetic, high maintenance cat—almost two hundred dollars. He sighed, took his hands out of his pockets, and walked away from the vending machine. He was skipping lunch that day anyway so he could stock up on dinner later at the cheap-but-filling all-you-can-eat Chinese buffet. He could get his Diet Coke there.

Narrative inquiry representations, like literature, vary in tone, style, and point of view. The common goal, however, is to create an evocative portrait of participants through the aesthetic power of literary form. A story does not always have to have a moral explicitly stated by its author. The reader reflects on personal meanings derived from the piece and how the specific tale relates to one’s self and the social world.

Qualitative Data Analysis Strategy: To Poeticize

To poeticize in QDA is to create an evocative literary representation and presentation of the data in poetic form. One approach to analyzing or documenting analytic findings is to strategically truncate interview transcripts, field notes, and other pertinent data into poetic structures. Like coding, poetic constructions capture the essence and essentials of data in a creative, evocative way. The elegance of the format attests to the power of carefully chosen language to represent and convey complex human experience.

In vivo codes (codes based on the actual words used by participants themselves) can provide imagery, symbols, and metaphors for rich category, theme, concept, and assertion development, in addition to evocative content for arts-based interpretations of the data. Poetic inquiry takes note of what words and phrases seem to stand out from the data corpus as rich material for reinterpretation. Using some of the participant’s own language from the interview transcript illustrated previously, a poetic reconstruction or “found poetry” might read as follows:

Scary Times Scary times … spending more   (another ding in my wallet) a couple of thousand   (another ding in my wallet) insurance is just worthless   (another ding in my wallet) pick up the tab   (another ding in my wallet) not putting as much into savings   (another ding in my wallet) It all adds up. Think twice:   don’t really need    skip Think twice, think cheap:   coupons   bargains   two-for-one    free Think twice, think cheaper:   stock up   all-you-can-eat    (cheap—and filling) It all adds up.

Anna Deavere Smith, a verbatim theatre performer, attests that people speak in forms of “organic poetry” in everyday life. Thus, in vivo codes can provide core material for poetic representation and presentation of lived experiences, potentially transforming the routine and mundane into the epic. Some researchers also find the genre of poetry to be the most effective way to compose original work that reflects their own fieldwork experiences and autoethnographic stories.

Qualitative Data Analysis Strategy: To Compute

To compute in QDA is to employ specialized software programs for qualitative data management and analysis. The acronym for computer-assisted qualitative data analysis software is CAQDAS. There are diverse opinions among practitioners in the field about the utility of such specialized programs for qualitative data management and analysis. The software, unlike statistical computation, does not actually analyze data for you at higher conceptual levels. These CAQDAS software packages serve primarily as a repository for your data (both textual and visual) that enables you to code them, and they can perform such functions as calculating the number of times a particular word or phrase appears in the data corpus (a particularly useful function for content analysis) and can display selected facets after coding, such as possible interrelationships. Basic software such as Microsoft Word and Excel provides utilities that can store and, with some preformatting and strategic entry, organize qualitative data to enable the researcher’s analytic review. The following Internet addresses are listed to help in exploring selected CAQDAS packages and obtaining demonstration/trial software; video tutorials are available on the companies’ websites and on YouTube:

ATLAS.ti: http://www.atlasti.com

Dedoose: http://www.dedoose.com

HyperRESEARCH: http://www.researchware.com

MAXQDA: http://www.maxqda.com

NVivo: http://www.qsrinternational.com

QDA Miner: http://www.provalisresearch.com

Quirkos: http://www.quirkos.com

Transana: http://www.transana.com

V-Note: http://www.v-note.org

Some qualitative researchers attest that the software is indispensable for qualitative data management, especially for large-scale studies. Others feel that the learning curve of most CAQDAS programs is too lengthy to be of pragmatic value, especially for small-scale studies. From my own experience, if you have an aptitude for picking up quickly on the scripts and syntax of software programs, explore one or more of the packages listed. If you are a novice to qualitative research, though, I recommend working manually or “by hand” for your first project so you can focus exclusively on the data and not on the software.

Qualitative Data Analysis Strategy: To Verify

To verify in QDA is to administer an audit of “quality control” to your analysis. After your data analysis and the development of key findings, you may be thinking to yourself, “Did I get it right?” “Did I learn anything new?” Reliability and validity are terms and constructs of the positivist quantitative paradigm that refer to the replicability and accuracy of measures. But in the qualitative paradigm, other constructs are more appropriate.

Credibility and trustworthiness (Lincoln & Guba, 1985 ) are two factors to consider when collecting and analyzing the data and presenting your findings. In our qualitative research projects, we must present a convincing story to our audiences that we “got it right” methodologically. In other words, the amount of time we spent in the field, the number of participants we interviewed, the analytic methods we used, the thinking processes evident to reach our conclusions, and so on should be “just right” to assure the reader that we have conducted our jobs soundly. But remember that we can never conclusively prove something; we can only, at best, convincingly suggest. Research is an act of persuasion.

Credibility in a qualitative research report can be established in several ways. First, citing the key writers of related works in your literature review is essential. Seasoned researchers will sometimes assess whether a novice has “done her homework” by reviewing the bibliography or references. You need not list everything that seminal writers have published about a topic, but their names should appear at least once as evidence that you know the field’s key figures and their work.

Credibility can also be established by specifying the particular data analysis methods you employed (e.g., “Interview transcripts were taken through two cycles of process coding, resulting in three primary categories”), through corroboration of data analysis with the participants themselves (e.g., “I asked my participants to read and respond to a draft of this report for their confirmation of accuracy and recommendations for revision”), or through your description of how data and findings were substantiated (e.g., “Data sources included interview transcripts, participant observation field notes, and participant response journals to gather multiple perspectives about the phenomenon”).

Data scientist W. Edwards Deming is attributed with offering this cautionary advice about making a convincing argument: “Without data, you’re just another person with an opinion.” Thus, researchers can also support their findings with relevant, specific evidence by quoting participants directly and/or including field note excerpts from the data corpus. These serve both as illustrative examples for readers and to present more credible testimony of what happened in the field.

Trustworthiness, or providing credibility to the writing, is when we inform the reader of our research processes. Some make the case by stating the duration of fieldwork (e.g., “Forty-five clock hours were spent in the field”; “The study extended over a 10-month period”). Others put forth the amounts of data they gathered (e.g., “Sixteen individuals were interviewed”; “My field notes totaled 157 pages”). Sometimes trustworthiness is established when we are up front or confessional with the analytic or ethical dilemmas we encountered (e.g., “It was difficult to watch the participant’s teaching effectiveness erode during fieldwork”; “Analysis was stalled until I recoded the entire data corpus with a new perspective”).

The bottom line is that credibility and trustworthiness are matters of researcher honesty and integrity . Anyone can write that he worked ethically, rigorously, and reflexively, but only the writer will ever know the truth. There is no shame if something goes wrong with your research. In fact, it is more than likely the rule, not the exception. Work and write transparently to achieve credibility and trustworthiness with your readers.

The length of this chapter does not enable me to expand on other QDA strategies such as to conceptualize, theorize, and write. Yet there are even more subtle thinking strategies to employ throughout the research enterprise, such as to synthesize, problematize, and create. Each researcher has his or her own ways of working, and deep reflexivity (another strategy) on your own methodology and methods as a qualitative inquirer throughout fieldwork and writing provides you with metacognitive awareness of data analysis processes and possibilities.

Data analysis is one of the most elusive practices in qualitative research, perhaps because it is a backstage, behind-the-scenes, in-your-head enterprise. It is not that there are no models to follow. It is just that each project is contextual and case specific. The unique data you collect from your unique research design must be approached with your unique analytic signature. It truly is a learning-by-doing process, so accept that and leave yourself open to discovery and insight as you carefully scrutinize the data corpus for patterns, categories, themes, concepts, assertions, propositions, and possibly new theories through strategic analysis.

Auerbach, C. F. , & Silverstein, L. B. ( 2003 ). Qualitative data: An introduction to coding and analysis . New York, NY: New York University Press.

Google Scholar

Google Preview

Birks, M. , & Mills, J. ( 2015 ). Grounded theory: A practical guide (2nd ed.). London, England: Sage.

Boyatzis, R. E. ( 1998 ). Transforming qualitative information: Thematic analysis and code development . Thousand Oaks, CA: Sage.

Bryant, A. ( 2017 ). Grounded theory and grounded theorizing: Pragmatism in research practice. New York, NY: Oxford.

Bryant, A. , & Charmaz, K. (Eds.). ( 2019 ). The Sage handbook of current developments in grounded theory . London, England: Sage.

Charmaz, K. ( 2014 ). Constructing grounded theory: A practical guide through qualitative analysis (2nd ed.). London, England: Sage.

Erickson, F. ( 1986 ). Qualitative methods in research on teaching. In M. C. Wittrock (Ed.), Handbook of research on teaching (3rd ed., pp. 119–161). New York, NY: Macmillan.

Galman, S. C. ( 2013 ). The good, the bad, and the data: Shane the lone ethnographer’s basic guide to qualitative data analysis. Walnut Creek, CA: Left Coast Press.

Geertz, C. ( 1983 ). Local knowledge: Further essays in interpretive anthropology . New York, NY: Basic Books.

Lincoln, Y. S. , & Guba, E. G. ( 1985 ). Naturalistic inquiry . Newbury Park, CA: Sage.

Miles, M. B. , & Huberman, A. M. ( 1994 ). Qualitative data analysis (2nd ed.). Thousand Oaks, CA: Sage.

Saldaña, J. ( 2016 ). The coding manual for qualitative researchers (3rd ed.). London, England: Sage.

Saldaña, J. , & Omasta, M. ( 2018 ). Qualitative research: Analyzing life . Thousand Oaks, CA: Sage.

Stake, R. E. ( 1995 ). The art of case study research . Thousand Oaks, CA: Sage.

Stern, P. N. , & Porr, C. J. ( 2011 ). Essentials of accessible grounded theory . Walnut Creek, CA: Left Coast Press.

Strauss, A. L. ( 1987 ). Qualitative analysis for social scientists . Cambridge, England: Cambridge University Press.

Sunstein, B. S. , & Chiseri-Strater, E. ( 2012 ). FieldWorking: Reading and writing research (4th ed.). Boston, MA: Bedford/St. Martin’s.

Wertz, F. J. , Charmaz, K. , McMullen, L. M. , Josselson, R. , Anderson, R. , & McSpadden, E. ( 2011 ). Five ways of doing qualitative analysis: Phenomenological psychology, grounded theory, discourse analysis, narrative research, and intuitive inquiry . New York, NY: Guilford Press.

  • About Oxford Academic
  • Publish journals with us
  • University press partners
  • What we publish
  • New features  
  • Open access
  • Institutional account management
  • Rights and permissions
  • Get help with access
  • Accessibility
  • Advertising
  • Media enquiries
  • Oxford University Press
  • Oxford Languages
  • University of Oxford

Oxford University Press is a department of the University of Oxford. It furthers the University's objective of excellence in research, scholarship, and education by publishing worldwide

  • Copyright © 2024 Oxford University Press
  • Cookie settings
  • Cookie policy
  • Privacy policy
  • Legal notice

This Feature Is Available To Subscribers Only

Sign In or Create an Account

This PDF is available to Subscribers Only

For full access to this pdf, sign in to an existing account, or purchase an annual subscription.

Log in using your username and password

  • Search More Search for this keyword Advanced search
  • Latest content
  • Current issue
  • Write for Us
  • BMJ Journals More You are viewing from: Google Indexer

You are here

  • Volume 17, Issue 1
  • Qualitative data analysis: a practical example
  • Article Text
  • Article info
  • Citation Tools
  • Rapid Responses
  • Article metrics

Download PDF

  • Helen Noble 1 ,
  • Joanna Smith 2
  • 1 School of Nursing and Midwifery, Queens's University Belfast , Belfast , UK
  • 2 Department of Health Sciences , University of Huddersfield , Huddersfield , UK
  • Correspondence to : Dr Helen Noble School of Nursing and Midwifery, Queen's University Belfast, Medical Biology Centre, 97 Lisburn Road, Belfast BT9 7BL, UK; helen.noble{at}qub.ac.uk

https://doi.org/10.1136/eb-2013-101603

Statistics from Altmetric.com

Request permissions.

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

The aim of this paper is to equip readers with an understanding of the principles of qualitative data analysis and offer a practical example of how analysis might be undertaken in an interview-based study.

What is qualitative data analysis?

What are the approaches in undertaking qualitative data analysis.

Although qualitative data analysis is inductive and focuses on meaning, approaches in analysing data are diverse with different purposes and ontological (concerned with the nature of being) and epistemological (knowledge and understanding) underpinnings. 2 Identifying an appropriate approach in analysing qualitative data analysis to meet the aim of a study can be challenging. One way to understand qualitative data analysis is to consider the processes involved. 3 Approaches can be divided into four broad groups: quasistatistical approaches such as content analysis; the use of frameworks or matrices such as a framework approach and thematic analysis; interpretative approaches that include interpretative phenomenological analysis and grounded theory; and sociolinguistic approaches such as discourse analysis and conversation analysis. However, there are commonalities across approaches. Data analysis is an interactive process, where data are systematically searched and analysed in order to provide an illuminating description of phenomena; for example, the experience of carers supporting dying patients with renal disease 4 or student nurses’ experiences following assignment referral. 5 Data analysis is an iterative or recurring process, essential to the creativity of the analysis, development of ideas, clarifying meaning and the reworking of concepts as new insights ‘emerge’ or are identified in the data.

Do you need data software packages when analysing qualitative data?

Qualitative data software packages are not a prerequisite for undertaking qualitative analysis but a range of programmes are available that can assist the qualitative researcher. Software programmes vary in design and application but can be divided into text retrievers, code and retrieve packages and theory builders. 6 NVivo and NUD*IST are widely used because they have sophisticated code and retrieve functions and modelling capabilities, which speed up the process of managing large data sets and data retrieval. Repetitions within data can be quantified and memos and hyperlinks attached to data. Analytical processes can be mapped and tracked and linkages across data visualised leading to theory development. 6 Disadvantages of using qualitative data software packages include the complexity of the software and some programmes are not compatible with standard text format. Extensive coding and categorising can result in data becoming unmanageable and researchers may find visualising data on screen inhibits conceptualisation of the data.

How do you begin analysing qualitative data?

Despite the diversity of qualitative methods, the subsequent analysis is based on a common set of principles and for interview data includes: transcribing the interviews; immersing oneself within the data to gain detailed insights into the phenomena being explored; developing a data coding system; and linking codes or units of data to form overarching themes/concepts, which may lead to the development of theory. 2 Identifying recurring and significant themes, whereby data are methodically searched to identify patterns in order to provide an illuminating description of a phenomenon, is a central skill in undertaking qualitative data analysis. Table 1 contains an extract of data taken from a research study which included interviews with carers of people with end-stage renal disease managed without dialysis. The extract is taken from a carer who is trying to understand why her mother was not offered dialysis. The first stage of data analysis involves the process of initial coding, whereby each line of the data is considered to identify keywords or phrases; these are sometimes known as in vivo codes (highlighted) because they retain participants’ words.

  • View inline

Data extract containing units of data and line-by-line coding

When transcripts have been broken down into manageable sections, the researcher sorts and sifts them, searching for types, classes, sequences, processes, patterns or wholes. The next stage of data analysis involves bringing similar categories together into broader themes. Table 2 provides an example of the early development of codes and categories and how these link to form broad initial themes.

Development of initial themes from descriptive codes

Table 3 presents an example of further category development leading to final themes which link to an overarching concept.

Development of final themes and overarching concept

How do qualitative researchers ensure data analysis procedures are transparent and robust?

In congruence with quantitative researchers, ensuring qualitative studies are methodologically robust is essential. Qualitative researchers need to be explicit in describing how and why they undertook the research. However, qualitative research is criticised for lacking transparency in relation to the analytical processes employed, which hinders the ability of the reader to critically appraise study findings. 7 In the three tables presented the progress from units of data to coding to theme development is illustrated. ‘Not involved in treatment decisions’ appears in each table and informs one of the final themes. Documenting the movement from units of data to final themes allows for transparency of data analysis. Although other researchers may interpret the data differently, appreciating and understanding how the themes were developed is an essential part of demonstrating the robustness of the findings. Qualitative researchers must demonstrate rigour, associated with openness, relevance to practice and congruence of the methodological approch. 2 In summary qualitative research is complex in that it produces large amounts of data and analysis is time consuming and complex. High-quality data analysis requires a researcher with expertise, vision and veracity.

  • Cheater F ,
  • Robshaw M ,
  • McLafferty E ,
  • Maggs-Rapport F

Competing interests None.

Read the full text or download the PDF:

  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • QuestionPro

survey software icon

  • Solutions Industries Gaming Automotive Sports and events Education Government Travel & Hospitality Financial Services Healthcare Cannabis Technology Use Case NPS+ Communities Audience Contactless surveys Mobile LivePolls Member Experience GDPR Positive People Science 360 Feedback Surveys
  • Resources Blog eBooks Survey Templates Case Studies Training Help center

how to write data analysis for qualitative research

Home Market Research

Data Analysis in Research: Types & Methods

data-analysis-in-research

Content Index

Why analyze data in research?

Types of data in research, finding patterns in the qualitative data, methods used for data analysis in qualitative research, preparing data for analysis, methods used for data analysis in quantitative research, considerations in research data analysis, what is data analysis in research.

Definition of research in data analysis: According to LeCompte and Schensul, research data analysis is a process used by researchers to reduce data to a story and interpret it to derive insights. The data analysis process helps reduce a large chunk of data into smaller fragments, which makes sense. 

Three essential things occur during the data analysis process — the first is data organization . Summarization and categorization together contribute to becoming the second known method used for data reduction. It helps find patterns and themes in the data for easy identification and linking. The third and last way is data analysis – researchers do it in both top-down and bottom-up fashion.

LEARN ABOUT: Research Process Steps

On the other hand, Marshall and Rossman describe data analysis as a messy, ambiguous, and time-consuming but creative and fascinating process through which a mass of collected data is brought to order, structure and meaning.

We can say that “the data analysis and data interpretation is a process representing the application of deductive and inductive logic to the research and data analysis.”

Researchers rely heavily on data as they have a story to tell or research problems to solve. It starts with a question, and data is nothing but an answer to that question. But, what if there is no question to ask? Well! It is possible to explore data even without a problem – we call it ‘Data Mining’, which often reveals some interesting patterns within the data that are worth exploring.

Irrelevant to the type of data researchers explore, their mission and audiences’ vision guide them to find the patterns to shape the story they want to tell. One of the essential things expected from researchers while analyzing data is to stay open and remain unbiased toward unexpected patterns, expressions, and results. Remember, sometimes, data analysis tells the most unforeseen yet exciting stories that were not expected when initiating data analysis. Therefore, rely on the data you have at hand and enjoy the journey of exploratory research. 

Create a Free Account

Every kind of data has a rare quality of describing things after assigning a specific value to it. For analysis, you need to organize these values, processed and presented in a given context, to make it useful. Data can be in different forms; here are the primary data types.

  • Qualitative data: When the data presented has words and descriptions, then we call it qualitative data . Although you can observe this data, it is subjective and harder to analyze data in research, especially for comparison. Example: Quality data represents everything describing taste, experience, texture, or an opinion that is considered quality data. This type of data is usually collected through focus groups, personal qualitative interviews , qualitative observation or using open-ended questions in surveys.
  • Quantitative data: Any data expressed in numbers of numerical figures are called quantitative data . This type of data can be distinguished into categories, grouped, measured, calculated, or ranked. Example: questions such as age, rank, cost, length, weight, scores, etc. everything comes under this type of data. You can present such data in graphical format, charts, or apply statistical analysis methods to this data. The (Outcomes Measurement Systems) OMS questionnaires in surveys are a significant source of collecting numeric data.
  • Categorical data: It is data presented in groups. However, an item included in the categorical data cannot belong to more than one group. Example: A person responding to a survey by telling his living style, marital status, smoking habit, or drinking habit comes under the categorical data. A chi-square test is a standard method used to analyze this data.

Learn More : Examples of Qualitative Data in Education

Data analysis in qualitative research

Data analysis and qualitative data research work a little differently from the numerical data as the quality data is made up of words, descriptions, images, objects, and sometimes symbols. Getting insight from such complicated information is a complicated process. Hence it is typically used for exploratory research and data analysis .

Although there are several ways to find patterns in the textual information, a word-based method is the most relied and widely used global technique for research and data analysis. Notably, the data analysis process in qualitative research is manual. Here the researchers usually read the available data and find repetitive or commonly used words. 

For example, while studying data collected from African countries to understand the most pressing issues people face, researchers might find  “food”  and  “hunger” are the most commonly used words and will highlight them for further analysis.

LEARN ABOUT: Level of Analysis

The keyword context is another widely used word-based technique. In this method, the researcher tries to understand the concept by analyzing the context in which the participants use a particular keyword.  

For example , researchers conducting research and data analysis for studying the concept of ‘diabetes’ amongst respondents might analyze the context of when and how the respondent has used or referred to the word ‘diabetes.’

The scrutiny-based technique is also one of the highly recommended  text analysis  methods used to identify a quality data pattern. Compare and contrast is the widely used method under this technique to differentiate how a specific text is similar or different from each other. 

For example: To find out the “importance of resident doctor in a company,” the collected data is divided into people who think it is necessary to hire a resident doctor and those who think it is unnecessary. Compare and contrast is the best method that can be used to analyze the polls having single-answer questions types .

Metaphors can be used to reduce the data pile and find patterns in it so that it becomes easier to connect data with theory.

Variable Partitioning is another technique used to split variables so that researchers can find more coherent descriptions and explanations from the enormous data.

LEARN ABOUT: Qualitative Research Questions and Questionnaires

There are several techniques to analyze the data in qualitative research, but here are some commonly used methods,

  • Content Analysis:  It is widely accepted and the most frequently employed technique for data analysis in research methodology. It can be used to analyze the documented information from text, images, and sometimes from the physical items. It depends on the research questions to predict when and where to use this method.
  • Narrative Analysis: This method is used to analyze content gathered from various sources such as personal interviews, field observation, and  surveys . The majority of times, stories, or opinions shared by people are focused on finding answers to the research questions.
  • Discourse Analysis:  Similar to narrative analysis, discourse analysis is used to analyze the interactions with people. Nevertheless, this particular method considers the social context under which or within which the communication between the researcher and respondent takes place. In addition to that, discourse analysis also focuses on the lifestyle and day-to-day environment while deriving any conclusion.
  • Grounded Theory:  When you want to explain why a particular phenomenon happened, then using grounded theory for analyzing quality data is the best resort. Grounded theory is applied to study data about the host of similar cases occurring in different settings. When researchers are using this method, they might alter explanations or produce new ones until they arrive at some conclusion.

LEARN ABOUT: 12 Best Tools for Researchers

Data analysis in quantitative research

The first stage in research and data analysis is to make it for the analysis so that the nominal data can be converted into something meaningful. Data preparation consists of the below phases.

Phase I: Data Validation

Data validation is done to understand if the collected data sample is per the pre-set standards, or it is a biased data sample again divided into four different stages

  • Fraud: To ensure an actual human being records each response to the survey or the questionnaire
  • Screening: To make sure each participant or respondent is selected or chosen in compliance with the research criteria
  • Procedure: To ensure ethical standards were maintained while collecting the data sample
  • Completeness: To ensure that the respondent has answered all the questions in an online survey. Else, the interviewer had asked all the questions devised in the questionnaire.

Phase II: Data Editing

More often, an extensive research data sample comes loaded with errors. Respondents sometimes fill in some fields incorrectly or sometimes skip them accidentally. Data editing is a process wherein the researchers have to confirm that the provided data is free of such errors. They need to conduct necessary checks and outlier checks to edit the raw edit and make it ready for analysis.

Phase III: Data Coding

Out of all three, this is the most critical phase of data preparation associated with grouping and assigning values to the survey responses . If a survey is completed with a 1000 sample size, the researcher will create an age bracket to distinguish the respondents based on their age. Thus, it becomes easier to analyze small data buckets rather than deal with the massive data pile.

LEARN ABOUT: Steps in Qualitative Research

After the data is prepared for analysis, researchers are open to using different research and data analysis methods to derive meaningful insights. For sure, statistical analysis plans are the most favored to analyze numerical data. In statistical analysis, distinguishing between categorical data and numerical data is essential, as categorical data involves distinct categories or labels, while numerical data consists of measurable quantities. The method is again classified into two groups. First, ‘Descriptive Statistics’ used to describe data. Second, ‘Inferential statistics’ that helps in comparing the data .

Descriptive statistics

This method is used to describe the basic features of versatile types of data in research. It presents the data in such a meaningful way that pattern in the data starts making sense. Nevertheless, the descriptive analysis does not go beyond making conclusions. The conclusions are again based on the hypothesis researchers have formulated so far. Here are a few major types of descriptive analysis methods.

Measures of Frequency

  • Count, Percent, Frequency
  • It is used to denote home often a particular event occurs.
  • Researchers use it when they want to showcase how often a response is given.

Measures of Central Tendency

  • Mean, Median, Mode
  • The method is widely used to demonstrate distribution by various points.
  • Researchers use this method when they want to showcase the most commonly or averagely indicated response.

Measures of Dispersion or Variation

  • Range, Variance, Standard deviation
  • Here the field equals high/low points.
  • Variance standard deviation = difference between the observed score and mean
  • It is used to identify the spread of scores by stating intervals.
  • Researchers use this method to showcase data spread out. It helps them identify the depth until which the data is spread out that it directly affects the mean.

Measures of Position

  • Percentile ranks, Quartile ranks
  • It relies on standardized scores helping researchers to identify the relationship between different scores.
  • It is often used when researchers want to compare scores with the average count.

For quantitative research use of descriptive analysis often give absolute numbers, but the in-depth analysis is never sufficient to demonstrate the rationale behind those numbers. Nevertheless, it is necessary to think of the best method for research and data analysis suiting your survey questionnaire and what story researchers want to tell. For example, the mean is the best way to demonstrate the students’ average scores in schools. It is better to rely on the descriptive statistics when the researchers intend to keep the research or outcome limited to the provided  sample  without generalizing it. For example, when you want to compare average voting done in two different cities, differential statistics are enough.

Descriptive analysis is also called a ‘univariate analysis’ since it is commonly used to analyze a single variable.

Inferential statistics

Inferential statistics are used to make predictions about a larger population after research and data analysis of the representing population’s collected sample. For example, you can ask some odd 100 audiences at a movie theater if they like the movie they are watching. Researchers then use inferential statistics on the collected  sample  to reason that about 80-90% of people like the movie. 

Here are two significant areas of inferential statistics.

  • Estimating parameters: It takes statistics from the sample research data and demonstrates something about the population parameter.
  • Hypothesis test: I t’s about sampling research data to answer the survey research questions. For example, researchers might be interested to understand if the new shade of lipstick recently launched is good or not, or if the multivitamin capsules help children to perform better at games.

These are sophisticated analysis methods used to showcase the relationship between different variables instead of describing a single variable. It is often used when researchers want something beyond absolute numbers to understand the relationship between variables.

Here are some of the commonly used methods for data analysis in research.

  • Correlation: When researchers are not conducting experimental research or quasi-experimental research wherein the researchers are interested to understand the relationship between two or more variables, they opt for correlational research methods.
  • Cross-tabulation: Also called contingency tables,  cross-tabulation  is used to analyze the relationship between multiple variables.  Suppose provided data has age and gender categories presented in rows and columns. A two-dimensional cross-tabulation helps for seamless data analysis and research by showing the number of males and females in each age category.
  • Regression analysis: For understanding the strong relationship between two variables, researchers do not look beyond the primary and commonly used regression analysis method, which is also a type of predictive analysis used. In this method, you have an essential factor called the dependent variable. You also have multiple independent variables in regression analysis. You undertake efforts to find out the impact of independent variables on the dependent variable. The values of both independent and dependent variables are assumed as being ascertained in an error-free random manner.
  • Frequency tables: The statistical procedure is used for testing the degree to which two or more vary or differ in an experiment. A considerable degree of variation means research findings were significant. In many contexts, ANOVA testing and variance analysis are similar.
  • Analysis of variance: The statistical procedure is used for testing the degree to which two or more vary or differ in an experiment. A considerable degree of variation means research findings were significant. In many contexts, ANOVA testing and variance analysis are similar.
  • Researchers must have the necessary research skills to analyze and manipulation the data , Getting trained to demonstrate a high standard of research practice. Ideally, researchers must possess more than a basic understanding of the rationale of selecting one statistical method over the other to obtain better data insights.
  • Usually, research and data analytics projects differ by scientific discipline; therefore, getting statistical advice at the beginning of analysis helps design a survey questionnaire, select data collection methods , and choose samples.

LEARN ABOUT: Best Data Collection Tools

  • The primary aim of data research and analysis is to derive ultimate insights that are unbiased. Any mistake in or keeping a biased mind to collect data, selecting an analysis method, or choosing  audience  sample il to draw a biased inference.
  • Irrelevant to the sophistication used in research data and analysis is enough to rectify the poorly defined objective outcome measurements. It does not matter if the design is at fault or intentions are not clear, but lack of clarity might mislead readers, so avoid the practice.
  • The motive behind data analysis in research is to present accurate and reliable data. As far as possible, avoid statistical errors, and find a way to deal with everyday challenges like outliers, missing data, data altering, data mining , or developing graphical representation.

LEARN MORE: Descriptive Research vs Correlational Research The sheer amount of data generated daily is frightening. Especially when data analysis has taken center stage. in 2018. In last year, the total data supply amounted to 2.8 trillion gigabytes. Hence, it is clear that the enterprises willing to survive in the hypercompetitive world must possess an excellent capability to analyze complex research data, derive actionable insights, and adapt to the new market needs.

LEARN ABOUT: Average Order Value

QuestionPro is an online survey platform that empowers organizations in data analysis and research and provides them a medium to collect data by creating appealing surveys.

MORE LIKE THIS

data information vs insight

Data Information vs Insight: Essential differences

May 14, 2024

pricing analytics software

Pricing Analytics Software: Optimize Your Pricing Strategy

May 13, 2024

relationship marketing

Relationship Marketing: What It Is, Examples & Top 7 Benefits

May 8, 2024

email survey tool

The Best Email Survey Tool to Boost Your Feedback Game

May 7, 2024

Other categories

  • Academic Research
  • Artificial Intelligence
  • Assessments
  • Brand Awareness
  • Case Studies
  • Communities
  • Consumer Insights
  • Customer effort score
  • Customer Engagement
  • Customer Experience
  • Customer Loyalty
  • Customer Research
  • Customer Satisfaction
  • Employee Benefits
  • Employee Engagement
  • Employee Retention
  • Friday Five
  • General Data Protection Regulation
  • Insights Hub
  • Life@QuestionPro
  • Market Research
  • Mobile diaries
  • Mobile Surveys
  • New Features
  • Online Communities
  • Question Types
  • Questionnaire
  • QuestionPro Products
  • Release Notes
  • Research Tools and Apps
  • Revenue at Risk
  • Survey Templates
  • Training Tips
  • Uncategorized
  • Video Learning Series
  • What’s Coming Up
  • Workforce Intelligence

Logo for British Columbia/Yukon Open Authoring Platform

Want to create or adapt books like this? Learn more about how Pressbooks supports open publishing practices.

Chapter 10: Qualitative Data Collection & Analysis Methods

10.5 Analysis of Qualitative Interview Data

Analysis of qualitative interview data typically begins with a set of transcripts of the interviews conducted. Obtaining said transcripts requires either having taken exceptionally good notes during an interview or, preferably, recorded the interview and then transcribed it. To transcribe an interview means to create a complete, written copy of the recorded interview by playing the recording back and typing in each word that is spoken on the recording, noting who spoke which words. In general, it is best to aim for a verbatim transcription, i.e., one that reports word for word exactly what was said in the recorded interview. If possible, it is also best to include nonverbal responses in the written transcription of an interview (if the interview is completed face-to-face, or some other form of visual contact is maintained, such as with Skype). Gestures made by respondents should be noted, as should the tone of voice and notes about when, where, and how spoken words may have been emphasized by respondents.

If you have the time, it is best to transcribe your interviews yourself. If the researcher who conducted the interviews transcribes them herself, that person will also be able to record associated nonverbal behaviors and interactions that may be relevant to analysis but that could not be picked up by audio recording. Interviewees may roll their eyes, wipe tears from their face, and even make obscene gestures that speak volumes about their feelings; however, such non-verbal gestures cannot be recorded, and being able to remember and record in writing these details as it relates to the transcribing of interviews is invaluable.

Overall, the goal of analysis is to reach some inferences, lessons, or conclusions by condensing large amounts of data into relatively smaller, more manageable bits of understandable information. Analysis of qualitative interview data often works inductively (Glaser & Strauss, 1967; Patton, 2001). To move from the specific observations an interviewer collects to identifying patterns across those observations, qualitative interviewers will often begin by reading through transcripts of their interviews and trying to identify codes. A code is a shorthand representation of some more complex set of issues or ideas. The process of identifying codes in one’s qualitative data is often referred to as coding . Coding involves identifying themes across interview data by reading and re-reading (and re-reading again) interview transcripts, until the researcher has a clear idea about what sorts of themes come up across the interviews. Coding helps to achieve the goal of data management and data reduction (Palys & Atchison, 2014, p. 304).

Coding can be inductive or deductive. Deductive coding is the approach used by research analysts who have a well-specified or pre-defined set of interests (Palys & Atchison, 2014, P. 304). The process of deductive coding begins with the analyst utilizing those specific or pre-defined interests to identify “relevant” passages, quotes, images, scenes, etc., to develop a set of preliminary codes (often referred to as descriptive coding ). From there, the analyst elaborates on these preliminary codes, making finer distinctions within each coding category (known as interpretative coding ). Pattern coding is another step an analyst might take as different associations become apparent. For example, if you are studying at-risk behaviours in youth, and you discover that the various behaviours have different characteristics and meanings depending upon the social context (e.g., school, family, work) in which the various behaviours occur, you have identified a pattern (Palys & Atchison, 2014, p. 304).

In contrast, inductive coding begins with the identification of general themes and ideas that emerge as the researcher reads through the data. This process is also referred to as open coding (Palys & Atchison, 2014, p. 305), because it will probably require multiple analyses. As you read through your transcripts, it is likely that you will begin to see some commonalities across the categories or themes that you’ve jotted down (Saylor Academy, 2012). The open coding process can go one of two ways: either the researcher elaborates on a category by making finer, and then even finer distinctions, or the researcher starts with a very specific descriptive category that is subsequently collapsed into another category (Palys & Atchison, 2014, p. 305). In other words, the development and elaboration of codes arise out of the material that is being examined.

The next step for the research analyst is to begin more specific coding, which is known as focused or axial coding . Focused coding involves collapsing or narrowing themes and categories identified in open coding by reading through the notes you made while conducting open coding, identifying themes or categories that seem to be related, and perhaps merging some. Then give each collapsed/merged theme or category a name (or code) and identify passages of data that fit each named category or theme. To identify passages of data that represent your emerging codes, you will need to read through your transcripts several times. You might also write up brief definitions or descriptions of each code. Defining codes is a way of giving meaning to your data, and developing a way to talk about your findings and what your data means (Saylor Academy, 2012).

As tedious and laborious as it might seem to read through hundreds of pages of transcripts multiple times, sometimes getting started with the coding process is actually the hardest part. If you find yourself struggling to identify themes at the open coding stage, ask yourself some questions about your data. The answers should give you a clue about what sorts of themes or categories you are reading (Saylor Academy, 2012). (Lofland and Lofland,1995, p. 2001) identify a set of questions that are useful when coding qualitative data. They suggest asking the following:

  • Of what topic, unit, or aspect is this an instance?
  • What question about a topic does this item of data suggest?
  • What sort of answer to a question about a topic does this item of data suggest (i.e., what proposition is suggested)?

Asking yourself these questions about the passages of data that you are reading can help you begin to identify and name potential themes and categories.

Table 10.3 “ Interview coding” example is drawn from research undertaken by Saylor Academy (Saylor Academy, 2012) where she presents two codes that emerged from her inductive analysis of transcripts from her interviews with child-free adults. Table 10.3 also includes a brief description of each code and a few (of many) interview excerpts from which each code was developed.

Table 10.3 Interview coding

Just as quantitative researchers rely on the assistance of special computer programs designed to help sort through and analyze their data, so, do qualitative researchers. Where quantitative researchers have SPSS and MicroCase (and many others), qualitative researchers have programs such as NVivo ( http://www.qsrinternational.com ) and Atlasti ( http://www.atlasti.com ). These are programs specifically designed to assist qualitative researchers to organize, manage, sort, and analyze large amounts of qualitative data. The programs allow researchers to import interview transcripts contained in an electronic file and then label or code passages, cut and paste passages, search for various words or phrases, and organize complex interrelationships among passages and codes

Research Methods for the Social Sciences: An Introduction Copyright © 2020 by Valerie Sheppard is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License , except where otherwise noted.

Share This Book

  • Python For Data Analysis
  • Data Science
  • Data Analysis with R
  • Data Analysis with Python
  • Data Visualization with Python
  • Data Analysis Examples
  • Math for Data Analysis
  • Data Analysis Interview questions
  • Artificial Intelligence
  • Data Analysis Projects
  • Machine Learning
  • Deep Learning
  • Computer Vision
  • How to Use Bard for Data Analysis and Insights
  • Financial Analysis: Objectives, Methods, and Process
  • Time Series Analysis & Visualization in Python
  • Difference Between Data Visualization and Data Analytics
  • What are the 5 methods of statistical analysis?
  • What Is Spatial Analysis, and How Does It Work
  • Qualitative and Quantitative Data
  • What is Geospatial Data Analysis?
  • Data-Driven Design Decisions and Analytics Tools
  • Data analysis and Visualization with Python
  • Exploratory Data Analysis (EDA) - Types and Tools
  • Data Analytics and its type
  • Why Data Visualization Matters in Data Analytics?
  • What is Data Analysis?
  • Data analytics and career opportunities
  • Methods of Economic Analysis
  • Data Science Methodology and Approach
  • Data | Analysis Quiz | Question 1
  • What is Exploratory Data Analysis ?

Qualitative Data Analysis Methodologies and Methods

Qualitative data analysis involves interpreting non-numerical data to identify patterns, themes, and insights. There are several methodologies and methods used in qualitative data analysis.

Qualitative-Data-Analysis-Methodologies

In this article, we will explore qualitative data analysis techniques in great detail, with each method providing a different perspective on how to interpret qualitative data.

Table of Content

Types of Qualitative Data Analysis Methodologies

1. content analysis, 2. thematic analysis, 3. narrative analysis, 4. discourse analysis, 5. grounded theory analysis, 6. text analysis, 7. ethnographic analysis, advantages and disadvantages of different qualitative data analysis methodologies, best practices for qualitative data analysis, qualitative data analysis methods- faq’s.

Lets weigh the benefits and disadvantages of each:

Content analysis involves systematically reading textual content or other types of communication to perceive patterns, themes, and meanings within the content. It provides a dependent technique to inspecting huge volumes of records to discover insights or trends. Researchers categorize and code the content material based on predetermined criteria or emergent themes, taking into consideration quantitative and qualitative interpretation of the facts. Content analysis is regularly an iterative procedure, with researchers revisiting and refining the coding scheme, collecting additional facts, or accomplishing in addition analysis as needed to deepen know-how or cope with new studies questions.

There are 3 fundamental techniques to content analysis:

  • Conventional Content Analysis : In conventional content analysis, researchers technique the records with out preconceived categories or theoretical frameworks. Instead, they allow classes and themes to emerge evidently from the statistics through an iterative system of coding and analysis. This technique is exploratory and bendy, allowing for the discovery of latest insights and styles inside the content material.
  • Directed Content Analysis : Directed content material analysis entails studying the statistics based totally on existing theories or principles. Researchers start with predefined categories or subject matters derived from theoretical frameworks or previous research findings. The analysis is focused on confirming, refining, or extending present theories in place of coming across new ones. Directed content analysis is specifically beneficial whilst researchers intention to test hypotheses or explore particular concepts in the statistics.
  • Summative Content Analysis : Summative content material analysis focuses on quantifying the presence or frequency of precise content within the information. Researchers expand predetermined classes or coding schemes primarily based on predefined criteria, after which systematically code the statistics in line with those classes. The emphasis is on counting occurrences of predefined attributes or topics to provide a numerical summary of the content. Summative content material analysis is frequently used to track modifications over time, examine unique assets of content material, or verify the superiority of specific subject matters inside a dataset.

When to Use Content Analysis?

  • Exploratory Research : Content analysis is appropriate for exploratory research in which the goal is to uncover new insights, discover emerging developments, or recognize the breadth of communique on a particular subject matter.
  • Comparative Analysis: It is useful for comparative analysis, permitting researchers to compare conversation throughout extraordinary sources, time periods, or cultural contexts.
  • Historical Analysis : Content analysis can be carried out to historical research, allowing researchers to analyze ancient files, media content, or archival substances to apprehend conversation styles over the years.
  • Policy Analysis: It is valuable for policy analysis, supporting researchers look at the portrayal of problems in media or public discourse and informing coverage-making methods.
  • Market Research: Content analysis is usually utilized in market research to investigate advertising and marketing substances, social media content, and customer critiques, presenting insights into patron perceptions and possibilities.

Thematic analysis is a method for identifying, analyzing, and reporting styles or topics within qualitative records. It entails systematically coding and categorizing information to become aware of not unusual issues, styles, or ideas that emerge from the dataset. Researchers interact in a method of inductive reasoning to generate topics that capture the essence of the facts, making an allowance for interpretation and exploration of underlying meanings.

Thematic analysis is appropriate when researchers are seeking for to become aware of, analyze, and document patterns or issues inside qualitative records. It is especially beneficial for exploratory studies where the intention is to find new insights or recognize the breadth of studies and views associated with a specific phenomenon.

Thematic analysis offers a bendy and systematic approach for identifying and reading styles or topics within qualitative statistics, making it a treasured method for exploring complex phenomena and producing insights that inform concept, exercise, and policy.

When to use Thematic analysis?

  • Psychology : Thematic analysis is used to explore mental phenomena, which include coping mechanisms in reaction to strain, attitudes towards mental fitness, or stories of trauma.
  • Education : Researchers practice thematic analysis to apprehend student perceptions of getting to know environments, teaching methods, or academic interventions.
  • Healthcare : Thematic analysis enables take a look at affected person reports with healthcare offerings, attitudes towards treatment alternatives, or obstacles to gaining access to healthcare.
  • Market Research: Thematic analysis is applied to research purchaser remarks, perceive product options, or recognize emblem perceptions in marketplace research research.

Narrative analysis entails analyzing and interpreting the memories or narratives that people use to make feel of their stories. It focuses on the shape, content, and which means of narratives to apprehend how people construct and speak their identities, values, and ideals via storytelling. It is especially beneficial for exploring how people assemble and communicate their identities, values, and beliefs through storytelling.

When to use Narrative Analysis?

It’s extensively used throughout numerous disciplines, which includes sociology, psychology, anthropology, literary research, and verbal exchange studies. Some applications of narrative analysis in qualitative statistics analysis methodologies are:

  • Understanding Identity Construction : Narrative analysis can be used to explore how people construct their identities through the tales they tell approximately themselves. Researchers can examine the issues, plot systems, and language utilized in narratives to uncover how individuals perceive themselves and their place inside the world.
  • Exploring Life Experiences : Researchers frequently use narrative analysis to research the lived reports of people or groups. By inspecting the narratives shared by using members, researchers can advantage insights into the demanding situations, triumphs, and extensive events that shape people’s lives.
  • Examining Cultural Meanings and Practices: Narrative analysis can provide treasured insights into cultural meanings and practices. By studying the stories shared within a selected cultural context, researchers can find shared values, ideals, and norms that influence behavior and social interactions.
  • Exploring Trauma and Healing : Narrative analysis is usually utilized in studies on trauma and restoration tactics. By studying narratives of trauma survivors, researchers can explore how individuals make experience of their studies, deal with adversity, and embark on trips of restoration and resilience.
  • Analyzing Media and Popular Culture : Narrative analysis also can be applied to analyze media texts, inclusive of films, tv suggests, and literature. Researchers can have a look at the narratives constructed within these texts to understand how they reflect and shape cultural beliefs, ideologies, and norms.

Narrative analysis offers a powerful technique for exploring the structure, content, and that means of narratives or stories instructed by people, providing insights into their lived reports, identities, and perspectives. However, researchers need to navigate the interpretive subjectivity, time-extensive nature, and moral concerns related to reading narratives in qualitative studies.

Discourse analysis examines the approaches wherein language is used to construct that means, form social interactions, and reproduce electricity members of the family inside society. It makes a speciality of studying spoken or written texts, in addition to the wider social and cultural contexts in which communique happens. Researchers explore how language displays and shapes social norms, ideologies, and power dynamics.

Discourse analysis is employed when researchers are seeking to investigate social interactions, power dynamics, and identity creation through language. It is applied to take a look at how language shapes social relations, constructs identities, and reflects cultural norms and values.

When to use Discourse Analysis?

  • Linguistics and Language Studies : Discourse analysis is foundational to linguistics and language research, where it’s miles used to study language use, communique patterns, and discourse structures. Linguists behavior discourse analysis to investigate how language shapes social interactions, constructs identities, and reflects cultural norms. Discourse analysis facilitates uncover the underlying meanings, ideologies, and energy dynamics embedded in language.
  • Media and Communication : Discourse analysis is applied in media and conversation research to have a look at media representations, discursive practices, and ideological frameworks. Researchers conduct discourse analysis to analyze media texts, information coverage, and political speeches, exploring how language constructs and disseminates social meanings and values. Discourse analysis informs media literacy efforts, media grievance, and media coverage debates.
  • Political Science : Discourse analysis is applied in political science to look at political rhetoric, public discourse, and policymaking tactics. Researchers behavior discourse analysis to research political speeches, party manifestos, and coverage files, analyzing how language constructs political identities, legitimizes authority, and shapes public opinion. Discourse analysis informs political verbal exchange techniques, political campaigning, and policy advocacy.

Grounded theory analysis is an inductive studies approach used to broaden theories or causes based on empirical data. It includes systematically studying qualitative information to perceive ideas, categories, and relationships that emerge from the statistics itself, rather than testing preconceived hypotheses. Researchers have interaction in a procedure of constant assessment and theoretical sampling to refine and increase theoretical insights.

Grounded theory analysis is hired whilst researchers are seeking for to find styles, relationships, and tactics that emerge from the records itself, with out implementing preconceived hypotheses or theoretical assumptions.

When to use Grounded Theory Analysis?

Grounded concept analysis is applied throughout various disciplines and studies contexts, such as:

  • Social Sciences Research : Grounded Theory Analysis is significantly used in sociology, anthropology, psychology, and related disciplines to discover diverse social phenomena together with organization dynamics, social interactions, cultural practices, and societal structures.
  • Healthcare Research : In healthcare, Grounded Theory can be implemented to apprehend affected person reviews, healthcare provider-patient interactions, healthcare delivery procedures, and the impact of healthcare guidelines on individuals and communities.
  • Organizational Studies : Researchers use Grounded Theory to examine organizational conduct, leadership, place of work subculture, and worker dynamics. It enables in knowledge how groups function and the way they may be advanced.
  • Educational Research : In training, Grounded Theory Analysis can be used to discover teaching and getting to know processes, scholar studies, educational regulations, and the effectiveness of educational interventions.

Text analysis involves examining written or verbal communique to extract meaningful insights or styles. It encompasses numerous techniques which includes sentiment analysis, subject matter modeling, and keyword extraction. For instance, in a have a look at on patron opinions of a eating place, textual content analysis is probably used to become aware of established topics along with food first-class, service enjoy, and atmosphere. Key additives and strategies worried in text analysis:

  • Sentiment Analysis : This approach includes determining the sentiment expressed in a piece of textual content, whether or not it is high quality, bad, or impartial. Sentiment analysis algorithms use natural language processing (NLP) to analyze the words, phrases, and context within the text to deduce the overall sentiment. For instance, in customer reviews of a eating place, sentiment analysis could be used to gauge purchaser delight levels based totally on the emotions expressed within the critiques.
  • Topic Modeling : Topic modeling is a statistical technique used to become aware of the underlying topics or issues present within a group of documents or text statistics. It entails uncovering the latent patterns of co-occurring phrases or terms that constitute awesome topics. Techniques like Latent Dirichlet Allocation (LDA) and Latent Semantic Analysis (LSA) are normally used for topic modeling. In the context of eating place opinions, subject matter modeling should assist identify not unusual subject matters inclusive of meals excellent, provider revel in, cleanliness, etc., across a large corpus of opinions.
  • Keyword Extraction : Keyword extraction includes figuring out and extracting the most applicable phrases or phrases from a bit of text that seize its essence or major topics. This technique enables to summarize the important thing content material or subjects mentioned within the textual content. For instance, in eating place analysiss, key-word extraction ought to identify often referred to terms like “scrumptious meals,” “friendly group of workers,” “lengthy wait times,” etc., presenting a quick analysis of customer sentiments and concerns.

When to use Text Analysis?

Text analysis has numerous programs throughout diverse domain names, including:

  • Business and Marketing: Analyzing purchaser remarks, sentiment analysis of social media posts, brand monitoring, and market fashion analysis.
  • Healthcare: Extracting scientific statistics from scientific notes, analyzing patient comments, and detecting unfavorable drug reactions from textual content information.
  • Social Sciences: Studying public discourse, political communique, opinion mining, and discourse analysis in social media.
  • Academic Research: Conducting literature analysiss, analyzing studies articles, and identifying rising studies topics and trends.
  • Customer Experience : Understanding purchaser sentiments, identifying product or service problems, and improving client satisfaction via text-based totally comments analysis.

Ethnographic analysis involves immersing in a selected cultural or social setting to understand the views, behaviors, and interactions of the human beings within that context. Researchers conduct observations, interviews, and participant observations to gain insights into the culture, practices, and social dynamics of the community under study. It is is suitable when researchers aim to gain an in-depth understanding of a particular cultural or social setting, including the perspectives, behaviors, and interactions of the people within that context. Particularly beneficial for reading complex social phenomena of their natural environment, wherein observations and interactions arise organically.

When to use Ethnographic Analysis?

  • Cultural Understanding : Ethnographic analysis is right whilst researchers goal to gain deep insights into the lifestyle, ideals, and social practices of a selected institution or community.
  • Behavioral Observation : It is beneficial while researchers want to observe and apprehend the behaviors, interactions, and each day activities of individuals within their natural surroundings.
  • Contextual Exploration : Ethnographic analysis is valuable for exploring the context and lived stories of individuals, presenting wealthy, exact descriptions of their social and cultural worlds.
  • Complex Social Dynamics: It is suitable whilst analyzing complex social phenomena or phenomena which might be deeply embedded within social contexts, including rituals, traditions, or network dynamics.
  • Qualitative Inquiry: Ethnographic analysis is desired while researchers are seeking for to conduct qualitative inquiry targeted on know-how the subjective meanings and perspectives of individuals inside their cultural context.

Ethnographic analysis gives a effective method for analyzing complex social phenomena of their herbal context, offering rich and nuanced insights into the cultural practices, social dynamics, and lived experiences of individuals inside a particular community. However, researchers need to cautiously bear in mind the time commitment, ethical considerations, and potential biases associated with ethnographic studies.

  • Clearly Defined Research Question : Ground analysis in a clear and targeted research question. This will manual for information series and preserve you on the right track at some point of analysis.
  • Systematic Coding : Develop a coding scheme to categorize facts into significant topics or concepts. Use software gear to assist in organizing and dealing with codes.
  • Constant Comparison : Continuously examine new facts with current codes and subject matters to refine interpretations and make sure consistency.
  • Triangulation : Validate findings by the use of a couple of records sources, strategies, or researchers to corroborate consequences and beautify credibility.

Refine subject matters and interpretations through engaging in repeated cycles of gathering, coding, and analysis.

Qualitative data analysis techniques are effective means of revealing deep insights and comprehending intricate phenomena in both practice and study. Through the use of rigorous analytical approaches, researchers may convert qualitative data into significant ideas, interpretations, and narratives that further knowledge and support evidence-based decision-making.

Is it possible to mix quantitative and qualitative methodologies for data analysis?

A: In order to triangulate results and get a thorough grasp of study concerns, researchers do, in fact, often use mixed methods techniques.

How can I choose the best approach for analyzing qualitative data for my study?

A: To choose the best approach, take the research topic, the properties of the data, and the theoretical framework into consideration.

What are some tactics I might do to improve the reliability and validity of my qualitative data analysis?

Aim for peer debriefing and member verification to improve validity, and maintain transparency, reflexivity, and methodological coherence throughout the analytic process.

Please Login to comment...

Similar reads.

  • Data Analysis

Improve your Coding Skills with Practice

 alt=

What kind of Experience do you want to share?

how to write data analysis for qualitative research

No products in the cart.

What is Qualitative Data Analysis Software (QDA Software)?

how to write data analysis for qualitative research

Qualitative Data Analysis Software (QDA software) allows researchers to organize, analyze and visualize their data, finding the patterns in qualitative data or unstructured data: interviews, surveys, field notes, videos, audio files, images, journal articles interviews, web content etc.

Quantitative vs. Qualitative Data Analysis

What is the difference between quantitative and qualitative data analysis. As the name implies, quantitative data analysis has to do with numbers. For example, any time you are doing statistical analysis, you are doing quantitative data analysis. Some examples of quantitative data analysis software are SPSS, STATA, SAS, and Lumivero’s own powerful statistics software, XLSTAT .

In contrast, qualitative analysis "helps you understand people’s perceptions and experiences by systematically coding and analyzing the data", as described in Qualitative vs Quantitative Research 101 . It tends to deal more with words than numbers. It can be useful when working with a lot of rich and deep data and when you aren’t trying to test something very specific. Some examples of qualitative data analysis software are MAXQDA, ATLAS.ti, Quirkos, and Lumivero’s NVivo, the leading tool for qualitative data analysis .

When would you use each one? Well, qualitative data analysis is often used for exploratory research or developing a theory, whereas quantitative is better if you want to test a hypothesis, find averages, and determine relationships between variables. With quantitative research you often want a large sample size to get relevant statistics. In contrast, qualitative research, because so much data in the form of text is involved, can have much smaller sample sizes and still yield valuable insights.

Of course, it’s not always so cut and dry, and many researchers end up taking a «mixed methods» approach, meaning that they combine both types of research. In this case they might use a combination of both types of software programs.

Learn how some qualitative researchers use QDA software for text analysis in the on-demand webinar Twenty-Five Qualitative Researchers Share How-To's for Data Analysis .

NVivo Demo Request

How is Qualitative Data Analysis Software Used for Research?

Qualitative Data Analysis Software works with any qualitative research methodology used by a researcher For example, software for qualitative data analysis can be used by a social scientist wanting to develop new concepts or theories may take a ‘grounded theory’ approach. Or a researcher looking for ways to improve health policy or program design might use ‘evaluation methods’. QDA software analysis tools don't favor a particular methodology — they're designed to facilitate common qualitative techniques no matter what method you use.

NVivo can help you to manage, explore and find patterns in your data and conduct thematic and sentiment analysis, but it cannot replace your analytical expertise.

Qualitative Research as an Iterative Process

Handling qualitative and mixed methods data is not usually a step-by-step process. Instead, it tends to be an iterative process where you explore, code, reflect, memo, code some more, query and so on. For example, this picture shows a path you might take to investigate an interesting theme using QDA software, like NVivo, to analyze data:

how to write data analysis for qualitative research

How Do I Choose the Best Approach for My Research Project with QDA Software?

Every research project is unique — the way you organize and analyze the material depends on your methodology, data and research design.

Here are some example scenarios for handling different types of research projects in QDA software — these are just suggestions to get you up and running.

A study with interviews exploring stakeholder perception of a community arts program

Your files consist of unstructured interview documents. You would set up a case for each interview participant, then code to codes and cases. You could then explore your data with simple queries or charts and use memos to record your discoveries.

how to write data analysis for qualitative research

A study exploring community perceptions about climate change using autocoding with AI

Your files consist of structured, consistently formatted interviews (where each participant is asked the same set of questions). With AI, you could autocode the interviews and set up cases for each participant. Then code themes to query and visualize your data.

how to write data analysis for qualitative research

A literature review on adolescent depression

Your files consist of journal articles, books and web pages. You would classify your files before coding and querying them; and then you could critique each file in a memo. With Citavi integration in NVivo, you can import your Citavi references into NVivo.

how to write data analysis for qualitative research

A social media study of the language used by members of an online community

Your files consist of Facebook data captured with NCapture. You would import it as a dataset ready to code and query. Use memos to record your insights.

how to write data analysis for qualitative research

A quick analysis of a local government budget survey

Your file is a large dataset of survey responses. You would import it using the Survey Import Wizard, which prepares your data for analysis. As part of the import, choose to run automated insights with AI to identify and code to themes and sentiment so that you can quickly review results and report broad findings.

how to write data analysis for qualitative research

Ways to Get Started with Your Project with Qualitative Analysis Software

Since projects (and researchers) are unique there is no one 'best practice' approach to organizing and analyzing your data but there are some useful strategies to help you get up and running:

  • Start now - don't wait until you have collected all the data. Import your research design, grant application or thesis proposal.
  • Make a  project journa l and state your research questions and record your goals. Why are you doing the project? What is it about? What do you expect to find and why?
  • Make a  mind map  for your preliminary ideas. Show the relationships or patterns you expect to find in your data based on prior experience or preliminary reading.
  • Import your interviews, field notes, focus groups —organize these files into folders for easy access.
  • Set up an initial code structure based on your early reading and ideas—you could run a  Word Frequency query over your data to tease out the common themes for creating your code structure.
  • Set up  cases  for the people, places or other cases in your project.
  • Explore your material and  code themes as they emerge in your data mining —creating memos and describing your discoveries and interpretations.
  • To protect your work, get in the habit of making regular back-ups.

QDA Analysis Tools Help You Work Toward Outcomes that are Robust and Transparent

Using QDA software to organize and analyze your data also increases the 'transparency' of your research outcomes—for example, you can:

  • Demonstrate the evolution of your ideas in memos and maps.
  • Document your early preconceptions and biases (in a memo or map) and demonstrate how these have been acknowledged and tested.
  • Easily find illustrative quotes.
  • Always return to the original context of your coded material.
  • Save and revisit the queries and visualizations that helped you to arrive at your conclusions.

QDA software, like NVivo, can demonstrate the credibility of your findings in the following ways:

  • If you used NVivo for your literature review, run a  query  or create a  chart  to demonstrate how your findings compare with the views of other authors.
  • Was an issue or theme reported by more than one participant? Run a  Matrix Coding query  to see how many participants talked about a theme.
  • Were multiple methods used to collect the data (interviews, observations, surveys)—and are the findings supported across these text data and video data files? Run a Matrix Coding query to see how often a theme is reported across all your files.

how to write data analysis for qualitative research

  • If multiple researchers analyzed the material — were their findings consistent? Use coding stripes (or filter the contents in a code) to see how various team members have coded the material and run a Coding Comparison query to assess the level of agreement.

how to write data analysis for qualitative research

QDA Software Integrations

Many qualitative analysis software options have integration with other software to enhance your research process. NVivo integrates or can be used with the following software:

  • NVivo Transcription to save you time and jump start your qualitative data analysis. Learn how in the on-demand webinar Transcription – Go Beyond the Words .
  • Reference management software, like Lumivero’s Citavi, for reference management and writing. By combining Citavi and NVivo, you can create complicated searches for certain keywords, terms, and categories and make advanced search syntax, like wildcards, boolean operators, and regular expressions. This integration allows you to take your analyses beyond reference management by developing a central location to collect references and thoughts, analyze literature, and connect empirical data.
  • Statistical software, like Lumivero’s XLSTAT , SPSS, or STATA to export your queries from NVivo to run statistical analysis
  • Qualtrics, SurveyMonkey to import your survey results into NVivo to start analyzing.

Make Choosing QDA Software Easy —  Try NVivo Today!

It's tough choosing QDA software! Test out NVivo, the most cited qualitative data analysis tool, by requesting a free 14-day trial of NVivo to start improving your qualitative and mixed methods research today.

Recent Articles

Cookie consent

We use our own and third-party cookies to show you more relevant content based on your browsing and navigation history. Please accept or manage your cookie settings below. Here's our   cookie policy

Product Overview Media

  • Form Builder Signups and orders
  • Survey maker Research and feedback
  • Quiz Maker Trivia and product match
  • Find Customers Generate more leads
  • Get Feedback Discover ways to improve
  • Do research Uncover trends and ideas
  • Marketers Forms for marketing teams
  • Product Forms for product teams
  • HR Forms for HR teams
  • Customer success Forms for customer success teams
  • Business Forms for general business
  • Form templates
  • Survey templates
  • Quiz templates
  • Poll templates
  • Order forms
  • Feedback forms
  • Satisfaction surveys
  • Application forms
  • Feedback surveys
  • Evaluation forms
  • Request forms
  • Signup forms
  • Business surveys
  • Marketing surveys
  • Report forms
  • Customer feedback form
  • Registration form
  • Branding questionnaire
  • 360 feedback
  • Lead generation
  • Contact form
  • Signup sheet

Slack Menu Icon

  • Help center Find quick answers
  • Contact us Speak to someone
  • Our blog Get inspired
  • Our community Share and learn
  • Our guides Tips and how-to
  • Updates News and announcements
  • Brand Our guidelines
  • Partners Browse or join
  • Careers Join our team
  • → Opinions & Expertise
  • → How to analyze survey data: Survey da...

How to analyze survey data: Survey data analysis best practices

Survey data analysis involves reviewing raw materials and transforming them into key insights. Learn how to analyze survey data and best practices here.

Screenshot of survey

Latest posts on Opinions & Expertise

Typeform    |    05.2024

Sheena Fronk    |    04.2024

Lydia Kentowski    |    04.2024

Norma Ventura    |    04.2024

The results are in. You’ve written the questions, found the right people to ask, and got your answers back—now what?

Perfect surveys sent to insightful respondents can become entirely useless if the results aren't coherently and comprehensively analyzed..

So, don’t run and hide.

We know the phrase “survey data analysis” sounds technical and exclusive, but it’s not as scary as it seems. By following a few simple guidelines on how to analyze survey data, you can draw insights from it yourself.

What is survey data analysis?

Survey data analysis is the process of turning survey responses into compelling insights. This includes taking all of your collected data and transforming it into something actionable. Whether it’s open-ended essays, multiple-choice answers, or other questions to ask , you can take this information and uncover patterns and helpful insights. 

Some survey data analysis methods include sorting data into categories and using statistical tactics to identify trends and patterns. The goal is to take these raw data responses and turn them into a clear story that answers your research questions to help you make informed decisions.

Getting started with survey data analysis

Before you get started crunching the numbers and performing a survey data analysis, there are a few pieces of information you need to gather.

First, you need to know the number of total respondents. This number will indicate how large your sample is and how much you can rely on your results. It’s always a good idea to gather people’s opinions, but if 5,000 people attended a concert and only five people answered your survey, you can’t really treat those five answers as representative of the whole group.

Secondly, you need to calculate your survey response rate . This is a straightforward percentage—calculate it by dividing the number of responses you received in total by the number of people you asked to fill out the survey. The higher your response rate and the higher your total number of respondents, the more you can trust your survey data to be representative of the sample as a whole.

How to analyze survey data

The first step when analyzing survey data is to turn your individualized responses into aggregated numbers. This sounds complicated, but really, it just means you need to do some counting.

For every question in your survey, you need to know the total number of people who answered with each response. Take a look at this example question:

By aggregating your responses, you're simply counting how many people answered a, b, c, and d, respectively. If 100 people took your survey, the aggregated results would look something like this:

In the last six months: 30

Six months to a year ago: 40

One to two years ago: 20

Over two years ago: 10

Now, if your survey was conducted through a survey host, your online survey results should be aggregated automatically, so there’ll be no need to add the numbers.

Qualitative vs. quantitative data

Once you have all of your aggregated answers, it’s time to start making some sense of the survey data.

Our brains can make sense of percentages much more quickly and easily than whole numbers. It's also far easier to compare different percentages rather than whole numbers.

Say you wrote a survey asking 5-year-olds for their favorite colors. Just saying that 67 children chose red as their favorite color means very little. However, saying that 23% of the children chose red as their favorite color, compared to 50% who chose blue, gives you a much clearer indication of the relative popularity of one color.

If you’ve asked people to write feedback or long-form answers, leave these until the end.

You don’t want the qualitative data to bias your quantitative analysis. Focus on the numbers first, and hopefully, once you have a clear idea of what the sentiment is, the qualitative answers will be able to help you understand why that might be the case.

How to cross-tabulate survey data

Cross-tabulating your data is where you can really begin to draw insights from your survey results instead of just statistics. It can help you add context to your numbers and explore how different groups of people behave or how different factors might affect a single outcome.

When you plan your survey, you'll have considered the different comparisons you'd like to make. For example, maybe you’d like to know if older people are more likely to enjoy eating olives.

Your question might be something like this:

A screenshot showing a survey of people who like or dislike olives.

Now, in the first round of your data analysis, you might have already divided the respondents into two groups to work out the split between people who like and don't like eating olives.

So let’s say the results of this olive question were:

Like olives: 542 people (46%)

Dislike olives: 630 people (54%)

To cross-tabulate your data, you’ll need to map another variable onto this one.

We’re interested in whether tastes change with age, so let’s use that age as our second variable and ask:

A screenshot showing the ages of survey respondents.

With these results, you can plug them into a Google Sheet and start to see if there are any correlations:

A screenshot of the survey data analysis.

Imagine you have a client who is looking at marketing their olive brand directly to people under 35. You could ask these two questions and look at the split between olive lovers and haters just within this subgroup and see how it compares to the overall average splits.

Benchmarking survey data

Data means very little to us without context and meaning. Turning your numbers into percentages makes comparisons easier, but although proportionally, we can recognize exactly what 75% means, how can we know if that is good?

The answer is benchmarks.

Setting benchmarks is key to making sense of the data and working out what those percentages really mean.

Some of the most common benchmarking techniques involve comparisons between this survey’s results and the data from the last time the survey was collected. To do this effectively, you need to make sure that you are comparing the results of the same question from each survey .

Setting a benchmark using last year’s data is easy. You simply take the percentage splits of responses to a certain question and treat these as your starting point. Then, you can easily see if this month’s data is above or below that benchmark.

Year-over-year or month-over-month comparisons are an excellent way of tracking progress. They allow you to see whether trends are emerging or how much responses have changed in a given period. This is known as longitudinal analysis.

If this is your first time collecting data, no worries, you can still set some benchmarks. Instead of comparing your results to last month's or last year’s data, you can calculate the overall total split between responses for each question and treat this as your benchmark or baseline.

Once you begin to cross-tabulate and break your respondents down into further categories, you can compare their results to your benchmark to place their statistics in context. 

If a value is higher than the average , we can say that this category is over-indexing , and if the value is lower , we can say that the category under-indexes . This gives some context to the statistics and starts letting you draw out some real insights from your survey data.

Why you need to analyze survey data

Quantitative data is extremely valuable when interpreting survey results. However, the numbers themselves are unlikely to provide a concrete answer as to why something happened or why people hold a certain opinion.

Understanding why respondents answered in the way that they did is when you can really start to address problems and make changes. This is where the real insight is born.

Sometimes, the “why” will be answered with direct questions in the survey and sometimes with multiple-choice boxes. Other times, it will be up to you as the survey analyst to determine causation, if possible. And this is where we need to be careful.

It's easy to become sucked into a trap when analyzing survey data and start to see patterns everywhere. This isn't necessarily a bad thing, as identifying a correlation between two variables is a key part of interpreting survey results. However, the danger is that we often make assumptions instead.

Assumptions about the data can be hopes or expectations, conscious or subconscious. However, realizing when we are making assumptions can help us avoid any problems further down the line and prevent us from wasting time.

Ultimately, no one wants to find out their assumptions were false after the survey analysis is complete. Similarly, you wouldn’t want a critical assumption to be false and never even realized.

Survey data analysis examples

Correlation occurs when two different variables move at the same time.

A classic example is the sale of seasonal products. During the summer, swimming pool and barbecue sales rise. When plotted on a graph, the two variables move in the same direction at the same time. However, there's no direct connection between these two variables. People buying barbecues isn't the reason the sales of swimming pools increase.

Causation, on the other hand, occurs when one factor directly causes a change in another factor.

For example, in the case of seasonal products, the weather is a key factor. As the temperature rises in the summer, so do barbecue sales. Barbecue sales here are a variable that's dependent on the weather, and there's a key link between them.

When interpreting survey results, it's easy to mistake correlation for causation. Just because two variables move at the same time, it doesn't mean that one is directly influencing the other.

This is where qualitative data comes in. If you’ve asked your respondents to fill in longer-form answers to explain why they chose a certain response, analyzing these answers can give you the insight you need to work out why.

How to report back on your survey data

When sharing your survey data analysis, remember that the story is what makes it interesting, not the numbers.

The percentages you've calculated are vital evidence for your argument, but your analysis needs a narrative to have a real impact on people's thinking.

If you can, always provide context with your statistics, either comparing them to the same survey from last year or comparing groups of people in the same year’s data. Benchmark your numbers so that your audience is immediately aware of whether what they are seeing is positive or negative.

If you are unable to provide recommended actions based on your survey data analysis, at least signpost the key areas that need attention so the relevant parties can begin to tackle the problem if necessary.

When you visualize your data, remember that while long reports can be fascinating, most people won’t read them. Whoever you are presenting to is unlikely to want to listen or read as you walk them through your survey analysis methods step-by-step, so don’t feel like you have to include every single calculation you made in your report.

Put yourself in your audience’s shoes and determine their interests and priorities. Only give them the information if it is relevant to them, they will understand it, and there's something they can do with this new information.

The author Typeform

About the author

We're Typeform - a team on a mission to transform data collection by bringing you refreshingly different forms.

Liked that? Check these out:

how to write data analysis for qualitative research

5 marketing quizzes and surveys to help you collect more (and better) zero-party data

Google is turning off third-party cookies in its Chrome browsers, joining Apple’s Safari. Your next marketing move? Start collecting zero-party data.

Norma Ventura    |    03.2024

how to write data analysis for qualitative research

How to get started with research-driven content marketing

How do you take your content from good to great? Research. In this guide, we spoke with PeerSignal’s Camille Trent and Some Good Content’s John Bonini to explore the value of research-backed content.

Typeform    |    08.2023

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • J Korean Med Sci
  • v.37(16); 2022 Apr 25

Logo of jkms

A Practical Guide to Writing Quantitative and Qualitative Research Questions and Hypotheses in Scholarly Articles

Edward barroga.

1 Department of General Education, Graduate School of Nursing Science, St. Luke’s International University, Tokyo, Japan.

Glafera Janet Matanguihan

2 Department of Biological Sciences, Messiah University, Mechanicsburg, PA, USA.

The development of research questions and the subsequent hypotheses are prerequisites to defining the main research purpose and specific objectives of a study. Consequently, these objectives determine the study design and research outcome. The development of research questions is a process based on knowledge of current trends, cutting-edge studies, and technological advances in the research field. Excellent research questions are focused and require a comprehensive literature search and in-depth understanding of the problem being investigated. Initially, research questions may be written as descriptive questions which could be developed into inferential questions. These questions must be specific and concise to provide a clear foundation for developing hypotheses. Hypotheses are more formal predictions about the research outcomes. These specify the possible results that may or may not be expected regarding the relationship between groups. Thus, research questions and hypotheses clarify the main purpose and specific objectives of the study, which in turn dictate the design of the study, its direction, and outcome. Studies developed from good research questions and hypotheses will have trustworthy outcomes with wide-ranging social and health implications.

INTRODUCTION

Scientific research is usually initiated by posing evidenced-based research questions which are then explicitly restated as hypotheses. 1 , 2 The hypotheses provide directions to guide the study, solutions, explanations, and expected results. 3 , 4 Both research questions and hypotheses are essentially formulated based on conventional theories and real-world processes, which allow the inception of novel studies and the ethical testing of ideas. 5 , 6

It is crucial to have knowledge of both quantitative and qualitative research 2 as both types of research involve writing research questions and hypotheses. 7 However, these crucial elements of research are sometimes overlooked; if not overlooked, then framed without the forethought and meticulous attention it needs. Planning and careful consideration are needed when developing quantitative or qualitative research, particularly when conceptualizing research questions and hypotheses. 4

There is a continuing need to support researchers in the creation of innovative research questions and hypotheses, as well as for journal articles that carefully review these elements. 1 When research questions and hypotheses are not carefully thought of, unethical studies and poor outcomes usually ensue. Carefully formulated research questions and hypotheses define well-founded objectives, which in turn determine the appropriate design, course, and outcome of the study. This article then aims to discuss in detail the various aspects of crafting research questions and hypotheses, with the goal of guiding researchers as they develop their own. Examples from the authors and peer-reviewed scientific articles in the healthcare field are provided to illustrate key points.

DEFINITIONS AND RELATIONSHIP OF RESEARCH QUESTIONS AND HYPOTHESES

A research question is what a study aims to answer after data analysis and interpretation. The answer is written in length in the discussion section of the paper. Thus, the research question gives a preview of the different parts and variables of the study meant to address the problem posed in the research question. 1 An excellent research question clarifies the research writing while facilitating understanding of the research topic, objective, scope, and limitations of the study. 5

On the other hand, a research hypothesis is an educated statement of an expected outcome. This statement is based on background research and current knowledge. 8 , 9 The research hypothesis makes a specific prediction about a new phenomenon 10 or a formal statement on the expected relationship between an independent variable and a dependent variable. 3 , 11 It provides a tentative answer to the research question to be tested or explored. 4

Hypotheses employ reasoning to predict a theory-based outcome. 10 These can also be developed from theories by focusing on components of theories that have not yet been observed. 10 The validity of hypotheses is often based on the testability of the prediction made in a reproducible experiment. 8

Conversely, hypotheses can also be rephrased as research questions. Several hypotheses based on existing theories and knowledge may be needed to answer a research question. Developing ethical research questions and hypotheses creates a research design that has logical relationships among variables. These relationships serve as a solid foundation for the conduct of the study. 4 , 11 Haphazardly constructed research questions can result in poorly formulated hypotheses and improper study designs, leading to unreliable results. Thus, the formulations of relevant research questions and verifiable hypotheses are crucial when beginning research. 12

CHARACTERISTICS OF GOOD RESEARCH QUESTIONS AND HYPOTHESES

Excellent research questions are specific and focused. These integrate collective data and observations to confirm or refute the subsequent hypotheses. Well-constructed hypotheses are based on previous reports and verify the research context. These are realistic, in-depth, sufficiently complex, and reproducible. More importantly, these hypotheses can be addressed and tested. 13

There are several characteristics of well-developed hypotheses. Good hypotheses are 1) empirically testable 7 , 10 , 11 , 13 ; 2) backed by preliminary evidence 9 ; 3) testable by ethical research 7 , 9 ; 4) based on original ideas 9 ; 5) have evidenced-based logical reasoning 10 ; and 6) can be predicted. 11 Good hypotheses can infer ethical and positive implications, indicating the presence of a relationship or effect relevant to the research theme. 7 , 11 These are initially developed from a general theory and branch into specific hypotheses by deductive reasoning. In the absence of a theory to base the hypotheses, inductive reasoning based on specific observations or findings form more general hypotheses. 10

TYPES OF RESEARCH QUESTIONS AND HYPOTHESES

Research questions and hypotheses are developed according to the type of research, which can be broadly classified into quantitative and qualitative research. We provide a summary of the types of research questions and hypotheses under quantitative and qualitative research categories in Table 1 .

Research questions in quantitative research

In quantitative research, research questions inquire about the relationships among variables being investigated and are usually framed at the start of the study. These are precise and typically linked to the subject population, dependent and independent variables, and research design. 1 Research questions may also attempt to describe the behavior of a population in relation to one or more variables, or describe the characteristics of variables to be measured ( descriptive research questions ). 1 , 5 , 14 These questions may also aim to discover differences between groups within the context of an outcome variable ( comparative research questions ), 1 , 5 , 14 or elucidate trends and interactions among variables ( relationship research questions ). 1 , 5 We provide examples of descriptive, comparative, and relationship research questions in quantitative research in Table 2 .

Hypotheses in quantitative research

In quantitative research, hypotheses predict the expected relationships among variables. 15 Relationships among variables that can be predicted include 1) between a single dependent variable and a single independent variable ( simple hypothesis ) or 2) between two or more independent and dependent variables ( complex hypothesis ). 4 , 11 Hypotheses may also specify the expected direction to be followed and imply an intellectual commitment to a particular outcome ( directional hypothesis ) 4 . On the other hand, hypotheses may not predict the exact direction and are used in the absence of a theory, or when findings contradict previous studies ( non-directional hypothesis ). 4 In addition, hypotheses can 1) define interdependency between variables ( associative hypothesis ), 4 2) propose an effect on the dependent variable from manipulation of the independent variable ( causal hypothesis ), 4 3) state a negative relationship between two variables ( null hypothesis ), 4 , 11 , 15 4) replace the working hypothesis if rejected ( alternative hypothesis ), 15 explain the relationship of phenomena to possibly generate a theory ( working hypothesis ), 11 5) involve quantifiable variables that can be tested statistically ( statistical hypothesis ), 11 6) or express a relationship whose interlinks can be verified logically ( logical hypothesis ). 11 We provide examples of simple, complex, directional, non-directional, associative, causal, null, alternative, working, statistical, and logical hypotheses in quantitative research, as well as the definition of quantitative hypothesis-testing research in Table 3 .

Research questions in qualitative research

Unlike research questions in quantitative research, research questions in qualitative research are usually continuously reviewed and reformulated. The central question and associated subquestions are stated more than the hypotheses. 15 The central question broadly explores a complex set of factors surrounding the central phenomenon, aiming to present the varied perspectives of participants. 15

There are varied goals for which qualitative research questions are developed. These questions can function in several ways, such as to 1) identify and describe existing conditions ( contextual research question s); 2) describe a phenomenon ( descriptive research questions ); 3) assess the effectiveness of existing methods, protocols, theories, or procedures ( evaluation research questions ); 4) examine a phenomenon or analyze the reasons or relationships between subjects or phenomena ( explanatory research questions ); or 5) focus on unknown aspects of a particular topic ( exploratory research questions ). 5 In addition, some qualitative research questions provide new ideas for the development of theories and actions ( generative research questions ) or advance specific ideologies of a position ( ideological research questions ). 1 Other qualitative research questions may build on a body of existing literature and become working guidelines ( ethnographic research questions ). Research questions may also be broadly stated without specific reference to the existing literature or a typology of questions ( phenomenological research questions ), may be directed towards generating a theory of some process ( grounded theory questions ), or may address a description of the case and the emerging themes ( qualitative case study questions ). 15 We provide examples of contextual, descriptive, evaluation, explanatory, exploratory, generative, ideological, ethnographic, phenomenological, grounded theory, and qualitative case study research questions in qualitative research in Table 4 , and the definition of qualitative hypothesis-generating research in Table 5 .

Qualitative studies usually pose at least one central research question and several subquestions starting with How or What . These research questions use exploratory verbs such as explore or describe . These also focus on one central phenomenon of interest, and may mention the participants and research site. 15

Hypotheses in qualitative research

Hypotheses in qualitative research are stated in the form of a clear statement concerning the problem to be investigated. Unlike in quantitative research where hypotheses are usually developed to be tested, qualitative research can lead to both hypothesis-testing and hypothesis-generating outcomes. 2 When studies require both quantitative and qualitative research questions, this suggests an integrative process between both research methods wherein a single mixed-methods research question can be developed. 1

FRAMEWORKS FOR DEVELOPING RESEARCH QUESTIONS AND HYPOTHESES

Research questions followed by hypotheses should be developed before the start of the study. 1 , 12 , 14 It is crucial to develop feasible research questions on a topic that is interesting to both the researcher and the scientific community. This can be achieved by a meticulous review of previous and current studies to establish a novel topic. Specific areas are subsequently focused on to generate ethical research questions. The relevance of the research questions is evaluated in terms of clarity of the resulting data, specificity of the methodology, objectivity of the outcome, depth of the research, and impact of the study. 1 , 5 These aspects constitute the FINER criteria (i.e., Feasible, Interesting, Novel, Ethical, and Relevant). 1 Clarity and effectiveness are achieved if research questions meet the FINER criteria. In addition to the FINER criteria, Ratan et al. described focus, complexity, novelty, feasibility, and measurability for evaluating the effectiveness of research questions. 14

The PICOT and PEO frameworks are also used when developing research questions. 1 The following elements are addressed in these frameworks, PICOT: P-population/patients/problem, I-intervention or indicator being studied, C-comparison group, O-outcome of interest, and T-timeframe of the study; PEO: P-population being studied, E-exposure to preexisting conditions, and O-outcome of interest. 1 Research questions are also considered good if these meet the “FINERMAPS” framework: Feasible, Interesting, Novel, Ethical, Relevant, Manageable, Appropriate, Potential value/publishable, and Systematic. 14

As we indicated earlier, research questions and hypotheses that are not carefully formulated result in unethical studies or poor outcomes. To illustrate this, we provide some examples of ambiguous research question and hypotheses that result in unclear and weak research objectives in quantitative research ( Table 6 ) 16 and qualitative research ( Table 7 ) 17 , and how to transform these ambiguous research question(s) and hypothesis(es) into clear and good statements.

a These statements were composed for comparison and illustrative purposes only.

b These statements are direct quotes from Higashihara and Horiuchi. 16

a This statement is a direct quote from Shimoda et al. 17

The other statements were composed for comparison and illustrative purposes only.

CONSTRUCTING RESEARCH QUESTIONS AND HYPOTHESES

To construct effective research questions and hypotheses, it is very important to 1) clarify the background and 2) identify the research problem at the outset of the research, within a specific timeframe. 9 Then, 3) review or conduct preliminary research to collect all available knowledge about the possible research questions by studying theories and previous studies. 18 Afterwards, 4) construct research questions to investigate the research problem. Identify variables to be accessed from the research questions 4 and make operational definitions of constructs from the research problem and questions. Thereafter, 5) construct specific deductive or inductive predictions in the form of hypotheses. 4 Finally, 6) state the study aims . This general flow for constructing effective research questions and hypotheses prior to conducting research is shown in Fig. 1 .

An external file that holds a picture, illustration, etc.
Object name is jkms-37-e121-g001.jpg

Research questions are used more frequently in qualitative research than objectives or hypotheses. 3 These questions seek to discover, understand, explore or describe experiences by asking “What” or “How.” The questions are open-ended to elicit a description rather than to relate variables or compare groups. The questions are continually reviewed, reformulated, and changed during the qualitative study. 3 Research questions are also used more frequently in survey projects than hypotheses in experiments in quantitative research to compare variables and their relationships.

Hypotheses are constructed based on the variables identified and as an if-then statement, following the template, ‘If a specific action is taken, then a certain outcome is expected.’ At this stage, some ideas regarding expectations from the research to be conducted must be drawn. 18 Then, the variables to be manipulated (independent) and influenced (dependent) are defined. 4 Thereafter, the hypothesis is stated and refined, and reproducible data tailored to the hypothesis are identified, collected, and analyzed. 4 The hypotheses must be testable and specific, 18 and should describe the variables and their relationships, the specific group being studied, and the predicted research outcome. 18 Hypotheses construction involves a testable proposition to be deduced from theory, and independent and dependent variables to be separated and measured separately. 3 Therefore, good hypotheses must be based on good research questions constructed at the start of a study or trial. 12

In summary, research questions are constructed after establishing the background of the study. Hypotheses are then developed based on the research questions. Thus, it is crucial to have excellent research questions to generate superior hypotheses. In turn, these would determine the research objectives and the design of the study, and ultimately, the outcome of the research. 12 Algorithms for building research questions and hypotheses are shown in Fig. 2 for quantitative research and in Fig. 3 for qualitative research.

An external file that holds a picture, illustration, etc.
Object name is jkms-37-e121-g002.jpg

EXAMPLES OF RESEARCH QUESTIONS FROM PUBLISHED ARTICLES

  • EXAMPLE 1. Descriptive research question (quantitative research)
  • - Presents research variables to be assessed (distinct phenotypes and subphenotypes)
  • “BACKGROUND: Since COVID-19 was identified, its clinical and biological heterogeneity has been recognized. Identifying COVID-19 phenotypes might help guide basic, clinical, and translational research efforts.
  • RESEARCH QUESTION: Does the clinical spectrum of patients with COVID-19 contain distinct phenotypes and subphenotypes? ” 19
  • EXAMPLE 2. Relationship research question (quantitative research)
  • - Shows interactions between dependent variable (static postural control) and independent variable (peripheral visual field loss)
  • “Background: Integration of visual, vestibular, and proprioceptive sensations contributes to postural control. People with peripheral visual field loss have serious postural instability. However, the directional specificity of postural stability and sensory reweighting caused by gradual peripheral visual field loss remain unclear.
  • Research question: What are the effects of peripheral visual field loss on static postural control ?” 20
  • EXAMPLE 3. Comparative research question (quantitative research)
  • - Clarifies the difference among groups with an outcome variable (patients enrolled in COMPERA with moderate PH or severe PH in COPD) and another group without the outcome variable (patients with idiopathic pulmonary arterial hypertension (IPAH))
  • “BACKGROUND: Pulmonary hypertension (PH) in COPD is a poorly investigated clinical condition.
  • RESEARCH QUESTION: Which factors determine the outcome of PH in COPD?
  • STUDY DESIGN AND METHODS: We analyzed the characteristics and outcome of patients enrolled in the Comparative, Prospective Registry of Newly Initiated Therapies for Pulmonary Hypertension (COMPERA) with moderate or severe PH in COPD as defined during the 6th PH World Symposium who received medical therapy for PH and compared them with patients with idiopathic pulmonary arterial hypertension (IPAH) .” 21
  • EXAMPLE 4. Exploratory research question (qualitative research)
  • - Explores areas that have not been fully investigated (perspectives of families and children who receive care in clinic-based child obesity treatment) to have a deeper understanding of the research problem
  • “Problem: Interventions for children with obesity lead to only modest improvements in BMI and long-term outcomes, and data are limited on the perspectives of families of children with obesity in clinic-based treatment. This scoping review seeks to answer the question: What is known about the perspectives of families and children who receive care in clinic-based child obesity treatment? This review aims to explore the scope of perspectives reported by families of children with obesity who have received individualized outpatient clinic-based obesity treatment.” 22
  • EXAMPLE 5. Relationship research question (quantitative research)
  • - Defines interactions between dependent variable (use of ankle strategies) and independent variable (changes in muscle tone)
  • “Background: To maintain an upright standing posture against external disturbances, the human body mainly employs two types of postural control strategies: “ankle strategy” and “hip strategy.” While it has been reported that the magnitude of the disturbance alters the use of postural control strategies, it has not been elucidated how the level of muscle tone, one of the crucial parameters of bodily function, determines the use of each strategy. We have previously confirmed using forward dynamics simulations of human musculoskeletal models that an increased muscle tone promotes the use of ankle strategies. The objective of the present study was to experimentally evaluate a hypothesis: an increased muscle tone promotes the use of ankle strategies. Research question: Do changes in the muscle tone affect the use of ankle strategies ?” 23

EXAMPLES OF HYPOTHESES IN PUBLISHED ARTICLES

  • EXAMPLE 1. Working hypothesis (quantitative research)
  • - A hypothesis that is initially accepted for further research to produce a feasible theory
  • “As fever may have benefit in shortening the duration of viral illness, it is plausible to hypothesize that the antipyretic efficacy of ibuprofen may be hindering the benefits of a fever response when taken during the early stages of COVID-19 illness .” 24
  • “In conclusion, it is plausible to hypothesize that the antipyretic efficacy of ibuprofen may be hindering the benefits of a fever response . The difference in perceived safety of these agents in COVID-19 illness could be related to the more potent efficacy to reduce fever with ibuprofen compared to acetaminophen. Compelling data on the benefit of fever warrant further research and review to determine when to treat or withhold ibuprofen for early stage fever for COVID-19 and other related viral illnesses .” 24
  • EXAMPLE 2. Exploratory hypothesis (qualitative research)
  • - Explores particular areas deeper to clarify subjective experience and develop a formal hypothesis potentially testable in a future quantitative approach
  • “We hypothesized that when thinking about a past experience of help-seeking, a self distancing prompt would cause increased help-seeking intentions and more favorable help-seeking outcome expectations .” 25
  • “Conclusion
  • Although a priori hypotheses were not supported, further research is warranted as results indicate the potential for using self-distancing approaches to increasing help-seeking among some people with depressive symptomatology.” 25
  • EXAMPLE 3. Hypothesis-generating research to establish a framework for hypothesis testing (qualitative research)
  • “We hypothesize that compassionate care is beneficial for patients (better outcomes), healthcare systems and payers (lower costs), and healthcare providers (lower burnout). ” 26
  • Compassionomics is the branch of knowledge and scientific study of the effects of compassionate healthcare. Our main hypotheses are that compassionate healthcare is beneficial for (1) patients, by improving clinical outcomes, (2) healthcare systems and payers, by supporting financial sustainability, and (3) HCPs, by lowering burnout and promoting resilience and well-being. The purpose of this paper is to establish a scientific framework for testing the hypotheses above . If these hypotheses are confirmed through rigorous research, compassionomics will belong in the science of evidence-based medicine, with major implications for all healthcare domains.” 26
  • EXAMPLE 4. Statistical hypothesis (quantitative research)
  • - An assumption is made about the relationship among several population characteristics ( gender differences in sociodemographic and clinical characteristics of adults with ADHD ). Validity is tested by statistical experiment or analysis ( chi-square test, Students t-test, and logistic regression analysis)
  • “Our research investigated gender differences in sociodemographic and clinical characteristics of adults with ADHD in a Japanese clinical sample. Due to unique Japanese cultural ideals and expectations of women's behavior that are in opposition to ADHD symptoms, we hypothesized that women with ADHD experience more difficulties and present more dysfunctions than men . We tested the following hypotheses: first, women with ADHD have more comorbidities than men with ADHD; second, women with ADHD experience more social hardships than men, such as having less full-time employment and being more likely to be divorced.” 27
  • “Statistical Analysis
  • ( text omitted ) Between-gender comparisons were made using the chi-squared test for categorical variables and Students t-test for continuous variables…( text omitted ). A logistic regression analysis was performed for employment status, marital status, and comorbidity to evaluate the independent effects of gender on these dependent variables.” 27

EXAMPLES OF HYPOTHESIS AS WRITTEN IN PUBLISHED ARTICLES IN RELATION TO OTHER PARTS

  • EXAMPLE 1. Background, hypotheses, and aims are provided
  • “Pregnant women need skilled care during pregnancy and childbirth, but that skilled care is often delayed in some countries …( text omitted ). The focused antenatal care (FANC) model of WHO recommends that nurses provide information or counseling to all pregnant women …( text omitted ). Job aids are visual support materials that provide the right kind of information using graphics and words in a simple and yet effective manner. When nurses are not highly trained or have many work details to attend to, these job aids can serve as a content reminder for the nurses and can be used for educating their patients (Jennings, Yebadokpo, Affo, & Agbogbe, 2010) ( text omitted ). Importantly, additional evidence is needed to confirm how job aids can further improve the quality of ANC counseling by health workers in maternal care …( text omitted )” 28
  • “ This has led us to hypothesize that the quality of ANC counseling would be better if supported by job aids. Consequently, a better quality of ANC counseling is expected to produce higher levels of awareness concerning the danger signs of pregnancy and a more favorable impression of the caring behavior of nurses .” 28
  • “This study aimed to examine the differences in the responses of pregnant women to a job aid-supported intervention during ANC visit in terms of 1) their understanding of the danger signs of pregnancy and 2) their impression of the caring behaviors of nurses to pregnant women in rural Tanzania.” 28
  • EXAMPLE 2. Background, hypotheses, and aims are provided
  • “We conducted a two-arm randomized controlled trial (RCT) to evaluate and compare changes in salivary cortisol and oxytocin levels of first-time pregnant women between experimental and control groups. The women in the experimental group touched and held an infant for 30 min (experimental intervention protocol), whereas those in the control group watched a DVD movie of an infant (control intervention protocol). The primary outcome was salivary cortisol level and the secondary outcome was salivary oxytocin level.” 29
  • “ We hypothesize that at 30 min after touching and holding an infant, the salivary cortisol level will significantly decrease and the salivary oxytocin level will increase in the experimental group compared with the control group .” 29
  • EXAMPLE 3. Background, aim, and hypothesis are provided
  • “In countries where the maternal mortality ratio remains high, antenatal education to increase Birth Preparedness and Complication Readiness (BPCR) is considered one of the top priorities [1]. BPCR includes birth plans during the antenatal period, such as the birthplace, birth attendant, transportation, health facility for complications, expenses, and birth materials, as well as family coordination to achieve such birth plans. In Tanzania, although increasing, only about half of all pregnant women attend an antenatal clinic more than four times [4]. Moreover, the information provided during antenatal care (ANC) is insufficient. In the resource-poor settings, antenatal group education is a potential approach because of the limited time for individual counseling at antenatal clinics.” 30
  • “This study aimed to evaluate an antenatal group education program among pregnant women and their families with respect to birth-preparedness and maternal and infant outcomes in rural villages of Tanzania.” 30
  • “ The study hypothesis was if Tanzanian pregnant women and their families received a family-oriented antenatal group education, they would (1) have a higher level of BPCR, (2) attend antenatal clinic four or more times, (3) give birth in a health facility, (4) have less complications of women at birth, and (5) have less complications and deaths of infants than those who did not receive the education .” 30

Research questions and hypotheses are crucial components to any type of research, whether quantitative or qualitative. These questions should be developed at the very beginning of the study. Excellent research questions lead to superior hypotheses, which, like a compass, set the direction of research, and can often determine the successful conduct of the study. Many research studies have floundered because the development of research questions and subsequent hypotheses was not given the thought and meticulous attention needed. The development of research questions and hypotheses is an iterative process based on extensive knowledge of the literature and insightful grasp of the knowledge gap. Focused, concise, and specific research questions provide a strong foundation for constructing hypotheses which serve as formal predictions about the research outcomes. Research questions and hypotheses are crucial elements of research that should not be overlooked. They should be carefully thought of and constructed when planning research. This avoids unethical studies and poor outcomes by defining well-founded objectives that determine the design, course, and outcome of the study.

Disclosure: The authors have no potential conflicts of interest to disclose.

Author Contributions:

  • Conceptualization: Barroga E, Matanguihan GJ.
  • Methodology: Barroga E, Matanguihan GJ.
  • Writing - original draft: Barroga E, Matanguihan GJ.
  • Writing - review & editing: Barroga E, Matanguihan GJ.

Learn / Guides / Quantitative data analysis guide

Back to guides

The ultimate guide to quantitative data analysis

Numbers help us make sense of the world. We collect quantitative data on our speed and distance as we drive, the number of hours we spend on our cell phones, and how much we save at the grocery store.

Our businesses run on numbers, too. We spend hours poring over key performance indicators (KPIs) like lead-to-client conversions, net profit margins, and bounce and churn rates.

But all of this quantitative data can feel overwhelming and confusing. Lists and spreadsheets of numbers don’t tell you much on their own—you have to conduct quantitative data analysis to understand them and make informed decisions.

Last updated

Reading time.

how to write data analysis for qualitative research

This guide explains what quantitative data analysis is and why it’s important, and gives you a four-step process to conduct a quantitative data analysis, so you know exactly what’s happening in your business and what your users need .

Collect quantitative customer data with Hotjar

Use Hotjar’s tools to gather the customer insights you need to make quantitative data analysis a breeze.

What is quantitative data analysis? 

Quantitative data analysis is the process of analyzing and interpreting numerical data. It helps you make sense of information by identifying patterns, trends, and relationships between variables through mathematical calculations and statistical tests. 

With quantitative data analysis, you turn spreadsheets of individual data points into meaningful insights to drive informed decisions. Columns of numbers from an experiment or survey transform into useful insights—like which marketing campaign asset your average customer prefers or which website factors are most closely connected to your bounce rate. 

Without analytics, data is just noise. Analyzing data helps you make decisions which are informed and free from bias.

What quantitative data analysis is not

But as powerful as quantitative data analysis is, it’s not without its limitations. It only gives you the what, not the why . For example, it can tell you how many website visitors or conversions you have on an average day, but it can’t tell you why users visited your site or made a purchase.

For the why behind user behavior, you need qualitative data analysis , a process for making sense of qualitative research like open-ended survey responses, interview clips, or behavioral observations. By analyzing non-numerical data, you gain useful contextual insights to shape your strategy, product, and messaging. 

Quantitative data analysis vs. qualitative data analysis 

Let’s take an even deeper dive into the differences between quantitative data analysis and qualitative data analysis to explore what they do and when you need them.

how to write data analysis for qualitative research

The bottom line: quantitative data analysis and qualitative data analysis are complementary processes. They work hand-in-hand to tell you what’s happening in your business and why.  

💡 Pro tip: easily toggle between quantitative and qualitative data analysis with Hotjar Funnels . 

The Funnels tool helps you visualize quantitative metrics like drop-off and conversion rates in your sales or conversion funnel to understand when and where users leave your website. You can break down your data even further to compare conversion performance by user segment.

Spot a potential issue? A single click takes you to relevant session recordings , where you see user behaviors like mouse movements, scrolls, and clicks. With this qualitative data to provide context, you'll better understand what you need to optimize to streamline the user experience (UX) and increase conversions .

Hotjar Funnels lets you quickly explore the story behind the quantitative data

4 benefits of quantitative data analysis

There’s a reason product, web design, and marketing teams take time to analyze metrics: the process pays off big time. 

Four major benefits of quantitative data analysis include:

1. Make confident decisions 

With quantitative data analysis, you know you’ve got data-driven insights to back up your decisions . For example, if you launch a concept testing survey to gauge user reactions to a new logo design, and 92% of users rate it ‘very good’—you'll feel certain when you give the designer the green light. 

Since you’re relying less on intuition and more on facts, you reduce the risks of making the wrong decision. (You’ll also find it way easier to get buy-in from team members and stakeholders for your next proposed project. 🙌)

2. Reduce costs

By crunching the numbers, you can spot opportunities to reduce spend . For example, if an ad campaign has lower-than-average click-through rates , you might decide to cut your losses and invest your budget elsewhere. 

Or, by analyzing ecommerce metrics , like website traffic by source, you may find you’re getting very little return on investment from a certain social media channel—and scale back spending in that area.

3. Personalize the user experience

Quantitative data analysis helps you map the customer journey , so you get a better sense of customers’ demographics, what page elements they interact with on your site, and where they drop off or convert . 

These insights let you better personalize your website, product, or communication, so you can segment ads, emails, and website content for specific user personas or target groups.

4. Improve user satisfaction and delight

Quantitative data analysis lets you see where your website or product is doing well—and where it falls short for your users . For example, you might see stellar results from KPIs like time on page, but conversion rates for that page are low. 

These quantitative insights encourage you to dive deeper into qualitative data to see why that’s happening—looking for moments of confusion or frustration on session recordings, for example—so you can make adjustments and optimize your conversions by improving customer satisfaction and delight.

💡Pro tip: use Net Promoter Score® (NPS) surveys to capture quantifiable customer satisfaction data that’s easy for you to analyze and interpret. 

With an NPS tool like Hotjar, you can create an on-page survey to ask users how likely they are to recommend you to others on a scale from 0 to 10. (And for added context, you can ask follow-up questions about why customers selected the rating they did—rich qualitative data is always a bonus!)

how to write data analysis for qualitative research

Hotjar graphs your quantitative NPS data to show changes over time

4 steps to effective quantitative data analysis 

Quantitative data analysis sounds way more intimidating than it actually is. Here’s how to make sense of your company’s numbers in just four steps:

1. Collect data

Before you can actually start the analysis process, you need data to analyze. This involves conducting quantitative research and collecting numerical data from various sources, including: 

Interviews or focus groups 

Website analytics

Observations, from tools like heatmaps or session recordings

Questionnaires, like surveys or on-page feedback widgets

Just ensure the questions you ask in your surveys are close-ended questions—providing respondents with select choices to choose from instead of open-ended questions that allow for free responses.

how to write data analysis for qualitative research

Hotjar’s pricing plans survey template provides close-ended questions

 2. Clean data

Once you’ve collected your data, it’s time to clean it up. Look through your results to find errors, duplicates, and omissions. Keep an eye out for outliers, too. Outliers are data points that differ significantly from the rest of the set—and they can skew your results if you don’t remove them.

By taking the time to clean your data set, you ensure your data is accurate, consistent, and relevant before it’s time to analyze. 

3. Analyze and interpret data

At this point, your data’s all cleaned up and ready for the main event. This step involves crunching the numbers to find patterns and trends via mathematical and statistical methods. 

Two main branches of quantitative data analysis exist: 

Descriptive analysis : methods to summarize or describe attributes of your data set. For example, you may calculate key stats like distribution and frequency, or mean, median, and mode.

Inferential analysis : methods that let you draw conclusions from statistics—like analyzing the relationship between variables or making predictions. These methods include t-tests, cross-tabulation, and factor analysis. (For more detailed explanations and how-tos, head to our guide on quantitative data analysis methods.)

Then, interpret your data to determine the best course of action. What does the data suggest you do ? For example, if your analysis shows a strong correlation between email open rate and time sent, you may explore optimal send times for each user segment.

4. Visualize and share data

Once you’ve analyzed and interpreted your data, create easy-to-read, engaging data visualizations—like charts, graphs, and tables—to present your results to team members and stakeholders. Data visualizations highlight similarities and differences between data sets and show the relationships between variables.

Software can do this part for you. For example, the Hotjar Dashboard shows all of your key metrics in one place—and automatically creates bar graphs to show how your top pages’ performance compares. And with just one click, you can navigate to the Trends tool to analyze product metrics for different segments on a single chart. 

Hotjar Trends lets you compare metrics across segments

Discover rich user insights with quantitative data analysis

Conducting quantitative data analysis takes a little bit of time and know-how, but it’s much more manageable than you might think. 

By choosing the right methods and following clear steps, you gain insights into product performance and customer experience —and you’ll be well on your way to making better decisions and creating more customer satisfaction and loyalty.

FAQs about quantitative data analysis

What is quantitative data analysis.

Quantitative data analysis is the process of making sense of numerical data through mathematical calculations and statistical tests. It helps you identify patterns, relationships, and trends to make better decisions.

How is quantitative data analysis different from qualitative data analysis?

Quantitative and qualitative data analysis are both essential processes for making sense of quantitative and qualitative research .

Quantitative data analysis helps you summarize and interpret numerical results from close-ended questions to understand what is happening. Qualitative data analysis helps you summarize and interpret non-numerical results, like opinions or behavior, to understand why the numbers look like they do.

 If you want to make strong data-driven decisions, you need both.

What are some benefits of quantitative data analysis?

Quantitative data analysis turns numbers into rich insights. Some benefits of this process include: 

Making more confident decisions

Identifying ways to cut costs

Personalizing the user experience

Improving customer satisfaction

What methods can I use to analyze quantitative data?

Quantitative data analysis has two branches: descriptive statistics and inferential statistics. 

Descriptive statistics provide a snapshot of the data’s features by calculating measures like mean, median, and mode. 

Inferential statistics , as the name implies, involves making inferences about what the data means. Dozens of methods exist for this branch of quantitative data analysis, but three commonly used techniques are: 

Cross tabulation

Factor analysis

Leveraging collective action and environmental literacy to address complex sustainability challenges

  • Perspective
  • Open access
  • Published: 09 August 2022
  • Volume 52 , pages 30–44, ( 2023 )

Cite this article

You have full access to this open access article

how to write data analysis for qualitative research

  • Nicole M. Ardoin   ORCID: orcid.org/0000-0002-3290-8211 1 ,
  • Alison W. Bowers 2 &
  • Mele Wheaton 3  

8183 Accesses

18 Citations

20 Altmetric

Explore all metrics

Developing and enhancing societal capacity to understand, debate elements of, and take actionable steps toward a sustainable future at a scale beyond the individual are critical when addressing sustainability challenges such as climate change, resource scarcity, biodiversity loss, and zoonotic disease. Although mounting evidence exists for how to facilitate individual action to address sustainability challenges, there is less understanding of how to foster collective action in this realm. To support research and practice promoting collective action to address sustainability issues, we define the term “collective environmental literacy” by delineating four key potent aspects: scale, dynamic processes, shared resources, and synergy. Building on existing collective constructs and thought, we highlight areas where researchers, practitioners, and policymakers can support individuals and communities as they come together to identify, develop, and implement solutions to wicked problems. We close by discussing limitations of this work and future directions in studying collective environmental literacy.

Similar content being viewed by others

how to write data analysis for qualitative research

Three pillars of sustainability: in search of conceptual origins

how to write data analysis for qualitative research

Reimagining the language of engagement in a post-stakeholder world

how to write data analysis for qualitative research

Can public awareness, knowledge and engagement improve climate change adaptation policies?

Avoid common mistakes on your manuscript.

Introduction

For socio-ecologically intertwined issues—such as climate change, land conversion, biodiversity loss, resource scarcity, and zoonotic diseases—and their associated multi-decadal timeframes, individual action is necessary, yet not sufficient, for systemic, sustained change (Amel et al. 2017 ; Bodin 2017 ; Niemiec et al. 2020 ; Spitzer and Fraser 2020 ). Instead, collective action, or individuals working together toward a common good, is essential for achieving the scope and scale of solutions to current sustainability challenges. To support communities as they engage in policy and action for socio-environmental change, communicators, land managers, policymakers, and other practitioners need an understanding of how communities coalesce and leverage their shared knowledge, skills, connections, and experiences.

Engagement efforts, such as those grounded in behavior-change approaches or community-based social marketing initiatives, that address socio-environmental issues have often emphasized individuals as the pathway to change. Such efforts address a range of domains including, but not limited to, residential energy use, personal transportation choices, and workplace recycling efforts, often doing so in a stepwise fashion, envisioning each setting or suite of behaviors as discrete spheres of action and influence (Heimlich and Ardoin 2008 ; McKenzie-Mohr 2011 ). In this way, specific actions are treated incrementally and linearly, considering first the individual barriers to be removed and then the motivations to be activated (and, sometimes, sustained; Monroe 2003 ; Gifford et al. 2011 ). Once each behavior is successfully instantiated, the next barrier is then addressed. Proceeding methodically from one action to the next, such initiatives often quite successfully alter a series of actions or group of related behaviors (at least initially) by addressing them incrementally, one at a time (Byerly et al. 2018 ). Following this aspirational logic chain, many resources have been channeled into such programs under the assumption that, by raising awareness and knowledge, such information, communication, and educational outreach efforts will shift attitudes and behaviors to an extent that, ultimately, mass-scale change will follow. (See discussion in Wals et al. 2014 .)

Numerous studies have demonstrated, however, that challenges arise with these stepwise approaches, particularly with regard to their ability to address complex issues and persist over time (Heimlich and Ardoin 2008 ; Wals et al. 2014 ). Such approaches place a tremendous—and unrealistic—burden on individuals, ignoring key aspects not only of behavioral science but also of social science more broadly, including the view that humans exist nested within socio-ecological systems and, thus, are most successful at achieving lasting change when it is meaningful, relevant, and undertaken within a supportive context (Swim et al. 2011 ; Feola 2015 ). Individualized approaches often require multiple steps or nudges (Byerly et al. 2018 ), or ongoing reminders to retain their salience (Stern et al. 2008 ). Because of the emphasis on decontextualized action, such approaches can miss, ignore, obfuscate, or minimize the importance of the bigger picture, which includes the sociocultural, biophysical, and political economic contexts (Ardoin 2006 ; Amel et al. 2017 ). Although the tightly trained focus on small, actionable steps and reliance on individual willpower may help in initially achieving success with initial habit formation (Carden and Wood 2018 ), it becomes questionable in terms of bringing about a wave of transformation on larger scales in the longer term. For those decontextualized actions to persist, they require continued prompting, constancy, and support in the social and biophysical context (Schultz 2014 ; Manfredo et al. 2016 ; Wood and Rünger 2016 ).

Less common in practice are theoretically based initiatives that embrace the holistic nature of the human experience, which occurs within complex systems spanning time and space in a multidimensional, weblike fashion (Bronfenbrenner 1979 ; Rogoff 2003 ; Barron 2006 ; DeCaro and Stokes 2008 ; Gould et al. 2019 ; Hovardas 2020 ). These systems-thinking approaches, while varying across disciplines and epistemological perspectives, envision human experiences, including learning and behavior, as occurring within a milieu that include the social, political, cultural, and historical contexts (Rogoff 2003 ; Roth and Lee 2007 ; Swim et al. 2011 ; Gordon 2019 ). In such a view, people’s everyday practices continuously reflect and grow out of past learning and experiences, not only at the individual, but also at the collective level (Lave 1991 ; Gutiérrez and Rogoff 2003 ; Nasir et al. 2020 ; Ardoin and Heimlich 2021 ). The multidimensional context in which we exist—including the broader temporal and spatial ecosystem—both facilitates and constrains our actions.

Scholars across diverse areas of study discuss the need for and power of collective thought and action, using various conceptual frames, models, and terms, such as collective action, behavior, impact, and intelligence; collaborative governance; communities of practice; crowdsourcing; and social movement theory; among many others (Table 1 ). These scholars acknowledge and explore the influence of our multidimensional context on collective thought and action. In this paper, we explore the elements and processes that constitute collective environmental literacy . We draw on the vast, relevant literature and, in so doing, we attempt to invoke the power of the collective: by reviewing and synthesizing ideas from a variety of fields, we strive to leverage existing constructs and perspectives that explore notions of the “collective” (see Table 1 for a summary of constructs and theories reviewed to develop our working definition of collective environmental literacy). A primary goal of this paper is to dialogue with other researchers and practitioners working in this arena who are eager to uncover and further explore related avenues.

First, we present a formal definition of collective environmental literacy. Next, we briefly review the dominant view of environmental literacy at the individual level and, in support of a collective take on environmental literacy, we examine various collective constructs. We then delve more deeply into the definition of collective environmental literacy by outlining four key aspects: scale, dynamic processes, shared resources, and synergy. We conclude by providing suggestions for future directions in studying collective environmental literacy.

Defining collective environmental literacy

Decades of research in political science, economics, anthropology, sociology, psychology, and the learning sciences, among other fields (Chawla and Cushing 2007 ; Ostrom 2009 ; Sawyer 2014 ; Bamberg et al. 2015 ; Chan 2016 ; Jost et al. 2017 ) repeatedly demonstrates the effectiveness, and indeed necessity of, collective action when addressing problems that are inherently social in nature. Yet theoretical frameworks and empirical documentation emphasize that such collective activities rarely arise spontaneously and, when they do, are a result of preconditions that have sown fertile ground (van Zomeren et al. 2008 ; Duncan 2018 ). Persistent and effective collective action then requires scaffolding in the form of institutional, sociocultural, and political economic structure that provides ongoing support. To facilitate discussions of how to effectively support collective action around sustainability issues, we suggest the concept of “collective environmental literacy.” We conceptualize collective environmental literacy as more than collective action; rather, we suggest that the term encapsulates action along with its various supporting structures and resources. Additionally, we employ the word “literacy” as it connotes learning, intention, and the idea that knowledge, skills, attitudes, and behaviors can be enhanced iteratively over time. By using “literacy,” we strive to highlight the efforts, often unseen, that lead to effective collective action in communities. We draw on scholarship in science and health education, areas that have begun over the past two decades to theorize about related areas of collective science literacy (Roth and Lee 2002 , 2004 ; Lee and Roth 2003 ; Feinstein 2018 ) and health literacy (Freedman et al. 2009 ; Papen 2009 ; Chinn 2011 ; Guzys et al. 2015 ). Although these evolving constructs lack consensus definitions, they illuminate affordances and constraints that exist when conceptualizing collective environmental literacy (National Academies of Sciences, Engineering, and Medicine [NASEM] 2016 ).

Some of the key necessary—but not sufficient—conditions that facilitate aligned, collective actions include a common body of decision-making information; shared attitudes, values, and beliefs toward a motivating issue or concern; and efficacy skills that facilitate change-making (Sturmer and Simon 2004 ; van Zomeren et al. 2008 ; Jagers et al. 2020 ). In addition, other contextual factors are essential, such as trust, reciprocity, collective efficacy, and communication among group members and societal-level facilitators, such as social norms, institutions, and technology (Bandura 2000 ; Ostrom 2010 ; McAdam and Boudet 2012 ; Jagers et al. 2020 ). Taken together, we term this body of knowledge, dispositions, skills, and the context in which they flourish collective environmental literacy . More formally, we define collective environmental literacy as: a dynamic, synergistic process that occurs as group members develop and leverage shared resources to undertake individual and aggregate actions over time to address sustainability issues within the multi-scalar context of a socio-environmental system (Fig.  1 ).

figure 1

Key elements of collective environmental literacy

Environmental literacy: Historically individual, increasingly collective

Over the past five decades, the term “environmental literacy” has come into increasingly frequent use. Breaking from the traditional association of “literacy” with reading and writing in formal school contexts, environmental literacy emphasizes associations with character and behavior, often in the form of responsible environmental stewardship (Roth 1992 ). Footnote 1 Such perspectives define the concept as including affective (attitudinal), cognitive (knowledge-based), and behavioral domains, emphasizing that environmental literacy is both a process and outcome that develops, builds, and morphs over time (Hollweg et al. 2011 ; Wheaton et al. 2018 ; Clark et al. 2020 ).

The emphasis on defining, measuring, and developing interventions to bring about environmental literacy has primarily remained at the individual scale, as evidenced by frequent descriptions of an environmentally literate person (Roth 1992 ; Hollweg et al. 2011 among others) rather than community or community member. In most understandings, discussions, and manifestations of environmental literacy, the implicit assumption remains that the unit of action, intervention, and therefore analysis occurs at the individual level. Yet instinctively and perhaps by nature, community members often seek information and, as a result, take action collectively, sharing what some scholars call “the hive mind” or “group mind,” relying on each other for distributed knowledge, expertise, motivation, and support (Surowiecki 2005 ; Sunstein 2008 ; Sloman and Fernbach 2017 ; Paul 2021 ).

As with the proverbial elephant (Saxe, n.d.), each person, household, or neighborhood group may understand or “see” a different part of an issue or challenge, bring a novel understanding to the table, and have a certain perspective or skill to contribute. Although some environmental literacy discussions allude to a collective lens (e.g., Hollweg et al. 2011 ; Ardoin et al. 2013 ; Wheaton et al. 2018 ; Bey et al. 2020 ), defining, developing frameworks, and creating measures to assess the efficacy of such collective-scale sustainability-related endeavors has remained elusive. Footnote 2 Looking to related fields and disciplines—such as ecosystem theory, epidemiology and public health, sociology, network theory, and urban planning, among others—can provide insight, theoretical frames, and empirical examples to assist in such conceptualizations (McAdam and Boudet 2012 ; National Research Council 2015 ) (See Table 1 for an overview of some of the many areas of study that informed our conceptualization of collective environmental literacy).

Seeking the essence of the collective: Looking to and learning from others

The social sciences have long focused on “the kinds of activities engaged in by sizable but loosely organized groups of people” (Turner et al. 2020 , para. 1) and addressed various collective constructs, such as collective behavior, action, intelligence, and memory (Table 1 ). Although related constructs in both the social and natural sciences—such as communities of practice (Wenger and Snyder 2000 ), collaborative governance (Ansell and Gash 2008 ; Emerson et al. 2012 ), and the collaboration–coordination continuum (Sadoff and Grey 2005 ; Prager 2015 ), as well as those from social movement theory and related areas (McAdam and Boudet 2012 ; de Moor and Wahlström 2019 )—lack the word “collective” in name, they too leverage the benefits of collectivity. A central tenet connects all of these areas: powerful processes, actions, and outcomes can arise when individuals coalesce around a common purpose or cause. This notion of a dynamic, potent force transcending the individual to enhance the efficacy of outcomes motivates the application of a collective lens to the environmental literacy concept.

Dating to the 1800s, discussions of collective behavior have explored connections to social order, structures, and norms (Park 1927 ; Smelser 2011 /1962; Turner and Killian 1987 ). Initially, the focus emphasized spontaneous, often violent crowd behaviors, such as riots, mobs, and rebellions. More contemporarily, sociologists, political scientists, and others who study social movements and collective behaviors acknowledge that such phenomena may take many forms, including those occurring in natural ecosystems, such as ant colonies, bird flocks, and even the human brain (Gordon 2019 ). In sociology, collective action represents a paradigm shift highlighting coordinated, purposeful pro-social movements, while de-emphasizing aroused emotions and crowd behavior (Miller 2014 ). In political science, Ostrom’s ( 1990 , 2000 , 2010 ) theory of collective action in the context of the management of shared resources extends the concept’s reach to economics and other fields. In education and the learning sciences, social learning and sociocultural theories tap into the idea of learning as a social-cognitive-cultural endeavor (Vygotsky 1980 ; Lave and Wenger 1991 ; Tudge and Winterhoff 1993 ; Rogoff 2003 ; Reed et al. 2010 ).

Collective action, specifically, and collective constructs, generally, have found their way into the research and practice in the fields of conservation, natural resources, and environmental management. Collective action theory has been applied in a range of settings and scenarios, including agriculture (Mills et al. 2011 ), invasive species management (Marshall et al. 2016 ; Sullivan et al. 2017 ; Lubeck et al. 2019 ; Clarke et al. 2021 ), fire management (Canadas et al. 2016 ; Charnley et al. 2020 ), habitat conservation (Raymond 2006 ; Niemiec et al. 2020 ), and water governance (Lopez-Gunn 2003 ; Baldwin et al. 2018 ), among others. Frameworks and methods that emphasize other collective-related ideas—like collaboration, co-production, and group learning—are also ubiquitous in natural resource and environmental management. These constructs include community-based conservation (DeCaro and Stokes 2008 ; Niemiec et al. 2016 ), community natural resource management (Kellert et al. 2000 ; Dale et al. 2020 ), collaboration/coordination (Sadoff and Grey 2005 ; Prager 2015 ), polycentricity (Galaz et al. 2012 ; Heikkila et al. 2018 ), knowledge co-production (Armitage et al. 2011 ; Singh et al. 2021 ), and social learning (Reed et al. 2010 ; Hovardas 2020 ). Many writings on collective efforts in the social sciences broadly, and applied in the area of environment specifically, provide insights into collective action’s necessary preconditions, which prove invaluable to further defining and later operationalizing collective environmental literacy.

Unpacking the definition of collective environmental literacy: Anchoring principles

As described, we propose the following working definition of collective environmental literacy drawing on our analysis of related literatures and informed by scholarly and professional experience in the sustainability and conservation fields: a dynamic, synergistic process that occurs as group members develop and leverage shared resources to undertake individual and aggregate actions over time to address sustainability issues within the multi-scalar context of a socio-environmental system (Fig.  1 ). This definition centers on four core, intertwined ideas: the scale of the group involved; the dynamic nature of the process; shared resources brought by, available to, and needed by the group; and the synergy that arises from group interaction.

Multi-scalar

When transitioning from the focus on individual to collective actions—and, herein, principles of environmental literacy—the most obvious and primary requisite shift is one of scale. Yet, moving to a collective scale does not mean abandoning action at the individual scale; rather, success at the collective level is intrinsically tied to what occurs at an individual level. Such collective-scale impacts leverage the power of the hive, harnessing people’s willingness, ability, and motivation to take action alongside others, share their ideas and resources to build collective ideas and resources, contribute to making a difference in an impactful way, and participate communally in pro-social activities.

Collective environmental literacy is likely dynamic in its orientation to scale, incorporating place-based notions, such as ecoregional or community-level environmental literacy (with an emphasis on geographic boundaries). On the other hand, it may encapsulate environmental literacy of a group or organization united by a common identity (e.g., organizational membership) or cause (e.g., old-growth forests, coastal protection), rather than solely or even primarily by geography. Although shifting scales can make measuring collective environmental literacy more difficult, dynamic levels may be a benefit when addressing planetary boundary issues such as climate change, biodiversity, and ocean acidification (Galaz et al. 2012 ). Some scholars have called for a polycentric approach to these large-scale issues in response to a perceived failure of global-wide, top-down solutions (Ostrom 2010 , 2012 ; Jordan et al. 2018 ). Conceptualizing and consequently supporting collective environmental literacy at multiple scales can facilitate such desired polycentricity.

Rather than representing a static outcome, environmental literacy is a dynamic process that is fluctuating and complex, reflective of iterative interactions among community members, whose discussions and negotiations reflect the changing context of sustainability issues. Footnote 3 Such open-minded processes allow for, and indeed welcome, adaptation in a way that builds social-ecological resilience (Berkes and Jolly 2002 ; Adger et al. 2005 ; Berkes 2007 ). Additionally, this dynamism allows for collective development and maturation, supporting community growth in collective knowledge, attitudes, skills, and actions via new experiences, interactions, and efforts (Berkman et al. 2010 ). With this mindset, and within a sociocultural perspective, collective environmental literacy evolves through drawing on and contributing to the community’s funds of knowledge (González et al. 2006 ). Movement and actions within and among groups impact collective literacy, as members share knowledge and other resources, shifting individuals and the group in the course of their shared practices (Samerski 2019 ).

In a collective mode, effectiveness is heightened as shared resources are streamlined, waste is minimized, and innovation maximized. Rather than each group member developing individual expertise in every matter of concern, the shared knowledge, skills, and behaviors can be distributed, pursued, and amplified among group members efficiently and effectively, with collective literacy emerging from the process of pooling diverse forms of capital and aggregating resources. This perspective builds on ideas of social capital as a collective good (Ostrom 1990 ; Putnam 2020 ), wherein relationships of trust and reciprocity are both inputs and outcomes (Pretty and Ward 2001 ). The shared resources then catalyze and sustain action as they are reassembled and coalesced at the group level for collective impact.

The pooled resources—likely vast—may include, but are not limited to, physical and human resources, funding, time, energy, and space and place (physical or digital). Shared resources may also include forms of theorized capital, such as intellectual and social (Putnam 2020 ). Also of note is the recognition that these resources extend far beyond information and knowledge. Of particular interest when building collective environmental literacy are resources previously ignored or overlooked by those in power in prior sustainability efforts. For example, collective environmental literacy can draw strength from shared resources unique to the community or even subgroups within the larger community. Discussions of Indigenous knowledge (Gadgil et al. 1993 ) and funds of knowledge (González et al. 2006 ; Cruz et al. 2018 ) suggest critical, shared resources that highlight strengths of an individual community and its members. Another dimension of shared resources relates to the strength of institutional connections, such as the benefits that accrue from leveraging the collective knowledge, expertise, and resources of organizational collaborators working in adjacent areas to further and amplify each other’s impact (Wojcik et al. 2021 ).

Synergistic

Finally, given the inherent complexities related to defining, deploying, implementing, and measuring these dynamic, at-times ephemeral processes, resources, and outcomes at a collective scale, working in such a manner must be clearly advantageous to pressing sustainability issues at hand. Numerous related constructs and approaches from a range of fields emphasize the benefits of diverse collaboration to collective thought and action, including improved solutions, more effective and fair processes, and more socioculturally just outcomes (Klein 1990 ; Jörg 2011 ; Wenger and Snyder 2000 ; Djenontin and Meadow 2018 ). These benefits go beyond efficient aggregation and distribution of resources, invoking an almost magical quality that defines synergy, resulting in robust processes and outcomes that are more than the sum of the parts.

This synergy relies on the diversity of a group across various dimensions, bringing power, strength, and insight to a decision-making process (Bear and Woolley 2011 ; Curşeu and Pluut 2013 ; Freeman and Huang 2015 ; Lu et al. 2017 ; Bendor and Page 2019 ). Individuals are limited not only to singular knowledge-perspectives and skillsets, but also to their own experiences, which influence their self-affirming viewpoints and tendencies to seek out confirmatory information for existing beliefs (Kahan et al. 2011 ). Although the coming together of those from different racial, cultural, social, and economic backgrounds facilitates a collective literacy process that draws on a wider range of resources and equips a gestalt, it also sets up the need to consider issues of power, privilege, voice, and representation (Bäckstrand 2006 ) and the role of social capital, leading to questions related to trust and reciprocity in effective collectives (Pretty and Ward 2001 ; Folke et al. 2005 ).

Leveraging the ‘Hive’: Proceeding with collective environmental literacy

This paper presents one conceptualization of collective environmental literacy, with the understanding that numerous ways exist to envision its definition, formation, deployment, and measurement. Characterized by a collective effort, such literacies at scale offer a way to imagine, measure, and support the synergy that occurs when the emphasis moves from an individual to a larger whole. By expanding the scale and focusing on shared responsibility among actors at the systems level, opportunities arise for inspiring and enabling a broader contribution to a sustainable future. These evolving notions serve to invite ongoing conversation, both in research and practice, about how to enact our collective responsibility toward, as well as vision of, a thriving future.

Emerging from the many discussions of shared and collaborative efforts to address socio-environmental issues, our conceptualization of collective environmental literacy is a first step toward supporting communities as they work to identify, address, and solve sustainability problems. We urge continued discussions on this topic, with the goal of understanding the concept of collective environmental literacy, how to measure it, and the implications of this work for practitioners. The conceptual roots of collective environmental literacy reach into countless fields of study and, as such, a transdisciplinary approach, which includes an eye toward practice, is necessary to fully capture and maximize the tremendous amount of knowledge, wisdom, and experience around this topic. Specifically, next steps to evolve the concept include engaging sustainability researchers and practitioners in discussions of the saliency of the presented definition of collective environmental literacy. These discussions include verifying the completeness of the definition and ensuring a thorough review of relevant research: Are parts of the definition missing or unclear? What are the “blank, blind, bald, and bright spots” in the literature (Reid 2019 p. 158)? Additionally, recognizing and leveraging literacy at a collective scale most certainly is not unique to environmental work, nor is adopting literacy-related language to conceptualize and measure process outcomes, although the former has consistently proven more challenging. Moreover, although we (the authors) appreciate the connotations and structures gained by using a literacy framework, we struggle with whether “environmental literacy” is the most appropriate and useful term for the conceptualizations as described herein; we, thus, welcome lively discussions about the need for new terminology.

Even at this early stage of conceptualization, this work has implications for practitioners. For scientists, communicators, policymakers, land managers, and other professionals desiring to work with communities to address sustainability issues, a primary take-away message concerns the holistic nature of what is needed for effective collective action in the environmental realm. Many previous efforts have focused on conveying information and, while a lack of knowledge and awareness may be a barrier to action in some cases, the need for a more holistic lens is increasingly clear. This move beyond an individually focused, information-deficit model is essential for effective impact (Bolderdijk et al. 2013 ; van der Linden 2014 ; Geiger et al. 2019 ). The concept of collective environmental literacy suggests a role for developing shared resources that can foster effective collective action. When working with communities, a critical early step includes some form of needs assessment—a systematic, in-depth process that allows for meaningfully gauging gaps in shared resources required to tackle sustainability issues (Braus 2011). Following this initial, evaluative step, an understanding of the components of collective environmental literacy, as outlined in this paper, can be used to guide the development of interventions to support communities in their efforts to address those issues.

Growing discussion of collective literacy constructs, and related areas, suggests researchers, practitioners, and policymakers working in pro-social areas recognize and value collective efforts, despite the need for clearer definitions and effective measures. This definitional and measurement work, in both research and practice, is not easy. The ever-changing, dynamic contexts in which collective environmental literacy exists make defining the concept a moving target, compounded by a need to draw upon work in countless, often distinct academic fields of study. Furthermore, the hard-to-see, inner workings of collective constructs make measurement difficult. Yet, the “power of the hive” is intriguing, as the synergism that arises from communities working in an aligned manner toward a unified vision suggests a potency and wave of motivated action essential to coalescing and leveraging individual goodwill, harnessing its power and potential toward effective sustainability solutions.

See Stables and Bishop’s ( 2001 ) idea of defining environmental literacy by viewing the environment as “text.”

The climate change education literature also includes a nascent, but growing, discussion of collective-lens thinking and literacy. See, for example, Waldron et al. ( 2019 ), Mochizuki and Bryan ( 2015 ), and Kopnina ( 2016 ).

This conceptualization is similar to how some scholars describe collective health literacy (Berkman et al., 2010 ; Mårtensson and Hensing, 2012 ).

Adger, W.N. 2003. Social capital, collective action, and adaptation to climate change. Economic Geography 79: 387–404.

Article   Google Scholar  

Adger, W.N., T.P. Hughes, C. Folke, S.R. Carpenter, and J. Rockström. 2005. Social-ecological resilience to coastal disasters. Science 309: 1036–1039. https://doi.org/10.1126/science.1112122 .

Article   CAS   Google Scholar  

Adler, P.S., and S.-W. Kwon. 2002. Social capital: Prospects for a new concept. Academy of Management Review 27: 17–40. https://doi.org/10.5465/amr.2002.5922314 .

Agrawal, A. 1995. Dismantling the divide between Indigenous and scientific knowledge. Development and Change 26: 413–439. https://doi.org/10.1111/j.1467-7660.1995.tb00560.x .

Aguilar, O.M. 2018. Examining the literature to reveal the nature of community EE/ESD programs and research. Environmental Education Research 24: 26–49. https://doi.org/10.1080/13504622.2016.1244658 .

Aguilar, O., A. Price, and M. Krasny. 2015. Perspectives on community environmental education. In M.C. Monroe & M.E. Krasny (Eds.), Across the spectrum: Resources for environmental educators (3rd edn., pp. 235–249). North American Association for Environmental Education.

Aldrich, D.P., and M.A. Meyer. 2015. Social capital and community resilience. American Behavioral Scientist 59: 254–269. https://doi.org/10.1177/0002764214550299 .

Amel, E., C. Manning, B. Scott, and S. Koger. 2017. Beyond the roots of human inaction: Fostering collective effort toward ecosystem conservation. Science 356: 275–279. https://doi.org/10.1126/science.aal1931 .

Ansell, C., and A. Gash. 2008. Collaborative governance in theory and practice. Journal of Public Administration Research and Theory 18: 543–571. https://doi.org/10.1093/jopart/mum032 .

Ardoin, N.M. 2006. Toward an interdisciplinary understanding of place: Lessons for environmental education. Canadian Journal of Environmental Education 11: 112–126.

Google Scholar  

Ardoin, N.M., and J.E. Heimlich. 2021. Environmental learning in everyday life: Foundations of meaning and a context for change. Environmental Education Research 27: 1681–1699. https://doi.org/10.1080/13504622.2021.1992354 .

Ardoin, N.M., C. Clark, and E. Kelsey. 2013. An exploration of future trends in environmental education research. Environmental Education Research 19: 499–520. https://doi.org/10.1080/13504622.2012.709823 .

Armitage, D., F. Berkes, A. Dale, E. Kocho-Schellenberg, and E. Patton. 2011. Co-management and the co-production of knowledge: Learning to adapt in Canada’s Arctic. Global Environmental Change 21: 995–1004. https://doi.org/10.1016/j.gloenvcha.2011.04.006 .

Assis Neto, F.R., and C.A.S. Santos. 2018. Understanding crowdsourcing projects: A systematic review of tendencies, workflow, and quality management. Information Processing & Management 54: 490–506. https://doi.org/10.1016/j.ipm.2018.03.006 .

Bäckstrand, K. 2006. Multi-stakeholder partnerships for sustainable development: Rethinking legitimacy, accountability and effectiveness. European Environment 16: 290–306. https://doi.org/10.1002/eet.425 .

Baldwin, E., P. McCord, J. Dell’Angelo, and T. Evans. 2018. Collective action in a polycentric water governance system. Environmental Policy and Governance 28: 212–222. https://doi.org/10.1002/eet.1810 .

Bamberg, S., J. Rees, and S. Seebauer. 2015. Collective climate action: Determinants of participation intention in community-based pro-environmental initiatives. Journal of Environmental Psychology 43: 155–165. https://doi.org/10.1016/j.jenvp.2015.06.006 .

Bandura, A. 1977. Social learning theory . Englewood Cliffs: Prentice Hall.

Bandura, A. 2000. Exercise of human agency through collective efficacy. Current Directions in Psychological Science 9: 75–78. https://doi.org/10.1111/1467-8721.00064 .

Barron, B. 2006. Interest and self-sustained learning as catalysts of development: A learning ecology perspective. Human Development 49: 193–224. https://doi.org/10.1159/000094368 .

Barry, M.M., M. D’Eath, and J. Sixsmith. 2013. Interventions for improving population health literacy: Insights from a rapid review of the evidence. Journal of Health Communication 18: 1507–1522. https://doi.org/10.1080/10810730.2013.840699 .

Barton, A.C., and E. Tan. 2009. Funds of knowledge and discourses and hybrid space. Journal of Research in Science Teaching 46: 50–73. https://doi.org/10.1002/tea.20269 .

Bear, J.B., and A.W. Woolley. 2011. The role of gender in team collaboration and performance. Interdisciplinary Science Reviews 36: 146–153. https://doi.org/10.1179/030801811X13013181961473 .

Bendor, J., and S.E. Page. 2019. Optimal team composition for tool-based problem solving. Journal of Economics & Management Strategy 28: 734–764. https://doi.org/10.1111/jems.12295 .

Berkes, F. 2007. Understanding uncertainty and reducing vulnerability: Lessons from resilience thinking. Natural Hazards 41: 283–295. https://doi.org/10.1007/s11069-006-9036-7 .

Berkes, F., and D. Jolly. 2002. Adapting to climate change: Social-ecological resilience in a Canadian western Arctic community. Conservation Ecology 5: 45.

Berkes, F., and H. Ross. 2013. Community resilience: Toward an integrated approach. Society & Natural Resources 26: 5–20. https://doi.org/10.1080/08941920.2012.736605 .

Berkes, F., M.K. Berkes, and H. Fast. 2007. Collaborative integrated management in Canada’s north: The role of local and traditional knowledge and community-based monitoring. Coastal Management 35: 143–162.

Berkman, N.D., T.C. Davis, and L. McCormack. 2010. Health literacy: What is it? Journal of Health Communication 15: 9–19. https://doi.org/10.1080/10810730.2010.499985 .

Bey, G., C. McDougall, and S. Schoedinger. 2020. Report on the NOAA office of education environmental literacy program community resilience education theory of change. National Oceanic and Atmospheric Administration . https://doi.org/10.25923/mh0g-5q69 .

Blumer, H. 1971. Social problems as collective behavior. Social Problems 18: 298–306.

Bodin, Ö. 2017. Collaborative environmental governance: Achieving collective action in social-ecological systems. Science . https://doi.org/10.1126/science.aan1114 .

Bolderdijk, J.W., M. Gorsira, K. Keizer, and L. Steg. 2013. Values determine the (in)effectiveness of informational interventions in promoting pro-environmental behavior. PLoS ONE 8: e83911. https://doi.org/10.1371/journal.pone.0083911 .

Brabham, D.C. 2013. Crowdsourcing . Cambridge: MIT Press.

Book   Google Scholar  

Braus, J. (Ed.). 2011. Tools of engagement: A toolkit for engaging people in conservation. NAAEE/Audubon. https://cdn.naaee.org/sites/default/files/eepro/resource/files/toolsofengagement.pdf .

Brieger, S.A. 2019. Social identity and environmental concern: The importance of contextual effects. Environment and Behavior 51: 828–855. https://doi.org/10.1177/0013916518756988 .

Briggs, J. 2005. The use of Indigenous knowledge in development: Problems and challenges. Progress in Development Studies 5: 99–114. https://doi.org/10.1191/1464993405ps105oa .

Briggs, J., and J. Sharp. 2004. Indigenous knowledges and development: A postcolonial caution. Third World Quarterly 25: 661–676. https://doi.org/10.1080/01436590410001678915 .

Bronfenbrenner, U. 1979. The ecology of human development: Experiments by nature and design . Cambridge: Harvard University Press.

Bruce, C., and P. Chesterton. 2002. Constituting collective consciousness: Information literacy in university curricula. International Journal for Academic Development 7: 31–40. https://doi.org/10.1080/13601440210156457 .

Byerly, H., A. Balmford, P.J. Ferraro, C.H. Wagner, E. Palchak, S. Polasky, T.H. Ricketts, A.J. Schwartz, et al. 2018. Nudging pro-environmental behavior: Evidence and opportunities. Frontiers in Ecology and the Environment 16: 159–168. https://doi.org/10.1002/fee.1777 .

Canadas, M.J., A. Novais, and M. Marques. 2016. Wildfires, forest management and landowners’ collective action: A comparative approach at the local level. Land Use Policy 56: 179–188. https://doi.org/10.1016/j.landusepol.2016.04.035 .

Carden, L., and W. Wood. 2018. Habit formation and change. Current Opinion in Behavioral Sciences 20: 117–122. https://doi.org/10.1016/j.cobeha.2017.12.009 .

Chan, M. 2016. Psychological antecedents and motivational models of collective action: Examining the role of perceived effectiveness in political protest participation. Social Movement Studies 15: 305–321. https://doi.org/10.1080/14742837.2015.1096192 .

Charnley, S., E.C. Kelly, and A.P. Fischer. 2020. Fostering collective action to reduce wildfire risk across property boundaries in the American West. Environmental Research Letters 15: 025007. https://doi.org/10.1088/1748-9326/ab639a .

Chawla, L., and D.F. Cushing. 2007. Education for strategic environmental behavior. Environmental Education Research 13: 437–452. https://doi.org/10.1080/13504620701581539 .

Chinn, D. 2011. Critical health literacy: A review and critical analysis. Social Science & Medicine 73: 60–67. https://doi.org/10.1016/j.socscimed.2011.04.004 .

Clark, C.R., J.E. Heimlich, N.M. Ardoin, and J. Braus. 2020. Using a Delphi study to clarify the landscape and core outcomes in environmental education. Environmental Education Research 26: 381–399. https://doi.org/10.1080/13504622.2020.1727859 .

Clarke, M., Z. Ma, S.A. Snyder, and K. Floress. 2021. Factors influencing family forest owners’ interest in community-led collective invasive plant management. Environmental Management 67: 1088–1099. https://doi.org/10.1007/s00267-021-01454-1 .

Cruz, A.R., S.T. Selby, and W.H. Durham. 2018. Place-based education for environmental behavior: A ‘funds of knowledge’ and social capital approach. Environmental Education Research 24: 627–647. https://doi.org/10.1080/13504622.2017.1311842 .

Curşeu, P.L., and H. Pluut. 2013. Student groups as learning entities: The effect of group diversity and teamwork quality on groups’ cognitive complexity. Studies in Higher Education 38: 87–103. https://doi.org/10.1080/03075079.2011.565122 .

Cutter, S.L., L. Barnes, M. Berry, C. Burton, E. Evans, E. Tate, and J. Webb. 2008. A place-based model for understanding community resilience to natural disasters. Global Environmental Change 18: 598–606. https://doi.org/10.1016/j.gloenvcha.2008.07.013 .

Dale, A., K. Vella, S. Ryan, K. Broderick, R. Hill, R. Potts, and T. Brewer. 2020. Governing community-based natural resource management in Australia: International implications. Land 9: 234. https://doi.org/10.3390/land9070234 .

de Moor, J., and M. Wahlström. 2019. Narrating political opportunities: Explaining strategic adaptation in the climate movement. Theory and Society 48: 419–451. https://doi.org/10.1007/s11186-019-09347-3 .

DeCaro, D., and M. Stokes. 2008. Social-psychological principles of community-based conservation and conservancy motivation: Attaining goals within an autonomy-supportive environment. Conservation Biology 22: 1443–1451.

Djenontin, I.N.S., and A.M. Meadow. 2018. The art of co-production of knowledge in environmental sciences and management: Lessons from international practice. Environmental Management 61: 885–903. https://doi.org/10.1007/s00267-018-1028-3 .

Duncan, L.E. 2018. The psychology of collective action. In The Oxford handbook of personality and social psychology , ed. K. Deaux and M. Snyder. Oxford: Oxford University Press.

Edwards, M., F. Wood, M. Davies, and A. Edwards. 2015. ‘Distributed health literacy’: Longitudinal qualitative analysis of the roles of health literacy mediators and social networks of people living with a long-term health condition. Health Expectations 18: 1180–1193. https://doi.org/10.1111/hex.12093 .

Emerson, K., T. Nabatchi, and S. Balogh. 2012. An integrative framework for collaborative governance. Journal of Public Administration Research and Theory 22: 1–29.

Engeström, Y. 2001. Expansive learning at work: Toward an activity theoretical reconceptualization. Journal of Education and Work 14: 133–156. https://doi.org/10.1080/13639080020028747 .

Ensor, J., and B. Harvey. 2015. Social learning and climate change adaptation: Evidence for international development practice. Wires Climate Change 6: 509–522. https://doi.org/10.1002/wcc.348 .

Fanta, V., M. Šálek, and P. Sklenicka. 2019. How long do floods throughout the millennium remain in the collective memory? Nature Communications 10: 1105. https://doi.org/10.1038/s41467-019-09102-3 .

Feinstein, N.W. 2018. Collective science literacy: A key to community science capacity [Conference session]. American Association for the Advancement of Science Annual Meeting, Austin, TX, USA https://d32ogoqmya1dw8.cloudfront.net/files/earthconnections/collective_science_literacy_key.pdf .

Feola, G. 2015. Societal transformation in response to global environmental change: A review of emerging concepts. Ambio 44: 376–390. https://doi.org/10.2139/ssrn.2689741 .

Fernandez-Gimenez, M.E., H.L. Ballard, and V.E. Sturtevant. 2008. Adaptive management and social learning in collaborative and community-based monitoring: A study of five community-based forestry organizations in the western USA. Ecology and Society 13: 15.

Folke, C., T. Hahn, P. Olsson, and J. Norberg. 2005. Adaptive governance of social-ecological systems. Annual Review of Environment and Resources 30: 441–473. https://doi.org/10.1146/annurev.energy.30.050504.144511 .

Freedman, D.A., K.D. Bess, H.A. Tucker, D.L. Boyd, A.M. Tuchman, and K.A. Wallston. 2009. Public health literacy defined. American Journal of Preventive Medicine 36: 446–451. https://doi.org/10.1016/j.amepre.2009.02.001 .

Freeman, R.B., and W. Huang. 2015. Collaborating with people like me: Ethnic coauthorship within the United States. Journal of Labor Economics 33: S289–S318.

Gadgil, M., F. Berkes, and C. Folke. 1993. Indigenous knowledge for biodiversity conservation. Ambio 22: 151–156.

Galaz, V., B. Crona, H. Österblom, P. Olsson, and C. Folke. 2012. Polycentric systems and interacting planetary boundaries—Emerging governance of climate change–ocean acidification–marine biodiversity. Ecological Economics 81: 21–32. https://doi.org/10.1016/j.ecolecon.2011.11.012 .

Geiger, S.M., M. Geiger, and O. Wilhelm. 2019. Environment-specific vs general knowledge and their role in pro-environmental behavior. Frontiers in Psychology 10: 718. https://doi.org/10.3389/fpsyg.2019.00718 .

Gifford, R., C. Kormos, and A. McIntyre. 2011. Behavioral dimensions of climate change: Drivers, responses, barriers, and interventions. Wires Climate Change 2: 801–827. https://doi.org/10.1002/wcc.143 .

González, N., L.C. Moll, and C. Amanti. 2006. Funds of knowledge: Theorizing practices in households, communities, and classrooms . New York: Routledge.

Gordon, D.M. 2019. Measuring collective behavior: An ecological approach. Theory in Biosciences . https://doi.org/10.1007/s12064-019-00302-5 .

Gould, R.K., N.M. Ardoin, J.M. Thomsen, and N. Wyman Roth. 2019. Exploring connections between environmental learning and behavior through four everyday-life case studies. Environmental Education Research 25: 314–340.

Graham, S., A.L. Metcalf, N. Gill, R. Niemiec, C. Moreno, T. Bach, V. Ikutegbe, L. Hallstrom, et al. 2019. Opportunities for better use of collective action theory in research and governance for invasive species management. Conservation Biology 33: 275–287. https://doi.org/10.1111/cobi.13266 .

Granovetter, M. 1978. Threshold models of collective behavior. American Journal of Sociology 83: 1420–1443.

Groulx, M., M.C. Brisbois, C.J. Lemieux, A. Winegardner, and L. Fishback. 2017. A role for nature-based citizen science in promoting individual and collective climate change action? A systematic review of learning outcomes. Science Communication 39: 45–76. https://doi.org/10.1177/1075547016688324 .

Gutiérrez, K.D., and B. Rogoff. 2003. Cultural ways of learning: Individual traits or repertoires of practice. Educational Researcher 32: 19–25. https://doi.org/10.3102/0013189X032005019 .

Guzys, D., A. Kenny, V. Dickson-Swift, and G. Threlkeld. 2015. A critical review of population health literacy assessment. BMC Public Health 15: 1–7. https://doi.org/10.1186/s12889-015-1551-6 .

Halbwachs, M. 1992. On collective memory (L. A. Coser, Ed. & Trans.). University of Chicago Press. (Original works published 1941 and 1952).

Heikkila, T., S. Villamayor-Tomas, and D. Garrick. 2018. Bringing polycentric systems into focus for environmental governance. Environmental Policy and Governance 28: 207–211. https://doi.org/10.1002/eet.1809 .

Heimlich, J.E., and N.M. Ardoin. 2008. Understanding behavior to understand behavior change: A literature review. Environmental Education Research 14: 215–237. https://doi.org/10.1080/13504620802148881 .

Hill, R., F.J. Walsh, J. Davies, A. Sparrow, M. Mooney, R.M. Wise, and M. Tengö. 2020. Knowledge co-production for Indigenous adaptation pathways: Transform post-colonial articulation complexes to empower local decision-making. Global Environmental Change 65: 102161. https://doi.org/10.1016/j.gloenvcha.2020.102161 .

Hollweg, K.S., J. Taylor, R.W. Bybee, T.J. Marcinkowski, W.C. McBeth, and P. Zoido. 2011. Developing a framework for assessing environmental literacy: Executive summary . North American Association for Environmental Education. https://cdn.naaee.org/sites/default/files/envliteracyexesummary.pdf .

Hovardas, T. 2020. A social learning approach for stakeholder engagement in large carnivore conservation and management. Frontiers in Ecology and Evolution 8: 436. https://doi.org/10.3389/fevo.2020.525278 .

Jagers, S.C., N. Harring, Å. Löfgren, M. Sjöstedt, F. Alpizar, B. Brülde, D. Langlet, A. Nilsson, et al. 2020. On the preconditions for large-scale collective action. Ambio 49: 1282–1296. https://doi.org/10.1007/s13280-019-01284-w .

Jordan, A., D. Huitema, H. van Asselt, and J. Forster. 2018. Governing climate change: Polycentricity in action? Cambridge: Cambridge University Press.

Jörg, T. 2011. New thinking in complexity for the social sciences and humanities: A generative, transdisciplinary approach . New York: Springer Science & Business Media.

Jost, J.T., J. Becker, D. Osborne, and V. Badaan. 2017. Missing in (collective) action: Ideology, system justification, and the motivational antecedents of two types of protest behavior. Current Directions in Psychological Science 26: 99–108. https://doi.org/10.1177/0963721417690633 .

Jull, J., A. Giles, and I.D. Graham. 2017. Community-based participatory research and integrated knowledge translation: Advancing the co-creation of knowledge. Implementation Science 12: 150. https://doi.org/10.1186/s13012-017-0696-3 .

Kahan, D.M., H. Jenkins-Smith, and D. Braman. 2011. Cultural cognition of scientific consensus. Journal of Risk Research 14: 147–174. https://doi.org/10.1080/13669877.2010.511246 .

Kania, J., and M. Kramer. 2011. Collective impact. Stanford Social Innovation Review 9: 36–41.

Karachiwalla, R., and F. Pinkow. 2021. Understanding crowdsourcing projects: A review on the key design elements of a crowdsourcing initiative. Creativity and Innovation Management 30: 563–584. https://doi.org/10.1111/caim.12454 .

Kellert, S.R., J.N. Mehta, S.A. Ebbin, and L.L. Lichtenfeld. 2000. Community natural resource management: Promise, rhetoric, and reality. Society & Natural Resources 13: 705–715.

Klein, J.T. 1990. Interdisciplinarity: History, theory, and practice . Detroit: Wayne State University Press.

Knapp, C.N., R.S. Reid, M.E. Fernández-Giménez, J.A. Klein, and K.A. Galvin. 2019. Placing transdisciplinarity in context: A review of approaches to connect scholars, society and action. Sustainability 11: 4899. https://doi.org/10.3390/su11184899 .

Koliou, M., J.W. van de Lindt, T.P. McAllister, B.R. Ellingwood, M. Dillard, and H. Cutler. 2020. State of the research in community resilience: Progress and challenges. Sustainable and Resilient Infrastructure 5: 131–151. https://doi.org/10.1080/23789689.2017.1418547 .

Kopnina, H. 2016. Of big hegemonies and little tigers: Ecocentrism and environmental justice. The Journal of Environmental Education 47: 139–150. https://doi.org/10.1080/00958964.2015.1048502 .

Krasny, M.E., M. Mukute, O. Aguilar, M.P. Masilela, and L. Olvitt. 2017. Community environmental education. In Urban environmental education review , ed. A. Russ and M.E. Krasny, 124–132. Ithaca: Cornell University Press.

Chapter   Google Scholar  

Lave, J. 1991. Situating learning in communities of practice.

Lave, J., and E. Wenger. 1991. Situated learning: Legitimate peripheral participation . Cambridge: Cambridge University Press.

Lee, S., and W.-M. Roth. 2003. Science and the “good citizen”: Community-based scientific literacy. Science, Technology, & Human Values 28: 403–424. https://doi.org/10.1177/0162243903028003003 .

Lévy, P., and R. Bononno. 1997. Collective intelligence: Mankind’s emerging world in cyberspace . New York: Perseus Books.

Lloyd, A. 2005. No man (or woman) is an island: Information literacy, affordances and communities of practice. The Australian Library Journal 54: 230–237. https://doi.org/10.1080/00049670.2005.10721760 .

Lopez-Gunn, E. 2003. The role of collective action in water governance: A comparative study of groundwater user associations in La Mancha aquifers in Spain. Water International 28: 367–378. https://doi.org/10.1080/02508060308691711 .

Lu, J.G., A.C. Hafenbrack, P.W. Eastwick, D.J. Wang, W.W. Maddux, and A.D. Galinsky. 2017. “Going out” of the box: Close intercultural friendships and romantic relationships spark creativity, workplace innovation, and entrepreneurship. Journal of Applied Psychology 102: 1091–1108. https://doi.org/10.1037/apl0000212 .

Lubeck, A., A. Metcalf, C. Beckman, L. Yung, and J. Angle. 2019. Collective factors drive individual invasive species control behaviors: Evidence from private lands in Montana, USA. Ecology and Society . https://doi.org/10.5751/ES-10897-240232 .

Mackay, C.M.L., M.T. Schmitt, A.E. Lutz, and J. Mendel. 2021. Recent developments in the social identity approach to the psychology of climate change. Current Opinion in Psychology 42: 95–101. https://doi.org/10.1016/j.copsyc.2021.04.009 .

Magis, K. 2010. Community resilience: An indicator of social sustainability. Society & Natural Resources 23: 401–416. https://doi.org/10.1080/08941920903305674 .

Manfredo, M.J., T.L. Teel, and A.M. Dietsch. 2016. Implications of human value shift and persistence for biodiversity conservation. Conservation Biology 30: 287–296. https://doi.org/10.1111/cobi.12619 .

Marshall, G.R., M.J. Coleman, B.M. Sindel, I.J. Reeve, and P.J. Berney. 2016. Collective action in invasive species control, and prospects for community-based governance: The case of serrated tussock ( Nassella trichotoma ) in New South Wales, Australia. Land Use Policy 56: 100–111. https://doi.org/10.1016/j.landusepol.2016.04.028 .

Mårtensson, L., and G. Hensing. 2012. Health literacy: A heterogeneous phenomenon: A literature review. Scandinavian Journal of Caring Sciences 26: 151–160. https://doi.org/10.1111/j.1471-6712.2011.00900.x .

Martin, C., and C. Steinkuehler. 2010. Collective information literacy in massively multiplayer online games. E-Learning and Digital Media 7: 355–365. https://doi.org/10.2304/elea.2010.7.4.355 .

Masson, T., and I. Fritsche. 2021. We need climate change mitigation and climate change mitigation needs the ‘We’: A state-of-the-art review of social identity effects motivating climate change action. Current Opinion in Behavioral Sciences 42: 89–96. https://doi.org/10.1016/j.cobeha.2021.04.006 .

Massung, E., D. Coyle, K.F. Cater, M. Jay, and C. Preist. 2013. Using crowdsourcing to support pro-environmental community activism. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems . https://doi.org/10.1145/2470654.2470708 .

McAdam, D. 2017. Social movement theory and the prospects for climate change activism in the United States. Annual Review of Political Science 20: 189–208. https://doi.org/10.1146/annurev-polisci-052615-025801 .

McAdam, D., and H. Boudet. 2012. Putting social movements in their place: Explaining opposition to energy projects in the United States, 2000–2005 . Cambridge University Press.

McKenzie-Mohr, D. 2011. Fostering sustainable behavior: An introduction to community-based social marketing (3rd edn.). New Society Publishers.

McKinley, D.C., A.J. Miller-Rushing, H.L. Ballard, R. Bonney, H. Brown, S.C. Cook-Patton, D.M. Evans, R.A. French, et al. 2017. Citizen science can improve conservation science, natural resource management, and environmental protection. Biological Conservation 208: 15–28.

Miller, D.L. 2014. Introduction to collective behavior and collective action (3rd ed.). Waveland Press.

Mills, J., D. Gibbon, J. Ingram, M. Reed, C. Short, and J. Dwyer. 2011. Organising collective action for effective environmental management and social learning in Wales. The Journal of Agricultural Education and Extension 17: 69–83. https://doi.org/10.1080/1389224X.2011.536356 .

Mistry, J., and A. Berardi. 2016. Bridging Indigenous and scientific knowledge. Science 352: 1274–1275. https://doi.org/10.1126/science.aaf1160 .

Mochizuki, Y., and A. Bryan. 2015. Climate change education in the context of education for sustainable development: Rationale and principles. Journal of Education for Sustainable Development 9: 4–26. https://doi.org/10.1177/0973408215569109 .

Monroe, M.C. 2003. Two avenues for encouraging conservation behaviors. Human Ecology Review 10: 113–125.

Nasir, N.S., M.M. de Royston, B. Barron, P. Bell, R. Pea, R. Stevens, and S. Goldman. 2020. Learning pathways: How learning is culturally organized. In Handbook of the cultural foundations of learning , ed. N.S. Nasir, C.D. Lee, R. Pea, and M.M. de Royston, 195–211. Routledge.

National Academies of Sciences, Engineering, and Medicine. 2016. Science literacy: Concepts, contexts, and consequences . https://doi.org/10.17226/23595

National Research Council. 2015. Collective behavior: From cells to societies: Interdisciplinary research team summaries . National Academies Press. https://doi.org/10.17226/21737

Niemiec, R.M., N.M. Ardoin, C.B. Wharton, and G.P. Asner G.P. 2016. Motivating residents to combat invasive species on private lands: Social norms and community reciprocity. Ecology and Society , 21. https://doi.org/10.5751/ES-08362-210230

Niemiec, R.M., S. McCaffrey, and M.S. Jones. 2020. Clarifying the degree and type of public good collective action problem posed by natural resource management challenges. Ecology and Society 25: 30. https://doi.org/10.5751/ES-11483-250130 .

Norström, A.V., C. Cvitanovic, M.F. Löf, S. West, C. Wyborn, P. Balvanera, A.T. Bednarek, E.M. Bennett, et al. 2020. Principles for knowledge co-production in sustainability research. Nature Sustainability 3: 182–190. https://doi.org/10.1038/s41893-019-0448-2 .

Olick, J.K. 1999. Collective memory: The two cultures. Sociological Theory 17: 333–348. https://doi.org/10.1111/0735-2751.00083 .

Ostrom, E. 1990. Governing the commons: The evolution of institutions for collective action . Cambridge University Press.

Ostrom, E. 2000. Collective action and the evolution of social norms. Journal of Economic Perspectives 14: 137–158. https://doi.org/10.1257/jep.14.3.137 .

Ostrom, E. 2009. A general framework for analyzing sustainability of social-ecological systems. Science 325: 419–422. https://doi.org/10.1126/science.1172133 .

Ostrom, E. 2010. Polycentric systems for coping with collective action and global environmental change. Global Environmental Change 20: 550–557. https://doi.org/10.1016/j.gloenvcha.2010.07.004 .

Ostrom, E. 2012. Nested externalities and polycentric institutions: Must we wait for global solutions to climate change before taking actions at other scales? Economic Theory 49: 353–369. https://doi.org/10.1007/s00199-010-0558-6 .

Ostrom, E., and T.K. Ahn. 2009. The meaning of social capital and its link to collective action. In Handbook of social capital: The troika of sociology, political science and economics , ed. G.T. Svendsen and G.L.H. Svendsen, 17–35. Edward Elgar Publishing.

Papen, U. 2009. Literacy, learning and health: A social practices view of health literacy. Literacy and Numeracy Studies . https://doi.org/10.5130/lns.v0i0.1275 .

Park, R.E. 1927. Human nature and collective behavior. American Journal of Sociology 32: 733–741.

Paul, A.M. 2021. The extended mind: The power of thinking outside the brain . Boston: Mariner Books.

Pawilen, G.T. 2021. Integrating Indigenous knowledge in the Philippine elementary science curriculum: Integrating Indigenous knowledge. International Journal of Curriculum and Instruction 13: 1148–1160.

Prager, K. 2015. Agri-environmental collaboratives for landscape management in Europe. Current Opinion in Environmental Sustainability 12: 59–66. https://doi.org/10.1016/j.cosust.2014.10.009 .

Pretty, J., and H. Ward. 2001. Social capital and the environment. World Development 29: 209–227. https://doi.org/10.1016/S0305-750X(00)00098-X .

Putnam, R.D. 2020. Bowling alone: Revised and updated: The collapse and revival of American community . Anniversary. New York: Simon & Schuster.

Raymond, L. 2006. Cooperation without trust: Overcoming collective action barriers to endangered species protection. Policy Studies Journal 34: 37–57. https://doi.org/10.1111/j.1541-0072.2006.00144.x .

Reed, M.S., A.C. Evely, G. Cundill, I. Fazey, J. Glass, A. Laing, J. Newig, B. Parrish, et al. 2010. What is social learning? Ecology and Society 15: 12.

Reicher, S., R. Spears, and S.A. Haslam. 2010. The social identity approach in social psychology. In The SAGE handbook of identities (pp. 45–62). SAGE. https://doi.org/10.4135/9781446200889

Reid, A. 2019. Blank, blind, bald and bright spots in environmental education research. Environmental Education Research 25: 157–171. https://doi.org/10.1080/13504622.2019.1615735 .

Rogoff, B. 2003. The cultural nature of human development (Reprint edition) . Oxford: Oxford University Press.

Roth, C.E. 1992. Environmental literacy: Its roots, evolution and directions in the 1990s . http://eric.ed.gov/?id=ED348235

Roth, W.-M. 2003. Scientific literacy as an emergent feature of collective human praxis. Journal of Curriculum Studies 35: 9–23. https://doi.org/10.1080/00220270210134600 .

Roth, W.-M., and A.C. Barton. 2004. Rethinking scientific literacy . London: Psychology Press.

Roth, W.-M., and S. Lee. 2002. Scientific literacy as collective praxis. Public Understanding of Science 11: 33–56. https://doi.org/10.1088/0963-6625/11/1/302 .

Roth, W.-M., and S. Lee. 2004. Science education as/for participation in the community. Science Education 88: 263–291.

Roth, W.-M., and Y.-J. Lee. 2007. “Vygotsky’s neglected legacy”: Cultural-historical activity theory. Review of Educational Research 77: 186–232.

Sadoff, C.W., and D. Grey. 2005. Cooperation on international rivers: A continuum for securing and sharing benefits. Water International 30: 420–427.

Samerski, S. 2019. Health literacy as a social practice: Social and empirical dimensions of knowledge on health and healthcare. Social Science & Medicine 226: 1–8. https://doi.org/10.1016/j.socscimed.2019.02.024 .

Sawyer, R.K. 2014. The future of learning: Grounding educational innovation in the learning sciences. In The Cambridge handbook of the learning sciences , ed. R.K. Sawyer, 726–746. Cambridge: Cambridge University Press.

Saxe, J.G. n.d.. The blind man and the elephant . All Poetry. Retrieved October 6, 2020, from https://allpoetry.com/The-Blind-Man-And-The-Elephant .

Scheepers, D., and N. Ellemers. 2019. Social identity theory. In Social psychology in action: Evidence-based interventions from theory to practice , ed. K. Sassenberg and M.L.W. Vliek, 129–143. New York: Springer International Publishing.

Schipper, E.L.F., N.K. Dubash, and Y. Mulugetta. 2021. Climate change research and the search for solutions: Rethinking interdisciplinarity. Climatic Change 168: 18. https://doi.org/10.1007/s10584-021-03237-3 .

Schoerning, E. 2018. A no-conflict approach to informal science education increases community science literacy and engagement. Journal of Science Communication, Doi 10: 17030205.

Schultz, P.W. 2014. Strategies for promoting proenvironmental behavior: Lots of tools but few instructions. European Psychologist 19: 107–117. https://doi.org/10.1027/1016-9040/a000163 .

Sharifi, A. 2016. A critical review of selected tools for assessing community resilience. Ecological Indicators 69: 629–647. https://doi.org/10.1016/j.ecolind.2016.05.023 .

Sherrieb, K., F.H. Norris, and S. Galea. 2010. Measuring capacities for community resilience. Social Indicators Research 99: 227–247. https://doi.org/10.1007/s11205-010-9576-9 .

Singh, R.K., A. Singh, K.K. Zander, S. Mathew, and A. Kumar. 2021. Measuring successful processes of knowledge co-production for managing climate change and associated environmental stressors: Adaptation policies and practices to support Indian farmers. Journal of Environmental Management 282: 111679. https://doi.org/10.1016/j.jenvman.2020.111679 .

Sloman, S., and P. Fernbach. 2017. The knowledge illusion: Why we never think alone . New York: Riverhead Books.

Smelser, N.J. 2011. Theory of collective behavior . Quid Pro Books. (Original work published 1962).

Sørensen, K., S. Van den Broucke, J. Fullam, G. Doyle, J. Pelikan, Z. Slonska, H. Brand, and (HLS-EU) Consortium Health Literacy Project European. 2012. Health literacy and public health: A systematic review and integration of definitions and models. BMC Public Health 12: 80. https://doi.org/10.1186/1471-2458-12-80 .

Spitzer, W., and J. Fraser. 2020. Advancing community science literacy. Journal of Museum Education 45: 5–15. https://doi.org/10.1080/10598650.2020.1720403 .

Stables, A., and K. Bishop. 2001. Weak and strong conceptions of environmental literacy: Implications for environmental education. Environmental Education Research 7: 89. https://doi.org/10.1080/13504620125643 .

Stern, M.J., R.B. Powell, and N.M. Ardoin. 2008. What difference does it make? Assessing outcomes from participation in a residential environmental education program. The Journal of Environmental Education 39: 31–43. https://doi.org/10.3200/JOEE.39.4.31-43 .

Stets, J.E., and P.J. Burke. 2000. Identity theory and social identity theory. Social Psychology Quarterly 63: 224–237. https://doi.org/10.2307/2695870 .

Sturmer, S., and B. Simon. 2004. Collective action: Towards a dual-pathway model. European Review of Social Psychology 15: 59–99. https://doi.org/10.1080/10463280340000117 .

Sullivan, A., A. York, D. White, S. Hall, and S. Yabiku. 2017. De jure versus de facto institutions: Trust, information, and collective efforts to manage the invasive mile-a-minute weed (Mikania micrantha). International Journal of the Commons 11: 171–199. https://doi.org/10.18352/ijc.676 .

Sunstein, C.R. 2008. Infotopia: How many minds produce knowledge . Oxford: Oxford University Press.

Surowiecki, J. 2005. The wisdom of crowds . New York: Anchor.

Swim, J.K., S. Clayton, and G.S. Howard. 2011. Human behavioral contributions to climate change: Psychological and contextual drivers. American Psychologist 66: 251–264.

Thaker, J., P. Howe, A. Leiserowitz, and E. Maibach. 2019. Perceived collective efficacy and trust in government influence public engagement with climate change-related water conservation policies. Environmental Communication 13: 681–699. https://doi.org/10.1080/17524032.2018.1438302 .

Tudge, J.R.H., and P.A. Winterhoff. 1993. Vygotsky, Piaget, and Bandura: Perspectives on the relations between the social world and cognitive development. Human Development 36: 61–81. https://doi.org/10.1159/000277297 .

Turner, R.H., and L.M. Killian. 1987. Collective behavior , 3rd ed. Englewood Cliffs: Prentice Hall.

Turner, R.H., N.J. Smelser, and L.M. Killian. 2020. Collective behaviour. In Encyclopedia Britannica . Encyclopedia Britannica, Inc. https://www.britannica.com/science/collective-behaviour .

van der Linden, S. 2014. Towards a new model for communicating climate change. In Understanding and governing sustainable tourism mobility , ed. S. Cohen, J. Higham, P. Peeters, and S. Gössling, 263–295. Milton Park: Routledge.

van Zomeren, M., T. Postmes, and R. Spears. 2008. Toward an integrative social identity model of collective action: A quantitative research synthesis of three socio-psychological perspectives. Psychological Bulletin 134: 504–535. https://doi.org/10.1037/0033-2909.134.4.504 .

Vygotsky, L.S. 1980. Mind in society: The development of higher psychological processes . Cambridge: Harvard University Press.

Waldron, F., B. Ruane, R. Oberman, and S. Morris. 2019. Geographical process or global injustice? Contrasting educational perspectives on climate change. Environmental Education Research 25: 895–911. https://doi.org/10.1080/13504622.2016.1255876 .

Wals, A.E.J., M. Brody, J. Dillon, and R.B. Stevenson. 2014. Convergence between science and environmental education. Science 344: 583–584.

Wenger, E.C., and W.M. Snyder. 2000. Communities of practice: The organizational frontier. Harvard Business Review 78: 139–146.

Weschsler, D. 1971. Concept of collective intelligence. American Psychologist 26: 904–907. https://doi.org/10.1037/h0032223 .

Wheaton, M., A. Kannan, and N.M. Ardoin. 2018. Environmental literacy: Setting the stage (Environmental Literacy Brief, Vol. 1). Social Ecology Lab, Stanford University. https://ed.stanford.edu/sites/default/files/news/images/stanfordsocialecologylab-brief-1.pdf .

Wojcik, D.J., N.M. Ardoin, and R.K. Gould. 2021. Using social network analysis to explore and expand our understanding of a robust environmental learning landscape. Environmental Education Research 27: 1263–1283.

Wood, W., and D. Rünger. 2016. Psychology of habit. Annual Review of Psychology 67: 289–314. https://doi.org/10.1146/annurev-psych-122414-033417 .

Woolley, A.W., C.F. Chabris, A. Pentland, N. Hashmi, and T.W. Malone. 2010. Evidence for a collective intelligence factor in the performance of human groups. Science 330: 686–688. https://doi.org/10.1126/science.1193147 .

Download references

Acknowledgements

We are grateful to Maria DiGiano, Anna Lee, and Becca Shareff for their feedback and contributions to early drafts of this paper. We appreciate the research and writing assistance supporting this paper provided by various members of the Stanford Social Ecology Lab, especially: Brennecke Gale, Pari Ghorbani, Regina Kong, Naomi Ray, and Austin Stack.

This work was supported by a grant from the Pisces Foundation.

Author information

Authors and affiliations.

Emmett Interdisciplinary Program in Environment and Resources, Graduate School of Education, and Woods Institute for the Environment, Stanford University, 233 Littlefield Hall, Stanford, CA, 94305, USA

Nicole M. Ardoin

Social Ecology Lab, Graduate School of Education and Woods Institute for the Environment, Stanford University, 233 Littlefield Hall, Stanford, CA, 94305, USA

Alison W. Bowers

Emmett Interdisciplinary Program in Environment and Resources, School of Earth, Energy and Environmental Sciences, Stanford University, 473 Via Ortega, Suite 226, Stanford, CA, 94305, USA

Mele Wheaton

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Nicole M. Ardoin .

Ethics declarations

Conflict of interest.

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Ardoin, N.M., Bowers, A.W. & Wheaton, M. Leveraging collective action and environmental literacy to address complex sustainability challenges. Ambio 52 , 30–44 (2023). https://doi.org/10.1007/s13280-022-01764-6

Download citation

Received : 11 July 2021

Revised : 11 January 2022

Accepted : 22 June 2022

Published : 09 August 2022

Issue Date : January 2023

DOI : https://doi.org/10.1007/s13280-022-01764-6

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Collective action
  • Environmental literacy
  • Social movements
  • Sustainability
  • Find a journal
  • Publish with us
  • Track your research

COMMENTS

  1. Qualitative Data Analysis: Step-by-Step Guide (Manual vs ...

    Step 1: Gather your qualitative data and conduct research (Conduct qualitative research) The first step of qualitative research is to do data collection. Put simply, data collection is gathering all of your data for analysis. A common situation is when qualitative data is spread across various sources.

  2. Qualitative Data Analysis: What is it, Methods + Examples

    Qualitative data analysis is a systematic process of examining non-numerical data to extract meaning, patterns, and insights. In contrast to quantitative analysis, which focuses on numbers and statistical metrics, the qualitative study focuses on the qualitative aspects of data, such as text, images, audio, and videos.

  3. Learning to Do Qualitative Data Analysis: A Starting Point

    For many researchers unfamiliar with qualitative research, determining how to conduct qualitative analyses is often quite challenging. Part of this challenge is due to the seemingly limitless approaches that a qualitative researcher might leverage, as well as simply learning to think like a qualitative researcher when analyzing data. From framework analysis (Ritchie & Spencer, 1994) to content ...

  4. PDF A Step-by-Step Guide to Qualitative Data Analysis

    Step 1: Organizing the Data. "Valid analysis is immensely aided by data displays that are focused enough to permit viewing of a full data set in one location and are systematically arranged to answer the research question at hand." (Huberman and Miles, 1994, p. 432) The best way to organize your data is to go back to your interview guide.

  5. Data Analysis for Qualitative Research: 6 Step Guide

    How to analyze qualitative data from an interview. To analyze qualitative data from an interview, follow the same 6 steps for quantitative data analysis: Perform the interviews. Transcribe the interviews onto paper. Decide whether to either code analytical data (open, axial, selective), analyze word frequencies, or both.

  6. PDF The SAGE Handbook of Qualitative Data Analysis

    The SAGE Handbook of. Qualitative Data Analysis. Uwe Flick. 00-Flick-Prelims.indd 5 29-Oct-13 2:00:39 PM. Data analysis is the central step in qualitative research. Whatever the data are, it is their analysis that, in a decisive way, forms the outcomes of the research. Sometimes, data collection is limited to recording and docu- menting ...

  7. Qualitative Data Analysis Methods: Top 6 + Examples

    QDA Method #3: Discourse Analysis. Discourse is simply a fancy word for written or spoken language or debate. So, discourse analysis is all about analysing language within its social context. In other words, analysing language - such as a conversation, a speech, etc - within the culture and society it takes place.

  8. Learning to Do Qualitative Data Analysis: A Starting Point

    In this article, we take up this open question as a point of departure and offer the-matic analysis, an analytic method commonly used to identify patterns across lan-guage-based data (Braun & Clarke, 2006), as a useful starting point for learning about the qualitative analysis process.

  9. What Is Qualitative Research?

    Qualitative research methods. Each of the research approaches involve using one or more data collection methods.These are some of the most common qualitative methods: Observations: recording what you have seen, heard, or encountered in detailed field notes. Interviews: personally asking people questions in one-on-one conversations. Focus groups: asking questions and generating discussion among ...

  10. Data Analysis in Qualitative Research: A Brief Guide to Using Nvivo

    Abstract. Qualitative data is often subjective, rich, and consists of in-depth information normally presented in the form of words. Analysing qualitative data entails reading a large amount of transcripts looking for similarities or differences, and subsequently finding themes and developing categories. Traditionally, researchers 'cut and ...

  11. Qualitative Data Analysis

    5. Grounded theory. This method of qualitative data analysis starts with an analysis of a single case to formulate a theory. Then, additional cases are examined to see if they contribute to the theory. Qualitative data analysis can be conducted through the following three steps: Step 1: Developing and Applying Codes.

  12. How to Do Thematic Analysis

    When to use thematic analysis. Thematic analysis is a good approach to research where you're trying to find out something about people's views, opinions, knowledge, experiences or values from a set of qualitative data - for example, interview transcripts, social media profiles, or survey responses. Some types of research questions you might use thematic analysis to answer:

  13. Qualitative Data Analysis Strategies

    This chapter provides an overview of selected qualitative data analysis strategies with a particular focus on codes and coding. Preparatory strategies for a qualitative research study and data management are first outlined. Six coding methods are then profiled using comparable interview data: process coding, in vivo coding, descriptive coding ...

  14. How to use and assess qualitative research methods

    For data analysis, field-notes and audio-recordings are transcribed into protocols and transcripts, and coded using qualitative data management software. Criteria such as checklists, reflexivity, sampling strategies, piloting, co-coding, member-checking and stakeholder involvement can be used to enhance and assess the quality of the research ...

  15. Qualitative Research: Data Collection, Analysis, and Management

    In this article, we review some principles of the collection, analysis, and management of qualitative data to help pharmacists interested in doing research in their practice to continue their learning in this area. Qualitative research can help researchers to access the thoughts and feelings of research participants, which can enable ...

  16. (PDF) Qualitative Data Analysis and Interpretation: Systematic Search

    Qualitative data analysis is. concerned with transforming raw data by searching, evaluating, recogni sing, cod ing, mapping, exploring and describing patterns, trends, themes an d categories in ...

  17. Qualitative data analysis: a practical example

    The aim of this paper is to equip readers with an understanding of the principles of qualitative data analysis and offer a practical example of how analysis might be undertaken in an interview-based study. Qualitative research is a generic term that refers to a group of methods, and ways of collecting and analysing data that are interpretative or explanatory in nature and focus on meaning.

  18. Data Analysis in Research: Types & Methods

    Methods used for data analysis in qualitative research. There are several techniques to analyze the data in qualitative research, but here are some commonly used methods, Content Analysis: It is widely accepted and the most frequently employed technique for data analysis in research methodology. It can be used to analyze the documented ...

  19. PDF Reporting Qualitative Research in Psychology

    how to best present qualitative research, with rationales and illustrations. The reporting standards for qualitative meta-analyses, which are integrative analy-ses of findings from across primary qualitative research, are presented in Chapter 8. These standards are distinct from the standards for both quantitative meta-analyses and

  20. 10.5 Analysis of Qualitative Interview Data

    Analysis of qualitative interview data often works inductively (Glaser & Strauss, 1967; Patton, 2001). To move from the specific observations an interviewer collects to identifying patterns across those observations, qualitative interviewers will often begin by reading through transcripts of their interviews and trying to identify codes.

  21. Data Collection

    Data collection is a systematic process of gathering observations or measurements. Whether you are performing research for business, governmental or academic purposes, data collection allows you to gain first-hand knowledge and original insights into your research problem. While methods and aims may differ between fields, the overall process of ...

  22. Qualitative Data Analysis Methodologies and Methods

    Types of Qualitative Data Analysis Methodologies. Systematically analyzing textual, visual, or auditory content to identify patterns, themes, and meanings. Includes conventional, directed, and summative approaches. Identifying, analyzing, and reporting patterns or themes within qualitative data. Offers a systematic approach to coding and ...

  23. Creating a Data Analysis Plan: What to Consider When Choosing

    For those interested in conducting qualitative research, previous articles in this Research Primer series have provided information on the design and analysis of such studies. 2, 3 Information in the current article is divided into 3 main sections: an overview of terms and concepts used in data analysis, a review of common methods used to ...

  24. What is Qualitative Data Analysis Software (QDA Software)?

    Published: Oct. 23, 2023. Qualitative Data Analysis Software (QDA software) allows researchers to organize, analyze and visualize their data, finding the patterns in qualitative data or unstructured data: interviews, surveys, field notes, videos, audio files, images, journal articles interviews, web content etc.

  25. Qualitative Data Analysis Retreats: Creating New Spaces for Doctoral

    Qualitative data analysis is recognized as a threshold concept in research education and can be conceptually challenging for doctoral students. While retreats are common approaches to support dissertation writing, we propose an unconventional approach for doctoral education with the use of retreats for qualitative data analysis. Analytic autoethnography was used to examine what features of an ...

  26. How to analyze survey data: Survey data analysis best practices

    How to analyze survey data. The first step when analyzing survey data is to turn your individualized responses into aggregated numbers. This sounds complicated, but really, it just means you need to do some counting. For every question in your survey, you need to know the total number of people who answered with each response.

  27. A Practical Guide to Writing Quantitative and Qualitative Research

    INTRODUCTION. Scientific research is usually initiated by posing evidenced-based research questions which are then explicitly restated as hypotheses.1,2 The hypotheses provide directions to guide the study, solutions, explanations, and expected results.3,4 Both research questions and hypotheses are essentially formulated based on conventional theories and real-world processes, which allow the ...

  28. Education Sciences

    This study aims to explore the perspectives of pre-service teachers (PSTs) regarding their experiences with the ANNOTO personal note feature within a blended learning (BL) context. Employing qualitative research methods, the analysis incorporates reflections from 45 PSTs, insights derived from two focus groups, and data gathered through semi-structured interviews. The objective is to ...

  29. Quantitative Data Analysis: A Complete Guide

    Here's how to make sense of your company's numbers in just four steps: 1. Collect data. Before you can actually start the analysis process, you need data to analyze. This involves conducting quantitative research and collecting numerical data from various sources, including: Interviews or focus groups.

  30. Leveraging collective action and environmental literacy to address

    Developing and enhancing societal capacity to understand, debate elements of, and take actionable steps toward a sustainable future at a scale beyond the individual are critical when addressing sustainability challenges such as climate change, resource scarcity, biodiversity loss, and zoonotic disease. Although mounting evidence exists for how to facilitate individual action to address ...