9 Survey Tools for Academic Research in 2024

checklist

  • Important Features

Survey Panels

  • Additional Tools

1. SurveyKing

2. alchemer, 3. surveymonkey, 4. qualtrics, 5. questionpro, 6. sawtooth, 7. conjointly, 8. typeform, 9. google forms.

  • Employee Feedback
  • Creating the Survey
  • Identity Protection
  • Research Tools

Need a research survey tool? Features include MaxDiff, conjoint, and more!

These nine survey tools are perfect for academic research because they offer unique question types, solid reporting options, and support staff to help make your project a success. This article includes a detailed review of each of these nine survey tools. In addition to these survey tools, we include information about other research tools and survey panels.

Below is a quick summary of these nine survey tools. We list the lowest price to upgrade, which usually has the featured s needed for research projects. We also include a summary of the unique features of each tool. Most survey software has a monthly subscription; we denote when a tool requires annual pricing is required.

Important Features of Research Survey Software

Academic research surveys often require advanced question types to capture the necessary data. Many of the tools we mention in this article include these questions. However, some projects also require specialized features or the ability to purchase a panel. To help guide your decision in choosing the best piece of software for your project, we’ll summarize some of the most critical aspects.

Research Questions

Standard multiple-choice questions can only get you so far. Here are some question types you should be aware of:

  • MaxDiff – measure the relative importance of an attribute. It goes beyond a standard ranking or rating by forcing respondents to pick the least and most valued items from a list. Rankings and other types only can you what is liked, not what is disliked. A statistical model will give you the probability of a user selecting an item as the most important. Latent class analysis can help you identify groups of respondents who value different attributes.
  • Conjoint – Similar to MaxDiff in terms of finding importance, respondents evaluate a complete product (multiple attributes combined). This simulates real word purchasing decisions. A statistical model is also used to compute the importance of each item.
  • Van Westendorp – Asks respondents to evaluate four price points. This shapes price curves and gives you a range of acceptable prices.
  • Gabor Granger – Asks users whether or not they would purchase an item at specific price points. Price points are shown in random order to simulate real-world buying conditions. The results include a demand curve, giving you the revenue-maximizing price.
  • Likert Scale – Measure attitudes and opinions related to a topic. It’s essential to use a mobile-ready Likert scale tool to increase response rates; many tools use a matrix for Likert scales, which could be more user-friendly.
  • Semantic differential scale – a multirow rating scale that contains grammatically opposite adjectives at each end. It is used similarly to a Likert scale but is much easier for respondents to evaluate.
  • Image heat map – Respondents click on places they like on an image. The results include a heat map showing the density of clicks. This is useful for product packaging.
  • Net Promoter Score – Respondents choose a rating from 0-10. Many companies use this industry-standard question to benchmark their brand perception. This question type is necessary if your academic project measures brand reputation.

Anonymous Survey Links

Many academic surveys can deal with sensitive subjects or target sensitive groups. For this reason, assuring anonymity for respondents is crucial. Choosing a platform with an anonymous link is essential to increase trust with respondents and increase your response rates.

Data Segmentation

Comparing two groups within your survey data is essential for many research projects. This is called cross tabulation . For example, consider a survey where you ask for gender along with product satisfaction. You may notice that males are not satisfied with the product while females are.

You can take this further and compute the statistical significance between the groups. In other words, make the differences that exist between two data sets due to random chance or not. Your comparison is statistically significant if it’s not due to random chance.

Some lower-end survey tools may not offer any segmentation features. If this is the case, you need to download your survey data into a spreadsheet and create pivots of set-up custom formulas.

Skip Logic and Piping

If your academic project has questions that only a specific subset of respondents need to answer, then some logic will help streamline your survey.

Skip logic will take you to a new page based on answers to previous questions. Display logic will show a question to a user based on previous questions; perfect for follow-up.

Answer piping will allow you to carry forward answers from one question into another. So, for example, ask someone which brand names they have heard of, then pipe those answers into a ranking question.

Data Cleaning

Making sure your responses are high quality is a big part of any survey research project. For example, if people speed through the survey or mark all the first answers for questions, those would be low-quality responses and should be removed from your data set. Some tools highlight these low-quality responses, which can be a helpful feature.

For platforms that do not offer a data cleaning feature, it’s generally possible to export the data to Excel, create formulas for time spent, answer straight-lining, then remove the needed data. You can also include a  trap question  to help filter out low-quality responses.

Great Support

Many academic projects require statistical analysis or additional options for the survey. Using a tool with a support staff that can explain a statistical model’s intricacies, help build custom models, or adds features on request will ensure your project is a success. With SurveyKing, custom-built features are billed at $50 per hour, making custom projects feasible for small budgets.

Asking classmates to take your survey, posting it on social media, or distributing QR code surveys around campus is a great way to collect responses for your project. But if you need more responses with those methods, purchasing additional answers might be required.

A panel provider will enable you to target a specific demographic, job role, or hobby type. When setting up a survey with a penal provider, you always want to include screening questions (on the first page) to ensure they meet your criteria, as panel filters may not be 100% accurate. Generally, panel responses start around $2.50 per completed response.  Cint  is one of the largest panel providers and works well with any survey platform.

Additional Research Tools

Before deep diving into the survey software list, here are some additional tools and resources that might assist in your project. These can help shape your survey by conducting preliminary research or using it as a substitute if conducting a study is not feasible.

  • Hotjar  – They offer simple surveys and many tools to help capture feedback and data points from a website. A feedback widget customized for websites in addition to a heat map tool to show where users click the most or to identify rage clicks. A tool like this could be helpful if your academic projects revolve around launching or optimizing a website.
  • Think with Google  – Used to help marketers understand their audience. The site contains links to Google Trends to search for the popularity of key terms over time. They also have a tool that helps you identify your audience based on popular YouTube channels. Finally, they have a “Grow My Store Tool” that recommends tips for improving an online store.
  • Google Scholar  – A specific search engine used for scholarly literature. This can help locate research papers related to the survey you are creating.
  • MIT Theses  – Contains over 58,000 theses and dissertations from all MIT departments. The database is organized by department and lets you search for keywords.

SurveyKing is the best tool for academic research surveys because of a wide variety of question types like MaxDiff, excellent reporting features, a solid support staff, and a low cost of $19 per month.

The survey builder is straightforward to use. Question types include MaxDiff, conjoint, Gabor Granger, Van Westendorp, a mobile optimized Likert scale, and semantic differential.

The MaxDiff question also includes anchored MaxDiff and collecting open-ended feedback for the feature most valued by a respondent. In addition, cluster analysis is available to help similar group data together; some respondents might value specific attributes, while other groups value others.

The reporting section is also a standout feature. It is easy to create filters and segment reports. In addition, the Excel export is well formatted easily for question types like ranking and Likert Scale, making it easy to upload into SPSS. The reporting section also gives the probability for MaxDiff, one of the few tools to offer that.

The anonymous link on SurveyKing is a valuable feature. A snippet at the top of each anonymous survey is where users can click to understand whether their identities are protected.

The software also offers a Net Promoter Score module which can come in handy for projects that deep dive into brand reputation.

Some downsides to SurveyKing include no answer piping, no image heat maps, no continuous sum question, and no premade data cleaning feature.

As a platform with lots of advanced question types and a reasonable cost, Alchemer is an excellent tool for academic research. Question types include MaxDiff, conjoint, semantic differential, image heat map, text highlighter, continuous sum, cascading dropdowns, rankings, and card grouping.

Reporting on Alchemer is a standout feature. Not only can you create filters and segment reports, but you can also create those filters and segments using advanced criteria. So if you ask a question about gender and hobby, you can make advanced criteria that match a specific gender and hobby.

In addition, their reporting section also can do chi-square tests to calculate the significant difference between the two groups. Finally, they also have a section where you can create and run your R scripts. This can be useful for various academic research projects as you can create custom statistical models in the software without needing to export your data.

Alchemer is less user-friendly than some other tools. The platform is a little clunky; things like MaxDiff require respondents to hit the submit button to get to the next set. Radio buttons need respondents to click inside of them instead of the area around them.

The pricing is reasonable for a student; $249 a month for access to the research questions. However, if you can organize your project quickly, you may only need one month of access.

As the most recognized brand for online surveys, SurveyMonkey is a reliable option for academic research. While the platform does not have any research questions, it offers all the standard question types and a clean user interface to build your surveys.

One advanced question type they do have is the image heat map. Their parent company  Momentive  does offer things like MaxDiff and conjoint studies, but you would need to contact sales to get a quote, meaning this could be out of budget for students.

The reporting on SurveyMonkey is good. You can easily create filters and segments. You can also save that criterion to create a view. The views enable you to toggle between rules quickly.

One of the main downsides to SurveyMonkey is the cost. For the image heat map and to create advanced branching rules, you need to upgrade to their Premier plan, which costs $1,428 annually. To get statistical significance, you would need their Primer plan, which is $468 annually.

As the survey tool known for experience management, Qualtrics has some nice features for research projects. For example, they offer both MaxDiff and conjoint in addition to tools like drill-down, continuous sun, image heat map, and a text highlighter.

Reporting on the tool offers the ability to create filters and segments. For segments, it’s called a report breakout, and it appears there is no ability to create a breakout with advanced criteria. However, filers do allow you for advanced criteria.

There is a custom report builder option to create custom PDF reports. You can add as many elements as needed and customize the information displayed, whether a chart type or a data table.

Overall, Qualtrics could be more user-friendly and may require training. The survey builder and reporting screens could be more cohesive. For example, to add more answer options, you need to click the “plus” symbol on the left-hand side of the question instead of just hitting enter or clicking a button right below the current answer choice. In addition, the reporting section will display things like mean and standard deviation for simple multiple-choice questions before showing simple response counts.

One drawback to Qualtrics is the pricing. For example, you would need to pay $1,440 for an annual plan to use the research questions. But many universities have a licensing agreement with Qualtrics so students can use the platform. When you sign up for a new account, you can select academic use, enter your Edu email, and they will check if your university has a license agreement.

A survey platform with all the needed research questions, including Gabor Granger and Van Westendorp, QuestionPro is a quality research tool.

The reporting on QuestionPro is comprehensive. They offer segment reports with statistical significance using a t-test. In addition, they offer TURF analysis to show answer combinations with the highest reach.

For conjoint, offer a market simulation tool that can forecast new product market share based on your data. That tool can also calculate how much  premium  consumers will pay for a brand name.

QuestionPro is a little easier to use than Qualtrics. The UI is cleaner but still clumsy. You must navigate to a different section in the builder for things like quotas instead of just having it near skip logic rules. The distribution page has the link at the top but an email body below. The reporting has a lot of different pages to click through for each option. Small things like this mean there is a learning curve to use the platform efficiently.

The biggest downside of QuestionPro is the price. All of their research questions, even Net Promoter Score, would require a custom quote under the research plan. There another plan with upgraded feature types is $1,188 annually.

When it comes to advanced research projects, Sawtooth is a great resource. While their survey builder is a little limited in question types, they offer different forms of MaxDiff and conjoint. They also provide consulting services, which could help if your academic project is highly specialized.

For MaxDiff, they offer a bandit  version, which can be used for MaxDiff studies with over 50 attributes. Each set of detailed attributes that are most relevant to the user. This can save panel costs because you can build a suitable statistical model with 300 bandit responses compared with 500 or 1000 standard MaxDiff responses.

Their MaxDiff feature also comes with a TURF analysis option that can show you the possible market research of various attributes.

For conjoint, they offer adaptive choice-based conjoint and menu-based conjoint. Adaptive choice tailors the product cards toward each respondent based on early responses or screening questions. Menu-based conjoint is for more complex projects, allowing respondents to build their products based on various attributes and prices.

Sawtooth has a high price point and may be out of the research for many academic projects. The lowest plan is $4,500 annually. If you need advanced tools like bandit MaxDiff or adaptive conjoint, you must pay $11,990 annually. They do have a package just for MaxDiff starting at $2,420.

Conjointly is a platform geared towards research projects, namely market research. Not only do they have the standard research questions, but they also have a bunch of unique ones: claims testing, Kano Model testing, and monadic testing. There are also question types like feature placement matrix, which combines MaxDiff and Gabor Granger into a single question.

You can either use your respondents or select from a survey panel. The survey panel option comes with predefined audiences, which makes scouring respondents a breeze.

One unique feature is that they monitor in real-time speeders and other criteria for low-quality respondents. If a respondent is speeding through the survey, a warning message is displayed asking them to repeat questions before being disqualified. If a question has a lot of information to digest, the system automatically pauses, forcing the respondent to thoroughly read the question before answering.

The pricing is a little steep at $1,795 annually. Response panels for USA residents appear to start around $4 per completed response. The survey builder and reporting section could be cleaner, with different options in many places. It may take time to get up to speed.

While Typeform doesn’t have any research questions, it is a very well-designed and easy-to-use tool that can assist with your academic survey. For example, it could gather preliminary data for a MaxDiff study.

Typeform offers a lot of integrations with other applications. For example, if your project requires exporting data to a spreadsheet, then Google Sheets or Excel integration might be helpful. Likewise, if your research project is part of a class project, then the Slack or Microsoft Teams integration might help to notify other team members when you get responses.

One unique feature of Typeform is the calculator feature. Add, subtract, and multiply numbers to the @score or @price variable. These variables can be recalled to show scores or used in a payment form.

The reporting in Typeform is basic. There is no option to create a filter or a segment report. Any data analysis would need to be done in Google Sheets or Excel.

For $29 a month, you can get 100 responses, or $59 a month, you can collect 1,000 responses each month.

One of the widely used survey tools, Google Forms , is a decent platform for an academic research survey. Unfortunately, the software doesn’t offer any research questions. Still, the few questions it has, like multiple choice, rantings, and open-ended feedback, are enough to collect essential feedback for simple projects or preliminary data for more complex studies.

Skip logic is straightforward to set up on Google Forms. For example, you can select what section to skip based on question answers or choose what to skip once a section is complete. Of course, you can’t create complex rules, but these simple rules can cover many bases.

Overall the user interface is elegant and straightforward. The form design is also elegant, meaning the respondent experience is excellent. Unlike other survey tools, which can have a clunky interface, there is no worry about that with Google Forms; respondents can quickly navigate your form and submit answers.

The spreadsheet export is very well formatted and can be easily imported into SPSS for advanced analysis. However, the export has the submission date and time but has yet to have the time started, so calculating speeders is impossible.

ABOUT THE AUTOR

Allen is the founder of SurveyKing. A former CPA and government auditor, he understands how important quality data is in decision making. He continues to help SurveyKing accomplish their main goal: providing organizations around the world with low-cost high-quality feedback tools.

Ready To Start?

Create your own survey now. Get started for free and collect actionable data.

Cover

Business Process Improvement Consulting: Expert Solutions

Definition: A business process improvement consultant will help design and implement strategies to increase the efficiency of workflows across your o...

Cover

8 Excel Consulting Services to Use in 2024 + VBA Support

These 6 Excel consulting firms offer support, training, and VBA development to help you automate tasks and increase efficiency when using Microsoft Ex...

9 Great SurveyMonkey Alternatives to Use in 2024

Discover alternatives to the most popular online survey tool, SurveyMonkey. Gain an understanding of where SurveyMonkey lacks in features and get intr...

Cover

7 Great Qualtrics Alternatives to Use in 2024

These seven alternatives to Qualtrics offer either more features, a lower cost, or a cleaner user interface. These alternative platforms also include ...

Cover

Union Negotiation Consulting: Planning Labor Agreements

A labor union negotiation consulting engagement involves quantifying member needs, proposing contract language, and developing communication strategie...

Cover

Creating a Transactional Survey: Examples + Template

Definition: A transactional survey captures customer feedback after a specific interaction, referred to as a touchpoint. This survey type provides dir...

Cover

Hire an Excel Expert: Automation + VBA Development

An Excel expert will help you to complete your projects within Microsoft Excel. A good Excel expert should be proficient in advanced formulas such as ...

Cover

Creating an Anonymous Employee Survey + Template, Sample Questions

Definition: An anonymous employee survey is a convenient way to collect honest feedback in the workplace. The survey can either measure employee satis...

Improving Fleet Performance Through Driver Feedback Surveys

In the US, the trucking industry generated $875.5 billion in gross freight revenues, accounting for 80.8% of the country’s freight invoice in 20...

13 College Study Tips to Use in 2023

These 15 college study tips will help you succeed in your academic career.

Maximizing the Value of Skills Assessment Tools Using Surveys

When you apply for a job, it’s only natural that you’ll aim to present the best possible version of yourself. You’ll focus on your best skills a...

Creating UX Surveys: 6 Tips and Examples

UX surveys are used to help create a great user experience. A good UX survey will incorporate a variety of question types to help understand what user...

5 Web Consultants to Use in 2023: Design + Development

Definition: A web consultant can update an existing website design, create a custom website, help increase traffic, recommend layout changes, and even...

Creating a Targeted Survey: Panels to Reach Your Audience

Definition: A targeted survey is used to research a specific audience, frequently utilizing a survey panel provider. A paneling service generally has ...

8 Typeform Alternatives to Use in 2023

These seven alternatives to Typeform offer a lower cost or additional features. In addition, these alternative platforms include question types that T...

Cover

6 Ecommerce Skills For Professionals + Students

Ecommerce has occupied its leading niche in the world, allowing us to draw certain conclusions. For example, it is not surprising that more specialize...

Ecommerce Analytics Explained + Tools to Use

Definition: Ecommerce analytics is the practice of continuously monitoring your business performance by gathering and examining data that affects your...

Planning a Survey: 6 Step Guide + Best Practices

Planning a survey involves six steps: Set objectives, define the target audience, select the distribution method, organize external data, draft the su...

4 Survey Consulting Services to Use in 2023

Definition: These 4 survey consulting services offer planning, design, development, and support to help complete your survey project. Whether it’s f...

Excel Automation Explained: VBA Code + Sample Workbooks

Definition: Excel automation will streamline repetitive tasks such as updating data, formatting cells, sending emails, and even uploading files to Sha...

Hire a Financial Modeling Consultant: Forecasts + Valuations

Definition: A financial modeling consultant will provide expertise in planning budgets, generating forecasts, creating valuations, and providing equit...

Excel Programming Services: Development, Macros, VBA

Definition: An excel programmer can be hired to organize workbooks, create custom formulas, automate repetitive tasks using VBA, and can consult on h...

Market Research Surveys: Sample Questions + Template

Definition: Market research surveys are a tool used to collect information about a target market. These surveys allow businesses to understand market ...

Cover

What do Americans Value Most in the Coming Election? A Comprehensive and Interactive 2020 Voter Poll

SurveyKing set out on a mission in the fall of 2020, to poll American's and help identify, with quantifiable data, what issues american are most focus...

Get Started Now

We have you covered on anything from customer surveys, employee surveys, to market research. Get started and create your first survey for free.

Quantitative Research

Qualitative research.

  • Free plan, no time limit
  • Set up in minutes
  • No credit card required

7+ Reasons to Use Surveys in Your Dissertation

Blocksurvey blog author

Writing a dissertation is a serious milestone. Your degree depends on it, so it takes a lot of effort and time to figure out what direction to choose. Everything starts with the topic: you read background literature, consult with your supervisor and seek approval before you start writing the first draft. After that, you need to decide how you will collect the data that is supposed to contribute to the research field.

This is where it gets complicated. If you have never tried conducting primary research (i.e. working with human subjects), it can seem quite scary. Analyzing articles may sound like the safest and the coolest option. Yet, there might not be enough information for you to claim that your research is somehow novel.

To make sure it is, you might need to conduct primary research, and the survey method is the most widespread tool to do that. The number of advantages surveys present is huge. However, there are various perks depending on what approach you pursue. So, let’s go through all of them before you decide to pay for essay and order a dissertation that will go on and on about analyzing literature and nothing else except it.

In the quantitative primary research, students have to calculate the data received from typical a, b, c, d questionnaires. The latter provides precise answers and helps prove or reject the formulated hypothesis. For the research to be legit, there are several stages to go through like:

  • Discarding irrelevant or subjective questions/answers included in questionnaires.
  • Setting criteria for credible answers.
  • Composing an explanation of how you will manage ethical concerns (for participants and university committee).

However, all this is done to prevent issues in the future. Provided you have taken care of all the points above, you will get to enjoy the following benefits.

Data Collection Is Less Tedious

There are numerous services, like Survey Monkey, that the best write my essay services use. It can help you distribute your questionnaire among potential participants. These platforms simplify the data collection process. You don’t have to arrange calls or convince someone that they can safely share the information. Just upload the consent letter each participant has to sign and let the platform guide them further.

Data Analysis Is Fast

In quantitative analysis, all you have to take care of is mainly data entry. It requires focus and accuracy, but the rest can be done with the help of software. Whether it’s ordinary Excel or something like SPSS, you don’t have to reread loads of text. Just make sure you download the collected data from the platform correctly, remove irrelevant fields, and feed the rest to your computer.

dissertation survey monkey

Numbers Rule

Numbers don’t lie (unless you miscalculated them, of course). They give a clear answer: it’s either ‘yes’ or ‘no’. Moreover, they leave more room for creating good visuals and making your paper less boring. Just make sure you explain the numbers properly and compare the results between various graphs and charts.

No Room For Subjectivity

A quantitative dissertation is mostly a technical paper. It’s not about creativity and your ability to impress like in admission essays students usually delegate to admission essay writing services to avoid babbling about things they deem senseless. It’s about following particular procedures. And there is also a less abstract analysis.

Qualitative-oriented surveys are about conducting full-fledged personal interviews, working with focus groups, or distributing open-ended questionnaires requiring short but unique answers. Let’s talk about what makes this approach worth trying!

dissertation survey monkey

First-Hand Experience

The ability to gain a unique perspective is what distinguishes interviews from other surveys. Close-ended questions may be too rigid and make participants omit a lot of information that might help the research. In an interview, you may also correct some of your questions, and add more details to them, thus improving the outcomes.

More Diverse and Honest Answers

When participants are limited by only several options, they might choose something they cannot fully relate to. So, there is no guarantee that the results will be authentic. Meanwhile, with open-ended questions, participants share a lot of details.

Sure, some of them may be less relevant to your topic, but the researcher gains a deeper understanding of the issues lying beneath the topic. Of course, all of it is guaranteed only if the researcher provides anonymity and a safe space for the interviewees to share their thoughts freely.

No Need For Complex Software

In contrast to quantitative analysis, here, you won’t have to use formulae and learn how to perform complex tests. You might not even need Excel, except for storing some data about your participants. However, no calculations will be needed, which is also a relief for those who are not used to working with such kind of data.

Both types of research have also other advantages:

  • With surveys, you have more chances to fill the literature gap you’ve discovered.
  • Primary research may not be quite easy, but it’s highly valued at the doctoral level of education.
  • You receive a lot of new information and stay away from retelling literature that has been published before.
  • Primary research is less boring.

However, there is a must-remember thing: not every supervisor or university committee approves of surveys and primary research in general. It depends on numerous aspects like topic and subject, the conditions of research, your approach to handling human subjects, etc.

It means that the methodology you are going to use should be approved by your professor first. Otherwise, you may have to discard some parts of your draft and lose time gathering data you won’t be able to use. So, take care and good luck!

7+ Reasons to Use Surveys in Your Dissertation FAQ

What are the benefits of using surveys in a dissertation, surveys can provide a large amount of data in a short amount of time, they are cost-effective and can allow for anonymity, they can reach a wide audience, and they can be used to obtain feedback from the participants., how can i ensure that my survey results are accurate, make sure to ask questions that are clear and concise and that there are no bias in the questions. make sure to have a good sample size and to have a response rate that is high enough to provide accurate results., how can i analyze the survey results, depending on the type of survey, there are various analysis techniques that can be used. these include descriptive statistics, inferential statistics, correlation analysis, and regression analysis., what are the limitations of surveys, surveys can be subject to sampling errors, response bias, and interviewer effects. they may also not be able to capture the full range of opinions and attitudes of the population., like what you see share with a friend..

blog author description

Sarath Shyamson

Sarath Shyamson is the customer success person at BlockSurvey and also heads the outreach. He enjoys volunteering for the church choir.

Related articles

A/b testing calculator for statistical significance.

A/B Testing calculator for Statistical Significance

Anounymous Feedback: A How to guide

Anounymous Feedback: A How to guide

A Beginner's Guide to Non-Profit Marketing: Learn the Tips, Best practices and 7 best Marketing Strategies for your NPO

A Beginner's Guide to Non-Profit Marketing: Learn the Tips, Best practices and 7 best Marketing Strategies for your NPO

4 Major Benefits of Incorporating Online Survey Tools Into the Classroom

4 Major Benefits of Incorporating Online Survey Tools Into the Classroom

7 best demographic questions to enhance the quality of your survey

7 best demographic questions to enhance the quality of your survey

Best Practices for Ensuring Customer Data Security in Feedback Platforms

Best Practices for Ensuring Customer Data Security in Feedback Platforms

Confidential survey vs Anonymous survey - How to decide on that

Confidential survey vs Anonymous survey - How to decide on that

Conjoint analysis: Definition and How it can be used in your surveys

Conjoint analysis: Definition and How it can be used in your surveys

Cross-Tabulation Analysis: How to use it in your surveys?

Cross-Tabulation Analysis: How to use it in your surveys?

What is Data Masking- Why it is essential to maintain the anonymity of a survey

What is Data Masking- Why it is essential to maintain the anonymity of a survey

The Art of Effective Survey Questions Design: Everything You Need to Know to Survey Students the Right Way

The Art of Effective Survey Questions Design: Everything You Need to Know to Survey Students the Right Way

Focus group survey Vs Online survey: How to choose the best method for your Market Research

Focus group survey Vs Online survey: How to choose the best method for your Market Research

How Employee Satisfaction Affects Company's Financial Performance

How Employee Satisfaction Affects Company's Financial Performance

How to create an anonymous survey

How to create an anonymous survey

How to identify if the survey is anonymous or not

How to identify if the survey is anonymous or not

A Simple and Easy guide to understand: When and How to use Surveys in Psychology

A Simple and Easy guide to understand: When and How to use Surveys in Psychology

How to write a survey introduction that motivates respondents to fill it out

How to write a survey introduction that motivates respondents to fill it out

Survey and Question Design: How to Make a Perfect Statistical Survey

Survey and Question Design: How to Make a Perfect Statistical Survey

Matrix Questions: Definition and How to use it in survey

Matrix Questions: Definition and How to use it in survey

Maxdiff analysis: Definition, Example and How to use it

Maxdiff analysis: Definition, Example and How to use it

How to Maximize Data Collection Efficiency with Web 3.0 Surveys?

How to Maximize Data Collection Efficiency with Web 3.0 Surveys?

Empowering Prevention: Leveraging Online Surveys to Combat School Shootings

Empowering Prevention: Leveraging Online Surveys to Combat School Shootings

Optimizing Survey Results: Advanced Editing And Reporting Techniques

Optimizing Survey Results: Advanced Editing And Reporting Techniques

Enhancing Student Engagement and Learning with Online Surveys

Enhancing Student Engagement and Learning with Online Surveys

Student survey questions that will provide valuable feedback

Student survey questions that will provide valuable feedback

Preparing Students for the Future: The Role of Technology in Education

Preparing Students for the Future: The Role of Technology in Education

When It’s Best To Use Surveys For A Dissertation & How To Ensure Anonymity?

When It’s Best To Use Surveys For A Dissertation & How To Ensure Anonymity?

Which Pricing Strategy Should You Choose for Your Product? A Van Westendorp Analysis

Which Pricing Strategy Should You Choose for Your Product?  A Van Westendorp Analysis

Why Are Surveys Important for Financial Companies?

Why Are Surveys Important for Financial Companies?

Want to create Anonymous survey in Facebook??- Know why you can't

Want to create Anonymous survey in Facebook??- Know why you can't

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • HHS Author Manuscripts

Logo of nihpa

A Tale of Two Diverse Qualtrics Samples: Information for Online Survey Researchers

Carrie a. miller.

1. Department of Health Behavior and Policy, Virginia Commonwealth University, Richmond, VA

Jeanine P. D. Guidry

2. Robertson School of Media and Culture, Virginia Commonwealth University, Richmond, VA

Bassam Dahman

Maria d. thomson.

There is often a lack of transparency in research using online panels related to recruitment methods and sample derivation. The purpose of this study was to describe the recruitment and participation of respondents from two disparate surveys derived from the same online research panel using quota sampling.

A commercial survey sampling and administration company, Qualtrics, was contracted to recruit participants and implement two internet-based surveys. The first survey targeted adults aged 50–75 years old and used sampling quotas to obtain diversity with respect to household income and race/ethnicity. The second focused on women aged 18–49 years and utilized quota sampling to achieve a geographically balanced sample.

A racially and economically diverse sample of older adults (n=419) and a geographically diverse sample of younger women (n=530) were acquired relatively quickly (within 12 and 4 days, respectively). With exception of the highest income level, quotas were implemented as requested. Recruitment of older adults took longer (vs. younger female adults). Although survey completion rates were reasonable in both studies, there were inconsistencies in the proportion of incomplete survey responses and quality fails.

Conclusions

Cancer researchers – and researchers in general – should consider ways to leverage the use of online panels for future studies. To optimize novel and innovative strategies, researchers should proactively ask questions about panels and carefully consider the strengths and drawbacks of online survey features including quota sampling and forced response.

Results provide practical insights for cancer researchers developing future online surveys and recruitment protocols.

Increased use of online (internet) surveys has led to a rise in commercial survey and market research platforms. Companies such as Qualtrics ( www.qualtrics.com ), Survey Monkey ( www.surveymonkey.com ), and Amazon’s Mechanical Turk (MTurk; www.mturk.com ) allow researchers to develop, test, and distribute surveys online. In addition to creating electronic surveys for distribution through typical sample outlets, these companies enable researchers to purchase access to existing pools of potential participants that have agreed to be solicited for survey recruitment. Utilizing online research panels for sample acquisition and data collection is quick and efficient. Compared to traditional survey modes (e.g., mail and telephone), online surveys are typically less expensive [ 1 ], require less time to complete [ 2 ], and more readily provide access to unique populations [ 3 , 4 ].

Innovations in the collection of data using mobile and online platforms is transforming the conduct of survey research [ 5 ]. For example, crowdsourcing (i.e., the practice of soliciting contributions from large groups of people) has been applied to other health-related research and is in the early stages of adoption in cancer research. A systematic review identified 12 studies that applied crowdsourcing to cancer research in a range of capacities, including identifying candidate gene sequences for investigation, developing cancer prognosis models, and assessing cancer knowledge, beliefs, and behaviors [ 6 ]. Within the broad field of cancer, other recent studies have drawn from panel samples to conduct survivorship [ 7 ], risk communication [ 8 ], message testing [ 9 ], and behavioral [ 10 ] research. With increasing use of new technologies for data collection, the use of commercial research panels will become more prevalent in cancer research. Furthermore, large cohorts of individuals open to participating in research are being established for longitudinal epidemiologic research, including the Growth from Knowledge (GfK) KnowledgePanel and National Institute of Health (NIH) All of Us panel. Future use of panel surveys for collecting health data and biospecimens holds promise for accelerating cancer research across the cancer continuum.

Although online panels may not be representative of the US population [ 11 ], growing evidence suggests samples recruited through online panels can be as representative of the population as traditional recruitment methods [ 12 – 14 ]. Yet, the greatest advantage of online research panels may be their ability to produce samples targeting specific groups, such as respondents who meet a specific condition of interest to the researcher. The use of quota sampling (i.e., a non-probability sampling technique) in online panel research can help researchers obtain survey participants matching specified criteria, such as young adult cancer survivors or mammography screening age-eligible adults. Although online panel provide some advantages over traditional sampling methods, questions about the validity of commercially derived online panel samples have been raised [ 15 – 17 ]. These concerns may arise due to a lack of transparency in the recruitment of panelists and insufficient details on how samples are derived from online panels.

Online panel members are recruited from a variety of sources [ 12 ] and therefore, precise information on how sampling frames are constructed is usually not available. Because researchers generally lack control over sample acquisition procedures, an in-depth characterization of how panel participants are recruited is needed to inform researchers on what to expect when administering an online survey and recruiting participants from online sample panels. The purpose of this study was to describe the recruitment and participation of respondents from two disparate online surveys using quota sampling and administered by the same commercial research platform.

Qualtrics, a commercial survey sampling and administration company, was contracted to recruit participants and implement two internet-based surveys. Samples were acquired from existing pools of research panel participants. Samples were acquired from existing pools of research panel participants who have agreed to be contacted for research studies. The Qualtrics network of participant pools, referred to as the Marketplace, consists of hundreds of suppliers with a diverse set of recruitment methodologies [ 18 ]. The compilation of sampling sources helps to ensure that the overall sampling frame is not overly reliant or dependent on any particular demographic or segment of the population. Respondents can be sourced from a variety of methods depending on the individual supply partner, including the following: ads and promotions across various digital networks, word of mouth and membership referrals, social networks, online and mobile games, affiliate marketing, banner ads, TV and radio ads, and offline mail-based recruitment campaigns.

Recruitment targeted potential survey respondents who were likely to qualify based on the demographic characteristics reported in their user profiles (e.g., race and age). Panelists were invited to participate and opted in by activating a survey link directing them to the study consent page and survey instrument. Ineligible respondents were immediately exited from the survey upon providing a response that did not meet inclusion criteria or exceeded set quotas (i.e., a priori quotas for race or household income group already met).

To ensure data quality, surveys featured (1) attention checks (i.e., survey items that instructed respondents to provide a specific response); and (2) speeding checks (i.e., respondents with survey duration of ≤ one-third of the median duration of survey). Respondents who failed either quality check were excluded from the final samples. The two surveys were approximately equivalent in terms of survey duration and participant remuneration. Qualtrics charged investigators $6.50 for each completed survey response requested. The data reported in this study were collected according to separate IRB-approved protocols and in accordance with recognized ethical guidelines. Written informed consent was obtained from each participant.

Study Design and Survey Administration

Sample one..

The first sample was obtained as part of a pre-post parallel trial designed to examine the effects of providing colorectal cancer risk feedback to average risk adults who are age-eligible for colorectal cancer screening. The survey contained 133 items with each item requiring a response (i.e., forced response). The target enrollment was 400 eligible respondents. To be eligible to participate, respondents had to report being aged 50–75 years, have no personal or family history of colorectal cancer or other predisposing condition, able to read and comprehend English, and reside in the contiguous United States (US). Sampling quotas were implemented for race and annual household income. Specifically, balanced proportions of respondents with non-Hispanic White/Caucasian, non-Hispanic Black/African American, and Hispanic/Latino/Spanish origin racial/ethnic identities and diversity in reported household income (approximately 20% less than 20k, 30% between 20–49k, 20% between 50–74k, 10% between 75–99k = 10% and 20% greater than or equal to 100k) were requested. Respondents identifying as some other race, ethnicity or origin were not eligible to participate.

Sample Two.

Sample two was acquired during a previously described study testing whether specific message types and various psychosocial variables affect future Zika vaccine uptake intent among women of childbearing age [ 19 ]. The survey contained 105 items and did not utilize forced response. Target enrollment was 500 respondents. To be eligible to participate in this study, respondents had to report being female, between the ages 18 and 49 years, able to read and comprehend English, and a resident of the contiguous US. Sampling quotas were implemented to achieve a geographically varied sample across the four census regions (i.e., Northeast, South, West, and Midwest) with oversampling in the Southern region due to the heightened risk of Zika in this area.

Data Collection

For the first sample, completed survey duration ranged from 10 to 1,922 minutes, with a median duration of 26 minutes. Data collection occurred over a period of 12 days in June 2017. Completed survey duration for the second sample ranged from 8 to 390 minutes, with a median duration of 27 minutes. Data collection occurred over a period of four days in March 2017.

Survey Recruitment and Participant Flow

Survey recruitment and participant flow for each sample are depicted in Figure 1 .

An external file that holds a picture, illustration, etc.
Object name is nihms-1561623-f0001.jpg

contains a depiction of recruitment and participation within each sample.

Based on target demographics, approximately 63,500 panelists were invited to participate in this survey by email and other methods (e.g., messaging through online portals, text message, and in-app advertisements). Of those contacted, 3,178 panelists interacted with the survey by opening the survey invitation and/or survey link and 1,606 completed the consent page. One hundred and fifty-eight panelists did not consent (9.8%). Of the panelists who consented, 671 did not meet eligibility criteria (46.3%), the majority of whom were ineligible due to health history (n=574 reported a history of colorectal cancer or other predisposing condition). Two hundred and twenty respondents were excluded due to quota sampling (15.2%). Seventy-one respondents did not complete the survey entirely (4.9%) and an additional 67 were removed from the study for failing an attention check (i.e., one of three survey items that required specific responses) (4.6%). A total of n=419 panelists completed the survey (75.2% of those who agreed to participate and were eligible).

The average age of respondents who completed the survey was 58.5 years (sd=6.3). The sample consisted of n=279 females (66.6%) and as requested, an equal proportion of non-Hispanic White/Caucasian, non-Hispanic Black/African American, and Hispanic/Latino/Spanish respondents (33% each). However, the a priori income quotas proposed for this study were not fully implemented. Due to difficulties acquiring participants who reported an annual household income of ≥100k, we elected to eliminate the income quota after two weeks of data collection. To acquire an adequate sample size to ensure statistical power for the parent study (i.e., n=400), we used natural probability sampling to obtain the remaining number of participants. Ultimately, income levels were within 5% of the proportions requested, except respondents with a reported income of ≥100k were 10.5% of the final sample instead of the 20% initially proposed.

The survey for sample two was distributed via email to approximately 56,978 panelists based on target demographics. A total of 2,015 panelists interacted with the survey (i.e., clicked on the survey invitation and/or survey link and 882 went on to complete the consent page. Three percent of these panelists did not consent (n=27). Among those who consented, 23 (2.7%) did not meet eligibility criteria (due to age and gender). No respondents were screened out due to being over quota. Thirty-eight respondents did not complete the survey entirely (4.4%) and an additional 264 were removed from the study for failing an attention check (i.e., one of three survey items that required specific responses) (30.9%). A total of n=530 panelists completed the survey (63.7% of those who consented and were eligible).

As intended, all respondents who completed the survey were female. On average, respondents were 33.9 years old (sd=7.9). Respondents were predominantly White/Caucasian (73.5%), 9.4% were Black/African American, 8.8% Hispanic, 5.0% Asian, 1.2% American Indian, and 2.1% were some other race/ethnicity. Thirty-nine percent of respondents were from the Southern US. Roughly one-quarter (24.5%) were residents of the Western region and the remaining respondents were from the Midwest (20.9%) and Northeast (15.6%).

This study described the samples resulting from two online surveys that recruited participants using quota sampling through the same commercial research panel. A thorough description of how quota sampling was used to obtain targeted samples – one racially and economically diverse sample of older adults and another geographically dispersed sample of younger adult women – via an online panel was provided. Survey recruitment and participant flow within each sample were examined. Taken together, results provide context and considerations for future cancer researchers – and researchers in general – contemplating the use of commercially administered, online research surveys.

The level of transparency regarding recruitment and participant flow reported in this study (e.g., number of panelists contacted, number of panelists that interacted with the survey, analysis of over quota exclusions, etc.) is greater than that typically reported in other recent studies using online research panels [ 20 , 21 ]. The information outlined indicates that commercial research platforms have access to large panels of research participants. In both cases, approximately 60,000 panelists were sent a survey invitation. About one-half (50.5% and 43.8% in samples one and two, respectively) of those who interacted with the email ultimately completed the consent page of the survey. Although the traditional calculation of response for each of these samples was very low (i.e., 3% to 7% of panelists interacted with the survey), these results are consistent with prior research examining response across multiple panel vendors [ 22 ] but lower than another Qualtrics panel study reporting an 18.7% response rate [ 10 ]. However, among those who consented and were eligible for participation, most completed the survey (75.2% in sample one and 63.7% in sample two). For internet derived samples, this ‘completion rate’ (i.e., the proportion of survey completers out of all eligible respondents who initiate the survey) is frequently reported [ 7 , 9 , 23 ]. Our completion rates compare favorably to the typical response rates of epidemiologic studies which have been declining for the past several decades [ 24 ]. A review of case-control cancer studies conducted in 2001–2010 revealed median response rates of 75.6% for cases, 78.0% for medical source controls, and 53.0% for population controls [ 25 ]. The median response rate of the 2017 Behavioral Risk Factor Surveillance System (BRFSS) survey was approximately 45% [ 26 ].

Study-specific inclusion criterion were the primary reason for ineligibility within both samples, including more than 500 consented panelists with a history of colorectal cancer excluded in sample one. This presence of colorectal cancer survivors suggests Qualtrics may be a potentially promising but overlooked platform for recruiting cancer survivors. Additional panelists in the first survey screened out due to quota sampling (n=220), while no panelists were excluded from survey two for being over quota. This difference is not entirely surprising given that no specific limits were set for the quotas on geographic regions in sample two. That the desired geographic variation was achieved without defining precise proportions represents an important consideration for future study designs as having less stringent quotas could save time (faster sample acquisition), reduce expenses (complex quotas increase sample cost), and potentially, reduce selection bias (by retaining otherwise eligible respondents).

Despite initially contacting higher numbers of potential participants relative to traditional methods, online panelists meeting specific demographic criteria can be effectively targeted based on information contained in panelists’ profiles. There were few exclusions based on sociodemographic characteristics. Survey completers in both studies were consistent with the study inclusion criteria and set quotas (except for the highest income level). Therefore, it may be difficult to acquire more affluent participants. It should also be noted that sample one (with no inclusion or sampling criteria related to gender) had a relatively high proportion of female respondents (66.6%) while sample two (with no race/ethnicity quota) yielded a predominately White/Caucasian sample. These results support the use of quotas when demographic characteristics are germane to study aims. For example, cancer researchers may use quotas in case-control studies to identify controls that match on specific criteria (i.e., smoking history). However, large panels would be need to target exposures with low prevalence. Researchers should carefully weigh the benefits and potential drawbacks of using quota sampling.

Several additional practical implications can be gleaned from the examination of these online surveys. First, researchers received more completed survey responses than were purchased (e.g., 19–30) and samples were acquired relatively more quickly than traditional samples. Although both samples were acquired in a short period of time, it should be noted that sample one took three times longer than sample two, likely due to the older target population. Second, the proportion of eligible respondents who did not complete the survey was relatively low in both samples, but lower in the sample of younger adult women (i.e., 4.6% in sample two vs. 12.7% in sample one). Therefore, forced item response may not be necessary to promote complete data and may contribute to the higher incomplete rate observed in sample one. In addition, only one “speeder” (i.e., respondent with a total survey duration of ≤ one-third of the median duration of survey) was identified. Thus, researchers should be encouraged to use additional quality checks, such as attention checks, to safeguard against cheaters (i.e., respondents who rush through the survey and threaten data quality). In this analysis, the younger respondents in sample two were more likely than older respondents (sample one) to fail attention checks (31.7% and 12.0%, respectively). This finding may support that older respondents are more conscientious, or alternatively, the forced response of items in sample one may underlie the difference.

The present study highlights the relative ease of obtaining diverse samples (i.e., one racially and economically diverse and the other geographically dispersed) via quota sampling and online recruitment methods. Implementing sampling quotas for race/ethnicity represents a major advantage over traditional sampling methods that often consist of predominantly White/Caucasian participants [ 27 – 29 ]. Researchers who seek racial/ethnic diversity should utilize available representative samples whenever possible. When access to minorities is limited, however, online panel sampling using quotas sampling for race may be a valuable approach for reaching minority participants, as demonstrated in sample one and other panel samples [ 30 , 31 ].

Finally, this study is one of the first to examine in-depth how studies using online panels function in terms of sampling, initial recruiting, and participation. Our results provide practical insights for cancer researchers – and researchers in general – to be cognizant of when designing future online surveys and recruitment strategies. In order to optimize novel and innovative strategies, researchers should carefully consider the strengths and drawbacks of online survey features including quota sampling and forced response. Our goal in documenting the methodologic aspects of recruitment and participation from two different study populations from online panels was to help build transparency in reporting online panel samples, as well as to provide a basis for comparison across different commercially available research panels. Like many other studies conducted using online panels, we were unable to fully ascertain how the panel was created or describe those who did not interact with the survey. According to Qualtrics [ 18 ], additional information related to these panelists (e.g., number of invalid emails and non-respondent characteristics) is not currently tracked. Another limitation of using commercial participant panels is that their overall size and the demographics of the underlying user populations are dynamic and often not available. Future studies should be proactive in negotiating additional information and addressing unknowns in panel recruitment procedures. In doing so, researchers could raise the bar on the standards of information provided by suppliers of commercial panels. These are important steps towards strengthening survey methodologies in the rapidly changing landscape of “citizen science” where the public actively engages in participatory research projects, such as online panels and scientific registries.

In summary, online research panels and quota sampling techniques provide new opportunities for the acquisition of traditionally underrepresented individuals or participants who meet narrowly specific inclusion criteria – an advantage over traditional sampling methods. Our results support leveraging online panels for cancer research. Future epidemiologic research using these methods to perform recruitment of targeted populations (e.g., cancer survivors (cases) and individuals (controls) residing in specific geographic areas, such as colorectal cancer “hotspots”) could alleviate the need for time- and cost-intensive methods such as mail-based and in-person correspondence. In comparison, web-based surveys participant incentives and administrative costs are substantially lower [ 32 ]. In conclusion, the use of panel participants could be leveraged to reach specific population groups, maximize limited research budgets, and therefore, enable novel cancer research focused on health disparities and cancer communication that are currently not feasible in either traditional small intervention or large population studies.

Acknowledgments

Financial support was provided in part to the corresponding author (CAM) by a Graduate Training in Disparities Research award GTDR14302086 from Susan G. Komen ® ” and a National Cancer Institute T32 award (2T32CA093423).

Conflict of interest statement:

The authors declare no potential conflicts of interest.

  • Search Menu
  • Browse content in Disability assessment
  • Assessment for capacity to work
  • Assessment of functional capability
  • Browse content in Fitness for work
  • Civil service, central government, and education establishments
  • Construction industry
  • Emergency Medical Services
  • Fire and rescue service
  • Healthcare workers
  • Hyperbaric medicine
  • Military - Other
  • Military - Fitness for Work
  • Military - Mental Health
  • Oil and gas industry
  • Police service
  • Rail and Roads
  • Remote medicine
  • Telecommunications industry
  • The disabled worker
  • The older worker
  • The young worker
  • Travel medicine
  • Women at work
  • Browse content in Framework for practice
  • Health and Safety at Work etc. Act 1974 and associated regulations
  • Health information and reporting
  • Ill health retirement
  • Questionnaire Reviews
  • Browse content in Occupational Medicine
  • Blood borne viruses and other immune disorders
  • Dermatological disorders
  • Endocrine disorders
  • Gastrointestinal and liver disorders
  • Gynaecology
  • Haematological disorders
  • Mental health
  • Neurological disorders
  • Occupational cancers
  • Opthalmology
  • Renal and urological disorders
  • Respiratory Disorders
  • Rheumatological disorders
  • Browse content in Rehabilitation
  • Chronic disease
  • Mental health rehabilitation
  • Motivation for work
  • Physical health rehabilitation
  • Browse content in Workplace hazard and risk
  • Biological/occupational infections
  • Dusts and particles
  • Occupational stress
  • Post-traumatic stress
  • Advance articles
  • Editor's Choice
  • Themed and Special Issues
  • Author Guidelines
  • Submission Site
  • Open Access
  • Books for Review
  • Become a Reviewer
  • About Occupational Medicine
  • About the Society of Occupational Medicine
  • Editorial Board
  • Advertising and Corporate Services
  • Journals Career Network
  • Self-Archiving Policy
  • Dispatch Dates
  • Permissions
  • Journals on Oxford Academic
  • Books on Oxford Academic

Article Contents

What is survey monkey, what are the risks.

  • < Previous

How I Use It: Survey Monkey

  • Article contents
  • Figures & tables
  • Supplementary Data

Eugene Waclawski, How I Use It: Survey Monkey, Occupational Medicine , Volume 62, Issue 6, September 2012, Page 477, https://doi.org/10.1093/occmed/kqs075

  • Permissions Icon Permissions

Survey Monkey is an internet programme and hosting site that enables a person to develop a survey for use over the internet. Other sites include HostedSurvey, CreateSurvey, SurveyMethods and LimeSurvey. Such programmes are commonly used for market research purposes but can be used for surveys in a number of areas including health research.

Different pricing structures exist for each product. Survey Monkey can be used for free on a Basic plan but this has limited functionality. It provides access to 15 available question types and over 20 basic survey templates. You can create unlimited surveys but each survey is limited to 10 questions and 100 respondents.

Three paid subscription services are also available: SELECT, GOLD and PLATINUM. Prices are country specific and in Canada range from $228 to $828 annually. GOLD is the most popular plan providing unlimited questions, unlimited responses, custom survey design, skip-logic and other advanced features (random assignment for A/B testing, question and answer piping, question randomisation), text analysis for open responses and integration with SPSS.

A PLATINUM plan delivers ‘complete brand control’ with separate research.net survey URLs and control of how the survey looks including adding a logo and brand colours.

I used Survey Monkey after initiating the Society of Occupational Medicine eNewsletter to seek information from the membership on what to include as topics and then again after a year for feedback about the eNewsletter to fine tune the service.

SurveyMonkey can be used for health research with appropriate ethical approval. An introductory section allows explanation of the purpose of the survey and consent can be obtained at this stage. The questionnaire can be set up with a variety of responses including yes/no responses, selecting one or more from a list and drop down menu responses. The designer can draft a survey questionnaire and save the draft for further editing. Logic options can be included so that a No answer moves the respondent to the next required question. Similarly a No response to the consent request in the introductory section could move the questionnaire to the end page where a thank you message and exit button can be placed. A Yes response to the consent would lead to the first of the survey sections/questions.

Once the questionnaire is in place and working according to the logic required it can be used. A URL can be copied and pasted into an email to a survey population or the URL can be placed in a specific web page that the survey population is directed to.

Survey Monkey provides a survey completion progress bar so that the total number of survey questionnaires completed can be easily read. Above the Basic plan a printable pdf version can be made to allow for postal questionnaire use if internet access is not available to all the survey population. Responses can be inputted for each returned questionnaire adding to the responses received via the internet.

The website provides tutorials and information sheets for those starting to use the service and there is also customer support with email support for paid plans. The service is compliant with US government accessibility standards for respondents with disabilities.

It is possible to design a survey with the wrong logic options. I prefer to start with a paper version and detail all the options following responses and then transfer that over to Survey Monkey. I then trial the survey questionnaire to ensure that the correct logic and questionnaire response types are in place.

Response rates may be low if the survey has not been marketed appropriately to the study population. Email requests may not be opened and alternative methods such as posters, newsletter articles and even letters to individuals may be required to improve response rates. If the questionnaire is anonymous there is no way of identifying the individuals who respond and a mailing reminding people to participate would have to be sent to all of the survey population. If personal information (name, address) is included a mailing to only the non-responders can be sent. A statement to indicate the limited use of personal information and the fact that such identifying information will not be used when the data is exported for analysis can reassure potential participants.

In an increasingly digital online world the use of internet surveys for medical and market research will increase. Tools like Survey Monkey offer occupational health practitioners an exciting opportunity to carry out their own surveys and research.

  • cercopithecidae

Email alerts

Citing articles via.

  • Contact SOM
  • Recommend to your Library

Affiliations

  • Online ISSN 1471-8405
  • Print ISSN 0962-7480
  • Copyright © 2024 Society of Occupational Medicine
  • About Oxford Academic
  • Publish journals with us
  • University press partners
  • What we publish
  • New features  
  • Open access
  • Institutional account management
  • Rights and permissions
  • Get help with access
  • Accessibility
  • Advertising
  • Media enquiries
  • Oxford University Press
  • Oxford Languages
  • University of Oxford

Oxford University Press is a department of the University of Oxford. It furthers the University's objective of excellence in research, scholarship, and education by publishing worldwide

  • Copyright © 2024 Oxford University Press
  • Cookie settings
  • Cookie policy
  • Privacy policy
  • Legal notice

This Feature Is Available To Subscribers Only

Sign In or Create an Account

This PDF is available to Subscribers Only

For full access to this pdf, sign in to an existing account, or purchase an annual subscription.

Dissertation surveys: Questions, examples, and best practices

Collect data for your dissertation with little effort and great results.

Dissertation surveys are one of the most powerful tools to get valuable insights and data for the culmination of your research. However, it’s one of the most stressful and time-consuming tasks you need to do. You want useful data from a representative sample that you can analyze and present as part of your dissertation. At SurveyPlanet, we’re committed to making it as easy and stress-free as possible to get the most out of your study.

With an intuitive and user-friendly design, our templates and premade questions can be your allies while creating a survey for your dissertation. Explore all the options we offer by simply signing up for an account—and leave the stress behind.

How to write dissertation survey questions

The first thing to do is to figure out which group of people is relevant for your study. When you know that, you’ll also be able to adjust the survey and write questions that will get the best results.

The next step is to write down the goal of your research and define it properly. Online surveys are one of the best and most inexpensive ways to reach respondents and achieve your goal.

Before writing any questions, think about how you’ll analyze the results. You don’t want to write and distribute a survey without keeping how to report your findings in mind. When your thesis questionnaire is out in the real world, it’s too late to conclude that the data you’re collecting might not be any good for assessment. Because of that, you need to create questions with analysis in mind.

You may find our five survey analysis tips for better insights helpful. We recommend reading it before analyzing your results.

Once you understand the parameters of your representative sample, goals, and analysis methodology, then it’s time to think about distribution. Survey distribution may feel like a headache, but you’ll find that many people will gladly participate.

Find communities where your targeted group hangs out and share the link to your survey with them. If you’re not sure how large your research sample should be, gauge it easily with the survey sample size calculator.

Need help with writing survey questions? Read our guide on well-written examples of good survey questions .

Dissertation survey examples

Whatever field you’re studying, we’re sure the following questions will prove useful when crafting your own.

At the beginning of every questionnaire, inform respondents of your topic and provide a consent form. After that, start with questions like:

  • Please select your gender:
  • What is the highest educational level you’ve completed?
  • High school
  • Bachelor degree
  • Master’s degree
  • On a scale of 1-7, how satisfied are you with your current job?
  • Please rate the following statements:
  • I always wait for people to text me first.
  • Strongly Disagree
  • Neither agree nor disagree
  • Strongly agree
  • My friends always complain that I never invite them anywhere.
  • I prefer spending time alone.
  • Rank which personality traits are most important when choosing a partner. Rank 1 - 7, where 1 is the most and 7 is the least important.
  • Flexibility
  • Independence
  • How openly do you share feelings with your partner?
  • Almost never
  • Almost always
  • In the last two weeks, how often did you experience headaches?

Dissertation survey best practices

There are a lot of DOs and DON’Ts you should keep in mind when conducting any survey, especially for your dissertation. To get valuable data from your targeted sample, follow these best practices:

Use the consent form.

The consent form is a must when distributing a research questionnaire. A respondent has to know how you’ll use their answers and that the survey is anonymous.

Avoid leading and double-barreled questions

Leading and double-barreled questions will produce inconclusive results—and you don’t want that. A question such as: “Do you like to watch TV and play video games?” is double-barreled because it has two variables.

On the other hand, leading questions such as “On a scale from 1-10 how would you rate the amazing experience with our customer support?” influence respondents to answer in a certain way, which produces biased results.

Use easy and straightforward language and questions

Don’t use terms and professional jargon that respondents won’t understand. Take into consideration their educational level and demographic traits and use easy-to-understand language when writing questions.

Mix close-ended and open-ended questions

Too many open-ended questions will annoy respondents. Also, analyzing the responses is harder. Use more close-ended questions for the best results and only a few open-ended ones.

Strategically use different types of responses

Likert scale, multiple-choice, and ranking are all types of responses you can use to collect data. But some response types suit some questions better. Make sure to strategically fit questions with response types.

Ensure that data privacy is a priority

Make sure to use an online survey tool that has SSL encryption and secure data processing. You don’t want to risk all your hard work going to waste because of poorly managed data security. Ensure that you only collect data that’s relevant to your dissertation survey and leave out any questions (such as name) that can identify the respondents.

Create dissertation questionnaires with SurveyPlanet

Overall, survey methodology is a great way to find research participants for your research study. You have all the tools required for creating a survey for a dissertation with SurveyPlanet—you only need to sign up . With powerful features like question branching, custom formatting, multiple languages, image choice questions, and easy export you will find everything needed to create, distribute, and analyze a dissertation survey.

Happy data gathering!

Sign up now

Free unlimited surveys, questions and responses.

IMAGES

  1. Survey monkey

    dissertation survey monkey

  2. How To Show Priorities In Survey Monkey

    dissertation survey monkey

  3. How to Use Survey Monkey

    dissertation survey monkey

  4. Survey Monkey Help

    dissertation survey monkey

  5. Survey Monkey: Best Practices Survey Design

    dissertation survey monkey

  6. PPT

    dissertation survey monkey

VIDEO

  1. Dissertation Survey Notification

COMMENTS

  1. SurveyMonkey and IRB Guidelines

    Many students use SurveyMonkey to conduct research for their dissertations or graduate work. This help article outlines the potential guidelines for using SurveyMonkey as a tool to survey research participants. These are criteria that most university IRB 's recommend when using an online survey tool to collect data.

  2. 9 Survey Tools for Academic Research in 2023

    MIT Theses - Contains over 58,000 theses and dissertations from all MIT departments. The database is organized by department and lets you search for keywords. 1. SurveyKing ... SurveyMonkey. As the most recognized brand for online surveys, SurveyMonkey is a reliable option for academic research. While the platform does not have any research ...

  3. 7+ Reasons to Use Surveys in Your Dissertation

    First-Hand Experience. The ability to gain a unique perspective is what distinguishes interviews from other surveys. Close-ended questions may be too rigid and make participants omit a lot of information that might help the research. In an interview, you may also correct some of your questions, and add more details to them, thus improving the ...

  4. Survey research for your academic papers: tips for a no ...

    Test your survey questions first. Once your survey is set up, test it. Test it again! Use preview mode to be sure everything is behaving as intended. Don't let your live survey be the first time you see what your project looks when it's full of responses. Test it with a practice collector (like a survey Email Invitation sent to a friend) so ...

  5. What is the best online survey tool for me to use for my ...

    I've used Qualtrics, Survey Monkey (paid version) and CheckBox - of all three, I agree with Dr. Turkson-Ocran and favor the use of Qualtrics, if available. Best of luck with your thesis. Survey ...

  6. (PDF) Utilization of online survey tools for academic research: A

    According to Nagalakhmi and Trivedi (2015), online survey tools including survey monkey are suitable for both academic and market research since data collected can easily be exported to other ...

  7. Academic surveys

    Use templates from SurveyMonkey's collection of Education, School, and Academic Online survey templates for research projects on family and school relationships, educational outcomes, school climate, online learning programs, student satisfaction, and more. Leverage templates such as the Harvard Graduate School of Education Pre K-12 Parent ...

  8. PDF Survey Monkey Access for Doctoral Students

    Survey Monkey Access for UB Doctoral Students . Frequently Asked Questions . Do I need to pay for a Survey Monkey account to collect data for my dissertation project? No. Currently enrolled doctoral students will be given a university Survey Monkey account upon request. Doctoral students must have IRB approval prior to requesting access and include

  9. A Tale of Two Diverse Qualtrics Samples: Information for Online Survey

    A commercial survey sampling and administration company, Qualtrics, was contracted to recruit participants and implement two internet-based surveys. The first survey targeted adults aged 50-75 years old and used sampling quotas to obtain diversity with respect to household income and race/ethnicity. The second focused on women aged 18-49 ...

  10. How I Use It: Survey Monkey

    Survey Monkey is an internet programme and hosting site that enables a person to develop a survey for use over the internet. Other sites include HostedSurvey, CreateSurvey, SurveyMethods and LimeSurvey. Such programmes are commonly used for market research purposes but can be used for surveys in a number of areas including health research.

  11. Citing SurveyMonkey or SurveyMonkey Audience

    You can reference or cite our company, SurveyMonkey, or the SurveyMonkey product in a paper, thesis, blog, or other publication. Please format the information as required according to the style manual or guide used by your company, school, or organization. When referencing SurveyMonkey, please include the following: Name: SurveyMonkey Inc.

  12. 10 things that can go wrong with your dissertation survey

    I edit dissertations and proposals, which means I've seen many questionnaires and surveys, both proposed and completed. I'm guessing most dissertators think their survey questions are the greatest thing since Survey Monkey. However, sometimes things go wrong, sometimes seriously wrong. Here are ten potential problems you might encounter with your dissertation survey.

  13. Dissertation survey examples & questions

    Dissertation survey examples. Whatever field you're studying, we're sure the following questions will prove useful when crafting your own. At the beginning of every questionnaire, inform respondents of your topic and provide a consent form. After that, start with questions like:

  14. Sample Size Calculator and Tips for Determining Sample Size

    Data you need for sample size calculation. Here are three key terms you'll need to understand to calculate your sample size and give it context: Population size: The total number of people in the group you are trying to study. If you were taking a random sample of people across the U.S., then your population size would be about 317 million.

  15. How to Analyse Survey Data: Methods and Examples

    With its many data analysis techniques, SurveyMonkey makes it easy for you to turn your raw data into actionable insights presented in easy-to-grasp formats.Features such as automatic charts and graphs and word clouds help bring data to life. For instance, Sentiment Analysis allows you to get an instant summary of how people feel from thousands or even millions of open text responses.

  16. SurveyMonkey: Create and publish online surveys in minutes

    All The Tools You Need To Create Compelling, Professional Videos You Can Share With The World. It's ALL About The Power Of Video... https://createvidello.co...

  17. How many survey responses do I need to be statistically ...

    Great question! If you don't have a sample size calculator at the ready, we've got a handy-dandy table with the answers. To use the table, just ask yourself two questions: How many people are in your population? How representative do your survey results need to be? Answering the first question is pretty simple. The second […]

  18. Survey Introductions

    Your introduction only needs to be three or four sentences, or a couple of short paragraphs at most. Include the following useful information in your introduction: Your name or the name of the company or organization you represent. The goal of the survey or what you're trying to find out. How you'll be using the responses to make a difference.

  19. 150+ Free Questionnaire Examples & Sample Survey Templates

    Filter by survey type. All our sample survey template questions are expert-certified by professional survey methodologists to make sure you ask questions the right way-and get reliable results. You can send out our templates as is, choose separate variables, add additional questions, or customize our questionnaire templates to fit your needs.

  20. SurveyMonkey Audience

    Our powerful data-quality controls ensure reliable responses from our panel—ideal for any market research project: Recruitment of proprietary, trusted partners. AI-powered bot and fraud detection. Regular updates to the panel and respondent profiles to ensure response quality. Data cleaning to ensure reliable and accurate results.