Inquiry
I had the privilege of taking two inquiry courses: Quantitative Communication and Communication in World Culture. Before I took these classes, I didn’t understand the difference between quantitative research methods and qualitative research methods. Now, I clearly understand the different focuses, methods, and tools used in quantitative versus qualitative research. Two projects forced me to grow in my understanding of the inquiry category. First, in my study on communication in world culture, I wrote a paper about the cultural memory of Australia and its relationship to the aboriginal population. Second, I participated in a group project and paper where we conducted a quantitative research study. We surveyed San Jose State University students to determine the effect of survey instruction length on response rates. I have chosen to showcase this group paper in my portfolio.
In my cultural memory paper, I engaged in critical research and drew conclusions from previous scholarly research. To complete this project in an ethical manner, I needed to consider all of the Australian populations that were involved. My group research study allowed me to use quantitative survey research. We used this research method to study numbers regarding response rates, percentages, and correlations between demographics. I learned about the ethical process and paperwork involved to treat subjects fairly.
I chose to use this group paper because it required an extensive quantitative study and our paper was modeled after a traditional scholarly article. I showcased my organization skills in this paper and my ability to present a thorough and complete research study. To see our research paper entitled, "The Effects of Survey Instruction Length on Response Rates," please see the post below.

The Effects of Survey Instruction Length on Response Rates
Dominic Amigone, Victoria Ashby, James Jun, Angelique Vega
San Jose State University
Abstract
This study was created to determine whether or not the response rates of surveys were affected by the length of the survey instructions. Researchers distributed two different online surveys to 40 students each from San Jose State University. The surveys were completely identical except for the introductions to the survey. The first survey’s introduction contained nearly four hundred words while the second survey’s introduction contained less than forty words. Participants were asked to give feedback on childhood cartoons and then asked a series of questions regarding their demographics such as age, sex, and ethnic identity. Finally, they were given an opportunity to give their honest feedback on the survey. They were asked why they chose to complete the survey and whether or not they had considered leaving the survey. Of the 41 participants who were given the shorter survey instructions, 12 completed the survey producing a response rate of 0.29. Of the 40 participants who were given the longer survey instructions, 18 completed the survey producing a response rate of 0.45. However, the researchers found that 33% of those participants complained about the length of the survey instructions. They concluded that a longer survey introduction might increase the response rate of a survey but it also might decrease the participant’s satisfaction in their survey experience. They suggest that student researchers use this information to create thorough survey instructions rather than short survey instructions. They also recommend that professors use this information to create thorough instructions for their students to complete assignments.
Introduction
In our generation today, Americans are asked to participate in hundreds of surveys over their lifetime. Businesses and researchers depend on survey responses that they receive via social media, emails, mail surveys, or telephone surveys. However, people complain about the excessive amounts of junk mail or spam that they must sort through. In order to capture responses, researchers need their subjects to participate in the entire survey. The community of public opinion research has always had to face the issue of survey response and they are looking for ways to maximize it. Research is more accurate when the researchers have more participants. Therefore, this research uses two different surveys to compare the way that the length of survey introductions affects response rates. We may be able to draw conclusions that shorter survey introductions increase or decrease response rates or we may find that response rates are not affected by the length of survey introductions at all. These conclusions will allow student researchers at San Jose State University to create survey introductions that will effectively increase the response rates to their surveys.
Literature Review
In the past five years, communication scholars have conducted an extensive amount of research around the topic of survey methods. Because surveys are an easy and effective tool for communication research, numerous scholars use surveys to gain understanding within the communication studies field. In 2016, Moy and Murphy published an article about the problems and prospects in survey research. Their research focused on finding the best methods to use when conducting a survey. Additionally, they examined survey research in the light of new technologies and the development of social media. Moy and Murphy studied nearly 500 survey-research-based articles published in four mass communication journals that are associated with either the International Communication Association or the Association for Education in Journalism and Mass Communication. They analyzed the mode of data collection that researchers used, the dates of fieldwork, the population studied, and the sample design. The four journals that Moy and Murphy studied published over 1,150 articles between 2008 and 2014. According to their research, 27.7% of the articles partly or fully relied on survey data to provide evidence of their conclusions (Moy & Murphy, 2016). Although face-to-face interviews have been a norm for researchers throughout history, online surveys and mobile surveys are becoming more prevalent. However, “an increase in access to data does not necessarily bring with it an increase in publishable research” (Moy & Murphy, 2016, p. 31). The research performed by Moy and Murphy highlights the importance of “the research survey” within the field of communication studies. They have shown the communication community that while online surveys are convenient, they are not always effective. Therefore, it is important to study survey methods so that researchers can be effective.
Every researcher who uses a survey as a tool is faced with the issue of finding participants. Their preparation is worthless if no one will take their survey and it is widely accepted in the scholarly community that research results are more reliable when the sample size is larger. Thus, researchers are faced with the task of increasing their response rates to their surveys. Scholars Hsu, Schmeiser, Haggerty, and Nelson acknowledge the issue of response rates in their own research. They explain that response rates to household surveys have been declining for a long time which may result in biased inference in the research if the nonresponse in nonrandom (Hsu, Schmeiser, Haggerty, & Nelson, 2017). While they investigated the effects of monetary incentives on survey response rates, another group of researchers named Fazekas, Wall, and Krouwel conducted a body of research in which they studied the effects of email “cover letters” on survey response rates. They realized that email “cover letters” are oftentimes used as invitations to attract participants to a survey. Something in the “cover letter” must strike people as interesting and trustworthy in order to turn readers into participants. They conceptualized the online “cover letter” “as a persuasive document, designed to influence the motivations of respondents in such a way that they are more likely to respond to the survey” (Fazekas, Wall, & Krouwel, 2014, p. 236).
Unsurprisingly, monetary incentives did affect survey response rates. The research done by Hsu and his colleagues showed that 29.2% of their original sample chose to take advantage of their advertised incentive and that response rates increased by 20% when a monetary reward was attached to the survey (2017). While monetary incentives may be a productive method to increase survey response rates, they may not practical for all research studies. Many researchers must find another way to increase their response rates. Fazekas, Wall, and Krouwel discovered that if the “cover letter” contained altruistic appeals, readers were 4% more likely to complete the survey than if the “cover letter” contained egoistic messages (2014). They also found that complex messages in email “cover letters” yielded a meager 21% response rate whereas simple language yielded a 29% response rate. Fazekas, Wall, and Krouwel strongly suggest the invitational wording of email “cover letters” plays an important role in response rates for online surveys (2014).
Our study is designed to build off of the previous research to aid researchers in handling their response rates. We are examining the idea that the introduction to a survey may cause the response rate to fluctuate.
Research questions:
How do the length of survey instructions affect the response rate to an online survey?
Are readers more likely to participate if the introduction to the survey is longer or shorter?
Methods
Our team began conducting our survey by collecting a sample of participants. Then, we collected data from our participants. Finally, we organized and analyzed the data that we had received.
Description of Sample
Our study achieved a total sample size of 81 participants. Forty of our participants were assigned to the survey with a long introduction and forty-one of our participants were assigned to the survey with the shorter introduction. Our sample was made up entirely of people who are currently students at San Jose State University. However, the sample was diverse in areas of age, gender, and ethnicity. Of the participants who responded to the surveys, 83.3% of them were between the ages of 18 and 25 years old. This percentage was identical for both the long introduction survey and the short introduction survey. In regards to gender, 61.1% of the respondents to the long introduction survey were female while 38.9% were male. For the short introduction survey, 66.7% of the respondents were female and 25% of the respondents were male. The ethnic identities of respondents were very diverse in response to the long introduction survey. That survey retrieved roughly the same amount of responses from Caucasian students, Asian American students, and Hispanic/Latino students. A very small percentage of African American students responded to the long introduction survey. On the other hand, exactly half of the responses to the short introduction survey identified as Hispanic/Latino with Asian American students, African American students, and Caucasian students making up the remaining 50%.
Data Collection
Our team originally planned to set up a booth at San Jose State University to invite passing students to become participants in our study. Unfortunately, that plan proved to be logistically difficult so we shifted to a more personal approach. We decided to gather our participants by approaching them to explain our research. Our sample classified as a convenience sample because we reached out to students who were easily accessible at our University. We informed them that everything in our study would be anonymous and that only their responses will be recorded. We explained that they could take the survey at their leisure, they could choose not to take the survey, and that they always had the option of leaving the study. If the participant chose to actually take the survey, their responses were collected using an online questionnaire that was divided up into four sections. The first section contained the survey instructions which were either lengthy or short while the other three sections were identical for both surveys. The second section asked participants to answer a series of questions about their favorite childhood cartoons and the third section was filled with demographic questions about age, gender, and race. The final section contained the most important questions that asked for feedback about the survey. We designed these questions to find out why each participant was motivated to complete the survey, if they had considered leaving the survey, and what they would change about the survey. These questions allowed us to increase the validity of our responses rather than depending solely on the response rates. However, the reliability of the responses may be skewed because our sample sizes were small. Forty participants cannot represent the entirety of thousands of students who attend San Jose State University.
Data Analysis
The response rates of each survey were the most critical statistics for our research. Our study was designed in a way that each student who agreed to receive the survey participated in the study whether they chose to complete the survey or not. The participants who chose not to take the survey were just as valuable to us as the participants who did. To analyze our date, we created an extensive Excel spreadsheet that recorded the number of responses to each question and the percentages of participants who picked a particular answer. Because the first section of our survey contained questions about childhood cartoons, the results from those questions were irrelevant to our research. However, we examined the statistics from the responses that we received to the demographic questions relating to gender, age, and race. These numbers allowed us to discover if a certain demographic was more inclined towards a particular survey introduction. And we closely examined the statistics from the responses that we received to the feedback questions. Those numbers allowed us to find out why exactly our participants chose to complete the survey and whether or not they had wanted to leave the survey.
Results
Our team calculated the response rates from each survey by dividing the number of students who completed each survey by the total number of people who had been sent an invitation to complete the survey. The survey with long instructions was sent to 40 participants and only 18 of them completed the survey. Therefore, the survey achieved a response rate of 0.45. Alternatively, the survey with short instructions was sent to 41 participants and only 12 of them completed the survey which produced a response rate of 0.29. The response rates from both surveys directly related to our research questions. When we examined section four from surveys, we found that one out of every three participants in the long instructions survey complained about the length of the instructions at some point. 40% of the respondents from the long instructions admitted that they considered leaving the survey when they saw the length of the instructions. See Figure 1. Not one respondent from the short instructions survey complained about the length of the instructions. When the students were asked to pick one thing to change about the long instructions survey, 29.4% reported that they would have changed the length of the survey. See Figure 2. 75% of students participating in the short instructions survey replied that they would not have changed anything. See Figure 3.
The majority of participants for both surveys fell into the age group of 18 to 25 years old. In the long instructions survey, 33% of the participants in this age group completed the survey because they wanted to give feedback on their favorite childhood cartoons and 33% of them completed the survey because they felt like the instructions were simple enough. In the short instructions survey, 50% of the participants in this age group completed the survey because they wanted to give feedback on their favorite childhood cartoons while 30% of them completed the survey because they felt like the instructions were simple enough. None of the other statistical findings regarding demographics were significant.
​
​
​
​
​
​
​
Figure 1
​
​
​
​
​
​
​
Figure 2
​
​
​
​
​
​
​
Figure 3
Conclusion
Both surveys had roughly the same amount of participants but we were surprised to find that there were more people who responded to the longer introduction than the shorter introduction. When we entered into our research, we wanted to know how the length of the introduction would affect the response rates. We wanted to know which survey would have a greater response and our team had privately suspected that the survey with a shorter introduction would yield a higher response rate. In the end, more people responded to the longer version of the survey which surprised our team. The conclusions of Moy and Murphy rang true in our research because the shorter introduction seemed more convenient but that didn’t necessarily make it more effective. We raised the question of whether or not the content of the introduction was more valuable to the participant than we had realized.
Our method of comparing shorter and longer introductions was similar to the way that Fazekas, Wall, and Krouwel compared simple cover letters to complex cover letters. They found that simple language produced a higher response rate. Our results did not line up with their research because a shorter introduction would appear to be the simpler choice but it yielded a lower response rate in our study. We found that a large percentage of students within the age range of 18 to 25 years believed that the instructions from both surveys appeared simple.
However, we could not overlook the fact that one out of every three participants in the long instructions survey complained about the length of the instructions at some point during their survey. Although more students participated in the long introduction survey, their responses implied that they were less satisfied with their survey experience. From the results of our research, we drew the conclusion that a longer and more thorough introduction might actually increase the response rates to a particular survey. Nevertheless, it appears that a shorter introduction might increase the participant’s enjoyment of their survey experience. While we had set out to answer a question directly related to response rates, we stumbled upon this correlation to participant satisfaction.
Discussion
Unfortunately, we did face some limitations in our research regarding the sample of students that we had from San Jose State University. Our sample size was too small to represent the thirty-thousand students that attend San Jose State University. We also realize that many students may have participated in the survey because they felt obligated to do so when we asked them to be a part of our study. This may have increased our response rates to both surveys in an unrealistic manner. When we saw the results to our survey, we had to acknowledge that we may have chosen a survey topic that positively skewed our results. Many students reported that they had chosen to complete the survey since they wanted to give feedback on their favorite childhood cartoons. We had chosen to create our survey around the topic of childhood cartoons to make it relatable to the masses of college students. However, we recognized that a light-hearted topic like cartoons probably encouraged more students to complete the survey for fun rather than reflecting their feelings about the introduction.
Based on the findings of this study, longer, more descriptive introductions yield higher responses to surveys. Our team believes that San Jose State University can use this research to improve student participation in several activities. First of all, thousands of students use surveys on campus every year to complete research for their classes or their clubs. When introducing a survey assignment, professors can use this research to show students how to write their questionnaire introductions in a way that would increase their response rates. It would also be valuable to contact teachers in the communications department and others to stress the importance of a detailed introduction for all assignments and tests in their classes. The importance of response rates can help not only surveyors but also can have a profound effect on different areas like teaching. Often, students do not read the instructions for assignments which can result in lower grades. If the introductions can be clear and detailed so that students can follow them, they may be able to improve their scores because they understand their teachers more thoroughly. By implementing these actions, both professors and students can benefit from the findings of this research.
References
Fazekas, Z., Wall, M. T., & Krouwel, A. (2014). Is it what you say, or how you say it? An experimental analysis of the effects of invitation wording for OnlinePanel surveys. International Journal of Public Opinion Research, 26(2), 235-244.
Hsu, J. W., Schmeiser, M. D., Haggerty, C., & Nelson, S. (2017). The effect of large monetary incentives on survey completion: Evidence from a randomized experiment with the survey of consumer finances. Public Opinion Quarterly, 81(3), 736-747. doi:10.1093/poq/nfx006
Moy, P., & Murphy, J. (2016). Problems and prospects in survey research. Journalism & Mass Communication Quarterly, 93(1), 16-37. doi:10.1177/1077699016631108


