SAES-422 Multistate Research Activity Accomplishments Report

Status: Approved

Basic Information

Participants

Steven Swinford (swinford@montana.edu), Montana State University; Ginny Lesser (lesser@science.oregonstate.edu), Oregon State University; Bill Stewart (wstewart@illinois.edu), University of Illinois; Carena van Riper (cvanripe@illinois.edu), University of Illinois; Hua Qin (qinh@missouri.edu), University of Missouri; Zhengyuan Zhu (zhuz@iastate.edu), Iowa State University; Melissa Constantine (cons0026@umn.edu), University of Minnesota; Todd Rockwood (rockw001@umn.edu), University of Minnesota; Fern Willits (fkw@psu.edu), Pennsylvania State University; Emily Perdue (emily.perdue@mail.wvu.edu), West Virginia University; Jason McKibben (Jason.mckibben@mail.wvu.edu), West Virginia University; Ashley Yopp (ayopp@tamu.edu), Texas A&M University; Billy McKim (brmckim@tamu.edu), Texas A&M University; Tobin Redwine (tredwine@tamu.edu), Texas A&M University; Gladys Walter (gladysw@tamu.edu), Texas A&M University; Glenn Israel (gdisrael@ufl.edu), University of Florida; Jessica Goldberger (jgoldberger@wsu.edu), Washington State University; Don Dillman (dillman@wsu.edu), Washington State University

Call to order at 8:20 a.m.

The meeting began with introductions and meeting logistics.

Brief history of group was provided for new attendees and Chair Glenn Israel mentioned more details are included in the Introduction to the special issue of the Journal of Rural Social Sciences.  Started as a methodology research project, W-183, in rural social sciences. Joint publications of replications of experiments was the primary motivation initially behind organization.

Israel added that the current WERA-1010 project runs through September 2018.  Inquiry about renewal will be posed to Lou Swanson, Administrative Advisor.

Improving Communication with Potential Respondents

The committee began the substantive meeting by discussing theoretical approaches to improving communication with potential respondents. Don Dillman (Washington) discussed plans that were developed for beginning a new line of experimentation in 2017 for evaluating alternative communication designs for encouraging response to surveys.  This experiment is scheduled to be put into the field in March, 2017.  A detailed rationale for designing and implementing this study was presented along with draft materials to be used in the experiment, for review and critique by committee members.

Dillman presented ideas for Improving Response Rates through better communications.  First, he noted that RDD is no longer working – response rates are bad, coverage is problematic with extra questions needed to test/assess coverage. Web-push surveys are the likely replacement.  One can make an initial contact with postal (good contact) and follow up with other modes/requests. In addition, Web-push is better now at getting people to use web (cuts costs) but doesn’t always work. American Community Survey is now all web-push; Dillan cited several other examples given of major national surveys using this method.  Obtaining good response rates on these too.

Dillman discussed what we know about what works/does not in web-push.  We don’t know how improved communications could increase effectiveness of web-push.  We need to understand how and when communication occurs.  Communications extends beyond the letters and graphics.  Not much work has been done on elements of this. There was discussion of what makes communication sequence effective, and things that detract. Many stages in this process though – not a singular solution to the issue.

Dillman discussed a new approach to communication, noting that there are four aspects to communicate – presentation, content, letter, and questionnaire. He also reviewed pre-suasion concepts from Robert Cialdini, which inspired the new approach: 1) Establishing trust initially can increase compliance later, 2) Creating a privileged moment, 3) Focal arguments in one’s mind tend to be causal, and 4) Utilize normative appeals to magnetize attention to focal argument. Dillman asked, “How might these concepts be built into a sequence of letters intended to elicit response?” and “How should one use connective language relevant to the community?” First, the communication should explain why WSU is conducting the study; the second contact extends connections; the third contact emphasizes local questions and provides feedback on results to date; and the fourth contact is a reminder postcard. Building the concepts into questionnaire was done is several ways: the cover page connects to community (two examples…one emphasizes location, other not); there are questions on local community up front and transitions are used to explain questions throughout.

The survey was developed for the two experimental conditions. In addition the cover letter explained why it is was coming to people in West Virginia from WSU. Everyone is getting an incentive of $2 at Week 1 (evidence is out there that they work) – not an experimental factor. 

A 2x2 design is planned with the persuasion letter vs standard letter and persuasion questionnaire vs standard questionnaire. The research team is shooting for a response rate of about 40%.

Comparison of online opt-in panels with address-based sample surveys

A second area discussed was the use of online panels and how they compare with probability samples. Ginny Lesser (Oregon) talked about one of her studies this year to compare a probability sample and an opt-in panel using the same questionnaire.  First, she contacted to Knowledge Networks for an online probability sample size n=2000 but this was going to cost $45,000.

The “standard” probability sample had 3,750 in mail, 3,750 in web+mail and used addresses from the USPS DSF. In addition, Lesser experimented with an “I Love Oregon” sticker (state outline, green heart) as an incentive insert. Response rates for token incentives are modest -- they don’t work well. Also, comparison of 4 vs 5 contacts on web+mail showed a  4%-6% bump for fifth mailing…overall rate was low 20s to mid 20s. Over a decade the percentage of respondents going to web has become about half of all completed.

For the opt-in nonprobability panel, Lesser used a Qualtrics commercial panel…recruited from their business partners to take opt-in surveys. Qualtrics recruits respondents and Lesser asked for Oregon residents with specific demographics. The panel matched demographic request, invited people to participate, and Qualtrics gives “points” incentive…for cashing in on gift cards. To reach the target of 500 completes, Qualtrics needed to contact 7250 individuals. Over four contacts, 457 completed the survey, so they ran another sample of 790 to get her to the 500 completed she contracted for initially. Qualtrics does not give you the list of people who are contacted. The overall response rate was 6.3%. Lesser noted that Qualtrics removes the “speeders” – people who just click a column or complete too quickly.

Lesser compared the opt-in panel responses with those from the probability sample, using the.  point estimate for the nonprobability Qualtrics sample and 95% confidence intervals for the probability sample. She also weighted the responses. Over the 265 questions, 63% of panel estimates outside of the confidence limits of the probability sample. Lesser also compared item nonresponse – there were a higher percentage of missing answers in the probability sample.  It may be the case that those in panels, to get reward points, cannot leave as many “no answer” responses.

Steve Swinford (Montana) discussed his experiences with using Qualtrics panels in several studies at the Center for Health and Safety Culture. He had used them 10+ times. Mostly for pre-testing new items but sometimes for very specific groups. These usually collected 75 or so completed surveys and costs about $600 per instance. He noted that you can get results in 6 hours sometimes. He also reported that there was a problem with getting rural samples because Qualtrics does not have the database for very specific rural samples, e.g., Utah, rural, aged 18-44.

Swinford reported that the cost was normally about to $6-$8 per completion, which is both faster and less expensive than mail. He shared several examples of recent studies:

  • Idaho traffic safety study
    • Used it for pretesting instrument a couple of times
    • We did household mailout and online panel
    • Results were “close” in the end
  • Oregon – health care providers
    • Paid $50-$75 per completion, but we got them
    • Dating Violence in high schools
    • Needed 18-year-olds in HS…got 8

Swinford commented that it is not perfect but another tool to use. He noted that he didn’t use them as the primary data collection method but the results have consistently been close to what we obtain using conventional random samples.

Zhengwan Zhu (Iowa) reported on the AVMA Pet Ownership and Demographic Survey. He conducted an addressed-based pilot of dog ownership in 2015 and then two more surveys –Pet Demographic Survey (PDS) and Metropolitan Market Survey (MMS), were done online using panels. PDS is conducted every five years to estimate the percentage of household that have different types of pets. The ISU study for 2017 is using SSI with the goal of 50,000 completes. MMS focuses on one type of pet, adding detail about the pet type. He looked at 2012 data (the previous panel data) to discern the sampling and weighting used, as well as to assess the impact of eliminating the split sample design that was previously used.  There was a claim that the sampling in 2012 was representative, but no data was given to ISU to verify this; nor was the documentation clear as to the procedures used to manipulate the data. It appears that construction of the panel makes the representativeness questionable.  No demographic info was asked in the questionnaire – thus it all comes from information in the original profile. Zhu also estimated standard errors of 2012 data but there is a lack of certainty of estimate overall.  He did this to arrive at scheme for doing so in 2017.

Zhu also reported on using a Google survey – it had one question and cost 15 cents per complete. Ne reported that they got estimates close to their adjusted estimates from a Qualtrics survey. No conclusion to draw, just noting the estimates that this approach yielded.

Glenn Israel (Florida) reported on an opportunity to have a Qualtrics nonprobability quota sample and a probability address-based sample for a survey on Climate Change in Florida.  The online Qualtrics survey was completed in 8 days, with 514 respondents completing it in November, 2016.  He contracted $5 per complete.  With a few exceptions, the online and mail questionnaires were the same.  About 800 accessed survey to get 514 completes.

A mail survey was started about the same time – no pre-letter; it had an initial packet with a cover letter, questionnaire, and postage-paid return envelope, followed by a reminder postcard, then a second questionnaire and a third questionnaire to nonrespondents. To date, the response rate is 16.3% on the mail survey with a 1500 sample size. A second replicate of 500 is in the field using a mixed-mode protocol. Israel will report results at the next meeting.

 

State Reports

Bill Stewart and Carena van Riper (Illinois) shared their information about their search and research for high response rates as part of the Parks and Environmental Behavior Work Group. They reported on research examining Sense of Place – place making – with a community and belonging theme. The study was conducted in an urban context (Chicago’s south side) and addressed land vacancy.  There were 25-30,000 vacant lots (common urban problem) and leaders were attempting to re-develop these spaces. There is hope for the neighborhood and redevelopment. The study was intended to measure their perceived impact of vacant lot buying on their community and, hence, was a social assessment of community engagement. They worked with partners that included several NGOs and neighborhood associations and the survey of large lot owners achieved a 71% response rate.  Data collection included a first mailing of the questionnaire, postcard reminder, second survey, city called non-respondents, third replacement survey.  About 58% responded before the phone call. Possible reasons for the strong response rate include issue salience, relationships built within policy chain (Introductory letter from City, Promise of survey response as voice in decision‐making process, Word‐of‐mouth due to focus groups, and Phone calling from City prior to third wave of questionnaires), and the $1 incentive enclosed in first questionnaire (basically got their dollar back).  Good buzz – word of mouth – was generated during his data collection process.

Stewart and van Riper also discussed a study on community resilience in protected grasslands. Rural communities face challenges to development and protected grasslands part of this. They are attempting to understand changes in social and economic conditions of rural communities near protected grasslands.  The study will focus on two counties in Illinois and Iowa with bison reintroduction as part of the issue. The survey will focus on trade-offs among future growth scenarios and be administered via mail. It will include a stated choice experiment to assess relative importance of attributes in a design. This will include determining the relative importance of 6 or 9 attributes identified in focus groups and then developing profiles (subgroups) defined by community attachment Pilot testing is planned for May, 2017, and the main data collection late summer/fall. Discussion focused on acquiring the sample, sampling frame and methods, and estimating nonresponse bias.

 

Friday, February 16, 2017

Call to order at 8:05 a.m.

Steve Swinford (Montana) described a study of alcohol use at university events: SAFE – Substance Abuse Free Environment.  Description of methods used and purpose of study.  Multiple surveys in multiple modes conducted. This was funded internally and, originally, focused on football tailgating at home games and then it was expanded to all public campus events involving alcohol sales/distribution. The methods included: 1) Observation work (done), 2) Interviews with key stakeholders, 3) Analysis of policies at peer institutions, and 4) Survey work of community members. He planned a sample of 1200 and to use online data collection and mail with four contacts: Pre-letter, letter (survey), reminder, replacement.

Ginny Lesser (Oregon) reported that each year the survey center conducts about 8 surveys for the state of Oregon. The Oregon DOT study measures satisfaction with highway maintenance (since 2000). Lesser noted that general and specific ordering of items has been studies and published. She reported that it uses a probability sample of about 4,130 (half all mail back, half web+mail) of licensed drivers. Lesser used four contacts for mail only and five for web+mail (Preletter, 1st mail, PC, 2nd, 3rd (web+mail only)). The response rates by mode were 34% overall, web+mail with 4 mailings (27%), web+mail with 5 mailings (31%).  The fifth mailing boosted response rate but these came back mainly through paper. Lesser also reported on a Control of Litter survey and found no consistent differences in responses between all mail and web+mail respondents on answers to specific or general questions. She will look at this again in two years (will be doing the same study again). Lesser noted that urban areas responded more via web and rural more via mail.

Zhengwan Zhu (Iowa) reported briefly on the 2015 and 2016 Iowa Nutrient Management Survey and noted the 2017 version goes out in early March. The focus was on water quality impacts of agricultural nutrients. They were looking at farmer knowledge, barrier to reducing nutrient loss, and changes over time. The sampling design identified priority watersheds. He reported the 2015 study had n=1746 and a response rate of 47%. Zhu provided a brief description of logistic regression analysis results and discussed the measurement of knowledge

Hua Qin (Missouri) reported on conducting a systematic review and meta-analysis of survey research. He sampled eleven papers dealing with survey research methods and discussed them broadly. Qin said this provided an opportunity to develop agricultural/natural resource topic meta-analysis around survey methods – defining an area that might have enough treatments to analyze. He suggest this may become a future endeavor of WERA group. Don Dillman noted that there was some previous integrative work on related topics, for example forced choice versus check all that apply questionse.

Todd Rockwood (Minnesota) briefly talked about an issue of knowledge about internal organs. He observed that health is only known with respect to disease and this creates challenges for other measurement approaches.

Fern Willits (Pennsylvania) suggested organizing one or more sessions at the upcoming annual meeting to the Rural Sociological Society this July to share and extend to a larger audience the fruits of our discussions concerning the uses of commercial survey providers. The WERA attendees suggested we aim for two (2) "panel discussion" sessions" as follows: 1) An Overview of  Commercial Survey  Service Providers for which we would invite representatives of several such service organizations  to describe their services concerning sampling procedures, data collection options, and survey consultation. A second panel, tentatively titled "Experiences and Lessons Learned Using Commercial Survey Service Providers," would engage various WERA participants and (hopefully) audience members to share the pros and cons of their experiences. Todd Rockwood and Melissa Constantine agreed to work with me to try to "make this happen."

Willits also described her interest in engaging other WERA members in analysis that would examine the consistency (or not) of observed relationships between survey variables if analysis were carried out at different stages of data collection/follow-up. There has been considerable research on differences between the characteristics of early and later responders.  She did not know of research that addresses whether observed relationships between or among variables differ depending upon whether the subjects responded early and later to solicitations for participation. To assess this idea, one needs to have access to substantive survey data sets that include the date of response for each subject as well as the information on the variables of interest.  Analysis would then require examination of the relationships between or among selected variables using only cases responding to the first wave of contact, and similar analysis after one or more later waves. She did not have any such data available and requested anyone who might have such information who would be willing to share (or join forces with her) to pursue such analysis to contact me. Ginny Lesser (Oregon) pointed out that proprietary constraints in some studies might be a problem, but these may not be insurmountable. 

Jessica Goldberger (Washington) discussed a series of data sources based on survey collections she has worked with over her career.  The first was for survey of certified organic producers (2007), which asked producers about biodegradable plastic mulch and barriers to use of the technology.  She is head of the technology adoption working group on the 5-year USDA project.  A second survey of Strawberry growers focused on various benefits and costs of the use of the plastic film. A biodegradable option does exist but it comes with some costs though.  None of the products currently meet the current standards for organics.  Some strawberry producers are not able to utilize the technology. Goldberger described the questionnaire design and sample, where 1553 growers in 6 states sampled.  Mailing list purchased from Meister Media (n=1357) and supplemented with OR/WA names (n=196). The data collection used four contacts: pre-letter, first full mailing, reminder postcard, second full mailing.  A web option was provided for all mailings. The initial response rate was about 18% and phone calls were conducted with 290 non-respondents. There was a higher proportion of ineligibles than anticipated, resulting in an adjusted response rate about 21% and  227 useable questionnaires.  Another farmer survey is planned for Fall, 2017, and WERA members discussed how to improve response rate on this next survey.

Glenn Israel (Florida) reported on follow-up data collection in a presentation of “Can Clarifying Instructions influence response to numerical open-ended questions in self-administered surveys?” Last year, he reported that 20% of his respondents were using phones and were using 10% tablets.  This led to a reformatting the questionnaire to improve navigation. Given this, for the Q10 and Q11 experiment in 2016, specific information on the information obtained from Extension was “mail-merged” into the individual’s questionnaire. The results suggested the instructions helped but they were not as clear-cut as were those for 2015 when the instructions were not individualized.

 Glenn Israel also reported on an additional experiment that examined effects of stem and response order on response patterns in satisfaction ratings. A 2x2 experiment that tested the order satisfied and dissatisfied in the question stem and very satisfied to very dissatisfied in the responses were manipulated.  The experiment in 2016 tested Q7 (overall satisfaction with the Extension office). There was a large response order effect found. This finding has been replicated in other studies in Florida and Nebraska. In addition, the column format seemed to enhance the magnitude of the effect in 2016. In summary, there is clear evidence of response order effects and satisfied/dissatisfied items should start with positive answers first as this is consistent with heuristics used by respondents.

The next meeting is scheduled for February 22-23, 2018, in Tucson.

Adjourned at 3:15 pm

 

Accomplishments

An agenda was developed and the coordinating committee held its annual meeting in February, 2017. Participating members discussed several important topics affecting error in agricultural and rural surveys. These topics included: Effective survey communication and Comparison of online opt-in panels with address-based sample surveys. In addition, members reported on survey research studies being conducted or planned in their state and provided feedback to others. Members from several states discusses plans for studies on comparing nonprobability samples in on-line surveys with address-based probability samples using mail and/or mixed-mode surveys in order to assess the strengths and weaknesses of these technologies, assessing the utility of different theories for inviting people to respond to a survey, assessing the order of concepts in question stems and responses for satisfaction items, and assessing the effects of follow-up contacts on sample characteristics and substantive research findings, as well as other topics.

During the year, work was completed on a special issue focusing on survey research methods for the Journal of Rural Social Sciences. Coordinating committee chair Israel then led efforts to solicit manuscripts and serve as guest editor for the special issue. The special issue contained an introductory article, six research articles and a commentary. Each article involved committee members as an author, co-author, or reviewer. In addition to the special issue, committee members have been active in publishing research in journal articles, presenting papers and posters at relevant conferences, and developing educational materials available to Extension professionals and the public during the past year. This includes publishing 12 survey methods-related articles, updating 5 publications for Extension and outreach audiences, and 13 presentations at professional conferences where attendees are members of the target audience for this project. In addition, the member from Florida conducted a 2-hour demonstration workshop on optimizing mixed-mode surveys for respondents using mobile technology at the American Evaluation Association annual meeting and he trained 75 Florida Extension professionals on the use of online survey tools, which incorporated research of the coordinating committee. Members from Florida, Minnesota, Oregon, and Pennsylvania participated in panel sessions at the annual meeting of the Rural Sociological Society in July, 2017. One panel, Contributions and Issues Related to the Use of Commercial Survey, involved participants from three firms while the second panel, Promise and Pitfalls: Experiences and Lessons Learned from Using Commercial Survey Services, was comprised of WERA 1010 members. Attendees at these panel session learned about available services for conducting surveys as well as the pros and cons of using some of these services.

Impacts

  1. Recipients of the research findings and outreach activities of coordinating committee members have more accurate information for making decisions about conducting surveys and/or assessing the strengths and weaknesses of survey data. This, in turn, can contribute to appropriate project- and policy-level decisions.

Publications

1.      Battaglia, M., Dillman, D. A., Frankel, M. R., Harter, R., Buskirk, T. D., McPhee C. B., DeMatteis B., Montaquila, J., & Yancey, T. 2016. Sampling data collection and weighting procedures for address-based sample surveys.  Journal of Survey Statistics and Methodology, 4 (4): 476-500.

2.      Dillman, D. A. 2016. Moving Survey Methodology Forward in our Rapidly Changing World: A Commentary. Journal of Rural Social Sciences, 31(3): 160-174.

3.      Dillman, D. A., & Edwards, M. L. 2016. Chapter 17. Designing a Mixed-Mode Survey. In Wolfe, Christof, Joye, Dominique, Smith, Tom W. and Fu, Yang-chih (eds.) Sage Handbook of Survey Methodology. Sage Publications Wolf, Joye, Smith and Fu. Thousand Oaks. CA pp.255-268

4.      Dillman, D. A., Hao, F., & Millar, M. M. 2016. Chapter 13.  Improving the Effectiveness of Online Data Collection by Mixing Survey Modes. In Fielding, Nigel, Raymond M. Lee and Grant Blank (eds.).   The Sage handbook of Online Research Methods, 2nd edition. Pp.220-237 Sage Publications, London.

5.      Flint, C. G., Mascher, C., Oldroyd, Z., Valle, P. A., Wynn, E., Cannon, Q., Brown, A., & Unger, B. 2016. Public Intercept Interviews and Surveys for Gathering Place-Based Perceptions: Observations from Community Water Research in Utah. Journal of Rural Social Sciences, 31(3), 105-125.

6.      Harter, R., Battaglia, M. P., Buskirk, T. D., Dillman, D. A., English, N., Mansour, F.,    Frankel, M. R., Kennel, T., McMichael, J. P., McPhee, C. B., Montaquila, J., Yancey, T., and Zukerberg, A. L.  2016.  Address-base Sampling.  American Association for Public Opinion Research Task Force Report http://www.aapor.org/getattachment/Education-Resources/Reports/AAPOR_Report_1_7_16_CLEAN-COPY-FINAL-(2).pdf.aspx  140 pages. 

7.      Israel, G. D. 2016. Advances in Survey and Data Analysis Methods for Rural Social Scientists: An Introduction. Journal of Rural Social Sciences, 31(3), 1-6.

8.      Jackson-Smith, D., Flint, C. G., Dolan, M., Trentelman,, C. K., Holyoak, G., Thomas, B., and Ma, G. 2016. Effectiveness of the Drop-Off/Pick-Up Survey Methodology in Different Neighborhood Types. Journal of Rural Social Sciences, 31(3), 35-67.

9.      Kumar Chaudhary, A., & Israel, G. D. 2016. Influence of Importance Statements and Box Size on Response Rate and Response Quality of Open-ended Questions in Web/Mail Mixed-Mode Surveys. Journal of Rural Social Sciences, 31(3), 140-159.

10.  Lesser, Virginia M., Newton, Lydia D., Yang, Daniel K., & Sifneos, Jean C. 2016. Mixed-Mode Surveys Compared with Single Mode Surveys: Trends in Responses and Methods to Improve Completion. Journal of Rural Social Sciences, 31(3), 7-34.

11.  Trentelman, C. K., Irwin, J., Petersen, K. A., Ruiz, N., & Szalay, C. S. 2016. The Case for Personal Interaction: Drop-Off/Pick-Up Methodology for Survey Research. Journal of Rural Social Sciences, 31(3), 68-104.

12.  Willits, F. K, Theodori, G. L., & Luloff, A. E. 2016. Another Look at Likert Scales. Journal of Rural Social Sciences, 31(3), 126-139.
    

Log Out ?

Are you sure you want to log out?

Press No if you want to continue work. Press Yes to logout current user.

Report a Bug
Report a Bug

Describe your bug clearly, including the steps you used to create it.