WERA1010: Improving Data Quality from Sample Surveys to foster Agricultural and Community Development in Rural America

(Multistate Research Coordinating Committee and Information Exchange Group)

Status: Active

WERA1010: Improving Data Quality from Sample Surveys to foster Agricultural and Community Development in Rural America

Duration: 10/01/2024 to 09/30/2029

Administrative Advisor(s):


NIFA Reps:


Non-Technical Summary

Surveys are a vital tool to collect data on people--social sciences, the humanities, and transdisciplinary teams rely on survey research to test theories, make statistically valid generalizations to populations of interest, measure trends, predict futures, and much more. However, the survey research landscape is constantly in flux. Paper mail surveys were once the de facto mode; internet surveys now dominate. These changes in mode have created extreme shifts in the best practices and represent just one historical shift. The meteoric rise of marketing surveys, proliferation of smartphones, inundation of scam surveys, precipitous drop in response rates, and shifting demographics have all played key roles in shaping the current survey landscape.


Agricultural and rural communities present their own challenges, with major differences in internet access, population density, average age, and more creating a need for survey methodology adapted to these particularities. 


The goals of WERA1010 include addressing these challenges through:



  1. Collaborating to assure quality surveys

  2. Exploring the evolving survey landscape

  3. Researching evolving survey approaches

  4. Investigating changes in measuring rural populations

  5. Writing and publishing joint research

  6. Disseminating research through teaching and Extension


These objectives are broad enough to encompass a variety of projects of the type WERA 1010 members have historically engaged in. Specific examples include experiments comparing response quality from random samples and panel samples; conducting diary tracking studies to better understand the kinds of surveys the public receives and how they decide which to complete; determining the impact of different types of incentives on response rates; and exploring whether having the head of a University Extension service send contact requests improves response rates. The overall goals and objectives in this proposal are intentionally left broad due to the flexible and often ad hoc nature of these experiments, which are often added on to external grant-funded research programs. For example, WERA 1010 members may have the opportunity to incorporate an incentive experiment into a survey they are completing for a USDA NIFA grant on farmers’ perceptions of conservation drainage practices. We outline major issues of interest in the Issues and Justifications sections; it is our intention that survey research and experiments conducted by WERA 1010 members will address these particularly important and relevant topical areas. 


Via our objectives, target audiences including university researchers, Extension educators, students, and practitioners, will push the cutting edge of survey research, applying their knowledge to their own work with agricultural and rural communities who stand to benefit from the knowledge gained. Our goals are accomplished via annual meetings; ongoing joint consultation, collaboration, and experimentation; and ad hoc activities (e.g. conference presentations) as needed.  

Statement of Issues and Justification

While major challenges have always existed in relation to surveying rural and agricultural populations, the nature of the challenges have evolved over time. In particular, three major emerging challenges include 1) Respondent Fatigue - Potential respondents receive more frequent requests to participate in surveys as a result of the ease of creating and sending plug-and-play style surveys and are less likely to respond to research surveys, 2) Changing and Inequitable Populations - The rural and agricultural landscape is facing alterations in population and demographics along with a sharp divide in technology accessibility, and 3) Decreasing Trust in Institutions - Government and educational institutions face scrutiny, which results in repercussions for trust in surveys from these institutions. These concomitant survey and rural landscape shifts have, and will continue to, greatly impact what is achievable through survey research with rural communities and agricultural stakeholders. Survey response rates continue to see a sharp decline (Leeper, 2019; Kennedy and Hartig, 2019; Stedman et al. 2019) which, combined with decreasingly available data on rural populations (Mueller and Santos-Lozada 2023), contribute to the possibility that these groups will be underrepresented, misunderstood, and underserved by research, Extension, policy makers, and more.


 


Therefore, WERA 1010 is particularly interested in conducting research (ex. surveys on surveys, qualitative interviews with populations of interest such as farmers) to better understand how the survey landscape has changed, how the public’s perception of and willingness to take surveys has evolved, and what potential differences in these conditions result from rural versus urban contexts. Beyond this ‘research on the research landscape’ approach, we will also continue our long tradition of conducting survey experiments aimed at improving survey methodology design and approaches. These commonly take the form of experiments added to pre-existing survey projects, such as varying contact mode and assessing the differences in response rates among boat owners. We are especially interested in issues of survey burden, representative samples, and response rates, given that these are major areas impacted by the changing survey landscape. 


 


Additionally, we are interested in better understanding existing sources of survey data on rural populations, how these have changed, and how the above priorities and activities are vital for filling existing knowledge gaps specific to rural communities. As noted in the 2013 WERA 1010 proposal, the Decennial Census Long Form, used throughout the 20th century to collect essential data on population characteristics from 1 in every 6 households throughout the United States, was discontinued after the 2000 Census. As a result, there is no longer a regular source of reliable data on the characteristics of people who live in each county and community of the United States, particularly those on the rural end of the scale. 


 


Its replacement, the American Community Survey, collects data from about two million U.S. households each year, thus making it possible to produce acceptable city and urban county estimates (for population characteristics including age, education, income, occupation, commute time to work, and other essential indicators of human capital and well-being). Data from this survey can also be accumulated across rural counties over multiple years so that acceptable estimates for rural regions containing a number of counties may also be obtained. However, if one’s interest is in a specific rural county, at the sub-county level, or even small clusters of rural counties, no reliable data now exist. This is, in particular, a problem for sparsely populated regions of Western United States. The absence of such data becomes a major concern when professionals and businesses interested in economic development activities are trying to assess and use key information on the economic development potential of such areas.


 


The void produced from loss of the long form Census data is not filled by other national surveys. The sample sizes for most nationwide surveys are too small to reliably measure attributes of specific rural areas because the variability is so large, and, in many cases, sample sizes are too small for use at the state level. If data are going to be available for rural places and people on their human capital characteristics and other relevant information including farm and agricultural group activities and interests, it is essential that data on these populations be collected in other ways.


 


The further we get from our last county-level benchmark data obtained in the 2000 Census, the more challenging it is to know with reasonable certainty what is happening demographically, as well as what economic development issues are facing specific rural areas of the United States. It is important to continue the development of methods that will allow geographically specific projects to obtain household data suitable for guiding economic development and other wellbeing decisions. 


 


Understanding attitudes, behaviors, and demographic characteristics of the general public is only part of the data problem that now prevails among agricultural and rural populations. One major change in the existing survey landscape is the decline of representative samples. Sample surveys, many of which have been conducted by professionals in Agricultural Research and Extension programs, have long been used to provide specific data of interest to local municipalities as a needed supplement to the questions formerly asked in the Decennial Census. In addition, sample surveys have been used to regularly and efficiently obtain information from agricultural production groups, rural interest groups, and others to describe problems and identify solutions for which no official statistics are available. 


 


The capability of sample surveys that sets them apart from other methods of collecting data in that only a few hundred or thousand questionnaires collected randomly in a carefully designed sample of a specific population (e.g., a rural community) allows one to estimate characteristics (from attitudes to behaviors) within a few percentage points of the actual population parameters at a high level of statistical confidence for the population being studied (Dillman, Smyth, & Christian, 2014). No other social science data collection method has this capability. However, this capability is in danger of not being realized because of rapid technological and other changes that make traditional data collection methods less effective.  At the same time, we are also increasingly interested in the potential of non-random-sample survey methodologies, how they are evolving, their utility, and specific circumstances under which they may be useful or even preferable. Better understanding the dominance of non-representative surveys versus sample surveys, as well as conducting experiments to determine the differences in quality and utility of resulting data, is one example of an activity WERA 1010 intends to engage in to achieve our primary mission; finding ways to improve sample survey methods so they can be used to meet rural and agricultural data needs is the proposed purpose for this coordinating committee.


 


Examples of knowledge our team has already generated in this area include information on an increase in probability panels that have been created by a number of organizations such as the KnowledgePanel (from IPSOS) and AmeriSpeak (from NORC at the University of Chicago). The rise of ‘Big Data’ and data linkages, non-probability panels, and data commons pose interesting questions centered around the need for random sample survey data and potential situations in which data sources may be substituted or combined.  Combining probability survey data with a non-probability or administrative data source in order to improve the accuracy of the probability survey data is an area of current survey research. Rural places, additionally, are underrepresented in panels (such as those provided by Qualtrics), have minimal access to Big Data sources, and have different needs for public-facing and linked data sources. Survey capabilities are changing, and we are beginning to move beyond a one-size-fits-all methodology for survey research; we must discover new ways of matching needs, populations, modes, and methodologies to be most effective with the resources we have and the research we wish to conduct. 


 

Objectives

  1. 1. Collaboration to assure quality surveys and data provisioning
    Comments: Continue the interaction and collaboration of researchers and extension faculty using sample (and other) survey methods, for arriving at a better understanding of how to assure the conduct of quality surveys and data provisioning during this period of rapid change in survey systems and technologies and the loss of national survey data to describe sparsely populated counties and small rural communities. Example Activities: Conducting ad hoc survey experiments on existing survey projects to determine evolving best practices for reducing survey burden, increasing response rates, and ensuring quality data. For instance, experimenting with different incentive types , such as cash versus commemorative stickers, and their impact on response rates; our University of Vermont rep has plans to do this exact experiment as part of her future AES work. We will report our results to WERA 1010 members as well as through other educational efforts, outlined below.
  2. 2. Explore evolving survey landscape
    Comments: Continue the exploration of how our survey landscape is evolving, particularly with respect to survey burden and its relationship to marketing surveys, academic surveys, increasingly frequent requests for personal information and feedback, trust, and how respondents think about the purpose and utility of surveys in making their decision whether to respond. Example Activities: Expand on previous WERA 1010 Survey Diary Experiments, in which respondents were asked to record every survey request they received over a one month period, in order to better understand survey frequency, source, legitimacy, quality, and factors impacting response rates. Conduct ‘surveys on surveys’, i.e., surveying people on why they do or do not typically complete surveys from different entities. For instance, a WERA 1010 member at Cornell recently surveyed hunters in New York State asking why they chose to respond to a survey on hunting harvest, what factors would keep them from responding to a survey, and what would make it easier to complete a survey. Members at Iowa State University are planning to add a similar question set to a survey of organic vegetable growers in the coming year.
  3. 3. Research evolving survey approaches
    Comments: Conduct research on how surveys continue to evolve, including examining data collected from less expensive non-probability approaches to determine if these approaches can be combined with probability methods to obtain unbiased estimates with improved levels of precision/lower cost, exploring the use and utility of Big Data and data linkages, investigating the potential of data commons for rural populations, and testing theories around communication and response including how these may be linked to the increasingly diverse survey landscape. Example Activities: Conduct survey experiments and research relevant to new and evolving survey approaches, for instance replicating the same survey with a random sample population and a non-probability panel to determine differences in key responses of interest, data quality, and demographics of respondents. There are currently plans to conduct experiments with probability versus non-probability samples from WERA 1010 members at Oregon State University (with the ODOT Transportation Needs and Issues Survey 2024), and with drop-off/pick-up modes at Cornell, Utah State, and Michigan State on the topic of rural opinions on solar power siting).
  4. 4. Investigate changes in measuring rural populations
    Comments: Investigate how measuring rural populations has changed and continues to evolve, particularly with relation to how rurality is and can be defined, what this means for representation and surveying, what existing data sources can be drawn from and how accessible and linkable these are, and how rural populations may have specific needs in terms of survey burden, modality, panel representation, and other issues. Example Activities: Review and assess currently available data sources specific to rural populations, create resource base, publish articles on how to access these data sources (ex. see Pilgeram et al. 2020). Conduct research activities, as outlined in Objectives 1-3, specifically with rural populations and longitudinally to discover their particular context and needs.
  5. 5. Write and publish joint research
    Comments: Encourage and facilitate the joint writing and publication of research results by members of the coordinating committee. Example Activities: Co-author reports and articles on our collaborative survey diary experiment (ex. see Wallen et al. 2021); co-author assessment of sample sources for surveys with agricultural populations (ex. see Ulrich-Schad et al. 2022)
  6. 6. Disseminate research (Teaching and Extension)
    Comments: Disseminate research findings through teaching, seminars, applied publications, and Extension in-service training by members of the coordinating committee to reach survey developers and consumers in the land grant system. Example Activities: Present on described example activities at outlets such as the Rural Sociological Society, AAPOR, Utah State University Extension Annual Meeting workshops, etc. Work with Western Regional Evaluators Network to facilitate reporting results to Extension and develop in-service trainings.

Procedures and Activities

WERA 1010 meets annually for researchers to share their activities and plans for the coming year. All participants are encouraged to present tentative plans on their future studies in order to obtain advice and comment from other participants. One of the typical outcomes of these discussions is to encourage other participants to develop a new test and/or repeat a test conducted by other colleagues on the committee in their own survey work. Previous work on WERA and its W-183 predecessor has shown that members are particularly effective in developing new tests because of their roles in experiment stations, extension and other work locations in helping design surveys.


 


WERA members are often consulted by other Agricultural Experiment Station and Extension employees, as well as other university faculty and staff, and students, for improving their survey designs. Opportunities come-up each year to do experiments by convincing these professionals that inclusion of an experiment will help them design a better survey. In the past, joint work between committee members has been conducted on address-based sampling, the use of mail with web data collection methods, and combining probability and nonprobability sample data. A recent project involving all interested members of WERA 1010 has focused on the realm of survey burden, laying the groundwork for better understanding the number, type, and quality of surveys the general public receives in an average month along with how they evaluate and choose (not) to respond to these requests. Preliminary findings include a vast number of marketing and administrative surveys received by research participants; marketing surveys had extremely low response rates, and response rates were also influenced by the perceived social and personal utility reflected in answering the survey. There are plans underway to expand this research, with individual WERA 1010 members conducting experiments with their own research populations of interests and combining data and observations in a cross-collaboration.


 


Because survey methodology is changing so rapidly and the desire to be cost-effective through use of the web, it is difficult to anticipate the exact experiments that members of the committee will conduct during the life of the committee. The typical time-lag between development of a new idea and getting it tested by WERA members in this area of research is several months to a year. Committee members report at the annual meeting, typically held in February of each year. When the results of a new idea tested during the year appear promising, another committee member will find a way to provide a further test (and sometimes an exact replication) the same year. Thus, the committee is quite dynamic in its operation. We expect this philosophy of operation to continue under renewal of the coordinating committee. We are also interested in exploring the utility of smaller ‘research interest groups’ to coordinate purpose-driven virtual meetings on a quarterly basis in order to bridge the time between annual meetings.


 


Finally, we are interested in how to better coordinate with other survey research groups and organizations. We have historically been involved with and presented at AAPOR and its regional meetings and are interested in strengthening this relationship, possibly through encouraging WERA 1010 members to present the results of their research at AAPOR and/or its regional meetings. We also actively pursue ad hoc opportunities to present on our work; for instance, four WERA 1010 members are coordinating a panel on “Surveying rural populations: Challenges, strategies, and lessons learned” at the 2024 Rural Sociological Society Meeting in Madison, WI. 


 


Additionally, while evaluation research is fundamentally different from random sample survey research, we are interested in starting a conversation with the Western Region Evaluation Network. Their expertise in designing consistent surveys to obtain measurable benchmarks nicely complements WERA 1010’s focus on general rural and agricultural populations, and the two groups may be able to share best practices for survey design, particularly that related to Extension services, and approaches to disseminating these best practices including train-the-trainer approaches. Michele Walsh at University of Arizona is a member of WREN and has also  previously attended WERA 1010 meetings; she would be an ideal contact to begin these discussions. Additionally, WREN has committee members from many of the same institutions that are a part of WERA 1010, including University of Idaho, Oregon State University, and Washington State University, which could facilitate further collaboration. 


 

Expected Outcomes and Impacts

  • Improved member knowledge and innovation in survey research Comments: Introduce members to innovative ideas for improving survey quality being tested by individual members; provide feedback and encourage replication and collaboration in these research areas
  • Improved member-conducted surveys Comments: Critique proposed survey designs and instruments at least annually, and through follow-up among individuals, in order to improve one another's experiments.
  • Conduct and report on joint experiments Comments: Coordinate proposed experimental designs and the dissemination of results across states and agencies. Facilitate, when appropriate, the joint write-up and publication of research results.
  • Updated surveying best practices Comments: Update best practices for conducting surveys of the general public (especially those in rural areas) which use appropriate technologies.
  • Improved land grant system survey capacity and quality Comments: Increase capacity of units in the land grant system for conducting surveys that yield scientifically accurate data for use in assessing needs of client groups, evaluating the effectiveness of extension programs, and understanding issues facing agriculture and rural America.
  • Improved survey curriculum Comments: Infuse research findings on best practices into graduate-level curriculum and non-formal professional development programs including Extension.
  • Identified, tested, and improved innovative survey methodologies Comments: Identify innovative methodologies in evolving survey research (ex. gamification), evaluate and test the potential application to rural research methods, connect relevant actors in the field, and provide public-facing recommendations, best practices, and guideline reports to facilitate appropriate use of evolving methodologies.
  • Increased accessibility of survey methodology to diverse participants Comments: Increase accessibility of information on survey methodology, and particularly membership in WERA, through expanding outreach to more diverse participants including those from HBCUs, Minority Serving Institutions, and Tribal Colleges.
  • Improved training of future survey researchers Comments: Train students in conducting high quality surveys

Projected Participation

View Appendix E: Participation

Educational Plan

We will provide educational outreach to professionals and graduate students involved in agricultural and rural research. WERA committee members have conducted presentations, workshops, and short courses at conferences attended by agricultural and rural researchers, including that of the Rural Sociological Society, the Southern Association of Agricultural Scientists, the American Association for Public Opinion Research, and the American Statistical Association to inform participants about the latest methodological findings of the committee and their application. We plan to continue our outreach efforts with these and other relevant groups.


 


For instance, since 2019, the University of Idaho has directly supported the design and implementation of survey research at the Idaho Department of Fish and Game through a joint faculty position. In addition, that support has included survey workshops and professional development curriculum to enhance the internal survey research capacity of the agency. Similarly, the Cornell Center for Conservation Social Sciences has a multi-year Memorandum of Understanding with the New York State Department of Environmental Conservation to research survey fatigue and the costs/benefits of web and mail survey modes. 


 


We also provide outreach and training to Extension professionals working in the field. County agents and state specialists frequently conduct surveys to assess needs and evaluate programs while dealing with the constraints of limited resources and access. Committee members will conduct in-service training workshops, which incorporate WERA research findings into practical steps for participants to conduct cost-effective, credible surveys. WERA members may develop brief fact sheets on selected topics to provide user-friendly advice about survey procedures to county agents and specialists. To date, a WERA member at the University of Florida has developed 22 fact sheets for the Savvy Survey Series and these factsheets have 111,555 views  since the series began in 2013. One member provided a workshop and resources for Extension specialists and county agents at the annual state Extension conference.  Finally, WERA committee members will assist, consult, and collaborate with others to conduct surveys of extension audiences (e.g., Singletary & Smith, 2006). In these instances, methods developed by the committee are embedded into the survey design and implementation. Our proposed increased collaboration with WREN would additionally facilitate these sorts of Extension-focused activities. 


 


WERA members will also conduct outreach to survey methodologists, evaluators, and other relevant professional groups. WERA members regularly conduct presentations and workshops for survey methodologists and professional conferences (e.g., American Association of Public Opinion Research). In addition, members periodically present findings to other relevant groups, such as participants at the American Evaluation Association conference (e.g., Israel, 2012b; Kumar Chaudhary & Israel, 2016b). This facilitates the diffusion of research findings to a broad array of practitioners involved in surveys, including those working in the agricultural and rural development fields.


 


WERA members research often leads to publication of research findings in peer-reviewed journals. WERA committee members will publish study results in leading journals in the survey methodology field as well as applied journals used by the committees primary stakeholders. The committee has established a long-standing record of productivity in well-respected journals, including Public Opinion QuarterlyRural Sociology, Society & Natural Resources, Journal of Official Statistics, and numerous other journals and this work is frequently cited by other survey scholars. The committee plans to continue publishing in these venues.


 


We will also infuse our knowledge into undergraduate and graduate curriculum. WERA committee members have continued to integrate members’ work into undergraduate and graduate coursework. Utah State University (USU) faculty offer a graduate course in advanced survey methods every other year and many MA and PhD students who take the course subsequently integrate information and skills developed from this group into their own research. Students in this class also often get hands-on experience designing and conducting a mail+web survey from start to finish (see, for example Schad, Braddock, and Lancaster 2023). USU faculty also integrate survey methods from this group into undergraduate research methods courses. To date, more than 300 students at Texas A&M University (TAMU) have completed at least one undergraduate research methods course that integrates results from the annual WERA meetings and Dillman et al. (2014). These students also participated in data collection activities that contribute to one or more goals of WERA 1010. Since 2012, WERA committee members have mentored more than 19 undergraduate students who completed the Undergraduate Research Scholars (URS) program at TAMU and have used survey data collected from WERA 1010 research efforts. Of the former undergraduate students who have completed the URS program, 12 have enrolled in or have completed a graduate program. Further, several former URS participants have secured research jobs including one at the Congressional Research Service (Library of Congress) and another in Google’s media research division. Each undergraduate student has directly benefited from the research of WERA 1010. These formal undergraduate education efforts will be continued and potentially expanded to other WERA members’ campuses. 


 


The foundation set by Internet, Phone, Mail, and Mixed-Mode Surveys: The Tailored Design Method (Dillman, Smyth, & Christian, 2014), will be adapted in two forthcoming textbooks relevant to the survey research in rural contexts: Human Dimensions of Natural Resources (Huff & Wallen; expected 2024) and Human Dimensions of Wildlife Management (Chizinski & Wallen, expected 2025). Since 2019, the University of Idaho has used Internet, Phone, Mail, and Mixed-Mode Surveys: The Tailored Design Method and the Savvy Survey Series within its undergraduate research methods curriculum (e.g., NRS 310) and Utah State University has used the same resources in its graduate research methods curriculum (e.g., SOC 7100)..


 


The Human Ecology Learning & Problem Solving (HELPS) Lab at Montana State University Bozeman frequently surveys rural populations, typically in Montana but also throughout the Mountain West. Lab staff are also active in the classroom and can bring applied examples and information about recent trends to students. Participation in WERA will expand this knowledge base significantly and will allow dissemination of recent findings and improved practices through both undergraduate and graduate courses, affecting approximately 50 undergraduate and 15 graduate students per year. Students in these courses frequently work with publicly available data produced by the HELPS Lab.


 

Organization/Governance

A chair and secretary will be elected annually. The chair will be responsible for developing an agenda for the annual meeting, and facilitating communication among participants throughout the year. The secretary will be responsible for taking minutes and emailing them to the Administrative Advisor and members.

Literature Cited

(References include those cited plus recent additional work by participants that provides selective background for the proposed coordinating committee activities completed under WERA-1001 and WERA-1010).


 


Asiu, B. W., Antons, C. M., & Fultz, M. L. (1998). Undergraduate perceptions of survey participation: Improving response rates and validity. AIR 1998 Annual Forum. Retrieved August 11, 2022 from https://files.eric.ed.gov/fulltext/ED422805.pdf


Avemegah, Edem, Wei Gu, Abdelrahim Abulbasher, Kristen Koci, Ayorinde Ogunyiola, Joyce Eduful, Shuang Li, Kylie Barington, Tong Wang, Deepthi Kolady, Lora Perkins, A. Joshua Leffler, Péter Kovács, Jason D. Clark, David E. Clay, and Jessica D. Ulrich-Schad. 2020. “An Examination of Best Practices for Survey Research with Agricultural Producers.” Society and Natural Resources. DOI: 10.1080/08941920.2020.1804651.


Bandura, A. (1986). Social foundations of thought and action: A social cognitive theory. Englewood Cliffs, N.J.: Prentice Hall.


Bandura, A. (2001). Social cognitive theory of mass communication. Media Psychology, 3(3), 265-299. doi: 10.1207/S1532785XMEP0303_03


Battaglia, Michael, Dillman, Don A., Frankel, Martin R., Harter, Rachel, Buskirk, Trent D., McPhee Cameron Brook, DeMatteis, Jill Montaquila and Yancey, Tracey.. 2016. Sampling data collection and weighting procedures for address-based sample surveys.  Journal of Survey Statistics and Methodology 4 (4): 476-500.


Beebe, T.J., M.E. Davern, D.D. McAlpine, K.T. Call, and T.H. Rockwood. 2005. Increasing Response Rates in a Survey of Medicaid Enrollees: The Effect of a Prepaid Monetary Incentive and Mixed Modes (Mail and Telephone). Medical Care. 3(4):411-4.


Blumberg, S.J. and J.V. Luke. May 2023. Wireless substitution: Early release of estimates from the National Health Interview Survey, July-December 2022. National Center for Health Statistics. DOI: https://doi.org/10.15620/cdc:127524


Brenner, Joanna and Lee Rainie. 2012. Pew Internet: Broadband. Pew Internet & American Life Project, Pew Research Center. Accessed June 3, 2012 at: http://pewinternet.org/Commentary/2012/May/Pew-Internet-Broadband.aspx


Christian, L. and D.A. Dillman. 2004. The Influence of Symbolic and Graphical Language Manipulations on Answers to Paper Self-Administered Questionnaires. Public Opinion Quarterly. 68(1):57-80.


Conger, R. D., Stockdale, G. D., Song, H., Robins, R. W., & Widaman, K. F. 2011. Predicting change in substance use and substance use cognitions of Mexican origin youth during the transition from childhood to early adolescence. In Y. F. Thomas, L. N. Price, & A. V. Lybrand (Eds.), Drug use trajectories among African American and Hispanic youth. New York: Springer.


Dillman, D.A., V. Lesser, R. Mason, J. Carlson, F. Willits, R. Robertson, and B. Burke. 2007. Personalization of Mail Surveys for General Public and Populations with a Group Identity: Results from Nine Studies. Rural sociology, 72(4), 632-646.


Dillman, D.A. 2006. Why Choice of Survey Mode Makes a Difference. Public Health Reports, 121(1):11-13.


Dillman, Don A. 2015.  Future Surveys. Bureau of Labor Statistics Monthly Labor Review. November.   http://www.bls.gov/opub/mlr/2015/article/future-surveys.htm.


Dillman, Don A. 2016. Moving Survey Methodology Forward in our Rapidly Changing World: A Commentary, Journal of Rural Social Sciences, 31(3): 160-174


Dillman, Don A. 2017. The promise and challenge of pushing respondents to the Web in mixed-mode surveysSurvey Methodology, Statistics Canada, Catalogue No. 12-001-X, Vol. 43, No. 1. Paper available as PDF (English): http://www.statcan.gc.ca/pub/12-001-x/2017001/article/14836-eng.pdf    


Dillman, Don A. and Michelle L. Edwards.  2016. Chapter 17. Designing a Mixed-Mode Survey. In Wolfe, Christof, Joye, Dominique, Smith, Tom W. and Fu, Yang-chih (eds.) Sage Handbook of Survey Methodology. Sage Publications Wolf, Joye, Smith and Fu. Thousand Oaks. CA pp.255-268.


Dillman, Don A., Feng Hao, Morgan M. Millar. 2016. Chapter 13.  Improving the Effectiveness of Online Data Collection by Mixing Survey Modes. In Fielding, Nigel, Raymond M. Lee and Grant Blank (eds.).   The Sage handbook of Online Research Methods, 2nd edition. Pp.220-237 Sage Publications, London.


Dillman, D.A., A. Gertseva, and T. Mahon-Haft. 2005. Achieving Usability in Establishment Surveys Through the Application of Visual Design Principles. Journal of Official Statistics, 21(2):183-214.


Dillman, D.A. and L.M. Christian. 2005. Survey Mode as a Source of Instability Across Surveys. Field Methods, 17(1):30-52.


Dillman, D. A., Smyth, J. D., & Christian, L. M. 2014. Internet, phone, mail, and mixed-mode surveys: The tailored design method. (4th ed.). Hoboken, NJ: John Wiley and Sons.


Edwards, Michelle L., Don A. Dillman and Jolene D. Smyth. 2014. An Experimental Test of the Effects of Survey Sponsorship on Internet and Mail Survey Response. Public Opinion Quarterly, 78 (3): 734-750.


European Society for Opinion and Marketing Research. (2018). Global market research report 2018: An ESOMAR industry report. Retrieved August 11, 2022 from https://shop.esomar.org/knowledge-center/library-2021/Global-Market-Research-2018-pub2898


Farias, K., McKim, B. R., Yopp, A. M., & Hernandez, F. (2015, August). Using video diaries as an alternative to mail diaries: Engaging Millennials in hard-to-reach populations. Proceedings of the of the 2015 Annual Meeting of the Rural Sociological Society. Madison, WI.


Goyder, J. (1986). Surveys on surveys: Limitations and potentialities. Public Opinion Quarterly, 50(1), 27–41. https://doi.org/10.1086/268957


Groves, R. M. (2011). Three eras of survey research. Public Opinion Quarterly, 75(5), 861–871. https://doi.org/10.1093/poq/nfr057


Greenberg, P., & Dillman, D. A. 2023. Mail Communications and Survey Response: A Test of Social Exchange Versus Pre-Suasion Theory for Improving Response Rates and Data Quality. Journal of Survey Statistics and Methodology, 11, 1–22


Harter, Rachel, Battaglia, Michael P., Buskirk, Trent D., Dillman, Don A., English, Ned, Mansour Fahimi,    Frankel, Martin R., Kennel, Timothy, McMichael, Joseph, McPhee, Cameron Brook,  Montaquila, Jill, Yancey, Tracie, and Zukerberg, Andrew L.  2016.  Address-base Sampling.  American Association for Public Opinion Research Task Force Report http://www.aapor.org/getattachment/Education-Resources/Reports/AAPOR_Report_1_7_16_CLEAN-COPY-FINAL-(2).pdf.aspx  140 pages.


Hill, J., Mobly, M., & McKim, B. R. (2015, February) Reaching Millennials: Implications for advertisers of competitive sporting events that use animals. Proceedings of the 2015 Agricultural Communications section of the Annual Meeting of the Southern Association of Agricultural Scientists. Atlanta, GA.


Hill, J. S., Mobly, M., & McKim, B. R. (2016). Reaching Millennials: Implications for Advertisers of Competitive Sporting Events that Use Animals. Journal of Applied Communications.


Israel, G. D. 2006. Visual Cues and Response Format Effects in Mail Surveys. Paper presented at Southern Rural Sociological Association, Orlando, FL, February.


Israel, G. D. 2009a. Obtaining Responses by Mail or Web: Response Rates and Data Consequences. Survey practice, June. Available at: http://surveypractice.org/2009/06/29/mail-vs- web/.


Israel, G. D. 2009b. Obtaining Responses by Mail or Web: Response Rates and Data Consequences. JSM Proceedings, Survey Research Methods Section. 5940-5954. Available at: http://www.amstat.org/Sections/Srms/Proceedings/.


Israel, G. D. 2010a. Effects of Answer Space Size on Responses to Open-ended Questions in Mail Surveys. Journal of official statistics, 26(2), 271-285.


Israel, G. D. 2010b. Using Web Surveys to Obtain Responses from Extension Clients: A Cautionary Tale. Journal of extension, 48(4), available at: http://www.joe.org/joe/2010august/a8.php.


Israel, G. D. 2011. Strategies for Obtaining Survey Responses from Extension Clients: Exploring the Role of E-mail Requests. Journal of extension, 49(3), available at: http://www.joe.org/joe/2011june/a7.php.


Israel, G. D. 2012a. Combining Mail and E-mail Contacts to Facilitate Participation in Mixed-Mode Surveys. Social Science Computer Review. Published online November 28, 2012 at http://ssc.sagepub.com/content/early/2012/11/26/0894439312464942. doi: 10.1177/0894439312464942


Israel, G. D. 2012b. Mixed-Mode Methods for Follow-up Surveys of Program Participants. Demonstration presented at the Annual Conference of the American Evaluation Association, Minneapolis, MN. October.


Israel, G. D. Lessons Learned While Planning and Conducting a Survey of Florida Residents about Climate Change Opinions. Presented at the annual meeting of the Rural Sociological Society, Columbus, OH, July, 2017.


Israel, G. D. When Does Support from Organizational Leaders Improve Survey Response Rates? Studies Comparing of Three Protocols. Paper presented at the annual meeting of the Rural Sociological Society, Westminster, Colorado, August 2022.


Israel, G. D., & Galindo-Gonzalez, S. 2010. Getting Optimal Answers to Open-ended Questions: An Experiment with Verbal Prompts and Visual Cues. Paper presented at the annual meeting of the Rural Sociological Society, Atlanta, GA, August.


Israel, G. D., & Lamm, A. J. 2012. Item Non-response in a Client Survey of the General Public. Survey Practice, April. Available at: http://surveypractice.wordpress.com/2012/04/17/item- nonresponse-in-a-client-survey-of-the-general-public/#more-6070


Israel, G. D., Newberry, III, M. G., & Lamm, A. J. Climate Change Knowledge and Perceptions of Florida Residents: Challenges and Opportunities for Florida Master Naturalists. Paper presented at the International Symposium for Society and Resource Management, Ümea, Sweden, June, 2017.


Israel, G. D., Wilson, K. L., & Haller, W. T. Amount and Timing of Cash Incentives on Response to a Mail Survey. Paper presented at the annual meeting of the Rural Sociological Society, New York, NY, August, 2013.


Junod, A. N., & Jacquet, J. B. (2023). Insights for the Drop-off/Pick-up Method to Improve Data Collection. Society & Natural Resources, 36(1), 76-88.


Kennedy, C. and H. Hartig. 2019.  Response rates in telephone surveys have resumed their decline.   https://www.pewresearch.org/short-reads/2019/02/27/response-rates-in-telephone-surveys-have-resumed-their-decline/


Kumar Chaudhary, A., & Israel, G. D. 2016a. Influence of Importance Statements and Box Size on Response Rate and Response Quality of Open-ended Questions in Web/Mail Mixed-Mode Surveys. Journal of Rural Social Sciences, 31(3), 140-159.


Kumar Chaudhary, A., & Israel, G. D. 2016b. A Demonstration on Optimizing Mixed-Mode Surveys to Address Device Variability in Program Evaluation. Demonstration presented at the American Evaluation Association, Atlanta, GA, October, 2016.


Leeper, T. 2019. Where Have the Respondents Gone? Perhaps We Ate Them All. Public Opinion Quarterly, Volume 83, Issue S1, 2019, Pages 280–288, https://doi.org/10.1093/poq/nfz010


Lesser, V.M., & L. Newton. 2007a. Comparison of Delivery Methods in a Survey Distributed by Internet, Mail, and Telephone. Proceedings of the International Statistics Institute Meetings. Lisbon, Portugal, August.


Lesser, V.M., & L. Newton. 2007b. Effects of Mail Questionnaire formats on answers to Open- Ended Questions. Unpublished paper presented at annual meetings of the American Statistical Association, Salt Lake City, Utah. August 3, 2007.


Lesser, V.M., K. Hunter-Zaworski, L. Newton and D. Yang. Using Multiple Survey Modes in a Study of Individuals with Disabilities. Presented at the American Association for Public Opinion Research, New Orleans, Louisiana, May, 2008a.


Lesser, V.M., L. Newton, and D. Yang. Evaluating Frames and Modes of Contact in a Study of Individuals with Disabilities. Presented at the American Statistical Association Meetings, Denver, Colorado, August, 2008b.


Lesser, V.M. and D. Yang. Alternatives to Phone Surveys: a study comparing Random Digit Dialing with Mail and Web using the Postal Delivery Sequence File. Presented at the American Association for Public Opinion Research, Hollywood, Florida, May, 2009.


Lesser, V.M., L. Newton, and D. Yang. Does Providing a Choice of Survey Modes Influence Response? Presented at the American Association for Public Opinion Research, Chicago, Illinois, May, 2010.


Lesser, V.M., L. Newton, and D. Yang. Evaluating Methodologies to Increase Internet Responses in Mixed-Mode Surveys. Proceedings of the International Statistics Institute Meetings, Dublin, Ireland, August, 2011a.


Lesser, V.M., A. Olstad, D. Yang, L. Newton. Comparing Item Nonresponse and Responses Across Modes in General Population Surveys. Presented at the American Association for Public Opinion Research, Phoenix, Arizona, May, 2011b.


Lesser, V. M., Newton, L. A., & Yang, D. 2012. Comparing item nonresponse across different delivery modes in general population surveys. Survey Practice, April. Available at: http://surveypractice.wordpress.com/2012/04/17/comparing-item-nonresponse-across-different- delivery-modes-in-general-population-surveys-2/#more-6026


Lesser, V. M., Newton, L. D., Yang, D. K., & Sifneos, J. C. 2016. Mixed-Mode Surveys Compared with Single Mode Surveys: Trends in Responses and Methods to Improve Completion. Journal of Rural Social Sciences, 31(3), 7-34.


Lesser, Virginia M., Nawrocki, Kerri, & Newton, Lydia. 2017a. Improving Response in Multimode and Single Mode Probability Based Surveys Compared to a Non-probability Survey. Presented at the annual meeting of the European Survey Research Association, Lisbon, Portugal, July, 2017.


Lesser, Virginia M., Nawrocki, Kerri, & Newton, Lydia. 2017b. Promises and Pitfalls: Experiences and Lessons Learned from Using Commercial Survey Services. Presented at the annual meeting of the Rural Sociological Society, Columbus, OH, July, 2017.


Loosveldt, G., & Joye, D. (2016). Defining and assessing survey climate. In C. Wolf, D. Joye, T. W. Smith (Eds.), The SAGE handbook of survey methodology (pp. 67–76). SAGE Publications Ltd. https://dx.doi.org/10.4135/9781473957893.n6


Lorenz, F., L. Hildreth, V.M. Lesser, & U. Genshel. 2010. General-specific questions in survey research: a confirmatory factor analysis approach. Presented at the Annual Meeting of the Rural Sociology Society, August.


Mahon-Haft, T. A., & Dillman, D. A. 2010. Does Visual Appeal Matter? Effects of Web Survey Aesthetics on Survey Quality. Survey research methods, 4(1), 43-59.


McCarthy, J. S., Beckler, D. G., & Qualey, S. M. (2006). An analysis of the relationship between survey burden and nonresponse: If we bother them more, are they less cooperative? Journal of Official Statistics, 22(1), 97–112.


McKim, B. R., Specht, A. R., & Stewart, A. Y. (2015, June). Video ethnography: An approach to collecting, archiving, and sharing data and results. Proceedings of the 2015 NACTA Conference. Athens, GA.


McKim, B. R., Stewart, A. Y., & Bishop, D. M. (2015, August). An experiment testing variations of the home delivery survey method. Proceedings of the 2015 Annual Meeting of the Rural Sociological Society. Madison, WI.


McMaster, Hope Seib, Cynthia A. LeardMann, Steven Speigle and Don A. Dillman. 2017.  An Experimental Comparison of Web-push vs. Paper-only survey Procedures for Conducting an In-Depth health Survey of Military Spouses.  BMC Medical Research Methodology.   https://bmcmedresmethodol.biomedcentral.com/articles/10.1186/s12874-017-0337-1 . April 26, 9 pages.


Messer, B. L., & Dillman, D. A. 2011. Using address-based sampling to survey the general public by mail vs. Web plus mail. Public Opinion Quarterly, 75(3), 429-457. doi: 10.1093/poq/nfr021


Messer, B. L., Edwards, M. L., & Dillman, D. A. 2012. Determinants of item nonresponse to web and mail respondents in three address-based mixed-mode surveys of the general public. Survey practice, April. Available at: http://surveypractice.wordpress.com/2012/04/17/determinants-of-item- nonresponse-to-web-and-mail-respondents-in-three-address-based-mixed-mode-surveys-of-the- general-public/#more-5998


Messer, Benjamin L. 2012. Pushing households to the web: Experiments of a web+mail methodology for conducting general public surveys. Dissertation. Pullman, WA: Washington State University.


Millar, M. M., & Dillman, D. A. 2011. Improving response to Web and mixed-mode surveys. Public Opinion Quarterly,75,(2), 249-269. doi: 10.1093/poq/nfr003.


Millar, M. M., & Dillman, D. A. 2012. Do Mail and Internet Surveys Produce Different Item Nonresponse Rates? An Experiment Using Random Mode Assignment. Survey practice, April. Available at: http://surveypractice.wordpress.com/2012/04/17/do-mail-and-internet-surveys- produce-different-item-nonresponse-rates-an-experiment-using-random-mode-assignment/


Mueller, J. T., Santos-Lozada, A. R. 2023. The 2020 U.S. Census differential privacy method introduces disproportionate discrepancies for rural and non-white populations. Population Research and Policy Review. DOI: 10.1007/s11113-022-09698-3


Newberry, III, M. G., & Israel, G. D. 2017. Comparing Two Web/Mail Mixed-Mode Contact Protocols to a Unimode Mail Survey. Field Methods, 29(3), 281-298. 


Olson, C. A. (2014). Survey burden, response rates, and the tragedy of the commons. Journal of Continuing Education in the Health Professions, 34(2), 93–95. https://doi.org/10.1002/chp.21238


Pew Research Center. (2015, June 10). Three technology revolutions. Retrieved August 11, 2022 from  https://www.pewresearch.org/internet/three-technology-revolutions/


Pilgeram, R., Dentzman, K., Lewin, P., & Conley, K. (2020). How the USDA changed the way women farmers are counted in the census of agriculture. Choices, 35(1), 1-10.


Redline, C.D., D.A. Dillman, A. Dajani, and M.A. Scaggs. 2003. Improving Navigational Performance in U.S. Census 2000 By Altering the Visual Languages of Branching Instructions. Journal of Official Statistics. 19(4):403-420.


Rookey, Brian, & Dillman, Don A. 2008. Do Web and Mail Respondents Give Different Answers in Panel Surveys. Unpublished paper prepared for Annual Conference of the American Association for Public Opinion Research. New Orleans, LA.


Savage, M., & Burrows, R. (2007). The coming crisis of empirical sociology. Sociology, 41(5), 885–899. https://doi.org/10.1177/0038038507080443


Singletary L. and M. Smith. 2006. Nevada Agriculture Producer Research and Education Needs: Results of 2006 Statewide Needs Assessment. University of Nevada Cooperative Extension, EB- 06-02. pp. 118.


Schad, Jessica; Braddock, Sadie; and Lancaster, Cole, "2023 Utah People & Environment Poll Descriptive Report" (2023). CANRI Projects. Paper 2. https://digitalcommons.usu.edu/canri_projects/2


Smyth, J.D., D.A. Dillman, L.M. Christian, & M.J. Stern. 2006a. Comparing Check-All and Forced-Choice Question Formats in Web Surveys. Public Opinion Quarterly. 70(1):66-77.


Smyth, J.D., D.A. Dillman, L.M. Christian, & M.J. Stern. 2006b. Effects of Using Visual Design Principles to Group Response Options in Web Surveys. International Journal of Internet Science. 1(1):5-15.


Smyth, J.D., & Dillman, D.A. 2007. Open-ended Questions in Mail, Web and Telephone Surveys. Unpublished paper presented at annual meeting of the American Statistical Association, Salt Lake City, Utah. August.


Smyth, J. D., Dillman, D. A., Christian, L. M., & O Neill, A. C. 2010. Using the Internet to survey small towns and communities: Limitations and possibilities in the early 21st century.  American Behavioral Scientist, 53(9):325-37.


Smyth, J. D., Israel, G. D., Newberry, III, M. G., & Hull, R. G. 2019. Effects of Stem and Response Order on Response Patterns in Satisfaction Rating. Field Methods, 31(3), 260-276.


Stedman, R. C., Connelly, N. A., Heberlein, T. A., Decker, D. J., & Allred, S. B. 2019. The end of the (research) world as we know it? Understanding and coping with declining response rates to mail surveys. Society & Natural Resources, 32(10), 1139-1154.


Stern, Michael J., Ipek Bilgen and Don A. Dillman.  2014. The State of Survey Methodology: Challenges, Dilemmas and new Frontiers in the Era of the Tailored Design. Field Methods, (August) 26: 284-301


Swinford, Stephen. 2007. How Answer Spaces Affect Answers to Open-Ended Questions in Mail Surveys; Results from Multiple Experiments. Unpublished paper presented at Annual Meeting of the American Statistical Association, Salt Lake City, Utah. August.


Toepoel, V., & Dillman, D. A. 2011. Words, Numbers, and Visula Heuristics in Web Surveys: Is There a Hierarchy of Importance? Social Science Computer Review, 29(2), 193-207.


Ulrich-Schad, Jessica D., Caroline Brock, and Linda S. Prokopy. 2017. “A Comparison of Awareness, Attitudes, and Usage of Water Quality Conservation Practices between Amish and non-Amish Farmers.”  Society and Natural Resources 30(12): 1476-1490.


Ulrich-Schad, Jessica D., Shuang Li, J. G. Arbuckle, Edem Avemegah, Kathryn J. Brasier, Morey Burnham, Anil Kumar Chaudhary, Weston M. Eaton, Wei Gu, Tonya Haigh, Douglas Jackson-Smith, Alexander L. Metcalf, Amit Pradhananga, Linda S. Prokopy, Matthew Sanderson, Emma Wade, & Adam Wilke. 2022. “An Inventory and Assessment of Sample Sources for Survey Research with Agricultural Producers in the U.S.” Society and Natural Resources 35(7): 804-812.


Ulrich-Schad, Jessica D., Jennifer E. Givens, and Mitchell Beacham.  2022. “Preventive Behaviors Along the Rural-Urban Continuum in Utah During the COVID-19 Pandemic.”  Journal of Rural Social Science Special Issue on Space, Place, and COVID-19 37(2):4.


Wallen, K. E., Hammell, A. E., & Dentzman, K. E. (2021). Exploratory diary study of survey request frequency among research professionals. Completed as a project for WERA 1010: Improving Data Quality from Sample Surveys to foster Agricultural and Community Development in Rural America. SocArXiv.http://doi.org/10.31235/osf.io/mrebz


Wardropper, C. B., Dayer, A. A., Goebel, M. S., & Martin, V. Y. (2021). Conducting conservation social science surveys online. Conservation Biology, 35(5), 1650-1658.


Wilcox, A. S., Giuliano, W. M., & Israel, G. D. 2010. Response Rate, Nonresponse Error, and Item Nonresponse Effects When Using Financial Incentives in Wildlife Questionnaire Surveys. Human dimensions of wildlife, 15(4), 288-295.


Zickhur, Kathryn, & Smith, Aaron. 2012. Digital differences. Pew Internet & American Life Project, Pew Research Center. Accessed June 4 2012 at: http://pewinternet.org/~/media//Files/Reports/2012/PIP_Digital_differences_041312.pdf


Zhu, T., L. Xue, V.M.Lesser.  Improved population parameter estimation by integrating probability and nonprobability sampling.  To be presented at the American Statistical Association Meetings, Toronto, Ontario, Canada, August, 2023.


 

Attachments

Land Grant Participating States/Institutions

IA, IL, MA, MO, MT, NV, NY, OR, PA, SC, UT, VT, WA

Non Land Grant Participating States/Institutions

Michigan State University, Middle Tennessee State University, Oregon State University, University of Arizona, University of Idaho, University of Minnesota
Log Out ?

Are you sure you want to log out?

Press No if you want to continue work. Press Yes to logout current user.

Report a Bug
Report a Bug

Describe your bug clearly, including the steps you used to create it.