SAES-422 Multistate Research Activity Accomplishments Report

Status: Approved

Basic Information

Participants

Swinford, Steve (Swinford@montana.edu) – Montana State; Lesser, Virginia (lesser@science.oregonstate.edu) – Oregon State; Willits, Fern (FKW@PSU.edu) - Penn State; Qin, Hua (qinh@missouri.edu) – Missouri; Kyle, Gerald (gkyle@tamu.edu) – Texas A&M; van Riper, Carena (cvanripe@illinois.edu) - Illinois; McKibben Jason – University of West Virginia; Yopp, Ashley – Texas A&M; McKim, Billy (brmckim@tamu.edu) – Texas A&M; Israel, Glenn (gdisrael@ufl.edu) – University of Florida; Dillman, Don (DIllman@wsu.edu) – Washington State

The committee began the meeting by discussing the impact of smartphones and survey procedures on data quality. Glenn Israel reported that Qualtrics collects metadata on platform used by respondent; he reported finding 20% of respondents were using a smartphone and 10% were using a tablet to access the 2015 Florida Cooperative Extension customer satisfaction survey on the web. The discussion focused around concerns about error in multi-mode data sets. Additional questions included, “Where are respondents when they are using a mobile device?” and, “Are they focused on task of completing instrument or distracted by other things?” Some clues might be found by looking at dropout rates by age, along with data on speed of response. We have always had difficulty obtaining responses from young people and innovations to address this were discussed including whether there is an opportunity to adjust the framing of surveys to increase buy-in. Topics mentioned included gamification, perceptions of legitimacy and providing feedback on responses. Same concern found when pictures are used on questionnaires. Also discussed the problem with people sharing their access ID to an online questionnaire with others, where one respondent turns into many, particularly when topic is “politically charged” and a stuffing the ballot box situation occurs.

 

Glenn Israel (Florida) reported on two studies. The first examined how clarifying instructions can improve item response quality for numerical open-ended questions. Two numerical open-ended questions asked respondents information about number of times they contacted FCES in past 12 months and years using Extension services. The addition of clarifying instructions significantly reduced the percent of missing and incorrectly formatted responses for both questions. The second study explored the interaction of stem and response order effects on satisfaction rating questions. Israel suggested an explanation is that respondent heuristics influence the communication and respondents assume that positive response categories will begin at the left for horizontal scales and at the top for vertical scales. Experimental factors are the order of the response categories and the order of the options in the question stem in a 2-by-2 design. Israel reported there was clear evidence of response order effects consistent with theories of satisficing and respondent heuristics but inconclusive evidence of stem order moderating these effects.

 

Hua Qin (Missouri) studied the applicability of using partially correlated longitudinal data to examine community change. Community surveys have been widely used to investigate local residents’ perceptions and behaviors related to natural resource issues. These include residents’ attitudes about rapid growth induced by energy development or amenity migration, public perspectives of wildfire and fuel management, and Community risk perception and response to forest insect disturbance. Although community can be conceived as a dynamic process of interaction and collective action, most existing community survey research relies on cross-sectional data and is thus unable to capture the temporal dynamics of community processes. Longitudinal analysis has received increasing interest in recent natural resource social science literature. Trend and panel studies are two typical approaches in longitudinal community survey research. Due to limited sampling frames, research design, and respondent attrition, longitudinal community surveys often involve both matched (paired) and uncorrelated (independent) observations across different waves. Using previous re-survey data on community response to forest insect disturbance in Alaska as an example, this research note shows that the corrected z-test is a more appropriate approach to compare partially correlated samples than conventional statistical techniques such as the paired and independent t-tests.

 

Ginny Lesser (Oregon) reported on ODOT Multi-mode studies during 2006-2014. Unit response rates, percentage moving to web with several sub-studies of design components. Fourth contact in 2012 and 2014 reminds respondent of web response option. 2010 statement on saving the state money had lots of discussions of effects. Comparisons with American community survey – percentage of males, income (web group higher), degree (web higher education). Lesser also reported on response rates for a monthly customer satisfaction survey and these have been declining since 1994 – changes did occur to instrument and design. Finally, a 2014 Off-Highway Vehicle Survey had mail response was 31% and web was 29%. Overall, 56% of total respondents were Web, 17% of them by tablet, 8% by smartphone.

 

Don Dillman (Washington) discussed the work he is doing on address-based sampling to request responses over the web from household samples. He sees these methods expanding because of telephone going into rapid decline, the result of it no longer fitting with the culture. People don’t answer phone calls from unknown numbers and parties, and if they do it once, they won’t do it again for follow-ups. And, the only good household frame we now have for general public surveys is postal service address based samples. He has just completed his contribution to an AAPOR task force on evaluating the use of U.S. Postal Service address-based samples. That report is now being published, and a follow-up journal manuscript is ready for submission to the Journal of Survey Statistics and Methodology. He also is making a major effort to help improve the communications used to request survey responses in the U.S. Decennial Census as well as the American Community Survey. In October, 2015, he discussed current issues and challenged at a National Academies of Science seminar on the roll-out of 2020 Decennial Census methods, and will be making a similar presentation to a National Academies Committee in May 2016, on those procedures.

 

Steve Swinford (Montana) continued work on transportation research as well as applying cultural models to adolescent behavior and child abuse prevention studies. A cannabis study will investigate perceptions of user and non-users on impairment to drive a motor vehicle after use. Also a Utah seatbelt safety study to collect local information to inform policy changes. Also conducted two small-scale community needs assessment surveys for Dillon and Valier (Montana cities).

 

Gerald Kyle (Texas) discussed four treatments on normative appeals in cover letters for a study he will be conducting. Texas boaters will be the target audience. Discussed reordering; does the treatment get lost in the format? Committee members made suggestions on wording.

 

Billy McKim (Texas) studied Heuristics Matrix Effects with TX Extension agents. This study measured perceived ability on a task and importance of the ask to job as two dimensions. Having the responses side by side (as opposed to on separate pages) made a difference. They used one answer (first one) to answer second factor. McKim’s Rodeo Austin Study was an economic impact and customer satisfaction survey. A total of 2,173 contacts were made and 1,473 usable completed questionnaires were obtained with an iPad intercept methodology. He used offline Qualtrics – an add-on function. An incentive – a token for concessions was offered (received $10,000 in tokens for the survey) Those at the $8 and $6 incentive also spent more time answering questions. McKim also reported on a third study: Reaching the Public – Personas for Marketing Agricultural Organization to Target Audiences. He discussed how one size fits all does not work with incentives – and same thing happens with messaging. Use of a “persona” in the marketing to increase response. The topic in the study was addressing animal treatment. They created a number of statements reflecting the values of the organization. The Q-sort method – a set number of statements are placed along a continuum by the respondent to reflect values, was used.

Accomplishments

An agenda was developed and the coordinating committee held its annual meeting in February, 2016. Participating members discussed several important topics affecting error in agricultural and rural surveys. These topics included: Impact of smartphones on survey procedures and data quality, Cultural, technological, and generational challenges and opportunities to survey engagement, and IRB-Surveyor issues. In addition, members reported on survey research studies being conducted or planned in their state and provided feedback to others. Members from several states discusses plans for coordination studies on comparing nonprobability samples in on-line surveys with address-based probability samples using mail and mixed-mode surveys in order to assess the strengths and weaknesses of these technologies.

 

In addition, committee members have been active in publishing research in journal articles, presenting papers and posters at relevant conferences, and developing educational materials available to Extension professionals and the public during the past year. This includes 6 publications for Extension and outreach audiences and 24 presentations at professional conferences where attendees are members of the target audience for this project. In addition, the member from Florida conducted a 4-hour workshop on conducting on-line surveys for extension professionals, which incorporated research of the coordinating committee. The member from Washington conducted a workshop for approximately 15 individuals at the annual meeting of the Rural Sociological Society in July, 2016.

Impacts

  1. Recipients of the research findings and outreach activities of coordinating committee members have more accurate information for making decisions about conducting surveys and/or assessing the strengths and weaknesses of survey data. This, in turn, can contribute to appropriate project- and policy-level decisions.

Publications

  1. Landon, A.C., van Riper, C.J., Angeli, N.F., Fitzgerald, D.B., & Neam, K.D. 2015. Growing transdisciplinary roots in the Peruvian Amazon: Lessons from the field. The Journal of Transdisciplinary Environmental Studies, 14(1), 2-12.

 

  1. Qin, H. and T. F. Liao. 2015. The association between rural-urban migration flows and urban air quality in China. Regional Environmental Change (in press). doi:10.1007/s10113-015-0865-3

 

  1. Qin, H., P. Romero-Lankao, J. Hardoy, and A. Rosas-Huerta. 2015. Household responses to climate-related hazards in four Latin American cities: A conceptual framework and exploratory analysis. Urban Climate 14(Part 1): 94-110.

 

  1. Qin, H. Comparing newer and long-time residents’ perceptions and actions in response to forest insect disturbance on Alaska’s Kenai Peninsula: A longitudinal perspective. Journal of Rural Studies 39: 51-62. 

 

  1. Qin, H., C. G. Flint, and A.E. Luloff. 2015. Tracing temporal changes in the human dimensions of forest insect disturbance on the Kenai Peninsula, Alaska. Human Ecology 43(1): 43-59.

 

  1. Wallen, K., Kyle, G., & van Riper, C.J. Carrying capacity and commercial services in the Southern Sierra Nevada. (Prepared for the U.S.D.A. Forest Service.) College Station, TX: Texas Agrilife Research.

 

  1. Schuett, M.A., Kyle, G.T., Dudensing, R., Ding, C., van Riper, C., & Park, J. 2015. Attitudes, behavior, and management preferences of Texas artificial reef users. (Prepared for the Artificial Reef Program, Texas Parks and Wildlife Department.) College Station, TX: Texas AgriLife Research.

 

Log Out ?

Are you sure you want to log out?

Press No if you want to continue work. Press Yes to logout current user.

Report a Bug
Report a Bug

Describe your bug clearly, including the steps you used to create it.