SAES-422 Multistate Research Activity Accomplishments Report
Sections
Status: Approved
Basic Information
- Project No. and Title: WERA_OLD1010 : Reduction of Error in Rural and Agricultural Surveys
- Period Covered: 10/01/2011 to 09/01/2012
- Date of Report: 03/10/2012
- Annual Meeting Dates: 02/23/2012 to 02/24/2012
Participants
Minutes
WERA1010: Error Reduction in Rural and Agricultural Experiments
February 23-24, 2012
The 2012 annual meeting of WERA 1010 was convened by Chair Ginny Lesser at 8:15am on Thursday, February 23 at Tucson InnSuites.
Present were:
Virginia Lesser,Chair: Oregon) lesser@stat.orst.edu
Lou Swanson, Administrative advisor (Colorado State) Louis.Swanson@ColoState.edu
Gerard Kyle (Texas A & M) gkyle@tamu.edu
Billy McKim (Texas A & M) brmckim@tamu.edu
Rob Robertson (New Hampshire) robrobertson@unh.edu
Don Dillman (Washington) dillman@wsu.edu
Glenn Israel (Florida) gdisrael@ufl.edu
Fred Lorenz (Secretary: Iowa) folorenz@iastate.edu
Todd Rockwood (Minnesota) rockw001@umn.edu
Steve Swinford (Montana) swinford@montana.edu
Still interested but couldnt make it are:
Patricia Hipple (USDA) PHIPPLE@nifa.usda.gov
John Allen (Utah State) john.allen@usu.edu
Shorna Broussard Allred (New York (Cornell)) srb237@cornell.edu
Dave Peters (Iowa) dpeters@iastate.edu
Marilyn Smith (Nevada) smithm@unce.unr.edu
Fern Willits (Penn State) fkw@psu.edu
Angela Mertig (Tennessee: Middle Tennessee State) amertig@mtsu.edu
Nick Plance (University of Maryland) nplace@umd.edu
Wuyang Hu (University of Kentucky) wuyang.hu@uky.edu
Courtney Flint (University of Illinois (Champaign-Urbana)) cflint@uiuc.edu
Retired members:
Jim Christenson (AES Advisor: Arizona) jimc@ag.arizona.edu
John Saltiel (Montana) jsaltiel@.gmail.com
Deceased (2009):
Robie Sangster (BLS), 2009
Bob Mason (Oregon State), 2011
Opening details: Ginny opened the meeting. She noted that we are being charged for the conference room this year, which we will divide 10 ways ($27/person). We will pay Ginny, who will pay the bill and sign our registration receipt. Dinner Thursday evening for approximately 14 will be at Jim Christensons country club. We need to pay Don who will settle with Jim.
The agenda for the meeting was approved. In brief, we will review state reports and plan for future meetings.
Special request: Lou Swanson, Administrative Assistant, needs three volunteers to discuss the demographers WERA proposal. (Three from the committee volunteered: Glenn, Fred & Steve). Lou is Vice President for Engagement and Director of Extension at Colorado State University. He requested that we meet at another time to avoid a conflict he has in Colorado. We are looking into it moving next meeting to a week earlier, and in fact have moved our next meeting to February 14 15, 2013.
State reports.
Don Dillman began by passing around three articles that have been published and appear in the references. Don is now interested in using address based samples, and then convincing people to go to the internet to respond? This is the objective because telephone interviewing is not doing well, with coverage at about 70%. Don discussed different approaches of mail and internet. The strategy: mail is expensive and web is cheap for large samples, so push people to the internet. Also, the demographic groups who are willing to do internet surveys are different (younger; better educated) than those who use mail only, so increasing web responses may reduce non-response bias.
Don also talked about a new methodological study that is underway but not yet published. It involves Pennsylvania, Alabama and Washington. It is an attempt to extend the techniques effective for pushing respondents to the web at a state level in Washington to other states. Don discussed the design and results so far, noting that Alabama has the lowest response rate, probably due to low aggregate education levels. Treatments are mail only, mail + web, etc. The incentives are $4 plus a 2nd $2 that goes with going to the mail+web treatment after several contacts. Sending the 2nd incentive was important. Mail remains highly effective in Alabama and Pennsylvania, but the push towards the web is less effective than Don had hoped. This study also examines a new, stronger push to the web and its influence on the proportion of respondents using web. Discussion followed: one important point: incentives lower non-response error; its not about response rates; it is about non-response error.
Don currently has another study ready to go to the field. It includes Nebraska, It examines whether trust in the sponsor, which seemed to be responsible for lowering use of web in the PA vs. WA study, has a significant influence on web response. The design is to send questionnaires from Washington State University to respondents in Washington and Nebraska, and send questionnaires from Nebraska to Washington and Nebraska. Jolene Smyth at the University of Nebraska is a collaborator on this study. The hypothesis is that internet responses may suffer from lack of identification with the state university, as evidenced by an expected cross-over difference. Discussion followed about how best to conceive of incentives: are they payments, part of an economic exchange? One consistent finding: the law of diminishing returns is at work; more incentive gets higher returns, but in diminishing amounts.
A question was raised and discussed: Whats the effect of the length of the URL? Ginny noted that Oregon has a formula for naming URLs. Don predicted that mail surveys will likely be around for a long time due to computer literacy (illiteracy). Todd asked: Why push the web, since mail only gets a significantly higher response rate? Ginny: in part its expected; its about perception? It is only when N is greater than 1000 that it pays to do web-based surveys. Web is still much more expensive for smaller surveys. Don noted that he is identified with mail, so that he cannot credibly advocate just for mail. If he advocates web, it lends credibility to mail. Discussion also was directed toward item non-response in mail and web surveys. Todd recalled that item non-response is higher on web. American community study: 10% missing on web; 3% on mail. Ginny observed that mail is better than web, but as more are moving to web, they are getting better at them and so expenses are dropping. Glenn: web has advantage in that people volunteer more open ended comments on the web.
Don noted that in his various experiments web has a lower item non-response than does web, but not by much, so it is not a major consideration in deciding whether to push more people to the web vs. mail. He has edited a special issue of Survey Practice, which will appear in June. It includes four papers by members of this committee, and was developed out of a session he organized for the 2010 American Association for Public Opinion Research (AAPOR) annual conference.
Don turned next to data from the National Park Service in Idaho. All park populations. Analysis over 20 years indicates response rates of 75% with little evidence of decline. Tommy Brown showed declines but with more questions and more questions per page. He is finishing a paper with Bryan Rookey and others on response rate trends in mail surveys over a 20 year period, retesting some of the ideas developed by Brown with a quite different data set. It has been tentatively accepted, pending some revisions by Social Science Research.
Glenn Israel reported working on a new mixed mode survey for the 4th year, and he is still working on open ended questions. Glenn reported that a new 3x3 experiment is in the field that crosses amount and timing of incentives. Amounts are $0, $2 and $5; timing is incentive with the pre-letter, with the 1st questionnaire, and with the 2nd questionnaire. No results yet. The survey is about aquatic invasive species, and the respondents are freshwater boaters (under 20 feet, based on vessel registrations) and fishing license holders.
Glenn also reported on a series of mixed mode survey in which the motive was to find high response rates with minimal bias using alternative contacts. The 1st experiment was mail only, mail/web choice and web preference. Choice included both mail questionnaire and URL; web preference had only URL but choice on last contact.
Basic results: mail only 65%; mail/web choice 51% (mail) + 8% (web) ; and web preference 29% (web) + 23% (mail)
The 2nd study: provided both postal & email addresses: mail only; web preference; and E-mail. Glenn laid out the alternatives. The essential results: mail only 53% and email + mail was higher with email obtaining 54% and mail 8%. Due to small sample size, total response rates were not statistically different.
The 3rd study: had postal & email contact information with mail only, email preference, email complement and email only. The postal only and email only differences were compared. One comparison: the email only group had high rates of non-contact (14% and 17%) in the two treatments. Glenn displayed differences by contact and response modes. Email/Web only had a lower overall response rate than mail only and mixtures. Demographic profiles were not different among treatment groups having both postal and email addresses.
The 4th (2011) study data are in with essentially same set up of mixtures. There are problems with email addresses. Glenn showed response rates for groups with both postal and email addresses, where P = postal and E=email. The orders of contact are shown: all postal contacts (PPPP) = 67%; PEEP = 58%; P..P = 38%; EEEP = 54%; ...P=36%. Also those with only a postal address were (PPPP) 59% and and those with only an email address, EEEE = 39.8%. The responses were affected slightly be education; else little differences. Item nonresponse: web gets more complete surveys, esp. open ended. Demographics were more complete in mail but open ended were more extensive in web.
Glenn also reported on 2011 open ended questions. One item asked for an explanation (what the problem was, what was done with the information, and what the result was) and a second item asked how services could be improved. The experiment tested whether a verbal prompt its very important for us to learn how. . . increased the likelihood of a response and the number of words provided when there was a response. The importance cue didnt work for the explanation item but did for the improvement item, and there was an interaction effect between the improvement item and average number of words in open ended questions with the verbal cue having a large impact only with web responses.
Steve Swinford is on sabbatical this year. He reported on his First Year Experience survey, which is linked to other data sets on students at Montana State. This will be presented in Hawaii and Vancouver this coming year. The Montana Crime Victimization Study is a statewide study with 5,000 people randomly selected from the general population. The study had a 55% response rate and 10% dead/returned. The study included pre-postcard, 2 mailings, an 8 page questionnaire and no incentive. After adjusting, they had close to 60% effective response rate. Steve works with the Center for Health and Safety Culture, which included the Idaho Transportation Department, plus agencies in Oregon, Minnesota, Alaska and Ontario. He analyzed his data, including pre- and post-tests, in response to evaluations requested by the funding agency. Finally, he is working with 10 communities in Minnesota with interventions surrounding alcohol abuse problems (with non-monetary incentives). One interesting idea: at one school, all students have ipads, so it is doing a web-based survey, to be conducted on a given day during a given period. The rest of the communities are doing pen & paper. This work does not specifically deal with experiments, but Steve can make gender by grade comparisons.
Todd Rockwood reported that he has been doing variations on respondent driven sampling (variant on snowball sampling), on which NIH has underwritten research. Respondent driven sampling uses network theory to develop sampling weights. They have tried it with prostitutes in the Cities; also with Latinos, Native Americans, children with childhood diseases, and samples from HIV positive populations. It seems to work very well in representing populations. The sampling goal: develop long referral chains. They want to extend this to develop social base and cultural based sampling. Sometimes it works well; sometimes not. During the past year weve done a lot in the UK; its quicker at getting address corrections. Also, Todd and his group do work around translation; using focus groups, they have found that some immigrants try to appear more, and sometimes less, acculturated then they are. Their research is now moving away from focus groups into cognitive and ethnographic work, translating from five different languages. Regarding their research on health and sexual functioning among the elderly, they had a lot of non-response among 80 year olds regarding sexual functioning, especially the emotional aspects. Further, in responding to lists in questionnaires, they often respond to long lists of where responses are yes or no by marking only the occasional yes that applies, and then they skip the rest (they dont answer no). They seem to interpret lists as check all that apply.
Rob Robertson reported his active involvement with variety of stakeholders in applied social science studies. For example, NH project is actively engaged with two projects focused on the management of fish and wildlife resources in rural areas. These projects are a mail and web-based survey residents of four NH communities. The survey instruments are focused on residents attitudes towards a variety of bear management programs and policies. The second study is an evaluation of NHFG Volunteer Turkey Monitoring Program. This study is making use of a web-based evaluation tool. A third project is being initiated that will focus on the management and marketing of farmers markets in Strafford and Rockingham counties. This project will be evaluating the effectiveness of a web based survey instruments and software. A fourth study in the planning stage will collect data necessary for the development of the NH Route 1a/1b Corridor Plan. The principal investigator completed a similar project in 10 years ago in cooperation with Rockingham Regional Planning and NH Department of Transportation. This project will include the establishment and direction of a NH Route 1a/1b Corridor Advisory Committee and the collection of data from key stakeholder groups to include tourists visiting the corridor, residents of the corridor and management/policy makers responsible for the development of the Corridor Plan. This project will include the compare the effectiveness using a pen and paper survey versus the use of I Pad technology for the collection of data from visitors to corridor. on studies he has been doing, including a study on how to manage bears.
We adjourned about 4:30 and reconvened at 8:15 on Friday, February 24.
Don passed around the series of articles to be published in Survey Practice, an electronic journal of AAPOR, which will appear in June 2012.
Ginny Lesser continued the state reports by discussing two studies completed and two studies in the field. The 1st deals with a questionnaire program (like Survey Monkey) called LimeSurvey, which is free. It has excellent choices for types of survey questions and provides the ability to assign pin numbers. Data is also collected on your own server.
Ginny also reported doing a survey on hazardous materials carried on Oregon highways. It followed a restricted stratified cluster sample design; the frame is trucks entering Oregon weigh stations. Hazardous class was treated as strata and companies were clusters. Companies were selected and sent a questionnaire to identify the routes traveled on the selected trip. As a note of information, diamond shaped logos on trucks are indications of hazardous materials, higher numbers in the diamond indicate greater hazard (does not include radioactive hazards). They collected data over two 6 month intervals. The data are to be used to identify the types of materials transferred on specific highways so that Hazmat teams are prepared for any accidents of hazardous material.
Ginnys second study is in collaboration with the National Oceanic and Aerospace Administration (NOAA) on estimating (sampling) the fish industry. The tradition is to use intercept studies on the docks. This does not work well to estimate fish caught at night. A pilot study is currently underway using diaries to collect a panel of data over a year. This was done before in Australia, with over 90% retention over a year. A key to success is coordinating interviewers that develop a relationship with the interviewee to maintain a high response rate. This pilot study is currently in the field.
Ginny has two studies either in the field or going out. One is the ODOT needs study, which will be mail (N = 2738) and web&mail (N = 2738) with up to 5 contacts, and variations on the front cover. The 2nd study (with Steve) is on underage drinking; the mail (n = 900) and web&mail (n = 900), and up to 5 contacts. No incentive because it was too IRB complicated.
Ginny updated us on her ongoing study of ODOT response rates, to be presented at APPOR. She has monthly time series data collected since 1994 and since 2001, data on age, gender along with response rate, has been collectedi. She reported a piecewise regression that takes into account changes in protocol, including increasing the length of the questionnaire. There was also a change in the visual format of the questionnaire. She reported on the segmented time series. The interpretation is that prior to 2001, response rates declined 1.4% per year; then declined at a 7% rate between 2001 and 2003, immediately after increasing the questionnaire length; and then declined at 0.55% per year after 2003. Response rates are higher for females and by age, with older respondents having higher response rates.
Fred Lorenz reported on general-specific questions for 5 waves of ODOT data and an Iowa community mail questionnaire and telephone interview. He noted that most research on general-specific questions, including research done by our group (Willits and Saltiel, Rural Sociology, 1995; Willits & Ke, Public Opinion Quarterly, 1995), has focused on the relationship between general question and specific items. Fred sought to replicate previous results but also examine the relationship of the specific items to each other. He found that their study has the same patterns of responses as previous studies. In addition, he also found evidence of 1st order serial correlations between adjacent items; that is, answers to the 2nd item on the list of specific items were conditioned by the 1st, the 3rd by the 2nd, etc. Using structural equations and a permutation test developed specifically for this study, they found evidence that correlating the residuals of adjacent items consistently improved the fit of the model to the data to a greater degree than the reduction you would expect when freeing any random set of residuals. The procedure and SAS program for doing the permutation test will appear in Structural Equation Modeling.
Gerard Kyle is new to our research group and reported on his activities that relate to our research. The National Geographic sponsors trips and they evaluate the trips using on-site questionnaires. They have response rates of about 65% based on a sample of 300. The past trip was in the Saguaro National Parks near Tucson, and the next will be at Rocky Mountain. A 2nd study is with Texas lakes, where boat ramp users and shoreline residents are interviewed. Its a web/mail combination. A 3rd study is with Texas Parks & Wildlife Department, with license lists; again a mail/web combination. Also about 5% buy license by mail, which gives more information and provides better information for conducting web surveys. The last time (2009) they had low and declining response rates. Also, Gerard is working with a colleague on studies on the Channel Islands off LA, to replicate an earlier study in Hitchenbrook, Australia. The study has to do with values, where respondents are asked to situate values spatially on a map relative to each other, and to assign weights to the values. This is very difficult data to assemble and he is uncomfortable with the design. Question: Is it valid? Todd: some literature. Don: suggest cognitive interviews.
Administrative details: Before continuing state reports after lunch on Friday, we discussed editing the draft of the minutes for the annual report, which is due with list of publications about April 1, 2012. We also agreed to continue with another submission, working with advisor Lew Swanson.
Billy McKim is also new to the group, and he reported on his research at Texas A & M. His work is split between extension and external research, including research for state agencies. First, he did a 2 year evaluation of disaster case management for homeland security in the wake of hurricane Ike (2008). He did a client survey, which FEMA has not used before. He looked at 34 counties, including Houston, but also very rural areas. He worked with regional council of governments, plus faith-based organizations. He had a 3-phase approach proposed and implemented. Data were collected from multiple sources and linked by case ID number. Privacy Act restrictions required that respondents couldnt be identified by name or street address. Respondents were sent questionnaires by case ID number so that when he assembled the data no one could match ID numbers with names. The questionnaires were designed to be read by people with 3 5th grade reading skills. The disaster case management project served 20,308 individuals. They took a stratified random sample of 7000 invited respondents, which had a 21% frame error, thus reducing the effective sample size to 5500. The US Postal Service refused to deliver to several zip codes for unknown reasons. There were 2139 responses for a response rate of 39%. Their process was postcard pre-notice; packet, reminder, replacement packet, and then another replacement packet. He showed the homeland security template cover letter, as approved by the state and released by each NGO. He showed the approved questionnaire. One was the client questionnaire (version 10 as approved), and many questions dealt with satisfaction with FEMA and the state. Billy reported differences in satisfaction by type of organization, and he showed other data.
Billy also studies extension and ag teachers in 37 states, all by web. He did an experiment comparing radio buttons verse sliders. There were comments: Don noted that Randall Thomas compared sliders to other scales, and they may not worth it. Billy also did another study of teachers in 5 states; only 1 wanted a web; all else wanted a packet. He also had an H1N1 mixed mode survey. Sent 10,000 mail Qs and used Qualtrics panel for web and RDD for telephone questionnaire. They wanted 10,000 because the expected a 10% response rate. They ended up with 19.5% response rate. The study was about avoiding H1N1 and how did you manage it. What messages did you get and where from?
Miscellaneous discussion followed after Billys report. We ended on an especially encouraging note: collaboration seems possible, especially between Gerard, Billy and Glenn on experiments dealing with open ended questions and common characteristics of experiment station and extension.
Outreach: Most of our discussion focused on fundamental research issues. However, Glenn Israel has been active in extension and will incorporate WERA 1010 results into his extension reports. Glenn gave a seminar on mixed-mode methods for five participants, including several from the Survey Research Center of the Bureau of Economic and Business Research, University of Florida.
Our next meeting is scheduled for February 14th and 15th, 2013 at the Best Western Tucson Inn Suites on North Oracle Road. Meeting adjourned at 3:30pm.
Minutes respectfully submitted March 6, 2012
Frederick O. Lorenz, Secretary, WERA1010
Publications released during calendar 2011
Conger, R. D., Cui, M., & Lorenz, F. O. (2011). Economic conditions in family of origin and offsprings romantic relationships in emerging adulthood (pp. 101-122). In F. D. Fincham & M. Cui (Eds.), Romantic relationships in emerging adulthood. New York: Cambridge University Press.
Cui, M., Wickrama, K. A. S., Lorenz, F. O., & Conger, R. D. (2011). Linking parental divorce and marital discord to the timing of young adults marriage and cohabitation (pp. 123-141). In F. D. Fincham & M. Cui (Eds.), Romantic relationships in emerging adulthood. New York: Cambridge University Press.
Durtschi, J. A., Fincham, F. D., Cui, M., Lorenz, F. O., & Conger, R. D. (2011). Dyadic processes in early marriage: Attributions, behavior, and marital quality. Family Relations, 60, 421 434.
Israel, G. D. 2011. Strategies for Obtaining Survey Responses from Extension Clients: Exploring the Role of E-mail Requests. Journal of extension, 49(3), available at: http://www.joe.org/joe/2011june/a7.php.
Lesser, V.M., D.K. Yang, and L. Newton. 2011. Assessing Opinions Based on a Mail and a Mixed-Mode Survey. Human Dimensions of Wildlife 16(3).
McKim, B. R., & Saucier, P. R. (2011). Agricultural mechanics laboratory management professional development needs of Wyoming secondary agriculture teachers. Journal of Agricultural Education, 52(3), 75-86. doi: 10.5032/jae.2011.03075
McKim, B. R., Rutherford, T. A., Torres, R. M., & Murphy, T. H. (2011). Organizational climate of the American Association for Agricultural Education. Journal of Agricultural Education, 52(3), 87-99. doi: 10.5032/jae.2011.03087
McKim, B. R., & Torres, R. M. (2011). Perceptions of Missouri 4-H youth development personnel regarding interorganizational cooperative behavior. Journal of Extension, 49(4). Available at http://www.joe.org/joe/2011august/a9.php
McKim, B. R., & Saucier, P. R. (2011). An Excel-based mean weighted discrepancy score calculator. Journal of Extension, 49(2). Available at http://www.joe.org/joe/2011april/tt8.php
Messer, Benjamin, L. and Don A. Dillman, 2011. "Chapter 14. Comparing Urban and Rural Quality of Life in Washington," in Marans, Robert M. and Robert Stimson, Urban Quality of Life: Implications for Policy, Planning and Research. Springer Books.
Messer, Benjamin L. and Don A. Dillman. 2011. Surveying the General Public Over the Internet Using Address-Based Sampling and Mail Contact Procedures. Public Opinion Quarterly 75 (3): 429-457.
Millar, Morgan M. and Don A. Dillman. 2011. Improving Response to Web and Mixed-Mode Surveys. Public Opinion Quarterly 75 (2): 249-269.
Munoz-Hernandez, B. V.M. Lesser, and Ruben Smith. 2011. Applying Multiple Imputation with Geostatistic Models to Account for Item Nonresponse in Environmental Data. Journal of Modern Applied Statistical Methods 9(2).
Saucier, P. R., & McKim, B. R. (2011). Assessing the learning needs of student teachers in Texas regarding management of the agricultural mechanics laboratory: Implications for the professional development of early career teachers in agricultural education. Journal of Agricultural Education, 52(4), 24-43. doi: 10.5032/jae.2011.04024
Surjadi, F. F., Lorenz, F. O., Wickrama, K. A. S. & Conger, R. D. (2011). Parental support, partner support, and the trajectories of mastery from adolescence to early adulthood. Journal of Adolescence, 34, 619-628. PMC3043113
Toepoel, Vera and Don A Dillman. 2011. Words, Numbers and Visual Heuristics in Web Surveys: Is there a Hierarchy of Importance? Social Science Computer Review 29 (2): 193-207
Yongwei Yang, Mario Callegaro, Dennison S. Bhola, Don A. Dillman. 2011. Comparing IVR and Web administration in structured interviews utilizing rating scales: Exploring the role of motivation as a moderator to mode effects. International Journal of Social Research Methodology 14 (1): 1-15
[Minutes]