SAES-422 Multistate Research Activity Accomplishments Report

Status: Approved

Basic Information

Participants

Minutes WERA 1010: Error Reduction in Rural and Agricultural Experiments Februrary 25-26, 2010 The 2010 annual meeting of WERA 1010 was convened by Chair Ginny Lesser at 8:15am on Thursday, February 25 at Tucson InnSuites. Present were: Virginia Lesser (Chair: Oregon) lesser@stat.orst.edu Jim Christenson (AES Advisor: Arizona) jimc@ag.arizona.edu Patricia Hipple (USDA) PHIPPLE@nifa.usda.gov John Allen (Utah State) johna@ext.usu.edu Shorna Broussard Allred (New York (Cornell)) srb237@cornell.edu Don Dillman (Washington) dillman@wsu.edu Glenn Israel (Florida) disrael@ufl.edu Fred Lorenz (Secretary: Iowa) folorenz@iastate.edu Dave Peters (Iowa) dpeters@iastate.edu Marilyn Smith (Nevada) smithm@unce.unr.edu Steve Swinford (Montana) swinford@montana.edu Fern Willits (Penn State) fkw@psu.edu Still interested but couldnt make it are: Angela Mertig (Tennessee: Middle Tennessee State) amertig@mtsu.edu Nick Place (University of Maryland) Wuyang Hu (University of Kentucky) Rob Robertson (New Hampshire) robertr@cisuix.unh.edu Todd Rockwood (Minnesota) trockwood@mn.rr.com John Saltiel (Montana) jsaltiel@.gmail.com Courtney Flint (University of Illinois (Champaign-Urbana)) Bob Mason (Oregon State) masonr@stat.orst.edu Deceased (2009): Robie Sangster (BLS)

Agenda: Welcome Studies on mixed model surveys Studies on open-ended questions General/specific questions. Report on: National Institute of Food and Agriculture (NIFA) Mail response rate trends Remaining state reports Discussion of outreach activities Other activities Introductions Committee members introduced themselves and their affiliations. A special welcome back was extended to Fern (Bunny) Willits, who failed in her attempt to retire. Mixed mode surveys: Don Dillman reported on response rates by mode of contact, and referred to his first handout, where the 1st slide is labeled Using address-based sampling to conduct mail and web surveys: Results from three studies. First, Don noted that address-based sampling (ABS) is likely providing better coverage than random digit dialing (RDD). Don introduced his studies comparing mail and web surveys. He focused on his most recent of three studies, the Washington Economic Survey, conducted in the Fall of 2009, which had a 12 page questionnaire with 46 questions and up to 95 responses. It had 6 treatment groups and tested the effect of two $5 incentives, one with the initial request, and a second at the time the replacement questionnaire or web access information was circulated. It also tested the effect of priority mail. The treatments were: (1) web preference (n = 700); (2) web preference & priority mail; (3) web preference, priority mail & second $5 incentive; (4) mail only; (5) priority mail only; (6) priority mail only & second $5 incentive. The sample sizes were 600 for the 1st three groups and 700 for the last three groups. The response rates were: " The mail only group using priority mail plus a second $5 incentive had a 68% response rate whereas the web preference with priority mail plus the second $5 incentive had 34% by web and an additional 18% by mail follow up (52% total). " The initial $5 incentive sent with the survey request had been shown to be important in a 2008 survey he summarized. In that study a mail preference treatment was increased13.6 percentage points by the use of a $5 incentive and the web preference group was increased by 20.6 percentage points. " When priority mail was sent with the additional incentive in the current experiment, the response rate increased another 4.4% in the web preference group and 9.6% in the mail only groups suggesting this treatment could be used in an effort to maximize response rates. Don pointed out that web and mail together brings out a larger array of respondents. However, both web and mail surveys under-represent lower education and lower income respondents, although mail appears to be closest to the general public. Ginny advocated for sample weighting to correct for biases associated education and income and this will be done by Don as analysis continues to the extent sample sizes allow. A general discussion followed. One comment was that incentives save money and make studies more valid by improving representativeness. Incentives have to be used ahead of time; they dont work if they are promised once the questionnaire is returned. Ginny says she cant use incentives in state agencies in Oregon for political reasons, at least at this time. OMB is allowing incentives, but regulates how much as well as the way they can be used. We also noted that you can approach people at random by telephone (a public utility) and by mail (a government agency) but not by email, which is private. This is the reason that addressed-based sampling using the U.S. Postal Service Delivery Sequence File is so important. Ginny noted from her experience that incentives increase response rates by 20%, but Don added that they may not reduce non-response error. Don went on to discuss choice between mail and web, using a 2nd handout, Improving response for web and mail mixed-mode survey: the effects of mode choice, offering modes in sequence, and adding email contacts. What happens when you give choice? Summary: if you give a choice, respondents go with mail. If you give people too many choices, then it gets more complicated and participation goes down. In Dons study, one concern was that he needed a sample that had email, so he used random samples of university students. Don outlined the experiment. See the 4 treatments (each of 700 students) on page 2 of the 2nd handout: (1) postal mail request to respond by mode of choice, web or mail; (2) postal mail request to respond by mail; (3) postal mail with request to respond by web; and (4) postal request to respond by web, with link to website sent by email 3 days later. Then, when the response rate flattened, so that few new responses were coming in, Don added a mode switch in which the choice group (group 1) received another request to participate, the mail group (group 2) received a request to participate via mail, and groups 3 and 4 received a request to participate by mail. Over the whole experiment, primary response rates were 50.5%; and mode switch added another 4.7% to 55.2%. The largest increase came in the 3rd group, when the web mail group was asked to participate by mail (7.5%). The next experiment followed up with more treatment groups. This Fall 2009 Treatment Groups, Contact methods and incentives included a wider range of treatment groups from email contact with no incentive to the use of intermingled postal and email contacts. The response rate by email only (without an incentive) was 20%. Use of an incentive delivered by an initial postal contact brought response up to 38%. The intermingling of contacts by mail and web brought response up to 46%. The best response came from offering choice with e-mail augmentation, i.e. a sequence of postal pre-notice, postal request, email follow up, postal replacement question, email follow-up. Although this opportunity to respond by either mail or web combined with email augmentation was not significantly higher that conducting the survey by mail questionnaire alone. Discussion followed. Ginny Lesser followed, continuing the discussion of response rates by different modes in her Department of Transportation studies. Ginny reported on 7 mixed mode surveys, none of which offered incentives because all were conducted for state agencies which did not allow use of incentives. She has phone, mail, and web/mail approaches in the 2006 and 2008 ODOT studies. The phone mode was dropped after 2008 because the response rates were low and the telephone method was expensive. For each survey, sample sizes of 1000 in each group. Among her results, response rates in 2006 and 2008 were around 30%. ODOT pre-letters are worth about 4-5%. Telephone is too costly and was dropped. For web and mail, letter instructions improve responses (28.5%) over special inserts, like the cards used by Don in one of his studies (22.8%). Two studies were of licensed boat owners, where boats are either longer or shorter than 27 feet (Ns are over 3300 in each group). The objective: determine the annual amount of gasoline consumed by boats. Ginny outlined alternatives modes, including mail, web/mail and web/mail-option (paper or web). One objective: keep the number of contacts in each group the same. Thus, if the respondent had no internet access, then respondents also received 2 contacts with paper questionnaire. Ginny showed complex results. Response rates by web alone are very low. But mail with web mail option brings in slightly less than just mail and web. When the number of contacts to the groups was kept the same, the mail method provided the highest response rates. When the number of contacts was increased by one mailing for the web/mail group, this mode provided the highest response rates. From this, there are no significant differences between mailings when all the combinations are included. The bottom line: to get people to web, need a sequential strategy. One point is that if you have small sample sizes in surveys, using the web may not be the most efficient approach given the amount of cost to put the questionnaire on the web. What is the difference between Don and Ginny? Don: the numbers seem to be in the same ballpark. The patterns are the same in several studies. John asked: how long is the questionnaire? Ginny: two pages. Discussion follows. Shorna asked about backlash. They have been monitoring, and multiple contacts draw more negative comments. Shorna noted that a clause that says, if you respond, then no more follow-ups, worked to increase response. Glenn Israel followed with mixed mode experiments. Glen reminded us about his 2008 experiment (n = 1318): mail only, mail/web choice and web preference. The studies are of cooperative extension service (CES) clients, and many of them are regular email users. Responses are similar to other studies: mail only was highest (65%), followed by mail with web choice (59.2%) and web preference (52.6%). The 2009 survey was mail only, e-mail preference (letter to alert; e-mail invitation to link; email reminder; reminder letter + paper questionnaire) and web preference (letter; invitation letter with URL and pin; standard reminder; reminder letter with URL and replacement questionnaire). Total sample size was over 1400, but only 430 provided an e-mail address. The e-mail preference group had highest response rate (63.5%); mail only was 56.3% and the web preference group was 48.2%.There are some modest differences in who responds, with different profiles by mode of survey; example, sex and residence predicts early respondent. People who respond early reported having visited CES websites. People who do not respond early, or who do not visit CES website, tend to be older and female. Implication: people who provide e-mail address, go with e-mail invitation and follow with paper are younger. Glenn moved next to open ended questions. Glen had a handout with questionnaires in Spanish and English. There were 4 forms of a two page questionnaire, with experiments built into questions 5 (1st page) and 11 (2nd page). He noted that more Hispanics responded in English than in Spanish! Glenn framed his discussion in terms of Grices (1975) maxims: relationship, quality, quantity, and manner. He is thinking about how verbal and visual elements in open ended questions as a device to create conversation. There was a paper by Sundman, Bradburn & Schwarz (1996) in which maxims refer to quantity, completeness and manner. Idea: ask questions that elicit responses. Form A was a standard box and basic question (Q5b: Please explain why it did or did not sove the problem or answer your question.); form B elaborated the question to get at norms of relevance and quantity (Q5b: Please explain what your information need or problem was, what you did with the information, and what the results were.); form C addressed visual design by breaking Q5b down into 3 distinct question and answer spaces; and Form D repeated the general question in Form A but added a verbal prompt (It is very important for us to understand as much as we can about the use of information provided by the Extension office.). Question 11 was designed as a parallel test, where Form A asked the question What can we do to improve our services. Data were collected this past summer. To gage how good the answers are, Glenn created an optimality index (quantity, manner and structure; R = relation): Index = R*(Q + M + S). The analysis is not yet done. About 60% responded to open ended questions. Discussion followed about where to go from here. Glenn is working on a paper for the Rural Sociological Society meetings in August 2010. Shorna Broussard Allred discussed two experiments from her research on distance learning in forestry education. One is a watershed in New York where they compared those who live close to Wappinger Creek vs. further away. Shorna showed examples of the questionnaires, and provided a summary table of response rates. There was a random assignment to questionnaire treatments. One concern: better response rates from upstate than downstate NY. The results are in the handout. Using APOR response rate #6, overall response rate was low (26%); the rates were higher for riparian owners (28.9% vs 23.4%), but color vs black & white questionnaire design made no significant difference. Overall color had about a 1% effect (26.5 vs 25.5%). Black & white booklets were less expensive and smaller envelops cost less. A discussion of color followed: Ginny recalled some surveys where there is an effect due to color (up to 8 percent) but most differences were pretty small. Shornas 2nd study is on distance learning and forestry education. The purpose was to evaluate distance learning, and the data base included 1099 names. The survey was done by web only, and she got 522 responses (46%). Shorna provided a handout, where Table 1 offered a summary of the methodological design. Of the 1099 names, some had e-mail addresses (see Table 1). The results are in Table 2. Notice the differences between groups 1 and 2: email only resulted in a 44% response rate; email + advance post card + reminder postcard yielded a 54% response rate. Conclusion: you can boost response rates by using e-mail and postal advance. Discussion followed. Don noted that if there is data in the register, than we can do some demographics to see if respondents differ from non-respondents. John Allen did not have new experiments but he discussed his studies on barriers to adopting energy efficient methods. One study is about knowledge of available technologies. One study is with industry and they cant get hold of industry executives through mail. John has one contract with Department of Transportation; going with indigenous plants on the median, due to water concerns. No incentive allowed. Another contract is in Peru. Biggest challenges: response rates in face of budget cuts. Fred Lorenz discussed general-specific experiments, using data from Iowa communities and four replications of Iowa Department of Transportation data. He presented material from last year and then added a SEM model in which the specific items are treated as manifestations of a latent variable. He then compared models that looked at the effects of the specific latent variable on the general question, depending on whether the general preceded or followed the specific latent variable. The results were not significant, indicating that the extent to which the general question was explained by specific items was not sensitive to question order. One extension of the model is to look at the 1st order correlations between specific items. He found that, for Oregon DOT data, there were strong 1st order correlations, suggesting that the 2nd item on the specific list is influenced by the response to the 1st item, and the 3rd item is influenced by the 2nd item, etc. This work is continuing and will be presented at the Rural Sociological Society meetings in August 2010. Steve Swinford presented his state report. No new data, but he has outreach. First, Steve is working with Most of Us, a social norm program relating to alcohol and drug use; i.e., Most of Us dont do drugs. This study is run through the schools, and Steve is involved. General specific questions are a possibility, particularly on the web site. Second, they are worried about honesty; they ask about a fictitious drug. Third, there are social and epidemiological measures of drug use, and they are interested in correlating the two. Steve is also involved in outreach. One is a survey of pay levels of city officials. They are also interested in surveying a wide range of government officials, especially through the associations of county governments and municipalities. Steve will continue to work on the design aspects of the studies. He also had students who presented papers in Chicago (Midwest Soc), and one student did an experiment on male student masculinity and reports of sexual activity. Dave Peters talked about labor vacancy studies (NRI) in cooperation with the University of Nebraska, NDSU and SDSU. The surveys address labor market shortages in rural communities (e.g., western Nebraska). They looked as selected communities where the demand for jobs exceeds supply. Idea: develop recruiting strategies. The survey instrument was developed from BLS items and identified 4 occupational groups: professional & management; production; administrative, sales and services; and healthcare support. Questions were on job openings, benefits, recruitment efforts and retention, etc., many were openended. The discussion turned to the design of the questionnaire. One problem with the survey was that the 2 column layout of the questionnaire was too complicated. Don suggested some spacing techniques for making the questions flow better. Data were collected by mail to employers, using 5 contacts, including telephone calls to the 25 largest employers. The study also had community advisory committee (CAC) lists to identify missing employers and key people w/in organizations. Response rates varied by community, from 100% on 4 respondents in one community on down to single digits, often for mid size firms, which is where there are often a lot of jobs. There were high levels of non-response from city and county government; restaurants, and banks were examples of groups that were slow to cooperate and did not response. Small employers w/o vacancies often did not respond. Measurement error resulted from ambiguity about how to define occupation (eg., are nurses professional or healthcare support). There is also an effect of the recession on responses, which may have made the survey atypical. Dave will work with an extension survey this summer. John: what about ownership; a local owner vs. large corporations. Don: can we go back, and can we improve response rates? Approach it differently. The BLS surveys start by making calls (telephone contacts) to employers. They are careful from the beginning, and then you can get good rates. To handle companies with w/o vacancies, tell them up-front that it is still important. Ginny reiterated by underscoring initial phone calls to find out whom to send the questionnaires to. Marilyn Smith reported on her cooperative extension work in Nevada, especially the local application of surveys. She provided several examples of reports, including one award-winning report, that show how survey research results were being disseminated. Researchers are interested in needs assessment in communities that are largely miners rather than the traditional agriculture. Marilyn assists younger professors. Marilyn also talked about impact evaluations (see Involving youth in community emergency preparedness: Impact of a multistate initiative). In this study, she looked at immediate, 6 month and year long impacts; see especially tables 2 and 3, etc. There are some implications: survey research has an important role in cooperative extension evaluation. One handout (on 4-H Bootstraps) is for youths ages 18 -25 who are put to work on public lands. Many have dropped out of school and they are encouraging re-entry into school; notice the knowledge gained schedules (Table 2), and then what happens (subsequent tables). Discussion followed. Ginnys comment: Marilyn provides the much needed outlet. Marilyn answered questions on the context and goals of the extension based programs to involve more and more non-profits. Patricia Hipple reported on changes in the USDA, as summarized by the National Institute for Food and Agriculture (NIFA) factsheet that she provided. By way of background, the USDA is undergoing major changes as a result of the 2008 farm bill. It did away with the National Research Initiative (NRI) that was languishing, at least in comparison with NSF and NIH. The Danforth study suggested that the NRI be drawn out of the USDA and become a separate Institute, the place to fund outstanding research. But concern about formula funds and Hatch dollars led to create 2100. Overall, the NIFA is replacing CRESS, and the proposal is to build the NIFA into a competitive research institute with exponential growth. The agriculture & food research initiative (AFRI) will be different from NRI. The NRI had 31 competitive programs that were roughly discipline specific. Over the years, they tried to include extension and education, and that is the model for AFRI, a set of integrated competitive opportunities. There will be 7 RFA, released in mid-March. The handout identifies 5 of the 7: global food security and hunger; climate change; sustainable energy; childhood quality; and food safety. The last two are not yet known, but one is likely on fundamental plant/animal research. The 7 areas are broad and society based. As an important departure from the past, individual scholars will not submit proposals. The awards will be hugh ($2  10 million) and they will go to teams. There will be hundred, rather than thousands, of applications. Each will be expected to have a significant social science component. Social scientists are well positioned to lead projects, compared to bench scientists. The AFRI is addressing societal problems. On the bottom of the handout, note  form does not follow function. All staff members of AFRI are being re-organized into 4 institutes and one center, not directly aligned with the 5 RFAs. The institutes and centers are the Institutes of Food Production and Sustainability; Institute of Bio-energy, Climate, and Environment; Institute for Food Safety and Nutrition; Institute of Youth, Family and Community; and Center for International Programs. There are still questions about coordinating committees, Hatch dollars, etc. Patricia will be assigned to one of the 4 institutes. Shorna: What is happening to Hatch dollars, extension allocations, etc? Patricia: they are being negotiated. The Center for International Programs was in its ascendancy because one of the undersecretaries was pushing to feed the world. But that undersecretary has since become head of USAID. Best advice: work through people you know. Discussion followed. Patricia noted that our research group (WERA 1010) is precariously positioned to either take leadership on a group or be servants for biophysical scientists. Thus, this group could guide NIFA and others in the land grant so that surveys are done with high quality. John: Hatch funds? The lobbies have been effective and the Hatch funds are secure for a while. But all land grants received a letter directing them to align with the priorities. Time was spend discussing the implications and strategies for getting ahead as social and survey scientists. One theme: survey research is important; we need to provide guidelines to insure that it is done well. Ginny suggested that we go the OMB website on standardized practices on surveys. Its done. We agreed that as a group we should distill the 29 page OMB document. Jim Christenson and Patricia will focus it. The way that it is expressed is in terms of the human and social implications of the research. The last topic is to return to reduction in response rates. Ginny showed response rates for Oregon for Dept. of Motor Vehicles (DMV) each month since April, 1994. The data show declines, with response rates going from about 70%. She used a time series analysis. She fit a piecewise linear trend with 3 pieces. The 1st segment (prior to Feb 2001) used a one page questionnaire with 4 mailing contacts; the second segment moved to a two page questionnaire, and used a 3 contact approach by dropping the postcard, and changed the first contact to a preletter (March 2001-July 2003); and then returning to 4 contacts (bringing the post cards back in). The model takes into accounts 2nd order autoregressive model. Other factors were not significant, including minor consent, number of questions, and question about identification of respondent. So what is the decline? Prior to 2001, 1.42% decline in response rates/year; between 2001 and 2003 it is 7% decline per year and now 0.6% decline per year. The trends apply to all groups: men have lower response rates than women; young have lower response rates than than older individuals, etc. There are minor variations if you add seasonal components to the time series. Don argues that response rates may not be declining. His data are from national parks using in-person delivery and mail only response of visitors to National Parks. Don described the survey, for which procedures remained the same over the twenty years, but sponsors increased the number of items, number of pages (12 to 16), and number of items per page over a twenty year period. They also increased number of replacement questionnaires to two. . Response rates correlated negatively with number of pages and items, . The average response rates have declined from 80% in late 1980s to about 70% in recent years. The overall mean response across 20 years has been 76%. When they bring in measures of salience, about 46% of variance is explained, and year adds 4%. Because of differences in pages, number of items and the use of replacements, its difficult to write-up a definitive analysis. Work on this paper will continue. The meeting ended with a discussion of topics for next year. A list of publications printed in 2009 are given in Appendix A. Meeting adjourned. Next meeting will be February 24  25, 2011. Minutes are submitted by Fred Lorenz

Accomplishments

Publications listed below show the accomplishments of the group. The meeting ended with a discussion of future work, publications, and impacts

Impacts

  1. Higher quality survey research
  2. Setting standards for survey response rate expectations

Publications

2009 Publication List 1. Martin, Elizabeth Ann and Don A. Dillman. 2008 (published in 2009). Does a Final Coverage Check Identify and Reduce Census Coverage Errors? Journal of Official Statistics. 24 (4): 571-589. 2. Rookey, Bryan D., Hanway, Steve, and Dillman, Don A. 2008 (Published in 2009). Does a Probability-Based Household Panel Benefit from Assignment to Postal Response as an Alternative to internet-only? Public Opinion Quarterly. 72(5): 962-984. 3. Dillman, Don A., Jolene D. Smyth and Leah Melani Christian. 2009. Internet, Mail and Mixed-Mode Surveys: The Tailored Design Method, 3rd edition. John Wiley: Hoboken, NJ 499 pp. 4. Dillman, D.A., Phelps, G., Tortora, R., Swift, K., Kohrell, J., Berck, J., & Messer, B.L. (2009). Response rate and measurement differences in mixed-mode surveys using mail, telephone, interactive voice response, and the Internet. Social Science Research, 38(1), 1-18. 5. Christian, Leah Melani, Nicholas L. Parsons and Don A. Dillman. 2009. Measurement in Web Surveys: the Importance of Visual Layout and Design. Sociological Methods and Research. 37(3): 393-425. 6. Dillman, Don A. 2009. Chapter 8. Some Consequences of Survey Mode Changes in Longitudinal Surveys. In Lynn, Peter et al. (eds.), Methodology of Longitudinal Surveys. John Wiley: London. Pp. 127-137. 7. Millar, Morgan M., Allison C. O'Neill and Don A. Dillman. 2009. Are Mode Preferences Real? Technical Report 09-003. Washington State University Social and Economic Sciences Research Center. Washington State University: Pullman. 52 pp. 8. Smyth, Jolene, Don A. Dillman, Leah Melani Christian and Mallory McBride. 2009. "Open-Ended Questions in Web Surveys: Can Increasing the Size of Answer Boxes and Providing Extra Verbal Instructions Improve Response Quality?" Public Opinion Quarterly 73 (Summer): 325-337. 9. Munoz-Hernandez, B., V.M. Lesser, J. Dorney, and R.Savage. 2009. Survey methodology for assessing the map accuracy of geographically isolated wetlands. Environmental Monitoring and Assessment 150: 53-64.
Log Out ?

Are you sure you want to log out?

Press No if you want to continue work. Press Yes to logout current user.

Report a Bug
Report a Bug

Describe your bug clearly, including the steps you used to create it.