SAES-422 Multistate Research Activity Accomplishments Report
Sections
Status: Approved
Basic Information
- Project No. and Title: WERA_OLD1010 : Reduction of Error in Rural and Agricultural Surveys
- Period Covered: 10/01/2009 to 10/01/2010
- Date of Report: 04/05/2011
- Annual Meeting Dates: 02/24/2011 to 02/25/2011
Participants
Virginia Lesser (Chair: Oregon) lesser@stat.orst.edu Jim Christenson (AES Advisor: Arizona) jimc@ag.arizona.edu Don Dillman (Washington) dillman@wsu.edu Glenn Israel (Florida) gisrael@ufl.edu Fred Lorenz (Secretary: Iowa) folorenz@iastate.edu Todd Rockwood (Minnesota) rockw001@mn.edu Steve Swinford (Montana) swinford@montana.edu
Still interested but could not make it are:
Patricia Hipple (USDA) PHIPPLE@nifa.usda.gov
John Allen (Utah State) johna@ext.usu.edu
Shorna Broussard Allred (New York (Cornell)) srb237@cornell.edu
Dave Peters (Iowa) dpeters@iastate.edu
Marilyn Smith (Nevada) smithm@unce.unr.edu
Fern Willits (Penn State) fkw@psu.edu
Angela Mertig (Tennessee: Middle Tennessee State) amertig@mtsu.edu
Nick Plance (University of Maryland) nplace@umd.edu
Wuyang Hu (University of Kentucky) wuyang.hu@uky.edu
Rob Robertson (New Hampshire) robertr@cisuix.unh.edu (email does not go through)
John Saltiel (Montana) jsaltiel@.gmail.com
Courtney Flint (University of Illinois (Champaign-Urbana)) cflint@uiuc.edu
Deceased:
Robie Sangster (BLS), 2009
Bob Mason (Oregon State), 2011
Agenda:
Welcome - Ginny Lesser
Administrative advisor - Jim Christenson
Studies on mixed model surveys - Dillman, Lesser, Israel
General/specific questions - Fred Lorenz
Studies on open-ended questions - Glenn Israel
Non response paper for AAPOR - Dillman, Lesser, Israel
Mail response rate trends - Ginny Lesser
Remaining state reports -
Discussion of outreach activities
Todd Rockwood on respondent driven sampling
Other activities
Ginny Lesser opened the meeting; we remembered Bob Mason, a founding member of this committee and an active participant through our February 2009 meeting. Our administrative advisor, Jim Christenson, announced that he is planning to retire in December, so we need to find another administrative advisor. Several names were suggested. The committee expressed its appreciation to Jim for his interest in our work and his hospitality.
The time and place of next year's meeting is being negotiated.
The notes that follow summarize important points in each power-point talk.
The discussion began with Don Dillman's studies of mixed model surveys, especially his efforts to meld web and mail surveys. Don discussed several results that drew from three papers, two in Public Opinion Quarterly and one in the American Behavioral Scientist. These papers are concerned with strategies for combining mail and web, often using email augmentation as a conduit to pushing respondents in the direction of web responses. The basic idea is to first send the questionnaire by mail and then follow up with an email and an electronic link. He also included incentives to one group. In one experiment, Don reported that when college student respondents obtained only postal contacts, the response rate to a web survey was 41%; when the postal contacts were augmented with an email the response rate increased to 54%. The essential strategy that came out of this is to send incentives via mail and then augment with emails.
A second study compared web and mail surveys using variations on mail and web contacts. One result was that in three separate surveys: one in the Lewiston/Clarkston region and two in the Washington community survey (2008) and Washington economic survey (2009) - the mail only mode had the highest response rates (57% - 71%). When the first contact was by web, the response rates were lower (31% - 41%) but increased when a later, 4th contact was made by mail (46% - 55%). The $5 incentive increased web + mail responses by 20.6% and mail only by 13.5%. In looking more closely at who responds, Don notes that compared with mail follow-up, web respondents tend to be younger, better educated with higher incomes, and are more likely to be married, have children and have internet access in their homes. However, when combined, the web + Mail respondents are very similar to mail-only respondents. In summary, Don noted that there are good coverage, response rate and non-response error reasons for using mixed-mode designs. Further, measurement differences may result owing to visual and question format differences. A subsequent discussion raised questions about going further, to smart phones, ipods, etc. These are new areas to explore.
Don reported on one more study, currently being conducted in Pennsylvania, Alabama and Washington on how people want their electricity (a tailored design study). The methodological twist was to use questionnaire covers that are tailored to state. Also, push harder to go to the web. Four treatments, each with 4 contacts: One is mail only; a small web push, a strong web push. Routine: (1) mail inviting participants to go to web, with $5; (2) reminder; (3) introduce paper with $5. Try to get a higher proportion to go the web. Push to web. If we can get the web demographics to better represent the population, then we can drop mail. Discussion followed.
Following up on Dons experiments, Glenn Israel reported on obtaining responses by mail and web. The 1st slides review the sequence of studies with extension customer satisfaction. Glenn's treatments were: mail only (standard practice since 2003); mail with web choice; and web preference (initial request included web only, but follow ups provided choice of web or mail). The mail only treatment had a 64.5% response rate, compared to the mail with web choice (51.4% by mail and 7.8% by web) and web preference (23.4% by mail and 29.2% by web). Glenn elaborated this in a 2nd survey in which he either had, or did not have, an email address. Among those for whom he had email addresses, some received only mail questionnaire, and the response rate was 52.3%. Those selected for web preference responded by either mail (12.4%) or web (35.8%), and those selected for email preference responded by mail (7.7%) and web (55.8%). If an email address was not provided, 56.3% responded if they received only mail contacts, compared with 28.6% among those with mail only and 50% total for those given the web preference (28.6% mail and 21.4% web). Glenn's slide presentation also discussed related issues dealing with response rates by combinations of contacts. His results indicated small differences by mode in respondent reports of satisfaction with extension services. Mail only seemed to have the lowest proportion very satisfied, compared with satisfied or less.Time spent discussing Glenn's slides.
Ginny Lesser continued with her state report, which included two mixed mode studies. The studies were testing cover letter details and testing the usefulness of a fifth contact using an additional mailing or postcard. Her state report also covered a summary of mixed-mode surveys done at Oregon State and a trace of response rates over time. For Study 1, Ginny discussed the effectiveness of using statements in the cover letter that pointed out the efficiency of completing the questionnaire on the Web. One version of the cover letter included an additional sentence encouraging respondents to use the internet to save money. The mail mode only treatment resulted in a 41% return. She then compared response rates by treatments (standard cover letter vs. a cover letter emphasizing the savings of completing a questionnaire by Web). For the Web followed by mail contact, 8.5% responded by Web using the standard letter and 10.9% responded by Web when a cover letter emphasizing the savings of completing a questionnaire by Web was used. When the sample group was offered the option of completing the survey by Web or mail, 3.9% responded by Web using the standard letter and 7.4% responded by Web when a cover letter emphasizing the savings of completing a questionnaire by Web was used.
Ginny's second study, of motorized boat owners, again compared combinations of mailing. The sample of motorized boat owners were first asked to complete the questionnaire by Web. Nonrespondents were then sent mail follow-ups. The fifth contact differed in the two treatment groups. In one group, the last mailing was a postcard and the other including another copy of the questionnaire along with another cover letter. The final response rates showed that the use of a postcard was not as effective as a final mailing that included a questionnaire with cover letter: 56% using a mailing including a questionnaire and cover letter vs. 49% using a postcard.
Overall, Ginny summarized all her mixed mode surveys (7) over the past years. She reviewed her 6 summary points. They were: if you want to drive people to complete a questionnaire by the web, the first contact should only provide a Web link to complete the survey and do not give an option; using a Web followed by mail approach, use 5 contacts, and make the 5th contact a letter and not a postcard; do not add additional instructions about accessing the web; use an additional line in the cover letter to comment on the cost savings of the web; and use a colorful front cover for the mail questionnaire.
Ginny's discussion was followed by a telephone conference call to Don's students, Benn lee Messer and Michelle Edwards on issues relating to item non-response of web and mail response in general public survey. Don gave a handout summarizing non-response. The concern: where do we go with mixed mode? Should we mix high and low response methods? A series of tables was past around comparing response rates by mode, including regression estimates showing sources of responses. The tables are to become part of a presentation at AAPOR meetings later this year, and members around the table offered Don and his students suggestions on how best to present the data. One suggestion was to focus more attention on the descriptive statistics, which tell an interesting story once the results are simplified.
The committee adjourned Thursday evening at 5pm and reconvened Friday morning.
Todd Rockwood talked about the difficulties of sampling special and sensitive populations with limited number of referrals. He briefly described respondent driven sampling (RDS) in his slides co-authored with Melissa Constantine. His research focuses on networks of special groups in the Minneapolis area. The argument of RDS is that respondents provide names of others in their network, and information about the networks can be used to obtain population parameter estimates. This is done by having initial members of a target population act as seeds to nominate others (up to 3), with both seeds and referred persons receiving incentives. In this approach, sampling weights are calculated using information about the respondent's network size. To get weights requires referral chains of at least 4 - 5 waves and an estimate of each respondent's network size. Todd described their protocol as they apply it to the Hmong populations of Minneapolis. Todd concluded by comparing some of the strengths and weaknesses of this approach. Discussion followed, with Ginny suggesting several publications by Thompson on closely related sampling procedures.
Steve Swinford reviewed his projects, including his paper at Midwest Soc Society in April 2010. Steve does a census of a school district on a given day, and asks about drug and alcohol use (an epidemiology model) over the last 30 days or over the past year. They end up with different estimates, with epidemiology being higher (30 days x months). Plan: the survey is implemented annually and they are planning to get rid of some of the questions, but are asking, which ones? They are trying to close the gap between measures. They do not have a panel but they collect data every year. He continues to work on this study. He also reviewed other studies underway. Steve is working on a survey of crime victimization; its statewide mail out 8 pages. He has money to do a letter and a survey in one wave, but may not have enough money to do a 2nd mail out. Ginny made a concrete suggestion: do a double sampling design, with a subsample of the non-respondents. That will give you a better estimate. Don: incentive? Steve: No, can't do. The study is modeled after studies in other states, including Minnesota. Steve will be on sabbatical next year, and he outlined his sabbatical agenda.
Fred Lorenz outlined his research on part-whole questions in which a general question either precedes or follows a series of related specific items. He drew from two data sources: four replications of the Oregon DOT surveys, which Ginny refereed to earlier, and a mail and telephone version of an Iowa study of satisfaction with local governments. The core of Fred's study was to look for evidence of response sets by conceiving of the specific items as manifestations or symptoms of an underlying latent variable. From this perspective, variance in each specific item is partitioned into common variance and error variance using confirmatory factor methods. Patterns in the systematic error variance may provide insight into how respondents answer questionnaires. Lorenz estimated a model in which error variance in the 2nd specific item was correlated with the 1st, the 3rd with the 2nd, etc. The result, replicated across samples, was to improve the fit of the model to a greater extent than you would expect by chance, as judged by a specially designed randomization test. The results suggest that respondents work their way through a questionnaire in a systematic fashion such that the response to one specific item shapes their response to the next immediate item. Discussion followed, with Don and Todd providing citations that could link this work back to earlier work on response sets.
Friday afternoon we returned to several themes initiated earlier Friday morning or Thursday, including a lengthy discussion of how best to frame subsequent research questions about mixed mode surveys. One theme that reoccurred repeatedly in discussions of mixed mode surveys has to do with data quality, and one measure of data quality is item non-response. Glenn provided two slides that demonstrated close parallels between mail and web surveys in counts of item non-response. In one case, about 30% of mail and 45% of web surveys had no item missing data. From among 21 possible items in one questionnaire, Glenn reported that very few returned questionnaires had more than 5 missing items. The largest quantities of missing data were with respect to open ended questions.
Glenn also has a handout on getting optimal answers to open ended questions. Glenn has a co-author who is skilled with qualitative data and thinks of open ended questions as narrative components that build on Sudman's application of Grices maxims. They will continue their work on this theme.
Ginny continued with her state report (started the previous day with a discussion of her two experiments). Ginny updated her work on response rates to a survey conducted by the OSU Survey Research Center since 1994. Overall, there has been a decline in response rates over time, from an average of about 70% in April 1994 to about 40% in Oct 2010. When looking more closely at the data, Ginny fit a spline to the data and recorded slow monthly declines between 2000 and 2005, a precipitous decline between 2005 and 2008, and a less dramatic decline between 2008 and 2010. She concluded that response rates are falling for both males and females; they are the lowest for males and for the younger age groups, and declining for all age groups.
Accomplishments
Publications listed in the publications section show the accomplishments of the group. The meeting ended with a discussion of future work, publications, and impacts of the research from the group.
Impacts
- Outreach: Most of the experiments reported on in these minutes arise from surveys conducted by WERA 1010 participants for university, state or community agencies. Examples include Glenn‘s evaluation of extension and his Your Florida Yard and Use Survey, Ginny‘s surveys for the ODOT, and Don‘s community surveys and evaluations of university services. Continued collaboration with these agencies motivates new experiments and provides empirically based advice for users of surveys.
Publications
2010 Publication List
Callegro, Mario, Yang, Yongwei, Bhola, Dennison S., Dillman, Don A. and Chin, Tzu-Yun. 2009. Response Latency as an Indicator of Optimizing in Online Questionnaires. Survey Methodology Bulletin. N. 103:5-25.
Constantine, M. L., Todd H. Rockwood, B. A. Shillo, J. W. Castellanos, SS. Foldes, & J. E. Saul (2009). The relationship between acculturation and knowledge of health harms and benefits associated with smoking in the Latino population of Minnesota. Addictive Behaviors, 34, 980 - 983.
Constantine, M. L., Todd H. Rockwood, B. A. Schillo, N. Alesci, S. S. Foldes, T. Foldes, Y. Chhith, & J. E. Saul (2010). Exploring the relationship between acculturation and smoking behavior within four Southeast Asian communities of Minnesota. Nicotine & Tobacco Research, 12, 715 - 723.
Davern, M. D. McAlpint, T. J. Beebe, J. Ziegenfuss, Todd Rockwook & K. T. Call (2010). Are lower response rates hazardous to our health survey? An analysis of three state telephone health surveys. Health Services Research, 45, 1324 1344
Dillman, Don A., Ulf-Dietrich Reips and Uwe Matzat. 2010. Advice in Surveying the General Public Over the Internet. International Journal of Internet Science 5 (1): 1-4.
Dillman, Don A. and Benjamin L. Messer, 2010. Chapter: 17: Mixed-Mode Survey, in Peter Marsden and James Wright (eds.) Handbook of Survey Methodology. Emerald Publishing Limited: Bingley, United Kingdom. Pp.551-574.
Cui, Ming*, Jared A. Durtschi, M. Brent Donnellan, Frederick O. Lorenz & Rand D. Conger (2010). Intergenerational transmission of relationship aggression: A prospective longitudinal study of observed behavior. Journal of Family Psychology, 24, 688 - 697.
Israel, Glenn D. (2010). Using web surveys to obtain responses from extension clients: A cautionary tale. Journal of Extension, 48, available at: http://www.joe.org/joe/2010august/a8.php.
Israel, G. D. 2010. Effects of Answer Space Size on Responses to Open-ended Questions in Mail Surveys. Journal of official statistics, 26(2), 271-285.
Israel, G. D. 2009. Obtaining Responses by Mail or Web: Response Rates and Data Consequences. JSM Proceedings, Survey Research Methods Section. 5940-5954. Available at: http://www.amstat.org/Sections/Srms/Proceedings/.
Mahon-Haft, Taj and Don A. Dillman. 2010. Does Visual Appeal Matter? Effects of Web Survey Screen Design on Survey Quality in Survey Research Methods 4 (1): 43-59.
Meier, A., Smith, M., and Usinger, J. (2010). Environmental Project Provides Work Experience for Rural Youth. Journal of Extension, 48(3). [Article No. 3IAW3] Article posted on-line June 2010. http://www.joe.org/joe/2010june/iw3.php.
Messer, Benjamin L. and Don A. Dillman. 2010. Using Address Based-Sampling to Survey the General Public by Mail vs. Web plus Mail. Technical Report 10-13. Washington State University Social and Economic Sciences Research Center. Pullman.
Morrison, Rebecca, Don A. Dillman and Leah Melani Christian. 2010. Questionnaire Guidelines for Establishment Surveys. Journal of Official Statistics. 26 (1): 43-85.
Rockwood, Todd & M. Constantine (2009). Item and instrument development to assess sexual function and satisfaction in outcome research. International Urogynecology Journal, 20 supplement 1: S57 - 64.
Smyth, J.D., Dillman, D.A., Christian, L.M., & ONeill, A. 2010. Using the Internet to survey small towns and communities: Limitations and possibilities in the early 21st century. American Behavioral Scientist. 53: 1423-1448.
Toepoel, Vera and Don A. Dillman. 2010. Chapter 7. How Visual Design Affects the Interpretability of Survey Questions, in Das, Marcel, Peter Ester and Lars Kaczmirek (eds.), Social Research and the Internet: Advances in Applied Methods and New Research Strategies. Pp.165-190.
Usinger, J. and Smith, M. (2010). Career Development in the Context of Self-Construction during Adolescents. Journal of Vocational Behavior, 76: 580-591.
Wickrama, K. A. S,, R. D. Conger, F. F. Sujadi, & F. O. Lorenz. (2010). Linking early family adversity to young adult mental disorders. In. W. Avison, C . S. Aneshensel, S. Schieman & B. Wheaton (Eds.), Recent Advance in Stress Research: Essays in honor of Leonard I. Pearlin. New York: Springer.
Wilcox, A. S., Giuliano, W. M., & Israel, G. D. 2010. Response Rate, Nonresponse Error, and Item Nonresponse Effects When Using Financial Incentives in Wildlife Questionnaire Surveys. Human dimensions of wildlife, 15(4), 288-295.