SAES-422 Multistate Research Activity Accomplishments Report

Status: Approved

Basic Information

Participants

Vicki McCracken (Administrative Advisor Washington) mccracke@wsu.edu; Don Dillman (Chair: Washington) dillman@wsu.edu; Fred Lorenz (Secretary: Iowa) folorenz@iastate.edu; Rob Robertson (New Hampshire) robertr@cisuix.unh.edu; Steve Swinford (Montana) swinford@montana.edu; Glenn Israel (Florida) gdi@ifas.ufl.edu; Virginia Lesser (Oregon) lesser@stat.orst.edu; Angela Mertig (Middle Tennessee State) amertig@mtsu.edu; Marilyn Smith (Nevada) smithm@unce.unr.edu; Bob Mason (Oregon) masonr@stat.orst.edu; Brad Gentner (Nat. Oceanic and Atmosph. Admin) brad.gentner@noaa.gov<p> The following were not in attendance but are members or still have interest:;<p> John Allen (Utah State) johna@ext.usu.edu; Tommy Brown (Cornell) tlb4@cornell.edu; Shorna Broussard (Purdue) srb@fnr.purdue.edu; Fern Willits (Pennsylvania) fkw@psu.edu; Todd Rockwood (Minnesota) trockwood@mn.rr.com; Robie Sangster (BLS) Sangster_R@bls.gov; Brian Meekins (BLS) Meekins_B@bls.gov; Loretta Singletary (UN, Reno) singletary1@unce.unr.edu; John Saltiel (Montana) jsaltiel@.gmail.com

Minutes

WERA1001: Experiments in Survey

March 1 - 2, 2007

The 2007 annual meeting of WERA1001 was convened by Chair Don Dillman at 8:15am on March 1 at Tucson InnSuites.

Opening comments and announcements:

The meeting was called to order by Chair Don Dillman. Don reported on the absence of Fern Willits (staying close to home), Robie Sangster and Brian Meekins (budget problems at BLS), Tommy Brown (approaching retirement), Todd Rockwood (another meeting), and John Allen (other meetings). Loretta Singletary is absent but Nevada is represented by Marilyn Smith. No word from Shorna Broussard at Purdue. Coleen McCracken (Washington State University) attended the second day of the meetings. Don made the following announcements: (1) This is our 5th year as a coordinating committee, and we have to discuss whether to continue. If we want to continue, we will need to submit a new proposal. This group has been meeting since about 1987, and our objective is to improve rural and agricultural surveys.

(2) Accomplishment: personalization paper is accepted in Rural Sociology.

(3) There will be a session on experiments in open-ended at the Joint Statistical Meetings (JSM) in Salt Lake City this year (August 2007). Administrative advisor Vicki McCracken commented on changes in the USDA. Vicki also reported on USDA re-organization proposals currently being considered by various groups. The Danforth proposal suggests restructuring to create a "National Institute of Food and Agriculture" within USDA that would have funding structure like NIH. Another suggestion is to reorganize units within the USDA under a new "National Institute for Food and Agriculture ("Create 21"). This approach would put a lot of what exists into new agencies within the new NIFA. Some of these issues are being discussed within the new farm bill. Overall, these proposals are friendly to multi-state regional projects. There were no Special Grants funded this year. The level of funds that had been distributed in the previous year for Special Grants were put into CSREES in the Hatch funds category with the intent of being distributed and utilized as Hatch funds. She noted that CSRES budgets have been distributed in a way that affects land grant universities differentially. Discussion of open-ended questions: Don began the discussion by providing background on open-ended questions. He noted that, with the rise of electronic questionnaires, open-ended questions are making a come-back, and that the responses to open-ended questions are sensitive to the size of the space allowed. The first report on open-ended experiments was from Ginny Lesser. In one study, she looked at two factors - colored vs. black & white questionnaires and opened-ended comparisons - in surveys on 8 studies with 4 versions of questionnaires. The open-ended comparison offered lines vs no lines. A second study, for Oregon Dept of Transportation, made the same comparisons (mail survey with 4 contacts; black & white with lines & no lines). Ginny showed response rates by color of paper. For the ODOT study, the response rates were 44.5% and 43.3% for the black & white and colored, respectively. In one of the 8 studies, responses were filled out by staff rather than consumers of services, and that study had response rates of 70% and 47.7%. On average over 8 studies, the response rate with colored paper was 30% and black & white was 37%. Ginny's finding pretty much rule out color as an important factor affecting response rates; color is much more expensive and does not improve (maybe hurts) response rates. For open-ended, the average number of words were 21.1, 18.2 and 12.1 for black & while and 18.8, 23.0 and 12.6 for color questionnaires (no pattern of significant differences). Question from Angela: why is color an issue? Ginny noted that in a survey done in 1999, green bubble sheets generated 60% response rates while white instruments where respondents circled boxes in the usual way returned a much smaller fraction. Glenn Israel reviewed experiments he has done so far. He used two open-ended box sizes for two questions: 1.12 and 0.28 inches in 2004; replicated in 2005 and 2006 but expanded the range of box sizes. He showed a table with box heights ranging from 0.28 to 1.68 inches. He looked at several outcomes. First, there were no differences on whether respondents provided any open-ended answers. Second, number of words increased with box size (from 10.3 to 24.8 words). Third, lines of words also increased; for example, from an average of 1.3 to 2.9 lines. Finally, as box size increased, higher percentages of comments stayed in the space provided by the box (from 81.6% to 95.4%). Glenn provided a handout to demonstrate a scheme they are using to code the substance of the open-ended questions. The objective is to see if box size affects the substance of what is said. Steve Swinford reported on surveys in Montana. One was the Bozeman Study Commission survey (n = 1000 sent out; 356 completed). The pen-ended question "Please provide any additional comments" was responded to by about 35%. The experiment was to compare comments in box and open space (no box). Most comments were short; 55 and 52 offered comments for the "box - no box" comparison, respectively. The 2nd survey instrument was sent to Municipal clerks in Montana, and the differences were suggestive (23/46 (50%) and 17/53 (70%) comments in box vs no-box). A question was raised by Bob Mason: do boxes lead to the elaboration of prior answers or do they elicit additional themes? A 3rd survey was done for the Petit Jean State Park in Arkansas. Data were collected by face-to-face interviews and asked about cigarette litter; again, box vs. no box. Question: do interviewers shorten or lengthen answers given to them? Analyses are underway. The 4th experiment was about cleanliness of Petit Jean State Park. The question about cleanliness was given before or after questions about cigarette litter. The 5th study was from the NorthEast neighborhood in Bozeman, and involved both face-to-face interviews (n = 88; no experiments) and box, no-box drop-off in drop off experiments (131 responded out of 250), with 6 open-ended follow ups on contentious issues. The 6th study was a "women in engineering" survey at another school. This one had box (n = 53 responses), no-box (47) and lines (54). Here too, analyses are in process. The 7th study was on sexual assault, given to students by a student. There were 81 box and 182 no box responses but the denominators arent known at this time; analyses continue. Don continued the discussion of open-ended questions, starting with a presentation title "Open-ended questions in web and telephone surveys" with Jolene Smythe and others. They explored size of box for paper, telephone and web. They looked at words, theme, elaboration, response time, and item non-response. They did this with 4 surveys and multiple forms of the instruments. Conclusion: more words, more themes and more elaboration with larger box in mail questionnaire. For web surveys, box size makes no difference in words, themes, elaboration or response time. This suggests that open-ended answers convey a different meaning on paper than on the web. Why? Two hypotheses: scrollable boxes on web surveys allow for more text, and differences in handwriting vs. typing may make boxes more salient on paper. One interesting result is that respondents provide longer answers on the web. The next experiment added an instruction that said "you are not limited in your answers by the size of the box," even though the boxes are of different size. Results: if instructions, then more words and more elaboration but number of themes did not change. Non-response is low in any case and the mean response time goes up, which is good in this case. Next, survey 3 elaborated the preamble to encourage respondents to "take their time." Results: again, more words and elaboration but no more themes. Item non-response (percent offering comments) went up, but among those who responded, average response time increased. The open-ended in this survey probed "is there anything else" at the end. The probe did not get anything; no probe effects. Survey 4 contained both the "important" and "take your time" treatments, plus the combination. Using both worked best, although not significantly better than either of the two treatments alone. When the probe "can you tell me more" was used instead of "is there anything else," there were significantly more themes, elaborations, and response times.

Digressing to response rates: how do you get 60% response rate via web? Use alternative contact modes, including letter with $2 incentive to the mailing address, including home. Then follow immediately with e-mail, and then follow up with additional postal (to those w/o e-mail response) and e-mail (to those w/o mail address response) contacts. One result: if only postal (with incentive), then 46% but if postal and e-mail, then 61.4%. This was from 2004; now most emails work. Summary of the open-ended discussion: We have 4 different papers on box size for open-ended questions for the JSM in August. They address different aspects of open-ended questions. A discussion on internet access and telephone surveys. One concern of survey researchers is the future of telephone surveys, with people switching from in-home phones to cell phones. Don presented data on internet access in 2006. Among those under 30 years, 88% have internet access, compared with 32% among those over 65. There were no differences in internet access between men and women, or between white, black and Hispanic. Presentation of state reports (Part I: Thursday afternoon): WERA1001 members (Brad, Marily, Fred, Bob and Ginny) reported on their activities. Brad Genter discussed recreational surveys regarding fishing. Fishermen are asked about fishing at the end of the day after fishing (n = 58,000 "intercept" interviews per year on the east coast; n= 45,000 on the gulf coast; fewer on the west coast and Hawaii). These data are used to estimate catch retained and throw back. In addition to catch, Brad asked about expenditures in a telephone follow up survey; this year it was done by mail questionnaire. About 20,000 respondents to the intercept interviews agreed to give an address, and another 20,000 addresses were obtained from a license frame, and they were followed up. The focus was on durable expenditures. In Florida, where the sample is large, Brad divided the sample into mail and telephone mode to look at any measurement issues related to reporting catch and expenditures. Discussion followed regarding non-response bias. The question was raised: are there opportunities for experimentation? Yes, a comparison of telephone and mail surveys. There will be a national angler registry, and the value of this frame might be compared with the intercept surveys. Response rates: running in the mid-40s. Marilyn Smith reported on her work with Loretta Singleton on "Nevada Agriculture Producer Research and Educational Needs" completed for several producer groups. The monograph reported the results of a survey that covered a range of issues, including water, range land, agricultural uses and needs. The target audience is the users themselves, with data presented in percentages by priority areas. The top 4 issues all relate to water; the 5th is about noxious weeds. This survey is done with Nevada Agricultural Statistical Service (NASS), and Marilyn sends envelopes and questionnaires to NASS, who then sent out the questionnaires. Response rate was 23%. Fred Lorenz reported on two points. (1) He is moving to psychology, and psychology has a student pool for conducting research, and he has access to the group and this may offer opportunities for conducting experiments; (2) Center for Rural Population Surveys has been resubmitted and will be reviewed. This is a second submission: the proposal wasn't funded last year but they addressed the reviewers concerns and are mildly optimistic. If it is funded, our plan is to provide advanced sampling frames for detailed rural surveys. Fred also described the work he was doing with panel studies of rural families through the ISU Institute for Social & Behavioral Research (ISBR). Bob Mason talked about ways to code open-ended questions in a very simple way so that it can be done on-line. His suggestions are documented in a chapter he wrote in the edited book that will appear this coming year. Ginny reported on her research dealing with mode comparisons, declining response rates, non-response and future activities. The mode experiment was done with the Oregon Depart. of Transportation (ODOT) division of motor vehicles (DMV). Each month they pull 400 names from people who visit a DMV office the past month. The survey is by mail, with a mailing and a follow-up. The questionnaire is always about the same length, and the 1st page stays the same, but there are minor changes in the 2nd page. Starting in July 2006 there are now three mailing. Ginny showed response rates are by age and gender. The adjusted response rate through Nov 06 has an intercept (June 2001) of 0.5307 and a slope of 0.0024 (Rsq = 69%). In July 06 there was an additional postcard and there was a small spike. This means that responses are declining at an annual rate of 2.9%. The rates are dropping for both mails and females, with women and older respondents having the highest response rates. One possible reason for the declining response rate may be compositional: the composition of respondents may be changing because younger and better educated respondents may be renewing their annual automobile tags on the web. One comment (Don): this is the strongest evidence yet that response rates are going down; composition of respondents and reasons for visiting the office may be changing. Ginny also talked about non-response error research in the Environmental Monitoring Assessment Program (EMAP). Missing data result when an owner doesn't allow you to enter the land. Two approaches include adjusting for missing data and incorporating multiple imputation with geo-statistical model. Ginny's paper (under review) compared these two methods. Imputation is based on auxillary measures (location, elevation; tree cover; etc) and is like a regression model where you see existing data to predict missing data. Multiple imputation does the imputation several times to get an estimate of imputation error. In Ginny's study, she added geo-statistical data that allows you to take into account spatial variation. Ginny estimated coho salmon spawners using observed data, single imputation, hot deck, mean imputation, multiple imputation and hot deck multiple imputation. MI had largest standard errors. The estimates of coho were larger when using imputation. Why? Because the missing sites had the largest concentrations of coho salmon. The auxillary variable was location. Ginny is doing a survey for the "National Center for Accessible Transportation." She worked with the National Multiple Sclerosis Society for lists of people with disabilities, with a supplemental RDD sample of people who travel by plane in the past 10 years and have a disability. Respondents will fill out by mail, telephone and web; 500 from each sample; 1/3 assigned to each mode. Results should be available next year. Discussion of whether to renew WERA1001 for another 5 years: Vickie opened discussion of renewal. She handed out the WERA1001 proposal and Appendices B and E and Attachment 1 for the directions, which outlines the requirements. Do we want to go forward as a project, and if so, as a Coordinating Committee or a Western Education and Research Activities (WERA)? The difference isnt important; no particular advantages. Because of the way money comes to experiment stations and with the obligation to do 25% regional research, there does not seem to be a mind set to reduce regional research. Do we want to continue? Consensus around the table is to continue.

Due date: The proposal should be done by May 15 for RCIC meetings in July. Presentation of State Reports (Part II: Friday): Members of the committee (Ginny, Don, Glenn; Steve; Rob, Don) reported on their activities. Ginny reported on her mode comparison study. The ODOT conducts a needs survey every other year. In the past, it has been a 22 minute RDD phone survey that covers a range of issues. This study did an RDD and a mail survey and a web/mail survey. The design: (1) telephone sample from Genesys with 15 attempts; 5 regions in the state x 200/region; (2) mail using delivery system file (DSF postal); and (3) web/mail - DSF file. The mail used a preletter, 1st mailing and a 2nd mailing to non-response rate. The web/mail used preletter, 1st mailing to website and then follow up with a questionnaire. One problem is that the DSF files have addresses but no names. The response rate:

Telephone (n = 1000) 30.0% - male=39%, employ=62%, BS+=33%, $75k+=25%

Mail (n =500) male=53%, employ=54%, BS+=36%, $75k+=18%

ODOT preletter 34.0%

SRC preletter 29.8%

Web/Mail (n = 400) male=50%, employ=61%, BS+=35%, $75k+=21%

ODOT preletter 24.5%

SRC preletter 20.3%

Percent white (90%) and age (56 - 57) were about the same for all three groups. Ginny compared the responses by questions depending on the group they were in. The data were weighted to reflect the 2.9 million people living in Oregon, and there was a non-response adjustment and a post-stratification adjustment. Result: 40 of the 67 variables showed significant differences across modes (p < 0). In 36 cases, the phone was different, with more respondents (11%) answering the 1st response category. In contrast, when ignoring the telephone and comparing mail and web/mail, only 1 question was significant. Don: this is consistent; telephone gets more extreme responses. The DSF and DMV addresses allowed for the study of non-deliverables. We got 11.5% undeliverable using DMV and 9% for DSF files, where the postal carrier decides about what is undeliverable. Don reported a study that paralleled Ginny's. Don wants an alternative to strictly telephone because 15% of households will soon be cell only. A place to start addressing this issue may be by tailoring the questionnaires to regions. For example, the 2005 Lewiston/Clarkston survey got a 69% response rate with a $2 incentive. From this study, 73% had access to internet (55% from home), so we should be able to design for it. Who don't you get? Those >65 age (32%). So a study was designed with the following sequence: (1) mail with Q and $2; (2) mailed questionnaire; (3) mail reminder; (4) mailed questionnaire; (5) email link. Results: mail first (81%), eventually up to 82%; email first (42%), eventually up to 70%. Based on this experience, Don proposed the following possible study design: (1) mail only, all contacts; (2) mail until final contacts and then announce web; (3); web preference with switch to mail; (4) mail and web with emphasis on respondent choice (Don has details on powerpoint). All are assuming a $2 incentive. One thought to improve web survey response is to experiment with the web page design. Don showed a series of slides that used different colors and layouts on web pages. The results are that unaesthetic designs (discordant colors) yielded the highest response rates among college students. Idea: if we want to test website design, then design one form with good design and alternatives with strange color and layout designs, and see what happens. Glenn reported results from his Florida Cooperative Extension customer satisfaction survey. He had 4 versions of a 2-page survey. Pooling over 4 years, response rates are about 60%. Among the results, when you ask for year of birth, the important thing is showing 4 cells (_ _ _ _). Second, Glenn rearranged the order of place of residence from farm à city and from city à farm. Last year there was a difference between city categories in residence question; this year he renamed the categories, and the respondents may not have distinguished "rural, nonfarm" from "farm" when "downtown" and "subdivision of a town" came first and "rural nonfarm" preceded "farm." Perhaps respondents anticipate response categories and don't read carefully. Third, Glenn experimented with horizontal vs vertical layout and reversed response categories. Effects due to reversal were not significant when the response framework was vertical. He speculated that respondents expect (1) positive responses first and (2) consistency in the response framework throughout the set of questions. Difficulties result for respondents when the frameworks are mixed. This idea was discussed further. Bob suggested we refer back to two books, "Thinking about Questions" and "Answering Questions" by Sudman, Schwarz and Bradburn. Steve reported on his plans for 2007. His activities include giving a seminar at the Keep America Beautiful (KAP) conference. He will present at the Midwest Sociological Society in Chicago. He will do a Montana Salary Survey for Municipal and County levels and a 3 year mail Longitudinal Rural Health Survey for a nursing survey. He is co-PI on an NSF grant on immigrant labor dealing with domestic workers, and surveys will be in Spanish, so there is an opportunity implement experiments in another language. He will also participate in a 5 year Montana Rural Poverty extension study, and a second assessment of the Sibling Interaction Scale. Rob presented material from New Hampshire. He has projects related to the NH SCORP plan, Costal Communities Responding to Change, and public access to private lands. A survey on Public Access to Private Lands in Northern Forests (w / Tommy Brown) is in the field right now (see www.uvm.edu//tourismresearch/Private_Landprivate_land.httm). They are also doing a web survey for the Lamprey River Study. They over sampled the water shed (3500 surveys with one mailing and 33% response rate with an ACE hardware coupon). The questionnaire was 14 pages long. They also connected to Public Radio and used postcard mailing to gain publicity about the survey and to check non-response bias. In another study Rob is looking at Tourism and the interests of tourism stakeholders with contacts through NPR, connections to Robs website, and through space links. Similarly, other surveys are on the Harmful Alder Blooms (HAB) and the Washington (DC) sanitary services commission (WSSC). These studies are at various stages of implementation and development. One form of website survey is implemented through the My Space system students use to network with one another. Other studies were summarized. Angie Mertig listed the projects in which she was involved. In one study, they looked at the source of the survey, Michigan State or DNR, and there was no difference. Studies were on wolfs in northern Lower Michigan, and so they gave phone cards as an incentive. The response rates were 60%. At Middle Tennessee, she is involved with surveys (1) of farmers, with 50% response rate with 4 contacts, (2) a study of pharmacists attitudes toward birth control bills; no incentive but a 40% response rate at this time, and (3) a study of rodent control in California with a questionnaire in both English and Spanish. Don handed out his state report (dated March 1, 2007). He also handed out a paper (Public Opinion Quarterly 70(1): 55 - 77) comparing check-all and forced-choice question format. A new edition of his book on mail and internet surveys is in press. Don also returned to the discussion of primacy (see RS 1996 study). He notes that most of the time there is no primary. However, he hypothesized that primacy may relate to satisficing, and three factors have been associated with satisficing: difficulty of task, ability to perform the task, and motivation to perform the task. The goal is to isolate some causes and address them. Some of the causes may be question stems and response options. Some hypotheses: primacy may be more likely (H1) when reporting about generalized others; (H2) when information is not readily accessible, as when comparing accessible (yesterday) vs inaccessible (on average); (H3) when questions encourage socially desirable answers; (H4) when there are indistinct categories (e.g., every day and most days; excellent and terrific); (H5) when questions do not have specific quantifiers; (H6) offering a "don't know" which provides a way out; and (H7) location of "don't know" near the top rather than the bottom. Discussion followed about how these hypotheses could be tested. Administrative finale: We discussed the renewal, noting there will be e-mail correspondence so that we can complete the paper work needed to continue the project. We also conducted elections. Ginny Lesser was nominated and elected chair and will begin in August after the new proposal is approved. Fred Lorenz was re-elected secretary. The next meeting of WERA1001 will be February 28 - 29, 2008.

Accomplishments

Work conducted by WERA participants provide contrasting perspectives on how important it is to be consistent in the direction (positive to negative or vice versa) scales are printed in survey. Work in Washington showed that direction makes no difference in people's answers when it is done consistently. However, followup work in Florida showed that answers were affected when direction of scales was changed within a single survey. Evidence suggests that these differences in results occurs because respondents expect scales to be presented in the same way and inconsistent display leads to unintentional errors. Results from these studies provide practical guidelines for survey designers to always present scales in the same direction. Results from nine experiments in five states (New Hampshire, Pennsylvania, Washington, Oregon and Idaho (a previous participant), forthcoming in Rural Sociology, showed that personalization remains effective for improving response to mail surveys, but is of questionable effectiveness in surveys of groups with special identities (e.g. users of extension information, forest land owners, and recreational use groups). In Oregon, a modified Horvitz-Thompson non-response estimator for the population total was developed to adjust for nonresponse. By using this weighting class adjustment, the impact of bias due to non-ignorable missing data is reduced.

Impacts

  1. Visual design principles were developed and applied to redesigning the USDA-NASS annual Agricultural and Resource Management Survey, making it possible to convert this form from interview administered only, to being mostly administered by mail. This survey, which provides essential data on farm income and expenses by agricultural sector, was administered by mail in 2005, obtaining a mail back response of 43%, prior to sending enumerators to individual farms, who were able to raise response to 73%. By eliminating more than half of the enumerator contacts, this method of administration resulted in considerable cost savings for data collection.
  2. A series of web survey experiments on ways of communicating more clearly using graphical and symbolic features in conjunction with answer spaces (rather than just in the question stem) dramatically lowered the incidence of "error messages," and increase the likelihood that respondents will not terminate answering mid-survey, thus improving data quality. Results from this experiment (published in Spring, 2007, in Public Opinion Quarterly) have been adopted for use in multiple surveys including the National Science Foundation&lsquo;s annual College Graduate and Earned Doctorate surveys.

Publications

Burdge, R.J. and R.A. Robertson. 2006. Social Impact Assessment and the Public Involvement Process. In R.J. Burdge (ed.), A Conceptual Approach to Social Impact Assessment. Middletown, WI: Social Ecology Press. pp. 177-187. Claesson, S., R.A. Robertson and M. Hall-Arber. 2006. Fishing Heritage Festivals, Tourism and Community Development in the Gulf of Maine. Proceedings of the Northeastern Recreation Research Symposium, Compiled and Edited by Rudy Schuster Gen. Tech. Rep. NE-302. Newtown Square, PA: U.S. Department of Agriculture, Forest Service, Northeastern Research Station. p. 15-21. Dillman, Don A. 2007. Mail and Internet Surveys: The Tailored Design, Second Edition-2007 Update. New York: John Wiley. 565 pp. ISBN: 0-470-03856-x. 523 pp. Dillman, Don A. 2006. Why Choice of Survey Mode Makes a Difference. Public Health Reports. 121(1):11-13. Dillman, Don A., T. Mahon-Haft, S. Cook, and K. Wright. 2006. Cognitive Evaluations of Alternative Questions for Determining Field of Study for Bachelors Degree(s) as a Follow-up to Highest Degree Question in the American Community Survey. Social and Economic Sciences Research Center Technical Report. 06-037. Washington State University: Pullman. 82 pp. Dillman, Don A., T. Mahon-Haft, and K. Wright. 2006. Effects of Question Structure on Answers to Field of Study Question Proposed for the American Community Survey: A Follow-up Test. Social and Economic Sciences Research Center Technical Report 06-041. Washington State University: Pullman. 20pp. Harrod, L.A., and V.M. Lesser. 2006. The Use of Propensity Scores to Adjust for Nonignorable Nonresponse Bias. Proceedings of the Survey Research Section, American Statistical Association Meetings. Hartley, Troy W. and R. A. Robertson. 2006. Stakeholder Engagement, Cooperative Fisheries Research, and Democratic Science: The Case of the Northeast Consortium." Human Ecology Review. 13(2):161-171. Hartley, Troy W. and R. A. Robertson. 2006. Emergence of Multi-Stakeholder Driven Cooperative Research in the Northwest Atlantic: The Case of the Northeast Consortium, Marine Policy. 30(5):580-592. Heleski, Camie R., A. G. Mertig, and A. J. Zanella. 2006. Stakeholder Attitudes Toward Farm Animal Welfare. Anthrozoos. 19(4): 290-307. Kane, R. L., T. Rockwood, K. Hyer, K. Desjardins, A. Brassard, C. Gessert, R. Kane, and C. Mueller (2006). Nursing Home Staff's Perceived Ability to Influence Quality of Life. Journal of Nursing Care Quality 21(3):248-55. Lorenz F.O, K. A. S. Wickrama, R. D. Conger, and G. H. Elder Jr. 2006. The Short Term and Decade Long Effects of Divorce on Women's Midlife Health. Journal of Health and Social Behavior. 47:111-125. Mason, Robert and S. Amer. 2006. A Dual Process that Disables the Persuasive Impact of Mass Media Appeals to Obey Tax Laws. In Belinda Brooks-Gordon and Michael Freeman (eds.), Law and Psychology. New York: Oxford University Press. Massey, Matt, S. Newbold, and B. Gentner. 2006. Valuing Water Quality Changes Using a Bioeconomic Model of a Coastal Recreational Fishery. Journal of Environmental Economics and Managment. Volume 52, Issue 1. pp 482-500. Munoz-Hernandez, B. and V.M. Lesser. 2005. Adjustment Procedures to Account for Non-Ignorable Missing Data in Environmental Surveys. Environmetrics, 16: 1-10. Peterson, M. Nils, A. G. Mertig, and J. Liu. 2006. Effects of Zoonotic Disease Attributes on Public Attitudes toward Wildlife Management. Journal of Wildlife Management. 70(6):1746-1753. Robertson, R.A. and S. Claesson. 2006. Commercial Fishing, the Fishery Crisis and Coastal Tourism: What are the Links and Potential? In Micallef A., A. Vallallo, and M. Cassar (eds.), Proceedings for the Second International Conference on the Management of Coastal Recreation Resources-Beaches, Yachting and Coastal Ecotourism-25-27 October 2006-Gozo, Malta; Euro-Mediterranean Center on Insular Coastal Dynamic: Foundation for International Studies: Valletta, Malta. pp. 417-425. Singletary, L., M. Smith, and W. Evans. 2006. Self-Perceived 4-H Leader Competencies and Their Relation to the Skills Youth Learn Through 4-H Youth Development Programs. Journal of Extension. 44(4), Article # 4RIB2. Singletary L., and M. Smith. 2006. Nevada Agriculture Producer Research and Education Needs: Results of 2006 Statewide Needs Assessment. University of Nevada Cooperative Extension, EB-06-02. 118 pages. Smyth, Jolene D., D. A. Dillman, L. M. Christian, and M. J. Stern. 2006. Comparing Check-All and Forced-Choice Question Formats in Web Surveys. Public Opinion Quarterly. 70(1):66-77. Smyth, Jolene D., D. A. Dillman, L. M. Christian, and M. J. Stern. 2006. Effects of Using Visual Design Principles to Group Response Options in Web Surveys. International Journal of Internet Science. 1(1):5-15. Stern, Michael J. and D. A. Dillman. 2006. Community Participation, Social Ties and Use of the Internet. City and Community. 5(4):409-424. Wickrama, K.A.S., F. O. Lorenz, R. D. Conger,and G. H. Elder, Jr. 2006. Changes in Family Circumstances and the Physical Health of Married and Recently Divorced Mothers. Social Science and Medicine. 63:123 - 136. Yeh, H., F. O. Lorenz, K. A. S. Wickrama, R. D. Conger, and G. H. Elder, Jr. 2006. Relationships Between Sexual Satisfaction, Marital Satisfaction and Marital Instability at Midlife. Journal of Family Psychology. 20: 339-343.
Log Out ?

Are you sure you want to log out?

Press No if you want to continue work. Press Yes to logout current user.

Report a Bug
Report a Bug

Describe your bug clearly, including the steps you used to create it.