Monthly Archives: November 2011

A Challenging Question

What are the “unobtrusive measures” we might use to capture the results of civic and community engagement?

For anyone interested in creative ways to document results, St. Cloud State University president and Minnesota Campus Compact board vice chair Earl Potter recommends Unobtrusive Measures, a classic book by psychologist Eugene J. Webb and several colleagues. (Originally published in 1966, it is still in print and available from Sage Publications.) While the authors consider surveys and interviews “probably the most flexible and generally useful devices we have for gathering information,” the book emerges from concern that:

The dominant mass of social science research is based upon interviews and questionnaires. We lament this overdependence upon a single, fallible method. Interviews and questionnaires intrude as a foreign element into the social setting they would describe, they create as well as measure attitudes, they elicit atypical roles and responses, they are limited to those who are accessible and will cooperate, and the responses obtained are produced in part by dimensions of individual differences irrelevant to the topic at hand.

But the principal objection is that they are used alone. No research method is without bias. Interviews and questionnaires must be supplemented by methods testing the same social science variables but having different methodological weaknesses.

Most civic and community engagement practitioners are well aware that self-reported data can be dismissed as unreliable evidence of real change. Students may answer questions selectively because they are aware they are being tested and the institution hopes they will respond in certain ways. Fewer students may respond because of survey fatigue, as an article in the Chronicle of Higher Education this fall suggests, and different groups of students may be more or less prone to respond. Surveys of students, community partners, and other stakeholders need not be abandoned, but finding additional ways to document impact could triangulate results and make a stronger case.

So how might we collect meaningful data without directly asking people for it? A few examples from the book may help spark creative ideas:

  • tracking how often floor tiles in different parts of a museum need to be replaced is a measure of the exhibits’ relative popularity;
  • counting empty bottles in trash cans would indicate how much alcohol is drunk by residents of an officially “dry” town; and
  • the dilation of a person’s pupils may indicate fear or interest in something.

Some public records offer information about civic behaviors, such as
registering to vote and actually voting. The number of complaints local police receive about loud parties might be an indication of how responsible and considerate students are as neighbors. Similarly, the number of dormitory residents requesting a new roommate might be a measure of how skilled students are at working through differences. The number and content of letters to the editor or online comments on news sites might reflect how engaged students are with public issues and how effectively they articulate a position. Obituaries can probably tell us a great deal about the behaviors and values of our alumni, though few of us would want to wait that long to assess the results of our efforts—plus obituaries, like other public and private records, are shaped by social expectations and may not be kept consistently over time.

Another technique is simple observation. Perhaps we could be trained to recognize contempt, fear, and other emotions in people’s facial expressions. We might see how often recyclable items are placed in regular trash cans or how many bicycles are parked on campus in order to measure people’s commitment to environmental sustainability. These examples are just the initial results of a little brainstorming. What measures can you imagine being useful in capturing the outcomes of campuses’ civic and community engagement efforts?

One final point to keep in mind as we strive to improve assessment and accountability: Mike Newman of the Travelers Foundation says he looks for evidence of contribution rather than attribution. Many people do not expect us to use experimental methods or to prove a causal effect free of any other factors. Evidence of success matters, but so does trust (and open communications and relationships), whether we want to gain support or achieve our core goals.
— Julie Plaut