This is taken from a speech I gave at a pre-conference workshop on research at the 14th National Metropolis Conference of CERIS where I was asked to speak to the capacity of community agencies to do primary research. I took it as an opportunity to address academics and policy-makers.
Photo Credit: Bard Azima, Livingface Photography, The Conference Publishers
As a director of research at a large community agency, I am admittedly a bit biased on this question.
Community-based primary research is good, not because of the much-touted participatory processes it uses, nor because it fulfills some evaluation criteria of funders, or because this research is somehow more “authentic.”
Community-based research is important because it addresses material realities and it seeks real solutions.
This is not to denounce those who do the hard, theoretical thinking that some of these wicked social problems require. But, if we are committed to social change and to wider ideas of justice, then we must address the grounded (and gritty) realities faced by those around us. This is an argument made by Critical Race Theorists, a place I call my intellectual home.
I am tired of research which is simply a walk in the park, describing its surroundings, commenting on it, and noting perhaps an “oddity” or two. Occasionally, such research deteriorates into awful-izing a situation, describing in it in gory and pornographic detail.
My charge is that community-based research can’t afford to do that. To be honest researchers, we must look for change and find solutions.
“Research fatigue” emerges, I contend, when researchers spend too much time talking, dealing with process, and missing the end game. Community members tire of too much talk.
Admittedly, the field of action research has emerged to address this folly, but the solutions can be too simplistic, missing opportunities to make a change at the individual, program, organizational, sectoral and system levels. Instead, innumerable reports descend into a few “Try harder” recommendations.
To ensure they are accountable to for the use of government and donor dollars, non-profit, community agencies track an enormous amount of data. However, this is usually either administrative data useful for ongoing monitoring or program data for evaluation. Common categories include:
• Clients identifiers (d.o.b., sex, status)
• Client location
• Client concern
• “Dosage” or program participation
Some follow-up is also made to get a longer term picture of the impact.
This minutiae is a big industry – In our agency, one unit has its 23 staff take every Friday afternoon to do case notes – that time clients cannot access the service but data that is required for multiple managers to check, a director to review, so that it can be sent to funding program officers who later return to do file audits, to be rolled into — I don’t know — giving us all a strong audit trail.
Good community research has a different flavour. It is:
Inclusive: More likely to include the unusual suspects – theory of creative teams – but not so process-oriented it cannot get to an end goal.
Solution focused: Awful-izing is easy; figuring out how to implement a solution is golden.
Asset-based: Such as the reports on resiliency in children and youth by Doorsteps Neighbourhood Services and Toronto Public Health which found how strongly tied children in “disadvantaged” neighbourhoods are to their families.
Analytical at the structural level: Reports like Social Planning Toronto’s fees and fundraising in school report, David Hulchanski’s work on gentrification in neighbourhoods done in partnership with St. Chris House (often ignored), or any of John Stapleton’s policy work look at the underlying triggers.
Sensitive to complexity: Two of the best recent examples which tried to capture dynamic interplay are the reports produced for the Strong Neighbourhoods Task Force which mapped needs and service levels to determine which neighbourhoods were the most under-served; TDSB census – Grades + SES + Experiential/self-report
Unpredictable: The recent TWIG report on the new hour-glass shaped labour force created a new frame with which to think about poverty and inequality.
Outside the box: United Way of Toronto produced two reports within two years of each other on the topic of neighbourhood poverty. Only the keenest among you have heard about Decade of Decline. But two years later, in 2004, Poverty by Postal Code made huge waves. The difference? GIS had evolved enough that maps could be produced, showing the same data visually.
This is work that can and is done by community agencies. I am proud of the theory-driven, knowledge transfer model we used in the Toronto East LIP. Over the past two years, we were able to:
• Map local services (in details beyond 211 Toronto)
• Map staff languages
• Map the social networks among partnership members
• Map levels of agency coordination and the gaps
• Map settlement pathways for immigrants and refugees
• Develop Toolkits for English Conversation Circles
• Develop Toolkits for grant-writing
• Produce information sheets on private career college accreditation
• Produce fact sheets on frauds/scams Canadian newcomers may face
• Report local labour market information
• Begin to test the merits of place-based delivery of services ( community hubs) vs. outreach
• Set the groundwork for child care cooperatives and other parent supports
• Collaborate with grassroots groups and agencies to launch a report measuring the scope of the local underground economy
I think we proved community research can be both solid and solution-focused.