Unlike the previous seminar that was focused more on research and its value in widening participation, the second in the series of joint SRHE and UALL seminars had a distinct focus on the nature of outreach work and how to evaluate its success. What was distinct from the three papers and the plenary discussions is the heterogeneous nature of what work is done and how institutions judge its success. In the opening, Annette Hayton posed the issue of how reporting, evaluation and research can often form discreet entities that can be divorced from practice. There were many issues raised over the day but I will pull out some of the key issues and my reflection on these.
The first paper given by Colin McCaig explored his work on content and discourse analysis of Access Agreements from 2006 and 2012 across 10 pre-1992 and 10 post-1992 universities. His paper addressed many key issues but the most striking was how much of the focus at primary level was coming from pre-1992 institutions. I would suggest that this is in some ways problematic as the overwhelming discourse from the pre-1992 institutions was that they wanted to work with ‘the brightest’, ‘the best’ and ‘high attainers’. Does this mean that in some cases these primary children may be developing a sense that if they do not make it to a pre-1992 university that they are in some way deficient or have failed? *
Also interesting in Colin’s paper was the shifts in language in Access Agreements between 2006 and 2012. Changes were more prevalent in post-1992 institutions that moved from talking about institutional focused issues to those centered on individuals. In pre-1992 institutions there was less change but, the main change in focus was around the role of widening participation as a civic virtue. This marked change in post-1992’s and more static approach in pre-1992’s may offer some insights into the thinking of institutional leadership teams onto why they should do widening participation work.
The second paper was a whistle stop tour by Carole Leathwood through a DfE funded study into school and college strategies to raise aspirations. I was very pleased to hear Carole’s reflection on the problematic nature of the steering groups insistence on using the term raising aspirations. As I have written about before, the issues is not one of low aspirations but mismatched or poorly channeled aspiration, which Carole helpfully termed an ‘Expectation Gap’. What worries me however is even despite a growing acknowledgement that the discourse of raising aspirations is problematic that there seems to be a reluctance to abandon it.
The study was relatively extensive and looked at 400 schools and 100 colleges with a telephone survey and 9 schools and 2 colleges as case studies. What was interesting was that 98% of colleges and 97% of schools said they did some ‘aspiration raising work’ and that there was a unanimous message that all options should be discussed, not just Higher Education. This resonates very much with feedback from teachers about events we have run with them valuing the fact we discuss all options open to young people and is often a criticism of other widening participation events they have participated in. This is something that is worth all practioners in the field taking some time to reflect on.
The study cited many elements that make a difference and I think it is useful to summarise them here:
- Whole school and college culture that engages in this work
- Well organized and structured programme
- Advice on subject choice
- Student finance advice
- Dedicated specialist staff in schools
- Vising speakers / Alumni
- Personalised one to one support
- University visits and Summer Schools
Overwhelmingly, it is these visits that were cited by both students and teachers as having the most profound impact although they also noted the issues of cost in being able to attend these visits. There was clearly a London effect where these schools were doing more and the transport issue could be a key factor in this. I would also suggest that getting student ambassadors into London schools is a far easier prospect than into a rural school with limited transport links or at a distance from a university campus. Carole’s paper raised many more issues and I would highly recommend any practitioners to spend some time to read the full report.
The final paper was a case study of the University of Bath’s attempts to create a framework for evaluating widening participation work. They identified clear issues with previous lack of correlation of aims and objectives which made it hard to find a focus for evaluation. For each level of intervention, they identified five dimensions: know, choose, become, practice and understand. They then ensured that there were objectives for each. This then created a much more systematic way of approaching evaluation. What was interesting to reflect upon was how much of the evaluation practioners may often do in a tacit or ad-hoc way as part of their own reflective practice through debriefs but that may not be systematically recorded. What was also useful to reflect upon was the diverse approach to evaluation in terms of measuring understanding through quizzes, surveys, peer evaluation of work and focus groups. One of the challenges when we talk about evaluation can be the narrow conception of evaluation as a paper form and this is something to be reflected upon – the multifaceted dimensions of evaluation and how to capture this effectively in a systematic way.
This second seminar offered much to reflect upon both in terms of what we do but how we measure what is done in widening participation. There can be a tendency to reduce evaluation to a must do task that is reducible to easily reportable numerical data in terms of participation, target groups and engagement but this can miss the transformative power of widening participation which is in many ways the real value of this type of work. It isn’t simply how many students an institution can work with but how many lives this type of work can transform.
(*this is a different interpretation to the original post due to misinterpreting the data)
Full presentations now available here