Tracing the ripples to make waves: The challenge of generating meaningful evidence of our impact 

Authors: Rachael Harding & Dr Mark Bassett, Auckland University of Technology, Te Wānanga Aronui o Tāmaki Makau Rau)

Higher education is constantly experiencing extreme waves of change. Rather than sandcastles inevitably washed away by these waves, learning advisors/developers/strategists (from here, LAs) ought to instead be making waves of our own. In a recent facilitated discussion, seventeen participants (from a range of institutions in Aotearoa New Zealand, Australia, Canada, Scotland, and the UK) explored and compared current and potential methods for collecting evidence of the impact of our practice, as well as who this evidence is reported to and how. In this blog post, we present a summary of what participants shared.

Like surfers waiting for the ‘perfect set’ of waves, the group collectively felt that there could be more effective methods and collection of evidence to demonstrate the impact of our work. For more sustainable and long-term success, rather than focusing on immediate results or numbers, most participants felt a need for longer term evidence collection involving enhanced collaborations and longitudinal studies.

Current evidence types

Figures 1 and 2 show that – despite participants mentioning feelings of dissatisfaction with feedback mechanisms, uptake and feedback fatigue – student and staff perceptions were by far the most common methods of collecting evidence through questionnaires and feedback (similar to the study by Bassett & Macnaught, 2024 – https://doi.org/10.1080/13562517.2024.2354280 ). Perceptions dwarfed all other forms of evidence, with the next most common being online content usage, and then student attendance. Other forms of evidence, such as changes in student writing, academic performance, and course completion/retention were negligible.

Figure 1
Figure 1: Types of reported impacts

Figure 2
Figure 2: Data types used as evidence of impacts

Collection of evidence was typically conducted by individuals within departments and sometimes through other institutional teams such as an Assurance of Learning team. Common difficulties included not only how to report certain types of activities or feedback but also the receptiveness to it. In some cases, there was no systematic collection or reporting of evidence, resulting in individuals doing it by themselves to gauge their impact more meaningfully than institutionally required to do so.

In terms of reporting (Table 1), the majority of reporting was done within departments and teams. Reporting to faculty and upwards within institutions occurred less frequently:

Table 1: Reporting

Reporting to:Number of activities reported
Department30
Institution14
Team13
Faculty9
Individual4
Government1

As shown in Table 2, the dissemination of evidence of impact to the wider community (for example, by publishing evaluations of practice or research articles, or presenting at conferences, etc.) was not mentioned by many participants. Some participants stated they conduct research, but did not specify how they disseminate findings (these we classified as ‘Unspecified outputs’).

Table 2: Dissemination

Dissemination typesMentions
Unspecified outputs3
Research articles2
Blog posts2
Conference presentations2

Participant feedback suggested that inconsistencies in evidence collection processes along with lack of awareness of the role of LAs contribute to questions of professional recognition and credibility within their institutions. Additionally, some participants were wary of a perceived institutional preference for quantitative evidence over qualitative, which they thought undervalued and obscured the visibility of their impact. These examples support the case for identifying more varied data types that can be triangulated in various combinations depending on motivations for communicating impact.

Generating more meaningful evidence – How can we show we are making a difference?

With research and data on current practices clearly showing the dominance of student and staff perceptions in reporting impact, participants were next invited to imagine what a more comprehensive picture of impact might include, given the luxury of no time limitations or costs. In general, responses (Table 3) highlighted the need to diversify and triangulate multiple data types and mixed methods to communicate impact more effectively.

Table 3: Alternative types of evidence / evidence reporting

Evidence typeDetails
Course- specific / Programme- specific dataIdentify impacts of embedding through increased specificity related to individual courses and programmes
Comparative studiesCompare interventions or groups of students for: Changes in student writingAcademic performanceChanges in students’ attitudes to communication Staff responses to students’ writing
Implementation of feedback or interventionsDocument how students apply feedback or teaching from workshops
Longitudinal studiesTrack individual students for:   How, when, and why students engage LA teaching and resources throughout their higher education careersImpacts on their academic performance and their writing Connections between assessments and workplace tasks after graduation
Holistic impactGather culturally appropriate qualitative data from students that tell their stories about their learning experiences

After asking the participants about the types of evidence they would ideally like to collect, we then asked what barriers there were to doing this and how those barriers could be overcome. The barriers related to the following interrelated issues:

  • LA skills and knowledge about how to identify and manage relevant data
  • Access to student data (including ethical approval to use it for research purposes)
  • Access to lecturers / curriculum
  • Low credibility / profile / awareness of LA work among staff and students

Encouragingly, participants shared some practical ways of overcoming these barriers. To enable better access to student data and to raise the profile of LA work, we should engage in strategic cross-departmental collaboration with staff, as well as students. These collaborations ought to form the basis for published evaluations of practice that can be reported internally and externally, as well as research reports.

To increase LA skills and knowledge in identifying relevant student data and how to manage it, we should leverage staff collaborations within and across our institutions. These staff might be lecturers or other LAs who can provide relevant skills and knowledge which can be adapted to various LA team contexts. Again, the collaborations ought to generate internal and external forms of reporting / dissemination.

Let’s make some waves!

Clearly, if we want to make waves by generating meaningful evidence of our impact, we must trace the ripples by collecting various types of data and triangulating them effectively.

Our 2025 ICALLD Symposium sessions comprised just seventeen LAs from across the world, and we could see how similar we are regarding what evidence we collect and how we communicate about the impact of our work.  It was also clear that we are frustrated by systemic institutional practices / perceptions that LAs can’t really do anything about. The waves of pressure on our profession are always there, and they are only going to increase as artificial intelligence automates more and more interactions between institutions and students.

To bring about change, we need to be strategic in collaborating with staff and students over time to identify our impact and make a stronger case for our work. We also need to ensure that we regularly report on and disseminate our practices. Surely, we don’t want to be the sandcastles that get washed away. We want to be making waves of our own.

This is why we are keen to collaborate with LAs from across the world on how to generate and communicate meaningful evidence of our impact. So, we would like to hear from you about what you are doing – you can contact us at:

rachael.harding@aut.ac.nz

mark.bassett@aut.ac.nz


Leave a Comment

Skip to content