This website is an archive and is no longer active. For more information about SVA activities in 2020 head to the University of Adelaide’s Student Partnerships webpage.

Should institutions review their student engagement?

All institutions are involving their students in decision making processes in various ways and to varying degrees. Would it be useful now for them to ask themselves critical questions around where and how they are engaging their students? Sectors overseas have seen this to be a valuable step as institutions and their student cohorts progress towards student partnership.

During my OLT project we saw examples where institutions lacked an overview of where their students were involved in decision making. Institutional websites with difficult to navigate information about opportunities for engagement are a part of that story. Inconsistencies between survey responses and anecdotal evidence of engagement provided to me at conferences and supported by subsequent follow-up told another part of the story. It is clear that lots of people are doing good things but it is a shame that the message isn’t permeating as thoroughly as it could.

The other side of this dilemma is the question of how students are engaged and there is plenty of evidence that institutions often still see ex post facto consultative models as engagement.

Although it requires an investment of time a good starting point on the road to effective and sustainable partnership for institutions is to examine where and how they are engaging students through a review or audit process. It would be useful to ask all members of institutions where and how they believe students are engaged in their decision-making whether academic or otherwise. That input could then be compared with documentary evidence in policies, websites, statutory instruments. Parameters that could be considered include the number of students engaged, how they are recruited, the duration of their engagement, how they are expected to interface with other institutional stakeholders in the process, what training and support they receive and how successful that engagement is. On the question of success, measures such as students turning up, actively participating and following through on assigned tasks (if there are any) are useful.

Simply sending out a survey is not enough although it is a start. Surveys are typically plagued by poor response rates. It would be of more use to delve deeper and to prioritise this activity as not just another piece of busy work. The time commitment per individual should not be significant but gaining comprehensive feedback would be. Opportunities such as student flash pizza sessions (to borrow from our friends in Adelaide) provide a good way to gather evidence. Brainstorming sessions during faculty, division or school meetings are another.

Armed with this information institutions could start to map how they are interacting with their students and where the gaps are. A picture would form of styles of engagement and where attempts at engagement are not working. This would open channels for a dialogue around how things can be done better.

Opportunities to consider how student engagement is communicated arise and again gaps can be identified. Institutions would also benefit from turning the spotlight on innovative and effective practices occurring in their midst that may have gone largely unnoticed.

The opportunities for improvement, removing ineffective and time-wasting processes and replacing them with more effective options, learning from one another and above all enhancing student experience would I believe make the investment in this process well worthwhile.

Finally, although I have expressed this process from an institutional perspective ideally the process of reviewing student engagement should be carried out as a partnership between all stakeholders ensuring that the process is robust and respected.

Sally Varnham
13 November 2017

Leave a Reply