Delving deeper into student engagement: Conversations between academics and analytics

Introduction

After developing new resources for their Interact2 subjects, Nicole Sugden and Jasmine MacDonald (PSY208 Biopsychology) and Robyn Brunton (PSY203 Social Psychology) set out to conduct some research evaluating how students engaged with these resources and their subject sites.

Note: See our related post for information on these resources and the findings of the evaluative research.

The CSU analytics team were contacted to find out what data beyond the basic site analytics could be provided. This led to a research collaboration between psychology academics and the analytics team.

What resulted was the amalgamation of traditional quantitative survey and qualitative interview/focus group data with detailed analytic data. Combining these three data methods has allowed the team to delve deeper into what students are doing in the subject sites.

What’s important about this learning and teaching story?

Question signs

We are all familiar with the analytics available on our Interact2 site in the Performance Dashboard and Retention Centre. These analytics give us a basic insight into whether our students are accessing our subject sites, how frequently they access, and whether they are meeting key deadlines.

Whilst this is great, often we want to know MORE about how our students are engaging with our sites.

Working with the analytics team allowed the subject conveners of the psychology subjects to gain detailed insight into what was happening in their subjects. Some of the questions that were able to be answered included:

  • basic site analytics tell us how many students have accessed the forums. BUT are these students actively posting, or are they lurking?
  • Basic analytics tell us whether students have accessed content pages. BUT do the students access these pages/topics in the order we expect them to? Are students going back through topics to revise?
  • Basic analytics tell us when students access our sites and how long they spend on them BUT can we uncover how efficient students are when they are on our sites (e.g. does spending more time online result in higher grades? Do students with higher grades spend less time online, but spend the time more efficiently?)
  • Does qualitative student feedback about how they study reflect their actual online activity?

In answering these questions using combined survey, interview, focus group and analytics data, this cross-division collaboration has resulted in dynamic and rich conversations about student engagement.

Future collaborations like this will allow us to better understand our students’ needs and adapt our teaching accordingly.

What did it look like?

Students enrolled in biopsychology and social psychology completed an anonymous survey using Qualtrics at the end of session. They were asked questions about their online study habits in the two subjects, whether they accessed certain activities/resources, and their perceptions of these resources (i.e. ease-of-use, enjoyment, whether the activity facilitated understanding and motivation to engage in peer-to-peer interactions).

Students volunteered to take part in qualitative one-on-one interviews (n = 6), or two small focus groups (n = 7 and 9 students in each group). These students were asked to discuss their online study behaviour and perceptions of the interact2 subject sites/resources.

The analytics team collated data about patterns of student activity on the Interact2 sites, as well as student grades. This data was used in the following ways:

  • The interview/focus group data was linked to the analytics data to determine whether students interview responses about how they studied matched their actual activity.
  • The analytics data was used to look at how students use the forum (i.e. active versus lurker activity, whether social networks between students were formed or whether interactions were predominately were teacher-learner)
  • Analytics data assessed how students were navigating through the site (i.e. whether students progressed through the topics in specified order, whether students returned to earlier topics for revision, and whether students access the forums via links embedded within study modules or the main discussion board link)
  • Analytics data was used to cluster students into profiles, to look at how online activity related to student grades (i.e. did students with higher grades access the forum more often? Did they spend more or less time online?)

How can I make this happen?

The analytics team can retrieve this type of data if requested by academics. The conversations that come out of this data can be very fruitful!

We are more than happy to discuss our research methodology and findings with anyone interested.

IMPORTANT NOTE: We sought ethics approval for the collection of this data, as we plan on publishing the research findings. As such, all data was de-identified so that the subject coordinators could not identify individual students or their responses.