Site statistics (course statistics) are logged automatically in Yorkshare and can give a crude picture of student engagement in terms of log-in frequencies and the number of visits to specific areas of a course, which may focus on content areas (folders or items) and tools that you have set up within your module site. Tracking can also be enabled for specific content items to observe how widely items are accessed by students.
The statistics will reveal general patterns of student engagement and activity within the module site, how often students visited the site and the range of resources that they have used. However, statistics are less useful in providing information on levels of engagement with these resources, for example time on task in completing study activities.
Case study: Environment and Health (Environment)
Course statistics were used to track student activity for the VLE site for this third year undergraduate module. Student log-in patterns were varied across the module, with peaks of high activity recorded at the time of practical sessions and deadlines for assessed coursework. 33% of hits occurred on Mondays – the day before the weekly lecture. The statistics tracked student access to content areas within the site, providing an insight into usage patterns. The group work area attracted the most hits, with 76% of the total (9347 hits). The practical area accounted for 14.5% of hits (1781 hits) and the course materials for 5% (630 hits).
Interaction with content
Structured VLE site with online resources as significant component to learning on the module
Pierre Delmelle, Environment
View Environment & Health Case Study [PDF]
Contribution statistics may also be generated by the learning tools themselves. For example, the number of discussion forum posts made by an individual, which might be compared with the total number of visits to the forum or views of other posts to judge the overall level of student engagement with a discussion activity.
Contribution statistics are also available in wiki tools, focusing on the number of edits made and percentage breakdown of individual contributions to a group wiki report. Again these statistics need to be treated with caution, as they will not reflect the complete picture of student learning, i.e. what takes place outside the formal learning environment. Do not ignore the informal learning processes such as drafting and discussion which may be taking place using alternative (student-controlled) tools, if the wiki is viewed by students as the formal publishing location for the activity.
Use Case: Evolutionary Ecology (Biology)
The screenshot above shows an undergraduate Biology module where contribution statistics have been generated through the use of a group wiki tool presenting the number of page saves (frequency and percentage of saves by the individual for the group report as a whole), and the total number of lines that have been modified by the individual in the report. These statistics can give a rough indication of an individual’s contribution, although they will not reflect the interaction and informal learning that shapes the group’s use of this tool. The quality of the individual’s learning contribution is therefore harder to assess, although there is the page history option to track the changes that have been made to the report.
The extract from the evaluation report below shows that some conclusions can be drawn from the data on the nature of group activity, focusing on the distribution of roles within the group.
50% modifications on a wiki by one student in three groups, and comments including:
- allocation of report writing to individuals.
- collaborative research and.
- a mix of communication methods including facebook, face to face and blogs.
We can gather from the data whether the final editing of reports has been a shared task between group members or one assigned to a particular individual. We may need other evidence though to judge the extent of the contributions made by individuals to the drafting of the report and research of findings, if these activities have not been captured through the use of the wiki.
Example of analysis of student contribution to a collaborative task
Dr Peter Mayhew, Biology
View Evolutionary Ecology Case Study [PDF]
Lecture Recording Statistics
Using the statistics tools within Panopto, you can refer to viewership trends and habits on either an individual video or across an entire VLE module site. Users attached to an associated VLE module site as an ‘Instructor’ have the ability to interrogate and download statistical data and graphs of any associated Panopto folders.
Statistics can be tailored to encompass a given date range, or to focus-in on a specific time period. Within the scope of an individual recording, there is also granulated breakdown of the number of specific views. This allows you to ascertain how many minutes a particular user has viewed, to see which parts of the capture have been most viewed, and to see when most users have accessed the recording.
Using this information, an instructor made deduce viewership trends within their cohort, or gauge which taught-concepts students feel the most need to revisit. This may be illuminating, for example, if the data suggests that viewership is focused around a particular concept. This may point to a shared knowledge gap, and could thus be used to shape the content of future contact-time.
Use Case: Extend The Lecture With Personal Capture (Economics)
The use case in question pertains to an intervention that was deployed in a Year 1 core undergraduate module for Statistics. Faced with a mixed-economy of understanding of core concepts in the student intake, the instructor used Personal Video Capture software to respond to areas of difficulty that were being highlighted through frequent email traffic.
In order to assess the effectiveness of the venture, the instructor was keen to keep his finger on the pulse about how (and, indeed, if) the content was being consumed. As such, video statistics were consulted, which indicated that 136 unique (353 cumulative) views were achieved during the first week after publication. This reactive peak in viewership correlated with a noticeable ‘drop off’ of ‘the same old questions’ via email traffic.
- Course Reports [help.blackboard.com]
- Item Analysis (for Yorkshare Tests only) [help.blackboard.com
- Performance Dashboard (suggested for Discussion Board evaluation only) [help.blackboard.com]
- Accessing Panopto Viewing Statistics via Yorkshare. [Google Doc, UoY Login Required]
- Using the Replay At-Desk recorder. [Google Doc, UoY Login Required]