Evaluation of Student Feedback Within a MOOC Using Sentiment Analysis and Target Groups
Published | September 2020 |
Journal | International Review of Research in Open and Distributed Learning Volume 21, Issue 3, Pages 140-156 |
Country | Australia, Oceania |
ABSTRACT
Many course designers trying to evaluate the experience of participants in a MOOC will find it difficult to track and analyse the online actions and interactions of students because there may be thousands of learners enrolled in courses that sometimes last only a few weeks. This study explores the use of automated sentiment analysis in assessing student experience in a beginner computer programming MOOC. A dataset of more than 25,000 online posts made by participants during the course was analysed and compared to student feedback. The results were further analysed by grouping participants according to their prior knowledge of the subject: beginner, experienced, and unknown. In this study, the average sentiment expressed through online posts reflected the feedback statements. Beginners, the target group for the MOOC, were more positive about the course than experienced participants, largely due to the extra assistance they received. Many experienced participants had expected to learn about topics that were beyond the scope of the MOOC. The results suggest that MOOC designers should consider using sentiment analysis to evaluate student feedback and inform MOOC design.Keywords | MOOC · teaching programming · sentiment analysis · target group · feedback · learner analytics |
Language | English |
ISSN | 1492-3831 |
Refereed | Yes |
Rights | CC BY |
DOI | 10.19173/irrodl.v21i3.4783 |
Export options | BibTex · EndNote · Tagged XML · Google Scholar |
AVAILABLE FILES
Viewed by 146 distinct readers
CLOUD COMMUNITY REVIEWS
The evaluations below represent the judgements of our readers and do not necessarily reflect the opinions of the Cloud editors.










Click a star to be the first to rate this document
▶ POST A COMMENT
SIMILAR RECORDS
Evaluation of evidence-based practices in online learning: A meta-analysis and review of online learning studies
Means, Barb; Toyama, Yukie; Murphy, Robert; Bakia, Marianne; U.S. Department of Education, Office of Planning, Evaluation, and Policy Development
A systematic search of the research literature from 1996 through July 2008 identified more than a thousand empirical studies of online learning. Analysts screened these studies to find those that (a) contrasted an ...
Match: evaluation
MOOCs: A systematic study of the published literature 2008-2012
Liyanagunawardena, Tharindu; Adams, Andrew; Williams, Shirley; McGreal, Rory; Conrad, Dianne
Massive open online courses (MOOCs) are a recent addition to the range of online learning options. Since 2008, MOOCs have been run by a variety of public and elite universities, especially in North America. Many ...
Match: liyanagunawardena, tharindu; mooc
Enhanced peer assessment in MOOC evaluation Through assignment and review analysis
Alcarria, Ramón; Bordel, Borja; de Andrés, Diego Martín; Robles, Tomás
The rapid evolution of MOOCs in recent years has produced a change in the education of students and in the development of professional skills. There is an increasing pressure on universities to establish procedures for ...
Match: evaluation; mooc
OERu context evaluation
Murphy, Angela
A survey was developed and launched in May and June 2012 to assess the extent to which OERu members are ready for participating in the pilot of the OERu model and determine which challenges organisations are currently ...
Match: evaluation
Meaningful learner information for MOOC instructors examined through a contextualized evaluation framework
Douglas, Kerrie; Zielinski, Mitchell; Merzdorf, Hillary; Diefes-Dux, Heidi; Bermel, Peter
Improving STEM MOOC evaluation requires an understanding of the current state of STEM MOOC evaluation, as perceived by all stakeholders. To this end, we investigated what kinds of information STEM MOOC instructors ...
Match: evaluation
Analyzing the discourse on open educational resources on Twitter: a sentiment analysis approach
Bhagat, Kaushal Kumar; Mishra, Sanjaya; Parida, Ashis Kumar; Samal, Alyal; et al.
This study investigated the sentiment of Twitter discourse on Open Educational Resources (OER). We collected 124,126 tweets containing hashtags related to OER posted from January 2017 to December 2021. We performed ...
Match: sentiment; sentiment analysis
MOOC quality evaluation system: Tomsk State University experience
Dyomin, Victor; Mozhaeva, Galina; Babanskaya, Olesya; Zakharova, Ulyana; et al.
E-learning development comes with an increased attention to its quality that is managed via the control over not only the learners' knowledge but over the learning process, its organization and applied tools. This paper ...
Match: evaluation
Formative evaluation of Hong Kong's first open textbooks
Yuen, Kin Sun; Li, Kam Cheong; Wong, Billy Tak Ming; Li, Kam Cheong; Yuen, Kin Sun
Twelve open textbooks for the local school curriculum – the first set ever developed in Hong Kong – were completed in 2015. During the development process, formative feedback was gathered from primary and secondary ...
Match: evaluation
Evaluation of the UNED MOOCs implementation: Demographics, learners' opinions and completion rates
Gil-Jaurena, Inés; Callejo-Gallego, Javier; Agudo, Yolanda
The paper is a study about the MOOC experience at the Spanish National University of Distance Education (UNED), where we have collected initial and final information about learners' profiles and opinions, as well as ...
Match: evaluation; mooc
Evaluation criteria for interactive e-books for open and distance learning
Bozkurt, Aras; Bozkaya, Mujgan; McGreal, Rory; Conrad, Dianne
The aim of this mixed method study is to identify evaluation criteria for interactive e-books. To find answers for the research questions of the study, both quantitative and qualitative data were collected through a ...
Match: evaluation