Evaluation of Student Feedback Within a MOOC Using Sentiment Analysis and Target Groups
Published | September 2020 |
Journal | International Review of Research in Open and Distributed Learning Volume 21, Issue 3, Pages 140-156 |
Country | Australia, Oceania |
ABSTRACT
Many course designers trying to evaluate the experience of participants in a MOOC will find it difficult to track and analyse the online actions and interactions of students because there may be thousands of learners enrolled in courses that sometimes last only a few weeks. This study explores the use of automated sentiment analysis in assessing student experience in a beginner computer programming MOOC. A dataset of more than 25,000 online posts made by participants during the course was analysed and compared to student feedback. The results were further analysed by grouping participants according to their prior knowledge of the subject: beginner, experienced, and unknown. In this study, the average sentiment expressed through online posts reflected the feedback statements. Beginners, the target group for the MOOC, were more positive about the course than experienced participants, largely due to the extra assistance they received. Many experienced participants had expected to learn about topics that were beyond the scope of the MOOC. The results suggest that MOOC designers should consider using sentiment analysis to evaluate student feedback and inform MOOC design.Keywords | MOOC · teaching programming · sentiment analysis · target group · feedback · learner analytics |
Language | English |
ISSN | 1492-3831 |
Refereed | Yes |
Rights | CC BY |
DOI | 10.19173/irrodl.v21i3.4783 |
Export options | BibTex · EndNote · Tagged XML · Google Scholar |
AVAILABLE FILES
Viewed by 133 distinct readers
CLOUD COMMUNITY REVIEWS
The evaluations below represent the judgements of our readers and do not necessarily reflect the opinions of the Cloud editors.










Click a star to be the first to rate this document
▶ POST A COMMENT
SIMILAR RECORDS
Evaluation of evidence-based practices in online learning: A meta-analysis and review of online learning studies
Means, Barb; Toyama, Yukie; Murphy, Robert; Bakia, Marianne; U.S. Department of Education, Office of Planning, Evaluation, and Policy Development
A systematic search of the research literature from 1996 through July 2008 identified more than a thousand empirical studies of online learning. Analysts screened these studies to find those that (a) contrasted an ...
Match: evaluation
Enhanced peer assessment in MOOC evaluation Through assignment and review analysis
Alcarria, Ramón; Bordel, Borja; de Andrés, Diego Martín; Robles, Tomás
The rapid evolution of MOOCs in recent years has produced a change in the education of students and in the development of professional skills. There is an increasing pressure on universities to establish procedures for ...
Match: evaluation; mooc
MOOCs: A systematic study of the published literature 2008-2012
Liyanagunawardena, Tharindu; Adams, Andrew; Williams, Shirley; McGreal, Rory; Conrad, Dianne
Massive open online courses (MOOCs) are a recent addition to the range of online learning options. Since 2008, MOOCs have been run by a variety of public and elite universities, especially in North America. Many ...
Match: liyanagunawardena, tharindu; mooc
OERu context evaluation
Murphy, Angela
A survey was developed and launched in May and June 2012 to assess the extent to which OERu members are ready for participating in the pilot of the OERu model and determine which challenges organisations are currently ...
Match: evaluation
Teacher Perspective on MOOC Evaluation and Competency-Based Open Learning
Chang, Wen-Li; Sun, Jerry Chih-Yuan
Quality MOOCs (massive open online courses) ensure open learning under the top-down guidance of established criteria and standards. With an evaluative approach, course providers can use the guiding frameworks in ...
Match: evaluation
Evaluation of the UNED MOOCs implementation: Demographics, learners' opinions and completion rates
Gil-Jaurena, Inés; Callejo-Gallego, Javier; Agudo, Yolanda
The paper is a study about the MOOC experience at the Spanish National University of Distance Education (UNED), where we have collected initial and final information about learners' profiles and opinions, as well as ...
Match: evaluation; mooc
Massive open online courses: A review of usage and evaluation
Sinclair, Jane; Boyatt, Russell; Rocks, Claire; Joy, Mike
The massive open online course (MOOC) has seen a dramatic rise in prominence over the last five years and is heralded by some as disrupting existing pedagogy and practices within the education sector, while others are ...
Match: evaluation
Quality assurance in the open: An evaluation of OER repositories
Atenas, Javiera; Havemann, Leo
The World OER Declaration 2012 recommends that States join efforts to facilitate finding, retrieving and sharing OER. The OER movement has thus far spurred the creation of numerous repository initiatives worldwide with ...
Match: evaluation
Meaningful learner information for MOOC instructors examined through a contextualized evaluation framework
Douglas, Kerrie; Zielinski, Mitchell; Merzdorf, Hillary; Diefes-Dux, Heidi; Bermel, Peter
Improving STEM MOOC evaluation requires an understanding of the current state of STEM MOOC evaluation, as perceived by all stakeholders. To this end, we investigated what kinds of information STEM MOOC instructors ...
Match: evaluation
Scaffolding-informed design of open educational resources in Chinese secondary school mathematics: insights from multi-cycle formative evaluation
Huang, Xiaowei; Lo, Chung Kwan; He, Jiaju; Xu, Simin; Kinshuk
In the post-pandemic world, open educational resources (OER) have the potential to ensure educational equity by providing all students with access to learning materials and by supporting teachers’ instructional ...
Match: evaluation