Evaluation of Student Feedback Within a MOOC Using Sentiment Analysis and Target Groups
| Published | September 2020 |
| Journal | International Review of Research in Open and Distributed Learning Volume 21, Issue 3, Pages 140-156 |
| Country | Australia, Oceania |
ABSTRACT
Many course designers trying to evaluate the experience of participants in a MOOC will find it difficult to track and analyse the online actions and interactions of students because there may be thousands of learners enrolled in courses that sometimes last only a few weeks. This study explores the use of automated sentiment analysis in assessing student experience in a beginner computer programming MOOC. A dataset of more than 25,000 online posts made by participants during the course was analysed and compared to student feedback. The results were further analysed by grouping participants according to their prior knowledge of the subject: beginner, experienced, and unknown. In this study, the average sentiment expressed through online posts reflected the feedback statements. Beginners, the target group for the MOOC, were more positive about the course than experienced participants, largely due to the extra assistance they received. Many experienced participants had expected to learn about topics that were beyond the scope of the MOOC. The results suggest that MOOC designers should consider using sentiment analysis to evaluate student feedback and inform MOOC design.| Keywords | MOOC · teaching programming · sentiment analysis · target group · feedback · learner analytics |
| Language | English |
| ISSN | 1492-3831 |
| Refereed | Yes |
| Rights | CC BY |
| DOI | 10.19173/irrodl.v21i3.4783 |
| Export options | BibTex · EndNote · Tagged XML · Google Scholar |
AVAILABLE FILES
Viewed by 162 distinct readers
CLOUD COMMUNITY REVIEWS
The evaluations below represent the judgements of our readers and do not necessarily reflect the opinions of the Cloud editors.
Click a star to be the first to rate this document
▶ POST A COMMENT
SIMILAR RECORDS
Enhanced peer assessment in MOOC evaluation Through assignment and review analysis
Alcarria, Ramón; Bordel, Borja; de Andrés, Diego Martín; Robles, Tomás
The rapid evolution of MOOCs in recent years has produced a change in the education of students and in the development of professional skills. There is an increasing pressure on universities to establish procedures for ...
Match: evaluation; mooc
Evaluation of evidence-based practices in online learning: A meta-analysis and review of online learning studies
Means, Barb; Toyama, Yukie; Murphy, Robert; Bakia, Marianne; U.S. Department of Education, Office of Planning, Evaluation, and Policy Development
A systematic search of the research literature from 1996 through July 2008 identified more than a thousand empirical studies of online learning. Analysts screened these studies to find those that (a) contrasted an ...
Match: evaluation
MOOCs: A systematic study of the published literature 2008-2012
Liyanagunawardena, Tharindu; Adams, Andrew; Williams, Shirley; McGreal, Rory; Conrad, Dianne
Massive open online courses (MOOCs) are a recent addition to the range of online learning options. Since 2008, MOOCs have been run by a variety of public and elite universities, especially in North America. Many ...
Match: liyanagunawardena, tharindu; mooc
OERu context evaluation
Murphy, Angela
A survey was developed and launched in May and June 2012 to assess the extent to which OERu members are ready for participating in the pilot of the OERu model and determine which challenges organisations are currently ...
Match: evaluation
OERRH evaluation framework
Perryman, L -A.
The OERRH Evaluation Framework provides guidelines that allow project processes, outputs and outcomes to be evaluated in ways appropriate to the individual concerns of the various work packages, while at the same time ...
Match: evaluation
Evaluation of the UNED MOOCs implementation: Demographics, learners' opinions and completion rates
Gil-Jaurena, Inés; Callejo-Gallego, Javier; Agudo, Yolanda
The paper is a study about the MOOC experience at the Spanish National University of Distance Education (UNED), where we have collected initial and final information about learners' profiles and opinions, as well as ...
Match: evaluation; mooc
Critical evaluation of quality criteria and quality instruments in OER repositories for the encouragement of effective teacher engagement
Connell, Marina; Connell, John
This paper offers a short evaluation of the variety of quality criteria used in Open Educational Resources and some of the methods and practices in use to ensure quality. The paper surveys and reviews effective ...
Match: evaluation
Evaluation criteria for interactive e-books for open and distance learning
Bozkurt, Aras; Bozkaya, Mujgan; McGreal, Rory; Conrad, Dianne
The aim of this mixed method study is to identify evaluation criteria for interactive e-books. To find answers for the research questions of the study, both quantitative and qualitative data were collected through a ...
Match: evaluation
Teacher Perspective on MOOC Evaluation and Competency-Based Open Learning
Chang, Wen-Li; Sun, Jerry Chih-Yuan
Quality MOOCs (massive open online courses) ensure open learning under the top-down guidance of established criteria and standards. With an evaluative approach, course providers can use the guiding frameworks in ...
Match: evaluation
Quality assurance in the open: An evaluation of OER repositories
Atenas, Javiera; Havemann, Leo
The World OER Declaration 2012 recommends that States join efforts to facilitate finding, retrieving and sharing OER. The OER movement has thus far spurred the creation of numerous repository initiatives worldwide with ...
Match: evaluation









