Automated essay scoring using incremental latent semantic analysis

Mingqing Zhang, Shudong Hao, Yanyan Xu, Dengfeng Ke, Hengli Peng

Research output: Contribution to journalArticlepeer-review

9 Scopus citations

Abstract

Writing has been increasingly regarded by the testers of language tests as an important indicator to assess the language skill of testees. As such tests become more and more popular and the number of testees becomes larger, it is a huge task to score so many essays by raters. So far, many methods have been used to solve this problem and the traditional method is Latent Semantic Analysis (LSA). In this paper, we introduce a new incremental method of LSA to score essays effectively when the dataset is massive. By comparison of the traditional method and our new incremental method, concerning the running time and memory usage, experimental results make it obvious that the incremental method has a huge advantage over the traditional method. Furthermore, we use real corpora of test essays submitted to the MHK test (Chinese Proficiency Test for Minorities), to demonstrate that the incremental method is not only efficient but also effective in performing LSA. The experimental results also show that when using incremental LSA, the scoring accuracy can reach 88.8%.

Original languageEnglish
Pages (from-to)429-436
Number of pages8
JournalJournal of Software
Volume9
Issue number2
DOIs
StatePublished - 2014

Keywords

  • Automated essay scoring
  • Incremental latent semantic analysis
  • Singular value decomposition

Fingerprint

Dive into the research topics of 'Automated essay scoring using incremental latent semantic analysis'. Together they form a unique fingerprint.

Cite this