Since I arrived at UTS we’ve been researching and prototyping our way forward, in close collaboration with academics here. We want to understand the potential of the (sometimes controversial) topic of providing students with automated feedback on their writing. Two pieces are forthcoming this year, which exemplify how we are tackling this exciting but complex challenge. The first is a pilot with Civil Law students using a parser tuned to analytical writing, and the second with Pharmacy students using a parser tuned to the very different qualities of reflective writing. Numerous other pilots are also underway, which we look forward to reporting in due course.
If this is a topic that interests you then tune into the Writing Analytics community. Next event in Vancouver next month, when we focus on what it might mean to be literate with Writing Analytics, which – we would argue – entails a close coupling of pedagogy, assessment and analytics.
This work is intensely collaborative and interdisciplinary – perhaps transdisciplinary –and I’m deeply grateful to my team and everyone we work with for the spirit in which we’re able to work together.
Gibson, A., Aitken, A., Sándor, Á., Buckingham Shum, S., Tsingos-Lucas, C. and Knight, S. (2017). Reflective Writing Analytics for Actionable Feedback. Proceedings of LAK17: 7th International Conference on Learning Analytics & Knowledge, March 13-17, 2017, Vancouver, BC, Canada. (ACM Press). DOI: http://dx.doi.org/10.1145/3027385.3027436. [Preprint]
Abstract: Reflective writing can provide a powerful way for students to integrate professional experience and academic learning. However, writing reflectively requires high quality action- able feedback, which is time-consuming to provide at scale. This paper reports progress on the design, implementation, and validation of a Reflective Writing Analytics platform to provide actionable feedback within a tertiary authentic assessment context. The contributions are: (1) a new con- ceptual framework for reflective writing; (2) a computational approach to modelling reflective writing, deriving analytics, and providing feedback; (3) the pedagogical and user ex- perience rationale for platform design decisions; and (4) a pilot in a student learning context, with preliminary data on educator and student acceptance, and the extent to which we can evidence that the software provided actionable feedback for reflective writing.
Knight, S., Buckingham Shum, S., Ryan, P., Sándor, Á., & Wang, X. (In Press). Designing Academic Writing Analytics for Civil Law Student Self-Assessment. International Journal of Artificial Intelligence in Education, (Special Issue on Multidisciplinary Approaches to Reading and Writing Integrated with Disciplinary Education, Eds. D. McNamara, S. Muresan, R. Passonneau & D. Perin). Open Access Reprint: http://dx.doi.org/doi:10.1007/s40593-016-0121-0
Abstract: Research into the teaching and assessment of student writing shows that many students find academic writing a challenge to learn, with legal writing no exception. Improving the availability and quality of timely formative feedback is an important aim. However, the time-consuming nature of assessing writing makes it impractical for instructors to provide rapid, detailed feedback on hundreds of draft texts which might be improved prior to submission. This paper describes the design of a natural language processing (NLP) tool to provide such support. We report progress in the development of a web application called AWA (Academic Writing Analytics), which has been piloted in a Civil Law degree. We describe: the underlying NLP platform and the participatory design process through which the law academic and analytics team tested and refined an existing rhetorical parser for the discipline; the user interface design and evaluation process; and feedback from students, which was broadly positive, but also identifies important issues to address. We discuss how our approach is positioned in relation to concerns regarding automated essay grading, and ways in which AWA might provide more actionable feedback to students. We conclude by considering how this design process addresses the challenge of making explicit to learners and educators the underlying mode of action in analytic devices such as our rhetorical parser, which we term algorithmic accountability.
Source: SBS Blog
Link: Writing Analytics update