Introduction
Vancouver, Full Day Workshop – March 14th 2017
Broadly defined, writing analytics involves the measurement and analysis of written texts for the purpose of understanding writing processes and products, in their educational contexts. Writing analytics are ultimately aimed at improving the educational contexts in which writing is most prominent. The principal goal of writing analytics is to move beyond assessment of texts divorced from contexts, transitioning instead to a more nuanced investigation of how analytics may be effectively deployed in different writing contexts. Writing analytics thus aims to employ learning analytics to develop a deeper understanding of writing skills.
There is untapped potential in achieving the full impact of learning analytics through the integration of tools into practical pedagogic contexts. To meet this potential, more work must be conducted to support educators in developing learning analytics literacy. The proposed workshop addresses this need by building capacity in the learning analytics community and developing an approach to resourcing for building ‘writing analytics literacy’.
The workshop will be targeted at:
- Providing a tutorial regarding key tools for writing analytics research and practice, highlighting existing tools, resources, and practices
- Building a resource bank of sample datasets from which learning vignettes might be developed
- Creating a ‘wish list’ of resources to support practitioners in their learning analytics literacy around writing, including developing a framework describing the kinds of pedagogic contexts in which particular tools might be integrated.
The workshop thus proposes to provide both hands-on tutorial elements, and resource-creation.
Photo credit: Vancouver sunrise 2014 by Gord McKenna, under a CC-BY-ND-NC license
CfP & Important Dates
This workshop will be of interest to a wide range of LAK delegates including: students and researchers actively engaged in writing research, text analytics or writing analytics specifically; educators in schools, universities and businesses; leaders and policymakers; and companies active or potentially active in the field.
We particularly invite participants who’re thinking about the question: “How can I use learning analytics to improve student writing” – whether as technologists, or teachers.
Submission Guidelines
Participation: Those interested in participating are invited to submit short (1 page maximum) applications for one of the following roles briefly describing which role you would like to participate in at the workshop and what you bring to the position (e.g., past experience working with student writing or writing analytics, how this area relates to your research interests). The workshop chairs will select contributions from a balance of roles, to ensure the workshop runs successfully.
Participation roles include the following:
– Pedagogic presenters: those who submit short documentation of a specific learning context in which writing analytics could be applied
– Data presenters: those who provide a salient dataset for discussion of its properties and potential analyses
– Analytic presenters: those who provide an analytic technique or tool that has been developed, along with resources describing particular pedagogic contexts in which it might be integrated
– Commentators: those who commit to reading, and responding to, at least one of the above kinds of submission – note that we welcome commentators with all backgrounds and levels of experience
Applications to participate should be submitted via email to sjgknight@gmail.com
Important Dates
- December 18, 2016: Submission deadline
- January 8, 2017: Notifications of acceptance sent out (prior to earlybird deadline of January 13th)
- March 13-17, 2017: LAK17 Conference
Rationale
Writing analytics literacy
The ability to communicate via writing is a key to literacy, central to participation in society, and thus central to all educational contexts [6, 7]. There is a long standing interest in the development and use of natural language processing (NLP) tools to analyze this writing [e.g., 5, 8], with tools emerging from the research and commercial spaces to support formative assessments of student writing.
Writing Analytics is a developing sub-domain of learning analytics with a specific focus on supporting writing practices. Research in this field has the potential to improve formative feedback in writing exercises and to provide insights to both educators and students (see previous workshop [1]). Despite this strong potential, adoption of writing analytics tools has not been widespread.
There is untapped potential in supporting educators to make effective use of such tools. However, ‘writing analytics literacy’ in this sense must go beyond simply knowing how to use tools or access results through simple user interfaces, and beyond tools that simply output numeric information absent actionable feedback. Rather, there is a need to engage educators with resources that support them in designing meaningful tasks, selecting appropriate tools to support those tasks, and interpreting the data arising from them. To do this, educators must consider the desired outcomes of assigned tasks (e.g., demonstrate knowledge of key topics, use correct citation, use creative language), and understand the potential – and pitfalls – of NLP to address those needs. We thus see writing analytics literacy as positioning analytics and writing-assessment literacies synergistically. Through building such literacies, we aim to:
- Develop a synergistic model of writing analytics literacy and writing assessment literacy
- Engage practitioners in thinking about (and researching) how writing assignments in their teaching might provide meaningful data for learning insights
- Develop student’s writing analytics literacy (and, by extension, their writing) through interaction with appropriate tools
Developing these literacies will require a multi-faceted approach, including continued development of research technologies, and innovation around new approaches to writing instruction. For these endeavors to have impact, practitioners must integrate them into pedagogic contexts in which they guide action [2].
Learners have a number of challenges in interpreting analytics for action [12]: They must connect the analytics to the processes and overall goals of the learning task (contextual issues); they must evaluate the quality of the analytic, understanding how it is developed and how it can inform their learning (trust issues); they must select to which information to attend – present, and absent from the analytic – and where to devote time (priority issues); and they must decide how – as an individual – to respond to analytics and what they represent (individual issues) [12].
Learners must make decisions based on these interpretations. Thus, analytics must [12]: present possible options, empowering learners to decide; provide actionable information for students to do something with the information; afford autonomy to students, such that analytics help them identify their own learning patterns, rather than relying on the analytic for this information [12].
Wise et al. propose the ‘align design framework’, in which educators integrate analytics as “an integral element in the learning process tied to their goals, expectations, and planned learning process” [12], with students given agency to “engage with analytics as a tool to inform their actions, as opposed to analytics being something with which students must comply” [12]. These principles frame the activity, with context added through a reference frame that provides an action-oriented comparator for the interpretation of the analytic, with a principle of dialogue/audience describing discussion around students’ learning goals and processes. For these design-implementations to be achieved, there is a need to support educators in connecting analytic design, pedagogy, and theory [4, 13].
Learning Analytics Carpentry
Other increasingly data-driven fields have grappled with developing both researcher and practitioner knowledge. Data Carpentry workshops (http://www.datacarpentry.org/), developed based on Software Carpentry bootcamps [11] (http://software-carpentry.org/) are short workshops designed to teach “basic concepts, skills, and tools for working with data so researchers can get more done in less time and with less pain” [9]. They are designed to give novices the starting toolkit to begin working programmatically with data in their own research. Data carpentry sessions focus on example data sets targeted at particular domains of relevance to the learners, with no prior-knowledge assumed.
For example, from an R Hackathon in population genetics, a community website has been developed of vignettes [3], with proposals for ‘a collaborative training infrastructure for bioinformatics’ including openly-co-developed resources and a carpentry-based teaching model that blends formal and informal elements with ongoing peer support [10].
We propose to adopt a ‘learning analytics carpentry’ model, to (1) develop capability among LAK researchers in the analysis of writing data; (2) connect this knowledge to practitioner contexts; (3) begin to build resources for writing analytics carpentry based learning
Existing work in this area (e.g., the 2014 EdX ‘Data, Analytics and Learning’, the 2016 LASI ‘topic modeling’ workshop, etc.) has focused on building researcher confidence in particular techniques, with a primary focus on the analytic rather than integration. The proposed workshop aims to develop resources that will both build capacity in learning analytic techniques, and the targeting of those analytics at particular pedagogic contexts.
Program
Contributors:
A range of contributors signed up for the workshop, including those presenting particular tools and approaches:
Elena Cotos (Iowa State University) – Research Writing Tutor (RWT) – A corpus-based platform for data-driven writing (webinar discussing RWT)
Noureddine Elouazizi, et al. (University of British Columbia) – A Formal Semantics-informed Learning Analytics Technology for Analyzing Written Argumentation (pdf)
Christian Rapp (Zurich University of Applied Sciences) – Thesis Writer (TW) – an Intelligent Tutoring System for Writing Instruction and its Study (Thesis Writer abstract (pdf); and webinar discussion of the tool)
Amna Liaqat, Zahia Marzouk, Mladen Rakovic, Philip Winne (Simon Fraser University) – Interactive Writing Analytics (pdf)
Anne Gere, Ginger Shultz, Chris Teplovs, Dave Harlan (University of Michigan) – M-Write: A Large-Scale Laboratory for Writing Analytics Research (pdf)
Bahar Sateli and Rene Witte (Concordia University) – Personal Research Assistants for Young Researchers (pdf).
Danielle McNamara and Laura Allen (Arizona State University) – The Writing Pal: A Writing Strategy Intelligent Tutoring System (ITS) (SoLET Lab)
Andrew Gibson, Simon Knight, Simon Buckingham Shum (Univ. Technology Sydney) and Ágnes Sándor (Xerox Research Centre Europe) – Automated Analytics for Reflective Writing (AWA project & LAK17 paper session Wed 3pm)
Keynote: Peter Foltz (Univ. Colorado Boulder / Pearson) – Automated feedback in large-scale formative writing: Lessons learned and best practices (Abstract & Bio)
Outline Schedule
8:30-10
Introduction to writing analytics and the workshop
Thinking like a student – the features of good writing
Identifying the obstacles to good writing
10:00-10:30 – BREAK
10:30-12:30
Identifying interventions and augmenting them with analytics
Tool overviews (5 minutes!)
Designing resources for future practitioner-oriented tutorials on the writing obstacles your tools address
12:30-1:15 LUNCH
1:15-2:30 – continuation of pre-lunch session
2:30-3:00 – BREAK
3:00-3:45 – Thinking like an educator – using analytics to augment practice, what literacies are needed?
3:45-4:45 – Keynote: Peter Foltz – Automated feedback in large-scale formative writing: Lessons learned and best practices
4:45-5:00 – Paths forward to build writing analytics literacy
Discussion on the day will include breakout groups with members from the different roles, including commentators for each group. This design aims to facilitate an inclusive and effective discussion on the day, and onwards. Prior to the day some core themes and questions will be identified for discussion. Participants will be accepted on the basis of contribution to the theme, with a desire to move forward both methodological and conceptual work in the area. We anticipate the theme bringing new people into the learning analytics community, nurturing connections and developing the theme within the community.
Organisers
Simon Knight is a Lecturer in Writing Analytics at the Connected Intelligence Centre, University of Technology Sydney. His research focuses on the relationship of analytics to epistemology, pedagogy and assessment, discourse analytics, and epistemic cognition, particularly around information seeking, work which has been presented at LAK and ICLS. He co-chaired the LAK16 writing analytics workshop, ICLS14 Workshop on Learning Analytics for Learning and Becoming in Practice and LAK15 & 16 Workshops on Temporal Analyses of Learning Data.
Laura Allen is a Doctoral Student in the Psychology Department at Arizona State University. The overarching aim of her research is to better understand the cognitive processes involved in language comprehension, writing, knowledge acquisition, and conceptual change, and to apply that understanding to educational practice by developing and testing educational technologies. Her research has been presented at LAK15 & 16 and other conferences related to writing analytics.
Andrew Gibson is a Research Fellow in Writing Analytics at the Connected Intelligence Centre, University of Technology Sydney, and a doctoral candidate and sessional academic in the Information Systems School at the Queensland University of Technology (QUT), Brisbane. His research focuses on reflective writing analytics for psychosocial meaning, and he has written software that utilise a range of natural language processing and machine learning techniques for this purpose. With an additional interest in transdisciplinarity, he works across both educational and computational domains. His work has been presented at LAK14 and LAK15 as well as local educational conferences.
Danielle McNamara is a Professor of Psychology at Arizona State University, where she directs the Science of Learning and Educational Technology Lab. Her research focuses on discovering new methods to improve students’ ability to understand text, learn new information, and convey their thoughts in writing. Her work integrates various approaches and methodologies including the development of intelligent tutoring systems and the development of natural language processing tools.
Simon Buckingham Shum is Professor of Learning Informatics at the University of Technology Sydney, where he directs the Connected Intelligence Centre. His research focuses on learning analytics for higher order competencies such as academic writing, argumentation and learning-to-learn. He served as LAK12 Program Co-Chair, and co-chaired the LAK13/14 workshops on Discourse-Centred Learning Analytics.