Learning Analytics for Curriculum and Program Quality Improvement Workshop

Learning Analytics and Knowledge Conference (LAK)
Edinburgh, UK
25 April, 2016

Much of the research in LAK to date has been “student facing”, that is, using data to better understand learners and their need or to create interventions that directly support or influence learners. This workshop takes the perspective on how Learning Analytics can drive improvements in teaching practices, instructional and curricular design, and academic program delivery. While this does influence student outcomes in the long term, the data gathered and evidence generated is more instructor and administrator facing. We have seen examples of how LAK can help build the case for instructional, curricular, or programmatic change and further how LAK can be used to foster acceptance of change processes by teachers, administrators, and other stakeholders in the educational enterprise. When successful, these kinds of changes are often associated with educational reform or culture shifts in educational practice.

This workshop will be hosted by the Learning Analytics and Knowledge conference (LAK) 2016.

Call for Papers


This workshop offers those in the LAK community an opportunity to share and explore how educational data, its analysis and visualization, and the evidence derived can change/improve the context of learning. The main research and practice questions to be addressed in this workshop are:

  1. How to provide relevant and actionable information to faculty, teaching assistants, departmental and college administrators to encourage a greater emphasis on student learning and use of evidence-based practices, thus encouraging a continuous improvement approach to teaching and learning.

  2. How to create visualization and data collection tools and approaches that encourage a community of instructors and administrators to engage in making evidence-based decisions to improve student learning whether at the activity, lesson, course, series, department, college or university-wide levels.

  3. How to extract information from the multiple modalities used in instructional environments and help capture, represent and evaluate faculty instructional approaches, student-faculty engagement, student-student interactions and student-technology interactions.

  4. How to represent, summarize and mobilize data from human interactions and student-technology interactions to motivate change and quality improvement. Is there a way to make the data and representations more useful for promoting sustainable change?

  5. How to change the evaluation of instructional activities to be more formative, actionable and multi-dimensional in nature, emphasizing individual and group improvement rather than a one-size fits all student survey by which instructors or courses are judged and compared. Ideally such evaluation systems would go well beyond student satisfaction as the sole measure of good teaching and should include learning outcomes, utility of outcomes, applicability to future learning, match or fit between learner and instructor, and more

  6. How can analytics help the individual instructor to examine the success of a course that they teach? By including the analysis of individual courses we will keep the interest of all instructors, and not just those concerned with larger questions of course sequences and curricula. Curricula are built of individual classes of course, and campus authorities at all levels need tools for examining the success and impact of individual courses.

Important Dates

  • January 25, 2016: Submission deadline
  • February 16, 2016: Notification of acceptance
  • March 16, 2016: Camera ready deadline
  • April 25, 2016 Workshop at LAK Conference

Workshop Format

It will be a half-day workshop.

Depending on the number of acceptable submissions we will consider two different possible formats. If there are many high quality submissions we will consider selecting a small number of submissions for plenary presentation with structured discussion followed by a poster session for all presenters to mingle and discuss. If there are fewer acceptable submissions, a world-cafe model might be used where themes reflected in the submissions are introduced and these themes form the basis for a number of round-table smaller group discussions. Participants at discussion tables can be shuffled if desired.

To provide a common ground where the different submissions could be compared and evaluated, the authors are required to follow the provided guidelines to prepare their submissions. While it is not always possible to comply with the guidelines, trying to fulfil them as much as possible will expand the impact of the publication.

Author Guidelines

To contribute to the workshop, authors will be required to submit a paper about past or planned use of Learning Analytics to understand and improve curricula or programs. The paper should describe, in a high level of detail the proposed solution.

The maximum length of the paper is 5 pages. The final version of the paper should follow the ACM conference guidelines for full papers. The submissions will be reviewed in a single-blind way, so it is not necessary to remove identifying information in the review version.

Each submission will be evaluated by at least two reviewers that will come from a variety of disciplines.

All the accepted papers will be published in the CEUR workshop proceedings .

The papers should be submitted using the Easychair platform.


Workshop Chairs

  • Jim Greer, Professor of Computer Science, University of Saskatchewan, Saskatoon, Canada
    Jim Greer is a Senior Strategist, Learning Analytics in the office of the Provost at the University of Saskatchewan. His research has focused on intelligent tutoring systems, learner modelling, semantic web for education, privacy and trust in learning technologies, and learning analytics. In his current role he is a champion for evidence-informed quality improvement in teaching and learning driven by data. The University of Saskatchewan is a founding member of SoLAR and a member of the Bay View Alliance, a consortium of public universities in Canada and the USA dedicated to improving teaching and learning in higher education through culture-change initiatives http://bayviewalliance.org

  • Marco Molinaro, Assistant VIce-Provost for Undergraduate Education and Director of the Educational Effectiveness Hub, University of California, Davis
    Marco Molinaro is the Assistant Vice Provost for Educational Effectiveness at UC Davis. His work is focused university wide effort to improve undergraduate student success through the Educational Effectiveness Hub (an expansion of the former iAMSTEM Hub). As part of the effort, the Hub is working with faculty and staff across the university to: 1) evolve the introductory undergraduate curriculum, 2) understand and measure change with new analytics tools and approaches that guide instructional improvement and, 3) develop actionable student success models. His projects have been funded through the NSF, NIH and various private foundations such as Gates, Intel and the Helmsley Trust. He is also leading the Tools for Evidence-based Actions (TEA) project involving members from over 20 US universities.

  • Xavier Ochoa, Professor, Escuela Superior Politécnica del Litoral, Vía Perimetral, Km. 30.5, Guayaquil, Ecuador
    Xavier Ochoa is a Principal Professor at the Faculty of Electrical and Computer Engineering at Escuela Superior Politécnica del Litoral (ESPOL) in Guayaquil, Ecuador. He is the coordinator of the Research Group on Teaching and Learning Technologies (TEA). He is currently member of the Executive Committee of the Society for Learning Analytics Research (SoLAR). He is also involved in the coordination of the Latin American Community on Learning Objects and Technologies (LACLO), the Latin American Open Textbook Initiative (LATIn) and other regional projects. His main research interests revolve around Learning Technologies, Learning Analytics and Multimodal Analysis.

  • Timothy McKay, Arthur F. Thurnau Professor of Physics, Astronomy, and Education, University of Michigan, 450 Church Street, Ann Arbor, MI, USA
    Professor McKay is the Chair of the Provost's Learning Analytics Task Force, Director of the Honors Program for the College of Literature, Science, and the Arts, and Principle Investigator of the Digitial Innovation Greenhouse and the REBUILD STEM Education Projects. He is a data scientist, with extensive and various experience drawing inference from large data sets. In astrophysics, his main research tools have been the Sloan Digital Sky Survey, the Dark Energy Survey, and the simulations which support them both. In education research, he works to understand and improve postsecondary student outcomes using the rich, extensive, and complex digital data produced in the course of educating students in the 21st century. In 2011, we launched the E2Coach computer tailored support system, and in 2014, we began the REBUILD project, a college-wide effort to increase the use of evidence-based methods in introductory STEM courses. In 2015, we launched the Digital Innovation Greenhouse, an education technology accelerator within the UM Office of Digital Education and Innovation.

Email contact: lacpworkshop@gmail.com

Program Committee

  • Christopher Brooks, University of Michigan, USA
  • Catherine Uvarov, University of California - Davis, USA
  • Christopher Pagliarulo, University of California - Davis, USA
  • Craig Thompson, University of Saskatchewan, Canada
  • Stephanie Frost, University of Saskatchewan, Canada
  • Katherine Chiluiza, Escuela Superior Politécnica del Litoral, Ecuador
  • Cristian Cechinel, Universidade Federal de Pelotas, Brazil
  • Leah Macfayden, University of British Columbia, Canada
  • Emily Miller, Association of American Universities, USA
  • Vive Kumar, Athabasca University, Canada
  • Phil Long, University of Texas at Austin, USA


  • 1:00-1:20 Introduction (Organizers)
  • 1:20-1:35 Paper: "Empowering instructors through customizable collection and analyses of actionable information" (Danny Liu)
  • 1:35-1:50 Paper: "Using a Risk Management Approach in Analytics for Curriculum and Program Quality Improvement" (Amy Wong)
  • 1:50-2:05 Paper: "LMS Course Design As Learning Analytics Variable" (John Fritz)
  • 2:05-2:20 Paper: "Simple Metrics for Curriculum Analytics" (Xavier Ochoa)
  • 2:20-3:00 Discussion time
  • 3:00-3:15 Break
  • 3:15-3:30 Paper: "Assessment Analytics for Peer-Assessment: A Model and Implementation" (Blazenka Divjak)
  • 3:30-3:45 Paper: "Data-Driven Programmatic Change at Universities: What works and how" (Jim Greer)
  • 3:45-4:00 Paper: "Promoting Instructor and Department Action via Simple, Actionable Tools and Analyses" (Marco Molinaro)
  • 4:00-4:30 Discussion time and closing


A draft version of the proceedings of the workshop can be downloaded here.