Augmented AI

Course Evaluation LM

Imperial College empowers their faculty with Augmented AI streamlined workflows

Customer:
Industry:
Higher Education
Country:
UK
Business Background:
The IDEA Lab at Imperial College Business School combines emerging technologies with a vision for transformative education. Through personalized learning experiences and innovative analytics, the team works to revolutionize how future leaders learn and grow. By experimenting with cutting-edge technologies and developing new educational approaches, the IDEA Lab is helping shape the future of business education while creating meaningful impact for both students and industry.
Downloads:
November 8, 2024

Problem

Imperial College London faced a challenge in evaluating course materials against multiple educational frameworks, including Equality, Diversity and Inclusion (EDI) and Sustainable Development Goals (SDGs). Faculty needed to assess their materials against numerous criteria, such as representation of different countries, use of inclusive language, and diversity in visual content. This manual assessment process was time-consuming and potentially inconsistent, especially when faculty wanted to make iterative improvements to their course materials.

quote-icon

"We've transformed our framework assessment process from a manual task into a streamlined workflow where educators can quickly evaluate and improve their course materials while maintaining full academic control."

- Monica Ares, Executive Director IDEA Lab, Imperial College Business School

Solution

Augmented AI developed a web application that automates the assessment of course materials against defined frameworks. Faculty can upload any course content - including lecture slides, assessments, module outlines, and reading lists - and the system analyzes them using both language and vision AI models. The application processes these materials and provides results through an analytics dashboard that shows summary insights, data visualizations, and detailed assessment results. For transparency, the system displays both high-level findings (like "78% of geographic references are from the Global North") and the underlying analysis that led to these conclusions.

Outcome

The solution transforms framework assessment from a manual process into a streamlined workflow where faculty can quickly evaluate their materials and make improvements. Rather than replacing human judgment, the tool surfaces relevant information and potential issues for faculty to review. Users can iteratively improve their course content by making changes and getting immediate feedback on how well they align with various frameworks. Early feedback from faculty has been positive as the system prepares for full deployment, highlighting the value of having a consistent, scalable approach to framework assessment that maintains human oversight while reducing manual effort.