Claritas One
All case studies
Education12 months·UK

Higher Education Platform — 2.1M Learners

Lifted course completion from 38% to 71% with adaptive learning at two-million scale

A UK higher education platform serving 2.1M learners had self-paced completion stalling at thirty-eight per cent with no visibility into at-risk learners until disengagement was irreversible. Twelve months of adaptive learning engineering, real-time educator analytics, and WCAG 2.2 AA certification took completion to seventy-one per cent and enabled a new tier of institutional contracts.

71%
Completion
72h
Risk detection
2.1M
Learners
WCAG
AA certified
Pattern study · Client is anonymised; details may be composited across engagements to preserve commercial confidentiality. Outcomes reflect the kind of result these methods are designed to deliver.

The platform had been founded on an optimistic thesis: if learners could study at their own pace with high-quality recorded content, outcomes would improve and access would widen. The access thesis had been vindicated — two point one million learners across the UK and Commonwealth, many from first-in-family university backgrounds. The outcomes thesis had not. Self-paced course completion was thirty-eight per cent, which was better than the sector median but was no longer competitive against institutional competitors who were beginning to bundle human tutoring into their higher-priced programmes. More pointedly, the platform's enterprise customers — large employers who funded employee learning — were now asking for completion guarantees that the platform could not credibly offer at the existing rate.

The platform had already instrumented its learning experience, and the data revealed the shape of the problem clearly. Disengagement almost always began in a specific narrow window — between the third and fifth piece of assessed content within a module — and once it began, it was effectively irreversible by the time educators noticed. Educators typically did not notice until a learner had missed two weekly check-ins, by which point the learner was on average twenty-one days into a disengagement arc that reversed in fewer than four per cent of cases. The feedback loop was structurally too slow.

Our architectural recommendation was a mastery-based adaptive progression layer grafted onto the existing course structure. Rather than asking every learner to progress through a linear sequence of content, the adaptive engine used item response theory to estimate each learner's current mastery on the module's learning objectives and to select the next piece of content — an explanation, a worked example, a practice assessment, a remedial review — that was most likely to advance that learner's mastery given their current state. The engine was implemented on a purpose-built service running on Kubernetes with a Postgres-backed state store and a Redis cache for low-latency next-item selection. The content catalogue was re-indexed against the learning objectives through a combination of instructor tagging and machine-assisted content mapping reviewed by the platform's curriculum team.

The educator analytics layer was the second structural change. We built a real-time cohort dashboard that surfaced at-risk learners within seventy-two hours of the first disengagement signal, not twenty-one days later. The dashboard combined behavioural signals — session frequency, assessment attempts, time on content — with mastery signals from the adaptive engine to produce a composite risk score. Educators were given lightweight intervention tools — a pre-drafted message, a suggested office-hours slot, a one-click resource recommendation — that allowed them to respond to at-risk signals in under two minutes. The intervention playbook itself was co-designed with the platform's most experienced educators and validated through a six-week pilot before general rollout.

Accessibility was a non-negotiable thread through the engagement. The platform had a significant population of learners with dyslexia, visual impairment, and neurodiversity-related accommodations, and the commercial team had committed to a full WCAG 2.2 AA certification as part of the re-launch. We engaged an accessibility specialist from the beginning of the programme, and accessibility acceptance criteria were included in every user story rather than retrofitted. The final audit by an external accessibility firm returned a compliant result across the full catalogue with zero level-A or level-AA violations.

The outcomes reshaped the platform's commercial position. Completion on adaptive courses rose from thirty-eight per cent to seventy-one per cent, measured on cohorts of sufficient size to be statistically robust. At-risk learner identification fell from an average of twenty-one days to seventy-two hours, and educator intervention rates in response to those signals rose from below five per cent to over seventy per cent. The platform's enterprise sales team used the completion evidence — independently audited by a learning science research group — to renegotiate contracts with four of their largest enterprise customers, lifting contract values by a blended forty-two per cent across those accounts. The platform now uses the adaptive engine as a central part of its commercial positioning and is extending it to the next generation of its postgraduate programmes.

Tags
EdTechAdaptiveAccessibilityAnalytics
Stack
KubernetesPostgreSQLRedisItem Response TheoryWCAG 2.2

Need a similar outcome for your organisation?

Brief our principals on your current state and target outcome. You will receive a scoped proposal within three business days.

Start a conversation