BPP : STAFF Portal

Improving attendance accuracy at scale

OVERVIEW

 

The Staff Portal is a web-based platform designed to streamline how tutors record and manage student attendance. It replaces fragmented and manual processes with a unified system that integrates QR-based check-ins with tutor-led validation. This project marked a key milestone in improving operational efficiency, data accuracy, and student engagement across the university.

 
 
 
 

THE PROBLEM

Attendance tracking was inconsistent, manual, and prone to misuse.

  • Tutors relied on multiple methods (QR scans, paper registers, manual reporting)

  • The Attendance Team spent significant time manually consolidating data daily

  • There was limited visibility of attendance patterns, making it difficult to intervene early

  • QR codes could be misused, with no reliable way to validate physical presence

  • Tutors lacked a clear, simple tool to verify and correct attendance in real-time

  • Additionally, there was no structured way for administrative teams to retrospectively
    correct attendance when students raised disputes, leading to manual workarounds and delays.

AS A RESULT:

  • Data quality was unreliable

  • Operational effort was high

  • Opportunities to support at-risk students were missed as learning support needs where not captured

 
 

WHY THIS PROJECT MATTERED

  • Attendance teams were spending 100–200 hours per month manually consolidating MS Forms.

  • Tutors had no visibility of class size or attendance status.

  • No way to identify U18 learners or students with learning support needs in the moment.

Compliance & safeguarding risk

  • Students could photograph QR codes and check in remotely, inflating attendance.

  • Tutors had no ability to override incorrect check‑ins.

  • This created risks around legal compliance, funding requirements, and safeguarding.

 

 

RESEARCH & DISCOVERY

To ensure the Staff Portal aligned with real teaching workflows, I conducted qualitative user research alongside a Service Designer.

We interviewed 6 tutors across different modules and teaching formats to understand:

  • How attendance was currently being recorded

  • Pain points with QR scanning and manual processes

  • Time pressures during live teaching

  • Common edge cases (late arrivals, students leaving early, or cheating the system, pretending to be in class when not infact present)

These sessions revealed that tutors needed:

  • A fast and low-effort way to validate attendance during class

  • Clear handling of exceptions and edge cases

  • Confidence that the system reflected what was actually happening in the room

These insights directly informed key design decisions, including:

  • Prioritising bulk actions with flexibility for individual edits

  • Designing for real-world interruptions and failures (e.g. QR issues)

  • Keeping the interface simple and scannable under time pressure

 

 
 
 

MY ROLE & INVOLVEMENT

I led the end‑to‑end design of the Tutor Portal, responsible for shaping both the Tutor and Admin experiences from concept through to MVP delivery.

My key contributions included:

  • Defining core user flows for tutors and admin users

  • Designing the attendance-taking experience, including:

    • QR-first logic with manual override capabilities

    • Bulk and individual editing patterns

  • Translating complex operational needs into intuitive UI patterns

  • Designing edge case scenarios (e.g. QR failures)

  • Creating validation logic and interaction models (e.g. bulk vs individual edits)

  • Collaborating closely with engineering to align UX with system constraints

  • Supporting content design, including FAQs and onboarding walkthroughs

  • Iterating designs based on stakeholder and tutor feedback

 
 
 

 

WHO I WORKED WITH

This was a highly collaborative effort across multiple teams:

  • Product & Project Leads — defining scope and MVP priorities

  • Engineering Team — implementing APIs, validation logic, and system behaviour

  • Service Designer — To align on workshops and user research sessions for gathering first hand feedback from tutors on their current process for attendance taking, pain points and early feedback on the proposed outcome

  • Attendance & Operations Teams — providing real-world workflows and constraints

  • Tutors (end users) — validating usability and real teaching scenarios

THE SOLUTION

We designed a QR-first attendance system:

Key features:

  • QR code check-in as the primary attendance method

  • Tutor override capability to correct inaccuracies and capture the reason for the override

  • Clear submission rules (editable until midnight)

  • Real-time class registers synced from scheduling systems

  • Admin tools to review and update past registers (up to 3 months)

We also introduced:

  • A guided walkthrough for onboarding tutors

  • A structured FAQ system to reduce support dependency

 
 
 

 

ADMIN PORTAL (Supporting tool)

The admin portal allowed administrative staff to:

  • Search for classes using identifiers such as CRN or tutor

  • Review historical attendance (up to 3 months)

  • Override student attendance where valid evidence is provided

  • Perform bulk updates in cases such as system issues

This ensured that:

  • Attendance data remained accurate and auditable

  • Tutors were not burdened with retrospective changes

  • Operational teams could efficiently handle student disputes and audits

 
 

Prototype created using Figma Make

 

ONBOARDING & ADOPTION

Given the scale of rollout (400+ tutors), onboarding was critical to the success of the product. Our approach for onboarding was:

1. In-product walkthrough
I designed a guided walkthrough within the Staff Portal to:

  • Introduce key features

  • Reduce reliance on external training

  • Allow tutors to revisit guidance at any time

This ensured support was available in context, at the point of need.

2. Interactive FAQ resource
I created a structured FAQ document covering:

  • Core workflows (attendance logging, QR behaviour)

  • Edge cases and teaching scenarios

  • Policy and compliance rules

This acted as a self-service support tool, reducing dependency and giving tutors confidence in the system.

3. Live onboarding sessions
Hosting onboarding sessions with 400+ tutors, where:

  • I delivered live demonstrations of the platform

  • Walked through key scenarios and workflows

These sessions provided real-time feedback and helped:

  • Identify areas of confusion early

  • Build trust and buy-in from tutors

  • Ensure smoother adoption at launch

Used by 400+ tutors across the university.


 

USER FEEDBACK:

  • Tutors responded positively to the simplicity and clarity of the system

  • The ability to quickly correct attendance was particularly valued

  • The platform is already supporting more proactive student engagement

“For the first time, all the students stayed until the end of the class. No doubt this portal is going to improve student attendance”

 

 

OUTCOME & IMPACT

The Staff Portal is now live and being used across the university.

Early impact:

  • Reduced manual effort for the Attendance Team

  • Improved data consistency across attendance reporting

  • Better visibility of attendance patterns and risks

  • Reduced misuse of QR check-ins

  • Faster, more reliable attendance submission by tutors

WHAT’S NEXT

This launch represents MVP (Slice 1).

The platform has been designed to scale, with future iterations focusing on:

  • Enhanced student insights and reporting

  • Improved safeguarding and flagging mechanisms

  • Additional admin capabilities and automation

  • Continued iteration based on tutor feedback

 

 

REFLECTION

This project reinforced the importance of designing for real teaching conditions — not ideal ones. Balancing automation with human judgment, handling edge cases gracefully, and supporting tutors under pressure were critical to success. It also strengthened my partnership with engineering and my ability to design systems that scale.