Call-AI Accessible Voice Interaction
Designing an Inclusive, AI-Powered Call Assistant
I collaborated with a developer to design an AI call agent powered by a Large Language Model (LLM). The goal was to transition the concept into a functional app prototype, focusing on accessibility and ease of use for users with disabilities.
Working closely together, we created a rough wireframe prototype in Figma, which provided enough structure for early usability testing. From those sessions, we refined the interface, repositioning buttons, improving hierarchy, and aligning layouts with user expectations.
An important step was identifying where the AI could simplify the process. The initial wizard-style setup was replaced with automated call generation, allowing the LLM to handle most of the input work and speed up user workflows.
This project highlighted how user-centred design and AI automation can work together to make complex tools more intuitive, efficient, and inclusive.
Key Design Areas

User flows version 1 and 2 mapped out. Click to expand
Focus Areas
Provide a general description of the items below and introduce the services you offer. Click on the text box to edit the content.
AI Integration
using an LLM to automate repetitive setup tasks and streamline call creation
Accessible UX Design
Ensuring layouts, navigation, and controls were optimised for users with disabilities
Rapid Prototyping
Building and testing a Figma prototype quickly to validate early design ideas
Interface Simplification:
Rreducing cognitive load through clear hierarchy and predictable flows
Tools and Techniques
Figma, XML, HTML, CSS, User Testing Platforms, Data Gathering and Analysis
User Testing and Iteration
Refining placement and interaction patterns based on real user behaviour


