TAG along with me...Designing Humane Mobile Interfaces for AI Companions
Lessons from rebuilding sarfarazh.xyz as a mobile-first lab where AI guidance feels like a conversation instead of a dashboard.
Reworking this personal site around a mobile-first experience forced me to confront a familiar tension: how do you give visitors access to powerful AI helpers without making them feel like they are filing tickets? The answer, I found, lives in the uncomfortable middle between polished UI systems and the messy reality of conversational models.
WORKIN! Why humane beats clever
Most AI interfaces still fetishize cleverness—animated gradients, modality toggles, and endless prompts. Humane design asks a simpler question: what is the smallest thing a person needs to feel oriented? On mobile, that often means a sentence of reassurance, a context-aware suggestion, and controls that respond within a thumb's reach.
- Lead with the why: the desktop prompt on this site doesn’t block; it explains the benefit, offers a QR code, then gets out of the way.
- Respect rhythm: CTA modules respect a 44 px minimum and a predictable visual rhythm so visitors never hunt for the next action.
- Surface state honestly: dark-mode, session dismissal, and logging all run locally when Supabase is asleep, but the interface tells you when it’s falling back.
Layering AI nudges without noise
The hero CTA that invites you to talk to my digital clone doubles as a quick calibration moment. Behind the scenes, the CTA module rotates based on what I am experimenting with, but it never mutates mid-session. That constraint keeps the AI recommendations from feeling like ads. When everything is an AI suggestion, nothing feels intentional.
I borrowed a technique from FinTech dashboards: pair every recommendation with a defer action. If the visitor closes the prompt, the dismissal is honored for the entire session. That micro-trust buys permission to surface richer context later, like a prompt library or a suggested narrative flow.
Telemetry without surveillance
Because my Supabase project occasionally hibernates, the mobile experience logging route ships as a stub that can degrade to a silent no-op. When the backend wakes up, it starts accepting anonymized events. That rhythm protects the visitor from 500-level noise and lets me re-enable analytics without changing the client.
If an AI companion feels like it’s auditing you, the design is wrong. It should feel like a co-pilot handing you just enough context to stay in flow.
Field notes
- Ship thumb-first: prototype every flow at 360 × 780 before you even glance at desktop.
- Honor dismissals: sessionStorage and in-memory stores are enough for trust; you do not need a user table.
- Treat AI as copy: narrative voice and microcopy are the real differentiators—model selection comes later.
Designing humane mobile AI is less about picking the shiniest model and more about respecting human pace. Ask if each pixel invites someone to continue their story. If it doesn’t, the UI is just decoration.