“Feels Like Paper!”: Interfacing AI through Paper
This video is featured in the Designing with AI 2025 playlist and 1 more.
Summary
“Feels Like Paper!” is a series of prototypes about augmenting physical paper through AI. Various ML models, LLMs and a mixed reality headset are used to infuse physical paper and ink with properties of the digital world without compromising on their physical traits.
Key Insights
-
•
Physical paper retains unique tactile qualities that people naturally gravitate toward for reading, writing, and drawing.
-
•
Augmented reality can augment handwriting math input and provide instant, contextual AI-generated answers projected close to the user’s writing.
-
•
Mixed reality interfaces enable a tight coupling between physical and virtual objects, making digital content feel part of the physical world with shadow and spatial alignment.
-
•
AI agents embodied in physical objects scattered in our environment offer localized, context-aware assistance rather than a single all-knowing AI entity.
-
•
Designing AI interfaces means designing for variable behaviors, as AI outputs are not deterministic and can change with context and prompts.
-
•
Embodied interfaces can sometimes function without visible augmentation, using physical actions like highlighting combined with speech to synchronize digital data.
-
•
Real-time image diffusion used as a co-creative muse on physical drawings creates a hierarchical relationship where AI inspires but does not dictate final outcomes.
-
•
Smart glasses with egocentric cameras offer new opportunities for natural interaction but raise significant privacy concerns that need careful attention.
-
•
Interaction choreography and microgestures are critical for socially acceptable use of smart glasses in public or professional environments.
-
•
Future interfaces may allow uninterrupted coexistence of physical and virtual content on a single perceptual layer, enhancing sensory saturation and collaboration.
Notable Quotes
"Conventional user interfaces have a very low rate of information exchange between the user and the technology."
"Augmenting our sensors through AI becomes possible with increasingly egocentric perspectives on the user’s context."
"The interaction comes to the foreground mainly when there’s a negotiation of intentionality between the user and the technology."
"Physical paper has a special tactility to it; people still gravitate back to it because of this nature."
"It is very important to be aware of augmentations because otherwise they can feel very intrusive to the user."
"There is a tight coupling of physical and digital objects that makes the physical world more fluid and the virtual world gain physical traits."
"With Draw Dream, the AI acts as a muse, giving inspiration but not the final outcome."
"Realtime model feedback on egocentric inputs really feels like lucid dreaming."
"Designing for AI means designing for behavior because you will not always get the same result for the same question."
"We need to factor in the specifics of the social situation when designing interaction models and gestures for smart glasses."
Or choose a question:
More Videos
"Empathy means constantly going back to the people using the tools and hearing their feedback."
Jon Fukuda Amy Evans Ignacio Martinez Joe MeersmanThe Big Question about Innovation: A Panel Discussion
September 25, 2024
"We move beyond checklist thinking to focus on real needs of real people."
Sam ProulxAccessibility: An Opportunity to Innovate
March 9, 2022
"Design and research people must report to leaders who understand their functions, or else they get assigned irrelevant tasks like social coordinator."
Anna Avrekh Amy Jiménez Márquez Morgan C. Ramsey Catarina TsangDiversity In and For Design: Building Conscious Diversity in Design and Research
June 9, 2021
"If a bot can’t handle the query, it goes to a human—it's like Mechanical Turk for AI."
Greg NudelmanDesigning Conversational Interfaces
November 14, 2019
"Changing branding and the look and feel should be possible with very quick and easy interactions, not taking another 10 weeks to customize."
George Abraham Stefan IvanovDesign Systems To-Go: Reimagining Developer Handoff, and Introducing App Builder (Part 2)
October 1, 2021
"I prefer bias from multiple observers instead of just one when analyzing qualitative data."
Shipra KayanMake your research synthesis speedy and more collaborative using a canvas
January 24, 2025
"If you’re designing a site used infrequently, don’t overload it with custom hotkeys that are hard to memorize."
Sam ProulxDesigning For Screen Readers: Understanding the Mental Models and Techniques of Real Users
December 10, 2021
"VOC wasn’t a replacement for user research, but an enhancer that captured nuggets researchers might miss during discovery."
Shipra KayanHow we Built a VoC (Voice of the Customer) Practice at Upwork from the Ground Up
September 30, 2021
"The environment can disrupt critical interactions, like when Stacey gets caught up in cords and takes her eyes off the patient."
Dane DeSutterKeeping the Body in Mind: What Gestures and Embodied Actions Tell You That Users May Not
March 26, 2024
Latest Books All books
Dig deeper with the Rosenbot
Who are natural allies for healthcare UX professionals inside complex healthcare systems, and how do you engage them?
How do I handle credibility challenges and emotional barriers when working with medical professionals?
What are best practices for engaging C-suite executives and physicians as champions for B2B healthcare apps?