Fishing for Real Needs: Reimagining Journalism Needs with AI
This video is featured in the Designing with AI 2025 playlist and 2 more.
Summary
Our team at Gazzetta, a media research lab, is tackling a fundamental challenge in journalism: the disconnect between media output and community needs, particularly in restricted or distorted information environments of autocracies. We have learnt over the past years that traditional audience research has led to quant-heavy, superficial understanding, ineffective content and, ultimately, irrelevance. To address this problem, we have developed a three-stage process using AI knowledge bases to build empathy, map information needs, and analyze information flows. We have used this process to systematically review multiple information sources to build deep community understanding before product development. This methodology has helped us preserve nuance, identify knowledge gaps, and assign confidence levels to findings. Rather than treating AI as a black box solution, a thoughtful process-oriented approach can help us better understand and serve information needs, and gradually rebuild relevance.
Key Insights
-
•
AI combined with structured research queries reveals deeper insights than traditional research when direct user access is limited.
-
•
Misalignment often exists between what organizations believe users need and what users actually need, as shown by North Korean fishermen preferring weather forecasts over political news.
-
•
Traditional engagement metrics only measure surface behavior, failing to capture true user information needs and satisfaction.
-
•
A shift from a theory of change (assuming known needs) to a theory of service (starting with understanding actual needs) is fundamental to effective media strategy and user-centered design.
-
•
A three-stage research approach—building empathy, prioritizing information gaps, and analyzing information flows—helps systematically leverage complex and scattered data.
-
•
Confidence rating of sources and insights is essential to prevent false certainty from AI-generated outputs.
-
•
Structured queries and systematic frameworks outperform opportunistic or freeform AI interactions, avoiding common pitfalls like overgeneralization and poor prompt design.
-
•
Human expertise, especially cultural knowledge, is indispensable to interpret AI outputs and maintain nuance.
-
•
Research repositories often hold untapped treasure troves of insights that, with structure and AI, can be mined even on shoestring budgets.
-
•
Visual coding of insights by confidence level improves transparency and decision-making among stakeholders.
Notable Quotes
"Journalists like me are in the business of interrogating reality to get at the truth."
"Those fishermen didn’t want political news—they wanted reliable weather forecasts to stay safe at sea."
"We spend millions on content that nobody wanted and that didn’t actually help people navigate their lives."
"Traditional metrics create a dangerous illusion where we optimize for what we can measure, not what people actually need."
"We’re moving from theory of change to theory of service: starting with what people actually need before creating anything."
"The careful application of structured queries can reveal deeper insights than traditional research alone."
"AI systems can present speculative connections as established facts, so confidence ratings are critical."
"Structure beats free form interaction—systematic query frameworks are essential to avoid pitfalls."
"Humans remain essential for evaluation, especially to interpret cultural nuance that AI often misses."
"Finding meaningful insights is not just casting a wide net—it requires discipline, structure, and knowing where to fish."
Or choose a question:
More Videos
"A large organization’s security protocols make AI adoption slower and more complex than in smaller groups."
Jon Fukuda Amy Evans Ignacio Martinez Joe MeersmanThe Big Question about Innovation: A Panel Discussion
September 25, 2024
"Diverse teams build diverse products."
Sam ProulxAccessibility: An Opportunity to Innovate
March 9, 2022
"Allies in the room who are not affected by bias are critical voices to help call out inequities."
Anna Avrekh Amy Jiménez Márquez Morgan C. Ramsey Catarina TsangDiversity In and For Design: Building Conscious Diversity in Design and Research
June 9, 2021
"Don't name your bot Bob, because it's not alive."
Greg NudelmanDesigning Conversational Interfaces
November 14, 2019
"You get real coded components, not just HTML snippets or style attributes, meaning developers can drag, drop, and extend with hooks already in place."
George Abraham Stefan IvanovDesign Systems To-Go: Reimagining Developer Handoff, and Introducing App Builder (Part 2)
October 1, 2021
"People don’t talk about their real fears in stakeholder meetings, and that hurts research credibility."
Shipra KayanMake your research synthesis speedy and more collaborative using a canvas
January 24, 2025
"Screen reader users control how metadata or tooltip voices are played, it’s a personal setting, not dictated by web designers."
Sam ProulxDesigning For Screen Readers: Understanding the Mental Models and Techniques of Real Users
December 10, 2021
"VOC wasn’t a replacement for user research, but an enhancer that captured nuggets researchers might miss during discovery."
Shipra KayanHow we Built a VoC (Voice of the Customer) Practice at Upwork from the Ground Up
September 30, 2021
"The computer metaphor of mind does not do a very good job of explaining what we see people do with their bodies."
Dane DeSutterKeeping the Body in Mind: What Gestures and Embodied Actions Tell You That Users May Not
March 26, 2024