Last night, Joanie asked me something that made me go hmmm: "Can you use AI to help me build a mood board?"
Simple request. She gave me a list of URLs to scrape—furniture images from different catalogs that she wanted arranged thoughtfully on a page with fabric swatches and color palettes to capture a specific mood. The kind of visual thinking that interior designers do instinctively. I took it as a technical challenge. How hard could it be?
Turns out, the technical execution was just the beginning. It still took me a couple of hours to circumvent bot protection.
Breaking Through the Anti-Bot Walls
Most product sites have sophisticated anti-bot detection these days. Cloudflare, rate limiting, JavaScript challenges—the works. But Claude Code and I found a clever workaround pretty quickly.
Instead of trying to scrape directly, I had TypeScript call JavaScript to control my local Safari instance. Multiple tabs, DOM crawling, automated image extraction. The AI even helped detect which images were the actual product shots versus thumbnails or lifestyle photos. Within an hour, we had a React component displaying furniture in a clean grid layout with switching capabilities.
Everything worked—mostly. We got some of the images, built a working React component, displayed them in a grid. But once I saw the actual layouts, I realized the limits of my approach and called the experiment.
But when I showed Joanie the result, she took one look and said, "This isn't a mood board."
The Problem With My Simple Approach
A real mood board isn't just a bunch of pictures of furniture arranged in a grid. As Joanie explained to me, someone who's good at this—an interior decorator with real skill—knows how to remove backgrounds. They see a picture of a chair in a room, capture the photo, erase everything but the chair, then make that chair the right proportion to match a different picture of a table from another room where they've also stripped the background.
Now you have a chair and a table, but they're not just sitting on a white page. Maybe they've taken a strip of wallpaper that might work in that room, and now it's the background for that section of the board. That might be only a quarter of the page, or an eighth. Other areas show lamps, vases, wood tones and grains, specific shapes—all layered thoughtfully across the page.
You're not just collecting inspiration. You're capturing a mood through how these elements are layered, through the color relationships, through the proportions and visual weight of each piece.
What We Can and Can't Automate
This experience crystallized something I've been thinking about across all my work with agentic coding tools. AI excels at execution—scraping data, parsing DOM structures, generating clean layouts. It can even recognize patterns if you feed it enough training data.
But it can't replicate the internal creative process that makes those boards meaningful.
The value isn't in the technical assembly. It's in the designer's ability to synthesize abstract concepts—comfort, sophistication, warmth—into specific visual choices. Understanding that this particular client needs grounding elements because their life feels chaotic, or that another client craves texture because their current space feels sterile.
That synthesis happens through years of observing how people live in spaces, what makes them feel at home, what makes them uncomfortable. It's pattern recognition trained on human behavior, not image datasets.
The Career Lesson Nobody Talks About
What we're really talking about is the difference between automating a task and modeling a thought process. AI can automate technical execution beautifully—image extraction, layout generation, color palette analysis.
But modeling impromptu creative thought? That's where it hits a wall.
A mood board works because it captures intuitive understanding of how visual elements create emotional responses. That understanding is contextual, cultural, and deeply personal. You could train an AI to recognize successful patterns, but teaching it to understand why certain combinations work for specific clients in specific situations requires modeling human psychology, cultural context, and individual taste simultaneously.
What This Means for the Future
I keep thinking about this as I test more agentic coding tools. We're getting incredibly good at automating the execution layer of creative work. Code generation, layout systems, even design variations—all of this is becoming routine.
But the strategic thinking that drives those executions? The understanding of why this solution fits this problem? That remains stubbornly human.
Maybe that's the real lesson here. As AI takes over more of the technical execution, the value shifts to the creative judgment that guides that execution. The mood board isn't valuable because it's hard to arrange images on a page. It's valuable because it represents a designer's unique way of seeing relationships between visual elements.
That seeing—that's what we can't automate yet.
The Final Layout
AI helped me build a technically perfect image aggregation tool in about an hour. But it couldn't help me understand why one arrangement of furniture images would inspire a client while another would leave them cold.
That gap between execution and intuition isn't a limitation to be solved. It's the space where human creativity lives—and where it's likely to remain valuable, even as everything else gets automated.
For anyone thinking about their career in an AI-dominated future: focus on developing that intuitive understanding of why certain solutions work. The technical execution is getting easier every day. The strategic judgment behind it isn't.
Thanks to Joanie for the inspiration. Interested in more perspectives on where AI excels and where it doesn't? Follow my exploration of agentic coding tools and their practical limitations at hyperdev.matsuoka.com.