The Joy Of Prototyping

If there’s one thread that’s run through my creative journey, it’s prototypiny. that middle space between imagination and reality. It’s where ideas start to breathe, where design becomes tangible. Looking back, I realize my entire relationship with technology started through this lens: experimenting, breaking things, and feeling that small rush when something finally worked.

Where it all started...with Flash

My first real exposure to digital design was in high school. We had an entire digital design department, which at the time felt like a portal into the future. That was the era of Macromedia Flash, and Dreamweaver. I still remember opening it for the first time and feeling completely overwhelmed. Flash had a steep learning curve such as with timelines, keyframes, layers, and this mysterious thing called ActionScript. But once you got a handle on it, it was magic.

I spent hours after school trying to make buttons animate, little characters move across the screen, or music sync just right. It was frustrating, sure, but also pure joy. The joy of experimenting and discovering what design could do when it wasn’t static.

I also dabbled in Dreamweaver, dragging and dropping HTML boxes like I was building my own little corner of the internet. It was exciting but limiting. You could feel how fragile everything was. One misplaced tag and the whole layout would collapse. But that didn’t matter. What mattered was the feeling of possibility. I was creating something interactive, something alive.

Looking back at those moments where staying late in a dark computer lab, testing animations, tweaking scripts were some of the most formative of my life. It’s what made me fall in love with the process of making.

Direct to Code

Fast forward to the early 2010s. The mobile boom was in full swing, and suddenly everything was about touch, gestures, and motion. This was when FramerJS and Facebook’s Origami came onto the scene — tools that brought the excitement of Flash back but with a new language and intent.

FramerJS was tough to learn, but once it clicked, it was exhilarating. I spent countless nights building iOS app prototypes. Crafting interactions that mimicked the slick transitions and animations of native apps. It felt like rediscovering that same childhood joy, just in a new form.

I remember working on concepts and ideas tied to products like ReceiptHog and Shoparoo. Sketching, coding, iterating. Those tools blurred the line between design and development, and that’s what I loved most. It wasn’t about perfection; it was about exploration.

The Era of Figma

These days, when I want to move fast, I reach for Figma. There’s something incredibly rewarding about building quick low-fidelity prototypes. It’s low-stakes and high-feedback. You can explore five directions before lunch and still have time to reflect.

It’s that same experimentation energy that first hooked me back in high school. The ability to try, fail, and learn through doing. Figma captures that spirit, even if it’s not about code. Sometimes, the most powerful design work comes from working fast and loose, before precision and polish creep in.

Where Things Get Real

Today, my work looks very different from those early days. I spend my time designing HMI (Human-Machine Interface) software. This could be for drones, medical devices, or other connected systems where design directly influences real-world behavior.

And that's where things get tricky. 1 As Houde and Hill noted, interactive computer systems are inherently complex — they involve "a rich variety of software, hardware, auditory, visual, and interactive features" that users experience as a combined effect. 2 In HMI design, this complexity multiplies exponentially.

Prototyping in this space isn't just about screens and pixels, it's about feedback loops, hardware states, sensor data, and real-time coordination. You might be syncing multiple displays, logging telemetry, or testing how physical buttons map to digital actions. But here's where we hit the real gaps:

Multi-Screen Orchestration: Most prototyping tools assume a single screen experience. But HMI systems often involve primary displays, secondary status screens, heads-up displays, and mobile companion apps all working in concert. How do you prototype the handoff between a drone operator's main console and their tablet when they need to switch contexts mid-flight? Current tools make you build each screen separately, losing the critical transitions and state synchronization.

No-Screen Interfaces: The future of HMI isn't always visual. Voice commands, haptic feedback, ambient audio cues, and gesture recognition are becoming primary interaction modes. Yet our prototyping tools are still fundamentally screen-centric. How do you prototype the Role of a voice interface that needs to communicate system status during a medical procedure? How do you test the Look and Feel of haptic patterns that convey urgency levels?

Augmented Reality Integration: AR overlays are transforming HMI design, especially in industrial and medical contexts. But prototyping AR interactions that respond to real-world objects, lighting conditions, and spatial relationships requires a completely different approach. The gap between a flat prototype and a spatially-aware AR interface is enormous.

Tools like ProtoPie have become the go-to in this space. It's incredibly powerful, you can simulate variables, connect to hardware, and design logic-heavy prototypes that behave like the real product. But it took me a long time to get comfortable with it. It's almost like coding again. Eventually, I found myself drifting back into code because it gives me the same sense of control and flow that first hooked me with Flash and FramerJS.

But even ProtoPie, powerful as it is, struggles with these emerging paradigms. It's still fundamentally built for screen-based interactions, with hardware connectivity as an add-on rather than a core assumption.

What's Next for Prototyping

Now we're entering a new era shaped by AI. Tools like ProtoPie are starting to bake AI into their workflows, helping automate logic, generate flows, and even simulate responses. Other tools like Cursor, Galileo AI, and Uizard are changing how designers prototype for web and mobile, speeding up the process in ways that would've seemed impossible just a few years ago.

But here's what's fascinating about this moment: AI is forcing us to be more intentional about what we're prototyping and why. 2 The Houde and Hill framework becomes even more relevant because AI can generate countless variations but without a clear learning objective, you're just creating more noise.

The real value of prototyping has never been the artifact itself, it's the insights gained from making something tangible and putting it in front of people. 4 AI can accelerate the making, but it can't replace the learning. That still requires human judgment, empathy, and the willingness to be wrong.

In my HMI work, this becomes critical. When you're designing interfaces for drones or medical devices, a beautiful demo isn't enough.. you need to understand how the system behaves under stress, how operators will actually use it, and whether the implementation can handle real-world complexity. AI can help simulate these scenarios, but the questions you ask determine the value you get.

But here's the catch.. most of these AI-powered tools focus on web and mobile, not hardware. There's still a huge gap in AI-native design tools that understand contextual, physical systems. Tools that can simulate multiple devices in sync, generate interaction logic for HMI, or bridge data between sensors and UI.

That's where the next big opportunity lies, and where play becomes essential. AI could help designers move from static prototypes to living systems. Intelligent mockups that understand state, data, and behavior in real time. Imagine AI that could prototype not just the Look and Feel of an interface, but also its Role in a complex workflow and its Implementation across multiple connected devices.

Bridging the Multi-Screen Gap: AI could orchestrate prototype experiences across multiple displays, automatically syncing state changes and testing handoff scenarios. Instead of building each screen separately, you could describe the workflow, "when the operator switches from console to tablet during emergency mode" and AI could generate the connected experience, complete with realistic data flows and timing.

Prototyping the Invisible: For no-screen interfaces, AI could simulate voice interactions with realistic speech recognition errors, generate haptic feedback patterns that you could feel through connected devices, or create ambient audio prototypes that respond to environmental context. The key is making the invisible tangible enough to test and iterate on.

Spatial Intelligence: AI with spatial understanding could prototype AR interfaces that actually respond to real-world geometry, lighting, and occlusion. Instead of flat mockups, you could test how information overlays behave when a technician moves around complex machinery, or how medical AR interfaces adapt to different patient positions.

But here's what excites me most: AI could bring back that spirit of playful experimentation that made Flash so magical. 2 Instead of rigid, robotic interfaces, AI could help us prototype systems that feel more human — interfaces that adapt their personality based on stress levels, that use humor to defuse tense situations, or that know when to step back and let human intuition take over.

The future of prototyping isn't just about faster tools. It's about smarter questions and more human answers. 2 As AI handles more of the mechanical work of building prototypes, we'll need to get better at defining what we're trying to learn and why it matters. But we also need to preserve that willingness to play, to try unexpected combinations, to let serendipity guide us toward solutions we never would have planned.

The best HMI systems don't feel like machines talking to humans, they feel like thoughtful collaborations. And the prototypes that get us there won't be perfect simulations; they'll be playful experiments that help us discover what's possible when technology serves human creativity rather than replacing it.

Coming Full Circle

In a way, I feel like I've come full circle — back to that same spirit of experimentation that started in high school. Whether it was ActionScript in Flash, code in FramerJS, or logic in ProtoPie, it's always been about one thing: bringing ideas to life.

But here's what I've learned over the years: prototyping isn't just about testing — it's about discovery. It's about learning what's possible by making it tangible. 4 In 1997, Stephanie Houde and Charles Hill wrote a seminal paper called "What do Prototypes Prototype?" that fundamentally changed how I think about this practice. 1 4

Their framework remains one of the clearest ways to approach prototyping because it forces you to be intentional about what questions you're trying to answer. 4 Too often, I see prototypes that are solution-oriented — beautiful demos with no clear learning objective. 4 Houde and Hill break prototyping into three fundamental categories:

Role: What is this product's purpose in someone's life? When I was building those early iOS prototypes in FramerJS, I wasn't just testing animations — I was exploring whether a gesture-based interface would feel natural or frustrating in daily use.

Look and Feel: What does it feel like to use? This goes beyond visual design to texture, motion, pacing, and emotion. Those late nights in Flash, tweaking timing curves and easing functions, were really about understanding how digital interactions could feel more human.

Implementation: How would it actually be built? This is where my current work with ProtoPie and HMI systems lives — testing whether hardware states, sensor data, and real-time coordination can actually support the concept.

What makes this framework powerful is that you don't have to prototype everything at once. You pick one lens, focus on it, and let the prototype do its job. 4

And as AI begins to reshape how we design and build, I hope we don't lose that hands-on spirit, that willingness to play, to explore, to make mistakes. Because that's where creativity lives.

The tools have evolved. From timelines to code, from code to prompts. But the heart of prototyping remains the same: curiosity, iteration, and the joy of seeing your ideas move.

References

Best,

Craig Aucutt