I didn’t arrive at this moment through a linear career path. I arrived through eras. I was born in 1964, before the digital world existed. Everything that followed had to be learned, adopted, questioned, and—eventually—integrated. That matters, because it means my relationship with technology has never been abstract. It has always been lived. I remember a time before interfaces. My first computer demanded patience, curiosity, and a tolerance for friction. That distinction has stayed with me. As personal computing entered culture—through music television, graphic interfaces, and the early marriage of design and software—I watched technology stop being purely functional and start shaping identity. Brands learned to move at the speed of culture. Images, sound, and story became systems of influence. This wasn’t theoretical. It was visible, daily, undeniable. Then the web went public. Suddenly there was a before and an after. Knowledge flattened. Gatekeepers weakened. Context collapsed. The workplace detached from place. Laptops, Wi-Fi, and email quietly rewired how organisations thought about time, presence, and productivity. Strategy began to accelerate, even as reflection began to thin. I earned a BS. Environmental Design from UC Davis shortly after, and my dream of being a hotel designer was put on permanent hold as I moved into this new world designing for this new future. By the time social platforms arrived, something fundamental had shifted. Media was no longer something we consumed—it was something we lived inside. Identity became performative. Influence became ambient. Entire industries reorganised themselves around attention. I was already mid-career when this happened. That, too, matters. It meant I could feel the difference between signal and noise. Between tools that genuinely expanded capability and those that merely increased velocity. I had enough pattern recognition to recognise cycles repeating—only faster, louder, and with higher stakes. When AI entered the mainstream, it didn’t feel like a surprise. It felt like convergence. For the first time, strategy, creativity, and technology began sharing the same interface. Thinking, making, and scaling collapsed into a single workflow. The implications weren’t just technical. They were cultural, organisational, and deeply human. This is where my work now lives. Not at the edge of novelty, but at the point where experience meets acceleration. Where organisations are trying to move faster without losing their shape. Where leaders are expected to make consequential decisions inside systems that no longer pause. I don’t approach this work as an optimist or a sceptic. I approach it as someone who has lived through multiple inflection points and learned that progress is rarely about adopting the next thing first. It’s about knowing what to hold steady, what to reinterpret, and when to act. That’s why I don’t lead with titles or company names. They flatten the story. They obscure the forces that actually shaped how I think. What matters more is this: I’ve lived through the transitions your organisation is now facing—just at different scales and speeds. I understand how technology seeps into culture, how culture reshapes behaviour, and how behaviour ultimately determines whether strategy survives contact with reality. This isn’t hindsight. It’s knowing where you stand while everything else is in motion. And in a world accelerating this quickly, that perspective is the real advantage. That perspective doesn’t exist in isolation. It’s shaped and sharpened in rooms where ideas are tested, challenged, and held to account. In communities where the work isn’t just to perform, but to contribute—especially when the industry itself is in transition. That’s why I’m an active member of The Marketing Society and the International Advertising Association. And why I serve as Co-Chair of the IAA Singapore Executive Advisory Board on AI. Not as titles to list, but as commitments to the craft. To leadership that extends beyond individual projects. And to the responsibility of helping an industry navigate change with more thoughtfulness than fear. Because experience isn’t something you keep to yourself. It’s something you share. There is one chapter that shaped everything that followed. Early in my career, I spent eight formative years working with Apple, during the Steve Jobs era, from 2000-2008, arguabley one of the most disruptive periods in hostory. It was a era when the company was rebuilding itself from the inside out. This wasn’t just a period of growth. It was a deliberate reinvention of how technology, culture, and craft could coexist. I worked across Marketing, Education, Retail, User Groups, and Events, leading agency teams inside a system that was being redesigned in real time. What made the experience extraordinary was the discipline. Apple, at that moment, was learning how to focus again. Product lines were being cut back. Language was being sharpened. Decisions were being made with an almost uncomfortable clarity. I watched teams argue not about what could be done, but about what should be done—and what needed to be protected at all costs. This was when design stopped being aesthetic and became structural. Every decision—hardware, software, packaging, retail, messaging—was treated as part of a single experience. Craft wasn’t decorative. It was operational. Ideas didn’t survive unless they held up all the way through execution. The launch of Mac OS X quietly taught a powerful lesson: that long-term thinking often looks invisible at first. Foundations matter more than features. You build for what comes next, not just what ships now. That mindset—investing early in systems that enable future creativity—stayed with me. The arrival of the iPod and iTunes changed something else entirely. I saw firsthand how technology becomes meaningful only when it removes friction from human life. Apple wasn’t selling devices; it was reshaping relationships—with music, with media, with personal identity. Storytelling wasn’t an overlay. It was the organising principle that made the product make sense. Apple Retail reinforced this again, at human scale. Stores weren’t designed to sell harder, but to explain better. Staff were trained to teach, not persuade. Experience wasn’t a layer added at the end—it was the strategy. Watching that play out taught me that the last mile is never tactical. It’s where trust is earned or lost. Then came the iPhone. From the inside, it was clear this wasn’t an iteration—it was a collapse of categories. Hardware, software, interface, and narrative fused into a single object. What struck me most wasn’t the technology itself, but the conviction behind it: the willingness to bet everything on a simpler, more intuitive future, even when the market wasn’t asking for it yet. By the time the App Store arrived, the lesson had expanded again. Apple stopped being just a maker of products and became an orchestrator of ecosystems. Control wasn’t about limitation—it was about coherence. Quality scaled because standards were non-negotiable. That entire period set my internal bar. It taught me that quality is a discipline, not a preference. That simplicity is earned through hard choices. That speed without judgment is dangerous. And that the most important work often happens before anything visible is produced. Everything I’ve done since traces back to that apprenticeship—not in imitation, but in principle. It shaped how I approach technology, how I evaluate ideas, how I work with leaders under pressure, and why I care so deeply about clarity when the world accelerates. It’s the reason I’m sceptical of hype, allergic to shortcuts, and committed to work that holds up over time.