Professional Journey

From Sales Operator → Ecosystem Strategist → Founder → Math-First AI Architect

Most of my career was spent in sales—carrying quota, building territories, and leading enterprise teams. That experience taught me to see revenue as a system, organizations as architectures, and execution as a design problem. Today, I apply that same systems thinking to build math-first AI architectures. This is how that evolution happened: Sales → Systems Architecture → Math-First AI. Everything that came next follows that pattern.

Dimension Data — Where I Grew Up Professionally (and Personally)

My real career began at Dimension Data—and it launched what became nearly 18 years in sales. Carrying quota, building territories, leading teams, closing deals. That wasn't a detour from systems thinking. It was where I learned to see systems. This was the foundation: real revenue, real customers, real complexity.

Sales at Dimension Data was fast-paced, deeply technical, and very much sink-or-swim. You had to perform, you had to learn quickly, and you had to be able to operate without someone constantly telling you what to do. The people around me were sharp and self-sufficient, and that forced me to become the same.

This was the early 2000s, when the industry was evolving at high speed. VPN was just emerging, wireless networking was taking off, IP telephony was reshaping how people communicated, and data centers and networking were modernizing rapidly. Virtualization arrived. Cloud started to move from concept to reality. All of that change hit in roughly the first seven years of my career, which meant I spent those years constantly absorbing new technologies and figuring out how to make them matter to customers.

Dimension Data forced me to master two things that have shaped everything since. First, I had to understand what real business value looked like. It wasn't enough to talk about speeds and feeds or pretty slides; I had to understand what would actually move a client's business forward and how technology fit into that. Second, I had to learn new technology at speed. Every six to twelve months the landscape shifted, and if you didn't keep up you were done. That combination—deepening business understanding and rapid technical learning—taught me how to translate technology into business impact, strip away noise, see patterns across customers and industries, and communicate clearly even when the underlying systems were complex.

While all of that was happening, my life was maturing at the same time. During my years at Dimension Data, I met my wife, bought my home, and started my family. The company wasn't just where I learned how to sell and think—it's where I grew up. It remains one of the most positive chapters of my career and my life.

NTT — From High-Growth to Global Ecosystem Strategy

When Dimension Data was acquired by NTT, my world widened dramatically. The culture shifted from entrepreneurial growth to a more mature, global enterprise. Many people resisted that change. I saw an opportunity.

At NTT, I moved from selling individual solutions to architecting sales systems at scale—designing the organizational structure that would generate $1B in pipeline. NTT had dozens of operating companies, each with its own strengths, sales teams, engineering cultures, and customers. I realized that if we could get those pieces to work together instead of in isolation, we could create something far more powerful than any single business unit on its own. No one asked me to do this. I started picking up the phone and calling people across the NTT ecosystem, learning what they sold, who they served, and where their capabilities overlapped or complemented each other. From there, I began tag-teaming accounts, integrating solutions, and coordinating architectures that spanned multiple operating companies.

That work turned into something more formal. NTT Holdings asked me to do something rare: write the job description for a role that didn't exist yet—and then step into it. That role became Director, Americas Cross-Sell Program. In practice, it was ecosystem architecture at scale. I spent my time aligning 31 operating companies, resolving service overlaps, creating a unified services matrix, and designing a GTM motion that made sense both to executives and to sellers on the ground. I hosted contentious meetings to define who owned what, built the structures that would allow for collaboration instead of conflict, and then took that structure out into the field to train hundreds of sellers on how to use it.

The results were real. In just 22 months, the Americas region alone generated roughly $1B in pipeline from that program. It's hard to communicate what that actually means on a résumé, but in reality it meant sitting at the intersection of leadership, sales, operations, and customers, and designing the system that allowed value to flow between them. That role taught me how to design organizational systems, orchestrate cross-functional behavior, align incentives, and lead without direct authority.

This was the first time sales, systems, and architecture fully converged for me.

This was the moment I realized: Sales isn't just about relationships or persuasion. It's about designing systems—incentive structures, feedback loops, organizational architecture—that allow value to flow predictably. The same thinking I used to align 31 companies would later become the foundation for how I design AI systems.

By that point I had been in the ecosystem for about 18 years. I had hoped to make it to 20, but life and opportunity pulled me in a different direction. Still, I look back on the NTT chapter with a lot of pride. It matured me as a professional in the same way my family was maturing me as a person.

t-emgee Solutions — First Shot at Building Product

After nearly two decades in enterprise technology and sales, I saw the same structural problems across every organization: misaligned incentives, broken feedback loops, systems that couldn't scale. I stepped away from the large-company track and founded t-emgee Solutions, my first serious attempt at building a product company from scratch. It wasn't a big commercial success; in truth, I barely got by. But in terms of learning, it was one of the most valuable periods of my career.

The problem I wanted to solve came directly from my experience in the field. Again and again, I saw the same structural friction: engineering and presales teams would invest huge amounts of time designing solutions and building bills of materials, while procurement teams were measured almost entirely on driving cost down. Procurement often didn't feel the value of the design work—so once they had a BOM, they would shop it, hand it to whoever came in cheapest, and award the deal elsewhere. The original integrator lost the business and all the time they'd sunk into the architecture, and the relationship between integrators and customers eroded.

t-emgee was my attempt to solve that misalignment. The vision was a single pane of glass that unified four things: a procurement marketplace, a solution-architecture Q&A assistant (using early machine learning, before "AI agents" were a thing), full lifecycle and maintenance management of the assets purchased, and a data analytics layer to help offset storage costs through ethical use of aggregated data. In today's terms you could think of it as a SaaS platform that both engineering and procurement could share: a place to design, purchase, track, and analyze enterprise technology in one interface.

Where it fell short was execution. I had never built a software product before. I raised and spent money in the wrong order. I had to learn what APIs looked like in practice, what it meant to architect a web application, how to set up infrastructure, and how to coordinate all of that without the benefit of modern LLM tooling. We never got the full product into production. But I did architect the entire solution end-to-end and captured it in Figma as a complete blueprint: the flows, the components, the integration points, the lifecycle. I ultimately sold that architecture to a regional systems integrator in the Northeast so they could develop it further on their own.

That experience taught me two things. First, I could architect real systems and full platforms, not just talk about them. Second, there was a lot I still needed to learn about the mechanics of building and shipping. But the core insight stuck: the same systems thinking I'd used to design sales organizations—incentive alignment, feedback loops, structural design—applied to product architecture too. Both would matter a lot in the next phase of my career. It was my first exposure to the reality that architecture without execution is theory—and execution without architecture is chaos.

AWS — Hyperscaler Lessons and an AI Inflection Point

AWS showed me that even hyperscalers struggle with the same systems problems I'd seen in sales—incentive misalignment, structural friction, narrow lanes instead of holistic architecture. On the back of my GTM and architecture background, I was recruited to join AWS as the Principal Account Director for MongoDB, responsible for the global relationship. On paper, it looked like the ideal fit: a critical strategic account, access to incredibly sharp colleagues, and a culture that talked nonstop about "raising the bar."

There were real positives. AWS has serious tools and data, talented engineers and field teams, and a global footprint that very few companies can match. During my time there, we worked closely with MongoDB on integrating MongoDB Query Language (MQL) into Amazon CodeWhisperer, explored early LLM-powered developer workflows, and engaged with emerging players like OpenAI and Anthropic before the space fully exploded. It was the first time I saw AI not as an add-on, but as an architectural layer that would reshape how systems were built and experienced.

But I also ran into a deep contradiction. AWS speaks constantly about raising the bar. To me, raising the bar means thinking globally about the account, integrating GTM, product, marketplace, and service delivery into one coherent strategy, and challenging the status quo. In practice, there was structural friction between "sell-to" (growing the MongoDB account itself) and "sell-through" (driving MongoDB via AWS to Mongo's customers). The incentive structures and reporting lines naturally emphasized marketplace and program metrics, which didn't always align cleanly with the long-term health of the overall relationship. The teams were talented and operating effectively inside the lanes the system defined; the tension lived in the architecture of the model, not in the people.

It sharpened a lesson I'd been learning my whole career: if the architecture of the system is misaligned, individual talent can't fix it. You have to change the structure.

After a year, I made the decision to move on. But I didn't leave empty-handed. That period gave me a clear view of hyperscaler strengths and blind spots and, more importantly, made it obvious that AI was about to open a completely new universe of architecture. I knew I didn't just want to sell into that universe. I wanted to build inside it.

AWS was the place where the limits of prose-based architecture became obvious.

Early AI Startup Attempt — The Seed of a New Architecture

After AWS, my engineer and I began working on what would have been my next startup: an AI-powered home orchestration platform. The idea was to unify the home experience across any smart speaker and then go beyond "smart devices" into real home management.

The vision was simple to describe and extremely hard to build. You'd be able to say something like, "Hey Alexa, schedule the painter to refresh the living room—same color as last time—next Friday," and the system would already know which contractor you use, the paint details, the room's dimensions, your schedule, and your budget. It would handle the orchestration in the background. This was late 2023: Whisper had just arrived, Amazon Lex was painful to work with, and agents as we know them today barely existed.

We started with smart speaker integrations, realized the tooling was holding us back, and pivoted to a custom web front end. Architecturally, we framed it as four layers: a front-end interface, a routing and orchestration back-end, a personalization and learning engine, and a data/analytics layer that could help offset storage costs by ethically using aggregated data. It was a forward-looking design that anticipated where a lot of consumer AI is heading now.

The execution, however, ran into limitations. My engineer eventually hit his ceiling on the complexity. I spent months trying to find a replacement and kept ending up with people who couldn't deliver what was needed. While that was happening, I started teaching myself Python and leaning on AI for help, effectively becoming an early-generation "vibe coder." The experience was painful. AI could generate code, but it couldn't see my architecture. It could write functions, but not robust systems. It failed tests in silent ways. I didn't yet have the mental models to debug everything, and I got hung up on syntax and structure. It was clear that "just have the AI write it" wasn't a serious engineering strategy.

And then something unexpected happened.

In parallel with all of this, I fell into a deep independent study of quantum physics. I'm not a physicist, and I'm not trying to be one. My goal was to understand patterns: how quarks, leptons, bosons, and fields combine to form higher-order structures. As I learned more, I noticed something surprisingly concrete: the way code, services, and architectures compose looks a lot like how nature builds systems from the smallest units up.

Quantum Thinking → Execution Fabric → Lattice → AIVA

The breakthrough came when I realized: the same systems thinking I'd used to design sales organizations could be applied to AI architecture—starting from first principles, building from the smallest units up. Looking at Python and distributed architectures through a quantum lens, I saw both worlds following the same basic pattern: small units, combination rules, layers, and emergent behavior. Functions looked like atoms. Services looked like molecules. Architectures looked like living systems.

That led to a simple but powerful question: What if we could build AI systems the way nature builds matter—starting from the smallest executable units instead of monolithic flows?

From that question emerged the core ideas that now define the TAI ecosystem:

  • An Execution Fabric built around atomic computational units that can execute in parallel, governed by explicit dependency graphs instead of hard-coded sequences.
  • A Lattice Execution Fabric (now the AIVA Execution Fabric), where quantum-inspired execution primitives are composed into arbitrarily complex behaviors, but always in a deterministic, auditable way.
  • A "chemistry layer"—rules for how these executors combine and interact, which became the Lattice Query Language (now the AIVA Query Language).
  • A "biology layer"—a cognitive OS with somatic and autonomic behaviors, supervisory logic, and self-monitoring, which became AOS/CAIO/TAI.

At that point, it stopped feeling like traditional software. It felt more like designing a hierarchical physics of computation.

Another realization followed quickly: the entire AI stack is built on math. Attention mechanisms are calculus. Neural networks are matrix operations and optimization. Classical ML is statistics and probability. Programming languages sit on formal systems. If AI is math, and execution is math, and computation is math, then designing these architectures purely in prose or diagrams wasn't enough.

That led directly to the pipeline that now underpins all my work:

Idea → Math → Proof → Tests → Code

That pipeline eventually became Mathematical Autopsy—a method for designing AI systems that behave like scientific instruments rather than art projects.

Resonant Field Storage — Where It All Comes Together

While building out the cognitive OS and orchestration layers, one region kept pulling me deeper: memory. A serious AI system needs memory that is explainable, deterministic, mathematically aligned with the rest of the architecture, and capable of parallel field-level operations.

At first, I thought about it in terms of "holographic memory," because there are analogies to interference patterns and whole-from-parts reconstruction. But the language caused confusion. People heard "optical holograms," not computational fields. The metaphor got in the way of clarity.

So I renamed it Resonant Field Storage (RFS)—a name that actually reflects what it is: a field-based memory system where data is encoded, stored, and recalled through resonance and interference, with exactness and mathematical guarantees. RFS was the first system I built entirely under the Mathematical Autopsy discipline. Every transformation, every invariant, every property lives in the math first, is proved and tested, and only then implemented in code. This was the turning point where engineering and mathematics finally snapped into alignment.

RFS solidified my math-first engineering philosophy and anchored the rest of the SmartHaus Temporal AI architecture: RFS for memory, VFE for semantics, MAIA for intent, AIVA Execution Fabric and Query Language for computation, and TAI/CAIO for orchestration.

The Full Arc — And What It Means for You

This is the arc: Sales → Systems Architecture → Math-First AI

I spent 18 years in enterprise sales, learning to see revenue as a system and organizations as architectures. That systems thinking became the foundation for how I design AI—starting from first principles, building with mathematical guarantees, designing for predictable behavior.

That's what this journey has been building toward.

Sales → Strategy → Ecosystem Architecture → Founder → Math-First AI Architect & Inventor

If you work with me, you get someone who:

  • Understands both revenue and rigor—because I've built both
  • Designs systems the same way, whether it's a sales organization or an AI architecture
  • Starts with math and proof, not prompts and hope
  • Sees around corners because I've seen these patterns before—in sales, in ecosystems, in AI

That's the through-line. That's the pattern. And that's the difference in how I build systems—whether they're revenue engines or AI architectures.