All Perspectives
Hold Period Value Creation Organizational Transformation AI Transformation

10X Didn't Come from a Single Transformation. It Came from Years of Evolving Everything at Once.

Al Mays

I helped drive a VC-backed customer journey analytics startup to 10X growth by doing something harder than any single transformation: we continuously evolved our roles, processes, and technology together, for years. Every time the market asked for something bigger, we could not just ship a better version of what we already had. We had to rebuild who did the work, how the work got done, and what the work was done on. I started that run as a pre- and post-sales solution architect and ended it as the CPTO. Along the way, I learned firsthand why continuous evolution is the only strategy that actually scales. Every major technology wave I have led through, agile, cloud, and now AI, has followed this same pattern. Anyone who tells you any one of them is a transformation you do once and then operate has not lived through enough transformations. This is the pattern. Recognize it, and you can lead it. Miss it, and you will have to manage around it for the rest of the hold period.

10X growth is never one transformation. It is many, overlapping.

When people look back at a company that grew 10X, they tend to compress the story. They see a single arc, a clean before-and-after, a CEO narrative that fits on a slide. That is not how it felt from the inside, nor is it how it actually worked.

We started with a narrow, clean use case: single-channel IVR analytics. A customer had one data source, the IVR logs, and we helped them see patterns in how callers moved through the phone tree. The value was real, but the problem was small. Then, customers asked us to cover the whole contact center. Then, to connect the phone channel to the web channel so they could see the full digital-to-voice handoff. Then, to add retail, so they could trace a customer from a store visit to a call to a website session. Then the really hard stuff: full business process journeys like order-to-activation, where the question was not “what happened on this channel” but “why are we losing this customer, or this renewal, or this upsell.”

Every one of those expansions looked, on the surface, like selling a bigger version of the same product. It was not. Each one required a different organization, a different process, and a different technology stack to be feasible at all. The growth came from recognizing that early and rebuilding ahead of the curve, not from squeezing more out of what we already had.

Data complexity is the leading indicator. Almost everything else is lagging.

The single most reliable signal that something had to change was the number of data sources in a customer engagement. Our early deployments touched one or two sources. Our most complex deployments, the ones with Fortune 500 financial services and telecommunications customers working alongside top global systems integrators and, eventually, a partnership with a top-tier strategy firm, touched twenty or more.

That is not an incremental difference. That is a different business.

With twenty-plus sources, top-of-the-house value on the line, and delivery pressure measured in weeks, you cannot run the same engagement model you used at two sources. The data complexity forces everything else to change. It is the canary. And in my experience, leaders who are paying attention to revenue growth but not to the underlying complexity of what their delivery organization is being asked to handle are always surprised when the wheels come off.

For any PE operating partner reading this: if you want a leading indicator of whether a portfolio company can handle its next growth stage, look at the data complexity of its engagements versus the shape of its delivery team. If the complexity is running ahead of the team structure, you are about to have a scaling problem that revenue will mask for one or two more quarters, and then won’t.

Roles had to evolve before the technology could.

In the early days, we relied on outsourced data engineering and had minimal subject-matter expertise. That worked when the job was scripting a handful of business rules for a simple use case.

It stopped working the moment our customers wanted to see the whole journey. To serve that, we built a team that looked fundamentally different: information architects who sat directly with the customer to translate messy business questions into analytical structures, data architects who owned the target models, combined with an outsourced data engineering function running on a 24/7 model to deliver at the speed the engagements demanded. Those three roles had to exist in that shape, with those handoffs, before we could take on the work. We did not add the roles because we got bigger. We grew bigger by adding the roles.

The lesson that keeps repeating for me, across the years and across everything that came after, is that organizational design is almost always upstream of technical capability. If the roles are wrong, no amount of tooling rescues the outcome.

Technology followed the roles, not the other way around.

The technology stack went through its own staircase over the years, mirroring the roles almost exactly.

We started by scripting every business rule in Perl. That was fine when one engineer could hold the whole engagement in his or her head. As complexity grew, we moved to traditional ETL tooling, which let a larger team work on the same pipelines without stepping on each other. Then we productized our own proprietary ETL language because the commercial tools were not expressive enough for the patterns we repeatedly saw. And eventually, we productized the tooling itself so that business analysts and customer users could drive the work directly, without routing every change through an engineer.

Each step moved the capability closer to the person who actually understood the business question. That is the arc that matters. The underlying database technology followed the same logic, riding the same industry wave as the rest of the market: traditional relational databases, then MPP columnar stores, and finally Hadoop when the data volumes demanded it. We did not adopt those technologies because they were new. We adopted them because the role structure we had already built demanded them.

The leader has to evolve too, or the company stalls.

I will be honest about this part because it is the piece that gets lost in most transformation stories.

I started at the company as a solution architect working both pre-sales and post-sales. Then I split off to build the information architecture practice. Then I grew into leading all platform delivery. And eventually, I served as CPTO. Each one of those moves was forced by the same thing that forced the technology and role evolutions: the work outgrew the shape of the leadership. If I had stayed in my original role while the company grew around me, I would have become a bottleneck by year three.

That is the part I want every CEO and operating partner to sit with.

If your technology leader is the same person, in the same shape, doing the same job, generating 10X the revenue they were at the start, something is wrong. Either the company has not actually grown, or the leader has become a ceiling. Both show up eventually. Both are expensive.

The same pattern is playing out with AI right now.

I have written separately about why AI is not a one-time transformation but a continuous evolution. Those years are why I believe it so strongly.

The companies I see doing well with AI right now are treating it the way we had to treat data complexity. They are watching their leading indicators. They are evolving roles before the tooling forces them to. They are picking technology to match the organization they are building, not the other way around. And their leaders are evolving their own jobs as the work changes underneath them.

The companies struggling with AI are doing the opposite. They are running a one-time AI program, hoping to cross the finish line, preserving roles and leadership structures that were appropriate for the pre-AI version of the business, and wondering why individual tool adoption is not compounding into business outcomes. It will not. It cannot. The pattern does not work that way, and it never has.


Many years inside that 10X run taught me that 10X is not a destination you reach. It is what happens when you get comfortable, as a leader, with the fact that your organization, your processes, and your technology will be different in eighteen months, and different again eighteen months after that, and that your job is to lead that evolution rather than resist it.

AI is speeding up that cycle, not slowing it down. The companies that internalize this now and build the muscle for continuous evolution across people, processes, and technology together will compound an advantage that the companies waiting for a stable end state simply cannot catch up to.

There is no finish line. There never was. That is the good news.

Want to talk through how this applies to your situation?

Send me a note, I'm happy to have a direct conversation.

Get in Touch