The "Ticker Tape" Problem
People are still waiting for AI-driven growth operations to "hit prime time"... but it's already happened.
I watched something this weekend that I haven’t been able to stop thinking about.
It was this Lenny Rachitsky podcast with the first official “vibe coder” at Lovable. The job title alone made me pause; but, what really got me was his background. He was hired to be a growth engineer on Elena Verna’s team… with no technical background, having never written a line of code.
His job is to build products. Not mockups. Not requirements documents. Actual working software that ships to users, both internal tools for the growth and marketing teams and external products that Lovable’s customers use. He gets a tool request, builds 3-5 working prototypes in a day, sends it out for feedback, iterates the next morning, and ships. His team pushes multiple new tools every week without reading code and without the support of “cross-functional development squads”.
I’ve been sitting with this for days now because it made something click that I think a lot of us have been dancing around: People have been saying for months that this type of success with AI is possible. Yet, when you actually sit down and try to use these tools yourself, it’s easy to feel like it’s bulls*it.
The code is buggy. Nothing works the way you want. It’s so, incredibly frustrating. You talk to the specialists, the career engineers and product managers and designers, and many of them are in complete and utter denial that someone without their training could do 85 percent of what they do with an LLM.
Honestly, I understand why people feel this way. The tools are hard to use. The learning curve is brutal. The output isn’t always good. It feels “certain” that this WILL happen… just, not yet. That’s a “two years from now” problem for us to solve later. They just move through each day doing things “the same way they always have”, telling themselves that’s fine- it’s not time to change yet.
This “vibe coding” job actually shipping product in one of the most successful startups of all time means that it’s real right now.
The next thing I watched hit me even harder.
I’ve watched this incredible demo of a Meta PM’s app-building workflow four times in the last three days. He is not technical. He has never written a line of code. And he walked through his entire workflow for building, shipping, and improving real products that people pay money for.
He developed a Cursor commands that behave like an AI CTO for his app side business.
- It helped him design a development process. If you’ve ever worked on a product development team, what he built looks almost identical to what you’d do with human engineers. There’s a discovery phase, a planning phase, technical recommendations, a build plan, code review from multiple perspectives.
- Claude writes the initial code.
- “Code review” is run with three “Sr Engineer” agents.
- The three AI “engineers” debate feedback in the pull request stage.
- They reach consensus, make changes, ship, and start production testing.
He ran through this entire motion in about 45 minutes. The features he shipped looked great. The hype is real- we live in the future.
You don’t need a “crystal ball” to see what post-AI Growth teams look like.
Remember the “professional vibe coder” at Lovable I mentioned earlier? He was hired by Elena Verna. Elena has led growth for some of the largest and most successful startups in history, and her current message is stark: traditional specialist roles are collapsing. Not might collapse someday. Are collapsing now.
She runs teams today where people execute every function of the Growth stack, from the concept itself to the distribution of the finished product. If you have a good idea, or if someone tells you to go drive a specific KPI, you no longer need a cross-functional pod of specialists to execute. You need one person with good taste and good judgment, someone who knows how to ask the right questions, who has learned to use these tools effectively. These people are staffed in a “loop-based” model- not in siloed, functional teams in an “assembly line” process.
Cursor doesn’t have traditional job titles or processes anymore, either. People own continuous development of specific pieces of the product and ship everything from soup to nuts. They use AI to fill in the gaps in functions they are less adept, and use the collective team’s feedback to tune their work.
These aren’t experiments. They’re production workflows at some of the most successful companies in tech.
I think about the traditional “assembly line” process we use in Growth teams:
- Someone writes user stories and product requirements. They hand it to engineering.
- Engineering goes away for a week or so, comes back with something. Everyone reviews it. There’s hemming and hawing about whether it matches the requirements. Eventually it gets approved.
- It goes to marketing, who puts it on their roadmap for next month. They say, “okay, what do we have here” and start planning distribution. Different people write copy for different channels. That copy goes through three rounds of reviews.
- It goes to design for a week.
- Back to the channel team to get coded and tested and set up in the automation system.
By the time the thing actually ships and hits distribution, it’s 1-2 months later and the team that developed the product requirements is 4-8 people away from the distribution plan.
Even today, before we’re “all the way there” with these new AI-driven systems and structures, that “old” process feels… archaic.
But AI skeptics are still doing things the “old” way, moving through the workday like snails in an Indy car race, unconcerned about the pace of those around them. Why?
This is how major tool shifts always go down with “incumbents”.
Think about how accountants must have felt when Excel hit the scene. At that time, everyone had those giant, noisy calculators on their desks with the ticker tape attached. My mother and grandmother both used these at work religiously when I was a kid. They had to staple the ticker tape to their reports as calculation “backup”. Everyone was saying: “Do it on the computer. Use Excel. It’s 100x faster and 10x more accurate.” They would try- and often found the hype to be unfounded. The software was nowhere near as good or intuitive as it is today, they had no idea how to use it properly, and nobody was paying them to learn how to use it.
During this transition from the “current” manual method to the “new” Excel approach, there were two types of people:
- Those who saw where things were going and learned the new tool, even though it was imperfect, time consuming, frustrating, and often had to be learned in their free time.
- Those who insisted the old way was fine and kept the noisy, printing desk calculator manufacturers in business for a bit longer. Eventually these laggards either adapted or became irrelevant.
Imagine knowing what you know today about how superior Excel is at processing numbers, yet seeing someone still doing the work in 100x the time with 50% of the accuracy in the “pre-Excel” way. It seems absurd- but it was common at the time.
This is what I see people doing today as AI emerges in professional services- and it freaks me out. Very “we’ll figure it out when it gets closer” vibes, with a side of “it will be a while until that happens”- but it already has happened.
I also remember how excited everyone was about “marketing automation” and “automated campaign optimizations” when I started in marketing. I know- “dinosaur” alert.
HubSpot was the hot thing. Companies were launching four-year enterprise projects to implement full-stack Adobe marketing suites. The idea that you could schedule social posts in Buffer and have them go out automatically, or that an algorithm could optimize your Google paid search bids for every keyword, every device, every geo, every 15 minutes without a human checking them, that was considered cutting edge.
Yet, there was resistance. Why would I trust an algorithm to do that? How do I know it’s making the right decisions? What if something goes wrong?
Eventually enough head-to-head tests were run. The algorithms outperformed manual bid management. You didn’t need 15 people to run a paid search campaign for a major advertiser anymore. You needed three people with the right tools, and they performed 20 percent better than the old way. The people who were no longer saddled with the task of optimizing every bid every hour on every ad group got to do other things to drive growth- oftentimes, really cool things.
The difference between those who saw it coming and those who didn’t was whether they were willing to run the experiments themselves- and whether they were even open to trying or doing it differently than they did now.
The more I think about it, the more I realize the skeptics aren’t wrong about the difficulty- they’re wrong about the trajectory.
Yes, the tools are hard. Yes, there’s a steep learning curve. Yes, it’s frustrating to learn something new when you’ve spent years developing expertise the old way.
But this isn’t how people will do this work “in the future”… people are doing it this way right now. If you’re still in the “wait and see” camp, it’s time to see and act- you’re already behind.
What to do?
There’s more to say about this, and I’ll dig into the “roles and teams” topic next week. I don’t want to leave you with just a bunch of observations. Here’s what I’d actually do if you’re reading this and feeling like you might be behind.
- Watch the interviews I linked above. Seeing someone who has never written code walk through their entire workflow for shipping real products is more convincing than anything I can write here. It makes the abstract concrete.
- If you’re just starting out, don’t begin with Cursor. It’s powerful but has a steep learning curve. Start with something simpler like Lovable or Bolt, where you can describe what you want in plain language and see working software appear in minutes. Get some early wins. Build your confidence. Then graduate to more advanced tools when you’re ready.
- Build something this week. Not next month. This week. It doesn’t have to be good. It doesn’t have to ship to anyone. Just pick a small problem you have, describe it to one of these tools, and see what happens. The gap between “I’ve heard this is possible” and “I’ve actually done it myself” is where all the learning lives.
The old way is becoming older every day. The people at the forefront are already working differently. And eventually, everyone else will have to catch up.