I've been a jack of all trades my entire career. A writer first, I majored in journalism at NYU. Journalism teaches you to parachute into any subject, learn it fast, and write about it with authority. Then do it again next week. My first job was editing books for software engineers. I wasn’t a tech person, but I was curious about everything they built. That curiosity became a pattern.
I moved into marketing, writing copy for an e-commerce company. Copy led to SEO. SEO led to agency work. Agency work was a natural fit. Every new client was a new industry to learn, a new set of problems to solve. I never had to pick a lane. Along the way I picked up enough about design, analytics, strategy, and media buying to do real work in each of them. The same instinct repeated across twenty years: learn enough to do the work, then learn the next thing. Not depth in one direction. Breadth in every direction.
Then the model broke. The agency where I'd worked for nearly 10 years started contracting. The work hadn't dried up. The economics had. Big teams, big overhead, big retainers for work that fewer and fewer people were needed to do. I was laid off.
Not ready to trade one leaky canoe for one with a slightly smaller leak, I started my own business. Put everything I'd learned to work for the one client who'd never fire me.
And then reality hit. Even with twenty years of experience across every marketing discipline, I hit the same wall every business owner hits.
Time.
I knew how to do everything. SEO, content, social, email, ads. I could do all of it. I just couldn't do all of it at the same time. One person can hold the strategy for a dozen channels in their head. One person cannot execute a dozen channels simultaneously.
So I used my limited programming skills to build some Python automations. Pulled data from Google Search Console for keyword research. Used AI to help improve my rankings. Just tools I needed.
That was May of 2025. I haven't stopped building since.
What started as a few scripts became a company. Not because I had some grand vision, but because every time I solved one problem, I realized AI could help me solve the next one too. And the next one. And the next one. Research, strategy, content, design, video, distribution, analytics. Each time I handed something off, I expected the AI to hit a wall.
It kept not hitting the wall.
And somewhere in the middle of all that building, I realized that most people are having the wrong conversation about AI.
Everyone's comparing AI to the Model T
The story goes like this: The automobile replaced the horse. Blacksmiths lost their jobs. Carriage drivers had to retrain. But new jobs appeared. Mechanics, gas station attendants, highway construction workers, truck drivers. Old roles faded, new roles took their place. The nature of work changed, but work itself persisted.
People tell this story to reassure themselves. "AI will take some jobs but create new ones." "Just like the Industrial Revolution." "Just like the internet." Same story, different decade.
I don't think they're wrong about the direction. I think they're catastrophically wrong about the magnitude.
This isn't the Model T. This is teleportation.
The Model T replaced the horse with something faster. Same category. Point A to point B, just quicker. The Model T still needed roads. It still needed fuel. It still needed a driver. It still took time. An entire ecosystem grew up around it: gas stations, highways, traffic laws, suburbs, drive-throughs, road trips.
Teleportation doesn't need any of that.
Teleportation doesn't improve how you get from A to B. It eliminates the concept of distance. Roads become irrelevant. Fuel becomes irrelevant. Drive time, commutes, geography, logistics. Not improved. Gone.
The people comparing AI to the Model T are thinking inside the transportation metaphor. Better, faster, cheaper. Still the same kind of thing.
AI isn't a better, faster version of expertise. It's the elimination of expertise as a scarce resource.
"AI isn't a better, faster version of expertise. It's the elimination of expertise as a scarce resource."
Think about what that means.
The entire modern economy is built on the scarcity of expertise. You hire a lawyer because you can't practice law. You hire a designer because you can't design. You hire a developer because you can't code. You hire a strategist because you can't... actually, you can probably strategize just fine. You just don't have time because you're busy managing all the other specialists.
Every hiring decision, every agency retainer, every freelancer invoice, every career ladder, every university degree exists because expertise is scarce and unevenly distributed. That scarcity is the road. The foundation underneath everything.
And teleportation just arrived.
When one person can access expert-level capability in virtually any domain, the entire infrastructure built around expertise scarcity starts to look like a highway system in a world where everyone can teleport. Not obsolete overnight. But fundamentally, structurally unnecessary.
Universities training specialists for roles that one curious person can fill? Career ladders built on accumulating narrow depth over decades? Freelancer marketplaces matching buyers with sellers of expertise? Agencies bundling six specialists and charging $10,000 a month for the bundle?
Those are roads. And we just invented teleportation.
But not everything disappears
Here's where people get this wrong in the other direction. The doomers think everything dies. That AI replaces all human contribution. I don't believe that either.
Think about Maya Angelou. Could a Maya Angelou exist in the age of AI?
Absolutely. Angelou's work wasn't the competent execution of literary technique. It was the expression of a life no one else had lived, a perspective no one else could have had. Angelou didn't succeed because she was good at writing. She succeeded because she had something to say that only she could say, and the courage to say it.
Now think about the thousands of working artists who make a living producing competent, derivative work. Album covers that echo a trending style. Pop songs built from the same four chords and recycled hooks. Ad campaigns that remix last year's winners. That work exists because creating it requires technical skill, and technical skill is scarce. People pay for it because they can't do it themselves.
That's the work that's going away. Not the savant. The derivative. Not the person with genuine vision. The person executing patterns that someone else invented.
This sounds harsh. It's not meant to be. Most of us do derivative work. I certainly did for most of my career. I was good at my job, but I wasn't inventing new paradigms. I was applying known frameworks to specific situations. That's what professionals do. That's what "expertise" means most of the time.
And that's exactly what AI does now. Better and faster than most of us.
So what actually matters?
I'll tell you what I've learned. Not from studying or reading about this. From living it for the past nine months.
Don't be afraid.
I know that sounds like a bumper sticker. But I mean something specific. Moments where you get to fundamentally redefine what you're capable of don't come along often. Most people will go their entire careers without one. This is one. And the biggest risk isn't that AI takes your job. It's that fear keeps you from seeing the opportunity that just opened up.
The bottleneck in my career was never what I knew. It was time. I could see the whole board, play most of the positions, but not all at once. AI didn't teach me anything new. It gave me the one thing I never had enough of: capacity. The ability to execute across every discipline simultaneously, at a level that used to require a team.
One person with broad curiosity and a $20/month subscription can now do things that used to require a team of ten. I know because I'm doing it. Right now. In real time.
But only because I wasn't too afraid to try.
"Moments where you get to fundamentally redefine what you're capable of don't come along often. Most people will go their entire careers without one. This is one."
What I'd actually tell you to do
Not as a tech person. Not as someone who builds AI. As someone who lost their job, went back to building a business, discovered what these tools could do, and is now building a company around it.
First: sign up for a paid AI platform today. Claude, ChatGPT, doesn't matter which. The paid version. The free tier is a year behind what's actually possible. Judging AI by the free version is like judging a band by their soundcheck.
Second, and this is the one that changed everything for me: change how you think. Every frustration, every tedious task is an opportunity. Every "I wish I could" is a question waiting to be asked. Don't think "I spend three hours every Monday pulling last week's numbers into a report nobody reads." Think "what can I do so this report writes itself?" Then type that question into an AI. Literally. See what comes back.
Are you automating away your job and accelerating your demise? Maybe. But if the canoe is already sinking, knowing how to swim isn't the problem. Not knowing is.
That's sending a plant through the teleporter. You're not stepping through yourself yet. You're testing it. Low risk, high information.
Third: stop looking at this through the lens of what you might lose. Most conversations I hear about AI are framed around fear. Fear of losing jobs. Fear of losing relevance. Fear that expertise built over decades is suddenly worthless.
I get it. I lost my job too. The fear is real. So is the hit to your ego.
But fear makes you defensive, and defensive people don't build things. The people who will come out of this better than before aren't the ones with the deepest expertise or the most impressive credentials. They're the ones who looked at the most significant opportunity to redefine themselves in a generation and didn't flinch.
One more thing
I spent my career developing what people call soft skills: clear thinking, clear writing, the ability to frame a problem and articulate what you need. For twenty years I never fully valued those skills because they felt so abstract. You can't point to clear thinking the way you can point to a line of code or a financial model. Writing was just the invisible thread through everything else I did.
It turns out the thread was load-bearing.
How much you get out of AI has less to do with what it outputs. It's entirely a function of what you put in. The quality of the question. The precision of the framing. The ability to articulate exactly what you need and recognize when you're not getting it.
Is it any wonder that the interface to the most powerful technology ever created is a text box?