🎨 Less Science, More Art

So Much of Creativity is Born Out of Spontaneity, So What Happens When We Outsource Thinking to Rules-Based AI Tools? Plus, Earnings!

Welcome to Tuesday Thursday Saturday! Here, I share a snapshot of trending stories across business, tech, and culture three times a week, plus some updates from the daily financial news show I host. - KP

The Big Story: What If AI Kills Original Thinking?

This post was inspired by something a former colleague, Tyler James, wrote recently on her Substack. Tyler is an MBA student at Stanford and a very sharp product manager, and her piece (truncated from a longer whitepaper) unpacks what’s missing from AI-generated creative work. 

Her simple observation related to a lot of things I’ve been thinking about lately but hadn’t really been able to put into words. If AI is just an amalgamation of everything that’s come before it, organized by rules and logic that predict one word or image or idea after the next, then is anything actually spontaneous or original in that setup? Or do we lose that aspect of creativity entirely?

This is a question that’s been circling in my mind for months. Because if AI lacks intuition, if it has “no body,” as Tyler writes, no lived context, no sense memory, then what is it doing when it creates? And more importantly, if we all start the creative process with tools like ChatGPT or Midjourney or Gemini, what happens to the original spark: that special kernel of an idea that had not existed prior?

Of course, people have always borrowed from one another and artists are the worst (best?) offenders. Art, music, and writing is all a conversation across time. Zach Bryan sounds like Bruce Springsteen who sounds like Bob Dylan. 

Zach Bryan, “Sandpaper” (2024)

Bruce Springsteen, “Valentine’s Day” (1987)

No one creates in a vacuum. No one ever has unless you want to get really spiritual about it, which I prefer not to do here. But what’s different now is the scale, the speed, and the invisibility of that borrowing. When you take inspiration from another artist, you usually leave a trail. Even when it’s subtle, there’s a throughline. But with AI, the lineage disappears. The model pulls from thousands of sources and merges them into something new-ish, and the path back to where it came from gets real fuzzy, real fast.

It’s sort of like music sampling on steroids. Hip hop turned this into an art form, pulling soul and R&B beats from decades prior into modern formats. But it was generally clear which song and artist were being utilized. With AI tools, the source material is borderline infinite. Where is the inspiration coming from? Who’s to say? Unless you ask for an extremely specific degree of attribution, you’re not going to get it.

Author Ted Chiang once described ChatGPT as “a blurry JPEG of the web.” It’s recognizable, even functional, but it’s not high fidelity … at least not yet. It’s not rich with texture or context, or point of view. And yet these are the very things we tend to associate with originality.

There are obvious tells to spotting AI in the wild, especially in writing, and it’s generally in the syntax, punctuation, or the overuse of obnoxious phrases like “in today’s digital age.” But there’s something more subtle. It usually lacks perspective. It feels kind of … soulless.

Don’t get me wrong. Not everything in life needs to be net-new. A good portion of my (day) job as a marketer involves getting to a fast, clear path of communicating an idea in a way that meets the stated objective. That’s it. I’m (generally) not trying to win a Pulitzer prize when I come up with five variations for an email subject line, although I tend to write good email subject lines, and a Pulitzer would be nice.

A few years ago, I had the once-in-a-lifetime opportunity of writing a parody song for 1980s heartthrob Michael Bolton. Something that arguably could only be produced from the bizarre inner workings of my overstimulated ADHD brain, plus a solid 10+ years of seeing every VH1 Behind the Music ever. At least twice. 

Right now, we’re pre-AGI, and while these tools we’ve become so reliant upon are great at researching, connecting themes, and remixing existing content logically, they still sort of suck as a default starting point. That’s why the best prompts are super micro-managey. Once you get good at prompting, a pretty likely next step is to visibly wince as you list out a command with 15-20 sub-bullets and nuances to consider. You’re limiting the tool’s creative parameters, but that’s the entire point.

But what if AI always becomes the default starting point? Not an assistant, not a tool you reach for after you’ve figured out what you want to say, but the thing you turn to first. The impulse to skip the messy, uncomfortable, nonlinear process of thinking and go straight to output is already there.

Not to get too 1984 about it, but take this a step further and consider that there are powerful people behind the LLMs we’re using, and if you thought the news media could manipulate large swaths of the populous, just wait until the ideologies of a select few become baked within the very tools we rely on to function. YIKES.

That’s where I think we risk losing the real value of original thought. It’s not because AI “steals it,” either. It’s because we simply stop doing the work. Because thinking is hard.

Of course, this matters less for certain tasks and functions. If I want to open my fridge, list out the 11 random ingredients I happen to have at the moment, and tell GPT to give me a quasi-healthy recipe idea, I truly do not care that I’m not cooking up a Michelin-worthy meal. (My Italian mother will tell you that this, sadly, is not in the cards for me.) Right now, it’s about going from point A to point B efficiently. Mission accomplished.

But what about when an amateur chef makes a mistake in the kitchen, and the result turns out to be … kind of amazing? This happened one time when we were kids. For some reason or another, my mom was out of town, and my dad was on dinner duty. He made grilled cheese with provolone, which is a move I had not seen before, nor likely will again. But it was fun and different and VERY messy, and after that, we requested that our grilled cheese be made “Dad’s way” from time to time.

Real ideas are inconvenient. They show up when you don’t have time, or when you're not expecting them. I remember watching a documentary about the Bee Gees, and one of the brothers (Barry, obviously!) shared how the idea for their hit song “Jive Talkin’” came to him as he was driving across a bridge in Miami. The road had evenly spaced gaps that created a rhythmic thumping under the car wheels, and the pulsing sound stuck in his head. That rhythm became the foundation for the song. It wasn’t planned. It emerged from something random and fleeting. 

That’s the kind of spark I worry LLMs can’t replicate, and when our creativity starts with an AI tool, we risk losing the serendipitous thoughts that turn out to be our most creative. For Barry Gibb, it wasn’t just the sound; it was the accidental moment of noticing it, internalizing it, turning it into something else. This is the “body” that Tyler talks about in her article.

These types of ideas are rarely logical. Nobody goes out for a drive purposely looking for a naturally occurring sound effect that will make for the perfect backing percussion to a hit dance track. But there are millions of examples like this across music, art, inventions, and even business. 

So, when you start with a model instead of yourself, you risk losing the imperfect moments that can lead to things that are truly original. Sure, you can back into a decent first draft, maybe even a great one. But you didn’t wrestle it into being. I join others in worrying — at least a little bit — about what happens to our creative muscles when we outsource them entirely.

AI is great. It’s world-changing. It’s time-saving. It makes us more productive, gives us a broader context, and yes, can make us smarter (if we let it). But I think we should be careful not to confuse efficiency with originality. Rules-based isn’t always the right path. Faster isn’t always better. 

Anyone who’s compared a Johnny Cash song to manufactured Nashville bro country can see and feel the difference. Somehow, strangely, I am a fan of both.

Dance! Like a dandelion!

And, as Tyler points out in her piece, remember that AI cannot experience the world the way we do. Even the best prompt engineer in the world will struggle to fully articulate the full experience of existing. We’re just not there yet.

AI doesn’t know what “soft” feels like. It doesn’t understand what it means to be wrong. It’s trained to predict, not to notice, after all.

Creativity, for humans, is often a physical act. It happens through trial and error, through material resistance, through gestures and intuition that can’t always be articulated. AI can’t do any of that. It can only simulate it by looking at the patterns created by people who did. Not the same!

That’s why so much AI-generated work feels impressive, but hollow. It’s almost there, kinda, but not quite. You may have the right shapes, but something is missing in the composition. It’s getting harder and harder for us to explain what’s “off,” but you can feel it.

With all this, I think (and hope) originality becomes more valuable. The more designs start to mimic each other ad infinitum and the more the content in our feeds all starts to sound the same, the more we will appreciate stuff that’s unexpected, unplanned, and imperfect.

If you’re someone who still starts with a question, a gut feeling, a glimpse of something strange, and finds excitement in building from that, you’re already doing something that AI cannot. It’s a skill that will increasingly be used less and less frequently, and I think that means it’s worth protecting.

Daily Rip Live Recap: Cloudflare’s Lowkey Monopoly, the Death of SaaS, and Grading Powell’s Job Performance

Watch us live every Monday through Thursday at 9 AM ET.

Every weekday, my co-host Shay Boloor and I cover the biggest market news and events LIVE on Stocktwits’ morning show, The Daily Rip Live. Ryan Detrick, Chief Market Strategist at Carson Group, joined us yesterday as we kicked off another busy week of earnings reports and economic data.

Here’s what we covered on Monday’s show:

⇢ 4:45 Is Cloudflare underdiscussed in AI convo? They see (host) 20% of all hosted websites on the Internet. $NET ( â–Ľ 1.68% )  

⇢ 10:01 Shay says NET can be the Visa of the new digital economy, where data = currency. They’re helping publishers monetize their scraped content and getting their piece of the pie.

⇢ 13:05 We look at a case study in evolving from software-as-a-service to AI: ServiceNow. My friend Austin Hankwitz and I got to interview their CFO, Gina Mastantuono, last year, and she is a baller. $NOW ( â–˛ 0.39% )  

⇢ 17:40 UX — that’s short for user experience or the basic ways a software looks, feels, functions — in SaaS will matter less than data orchestration in an agentic AI world. Does this spell trouble ahead for players like like HubSpot and Salesforce? $CRM ( â–˛ 0.18% )  $HUBS ( â–˛ 2.38% )  

⇢ 21:05 Tesla strikes a $16.5B chip deal with Samsung. Is $ASML ( ▲ 0.75% ) the secret winner? (They’re effectively the monopoly that makes the machines that make the chips.)

⇢ 25:10 Retail inflows, meme trading, and crypto’s on fire — what will it all mean for Robinhood and SoFi later this week? $HOOD ( â–Ľ 2.01% )  $SOFI ( â–Ľ 0.77% )  

⇢ 31:45 The S&P is up 8.6% YTD — we ask Ryan, what's next? Let's look at some historical context, with a nod to Eisenhower of all people… BTW, Ryan called the market bottom of April 8. Touche, sir!

⇢ 36:06 Ridin’ on vibes? How much of the stock market recovery is attributable to bullish sentiment among retail investors, which has been followed by institutional investors not far behind?

⇢ 42:44 Does the popularity of meme stocks necessarily mean there’s frothiness in the market? Ryan challenges investors to compare people ape-ing into Kohl’s stock with new Gallup data, revealing that a lot of regular Americans are worried about the economy.

⇢ 50:33 I asked Ryan to grade Fed Chair Jerome Powell’s efforts so far. Ryan says C+. “I’m being nice,” he adds. We have another FOMC meeting on deck later this week.

We’re live every weekday, Monday-Thursday, at 9 AM ET. Tune in!

Now Here’s a Chart

Ryan Detrick shared this one with us on the show yesterday. In his words, “The S&P 500 is up 8.6% YTD. This is about average, but with markets, there is no such thing as average. Only 4 out of the past 75 years have stocks gained between 8% and 10%.”

Reading List

Tuesday Thursday Saturday is written by Katie Perry, owner of Ursa Major Media, which provides fractional marketing services and strategy in software, tech, consumer products, professional services, and other industries. She is also the co-host of Stocktwits’ Daily Rip Live show.

Disclaimer: The contents here reflect recaps and summaries of pre-reported or published data, news, and trends. I have cited sources and context for the information provided to the best of my ability. The purpose of the newsletter is to inform and educate on larger trends shaping business and culture — this is NOT investment advice. As an investor, you should always do your own research before making any decisions about your money or your portfolio.