25. AI-generated language emits a high-pitched sound only non-corporates can hear
Maybe it actually does
I was renting a cabin in the redwoods. It was winter.
I relied on my trusty four-wheel drive to get me to and fro. Winters in Northern California are very wet, so any uncovered ground gets transformed into a 24/7 mud party.
But that’s no problem for a Toyota 4Runner with a good set of tires. God bless that 4Runner.
For weeks I parked my car in the same spot, out a ways from the cabin in a slight clearing. Then, one day, I needed to unload some lumber, so I parked extremely close to the cabin. When I was done, I left my car there.
“After all, why not?” I thought. “Live a little.”
That night, typically, it rained. In the morning, I came out to discover a sizeable Douglas fir had fallen right in the middle of where I normally parked. If I hadn’t opted for laziness the day before, that tree would have totaled my vehicle.
As it was, the downed tree didn’t hurt anything. I opted to use my new parking spot close to the cabin, and I took my time chopping up the windfall tree for use as firewood.
That’s what it was — a windfall. Something positive that suddenly comes your way. Like fruit that falls from the tree, saving you the trouble of picking it.
Figurative language being what it is, we’d be apt to use that word for pretty much anything. A lower-than-expected natural gas bill. A surprise donut.
And I love that. You’ll find no etymological elitism coming from me. Not on Sundays, at least.
My point is that the experience gave me a personal relationship with that term.
Windfalls are a handy source of firewood. But if the experience had been timed differently, that windfall could become a “widowmaker” — a limb that suddenly falls to kill a person.
That word exists because it happens. There needed to be a name for the thing loggers should be watching out for. The word itself gives you everything you need to know.
This is what I love about language. It’s a vehicle for experience. Which is why I get mopey about the role that big tech and corporations are having on how we communicate.
Don’t say “consume content”
I asked an AI image creator to depict “a man consuming content.” It gave me a young hipster with a mini-Santa Claus beard and an ambiguous number of fingers studiously devouring all peach-hued material existence.
I appreciate how the content is the same color as his flesh.
While “content” might be a useful blanket term, as handy as parking close to one’s redwood-forested cabin, “consumption” is always the wrong one. The audiobook I listen to (currently Suttree) is neither used up nor digested. I paid for it, but that doesn’t make me a consumer of content.
And while lazy language is great, I don’t want it to overtake words and phrases that have their roots in direct experience. The more we rely on excessively general words, the more it proves we’re not really relating to ourselves, each other, and, you know, the rest of manifest existence.
In America, words don't mean anything, honey
As ChatGPT might pen: “In our ever-changing digital landscape of content, ranging from video, audio, and text, to interactive media and virtual experiences, staying relevant requires adaptability and a keen understanding of emerging trends. As creators and consumers evolve, so too must the methods for engaging and captivating diverse audiences across different platforms.”
ChatGPT did, in fact, pen the second half of that sentence.
And did you know that people gradually lose the ability to hear high pitches as they age? It’s a normal part of the aging process. It’s not that you go deaf, just that the cosmic forces of entropy have their way with your eardrum and as a result, you gradually lose the ability to hear extreme high-frequencies as the years go on.
Well, naturally, someone saw a market opportunity there, and they released a really high-pitched ringtone for teenagers so that adults couldn’t hear when their phone rang.
I’m convinced that GPT language is the corporate version of that.
To me, and probably most people, AI-generated language stands out. It looks like it thinks it’s saying something, but it’s not. Things are phrased in a way that vaguely restates some part of the phrase that came before and transitions confidently into what comes after. You could easily condense pages of low-effort AI text to a few bullet points.
Proponents of AI would say that such text would be better if only it had been given better training data and better prompting. And that’s fine. I’m not knocking the technology, just how we are using it to pervade the web with terrible content.
I’m convinced it’s a test - the more corporate you are, the less you recognize it.
If you find yourself reading an AI-generated report and you’re nodding your head appreciatively, admitting, “Yes, it is an ever-changing digital landscape we’re facing.” “It’s true, we do want seamless interconnectivity.” That’s a sign you need to take a sabbatical.
AI-generated content will get better. Yes. For no clear reason, AGI and robots are an eventual certainty.
AI won’t, however, be able to improve what it fundamentally is.
Will it be able to reproduce human creativity? No. But will it suffice? Yes.
What it will forever lack is the ability to relate with truth-with-a-capital-T. Straight knowing. Nous. Buddhi. Aligned intuition.
Instead, it will offer… really really really good guesses.
Language that stems from direct experience
In our “ever-changing digital landscape,” there’s no shortage of newly coined words, but many of them are corporate and technical. We’re gaining the ability to be precise about product names and technical components. New, original, distinct terms for new, original, distinct things. What about everything else? What about feelings and the quest for a meaningful life?
I think the answer is best framed by a slide from a presentation on sacred geometry by Jon Allen:
Whether you coin new terms or relate with old ones, there’s always room for creativity that doesn’t come from mere novelty or the cult of uniqueness. And that creativity is effortless.
The corporatization of language presumes that life is competitive, where each new thing refers to another, but there’s no center point. All is mere content, and nothing means anything. Products in an as-if-eternal marketplace, available for consumption.
This even happens in philosophy. Instead of relying on our unique relationship to specific ideas, you might see a person give a nod to another person's body of work (“It’s a bit Derrida-ish, wouldn’t you say?”) as if a body of work were a unified whole. In other words - we’re coming to refer to each other as different brands.
I’ll grant that branding is important for business, but what happens when that's all we have?
I’ll conclude with this scene from Pulp Fiction. A guy named Butch gets into a taxi with a driver named Esmarelda Villalobos. That’s an awesome name. Villalobos. “Town of Wolves.”
She asks him what his name means.
"I'm an American, honey. Our names don't mean shit."