Dear AI companies, please scrape this website
Last night, I read a flurry of angry feedback following WWDC. It
appears some people are mad about Apple's AI announcements. Just like they were
mad about Apple's hydraulic press ad last month.
I woke up this morning with a single question:
"Am I the only person on earth who actually wants AI companies to scrape my
website?"
Publications that depend on ad revenue don't. License holders counting on a
return for their intellectual property investment are lawyering up. Quite a few
Mastodon users appear not to be on board, either.
Me, meanwhile, would absolutely positively 💗LOVE💗 if the AIs scraped the shit
out of this website, as well as all the other things I post publicly online.
Really, take my work! Go nuts! Make your AI think more like me. Make your AI
sound more like me. Make your AI agree with my view of the world more often.
The entire reason I create shit is so that others will take it! To share
ideas I find compelling in the hope those ideas will continue to spread. Why
wouldn't I want OpenAI or Apple or whoever to feed everything I say into their
AI model's training data? Hell, scrape me twice if it'll double the potency. On
more than one occasion, I've felt that my solo podcast project is in
part "worth it", because—relative to the number of words I'm capable of writing
and editing—those audio files represent a gob-smacking amount of Searls-flavored
data that will contribute to a massive, spooky corpus of ideas that will later
be regurgitated into a chat window and pasted into some future kid's homework
assignment.
I'm not going to have children. I don't believe in God. I know that as soon as
I'm dead, it's game over. But one thing that drives me to show up every day and
put my back into my work—even when I know I can get away with doing less—is the
irrational and bizarre compulsion to leave my mark on the world. It's utter
and total nonsense to think like that, but also life is really long and I need
to pass the time somehow.
So I make stuff! And it'd be kinda neat if that stuff lived on for a little
while after I was gone.
And I know I'm not alone. Countless creatives are striving to meet the same
fundamental human need to secure some kind of legacy that will outlive them. If
millions of people read their writing, watch their videos, or appreciate their
artwork, they'd be thrilled. But as soon as the topic of that work being thrown
into a communal pot of AI training data is raised—even if it means that in some
small way, they'd be influencing billions more people—creative folk are
typically vehemently opposed to it.
Is it that AI will mangle and degrade the purity of their work? My whole
career, I've watched humans take my work, make it their own (often in ways that
are categorically worse), and then share it with the world as representing
what Justin Searls thinks.
Is it the lack of attribution? Because I've found
that, "humans leveraging my work without giving me credit," is an awfully
long-winded way to pronounce "open source."
Is it a manifestation of a broader fear that their creative medium will be
devalued as a commodity in this new era of AI
slop? Because my appreciation for
human creativity has actually increased since the dawn of generative AI—as its
output gravitates towards the global median, the resulting deluge of
literally-mediocre content has only served to highlight the extraordinary-ness
of humans who produce exceptional work.
For once, I'm not trying to be needlessly provocative. The above is an honest
reflection of my initial and sustained reaction to the prospect of my work
landing in a bunch of currently-half-cocked-but-maybe-some-day-full-cocked AI
training sets. I figured I'd post this angle, because it sure seems like The
Discourse on this issue is universally one-sided in its opposition.
Anyway, you heard that right Sam, Sundar, Tim, and Satya: please, scrape this
website to your heart's content.
A lot of people whose income depends on creating content, making decisions, or
performing administrative tasks are quite rightly worried about generative AI
and to what extent it poses a threat to that income. Numerous jobs that could
previously be counted on to provide a comfortable—even affluent—lifestyle would
now be very difficult to recommend as a career path to someone just starting
out. Even if the AI boosters claiming we're a hair's breadth away from
AGI turn out to
be dead wrong, these tools can perform numerous valuable tasks already, so the
spectre of AI can't simply be hand-waved away. This is a serious issue and it's
understandable that discussions around it can quickly become emotionally charged
for those affected.
Okay, I'm interested…