Site Loader

The AI Now Institute has released Artificial Power: 2025 Landscape Report.

I’ve summarized and provided my take on the executive summary of this large study. Follow the links to learn more of course. But for now, this is my marker. AI has limited ethical uses, but are the costs worth those? I’m still struggling with that balance to be honest. AI copyright issues, however, aren’t so thorny.

But I promise This will always be true. What I publish. I write.

The AI revolution isn’t just reshaping technology—it’s the largest intellectual property theft in human history, disguised as innovation. A new report from AI Now exposes how tech giants have systematically pillaged creators’ work to build AI systems that now threaten to replace the very people whose labor made them possible.

People like me.

Building on a stolen foundation

Every AI system powering today’s “revolution” was trained on copyrighted material scraped without permission from authors, artists, journalists, programmers, and countless other creators. These companies didn’t ask—they simply took. Books, articles, code repositories, poems, artwork, music, and films were vacuumed up en masse to train models that now compete directly with their creators’ livelihoods.

OpenAI, despite its name, refuses to disclose its training data. Google’s models were fed on copyrighted books through its library digitization project. Meta trained on billions of social media posts containing user-generated content. Meta even seeded and reaped porn videos according to new reports.

The entire AI industry is built on what would be considered theft in any other context, laundered through claims of “fair use” and “transformative purpose.” This, of course, is absolute bullshit. You bet if I took stuff directly from their marketing materials and replaced OpenAI with OpenAssholes and published it under the transformative purpose bit, their lawyers would bankrupt me.

The displacement economy

The cruel irony is inescapable; creators’ own work is being used to build the machines meant to replace them. Journalists’ articles train AI systems that publishers now use to generate content more cheaply. Authors’ novels and poems feed chatbots that aspiring writers are told to use instead of developing their craft. Artists’ portfolios train image generators that clients increasingly prefer over commissioning original work.

This isn’t creative destruction—it’s extractive parasitism. The AI industry has found a way to monetize human creativity without compensating creators, while simultaneously devaluing the skills and expertise that took lifetimes to develop. They’ve turned our collective cultural heritage into training data for profit-maximizing algorithms. The rich take and take and profit, while those who build are left holding the fucking empty bag.

But, the productivity right?

Tech companies frame AI as boosting “productivity,” but whose productivity are we talking about? When Microsoft’s Copilot helps software engineers write code faster, it’s not making programmers more productive—it’s making them replaceable. The value goes to Microsoft and employers who can extract more output for the same wages, while developers see their specialized knowledge commoditized, downgraded, and disrespected. This applies equally to writers and artists in other contexts.

And by all accounts the code is crap even though it sort of works. Gone is the elegance.

The same pattern repeats across industries. AI doesn’t enhance human creativity; it standardizes and mechanizes it, surgically removing the creative spark and transplanting rote machine learning. Big problems are not solved in any context without that moment of creative intuition that AI is incapable of. The “productivity” gains flow upward to shareholders. And this was always the point.

The actual writers and artists face downward pressure on wages and working conditions. The pressure is real. I’m seeing hourly rates that are 25% of what I could command even last year. Which would be all well and good if the end product were good.

But it’s not. It’s exactly what you’d expect from Eliza the chatbot with a larger vocabulary and phrases learned from people who can actually write, lacking the connections and context.

The haves and have nots

It always comes back to class it seems.

The AI industry’s approach to copyright reveals its true nature. It’s a wealth extraction mechanism designed by and for the already powerful. When individual creators sue for copyright infringement, they face armies of lawyers and claims that training AI models constitutes fair use. Meanwhile, these same companies zealously protect their own intellectual property, using patents and trade secrets to maintain their advantages. See above.

The message is clear—your creative work belongs to everyone (meaning everyone with means like tech giants), but their algorithms belong only to them. They’ve created a system where human knowledge and creativity are treated as free raw materials while machine outputs are considered proprietary products worthy of protection.

(Ed: Note: I use the em dash on purpose as a human, despite all the hair pulling and clothes rending on LinkedIn.)

Laundering human labor

The worst part? AI systems obscure their dependence on human labor and creativity. When ChatGPT generates text or DALL-E creates an image, the output appears to emerge from the machine itself. The thousands of writers, artists, and other creators whose work enabled these capabilities become invisible, their contributions unacknowledged and, more crucially, uncompensated.

This erasure isn’t accidental—it’s necessary for their business models. If users understood that AI outputs are sophisticated recombinations of existing human work, the mystique would disappear. The technology would be revealed as an elaborate mash-up scheme, capturing value from creators who can’t afford to sue while selling their transformed labor back to the market.

By whom exactly

Traditional media may have had its flaws, but its flaws were human and noted as such with errata. It always maintained concepts of attribution, credit, and compensation. AI systems obliterate these norms entirely. When an AI model generates content based on training data that included your work, you’ll never know. There’s no citation, no royalty, no acknowledgment that your labor contributed to the output.

This represents a fundamental shift from an economy based on creating and sharing ideas to one based on hoarding and recombining them without consent. We’re moving toward a world where the concept of intellectual ownership becomes meaningless for individual creators while remaining absolute for platform owners.

My own work on content strategy has been ripped off in such a way. Parts of some its descriptions match mine verbatim, despite the years between their generation (this may have change to pick on one of my colleagues by now as I’ve been keeping a lower profile socially).

Other costs to society

The environmental cost of training and running AI models represents another form of theft—from future generations. These systems require massive energy consumption and water usage, contributing to climate change while providing dubious benefits to society.

The companies profiting from AI externalize these environmental costs while claiming their technology will somehow solve the climate crisis they’re actively worsening.

So much for technology saving the future. It’s as it always was. All about the profits as long as possible. Roll on the bed of cash until you die and to hell with everyone else.

The regulatory capture

As evidence of AI’s harmful impacts mounts, the industry has deployed sophisticated regulatory strategies. They fund academic research, hire former government officials, and shape policy discussions to focus on hypothetical future risks rather than present harms. Meanwhile, copyright holders struggling to protect their work face an uphill battle against well-funded lobbying efforts.

The “AI arms race” with China provides convenient cover for avoiding accountability. Any attempt to enforce copyright law or require consent for training data is framed as hampering American competitiveness, even as these same companies freely exploit the work of American creators.

Can we regain what is lost?

The current trajectory threatens to destroy the economic foundation for human creativity. Why invest years learning to write, code, or create art if AI can produce “good enough” alternatives instantly? We’re witnessing the devaluation of human skill and the commoditization of culture itself.

Independent media, already struggling against platform monopolization, faces an existential threat from AI-generated content. Publishers can replace expensive human journalists with cheap algorithmic substitutes trained on their own archives. The feedback loop is devastating: human-created content trains AI systems that then compete with and ultimately replace their creators.

We must treat AI for what it is: an intellectual property laundering scheme that must be subject to the same laws governing any other use of copyrighted material. This means:

Enforcing Existing Copyright Law:
No amount of “transformation” should exempt AI training from requiring permission to use copyrighted works. Clearly define what “fair use” is in today’s environment.

Mandatory Licensing and Compensation:
AI companies should pay for the content they use, just like any other media company licensing copyrighted material.

Transparency Requirements:
Companies must disclose their training data so creators can identify unauthorized use of their work.

Creator Rights:
Individuals should have the right to opt out of AI training and remove their work from datasets.

Economic Justice:
Revenue sharing mechanisms should ensure creators benefit from AI systems trained on their work.

The AI industry wants us to believe this transformation is inevitable, that resistance is futile. But there’s nothing natural or inevitable about a system that concentrates wealth by stealing from creators. We have laws protecting intellectual property for good reason—it’s time to enforce them against the biggest violators in history.

The future of human creativity hangs in the balance. We can either accept a world where our cultural output becomes free training data for corporate algorithms, or we can fight for an economy that values and compensates human creativity. The choice is ours, but only if we act now.

Post Author: Timothy Truxell