A humorous but stark message on the damage of intellectual property theft
by Paul Gerbino (Written with the help of my AI friends)
The air crackles with a sense of irony so thick you could choke on it. Software companies, the very bastions that sue for unauthorized duplication of their work, stand idly by as a new frontier of intellectual property theft unfolds – the “Industrialization of Plagiarism” as Ricky Sutton aptly terms it in his recent Substack article – “A year ago I predicted AI needed publisher partnerships – was I right?” (https://rickysutton.substack.com/p/a-year-ago-i-predicted-ai-needed). Artificial intelligence, built on a foundation of potentially plagiarized content, churns out content at an alarming rate. But the rot, I fear, runs far deeper.
Haven’t we, the everyday consumer, become unwitting accomplices in this grand content heist? We readily copy music without a second thought, our fingers flying across keyboards to paste articles into emails for easy sharing. Rarely do we consider the journalist whose work just lined our inbox, or the musician whose song fueled our morning jog. Every instance of uncompensated content consumption chips away at the foundation of a creative ecosystem – the one that brings laughter, knowledge, and stories to our lives.
So, what is plagiarism? Plagiarism, in this context, refers to the unauthorized use of another’s intellectual property. It encompasses copying someone else’s creative work, like a song or article, without giving them credit. Even paraphrasing ideas or factual information without proper citation can be considered plagiarism. This “industrialization of plagiarism,” as Mr. Sutton terms it, occurs when AI content generators are trained on massive datasets that may contain uncredited or stolen content. This allows them to produce derivative content that feeds into a cycle of devaluation for original creative work. While sharing snippets or using AI tools can be beneficial, it’s crucial to be mindful of proper attribution and ethical consumption of content. (However, Frank Bilotto, our resident Licensed attorney said that the AI engine that wrote this is wrong and according to Frank, plagiarism is an ethical and moral construct that is comprised of passing off someone else’s work as your own and copyright infringement is a legal construct comprised of the unauthorized use of original works… but, when you plagiarize, isn’t that unauthorized use of original works? I am so confused.)
The rationalizations we weave around plagiarism are as flimsy as a house of cards. Thoughts that include “everybody does it,” “I am only doing this one time,” “I need a good grade,” “I don’t have time doing this myself,” “”No one will ever know,” and “The AI wrote it, so it doesn’t count as plagiarism,.” are some of the best. But it is the real lie, “It’s not hurting anyone,” we tell ourselves that is the most dangerous because the truth is far starker. Media companies, the lifeblood of a healthy democracy, are hemorrhaging. Poynter’s 2023 report paints a grim picture – 20,000 media jobs lost in a single year. Lee Enterprises, a media giant, drowns in red ink despite a rise in digital revenue. The ability to “keep their heads above water” is rapidly diminishing.
Some, with a short-sighted view, urge media outlets to “take the money and run,” partnering with AI companies for a quick buck. But this approach is akin to selling the family silver – what happens to media companies two decades down the line? Will AI become our sole source of information, a filter bubble crafted from stolen content, devoid of diverse voices and critical perspectives?
Don’t get me wrong, I’m a cheerleader for AI. The tools it offers are a boon to efficiency, streamlining tasks and freeing up valuable time. In fact, I used AI to help me write this article by feeding it all the points I wanted to make and asked it to create an article in the style of a newspaper columnist. (Do ya think I could write dis good.) However, a gnawing suspicion lingers – is the output from this marvel of technology tainted by the theft of someone else’s hard work? Am I, by using it, perpetuating the problem? The answer, shrouded in the murky, non-transparent world of AI training data, remains frustratingly elusive.
This cycle of rationalization, this self-justification for content theft, has to stop. We, the consumers, must acknowledge our role in this unfolding drama. We can’t claim ignorance when the evidence is writ large – the struggling media landscape, the rise of AI-powered plagiarism factories. Look what is happening in the scholarly journal world, where so-called “paper mills” are producing fraudulent content on steroids.
The solution is multifaceted. Supporting legitimate content creators, be it through subscriptions or micro-payments, is a crucial first step. Holding AI companies accountable for the source of their training data is another. Creating collective rights organizations where there aren’t any to represent small to medium sized publishers and negotiate licensing with AI is yet another. Finally, fostering a culture of respect for intellectual property, where creators are valued and compensated, is paramount.
“The future of media, the very wellspring of information and entertainment, hangs in the balance. Let’s not sleepwalk into a world where AI dictates the narrative, one built on the stolen bricks and mortar of human creativity. It’s time to break free from the cycle of rationalization and fight for a future where both creators and consumers thrive. (I could not have said that any better if I wrote this myself.)