In a landmark case that is already being expected to break new ground in terms of artificial intelligence, the New York Times has filed suit against Microsoft and their ChatGPT creator, OpenAI. Asking for “billions of dollars in statutory and actual damages,” the giant alleges Microsoft and OpenAI used their articles without permission to create Microsoft’s AI-powered chatbot called Copilot.
By using the millions of news articles that had been copywritten by the NYT, ChatGPT was able to develop a large-language model that was incredibly lifelike. The lawsuit goes on to allege that the company used the works with the full knowledge that they should be sharing in the billions of profits for the work.
Before getting to this point, the NYT attempted to settle with the organization, but the defendants allegedly refused to discuss it. “Defendants insist that their conduct is protected as ‘fair use’ because their unlicensed use of copyrighted content to train GenAI models serves a new, ‘transformative’ purpose. But there is nothing ‘transformative’ about using The Times’s content without payment to create products that substitute for The Times and steal audiences away from it.”
To substantiate their lawsuit, the NYT submitted a five-part series on predatory lending they had previously published in 2019 that ChatGPT recited almost word for word.
The results of this lawsuit could send shockwaves through multiple industries. As it stands, many publications have been using AI for some of their writing assignments, with companies like ESPN alleged to have fabricated entire authors to keep pace with demand and avoid hiring new employees. Aside from eroding the bedrock of an entire industry, it also has the power to effectively make the residuals from their hard work worthless.
Allowing AI to run wild like Microsoft and ChatGPT want to do is incredibly dangerous, and it’s also how the liberals can continue to manipulate and distort realities. If they can AI generate quotes and stories, they then can ultimately fabricate everything else, even more than they already do.