LA Newscasters Will Literally Be Fake in 2024

Stock-Asso / shutterstock.com
Stock-Asso / shutterstock.com

After years of making tongue-in-cheek jokes about everything in Los Angeles being plastic and fake, now it will be true.

LA-based Channel 1 announced shortly before the end of 2023 that their news station will be launching in 2024. As a ground-breaking initiative, all of their anchors will be created with artificial intelligence (AI). Created with a mixture of digital doubles of real people, animated faces, and some creative angles, these are incredibly lifelike. Much like the other AI videos and photos that have been circulating social media, these still have problems like extra-long fingers, or having more than five of them.

For many who have seen them, the blinks that these figures exhibit some very emotionless expressions. The lack of “life” in these images fails to give people the warm and fuzzy that people have come to expect from the news. Additionally, the lack of feeling when these AI reporters speak languages like Greek leaves many feeling that they aren’t fit for the assignment.

Scripts for these bots to read from will originate from three major pieces: partnerships with established news outlets, commissioned freelance writers, and AI-generated reports from trusted and confirmed data like public records and government documents. These sources will also likely fail to give everyone the truth. Instead, Channel 1 looks to be just like every other mainstream media source, just without the direct connections to the Soros family.

This is the kind of use and abuse of AI that Americans should fear most. For years, we were told to trust the news to give us the facts. People like Walter Cronkite made things less scary because he seemed to tell the truth. His “and that’s the way it is” sign-off left people feeling refreshed at getting their news direct and honest. Now, that’s all gone, and in its place is recycled content being told across mainstream media networks with no variance. Channel 1 looks to be just another example of this failure.