If you've ever wondered what someone with the job title of Global Editor-in-Chief actually does all day, the answer is sit around and write memos like this:
What we have here, thanks to Semafor's Maxwell Tani, is the guy who sits at the top of Insider's editorial hierarchy telling his employees that the publication will begin testing out ways that it can incorporate the AI language model ChatGPT into the production of its journalism. As someone who has received many memos like this from past bosses, I can say that this is one of the funniest examples of the genre I've seen. For one thing, there is no sillier way to begin a staff-wide memo than, "I've spent many hours working with ChatGPT, and I can already tell having access to it is going to make me a better global editor-in-chief for Insider." What are you supposed to do but laugh after coming across the following paragraphs in a memo that is ostensibly about how valuable ChatGPT can be to journalists:
Generative AI can introduce falsehoods into the copy it produces. Research it provided me for this memo was wrong, as I discovered when I fact-checked it. You cannot trust Generative AI as a source of truth. Doing so can lead to journalistic disaster. AI can also introduce bias into text it generates. When it comes to facts, generative AI should be viewed as a resource similar to Wikipedia or a factoid at the top of a Google search-results page: that is, a great starting point that helps you find more reliable sources. ChatGPT is a language generator that performs calculations to guess at the next best word. It doesn't understand facts or meaning, and it doesn't know whether an assertion is right or wrong, much less fair. It is not a journalist—you are. No matter the tool you use, AI or otherwise, journalists at Insider are ultimately responsible for the accuracy of their stories. Always verify your facts.
Generative AI may lift passages from other people's work and present it as original text. Do not plagiarize! Always verify originality. Best company practices for doing so are likely to evolve, but for now, at minimum, make sure you are running any passages received from ChatGPT through Google search and Grammarly's plagiarism search.
A third, less serious, warning is that text generated by AI can be dull and generic. Take what it gives you as a suggestion: something to rewrite into your own voice and in Insider's style. Make sure you stand by and are proud of what you file.
Nich Carlson
Imagine you are a journalist, and one day you receive an email from your boss announcing an exciting new hire. Everyone please give a hearty welcome to Bob! the message begins. Bob is a serial plagiarist who routinely lies, doesn't understand what a fact is, and has never written an interesting sentence in his life. Please consider him a writing and research resource for all of your stories. Just make sure you are spending a lot of time rigorously fact-checking and rewriting anything he produces for you, because he will seriously end your career if you don't! Anyway, feel free to swing by the kitchen at 3:00 p.m. to grab a slice of cake and welcome Bob to the office. Would you not immediately attempt a citizen's arrest on your boss?
Don't worry, though. Alongside this rundown of all the career-ending problems AI could inflict upon a journalist, Carlson shares an example of how ChatGPT earned him some praise from a subordinate:
I fed it some of the episode titles of one of our most popular video series and asked it for future episode ideas. (I sent them to one of our executive producers and she said, "Holy moly! There are some really great ideas here. Thank you!")
Nich Carlson
How can you argue with that? It's not as if anyone, in any corporate workplace, at any time in history, has offered some quick and disingenuous praise to one of their bosses just for the sake of being left alone.
The problem we have here is one of organizational hierarchy. Any company that reaches a certain size ends up with an upper-management layer that doesn't really have much to do all day. So in order to fill time, these managers sit around and stroke their chins and consider "the big picture" until they manufacture a reason to send a very important email to everyone who works below them, thus satisfying their internal need to feel productive and abreast of important innovations. In every industry you will find these people, and the health of any business usually depends on the willingness of the rank-and-file to ignore such directives while smiling politely. If AI ever really does become an integrated part of the journalism industry, it won't be because it actually makes anyone's job easier or the work they produce better; it will be because enough guys like Nich Carlson were looking for a reason to send an email.