Tokenmaxxing technologists vs. AI-cautious journalists
Found this data point recently: “AI-assisted stories accounted for nearly 20% of Fortune’s web traffic in the second half of 2025.” Even though “AI-assisted” is still a pretty vague definition, I can’t say I see a lot of data lying around on the exact impact of AI.
The number comes from WSJ’s recent feature on Fortune’s foray into AI journalism, spearheaded by Nick Lichtenberg. He’d published over 600 stories in about eight months — and most of the pieces are at least partially AI-generated, but always checked and edited by the author.
The piece also mentioned that “most of the roughly 60 journalists at Fortune aren’t doing their jobs the way Lichtenberg is,” which suggests that the “AI-assisted” stories are produced by a small number of people with significant involvement of one or more LLMs. In which case, 20% is a lot — and shows the potential of AI in journalism when used well.
The WSJ piece is also a rare case of a reasonably big publication showing how AI is used “in production.” I read often about both industries — and I’ve noticed a striking difference in how their representatives usually talk about generative AI.
In tech, you’ve got your “tokenmaxxing” — that is, competing on who spends more on AI usage — and executives shouting from the rooftops how much they’re employing AI every step of the way. In media, however, a lot of people tend to be significantly more careful in disclosing and discussing their usage of LLMs.
Harriet Meyer, who trains media people in AI, talked about the same thing in her excellent newsletter today: hallucinations are real, and there are plenty of documented cases where AI screwed up journalistic output. But, as she puts it, “people can’t opt out, even if they want to. AI’s already in the spellchecker, the inbox, the search box, the transcription tool, and half the software their company already pays for.”
Of course, the journalistic AI tools need to improve across the board, from hallucination protection to copy and reasoning quality. Yet, the tools in existence can already provide enough support for a journalist in ideation, editing, and finding the best data.
Lichtenberg told WSJ that he’s seeing a “vibe shift” in journalists’ attitude towards AI, as well as that he’s been able to win people over by explaining his process and all the checks and balances involved.
If that’s indeed the case, we’re about to see how the two paths — those of the technologists and journalists — will inevitably converge; and it will be the journalists using more AI in their day-to-day work rather than the technologists using less of it.