The arrival of generative artificial intelligence has unsettled journalism in familiar ways. Traffic models look fragile. Copyright is contested. The marginal cost of producing words has collapsed. It is tempting to conclude that news is becoming less valuable just as machines become more fluent. So far for Mongabay, the opposite is happening. AI systems are adept at rearranging existing information. They summarize, paraphrase, and answer questions at speed. What they cannot do is observe the world directly, make accountable judgments, or create new facts. Those limits are not incidental. They are built in. And they place journalism in a more central position than before. This has become clearer to me in an unexpected way. At Mongabay, a nonprofit newsroom focused on environmental reporting, we chose not to block generative AI systems from accessing our work. Many publishers have done the opposite, citing copyright concerns, energy use, or fear of disintermediation. Those concerns are understandable, especially for commercial outlets whose business models rely on restricting access. But as a nonprofit focused on impact, our calculus is different. We already allow other outlets to republish our reporting as part of our impact strategy. If AI tools were going to answer questions about forests, fisheries, conservation, or biodiversity, it seemed better that those answers be informed by reported journalism than by thinner sources. I assumed this would reduce traffic. If people could get what they needed from an AI interface, why would they click through? That is not what happened. According to our…This article was originally published on Mongabay


From Conservation news via this RSS feed