Yes, AI can create the kind of formulaic crap that is passed off as journalism these days. But AI cannot handle meanings that require context to understand. AI designers have not kept up with neuroscience and they are using an incorrect model of intelligence. (This is my field; I am a biosemiotician.) Neurons self-organize by means of reaction-diffusion processes which create wave-like properties that further organize and constrain the neurons. (Alan Turing discovered this shortly before he died.) This gives humans the ability to use context to make judgements about new information. What AI does is use statistical summation to make generalizations. AI is good at stereotyping. This has been disastrous when AI has been applied, for instance, to help judges pass sentences. ProPublica did a get article on this. They found AI gave black people twice as tough sentences as white people with similar histories. Caitlin, you’ll be glad to know that AI cannot understand or translate jokes, poetry, wit or irony. AI can take a CNN reporter’s job; it cannot take Caitlin’s. There is a big PR push to make you think that AI can replace human judgement with a more “objective” outcomes. This is simply not true. Do not believe the hype.