Trends in Innovation at Media Publishers — part 1
Technological development: automation, robots and artificial intelligent
By Nikolaj Christensen, Department of Strategic Development, Danish School of Media and Journalism, DMJX
Like in any other business the media business models has been disrupted in recent years by both technological changes, changes in user habits and by globalization of the content and advertorial market. Big medias are struggling to reinvent themselves and their business models. New media players are entering the marketplace with new products and skills.
Recently a group of representatives from media companies, media labs and academics were gathered in Copenhagen for the fifth Media Lab Days, to share insights about current trends and ways to navigate them. The event was arranged by Wan-Ifra (World Association for News Publishers) and The Danish School of Media and Journalism.
You can read Part 2 of the article here.
In this article we will take a look at some of the trends, which were discussed during the two days event.
Robots and AI in Japan
Nikkei Inc. is a Japanese holding company with newspaper businesses as its main activity and delivers, among other things, business news for their own 2.2 mill. users as well as for the 66 mill. users of the biggest broadcasting stations in Japan. Nikkei has developed a system where AI is used for automatic generation of titles. The machine produces 5–10 patterns of titles in a few seconds. Human operators will control the validity of the titles and chose two to three options which are tried out with the users. Nikkei’s experience shows that the CTR (Click Through Rate — the number of people who click on the title) are up to three times higher with certain titles than with others.
AI is also used to generate news on business accounts, which have been increased more than ten times. Through machine learning, which is good for image generation, the written news is transformed into short video stories with a 2D graphic presenter with synthesized voice. Up to 120 of these clips can be delivered pr. hour.
Key focus points for the future
Will automation and robots make the journalist useless? Not according to Andreas Markmann Andreasen, who as a fellow at the University of Southern Denmark has studied this question intensively for six months both from a Danish and an international perspective. According to Andreas Markmann the vast use of automation and robots should make us aware of some important areas of competencies and organization. He identified four domains to consider in our embracement of new technology.
Augment or automate.
Automation is when a machine partly or fully is doing what a journalist used to do. Augmentation is using the technology to enhance and augment the journalistic product. So instead letting a robot replace you it should, as Jeremy Bowers director of engineering at Washington Post puts it, be your exoskeleton that gives you the possibility to do what you do even better. See an example of exoskeleton use here.
An article, which is the result of collaborative study between The Northwestern University and Washington Post refers to the term Computational News Discovery as an expression of the use of algorithm as a tool to orient editorial attention to potentially news worthy events or information prior to publication. A concrete tool, Lead Locator, is used to supplement national political reporting with potentially interesting locations, where statistic on voters indicate some interesting story lies. Read more about the study and its conclusions here.
Team is key
For a successful implementation of new technological possibilities it’s important to have the right skilled people in the team. Writers, multimedia journalist, coders. Everybody need to understand algorithms but only a few needs to be able to program them.
Journalists need to understand how the internet works, the logic of organizing data, use of spread sheet, what can we learn from tech culture and what can be automated and what is worth the effort. Not all journalist has to learn to code. Some should, but others might develop and keep up with their competencies focusing on other skills.
Values and ethics
New technologies demand ethical focus. There is still a need for transparency, awareness of bias, responsibility, which has been in journalism for a long time, but they might need to be updated and adapted to new circumstances. For example; should we byline an article written by a robot? And what if part of the story is written by a robot? Or if a robot is organizing and generating the headlines? Who is responsible for content published? We might have to organize responsibility in a new way. And then our systems need to have the robustness that protects them from being hacked and misused.
Don’t lose the journalistic DNA
But how does the focus on technology, data and algorithm challenge the journalistic profession? Jenny Wiik, from Lindholmen Science Park in Sweden, has been both teaching journalists and studied the profession for years. She is now doing a research project on how the increased focus on technology affects the newsrooms and she raise a flag on the importance of not losing the intrinsic values, which drives many journalists towards their professional results.
In many news room people are not so tech savvy and those who comes in gain a lot of influence either as the result of management decisions to focus on those new skills or because there could be a generation gap in knowledge and understanding of technology itself.
“Journalists definitely feel the tension of their investment in contrast to the new technology and automation and the new logics that starts to be a driving force” says Jenny Wiik who continue “…it’s really important that we approach this from a different take and do not discard old fashion journalism because its old, because the energy and the personal investments and the ideas of journalists are really important to the sustainability of news rooms.”
Automation, robots and artificial intelligence is already an integrated part of society as well as media business. Automation helps us using software and producing day to day tasks, like correcting spelling while writing. Robots can read and interpret large amounts of data and turning them in to short news and with the help of artificial intelligence we have simultaneous translations on skype as well as digital 2D and 3D presenters delivering newscasts.
Ole Kjeldsen, Head of Technology and Security at Microsoft Denmark, gave an overview of the many ways artificial intelligence (AI) is integrated in society without us paying attention to it. “Take airport security. Lost luggage represents a security issue and that is why we are constantly told by loud speakers not to leave our luggage unattended. But still it happens. Years back it could easily shut down parts of an airport, while the luggage was collected and the owner was found. Nowadays, due to the presence of surveillance cameras with face recognition, the owner of an unattended bag will be identified almost immediately and no one else will be disturbed or even pay notice”. However collecting massive data like in the case, presents us with a lot of ethical questions about how the data is collected, who has access to them, and how it is stored. Therefore there is a need for discussing ethical standards for data collection.
Microsoft has put forth six parameters for creating an ethical standard which should be applied when AI is used. Here are a few words on each.
· Fairness: Focus on avoiding cultural bias, securing diversity and keeping a human review in the process.
· Reliability and safety: Evaluate data and monitoring performance, design for unexpected circumstances and keeping humans in the loop.
· Privacy and security: General data laws, transparency about collection of data, de-identification techniques.
· Inclusiveness: Design practice that doesn’t unintentionally exclude people, enhance people with disabilities.
· Transparency: Make it easy to understand how decisions were made, give contextual explanation, raise attention of potential bias.
· Accountability: People are accountable for the systems; norms should be observed both in design process and ongoing manner.
Read more on the use of AI and the data ethical challenges here.
You can read Part 2 of the article here.