UK AI Council appointed

What happened: The government has announced the membership of its AI Council, made up of academics, public servants, and industry representatives. It is intended as an independent committee to convene experts from a range of sectors, to provide leadership to boost growth in the AI sector and promote the adoption of AI across the British economy.

It has 22 members and is chaired by Tabitha Goldstaub, co-founder of CognitionX, who will advise government on how to work with and encourage businesses and organisations to boost their use of artificial intelligence.

Why this matters: I’m naturally somewhat sceptical of independent committees without any specific powers. However, given the line-up includes Dame Wendy Hall, Professor Adrian Smith and Mustafa Suleyman of DeepMind and many other individuals who will seriously engage gives me some hope it won’t just be talking shop and something for participants to put in their conference bios.

Further, it is encouraging that the council intends to specifically focus on ethics and diversity as two of its first priorities, the latter of which is in particularly dire straits as highlighted by the ongoing research of the AI Now Institute.

Controversy over facial recognition trials by police forces across the country

What happened: BBC Click revealed a man was fined £90 for disorderly behaviour after covering his face from a facial recognition trial by the MET Police. This has raised the profile of the debate around the use of facial recognition systems by the police.

Why it matters: The Biometrics and Forensics Ethics Group raised concerns about consent and the risk of undermining public confidence in the police in an interim report back in February. This incident indicates these warnings weren’t heeded.

In response, Darren Jones, a member of the Science and Technology Select Committee, has also indicated that the Select Committee will be launching an inquiry on this use of facial recognition. Also this week, from today until the 23rd, Ed Bridges will be in court asking that South Wales Police’s use of facial recognition technology is ruled unlawful as a breach of the right to privacy.

With cities like San Francisco prohibiting the use of facial recognition by its police and public services, some have called on London to do the same. It seems likely that this incident, along with other ongoing campaigns, will lead to a reckoning on the use of facial recognition by the state (if it doesn’t get suffocated by the return of Brexit)

See also — Government expands use of facial recognition border gates to visitors from 7 more countries

Defence Secretary reasserts commitment to autonomous weapons for the Royal Navy

What happened: In a speech to the Sea Power Conference 2019, Defence Secretary Penny Mourdant, reemphasised the UK’s commitment to developing autonomous weapons, specifically in the naval arena: “Delivering AI and robotics into every fighting arm [of the Royal Navy] courtesy of our new pioneering new Naval X accelerator”

This follows in the footsteps of the previous Defence Secretary who last December made a commitment to the greater use of AI across the military, reconnaissance and command & control coordination on the battlefield, to dealing with sub-surface submarine threats.

Why it matters: It highlights the creation of the NavyX accelerator in April, as part of a £75m investment into new naval technology, something I missed at the time. Some of the money will be immediately spent on two new autonomous mine-hunter vessels to enable remote mine-hunting in the Gulf. However, a significant proportion is going to NavyX, which the official press release describes as a “new autonomy and lethality accelerator”. This language suggests that the UK won’t stop at developing utility and defensive autonomous capabilities at sea.

The UK, along with the USA and Russia, were among the leading opponents of controls on the development and use of lethal autonomous weapons at the UN Convention on Certain Conventional Weapons back in March. The tone struck by the new Defence Secretary suggests there is unlikely to be a change of policy in this area.

Government publishes West Midlands Local Industrial Strategy including plans for autonomous vehicles and medical AI

The government has published its first Local Industrial Strategy, for the West Midlands. The Local Industrial Strategy does not include any new spending commitments outside of existing budgets and specific commitments are funded from existing local and national departmental budgets. However, it does indicate what the government is prioritising. The two main aims in the strategy that fall under AI relate to autonomous vehicles and medical machine learning for chronic diseases, e.g. cancer or dementia.

Autonomous Vehicles:

There is a particular focus on connected autonomous vehicles (CAVs) and so the roll-out of autonomous vehicles in the West Midlands goes hand-in-hand with the deployment of 5G.

According to the strategy, the government believes that CAVs could form the majority of cars on the roads in 15 years (which may be somewhat ambitious) It also suggests that truly self-driving vehicle trials for the public are due to begin in the UK in 2021. If by ‘truly self-driving’ they mean Level 5 autonomous vehicles, that seems almost unrealistically ambitious.

Between now and 2021, the strategy suggests West Midlands Future Mobility will be building a network of over 50 miles of roads in Coventry, Birmingham and Solihull optimised to constantly monitor CAVs in action, gather data about them, and measure public interaction. This a part of the government’s wider £200m initiative for connected autonomous vehicles, Zenzic (formerly MERIDIAN).

To enable the CAVs, this will happen side-by-side with the UK’s first large-scale, multi-city 5G testbed, with hubs in Birmingham, Coventry and Wolverhampton, through the Urban Connected Communities Project (whose initial plans also include automated surveillance of public transport enabled by live-streaming CCTV with 5G).

This is intended as the next step in the government’s 5G Testbed and Trials Programme. However, just days after the announcement of the strategy, it was reported that lamppost transmitter disputes may delay the rollout of 5G, adding to the fear of delays if a Huawei ban materialised, which may hamper the trials of CAVs.

Objectives:

· Contributing to the government’s ambition to deploy 3 trials of connected autonomous vehicles by 2021, with the West Midlands aiming to deploy the first fully operational connected autonomous vehicles in the region in advance of the 2022 Commonwealth Games.

Funding Commitments:

· £20 million to establish the UK’s first Future Mobility Zone between Birmingham, Solihull and Coventry.

· £50 million to 5G trials in Birmingham, Coventry and Wolverhampton, awarding West Midlands preferred partner status as part of the Urban Connected Communities scheme.

Medical AI:

The deployment of medical AI in the West Midlands is focused on using machine learning in the prevention, early diagnosis and treatment of chronic diseases, such as cancer, diabetes, heart disease and dementia by 2030.

As part of the government’s Life Sciences Industrial Strategy, they will promote the development of a locally led West Midlands ‘Translational Med-Tech Commission’ to accelerate commercialisation from labs to patients in the region.

Transport Minister announces development of an autonomous vehicle cyber-security testing facility

What happened: Jesse Norman, Future of Mobility Minister, announced at the FT Future of the Car Summit that the Department for Transport will, in June, launch a competition to develop a autonomous vehicle cyber-security testing facility.

The facility is intended to allow researchers, start-ups and industry to stress-test their autonomous vehicles software before it hits the road. They will subject self-driving vehicles to a range of cyber-attacks on test-tracks and in virtual environments, ensuring that vehicles’ communication with 5G networks, smart traffic lights and other self-driving vehicles remains completely secure.

Why this matters: Given the government’s ambitious target of having fully autonomous vehicles on the road by 2021 and their focus on connected autonomous vehicles which will necessarily be highly networked, this is a necessary step in achieving that goal. Just one incident of an autonomous vehicle being hacked and hijacked would almost certainly massively erode public trust in the technology and so cybersecurity is paramount in any effective large-scale deployment. The government has already published its cyber security standards for self-driving vehicles.

Yet this announcement feels like it comes far too late. If the competition doesn’t even begin until June and is open for a reasonable period of time, the contract won’t be awarded until close to the end of the year. Then even if the facility can be spun-up in 6–8 months (which seems optimistic), that gives researchers at best 4–6 months to test their fully autonomous vehicles before the 2021 deadline. And this all presumes that the technology is ready for testing by then…

Science Minister announces £1m funding for UK business international collaboration on AI development and deployment

What happened: Science Minister Chris Skidmore speaking at the EUREKA Global Innovation Summit announced (a strong word here for something that began 6 weeks prior to the announcement) £1m in funding for UK participation in a EUREKA project focused on AI and quantum technology. EUREKA is a programme aimed to get innovations to market quicker and to boost networking between businesses in different countries.

Why this matters: Announced alongside the publication of the UK’s International Research and Innovation Strategy, both emphasise that the UK is aiming to make international collaborations, across academia and business, central to its AI strategy. This is in contrast to, for example, the ‘national champions’ and now ‘national team’ approach China appears to be taking, as Jeff Ding explains in his latest ChinAI newsletter, quite possibly because with the exception of DeepMind (now subsumed into Alphabet) the UK was never in any position to have a ‘national champion’ that could stand up to the competition.

The UK places 2nd in Oxford Insights’ Government AI Readiness Index

What happened: Oxford Insights has launched this year’s AI Government Readiness Index, which is intended to measure how well-placed national governments in the UN are to take advantage of the benefits of automation in their own internal operations and delivery of public services.

They do this by looking at the governance of AI (measured through frameworks and national strategies), infrastructure and data availability, skills & education, number of AI start-ups, and the degree of innovation already present in public services.

In 2017, the UK was placed first, just ahead of the USA. However, in 2017, only OECD countries were included in the Index. This year the UK has slipped to second behind the newly included Singapore. However, as Oxford Insights themselves note, countries like China place much lower than would be expected due to missing data so this may not be a completely accurate ranking.

Interesting Upcoming Events

Personal Data and AI

5th June, Newspeak House

Reuben Binns will be presenting work from the Information Commissioner’s Office on AI auditing, while Sophie Stalla Bourdillon, Immuta, will be discussing promises and challenges that she has encountered promoting the adoption of responsible data management practices.

AI: Decoding IP

18th–19th June, London Stadium

The Intellectual Property Office and World Intellectual Property Organisation are running a two-day conference to explore whether the current IP framework incentivises and supports the use of AI, and discuss what changes may be necessary to accommodate the different ways in which AI is likely to be used, as well as the outputs from such uses.

Speakers include: Lord David Kitchin, Justice of the Supreme Court of the UK; Chris Skidmore MP, Minister of State for Universities, Science, Research and Innovation, London; and Dr Zoë Webster, Director of AI and Data Economy, UKRI.

--

--

Elliot Jones

Researcher at Demos; Views expressed here are entirely my own