4 Risks in Connected Cars

Black Duck held its inaugural European user conference this month in Amsterdam. Turnout was great, with almost 100 representatives from European businesses attending our training and presentations.

I was privileged to lead a panel discussion on the security implications of open source in the connected car. Gordon Haff, Technology Evangelist at Red Hat, and Simon Gutteridge, Global Information Security Manager at TomTom, joined me to explore the topic.

Car hacking” is certainly a fun subject to talk about (and even more fun to watch). But it’s also a serious topic as the volume of code increases in modern automobiles. The trend started in the 1977 Oldsmobile Toronado, in which a small amount of code managed electronic spark timing. As the chart shows, a high-end car today can include over 100 million lines of code. This software provides convenience (driver assistance), entertainment (infotainment systems), safety (blind spot detection, collision avoidance), and vehicle management benefits.

As Gordon and Simon pointed out, there are a number of security challenges in connected cars. Today, I’ll focus on four.

The Supply Chain Makes Tracking Code Difficult

When we think of building software, we first think of our internal development teams. But the connected car is different, relying on hundreds of independent vendors supplying hardware and software components to Tier 1 and 2 vendors, as well as the OEMs.

The software from each of those vendors is likely a mix of custom code written by the vendor and third-party code (commercial and open source). Black Duck’s 2017 Open Source Security and Risk Analysis research found that 96% of the commercial applications analyzed included open source. On average, over 35% of an applications code base was comprised of open source, made up of 147 individual open source components. When you multiply this by multiple vendors, understanding exactly which components are part of connected cars is extremely difficult for the OEMs. When you add to this the fact that over 3,000 vulnerabilities are reported in open source every year, the security implications are clear.

Fixing a Vulnerability Doesn’t Always Result in Fixing a Car

There are several steps that are required to correct a software issue in the connected car. Let’s assume a Tier 2 vendor is using an open source component, and a vulnerability is disclosed.

  • First the vendor needs to know they are using that specific open source component.
  • Next they need to be monitoring sources to have visibility to the newly reported vulnerability.
  • Then they need to refactor and test their code to remediate the issue.

Even when all this is done, the software update needs to go to the OEM or Tier 1 vendor, be incorporated into an update of that entity’s component (hardware and software) and, ultimately, be updated in each consumer’s vehicle. The Jeep hacking referenced earlier was addressed by sending a USB stick to each affected vehicle owner (how many car owners are comfortable updating their own software?). Alternatively, vehicles can be updated during routine service, IF that service is provided by an authorized dealer, a prospect that decreases as a vehicle ages. Over-the-air updates of software are still the exception rather than the rule, and may require that the vehicle be running but not moving (we don’t want to reboot systems when the vehicle is at highway speed).

Product Lifecycles Present Long-term Maintenance Challenges

The product lifecycle of vehicles presents challenges as well. Your cell phone may have a practical life of 2–3 years, but receives regular operating systems updates and perhaps hundreds of app updates each year. The laptop I’m using will likely be replaced after 3–5 years, and likewise receive regular updates and patches. This is the typical lifecycle software vendors are used to addressing.

A modern car, however, is in design for years prior to production, and the average vehicle may be on the road for 10–15 years. Supporting software over that period of time requires a different thought process. Vendors (and open source communities) need to be considered in light of the operational risk they present. Questions vendors need to ask include:

  • How sure are you that the components you are using will be supported by the open source community in the future?
  • Are you prepared to provide ongoing support for projects if the community (or vendor) abandons them?
  • What does the release cycle look like?
  • How many vulnerabilities has the component had over the last 3 years compared to the size of the code base? Is the community security aware?

Who Owns Your Personal Data?

While not a direct security threat in terms of vehicle safety, gaining control over this data is critical. The connected car may collect more personal data on a driver than any device other than a personal computer. It has the ability to record where you drive, how long you stay in a location, how fast or erratically you drive (e.g., frequent lane changes or blind spot alerts, hard braking, auto-assisted braking), who you call, what you search for, and even your musical and news preferences.

The use of this data is largely uncontrolled at this point. Privacy policies may differ between OEM and vendors, as well as between countries where the vehicle is operated.

I look forward to future panels with both Gordon and Simon, as well as other customers and partners. Our U.S. user conference — FLIGHT 2017 — will be in Boston November 7–9. Come check us out.

Mike Pittenger | VP of Security Strategy

Originally published at blog.blackducksoftware.com.