I am excited to announce that Josh Mitchell, the leading cybersecurity researcher on police body camera vulnerabilities, has joined Amber Video as an Advisor and to help extend our product to police body cameras and other security recording devices. I first heard of Josh by reading this Wired article on his work to identify threats to body cameras and to spotlight issues so they can be swiftly addressed.
When we spoke, I quickly realized we aligned on the belief that cameras, and specifically the footage they captured, are an important evidentiary piece — if they can be trusted. In engagements with law enforcement, impartial evidence supersede he-said/she-said and we need to make sure that the chain of custody of that evidence does not contain vulnerabilities that could cast doubt on the impartiality.
If unaddressed, vulnerabilities will be exposed and video evidence will be altered, manipulated, and delegitimized. Worse yet, it will undermine the trust in law enforcement and the checks-and-balances dynamic between citizen and executive and judicial branches of governments, a core tenet in a democracy.
Josh, like us, is concerned with the speed of the progress of deepfakes, a malicious and sophisticated AI that poses a cybersecurity threat, to create fake video that is imperceptible to humans and has the potential to sow chaos in evidence-based, meritocratic societies.
He also believes this threat can be neutered with Amber’s technology.
Josh is a Principal Cybersecurity Consultant at Nuix. He works with Nuix’s internal and external application security teams to provide reverse engineering, tool development, secure architecture, and vulnerability assessment services.
Josh has more than a decade’s experience as an information security researcher. He has authored numerous technical documents and presented his findings at conferences, academic discussions, and in the classroom.
Josh is an expert at discovering and exploiting vulnerabilities and writing code to protect operating systems and programs. He holds patents in classifying computer files and executable files as malware or whiteware.
Before joining Nuix, Josh served in the United States Air Force and held numerous defense contracting roles covering electronic signals intelligence exploitation, electronic warfare, malware analysis, exploit development, and reverse engineering. He also provided security services for General Dynamics Advanced Information Systems, Endgame, and Accuvant and assisted multiple computer emergency response teams with investigations vital to national security.
We are happy to have you on board, Josh.
Shamir Allibhai, CEO & Founder of Amber Video
Q+A with Josh Mitchell
- What important truth do very few people agree with you on:
If something is free, then you are what is being sold. It certainly is an unpopular concept these days. However, from 23andMe to Facebook and Google; companies do not make billions of dollars providing free services. The price we pay is the cost of our collective privacy. Personally identifiable information (PII) is the currency that our digital world is based upon, and we are paying a steep price. Recent legislative measures like the General Data Protection Regulation (GDPR) have attempted to hold these various entities accountable for the devastating consequences of a data breach. Unfortunately, the company responsible for one of the most devastating data breaches in recent history has gone largely unaffected.
2. Give us a bit of your career background and what led you into cybersecurity:
My entry into cybersecurity was a product of preparation meeting opportunity. I always had a hobby breaking apart software, but I was working in an area doing something entirely unrelated to the field. One day, the need arose to reverse engineer some malicious software. That was the day my hobby became my profession. Twelve years later I have had the honor of performing many tasks in the cybersecurity field. From developing offensive cyber capabilities, to assisting Computer Emergency Response Teams (CERT) with investigations vital to national security, I have seen quite a bit.
3. How and why did you get interested in the security of police body cameras:
Some time ago I began a project to investigate the issues surrounding connected cities. The rapid adoption of Internet of Things (IoT) technology has posed many challenges and opportunities. The decision was made pursue body cameras due to the emergent nature of the devices. Additionally, recent controversies over the employment of the devices had left me with some nagging questions. These questions could not be answered by material provided by the manufacturers, and only option was to ascertain the truth for myself.
4. What is an important lesson you have learned from your time as a cybersecurity researcher:
Persistence and dedication are your allies; assumption and hubris are your enemies. Researching a topic means that you have little to no public information available. The lack of resources and material to rely upon can only be made up by effort. The real world is your classroom, teacher, and test. Sometimes this gap is just a bridge too far, and that is okay. However, more often than naught, the deficiencies can be mitigated with effort. It was a very important lesson I learned from the Anti-Tamper Software Protection Initiative (AT-SPI).
5. Why did you join Amber Video and what are you most excited for:
I believe Amber Video has developed a technology that is foundational to our digital lives. Its adoption will restore faith in the integrity of our legal system and broadcast media. In an information-based conflict, often times the first group with the most compelling narrative wins. Perception is everything. The simple fact of the matter is that anyone with editing tools and time can skew the context of a video by manipulating it. Amber Video reinstates confidence in critical areas of public trust. It is a privilege to help guide the adoption of this technology and remove ambiguity introduced by malicious individuals
6. What do most people misunderstand or not get about:
Video evidence can easily be manipulated by modifying video or clipping select portions to skew the context. Simply removing audio gives people with malevolent intent the ability to create their own narrative. I have personally analyzed media files where only the date was changed. It may seem like a simple thing, but if left to stand it would have jeopardized an individual’s freedom. Manually combatting, even simple tasks like this, is not feasible in our current climate. In thirty days more video content is uploaded to the internet than created by the top three television networks in the last thirty years. We simply must have an embedded, automated, confidence inspiring solution capable of verifying the veracity of footage. We, as a society, cannot allow people to be placed in jail with video evidence that has an unfounded provenance.
They are already here, and here to stay. You do not need a supercomputer, master’s degree from MIT, or Alan Turing’s genius. A quick look on YouTube shows quite a few amateur data scientists placing Nicolas Cage’s face in every movie possible. Snapchat can even do a simplified version of it on your smartphone. This usually comes out quite crude and creepy, but the idea is the same.
Hardware vulnerabilities, and embedded software vulnerabilities, are not simple to fix. The problem starts with the concept of a minimum viable product (MVP). This business term is used to define a product with the smallest feature set that can still satisfy customers. From this point products are rolled-out and features can be iteratively added. We get into trouble when MVP concepts are applied to IoT. This model essentially incentivizes companies to produce products with little or no security. Once the products make it to market it is very difficult, or impossible, to implement secure update mechanisms. The only viable option for manufacturers is to resign themselves to “fix it in the next release.” However, they usually get caught in the MVP cycle again.
7. Has the Internet made society more vulnerable and if so, how:
Yes… ish. Sentiment manipulation, desentization, foreign nation involvement, IoT, social media; all these factors work to make you and our society more vulnerable. Security will always be a tradeoff between ease of use and functionality. You cannot spend $100 to protect $20 worth of stuff. It is a balance. Steps can be taken to protect things that are important, but the level of security needs to be appropriate for the level of risk.
8. What areas are you excited for AI to be applied to:
Assisted AI is really exciting to me, and we can do it right now. Doctors cannot monitor patients 24/7. Drivers fall asleep at the wheel. Fraud and identity theft investigations take too long. Assisted AI can help to provide real, actionable solutions. We can implement these solutions today.
9. What keeps you up at night:
Nothing, I don’t drink coffee after six. In all seriousness, our world is faced with many significant issues. It is our responsibility as decent human beings to do our best to solve them. To that end, the solving issues in body camera industry and broadcast media are critical for the betterment of our society. We simply cannot have problems in these systems undermine our faith in the core institutions we all rely upon.