GitHub describes its new product Copilot as an “AI pair programmer.” However, pair programming usually doesn’t usually involve stealing licensed code, does it? Photo by Christina Morillo.

GitHub’s AI Copilot Might Get You Sued If You Use It

Some are even abandoning GitHub because of it

Jacob Bergdahl
Geek Culture
Published in
4 min readJul 8, 2021

--

GitHub just announced its latest, shiny product: an artificial intelligence (AI) called Copilot. It’s a machine learning-powered software that can write code by itself, generating quite impressive programming functions. Yet, it has people pulling out of GitHub and worrying about lawsuits.

The AI works similarly to other OpenAI-powered code-generating tools. The user writes a comment describing what they want the AI to write, and the AI makes it happen. What makes Copilot unique is that it also takes initiatives on its own, suggesting autocompletions on the fly.

It sounds really cool, doesn’t it? If you know me, you know that I’m often excited about artificial intelligence; I even published a book wherein I described technologies similar to Copilot. But there are plenty of issues surrounding machine learning, and GitHub is experiencing these dilemmas already on day one. Usually, the source of the drama for any machine learning application lies in its data, and the outcry surrounding Copilot follows that rule. More specifically, in the case of Copilot, the problem lies in how GitHub went about gathering the data to build the algorithm.

“Unfortunately, the user has no way of knowing if…

--

--