I have created a website to query the GPT-2 OpenAI model (AskSkynet.com)
A few days ago, OpenAI announced that they have created a very sophisticated AI model called GPT-2, it has been kind of famous cause they have refused to release the full model due to its potential dark uses (Fake news generation, fake content generation, etc.) But they have released a small part of it, called 117M.
I tried it by launching the Tensorflow environment, and I was astonished with the results (some of them quite curious and funny). So right away I was thinking about sharing my GPU with some other cloud GPUs in a website, to open OpenAI a little bit more :-) The goal is also show to others what is the current state of the art on Artificial Intelligence and creating a higher consciousness on the challenges we will face very soon in our society.
How you should query it and how it works
This model has been trained with 40GB of data to predict the next word in a sentence, and this data was selected by using the most upvoted links on Reddit, so yes… you can expect some weird topics in the output!
You shouldn’t ask directly to this model things like: “Am I going to be rich?” Cause it is trained to predict the next word, so if you are expecting a concrete answer you should probably do something more similar to: “Am I going to be rich? Yes, but to be rich I should start doing” At least in this 117M reduced model we need to give some context to create an answer with some sense.
This AI is also pretty good creating stories so you can try some of my random examples in AskSkynet.com to generate some crazy stories from scratch.
The setup is running in 2 servers (1080Ti & nVidia P4), which have been load balanced using Cloudflare, the 1080Ti computes the inference in only 9 seconds, while the P4 is taking around 21 sec., so this last server is acting as a backup server and as a load balancer through Cloudflare when the first one is very busy.
The web application has been developed using the Python framework Flask, it is a great example of how Flask can be used in a controlled production environment by adding only a Cloudflare in front of it.
What are the implications of this for the future?
This topic maybe deserves its own article, but to summarize I will say that we are not aware of the short term challenges we are facing right now, some people could say that “well, experts always have said that in AI”, but believe me, when things like these are open marvelous things happen.
The apocalyptic approach:
- Fewer jobs
- More fake news
What I want to happen:
- Politicians understanding AI and changing the society base: educating more and creating a universal basic income paid by the AI/robots that are earning more
- A more critical thinking society and justice, where we can’t trust even a video or an audio recording
If you are reading this at this point, probably you are also one of the people that need to put his/her grain of sand to move the current society to the second approach, if AI follows the positive way we can say goodbye to the word “crisis” forever.
Follow me on Twitter: @asierarranz