Microsoft Will Pay You $15,000 If You Get Bing AI to Act Out of Character

Wealth Wisdom
3 min readOct 14, 2023

--

Microsoft Will Pay You $15,000 If You Get Bing AI to Act Out of Character: Why Microsoft Wants You to Mess with Bing’s AI and How to Do It

Microsoft has launched a new AI bug bounty program that rewards security researchers who can find vulnerabilities in its AI-powered Bing services and apps. The rewards range from $2,000 to $15,000, depending on the severity and quality of the bug report.

The program covers the AI-powered Bing experiences on bing.com, Microsoft Edge, Microsoft Start, and Skype Mobile. The program also includes the Bing chatbot and AI integrations, such as Bing Chat for Enterprise and Bing Image Creator.

The program aims to enhance the security and reliability of the AI-powered Bing products, which use natural language processing, computer vision, and machine learning to provide users with enhanced search results, chat interactions, and image creation.

Some of the vulnerabilities that Microsoft is looking for include:

✅ Influencing and changing Bing’s chat behavior across user boundaries, i.e. change the AI in ways that impact all other users.

✅ Modifying Bing’s chat behavior by adjusting client and/or server visible configuration, such as setting debug flags, changing feature flags, etc.

✅ Breaking Bing’s cross-conversation memory protections and history deletion.

✅ Revealing Bing’s internal workings and prompts, decision making processes and confidential information.

✅ Bypassing Bing’s chat mode session limits and/or restrictions/rules.

To be eligible for the bounty program, researchers must submit a detailed report that meets the following criteria:

✅ Identify a previously unknown vulnerability that is either important or critical to security.

✅ Provide a clear and concise description of the vulnerability and its impact.

✅ Provide a step-by-step guide on how to reproduce the vulnerability, including a video or a written proof of concept.

✅ Provide any additional information or evidence that can help Microsoft verify and fix the vulnerability.

Microsoft says it will review each submission within 90 days and notify the researcher of the outcome. If the submission is accepted, Microsoft will pay the researcher via HackerOne, a platform that connects security researchers with companies.

Microsoft says it welcomes researchers from around the world to participate in the program and help improve the AI-powered Bing experience. The company also says it will not take any legal action against researchers who act in good faith and follow the rules of the program.

The AI bug bounty program is part of Microsoft’s broader security initiatives, which include various bug bounty programs for its products and services, such as Windows, Office 365, Azure, Xbox, and more. In the year to June 2023, Microsoft says it paid out over $13 million in bug bounty rewards, including one individual payout of $200,000.

So, if you think you can outsmart Bing’s AI and make it go off the rails, you might want to give it a try and earn some cash. But be warned: Bing’s AI is not easy to fool. It has been trained on billions of data points and has learned from its own mistakes. It might even surprise you with its witty responses and clever insights.

What do you think of Microsoft’s AI bug bounty program? Do you have any tips or tricks on how to find vulnerabilities in Bing’s AI? Let us know in the comments below. And if you liked this article, please share it with your friends and followers. Thank you for reading!

--

--

Wealth Wisdom

I'm Laenny David, an online entrepreneur. I teach people how to make money online and live their dream life. Learn how to turn your passion into profit.