In the News: Roblox and Generative AI, Preventing Stolen Art, and Tesla’s Recall
This week find out how Roblox is implementing generative AI to allow anyone to build games! Also, see how one company is fighting back against AI art theft. And finally, check in with Tesla and its problematic self driving mode. The tech and analytics landscape is always changing so follow us to stay in the loop!
Create Your Video Game Without Coding
By: Annika Lin
Roblox aims to help anyone build their virtual worlds based on text description along with generative AI. The company’s creators already reach over 50 million people worldwide daily. The current Roblox experience is created through a combination of 3D objects of various forms, connected in behavior through Lua scripting and backed by a universal physics engine that provides core behavior on the platform.
As we’ve seen with DALL-E and ChatGPT, generative AI learns the underlying patterns and structures of data and can generate new content that has not been seen before. Roblox’s proposed implementation involves creating both virtual materials based on a natural language prompt (like turning the car’s headlights on in the picture above) as well as code based on text inputs. Generative AI will allow creators to develop integrated 3D objects that come with behavior built in. For example, a creator could design a car through a simple statement such as “A red, two seater, convertible sports car with front-wheel drive”. This new creation would both look like a red sports car but also have the behavior coded into it to be driven through a 3D virtual world. Soon, anyone will be able to create a generative model for all types of content at once — image, code, 3D models, audio, avatar creation, and more.
Roblox’s vision involves some technical and ethical challenges. Technically, the tools have to be suited to things a user would create, and how they might fit into an in-experience creation environment. The company also must be aware of the need to implement generative AI thoughtfully and ethically on the Roblox platform to keep it safe and civil.
A Solution to AI Art Theft?
By: Matt Jordan
A team of computer science researchers at the University of Chicago are designing a tool that protects artists’ individual styles from AI. It’s called Glaze. The tool manipulates images at the pixel-level to make them appear as other works. This leaves the image looking the same to the human eye but a stable diffusion model — the deep learning model responsible for text-to-image generation — would mistake a small artist for having the same style as a famous one, like Jackson Pollock. This would protect artists from their name being used in generative AI programs to mimic style. The team plans on making the tool public to protect artists from having their artwork added to the rapidly expanding databases of modern AI. Additionally, many tech companies are facing heavy backlash for using online data without permission to train their models. Stability AI, the company responsible for stable diffusion and valued at over $1 billion, was sued by Getty Images for “copying millions of photos without a license.” In some cases when an artist — who signs their work — is named in a generation prompt, a ghost signature will appear on the generated image clearly indicating that the training data for that artist was acquired without consent. Glaze looks to be able to save thousands of artists from the relentless reach of large tech companies.
Tesla Recalls 362,000 Vehicles for Troublesome “Full Self-Driving”
By: Maggie Shen
Just two weeks before Tesla’s March 1 investor day, during which Musk is supposed to promote the company’s artificial intelligence capability and plans, Tesla announced that it will recall its 362,000 U.S. vehicles to update its Full Self-Driving (FSD) Beta software. The National Highway Traffic Safety Administration (NHTSA) said that Tesla’s driver assistance system did not sufficiently adhere to traffic safety laws and might cause car accidents, allowing the EVs to “exceed speed limits or travel through intersections in an unlawful or unpredictable manner increases the risk of a crash.” Despite maintaining that it holds 18 warranty claims, Tesla announced a free software update. The recall includes 2016–2023 Model S and Model X vehicles, 2017–2023 Model 3, and 2020–2023 Model Y vehicles. Criticized by U.S. senators for overstating the real capabilities of its vehicles, the company now describes FSD as an advanced system “designed to provide more active guidance and assisted driving” under the driver’s active supervision but does not make the cars autonomous. Even Tesla admitted itself that FSD could sometimes violate local traffic laws while “executing certain driving maneuvers”. Tesla now plans on recalling nearly 363,000 vehicles to fix the traffic safety issues it brings after recalling nearly 54,000 U.S. vehicles equipped with FSD Beta software last year to repair the “rolling stops” issue.