AI System To Suggest Recipes By Just Looking At Food Pictures
The AI did exceptionally well with desserts
MIT scientists have come up with a new AI system that can tell users about recipes just by looking at food pictures
Now an AI will be able to tell you the recipe of a dish if you show it the picture. Isn’t it a dream-come-true technology for all the food-lovers like me? Oh, hell it is.
MIT scientists have come up with a new artificial intelligence system that can predict and suggest users recipes just by looking at the photos of the dishes. The AI system is called Pic2Recipe and it can help us learn recipes easily as well as better understand people’s food habits.
“In computer vision, food is mostly neglected because we don’t have the large-scale data sets needed to make predictions. But seemingly useless photos on social media can actually provide valuable insight into health habits and dietary preferences,” said Yusuf Aytar, from Massachusetts Institute of Technology in the US.
Researchers examined and explored websites like All Recipes and Food.com to develop “Recipe1M,” a database which includes over 1 million recipes that were explained with detailed information about the ingredients in a wide range of dishes. They then used that data to train a neural network to find patterns and make connections between the food images and the corresponding ingredients and recipes.
Given a photo of a food item, Pic2Recipe could identify the ingredients in it like flour, eggs, and butter, and then suggest several recipes from the database that it detects to be similar to the ingredients of the images’ dishes.

The system did particularly well with desserts like cookies or muffins, since that was a main theme in the database, reported India Today. However, it had difficulty determining ingredients for more indefinite foods, such as sushi rolls and smoothies. It was also often made a mistake when it came to being similar recipes for the same dishes.

The MIT research team expects to improve the system further in future so that it can understand food items in more minute details. The researchers hope to turn it into a “dinner aide”, which could figure out what to cook based on some given ingredients and a dietary reference.
“This could potentially help people figure out what’s in their food when they don’t have explicit nutritional information. For example, if you know what ingredients went into a dish but not the amount, you can take a photo, enter the ingredients, and run the model to find a similar recipe with known quantities, and then use that information to approximate your own meal,” explained Nick Hynes, graduate student at MIT, reported PTI.
