More perfect Cheetos: How PepsiCo is using Microsoft’s Project Bonsai to raise the (snack) bar
I have eaten a lot of Cheetos in my life. So when I open a bag, I know exactly what to expect: a satisfying crunch from the delightfully orange original, a melt-in-your-mouth airiness from the baked puff or the almost-too-spicy (but in a good way) fire from any of the Flamin’ Hot varieties.
What I’ve never given much thought to, though, is how much work goes in to creating that just-right bite. It turns out that there are a number of complex individual inputs and detailed product specifications – from water ratio to cutting speed – that interact to create each perfect snack. And that perfection is paramount for PepsiCo, as Cheetos is one of the company’s most beloved billion-dollar brands.
In an effort to increase efficiency while maintaining that consistency and quality, PepsiCo has developed an AI solution by Microsoft Project Bonsai. That solution, which uses data from a computer vision system to make recommendations or adjustments any time a product falls out of spec, has proven itself at a pilot plant and will soon be deployed in a production plant.
This is good news for Cheetos lovers like me. And it’s great news for the company, which is now exploring other avenues to use the technology.
“Innovation is a key ingredient in our success at PepsiCo and helps us deliver exciting new products, technology advancements and even new ways of working—whatever it takes to ensure we continue to bring smiles to our consumers every day,” said Denise Lefebvre, Senior Vice President of Global Foods R&D at PepsiCo. “Cheetos, one of our most beloved billion-dollar brands, are produced in 22 countries and come in more than 50 flavors. The Project Bonsai technology helps us ensure each [Cheetos snack] is perfect, and we’re excited about its potential. This is only the beginning.”
‘Innovation on the factory floor’
PepsiCo chose the Cheetos baked puff as its first test product for Project Bonsai. Cheetos puffs are made on a machine called an extruder. Historically, an operator manually selected a few Cheetos coming off the extruder at defined intervals, checking them for qualities like shape and bulk density and adjusting inputs on the extruder if something was off.
The Project Bonsai solution can monitor the product almost continuously, using sensors to oversee those qualities. It knows immediately if a product strays outside of a defined range, and it can either make recommendations to be approved by an operator or adjust settings itself if working autonomously.
Initial results from the pilot also suggest that the Bonsai “brain” is able to do a good job of independently adjusting the extruder to maintain product quality and consistency despite disturbances, such as cornmeal lot changes.
Dylan Dias, CEO of Neal Analytics, who partnered with PepsiCo on the pilot project, says the effort is a great example of autonomous system design and implementation.
“The project brought together a powerful mix of technology, applied modeling skills and subject matter expertise to create innovation on the factory floor,” Dias says.
The subject matter expertise Dias refers to comes from expert operators and PepsiCo engineers, whose training and experience was used by developers to program the AI solution and create a simulation environment to replicate the extruder.
Once the developers had created that simulation framework, the AI algorithm learns through trial and error as well as feedback from operators – a process called reinforcement learning. In the simulation, the AI solution can simulate a day’s run in a mere 30 seconds.
That means the AI solution has easily gone through more simulated runs than an operator could see in many lifetimes. And its computing power means it can come up with the right option far faster. Plus, it learned from the company’s most skilled operators and Cheetos experts, so it’s monitoring the fluctuations in quality and productivity from the highest level of experience.
The AI solution “could encapsulate the knowledge and skill of the best operators, then apply that through other facilities,” says Jayson Stemmler, a technical project manager at Neal Analytics who worked on the PepsiCo pilot project. “This solution reveals interactions and relationships that might not be intuitive to operators but that exist in the data. Without the manual measurement process, PepsiCo’s engineers are able to be more efficient with their time and focus on breakthrough innovation.”
A few bad Cheetos?
After the solution spent some time in its simulation proving ground, it was time to take it to a test plant in PepsiCo’s Plano facility to see how it did with the real thing, which means testing it with some imperfect Cheetos.
“To develop this technology, we need to be able to make product that’s not good, so the AI can learn to take the product back into spec,” says Sean Eichenlaub, a senior principal engineer at PepsiCo.
Personally, I don’t see how any Cheetos could be “not good,” but I understand PepsiCo is going for perfect.
With the computer vision system continually monitoring and sending data to the Project Bonsai solution, any variance from that ideal can be fixed ASAP.
“With faster corrections, we can avoid the potential issues of going out of spec, such as having to discard product, or problems with packaging and waste,” Eichenlaub says.
I, for one, am all for a bag full of perfect Cheetos. And while the company prepares to use this Project Bonsai solution at a production plant, it’s also looking into using it with other Frito-Lay products, including the even-more-complex tortilla chip.
Leah Culler edits Microsoft’s AI Blog for Business & Technology.