Sustainable waste disposal has been a global pain point for many decades. While the recent push toward comprehensive recycling has eased the pressure a little, there’s still much more to be done if we are to build a sustainable society. Luckily for us, the progression of AI brings new hope for feasible waste disposal, and today’s guest, the CTO and Co-Founder of CleanRobotics, Tanner Cook, is here to tell us how his company is playing its part in improving the disposal of waste, recycling, and compost.

In our conversation, we learn about CleanRobotics and why the company’s work is vital for sustainability, the ins and outs of their smart recycling product, TrashBot, and how it uses machine learning, how CleanRobotics ensures that its technology is always improving and up to date, and the impact of their AI-powered systems on sustainable waste management. Plus, Tanner offers up some noteworthy advice for other leaders of AI-powered start-ups before sharing his vision of the future of CleanRobotics.  


Key Points:
  • Introducing the CTO and Co-Founder of CleanRobotics, Tanner Cook.
  • Tanner’s background and how he ended up as a co-founder.
  • What CleanRobotics does and why this work is important for sustainability.
  • Assessing the information that CleanRobotics is able to extract from its TrashBot product.
  • The role of machine learning in TrashBot technology.
  • How they gather and annotate data with TrashBot.
  • The challenges of training machine learning models on imagery.
  • How Tanner and his team improve their technology and ensure that it’s always up to date.
  • The way CleanRobotics measures the impact of its technology.
  • Tanner’s advice to other leaders of AI-powered startups.
  • What he’d like CleanRobotics to achieve over the next five years.

Quotes:

“[I] found myself looking at trash cans very closely with my co-founder, Charles Yhap, and realizing, at the bin level and where people dispose of things, there were a lot of problems going on, and a lot of problems that artificial intelligence and robotics could solve.” — Tanner Cook

“The number of rule sets are very diverse throughout the United States and throughout the world. The rules can easily change for what is and isn't recyclable when you drive 20 minutes outside of your city.” — Tanner Cook

“One of our personal tellers internally for CleanRobotics is sustainability. Putting in those checks and balances to make sure that we're actually doing something good - instead of just greenwashing - is very important to us.” — Tanner Cook


Links:

Tanner Cook on LinkedIn
CleanRobotics
TrashBot


Resources for Computer Vision Teams:

LinkedIn – Connect with Heather.
Computer Vision Insights Newsletter – A biweekly newsletter to help bring the latest machine learning and computer vision research to applications in people and planetary health.
Computer Vision Strategy Session – Not sure how to advance your computer vision project? Get unstuck with a clear set of next steps. Schedule a 1 hour strategy session now to advance your project.


Transcript:

[INTRODUCTION]

[0:00:03] HC: Welcome to Impact AI, brought to you by Pixel Scientia Labs. I’m your host, Heather Couture. On this podcast, I interview innovators and entrepreneurs about building a mission-driven machine learning-powered company. If you like what you hear, please subscribe to my newsletter to be notified about new episodes. Plus, follow the latest research in computer vision for people and planetary health. You can sign up at pixelscientia.com/newsletter.

[INTERVIEW]

[0:00:34] HC: Today, I’m joined by guest, Tanner Cook, CTO and Co-Founder of CleanRobotics to talk about smart recycling. Tanner, welcome to the show.

[0:00:42] TC: Thank you, Heather, for having me. I’m excited to be here.

[0:00:44] HC: Tanner, could you share a bit about your background and how that led you to co-create CleanRobotics?

[0:00:49] TC: Sure. My background is in engineering, sustainable technology. I started off as a nuclear engineer at a nuclear fusion-based startup. Went on to found my own company that was utilizing nanotechnology to clean up oil spills and then eventually, just found myself looking at trash cans very closely with my co-founder, Charles Yhap, and realizing at the bin level and where people dispose of things, there was a lot of problems going on and a lot of problems that artificial intelligence and robotics could solve. That’s an interesting progression of wandering into a bunch of different industries, but with a base lens of sustainability and improving our world and environment.

[0:01:38] HC: What does CleanRobotics do and why is this important for sustainability?

[0:01:41] TC: Right. Our core product is a product called TrashBot. TrashBot is really made to replace conventional waste stations and trash cans in these high traffic public spaces, and to help these airports and convention centers develop a solid and dynamic waste program. Right now, a lot of these places are struggling heavily, since people will come in, they’ll fly in, they understand the rules of where they live. But when trying to sort at these airports and stadiums, they don’t understand the rules, or care necessarily sometimes.

We’re helping to one, sort out all this waste. You throw an item away and it automatically sorts the item to the correct bin, whether that’s recycling, compost, landfill, or any other multitude of bins if you want more specific sorting. We’re also giving them data and information in order to be able to help build their sustainability programs. We’re able to help them identify waste practices and education materials that will actually make a direct impact and then measure those impacts over time.

Fundamentally, at a basic level, we sell smart trash cans. But at a higher level, we’re really looking to provide information and help to build sustainability programs as related to waste and recycling and composting.

[0:03:03] HC: What kind of information are you able to extract from this in order to support those programs?

[0:03:09] TC: Right. A lot of what we do is just item by item information. Every item that’s thrown away within a trash pot, we can, with pretty high accuracy discover what it is, whether it is recyclable, or compostable, whether it’s contaminated or not. Whether your water bottle is full or empty does matter to the recyclability of it in a lot of instances. Then congregate all that data into a central location where it’s viewable.

Instead of having to pay someone with a clipboard to dig through the trash in your way and every individual item, we’re doing it continuously and automatedly, which allows people to get direct feedback on what is and isn’t working. One thing that we see pretty commonly is water bottles that are full. That’s the easiest thing for people to just empty their water bottle and to educate them on that. It’s incredibly impactful, too, because it does move the needle substantially, particularly in airports where there are a lot of plastic water bottles.

As granular as we can, we can recognize logos in some instances, but we do it down to generally a simple product type level. Aluminum cans, plastic bottles, PET bottles, HDPE bottles, and so on. It is worth mentioning that trash can be anything. While we do get plenty of items through that we recognize, we also get plenty through that we just have no clue what they are, or our system doesn’t recognize.

[0:04:39] HC: Well, you’re educating me on this already, because I had no idea that leaving water in your water bottle could have an impact on the downstream processing.

[0:04:48] TC: Yes, yes. That and coffee cups are not recyclable. Those are two big ones. Coffee cups are not recyclable. There’s a paraffin, like a plastic coating on them that makes them very difficult to recycle, and so most recycling facilities will not take them.

[0:05:05] HC: That’s good to know as well. What role does machine learning play in this technology?

[0:05:09] TC: Particularly, just with computer vision. That’s our main application. We’ve been doing it for a while. We did it back when Bayesian classification was all the rage. Now, we’ve moved to more transfer learning models. We have to recognize dozens of different types of items with relatively high accuracy, quickly. We do that with a camera and we also weigh the items as well. Based upon the weight and what we’re bringing in with our images through object detection is we were able to make a decision on what the object is and where it should go.

[0:05:46] HC: To train these types of models, how do you gather and annotate data? I imagine there’s almost an endless variety of objects that can be placed in a trash can. How do you gather and annotate enough data to train these types of models?

[0:06:00] TC: Well, what’s nice is people do it for us, and the robot does it for us. People will generate their own garbage and they’ll throw it away, and our robotic system will automatically gather the information on those. There is a level of human oversight. We do have what we call just heavier models that are cloud-based in order to separate out all the data at a more refined rate, rather than using a simple computer on the edge to recognize what the objects are. There is a level of human oversight then that we need to go through and re-tag images and re-box pieces of trash.

It’s slightly automated in that we’re not physically sitting there as an engineering team throwing things away constantly to gather data. It’s the public. But we still do have to go through and validate, make sure that our data set is correct on the back end for training purposes.

[0:06:55] HC: The annotations for what type of object this is, and not just that there’s an object there, but narrowing it down to a specific class, is that semi-automated as well, or is that mostly manual to get the training data?

[0:07:07] TC: It’s semi-automated. We have layers of software for boxing the items automatically. Some of the objects that we have very high-surety scores, or accuracy and precision recall scores on will automatically tag maybe 10 different categories, or classes. Then everything else is just manually done. Instead of having someone sit there and make a box, we try and automate that out. They may have to shift the parameters of the box to make it a little tighter or something, but we’ve tried to automate that out.

It is still very a manual process in a lot of instances with being able to recognize and categorize what is X type of waste. Again, just due to the sheer variety of what you can get. If you look at aluminum cans, they come in dozens of different colors and patterns and shapes. You can take the image of each of these cans at many different angles, or whether they’re crushed and so on. You still do need a little bit of a human touch, but it is getting better all the time, the level of the human touch that we need on it.

[0:08:11] HC: What kinds of challenges do you encounter in working with this type of imagery and training machine learning models with it?

[0:08:18] TC: A lot of it is figuring out a correct taxonomy, because there’s a lot of overlap within categories, or the potential for overlap within categories. Knowing the specificity of the taxonomy was an understanding of what level we need to be at is difficult to figure out. It took us years, actually, of trial and error with several customers and internally and speaking with policy makers and everyone to really nail it down to the correct categories. I think the next most difficult thing is the fact that anything can be trash and the forms of it, again, are changing continuously. What constitutes a paper plate somewhere is entirely different somewhere else. We’re able to retrain our system in order to recognize those things longer term, but it certainly will get things wrong if it hasn’t seen them before and we just hope to improve continuously over time on that.

[0:09:19] HC: How do you go about getting your models to improve over time? Does domain shift affect your models the way it does in a lot of applications? Or are some of the conditions constant enough that it’s more just keeping up with the variety of objects that are being thrown away?

[0:09:34] TC: Right. What’s nice is the conditions are very controlled. It tends to be just pure variety. We’re always looking at ways to tune and change the taxonomy a little bit by a little bit, too. There is, in a lot of instances, confusion between some categories. Optically, it’s very difficult to recognize the difference between a plastic water bottle that’s empty and a plastic water bottle that’s full, just through a computer vision model. It’s pretty good, but it’s not perfect.

We look to augment that as well with secondary characteristics, like the weight, right? Understanding that this paper bag is likely full of something, so it’s probably contaminated and will go to landfill is meaningful to us and gives us more context to the situation. It really is just a matter of getting variety and we do plenty of things with augmenting our data on the back end to try and get all the different colors and forms and shapes of waste. Again, since anything can be garbage, people are always coming out with new designs for packaging and everything else. It’s still a little bit of a continuous uphill battle on that front.

[0:10:44] HC: How do you ensure that the technology your team develops will fit in with current waste disposal practices and provide the most impact for recycling?

[0:10:52] TC: Right. One of the biggest things that we’ve done is base our taxonomy around the general language used with rules, whether it’s internationally, or locally. What we’re able to do is on the robotic system is we can instantaneously change which category of items should go to which category of bin. By that, I mean, when the rules change locally, we’re able to within 15 seconds after we learn that the rules change, we still have to learn about the local rules, we’re able to just switch categories.

Say that a place didn’t used to take aluminum cans and they used to always just go to the landfill, because that was the rules of the local municipality. Then on a when, they decided to buy aluminum can resource center, or something like that, and then they would take aluminum cans. We’re able to dynamically change quickly what our bots consider what category, just again, because the number of rule sets are very diverse throughout the United States and throughout the world. The rules can easily change for what is and isn’t recyclable when you drive 20 minutes outside of your city.

We planned that with that in mind and recognizing a diverse set of objects has helped with that as well. We recognize, I think, what’s 70 different categories, primary categories of waste recyclables, compostables. Then with those, we’re able to nail about 98% or 99% of what we need to sort out.

[0:12:30] HC: How do you measure the impact of your technology?

[0:12:32] TC: What’s interesting is what we do is we’ll generally go through and we’ll do what’s called a waste audit with a facility that will deploy trash pot. What that is, is that it breaks down the amount by weight of recyclables, compostables and landfill waste that they have before they implement a TrashBot. Then we’ll do that with TrashBot continuously. TrashBot will provide this audit data over time, and we can look at the differential of the amount that they were recycling beforehand, versus what they are with TrashBot. Based upon that differential, we can pretty easily come up with an impact metric.

I think right now, it is 5 pounds of CO2 abated. I think it maybe even actually is a little less than that. I think it’s around 3.5 pounds of CO2 is abated per pound of recyclables that are diverted. Per extra pound of recyclables that we divert because we’re sorting properly, we’re able to more or less reduce the carbon impact by three and a half pounds, which is pretty substantial impact. We found that a single TrashBot, one of our classic airports, or high-traffic public spaces it sorts around 2,000 pounds of recyclables extra per year, which is about the same as taking a fossil fuel burning car off the road. It’s pretty impactful in terms of just having things go to the right places will do.

[0:14:00] HC: I like that you can quantify it not just to the recycling process, but down to the CO2 emissions reduced. That’s definitely helpful in communicating the importance of the work that you’re doing.

[0:14:11] TC: Absolutely. Yeah. What it comes to is the numbers, we don’t want to greenwash anything. We even understand the general environmental impact of what it costs to build a TrashBot, right? As well as what the amount of energy that it takes to continuously run it. It’s one of our personal tellers internally for CleanRobotics is sustainability. Putting in those checks and balances to make sure that we’re actually doing something good is – instead of just greenwashing is very important to us.

[0:14:41] HC: Is there any advice you could offer to other leaders of AI-powered startups?

[0:14:45] TC: I would say, for most anyone, particularly from an engineer and entrepreneur standpoint, find the problem and then make the solution. A lot of the newer technologies seem to be people with solutions looking for problems to solve. That would be my main point of advice, I think. Particularly, if you’re looking to start a business around is find a problem that can be solved with AI. Research the heck out of it. Talk with all the people you can who have that problem and then create the solution for it.

[0:15:20] HC: That’s very good advice there. Finally, where do you see the impact of CleanRobotics in three to five years?

[0:15:26] TC: Well, I’d love to see our TrashBot products out there in the thousands. I think we’re well on our way to doing that. I like to see us as being a provider of data to policy makers and to people and really driving a lot of what’s going on within the waste industry through education as well. Our TrashBots have an integrated screen with them and they provide feedback to people, but they also provide education about recycling and waste.

I’d really like to see what we’re doing with our data, again, shape more than what it is right now, which is just advising these facilities on what to do better. Again, policy making. I would like to see it used in research and beyond. I think that’s where we’re really, I want to see our company go is from being a hardware and robotics company to being a sustainability and general data consulting company.

[0:16:25] HC: This has been great, Tanner. Your team at CleanRobotics is doing some really interesting work for recycling. I expect that the insights you’ve shared will be valuable to other AI companies. Where can people find out more about you online?

[0:16:36] TC: Our website is cleanrobotics.com. There should be plenty of information there on the website. Feel free to enter in your information, too, to get more of our white papers on some of our technology, as well as the applications. Or you can feel free to reach out to me on LinkedIn, just at Tanner Cook and I’ll be happy to reply to you there as well.

[0:16:59] HC: Perfect. Thanks for joining me today.

[0:17:01] TC: Okay. Wonderful. Thank you for having me again, Heather.

[0:17:03] HC: All right, everyone. Thanks for listening. I’m Heather Couture. I hope you join me again next time for Impact AI.

[END OF INTERVIEW]

[0:17:14] HC: Thank you for listening to Impact AI. If you enjoyed this episode, please subscribe and share with a friend. If you’d like to learn more about computer vision applications for people and planetary health, you can sign up for my newsletter at pixelscientia.com/newsletter.

[END]