In this episode, I sit down with Konstantinos Kyriakopoulos, CEO of DeepSea, to discuss the transformative world of AI-powered shipping optimization. DeepSea focuses on enhancing vessel performance, fuel efficiency, and overall logistics management in the shipping and logistics industry. Konstantinos has been a key figure in advocating for digitalization in the maritime sector, pushing for technologies to streamline processes, cut costs, and reduce environmental impact.

In our conversation, Konstantinos shares the captivating journey behind DeepSea's inception, revealing how its AI-driven solutions emerged from a desire to revolutionize the shipping industry's efficiency and environmental impact. We explore the intricate use of machine learning to predict fuel consumption, optimize vessel operations, and navigate the shift toward decarbonization.

Gain insights into the intricacies of data architecture, the critical role of scalability, measuring impact, the future vision of the company, and much more. Don't miss out on discovering the cutting-edge applications of AI that are steering the shipping industry toward a more sustainable future with Konstantinos Kyriakopoulos. Tune in now!


Key Points:
  • Background about Konstantinos and DeepSea's inception.
  • How AI is reshaping shipping efficiency and vessel operations.
  • The role of DeepSea in the shipping industry and mitigating climate change.
  • Insights into the challenges and hurdles of an evolving shipping industry.
  • How DeepSea leverages AI, inputs into the model, and the overall aim.
  • Approaches the company implements to ensure the integrity of its products.
  • Why the explainability of machine learning models is critical.
  • He shares DeepSea’s approach to model validation.
  • Measuring impact: CO2 reduction and cost savings for clients.
  • Konstantinos offers valuable advice for leaders of AI-powered startups.
  • What the company has planned for the future.

Quotes:

“If you really want to create impact, it’s not enough to just show people what’s happening and give them analytics, but you also have to, in some way, produce a tangible ROI.” — Konstantinos Kyriakopoulos

“The most important thing is to evaluate performance, so to make sure that the proof of performance is constantly being tested and you have good benchmarks and analytics.” — Konstantinos Kyriakopoulos

“It’s really important to also be able to check internally what is going on but also how the customer wants to see what’s created.” — Konstantinos Kyriakopoulos

“For us, the impact is actually very straightforward. It’s dollars and the metrics tonnes of CO2.” — Konstantinos Kyriakopoulos

“I think what I always say when people talk to me about starting an AI company is to focus on your data architecture early.” — Konstantinos Kyriakopoulos


Links:

Konstantinos Kyriakopoulos
DeepSea
DeepSea on LinkedIn


Resources for Computer Vision Teams:

LinkedIn – Connect with Heather.
Computer Vision Insights Newsletter – A biweekly newsletter to help bring the latest machine learning and computer vision research to applications in people and planetary health.
Computer Vision Strategy Session – Not sure how to advance your computer vision project? Get unstuck with a clear set of next steps. Schedule a 1 hour strategy session now to advance your project.
Foundation Model Assessment – Foundation models are popping up everywhere – do you need one for your proprietary image dataset? Get a clear perspective on whether you can benefit from a domain-specific foundation model.


Transcript:

[INTRODUCTION]

[00:00:03] HC: Welcome to Impact AI, brought to you by Pixel Scientia Labs. I’m your host, Heather Couture. On this podcast, I interview innovators and entrepreneurs about building a mission-driven, machine-learning-powered company. If you like what you hear, please subscribe to my newsletter to be notified about new episodes. Plus, follow the latest research in computer vision for people in planetary health. You can sign up at pixelscientia.com/newsletter.

[INTERVIEW]

[0:00:33.8] HC: Today, I’m joined by guest Konstantinos Kyriakopoulos, Co-founder and CEO of DeepSea, to talk about shipping. Konstantinos, welcome to the show.

[0:00:43.3] KK: Thanks so much for having me, Heather, it’s a pleasure to be here.

[0:00:45.0] HC: Konstantinos, could you share a bit about your background and how that led you to create DeepSea?

[0:00:49.8] KK: Sure, I’m an information engineer by training, actually with Master’s and Ph.D. I did my Ph.D. on applying deep learning to speech applications, so nothing to do with shipping but I was a part of the lab in Cambridge that was very active, and this is back in 2016, ‘17 when deep learning was becoming very hot and it was conquering everything and I was thinking about, “What can I do with it that’s different from what most people are doing?”

Most people were doing things like language models, speech, image data, you know, with a multitude of consumer-facing data, and I was very interested in the application of AI to heavy industry, you know, things that are big and real and engineering and I felt that there is a lot of disruption that can be done that and no one was really pursuing it at the time, and so that’s when I had my serendipitous encounter with Roberto, who is the other co-founder of DeepSea and was actually an old family friend.

And the same way that I had this solution looking for problem, he had a problem looking for a solution and he had a deep background in shipping as mechanical engineering by training. He was working at the time on the problem of estimating the fuel consumption of ships and shipping companies have a lot of trouble being able to estimate what the fuel consumption of a ship would be at different speeds and in different weather conditions because the state of the hull of the vessel that changes over time because things stick to it and repairs were done.

And this is not a tractor model everywhere, so traditional linear regression and also theoretical physics-based models that were just off by 15, 20% in making the prediction, and so we had a hunch that if we used deep learning to take in large amounts of data from lots of ships, we could do a better job at predicting the fuel consumption and also at modeling this state change as a latent variable or using time series based methods.

And so, we were able to get some data from this company at the time, I ran the first, who became the first DeepSea model on my laptop and we found very quickly just by applying standard method, we could bring this arrow down from 15% to absolute range of 5% and so you know, we looked to see if anyone else is doing it, no one else was doing it and so, we thought this to be good business and that’s how it started.

[0:03:18.1] HC: So, what else does DeepSea do today and why is this important overall for the shipping industry and mitigating climate change?

[0:03:25.2] KK: Yeah, so we started out with this problem of modeling the vessel. There’s actually two parts of modeling the hull of the vessel, how much energy does the vessel need to do whatever it’s doing, and then modeling the engines, right? How much fuel you need to generate this out and doing this for every type of ship in all the different sea conditions over long time series of many years, taking into account all the things that the change over time have the spare changes over time.

Initially, we did this as just the modeling and monitoring product like an analytic suite but we soon realized two things. The first is that you can’t really do any of this unless you have good data. So, we started building also a data gateway and collection product, which is installed onboard the ships to collect the data and then the next thing we realized is that actually if you really want to create impact, it’s not enough to just show people what’s happening and giving them analytics but you also have to, in some way, produce a tangible ROI.

So, we decided to go into optimization using these models to feed the downstream atomization task to actually give the customer the speeds and the root, which will optimize differently over time and so it becomes more of a reinforcement learning type problem or some deep learning problem embedded within the optimization problem, and so we came up with our next product where – that which is actually used day to day by the operators or the captains.

They put in what voyages they want to do, port to port, they’re putting all their commercial constraints, the price of the fuel, they put in their nautical charts, all of this, it gives them a full routine speed plan, which is sent to the vessel that automatically updated and we now have hundreds of vessels that are using this as the way they decide on their speed.

What we quickly realized was that this isn’t just a cost-saving measure or the reduction of fuel because as we start – when we started this, the shipping industry didn’t really care about climate change. It was actually formally exempted from most climate change regulations because it was just viewed that A, it’s too international to regulate and B, it’s not possible to have a zero carbon ship because they just – because there’s no – you know, you can’t connect it to the electric grid.

But since then, there’s been a huge shift, and now, shipping is part of the mission trading regulations and there is a lot of economic and social regulatory pressure on them to start decarbonizing and this means two things. The first thing is, in order to meet the decarbonization targets they need to even stay open, they need to cut emissions quickly, which means they just need to reduce the amount of fuel they consume while maintaining the same commercial results.

But it also means that in the future, they’re going to start experimenting and they’ve already started experimenting with some radically different modes of propulsion. So, we have liquified natural gas power engines, we have vessels now that have rotor sails attached to them. So, these are cylinders that spin and as the wind hits them, it produces a propulsion force to augment the force from the engine. So, now, wind isn’t just an obstacle, it’s also a form of propulsion.

You have ammonia, which is basically an energy storage mechanism. You have a wind turbine or somewhere on the land that gives ammonia, ammonia becomes the new fuel. What this means is that all of a sudden, the vessels are operating completely different ways than they have been for the last hundred years when all the physical models were built, and so it means that actually, newer network-based approaches that treat these things statistically are so much more important.

And optimization becomes so much more important because, without it, the operators of these things are at zero, and so we quickly found ourselves actually being at the forefront of the decarbonization of shipping as well with this business.

[0:07:17.3] HC: So, you mentioned modeling and optimization and a couple of different places there. Maybe, could you elaborate on how you use machine learning, what the inputs to these models are, and what they’re trying to predict.

[0:07:27.6] KK: Yeah, sure. So, the kind of core task is that you have every minute, the RPM, speed over ground, and various weather conditions, wind speed and angle current speed and angle, wave, hike, swell, et cetera. Plus, the amount of loading of the vessel and you have to predict the power, right? How much energy does it need to maintain that speed in those conditions?

At first glance, this is just a tabular regression task but when you take into account the fact that you have acceleration/deceleration, when you take into account the state of the hull changes over that scale of months due to marine life attaching itself to the hull of the ships, we’re literally talking about barnacles and things sticking to the hull, which can increase the consumption 10 to 20%, what you realize is, what you’re actually dealing with is a very long time series prediction task.

And so that’s sort of the core cost we solve and we have kind of an ensemble of different models that we use for this, ranging from straightforward neural networks with latent invariable components to attention-based and memory network-based techniques for doing the long [Inaudible 0:08:45.0].

And then, we have the main engine modeling, where that is predicting the fuel consumption as the output and the input are the power, RPM of the engine, and then the temperatures and pressures of various error and fuel and cooling, water, and there are things are going into the engine and that’s more of a tabular aggression task. The complexity there comes in addition to just like there’s a highly non-linear problem.

It also comes from the fact that the features are basically never all there at the same time. There’s a lot of missing features that needs to be imputed. So, actually, a lot of what we work on there is the imputation task, comes with filling in the mentioned data in an end-to-end way that helps the prediction at the end, and then across both of these, it is very important to be able to do uncertainty estimation and to be able to have robustness to distributional shift and that’s one.

The other things we’ve formed this paper on, papers on because there is just a huge variance in what you’re going to see. Again, for both of these models, it’s very important to be able to do transfer learning of different kinds because we’re going to have a scarcity of data when you have a new vessel. So, you want one vessel to be able to learn from other vessels and then you also have cases in which the different inputs and outputs have different granularities.

So, for instance, there are a lot of vessels where you have your speed and your weather on a per ten-minute basis but your fuel consumption comes in on a daily basis. So, we’re talking our value, aggregating models, and more transfer, and then once you got all this, then it’s all has to be fed into the bow stream optimization. So again, you’ve got two approaches for this. You can do it as an end to a reinforcement-varied problem or you can use the regression models to populate and grade that.

It goes into a more traditional optimization algorithm, genetic algorithm, or dynamic programming and so on, where in that case, again on the modeling side, you need to focus a lot of getting the smoothness and the robustness of distributional shift right so that you can actually use neural network gradients and outputs for – and that is optimization cost, which is not straight forward and at the end of the day, this is all going. This is all being judged and measured in real time because it actually has to save you know, the customer fuel for it by voyage and if the model is not able to do that and it costs them more fuel, we’re talking about huge amounts of money lost and larger amounts of pollution that are unnecessary.

[0:11:19.9] HC: How do you ensure that your models continue to perform well over time? Maybe this is changes related to the marine life attaching to the hull that you mentioned or some other type of distribution shift, how do you accommodate these changes?

[0:11:32.1] KK: Yeah, so the most important thing is to evaluate performance, so to make sure that the proof of performance is constantly being tested and you have good benchmarks and analytics beyond just you know, make, right? We’re on the sea like actually looking at you know, the shapes of curves, gives some interpretability, looking at a short estimation, these things, so we just have that running all the time.

Of course, we retrain the data. We have ML ops teams that are looking at things manually and then we have, you know, a great team of ML scientists that are just pushing forward the models and then you know, we also have back-off techniques, right? We have sanity checks, we have back of the simpler models, the theoretical models, and various fail-safes in place to prevent the degradation and the quality of the model from leading to problems in the production environment.

[0:12:24.0] HC: Why does the explainability of machine learning models important and how do you make them explainable and in case of shipping data?

[0:12:30.9] KK: Yes, so it is extremely important. It’s important first for the reason I alluded to, which is just debugging what is happening and for checking the accuracy not just as a final number but actually checking whether what’s being done makes sense. So, you know, is the relation of being learned between speed and power one that you know, is reasonable? Like more speed, more power, and these sorts of things, and again, that’s important because of the out-of-the-domain distribution problem, right?

Like you can have a model that have a very high accuracy but that’s only because it’s overfit to a particular domain, right? So like even if you fasten in all of that, have that evaluation set, and maybe overfitting to a broader trends within the data that somebody may see to apply and especially if you are now doing an optimization task because you are trying to find an optimum operating point, that optimum may well end up being outside the domain.

So, it’s really important to also be able to check internally what is going on but also how the customer wants to see what’s created. So, no one who is operating an asset worth tens or hundreds of millions of dollars is going to trust all of the speeds and the roots and all of the decisions to being given by a complete black box algorithm. They want to have some view of what is doing and why.

And so we have spent a lot of time building visualizations that can show kind of slices through the model and can visualize what it’s doing and how it’s thinking. So, this is everything from just a, you know, speed power curve at different values and different conditions keeping everything else equal or 3D plots of, for example, power versus wind speed and we’re not going to see if makes sense to even ship with the vessel.

When we do optimization and also incorporates opportunity costs with having you know, the model do lots of predictions to draw out kind of economics file, marginal cost, marginal revenue curves to see if the optimum point is where the two meet. It’s looking at what the – were historically jumbled before the customer used our product, what they did, what we would have suggested, visualizing that versus our weather conditions.

And then trying to build interpretability of how changes in the weather leads to changes in the optimum speed and so on and so forth but this is something that’s just a constant project that we’re developing more and more and better and better ways to build in terms of ability and there is always need for more.

[0:15:01.8] HC: How do you go about validating your models to be sure that they function the way you expect them to in a real-world scenario?

[0:15:08.4] KK: So, as I mentioned mean absolute percentage error and mean-squared error doesn’t even get you started. You can get a bit better by doing this on tasks that have much better curated datasets but we need more. So, we look at using a theoretical model to generate synthetic data and evaluate against that. We look at generating some of these curves that I’ve mentioned before and then validating the model based on kind of common sense behavior.

So, are you getting higher power or higher speed? Are you getting higher power on the vessels when you think the vessel is more fouled? And so on and so forth and so we have our respective for this and then we also have likelihood-based metrics that are also measuring the accuracy of the uncertainty estimation which is also very important for us. It is about building as many of these tests as possible.

Building scoreboards for them, having run automated way as much as possible, and just running them regularly.

[0:16:12.3] HC: Thinking more broadly about what you are doing at DeepSea, how do you measure the impact of this technology to be sure it’s having the effect that you want it to?

[0:16:20.4] KK: Yeah, so for us, the impact is actually very straightforward. It’s dollars and the metrics tonnes of CO2, right? So, we can point to how much CO2 we remove or prevent from being emitted, that’s the most important metric for us in terms of impact. We’re now doing about as much as a town of a few tens of thousands of people as in the amount of CO2 without being emitted is the amount that is being emitted by ten thousand houses.

Yeah, we also look at the money we are saving our customers and then of course, we also look at qualitative things, right? Do they like it? Are they using it? Are they adopting it? Is it being used for all their voyages? Are they using analytics? Is it making their life easier? Is it setting them time? And so on. I think that’s really how any company values their success, right? Are the customers happy? Are they seeing tangible and measurable impact?

[0:17:09.8] HC: Is there any advice you could offer to other leaders of AI-powered startups?

[0:17:14.1] KK: I think what I always say when people talk to me about starting an AI company is to focus on your data architecture early. You know, look at your databases, your microservice architecture, how you’re scaling it, these sorts of things, at least people who come from my sort of background where they’re used to just working Jupyter Notebooks all day, it’s very easy to make bad decisions early on and then end up with something that is not scalable.

When you’ve got them going and you’ve got lots of customers using your system, it’s very, very difficult to fix the engine while the car is running.

[0:17:46.2] HC: And finally, where do you see the impact of DeepSea in three to five years?

[0:17:49.8] KK: We want to be removing more and more CO2 in the atmosphere. We want to be hitting the thousand ship level and we want to be used by vessels of every type in every market, and we want to also to have started becoming kind of the center of ecosystem where all the different parties that rely on us and use or need their vessel to be optimum and the owner of the asset, the manager, the charger has – insurers and financers and so on are all logging into the same platform and using it as the single source of truth for the performance of the vessel in a single point of optimization for all of their different KPIs.

[0:18:28.8] HC: This has been great. Konstantinos, I appreciate your insights today. Where can people find out more about you online?

[0:18:34.5] KK: The website is deepsea.ai but you can also follow us on LinkedIn.

[0:18:38.4] HC: Perfect. Thanks for joining me today.

[0:18:40.8] KK: Thanks a lot, Heather.

[0:18:42.1] HC: All right everyone, thanks for listening. I’m Heather Couture and I hope you join me again next time for Impact AI.

[END OF INTERVIEW]

[0:18:52.0] HC: Thank you for listening to Impact AI. If you enjoyed this episode, please subscribe and share with a friend. And if you’d like to learn more about computer vision applications for people in planetary health, you can sign up for my newsletter at pixelscientia.com/newsletter.

[END]