What if technology could be the key to averting a biodiversity crisis? Today, I explore this possibility with Mads Fogtmann, Chief Data Officer of FaunaPhotonics, as we discuss their groundbreaking approach to biodiversity monitoring. I talk with Mads about the looming biodiversity crisis, the innovative solutions his team is developing to address the urgent need for scalable biodiversity monitoring, and the central role that humans have to play in all this. Find out how the FaunaPhotonics platform is employing advanced sensing technology and machine learning to protect ecosystems, why insects are such useful proxies for monitoring ecosystem health, and their successful partnerships with other domain experts and researchers. Our conversation also covers the broader implications of biodiversity loss, the role of public awareness in conservation, and the future of biodiversity monitoring. Join us for a comprehensive and insightful discussion on how technology can help safeguard our planet's future and ensure the stability of natural and human systems alike!
Key Points:
- Some background on Mads and his transition from academia to the private sector.
- The FaunaPhotonics platform and how it monitors biodiversity.
- An overview of the biodiversity crisis and the urgent need to address it.
- Understanding our connection to, and dependence on, nature.
- The risks that the biodiversity crisis poses for supply chains.
- FaunaPhotonics’ role in measuring the biodiversity crisis: why this protects ecosystems.
- Why insects are the best available proxy for measuring ecosystem health.
- How sensing technology and machine learning are utilized by FaunaPhotonics.
- Case studies showcasing the impact of FaunaPhotonics' technology.
- Future directions and innovations in biodiversity monitoring.
- Key challenges faced in developing and deploying biodiversity monitoring technology.
- FaunaPhotonics’ collaboration with other domain experts and researchers in the field.
- Why there is such an urgent need for scaleable biodiversity monitoring.
- The importance of public awareness and education in addressing the biodiversity crisis.
- Mads’ advice to leaders of other AI-powered startups and the future of FaunaPhotonics.
Quotes:
“The clothes we wear, the food we eat, the water we drink, the material we use to build houses: everything comes from nature. And right now, we are destroying that foundation rapidly.” — Mads Fogtmann
“I think it’s important that we become more aware that we are an integral part of nature.” — Mads Fogtmann
“If you can’t measure it, then how can you protect the rights? – [We come with the solution] that allows them to measure [the impact on biodiversity] so they can protect it. We do this by using insect sensing. The reason we do this is that insects are so fundamental to the ecosystem.” — Mads Fogtmann
“Insects are the best proxy that you can have for actually measuring the health of [an] ecosystem.” — Mads Fogtmann
“There’s a huge need and an interest in ‘how we can actually scale biodiversity monitoring to kind of help us understand what’s going on with nature at the moment.’” — Mads Fogtmann
Links:
Mads Fogtmann on LinkedIn
FaunaPhotonics
FaunaPhotonics on LinkedIn
LinkedIn – Connect with Heather.
Computer Vision Insights Newsletter – A biweekly newsletter to help bring the latest machine learning and computer vision research to applications in people and planetary health.
Computer Vision Strategy Session – Not sure how to advance your computer vision project? Get unstuck with a clear set of next steps. Schedule a 1 hour strategy session now to advance your project.
[INTRODUCTION]
[00:00:03] HC: Welcome to Impact AI, brought to you by Pixel Scientia Labs. I’m your host, Heather Couture. On this podcast, I interview innovators and entrepreneurs about building a mission-driven, machine-learning-powered company. If you like what you hear, please subscribe to my newsletter to be notified about new episodes. Plus, follow the latest research in computer vision for people in planetary health. You can sign up at pixelscientia.com/newsletter.
[INTERVIEW]
[0:00:34.1] HC: Today, I’m joined by guest Mads Fogtmann, chief data officer of FaunaPhotonics, to talk about biodiversity. Mads, welcome to the show.
[0:00:42.5] MF: Thank you so much, Heather.
[0:00:44.1] HC: Mads, could you share a bit about your background and how that led you to FaunaPhotonics?
[0:00:47.9] MF: Absolutely. Yeah, basically been working professionally with machine learning for around 15 years, originally out of academia. The last thing I did was a postdoc on medical MR imaging of fetuses, and after the – kind of my academic career, I went to the private sector, I’ve been through risk management in banking, fingerprint mechanism systems on cellular phones that have most data where we kind of bootstrap doing consultancy work.
We kind of bootstrap our development of our own kind of long shots, it’s right on the hype of the AI phase, so that’s how we get started, I mean, but just going out and trying to do a different kind of project with small and big companies, most [inaudible 0:01:26.4] also of foreign companies and after four years, my partner in the company, he said he suffers from startup fatigue and I guess I almost did.
So, we kind of stopped our company and I kind of went into the more corporate world, first in Nuuday which is the biggest data digital provider, a step of data science, and later, I was the chief data scientist Novo Nordisk and after a year in Novo Nordisk, I was headhunted to become the chief data officer of FuanaPhotonics. That’s basically my journey to FaunaPhotonics. So, I’ve really been, you know, going from a specialist kind of role to a very long transition into management.
I’m still finding a little bit, being a manager, I still like to get kind of involved and we’re also going from you know, the core algorithm development to more about concerned, being more concerned about how do we actually operate machine learning, how to build supporting infrastructure, and so on, and this is also what I’m – my role to see in FauanPhotonic is kind of to build the foundation for doing machine learning. I think that’s my journey to FuanaPhotonics.
[0:02:30.1] HC: So, what does FaunaPhotonics do and why is it important?
[0:02:33.8] MF: So, we offer a platform for biodiversity monitoring, an online platform, built on our own sense of technology, and the reason very really important to a bunch of biodiversity is that, we have, right now, have a very large biodiversity crisis. I think we have all heard about the climate crisis and pollution crisis they are making, then we have this now, overarching crisis of the loss of biodiversity.
I think, since the 70s, we lost about 40% of all species, and the current extinction rate is maybe around a hundred times higher than any mass extinction event in history and also, there’s a lot of you know, 50% of the worst GDP in the space is also at risk to nature loss. So, I think everything, the clothes we wear, the food we eat, the water we drink, the material we use to build houses, everything comes from nature.
And right now, we are destroying that foundation rapidly, and we need to understand that we rely so much on ecosystem services, and then we are destroying these balances there is in the ecosystem and hereby also again, destroying the foundation of human living basically. I think it’s important that we become more aware that we are an integral part of nature and that we don’t mean – when I was young, I think you would go.
I mean, at least, that’s how I grew up, we live some – in the city and we will go to nature, we would go to the forest, we would go canoeing or something like this. So, we were like, never considered being a part of nature and I think it’s really important to understand that humans are, as everything, a part of nature, and basically, this is why we’re building a platform that can – that allows companies to measure the impact on biodiversity.
It allows them kind of to, to you know, manage their risk related to loss of biodiversity, which also would – that would mean companies that has a high, you know, [inaudible 0:04:21.1] which have a large supply chain based out of nature, they would have easier access to capital if they showed and protect that kind of their supply chain from breakdowns. I think a lot of companies in kind of since Corona kind of aware of that, the effect of the supply chain breakdown, how that can affect their businesses.
So, I think most of them, the companies are becoming aware that loss of biodiversity actually impose a big risk for that supply chain. Also, there’s a lot of reporting framerates coming in that kind of either voluntary or by law, kind of require companies to report on their impact on biodiversity. We have kind of the corporate social responsibility directive from the EU, which has some part that’s also around biodiversity.
And we also want to – our platform to enable customers or customers of businesses to use our platform to help them more easily report into these, into this framework, which they – from 90 or 50,000 biggest company in the European Union have to report into CSID. So, this is really what we’re trying to build, and I think it’s important to also to understand that if you can’t measure it, then how can you protect the rights?
So, this is where it’s important for us to become, we’re the solution that allows them to measure this impact so they can protect it, and we do this by using insect sensing, and the reason we do this is that insects are so fundamental to the ecosystem. They clean water, they pollen the flowers, they break down organic material, they’re the major food source. So, this is why we believe that insects are the best proxy that you can have for actually measuring the health of the ecosystem.
This is why we developed an insect sensing technique as the foundation of how we measure impact on biodiversity but also, in the future, we plan to kind of integrate other sources of data into our platform. So, it’s not just what we – whether we come from our sensors but also, other the sources of data like these layers. It can be other sensing technologies, it can also be soil conditions, and stuff like this.
[0:06:25.1] HC: And what role does machine learning play in this technology to monitor biodiversity?
[0:06:30.2] MF: Well, I think like, in any kind of sensing technology that then, what I analyze with these signals are, I mean, that’s obviously, you know, machine learning is a very – the tool that you use for this and of course, we use ML to analyze the sequence from our sensors and in the future, we can also kind of plan to use machine learning to predict or produce what we call actionable insights or recommendation.
So, if a customer wants to know, how can I improve the impact of, let’s say, some site I have, “How can I improve the biodiversity?” If we collect enough kind of data around what kind of, when customers use our solution, what are they doing, how does it look in the surrounding area, we take these variables into our platform, then we can use those also in the future to predict that. On less ideal circumstances, then we recommend to do this and this in order to protect or improve your – the biodiversity of your site, basically.
[0:07:25.7] HC: Do you maybe have a couple of specific examples of the types of models that you train, you know, for example, the inputs and the outputs, how you set up that problem using machine learning?
[0:07:34.7] MF: Yeah, so typically, it’s – I mean, we have – our signal looks very much like audio signals. So, we’re basically, we are from one-day signals, you basically try to detect events that looks like insects, and occasionally, we also try to classify what kind of insects are we observing basically, from these signals and then, we take these events and we aggregate them into statistics that the customer can use to understand or impact their biodiversity.
[0:08:00.9] HC: How do you go about gathering data to train these models and do you need to annotate it in the process as well?
[0:08:06.6] MF: Yeah, it depends a little on what the objective is. If you’re looking for you know, annotating, if you’re looking to do, let’s say, you know, pollenate a classification or species classification, which we used to do, I have to understand it but we also recently did a pivot. Beforehand, we were also very much a pest management company that’s trying to detect pests in the fields and agriculture.
So now, we had built a lot of species detection models and that requires that you know what kind of species that is in front of your sensor, which is very difficult because if you’re out in the – it’s hard to annotate a – humans can look at our signals and say, “This is a bumblebee” or this is a – I don’t know, aphid or whatever it might be. So, what we did to have done traditionally is to put insect of a known species into a cage or a greenhouse, so we know what kind of insects is – could be in front of the sensor.
And then we know when we see it, an insect, and then in front of our sensor, we know what is the label of this insect, or what is the species of that observed insect, which is a very difficult and very tedious thing to do essentially because insects don’t behave the same way when they’re in the cage. They behave differently than when they are in their natural environment. So, it takes a lot of work from specially trained expert biologists or entomologists to kind of motivate those insect to fly when they’re in the cage.
We also have built a prototype for something we called the add routing system that augments basically our sensor, our insect sensors with cameras, where when we see an insect, you know, a sensor will trigger four cameras and they will take high-resolution images that kind of covers the – our detection volume, and then you can go to the images and kind of annotate what kind of species of insect it is.
Today, we focus less on species, so there, we can actually – we go directly into the signals we get and we can annotate, “Here is the start and end of the insect signal.” So, it’s kind of possible to take you know, annotate, “This is an insect, this is not an insect.” So, this is what we’re doing mostly these days is to take, to kind of annotate when we see an insect and what is the wing beat, for example, what is the wing beat frequency of this insect.
We can easily look at our signals and annotate this. So, this is how we do it, how we actually gather data, and we definitely need annotation.
[0:10:24.1] HC: What kinds of challenges do you encounter in working with this data and trying to predict the insect types and then characteristics?
[0:10:31.4] MF: I think it depends a little bit and so we have two words, technologies to be used. One is an infrared technique where we emit infrared light and we then – that light is reflected off the insect and the second we capture them, it is actually a very nice signal because it is a very good signal-to-noise ratio because we – it is a much-related signal, or sorry, imagine a light we send out, so we can kind of call the source of light.
We can kind of fill those out but that technique is kind of – it has a high power consumption because an active thing that we need is to emit a lot of lights, and insects are very small. So, a lot of the light of the power is just never used for anything because it is not hitting insects basically. So, it means that we have had to look into different tech, sense of technologies, and our recent developed a new insect sensor technology based out of electrostatic field sensing.
So, we are surrounded of course of the time by these electrostatic fields. I mean, we are sensing you, sent to measure these fields. When an insect kind of flies into this field, it changes the electrostatic field or either it has a static electric charge over it to kind of just interfere with the electrostatic field there is and we kept it as difference but of course, you get interference from a lot of different sources when you are measuring the static electric field, especially in urban areas.
There is a lot of electronic devices that of course, introduce and modulate this, the electric static field. So, we have a lot of other type of noises that has to be handled when we work with this new sensor technology but the advantages with the technology we have now is that very little power consumption and a very large detection volume but then also we need to kind of a handle different kind of noise just that comes from especially other electric devices, including the device itself basically.
So, our biggest challenge with the work with this data studies is the – is a lot of interference from other sources basically and we need to kind of build in that into when we are building a machine learning model. We need to show it all these kinds of noises that we can encounter or signals that interfere so that it can learn to kind of distinguish between insect signals and signals coming from, for example, electronical devices.
[0:12:40.2] HC: Are you able to remove any of that noise beforehand or is it really just a matter of you need a larger diversity of data covering the different types of noise to present to the model during training?
[0:12:50.1] MF: Some of the noises are easy to remove, so a typical common – more noises coming from, like for example, fifth thirds or sixth thirds in the US kind of noises are easy to remove but the problem is that a lot of these noises are electronical devices. They have this kind of modulated, introduces very [inaudible 0:13:09.0] very modulated and very – signals would have very little pattern to it and then therefore, it’s a little bit.
So, mainly what we do in our work to do is that we try to add as many type of noises that we can and augment our signals as much as we can and then train the algorithm to distinguish between what are insect signals and what are signals coming from other things basically.
[0:13:31.6] HC: It sounds like there is a lot of knowledge related to what is captured in the data, what the modalities of noise are, what are the characteristics of different insects. How do your machine learning developers collaborate with other domain experts in order to build the knowledge they need into your models to make it work?
[0:13:49.6] MF: I think we have like a one-to-one relations or I mean, we have as many biologists hired us, data scientists or entomologists basically. So, we have a lot of entomologists and biologists that helps us both in terms of planning the development, helping us gather training sets, explain. If we see something, a data we cannot explain then you know, having a domain expert sometimes helps a lot to explain what is going on here, especially sometimes you can have peaks.
For example, in insect activity we cannot understand why do we see this, then you know, having a – sometimes that’s in biology and naturally explain it for it and this is where like a biologist or an entomologist can help us a lot and they do a lot of the validation when we validate our models, it is – especially it’s a little bit hot, especially when we are doing species or functional classification, for example, pollinators.
It’s difficult to validate these models for us because we essentially kind of look at the signals and say, “Hey, this is a bumblebee” and stuff. We need to do other ways of validating these models basically, and this is where it helps a lot to work with you know, these domain experts. We try and allocate biologists and entomologists.
[0:14:57.7] HC: So, how do you go about validating models?
[0:14:59.9] MF: So, one of the biggest challenges also, as far as to actually convince other domain experts, trained entomologists and biologists that work with biodiversity, and there they have of course, over many years, develop techniques and to measure biodiversity and of course, we use – and these are normally based out of kind of traditional trapping method where you trap insects and then you go down and you identify it to a family or at species level.
And then, you build your biodiversity intakes are based on these manual ground trudging of trapping data and what we kind of we do is in order to kind of convince these experts that our method makes sense, we do actually do a lot of correlation studies to traditional trapping techniques but, of course, those techniques are biased in many ways as our team also can be biased because we’re only looking at flying insects for example.
So, it sometimes challenging to just look at a correlation. So, is this good or bad if the correlation is 0.7, is that good or is it bad? This is basically still a work in progress to kind of validate this more. Of course, we can look at, for example, by just doing insect classification. We can of course, we have annotated, so we have especially ground truth data somehow, so we can do equity estimates on real data.
We can also – we also look very much to see if we have any kind of bias. For example, do we – are we more likely instead of detect insects, for example, with the high signal or with the high wing beat frequency or low wing beat frequencies, which will, you know, affect our measures of biodiversity if we have biases towards a certain type of insect, then we actually end up looking less diverse than it is.
So, of course, yeah. So, we use a lot of – we study a lot of – into details around if there’s any bias and our detections that we need to be aware of towards certain types of insects. I hope this answers the question basically, around validation.
[0:16:51.7] HC: Why is now the right time to build this technology?
[0:16:54.4] MF: I mean, our technologies have been built over a decade, and the learnings we have. As I said earlier, we just kind of – we recently did a complete pivot into biodiversity where we, traditionally, we’re a pest management company and now we’re seeing that there’s a huge need and an interest in how we can actually scale biodiversity monitoring to kind of help us understand what’s going on with nature at the moment.
Understanding what are the impacts you have and how can we reduce this impact and therefore, right now, this is – I think we have – this is not really the technology, it’s not the right time, it’s maybe not the technology but it’s the application is ready right now. There’s a need for techniques that can scale when it comes to biodiversity monitoring, and this is what we can provide as one of the only companies.
I think in the world right now, we are one of the only companies that actually can deliver a credible sense of base and scalable technology for measuring the impact of biodiversity. So, I think it’s more – not the technology, it’s more of the application that is the right time.
[0:17:54.6] HC: Well, that’s definitely a good time to be developing it then if it’s that – so needed in the market. Is there any advice you could offer to other leaders of AI-powered startups?
[0:18:03.4] MF: I think there’s a lot of focus on potential in AI and a lot of hype around it and often, we forget about the value we created. We talk about a lot of potential but not really talk to the customers and I have the feeling a lot of people don’t really understand what are the needs of the customers. I would definitely focus on what are we actually solving with AI. There’s a tendency just to focus on something that’s AI and not really on what do we actually solve with using AI and just what kind of application we have for it.
So, I think that’s important, that status when using AI, they kind of spend more time talking to the customers about how you can solve this, what are their problems and what kind of value proposition do we offer to them if they use our product, rather than just focusing on AI and the potential of AI than focus on really, how do I help the customers. I think that would be my advice.
[0:18:53.0] HC: Sounds like pretty solid advice to me, and finally, where do you see the impact of FaunaPhotonics in three to five years?
[0:19:00.0] MF: Hopefully, we are kind of the – one of the accepted standards for evaluating the ecosystem, the health of the ecosystem, measuring the impact or evolution of an ecosystem, and that allows us to again, support businesses and protect nature and essentially, their business. So, that’s hopefully where we are in three to five year that we are kind of accepted standard for essentially, the measuring impact of biodiversity.
[0:19:26.9] HC: Mads, I appreciate your insights today. I think this was been valuable to many listeners. Where can people find out more about you online?
[0:19:34.2] MF: I think people should go to FaunaPhotonics.com and check it out or you can also follow us on LinkedIn. We are kind of always very active on LinkedIn, posting new developments and collaborations, and so either go to LinkedIn, find us at LinkedIn, or just go to FaunaPhotonics.com.
[0:19:51.9] HC: Perfect, I’ll link to both of those in the show notes. Thanks for joining me today.
[0:19:55.9] MF: Thank you, Heather. Thank you for having me.
[0:19:57.7] HC: All right, everyone, thanks for listening. I’m Heather Couture and I hope you join me again next time for Impact AI.
[END OF INTERVIEW]
[0:20:07.4] HC: Thank you for listening to Impact AI. If you enjoyed this episode, please subscribe and share with a friend, and if you’d like to learn more about computer vision applications for people in planetary health, you can sign up for my newsletter at pixelscientia.com/newsletter.
[END]