Measuring plant health is essential for various applications, such as agriculture and conservation biology. Being able to measure plant health effectively enhances resource use, mitigation measures, and sustainable use of ecosystems in a rapidly changing world. But how is this done?

In this episode, I am joined by Kevin Lang, the CEO and president of Agerpoint, who shares insights into their cutting-edge solutions for measuring and monitoring plants. Agerpoint is a company that provides tools and solutions for measuring and monitoring plants to gather accurate data related to forests and crops. Their spatial intelligence platform is designed to unlock valuable insights for sustainable food systems and climate solutions.

In our conversation, Kevin explains how Agerpoint harnesses the power of AI, machine learning, and 3D modeling to enhance crop management, reduce resource inputs, and promote regenerative farming practices. Learn about the innovative ways Agerpoint is leveraging existing technology, such as smartphones, to make their products more accessible and affordable. Kevin also delves into the types of data Agerpoint uses, the validation process, the challenges of analyzing plant health, and much more. Tune in as we explore the fusion of technology and nature and how it's helping shape a more sustainable and efficient future with Kevin Lang from Agerpoint!

Key Points:
  • Kevin’s professional background and the road to Agerpoint.
  • Details about Agerpoint and what the company specializes in.
  • How machine learning forms the core of Agerpoint’s technology.
  • The range of data modalities Agerpoint uses for its technology.
  • Data challenges and insights into the validation process.
  • Bridging the gap between technology and biology.
  • Agerpoint's approach to recruiting top talent.
  • Measuring the impact of Agerpoint’s technology.
  • Essential advice for AI startups: align with your investors, board, and team.
  • What to expect from Agerpoint in the future.


“The combination of the point cloud and the machine learning and automation and then putting this all together in a cloud-based system, where we can fuse these data layers together, is unique.” — Kevin Lang

“Machine learning really plays a critical role across multiple processes and products in our business, and it’s really the core of the Agerpoint platform.” — Kevin Lang

“Validation is just as much of a scientific challenge as it is a change management and communication challenge with your clients.” — Kevin Lang

“The impact [of our product] is about access and affordability.” — Kevin Lang

“We are building a company and a capability that we believe represents the next wave of digital agriculture and forestry.” — Kevin Lang


Kevin Lang on LinkedIn
Kevin Lang on X
Agerpoint Capture iOS App
Know Your Carbon

Resources for Computer Vision Teams:

LinkedIn – Connect with Heather.
Computer Vision Insights Newsletter – A biweekly newsletter to help bring the latest machine learning and computer vision research to applications in people and planetary health.
Computer Vision Strategy Session – Not sure how to advance your computer vision project? Get unstuck with a clear set of next steps. Schedule a 1 hour strategy session now to advance your project.
Foundation Model Assessment – Foundation models are popping up everywhere – do you need one for your proprietary image dataset? Get a clear perspective on whether you can benefit from a domain-specific foundation model.



[00:00:03] HC: Welcome to Impact AI, brought to you by Pixel Scientia Labs. I’m your host, Heather Couture. On this podcast, I interview innovators and entrepreneurs about building a mission-driven, machine-learning-powered company. If you like what you hear, please subscribe to my newsletter to be notified about new episodes. Plus, follow the latest research in computer vision for people in planetary health. You can sign up at


[0:00:32.6] HC: Today, I’m joined by guest Kevin Lang, CEO and President of Agerpoint, to talk about measuring and monitoring plants. Kevin, welcome to the show.

[0:00:43.0] KL: Thanks Heather.

[0:00:43.5] HC: Kevin, could you share a bit about your background and how that led you to Agerpoint?

[0:00:47.9] KL: Sure. So, my background stemmed from a multi-generation farming family with roots in the Southern US, and studied mechanical engineering in college and spent the early part of my career in a few roles in product development for John Deere. So, designed green, yellow tractors, spent time in product design, testing reliability, and did a lot of work on current production products but also on some frontier tech initiatives and while I was there, I got a business degree part-time.

Left Deere and entered the management consulting world and worked for Deloitte Consulting. At the time, their strategy and operations group, not sure what they’ve rebranded that part of the reorganization now but during my consulting career, I had a chance to work with an NGO on a secondment, and as part of my role with this NGO is focused on really creating opportunities for collaboration, tech collaboration, and creating dialogue between startups and larger enterprises, academia, and governments. And I really became inspired by all the entrepreneurs that were solving all these grand challenges related to food security, climate change, urbanization, and particularly I saw opportunities with digital technologies like remote sensing and drones and satellite imagery, IoT sensors, and AI, and just thought how we could solve some of the same data information and implementation challenges I experienced earlier in my career at Deloitte and John Deere.

Soon after, began my foray into the startup world and working with venture-backed companies, and started to lead the agriculture business for a commercial drone analytics and services company, really, when the kind of commercial drone market was at its infancy when the FAA had just approved part one of seven certification and drones were starting to become more prevalent in the sky for commercial applications.

And really exciting time to be in the remote sensing space but at the same point, started to really see all these challenges with UAVs and also, specifically how ground information was necessary in order to validate these models that we were building with drones, right? So, and that’s really what led me to Agerpoint and discover the company, and become interested in Agerpoint was building.

[0:03:07.6] HC: So, what does Agerpoint do? What kinds of products and services do you provide?

[0:03:11.2] KL: Yeah, so, Agerpoint develops technology to measure and monitor plants. That’s being crops, anything from agriculture crops, root crops, permanent crops, and forests, and a combination of those two where you have ecosystems like agroforestry, and we achieve this through, it’s a combination of different technologies. I mentioned the challenges with ground information. Agerpoint focuses on terrestrial data from LiDAR sensors and cameras that you can get in the field.

These can be sensors on the back of a tractor, of an ATV, of a field vehicle but really, now, even from sensors like your smartphone technologies have evolved there. This ground component where we can get accurate plant-level details that tell us a lot about a crop or a tree, and then combine that with space-born data from satellites and other arial data to look really at the broad acreage picture. The combination of these two really creates a comprehensive understanding of what’s happening and Agerpoint, you know, fuses these datasets together in a cloud platform. So, I would say our technology and our differentiators are really around this 3D model, the point cloud, that we can generate from images or from LiDAR data and why this is important is understanding how much of a disease or a specific problem will you see in the field or if we are trying to understand better yields or harvest information, how much of that actually exists in within a plant or within an area?

So, being able to quantify this in three dimensions, how we would really measure it in the real world helps us to make better decisions and fuse that with other broad-scale information. So, the 3D component in the point clouds, Agerpoint has spent about a decade, really developing, I would say, pioneering on what’s possible in agriculture and nature from LiDAR and point clouds and the company does have 10 patents on those processes as well. So, we’re starting to see that become more popular.

The second piece of Agerpoint’s technology stack is really built around machine learning and artificial intelligence and would love to dive more into what we’re doing and how we’re using machine learning. The third is really about automation and Heather, I mentioned, you know, we use LiDAR data and it’s becoming more prevalent in the nature scene and quantifying biomass in forestry, for example.

But what Agerpoint has built is a way to automatically extract insights from these point clouds and from LiDAR data. So, we’re not relying on a data scientist or a human to extract these insights, like heights and diameters and biomass and these important plant metrics. So, the combination of the point cloud and the machine learning and automation and then putting this all together in a cloud-based system, where we can fuse these data layers together, is unique.

And the last thing I’ll mention is about a year ago, we launched our first smartphone app, currently on the iOS store and we were using the LiDAR sensor that is actually embedded in the iPhone or iPad to bring the same technology that we built of a very expensive research-grade sensors now to anyone with a smartphone. So, you can download an app, it’s called Agerpoint Capture from the app store.

You can take captures in the field, we can create three-dimensional models on your phone and then you can upload those to the cloud, and get subsequent analytics from these captures, and fuse them with other data sets. So, the smartphone piece, it’s really fundamentally changed how we can provide access to this technology to really, anybody in the world.

[0:06:43.7] HC: So, you mentioned machine learning there. What role does machine learning play in your technology to measure and monitor plants and in the 3D point cloud technology?

[0:06:53.3] KL: I would say, machine learning really plays a critical role across multiple processes and products in our business, and it’s really the core of the Agerpoint platform. One of the primary uses is just for object detection. Imagine you’re using the smartphone product, you’re in the field, you’re capturing data, feeling like you’re recording a video but we’re essentially taking multiple frames per second, and all these different frames that are then geo-tagged and geo-located, we can run object detection algorithms on a specific image or frame.

And we’re looking for one presence of vegetation, a specific species, the presence of a disease or non-crop material and we can theoretically look for as many classes as we have, models built on that single image and we can take what we classify from the image and frames and then provide that into the point cloud itself. A lot of the work we do for customers, let’s say, in agriculture is, well, I’m going to look at a citrus tree.

I’m going to take a capture and use it in my smartphone and Agerpoint is going to tell me, you know, what percentage of that scene is canopy, what percentage of this scene is fruit, what percentage of that fruit has disease, and then what’s the trunk diameter and specific species of that, and we can see all of that from the visual data that we capture using a smartphone, or any other type of sensor that you have access to with a camera.

That’s the primary use, Heather, is finding objects within a 2D image but then once it’s in three dimensions as well, we’ve got the similar type of algorithms that can extract the morphological dimensions of say, a plant height, plant canopy, density, and then a derivative measurements like biomass that are so critical for carbon markets and how much carbon a tree or a plant can actually sequester.

I would say, we also use machine learning processes to help train our models. We create synthetic data that we use to accelerate the model training and validation and we also are using this for predictive analytics as well. So, let’s say we’ve got a rich set of ground information, we have Agerpoint analytics, we have customer fused data that’s from physical observations, we have soil sensor data outputs that are in Agerpoint cloud.

We want to understand, “Well, which one of these measurements actually correlates to you know, some type of outcome, like yield or water stress?” And so, we can use predictive analytics and unsupervised training methods that actually find these trends and correlations across these different data sets. Machine learning is extremely important for us.

[0:09:27.5] HC: So, you’re working with a few different types of data there. You mentioned smartphones, photos, point cloud data, were there other modalities of data in there?

[0:09:35.7] KL: This is the challenge for any good data science or remote sensing analytics company. There are multiple different sensor inputs and bands of data that you theoretically can use to find an outcome. The balance and the tension is you know, which one of these bands can we “scalably” use to acquire data. So, for example, hyperspectral sensors are near-infrared or red-edge bands.

They’re really useful for water stress but the amount of sensors that are available, especially on the ground are limited. So, Agerpoint has performed work and has projects where we’ve used these other bands for clients but we’re really starting to focus more on the visible spectrum plus LiDAR. So, RGB plus LiDAR. What can we see from these bands that can take the stress of a human eye off of making decisions, and also make multiple derivative decisions off the same dataset?

So, not only looking for how many plants are emerged in the scene, what’s the species of the plant but also tough indicators of disease, and a lot of the work we do is looking at green on green fruit. So, green fruit on green vegetation and helping to predict harvest, several months in advance because of what we can see off the smartphone but you know, Heather, there’s all sorts of other geospatial data streams that we can work with partners on to fuse this information together to get this comprehensive view of a field. [0:10:55.8] HC: So, with those most common modalities that you use of visible plus LiDAR combination, what kinds of challenges do you encounter in working with them and in training, machine learning models based on them?

[0:11:06.9] KL: We use point cloud data and it’s tough to find the right balance between point cloud density, how rich and dense this point cloud is versus the types of processing speeds that we want to work with, and also how do we visualize this in an environment that’s cloud-based, that allows a user to actually interact with that data.

Really optimizing the parameters on resolution, point cloud density size, and to get that right exchange of information from the clients, from the phone, from other data sources into the cloud and then being able to actually, you know, do something with that. That’s one challenge that we’ve worked, you know, over a decade, to really optimize.

And then some of the other questions you have to ask and tradeoffs you have to make are, “What type of processing in computation processes do you want to use?” So, for example, there’s photogrammetry, which is you know, a well-known field and a mature field and being able to generate point clouds, right?

And that’s you know, taking one image overlaps with another image, where do I see intersecting points and I get an extremely accurate view of a 3D model but the tradeoff is, you know the downside is you need a lot of data, right? You have to make sure you have extreme overlap. There are new technologies that are available in the space that are using a bit of artificial intelligence.

For example, NeRF, neural radiance fields, and other similar types of approaches that are making educated predictions on what should be seen based on less data, right? And so, the challenge that we have to address is first and foremost, we need to provide accurate measurements and insights back to our customers but we want to make sure that we provide the best representation of what’s there at the lowest efficiency cost the customers can actually access our technology.

But you know, we have really made some significant strides in the last 12 months in being able to use some of these alternative technologies that are providing us with just as accurate outcomes that are really moving at such pace that is presenting an exciting future for using point clouds and in a non-photogrammetry environment.

I would say, the last challenge is we collect data sets at different times during a calendar year, during an agriculture growing season, or during the life cycle of a forest from planting to you know, survivability. You know making sure that we align these different datasets and co-align your spatial context to know that I am looking at the same plant in April that I’m looking at I want to evaluate in August.

So there’s some work that we’ve done on the platform to make sure that our customers can track the same natural asset at different times.

[0:13:46.5] HC: How do you go about validating your machine-learning models? For the object detector once working off at 2D images, I imagine that’s relatively straightforward but for some of the other models and data types you’re working with, you know I can imagine it’s much more complicated.

[0:14:01.0] KL: Validation is just as much of a scientific challenge as it is a change management and communication challenge with your clients. You can have the best model and you can trust it internally, you can know it’s 100% accurate but if it differs from a customer’s expectation of what that model is telling them, then the adoption is going to be low or difficult.

So I think backing up the most important exercise you can commit to when you’re building a model or you want to actually implement a model with the customer is just align on success prior to whatever you’re engaging on, a proof of concept or if they’re looking to adopt the platform. So it’s aligning on what is truth, right? So, is truth the client’s annotation of what’s happening in a scene?

It is this percentage of disease, it is this percent of foliation on a leaf, or is aligning on some new digital method to align on truth? Is it looking frame by frame alongside of your client, alongside of your digital economist that sit within the Agerpoint walls? So, it really heather some combination of our clients helping us to understand what exists in that scene and then us validating it and becoming educated in that, internally.

And then you know, slowly and steadily, taking some of the onus off of our clients to help train our data but I would say you know, we are in a fortunate position at Agerpoint to be able to work with a lot of the top enterprises in the agrochemical space and the food and beverage space where we have access to some of the leading PhDs and researchers in the field that know exactly what they’re looking for, who can help train and validate models alongside of us that then get deployed in their accounts and platforms at scale.

But yeah, communication alignment upfront is really important when we’re validating machine learning models.

[0:15:52.6] HC: Yeah, so it sounds like getting the problem definition right from the start is part of how you tackle it there.

[0:15:58.7] KL: Absolutely.

[0:15:59.4] HC: So, how do your machine learning developers collaborate with agronomists or other domain experts throughout the project to ensure that they build the most effective models and get the knowledge they need in order to do that?

[0:16:11.5] KL: Yeah, I think some of that was addressed a bit in our last response there. It’s communication and then when we work with enterprises, just making sure that we have these clear expectations where they help to train and validate what we’re seeing in an image or in a point cloud or in a scene but then it’s also making sure that we provide a very clear process and easy tools for that communication.

Like there is some ad-hoc ways where you can communicate with the client, you can get images or use spreadsheets to build training datasets but we’re building in some automated tools in the platform itself, where a customer can log in, they can see captures, they can see the visualization of that point cloud, you know they can make selections on the image or on the point cloud itself. So there is no way to misinterpret where and what is being seen. So it is a combination of you know, getting the right people, providing new tools where there’s no opportunities or limited opportunities for error, and then having someone on your team at the intersection of the industry that you’re playing in and having the right technical skills. So, at Agerpoint, we have PhDs and digital agronomy that bridge that gap between our clients and the biological nature of what manifests itself visually in pixels why that actually is occurring.

And you know we have also at times we’ll be in the field with our clients. So, we’ve had field team and our digital solutions team in root crop fields all over North America this year looking at root crop immersions and disease. We’ve got teams that have been in agroforestry application and scenes in Africa, we’ve been in greenhouse facilities where we’re testing the new disease and drought-resistant seed genetics.

So as a startup and as someone that’s you know, maybe majors in technology but then kind of minors in the application of that technology, I can’t stress enough how important it is to have those types of resources on your team that can interpret both languages.

[0:18:09.4] HC: And for the machine learning developers themselves, hiring for this skill set can be very challenging lately for the high demand for professionals in this field. What approaches to recruiting and onboarding have been most successful for your team?

[0:18:22.2] KL: Yeah, I think recruiting top talent is always challenging regardless if it is a machine learning or any industry, you always want to get top talent in your doors that are really aligned with your culture and vision. I think from a recruiting standpoint, what we found successful is just find ways to tell your story. I believe we have a great story at Agerpoint, we’re really – we are engaging in some impactful ways to use technology to solve problems in agriculture and nature.

So, hiring a good marketing person to lead marketing, tell stories, create videos, tell stories alongside of your client, and you know having a megaphone to be able to talk about what you’re doing is number one, right? And so you know, once you actually have recruits that you’re working through the talent acquisition pipeline, it is important to just be transparent about your goals as a company, your mission, what’s expected from your employees day to day. You know, what’s life at Agerpoint like and we found that it’s effective to use trials or case studies or bring someone on temporarily or on a contract role or just having an experience where you know, the customers or the employees are talking with the team, right? So work with the team that you’re going to be a part of, get feedback internally from our team, and then make sure the candidate feels like this is the right fit.

Just this week, we’re having our all hands in person that we aim to do a few times of the year. We’re inviting some of our recruit center pipeline to you know, sit down with our team, hear about the vision for the company or hear about the pain points and it’s a similar process if you were managing a college athletics program, right? You have recruits, you want to get them on campus, you want to make sure that they’re the right fit for the right party.

You know as a startup, you’ve got flexibility being able to you know, employ some of those practices. You’ve got to spend also a dedicated time to do it, right? It’s so important, the lifeblood of your company is talent. So, just making sure that your prioritizing and you’re bringing in really talented people and you take care of those people once they’re in the walls.

[0:20:16.4] HC: Thinking more broadly about your goals at Agerpoint, how do you measure the impact of your technology?

[0:20:21.6] KL: There’s a lot of ways we can measure it across the different industries that we address. In agriculture, it’s really about how do we reduce the amount of inputs required to produce an output, to produce a crop to feed the world, and alongside of that is you know, how do we contribute to regenerative farming practices as well. How do we leave the soil greater for the next generation, right?

How are we providing an education feedback loop as well to make this a reality and what’s critical around solving some of these problems is the affordability and access to our tools and that is what has largely driven Agerpoint besides wanting to create a good business, right? It has driven the need for Agerpoint to use smartphones and use the phone as a sensor just because of the affordability.

And the access it provides to say, a small older farmer that you know often, most affected by some of the trends in climate change but they at least have the ability to affordably benefit from it, right? So, when we talk as a team and when we’re designing new products, we are trying to innovate to reduce the cost to extract data, so get the right amount of data, optimize processing, work with our processing partners to be sure that we’re optimizing how much it actually costs to store data process, data to create point clouds, to run machine learning models.

And then looking at the vision for deploying a lot of our analytics on the device in the field, right? That’s the utopian state of Agerpoint is the instant answers that you can get from deploying our technology, using machine learning, using machine vision in the field. The impact is it’s about access and affordability.

And whether that’s in agriculture as mentioned, to help reduce inputs and help to optimally schedule labor to the right part of their field at the right time to reduce spoilage and all the water and resources that are required in that fruit that then is spoiled because we can’t get labor to the right spot but those are the types of impacts that we like to measure and then on the nature side, you know we are doing a lot of work now in the carbon markets.

And having a critical data layer for digital monitoring, reporting, and verification, this digital MRV product that Agerpoint has created called, Know Your Carbon, it’s really helping to ensure that carbon projects and conservation, restoration, and reforestation projects, they’re selecting the right projects by providing information that’s actually affordable and accessible but then effectively baselining new projects, monitoring these projects throughout as project life cycle.

In general, you know these new digital MRV tools, better data, more accurate data, we’re holding enterprises that are most responsible for greenhouse gas emissions more accountable to their net zero claims and then their nature-based offset commitments and you know, we’re looking to continually find ways to quantify that impact for agriculture and nature.

[0:23:14.1] HC: Is there any advice you could offer to other leaders of AI-powered startups?

[0:23:18.5] KL: You know, I would say the number one I would have is alignment with your investors, with your board, and your team. We’re in a position at Agerpoint where we have a really outstanding situation where we’re aligned across the organization, right? This works for any company, large company, or a small company but especially with startups, making sure that your investors are in full communication with your plans.

You utilize your board and your team as a sounding board to test new opportunities and pursuits and the other piece specifically for AI companies is you know, don’t necessarily fixate on the model itself. You know, the product is not just the model but how the model is delivered, how is that machine learning model, that AI model, or AI tool delivered and used by your customers. So put yourself in your customer’s shoes and the position where they have to go potentially acquire data or you know, run an algorithm and what is the output actually tell them as well and then what they’re actually going to with that information.

So at Agerpoint, we really want to make the process of capturing data, analyzing data, and receiving that output [inaudible 0:24:23.1] and allow customers to bring their AI models on our platform. You know, if they have a plant-specific model or disease-specific model to just say, “Wow, Agerpoint really makes this process easy for me.” It is not about the model itself, it’s just the experience with using the model.

And the last piece of advice is always just to find success upfront when you’re building, right? So, what’s success look like and then what’s failure look like and so you can quickly, you know, move away from initiatives and projects that aren’t really providing that ROI and that outcome that you want.

[0:24:56.8] HC: And finally, where do you see the impact of Agerpoint in three to five years?

[0:25:00.5] KL: Yeah. I mean, as a startup, you prepare for multiple future scenarios. If you balance impact and values with exit opportunities and you have the views of your stakeholders but in general, we are building a company and a capability that we believe represents the next wave of digital agriculture and forestry and we’re building pathways for any sensor that touches a field or forest to incorporate Agerpoint’s analytics engine so that we can provide more accessible crop and tree data to better manage and nurture our planet’s natural assets.

And you know, that’s a vision and impact that all of our employees are behind and you know, we’re really excited to continue building over the next few years.

[0:25:39.1] HC: This has been great. Kevin, your team at Agerpoint is doing some really interesting work for agriculture. I expect the insights you’ve shared will be valuable to other AI companies. Where can people find out more about you online?

[0:25:50.3] KL: Thanks, Heather. You can find us on our website at We’re also on LinkedIn and Twitter/X.

[0:25:58.1] HC: Perfect. Thanks for joining me today.

[0:26:00.0] KL: Thanks, Heather, it’s been a pleasure.

[0:26:01.1] HC: All right everyone, thanks for listening. I’m Heather Couture and I hope you join me again next time for Impact AI.


[0:26:11.4] HC: Thank you for listening to Impact AI. If you enjoyed this episode, please subscribe and share with a friend, and if you’d like to learn more about computer vision applications for people in planetary health, you can sign up for my newsletter at