How AI Happens

AI Drones in Agriculture with Precision AI's Heather Clair

Episode Summary

Today we are joined by Product Manager Heather Clair to discuss how Precision AI is disrupting the agricultural industry by taking a dramatically different approach to the traditional land-based high-clearance crop spraying model. Instead, precision.ai uses AI trained drones to target individual crops.

Episode Notes

In this episode, Heather shares her background in both farming and commerce, and explains how her in-field experience and insights aid both her and the AI team in the development cycle. We learn about the advantages of drone-based precision spraying, the function of the herbicides that Precision AI’s drones spray onto crops, and the various challenges of creating AI models that can recognize plant variations. 

Key Points From This Episode:

Tweetables:

“Up until now, everybody just went, ‘How do we get more efficient [with] fewer passes?’ But nobody questioned, ‘Are we doing the passes with the right equipment?’” — Heather Clair [0:07:07]

“[precision.ai is] moving from land-based high-clearance sprayers to drone-based precision spraying.” — Heather Clair [0:07:24]

“I never thought when I was a little farm kid that I would be playing with drones, but it is one of my favorite things to do.” — Heather Clair [0:07:45]

“Trying to create these AI models that can work on any stage of plant can be a challenge.” — Heather Clair [0:21:15]

“It's incredible how working with my AI team has opened up my eyes to being able to look at these plants from a very logical standpoint.” — Heather Clair [0:25:34]

Links Mentioned in Today’s Episode:

Heather Clair on LinkedIn

precision.ai

Sama

Episode Transcription

EPISODE 52

[INTRODUCTION]

[00:00:04] RS: Welcome to How AI Happens, a podcast where experts explain their work at the cutting edge of artificial intelligence. You'll hear from AI researchers, data scientists, and machine learning engineers, as they get technical about the most exciting developments in their field and the challenges they're facing along the way. I'm your host, Rob Stevenson, and we're about to learn How AI Happens.

[EPISODE]

[00:00:32] RS: Here with me today on How AI Happens is a product manager over at Precision AI, Heather Clair. Heather, welcome to the podcast. How are you today?

[00:00:39] HC: I'm well. Thank you. Very happy to be here.

[00:00:42] RS: Yeah, I'm really pleased to have you, because Precision AI is such an interesting company. Just the use case in general, I think is really fascinating. We'll definitely get into that. But I wanted to point out that before, when we met, you were like, "I love plants" and I was like, "Yeah. Me too. Look at all the plants around me." You correct me to be like, "No, no. I like the plants that grow outside for a real function," as opposed to the ones I love, which are just for my own aesthete and vanity. We share a love of plants, but different sort of applications.

[00:01:08] HC: Don't get me wrong, houseplants are wonderful. They're beautiful. I love how much they spruce up your home, and they bring a certain flair. But no, I like the plants that grow outside, that we get to turn into food. Those are my favorite.

[00:01:26] RS: You probably get enough of those plants that you are like, "Yeah, I don't know if I need to bring them indoors when you're around them all day."

[00:01:31] HC: Well, my big thing is that, I'm out there looking at those plants, and interacting in the field, and creating tests and such. I find myself away too much to keep the house plants alive, so I just don't put them through that.

[00:01:47] RS: Yeah. After a day of doing that, it's probably the last thing you want to do is come indoors and deal with a fussy pilea or something like that, right?

[00:01:55] HC: Basically, yes, that and I've killed a lot of violet in my life.

[00:02:00] RS: Yeah, me too. When people see my houseplants, they're like, I always kill my plants. I'm like, "Yeah, you know. I've killed more plants than most people have probably ever owned." It just feels like it's part of the process, it happens. We shouldn't feel too bad about it. But you could maybe join me on my spin-off horticulture podcast to continue this conversation. At some point in here, we're going to get into AI and machine learning and all the techno stuff. I would love first, Heather, to just learn a little bit more about you. Would you mind sharing about your background and how you came to precision AI?

[00:02:28] HC: Absolutely. I probably have a little bit different background than a lot of the people that you get to talk to. I actually came up through the industry, so I have very, very little tech background. I grew up on a farm in Northwest Saskatchewan, Canada. We are very much a farming area. But we are very proud of the crops that we produce. It's beautiful soil. I had parents that farmed, I had grandparents that farmed, very much instill the love of the land, if you will. That being said, when I got into high school, I was like, "That's it., I want out of here." I wanted to do anything except farming, so I went and I actually got a commerce degree. Some areas call that a business degree. I have a major in marketing from the University of Saskatchewan, but it became blatantly apparent when I got there that as much as I loved business, I really gravitated towards the business side of farming.

Rather than growing the crops and such, I found myself drawn towards the inputs that go into growing that crop. I did some summer jobs and figured out exactly what I liked. So then I spent my first five years of the workforce working for two different companies where I sold, seed chemical inoculants, the things that go into producing a crop. Then in the backside, I bought the crops back from the farmers and help them with their marketing, trying to make some as much money on the open market as I could. After five years doing that, I really needed a change or I needed to keep growing. I went into wholesale, and I worked for a local Saskatchewan company called Federated Co-op Limited. It's headquartered in Saskatoon and gave me the opportunity to move back, because I moved a couple times at that point. I really love Saskatoon, it's a beautiful city. 

I did, I think it was three or four different jobs at Federated, but I worked in the crop inputs department, so I was seed chemical. I was doing logistics off the cuff and taking care of about $40 million worth of chemical in our different warehouses across Western Canada. I was doing promotions, and margins, and recalls, and price adjustments and that sort of thing. I got into margins, trend analysis, all sorts of projections, trying to get the company into a better position to make better purchases, if you will. The better that Federated did, they actually passed along any of their profits to the co-ops that bought from them. It was very much a symbiotic relationship.

I transitioned into their business-to-business sales department, and was selling to the co-ops and helping them with their training, their margins, understanding back end, all of that. It was a very fun experience; I very much enjoyed it. I traveled across a lot of Saskatchewan, got to see basically three-quarters of it, and worked really closely with a lot of very good people in rural Saskatchewan. After seven years there, I transitioned to a somewhat new company called Farmers Edge. It was an agronomy firm that had previously specialized in variable rate fertility products. We're moving towards data product. It was really the first time that anybody that I knew of was really looking at utilizing on-farm data for anything for the farmer.

Most of the data that gets collected on farms is for the big companies to be able to forward project their maintenance. Most of the data that's had been used previously was mostly to help with stocking, like when do we have the need to have replacement products available? Was really the only question that the big companies were asking. We started asking, "Well, how can we use the data that's generated on the farm to aid the farmer." We were creating a lot of really interesting products I worked for them for – I believe it was for four or four and a half years, somewhere in there. Unfortunately, the pandemic hit, and things shifted, and I ended up back in the pool of people looking for something new. 

When I saw the opportunity at Precision AI, I jumped on it. This was a cutting-edge company that was doing something dramatically different. Just looking at the industry for from a completely different angle, up until now, everybody just went, "Well, how do we get bigger? How do we get more efficient because we're doing fewer passes?" But nobody questioned, "Are we doing the passes with the right equipment?" That's one of the big things that the Precision AI is doing, is we're moving from land-based high clearance sprayers to drone-based precision spraying. It's a completely different way of looking at things, instead of everything getting coated. How do we target something with a high degree of accuracy? It's been an absolutely bizarre journey for me. I never thought when I was a little farm kid that I would be playing with drones, but it is one of my favorite things to do.

[00:07:51] RS: I love it. It's so clear to me that this approach of using drones is way cooler. But could you explain why it's better than previous methods and why it's more advantageous?

[00:08:04] HC: Well, there's a couple of really basic components that I like from an agronomic standpoint, let alone a tech standpoint. Number one, it's far more fuel efficient. We're flying a very small engine over the land, and we're getting about two hours out of a tank of gas. As opposed to pulling 80 feet with a 550-horsepower tractor, and all of the diesel that goes into that. So just the fuel efficiency to start with. The second thing is compaction. Every time you take a piece of equipment across your land, you're actually compacting it, you're tightening it up, you're packing it down. That makes it a little bit harder every time for those little seeds to break through the surface. There are tons of data, where if you look from an aerial view of fields, like even go to Google Maps, and zoom in on farmland, and you can see thinned out areas in the field where that's where the approach is. That's every time you drive in and out, and in and out and in and out, you're compacting that approach. Over time, it just kind of stops growing as well. Compaction is something that we really do have to deal with in agriculture.

Then the third component I look at from an agronomic standpoint is what we call trample. As the crop progresses, once it gets to a certain stage, it doesn't actually bounce back after you drive over it. If you think about it in terms of what you're familiar with your grass. When your grass is small, you can walk across it, and you might have footprints for a day but then it bounces back. But if you let the grass go all the way to seed, and then you go walking through it. Well now you have a trail and you know exactly where you went. It's no different in fields. Now, instead of walking across it, drive across it. Most sprayers are about 120 feet from tip to tip. The wheels will trample 3% of every pass. As the crop gets older, every time you drive across it, you're going to actually lose 3% of your yield in the back end, because those tire tracks won't bounce back and they won't mature. If they do bounce back, there'll be immature compared to the rest of the crop. On top of that, those seeds are still viable. So wherever you drove, will likely produce a weed the following year in the next crop, so you'll have to treat it. If we can cut back on trample, we've saved ourselves 3% on our yield, and we've also saved ourselves some headache on next year's crops weed budget, if you will.

[00:10:43] RS: Trample gets you coming and going it sounds like.

[00:10:45] HC: It really does.

[00:10:46] RS: These drones, are they deploying pesticides? Are they watering the crops? What are they spraying on the actual crop?

[00:10:52] HC: Pesticide is a term that kind of is all-encompassing if you will, which is one of the reasons that I'm not a huge fan of it. We're currently working with herbicides, which is a spray that is targeted at only killing the noxious weeds. The invasive plant life, if you will. So yes, herbicides, target invasive plants. Ones that are not supposed to be there. Does that make sense?

[00:11:20] RS: Yeah, definitely. That's what the drones are spraying?

[00:11:22] HC: Yes. Our drones are specifically spraying herbicides currently. There are other categories of pesticides that we could be spraying in the future that include insecticides and fungicides. But for right now, we're only targeting herbicides. We're starting with pre-seed, and then in crop, and then post-harvest. Those are our three application windows of herbicide where we're targeting the weeds.

[00:11:49] RS: Got it. Then, is it computer vision? Are they sort of just based on a video feed or image processing they are identifying where the individual plant is and where to distribute the chemicals?

[00:12:02] HC: Basically, yes. The pre-seed and the post-harvest, those applications are strictly – the camera system is looking out the front, the AI is computing on board, and then it's spraying out the back in one simultaneous function. The preceding post-harvest was specifically just targeting green, like it's looking for live plant matter. Because at those times a year, if there's something green, it shouldn't be there, because nothing has purposefully been seeded. See green, spray green, off we go. In crop, that's where the AI function has to kick in, then we have to teach the drone to know what is harmful and what is helpful. We've created these models, and that's where our partnership with Sama comes in, is that they're helping us with the annotation of those images. We come in and apply the model to the field, knows that in this field, canola is good, and in the next field, canola is bad, because, on that field, it's actually a wheat crop. We've taught it what is good and it sprays what is not.

[00:13:11] RS: It's just a matter of like annotating data to be like, "Okay, we're identifying in the same way as like a CAPTCHA is like, click on the images with stairs in them." It's like, "Okay. We'll identify the images that have noxious weeds versus ones that are the crop we want" or what is the actual annotation looking like?

[00:13:26] HC: It's somewhat like the click-in firm if you will, but it's individual pictures. The photo swath is actually quite large. In that picture, we've taught it that all of these plants are good. Then anything outside of the good plants, the canola, the wheat, the barley, the oats, peas, any of them. We've taught them individually that these are the plants that are supposed to be there. And anything that is not supposed to be there gets a treatment.

[00:13:55] RS: Got it. Now, you have this tremendous domain knowledge. Then Precision AI also presumably has your standard machine learning engineers and AI experts. What is the relationship between the two of you? Because it strikes me that they are kind of putting together these models, and learners and then you would be the one being like, "Okay. Here's how it would actually help us."

[00:14:15] HC: I am incredibly blessed to work with some just amazingly brilliant people. My founder, Dan McCann has done an amazing job of collecting incredibly intelligent people within their own domain and then letting us lead those areas. We've also got a really phenomenal culture of respect here at Precision AI. I lean on my AI department entirely, like I have learned so much from them. My drone engineers, we've got mechanical, electrical, they're brilliant. I couldn't fathom doing what they do. When it comes to those areas, I trust them to be the experts. In the same regard when it comes to the actual marketplace, to farms, to the practical nature of what we're creating, they trust me to try and direct them down the road.

We've had instances where both of the different teams have come to me with these, "Oh, here are these wonderful plans that we have." I've sat there and gone, "No, I'm really sorry, but that's just not going to work in the real world." Part of that is simply because they haven't been out there. They haven't seen it. But I lived and breathed it my whole life, just in the same regards that when I come to them, and I'm like, "Oh, I want to teach it to do this." They're like, "No. No, I'm sorry. That's not how AI works." We really lean on each other entirely. Having that level of trust within a company is absolutely crucial, especially when you're in this small of a company.

[00:15:52] RS: Yeah. What is kind of your involvement in the development cycle then? Because it strikes me that in those examples where they put together some grand plan, and you have to give them a dose of reality of how it would or would not work. Ideally, that would be as early in the development process as possible, right, so they don't spend their time working on something that has no end, or something that's a dead end. When are you looped in and what do those conversations typically sound like?

[00:16:15] HC: I would say, each one of them is a little bit different if you will. The engineering team on the practical design side of the product that we're creating, I get looped in basically right from the start, like we will have large engineering meetings, where we come together every couple of months, and lay out what we want things to look like going forward. I typically will sit back and listen. To be honest, I usually have Google open, because there are so many acronyms getting thrown out that I needed by a vowel. I do my best to not interrupt the flow of the meeting until I get to one where I'm like, "Okay, what does IOU mean in this context? Because I know what most people think it means." The guys are fantastic, and I shouldn't say guys, because we do have a lot of women on the team. I do apologize for that. We have a very mixed culture. I love it. It's fantastic to see.

Unfortunately, the industry that I came up in, when I started, everybody was guys. If you took offense to it, you kind of got singled out. I just roll with being called one of the guys, I'm fine with it. My apologies. But the men and women on our team, we all sit around and really have an open dialogue about what it is that we want to accomplish, and what are the best designs for it. Then usually, I'll chime in as we're getting towards the end. If there's something that I want them to look at from a different angle. On the AI team, we started off by sitting down and identifying the top crops that we wanted to work on and then ranking them. We kind of created a roadmap that way. When we first started, I knew how the process worked, and then they would come to me at the end and say, "We really need your help creating a data set that is representative of all the different kinds of scenarios that we can expect to see in terms of background differences, plant differences, lighting differences." In that situation, I did my best to help them assess the data sets that they were looking at, and suggest improvements for them, like what areas I thought they were a little light in, in terms of the different nuances of each picture.

That's basically how the first two models went. Then we started to get into a groove. We really started to understand what each side was looking for. Not only can come with longer-term discussions. Now, the team has got a very good understanding of – we need all the different plants and all the different sizes. We need different backgrounds. We need different lighting. They've come a very long way in being able to assess that themselves. They understand the different lighting conditions. Obviously, when you're out in a field, you have everything from bright sunlight to partially cloudy to, "Oh, it's the tail end of the day" And you can kind of see the sun setting through the plants. It's actually kind of pretty.

The team has done a really good job of creating their own system to pre-select and then the images go to Sama. When they come back annotated and labeled, that's where myself and my contractor who is a professional agronomist, we go through them and we double check the work. Sama has been phenomenal. They've really adapted as we've thrown different scenarios at them. I would say that their accuracy rate is over 95%, which is incredible. We just do little tiny tweaks here and there. To be honest, their accuracy is probably closer to 99%. They've been just fantastic partners to work with. 

In terms of when I get brought in, it's kind of as a double check on the AI side. Lots of times, if there's a new method that they're trying to develop, they'll sit down as a team, they'll decide, "Okay, this is something we want to look at" and then they'll present it to me. They've put together fantastic visual explainers as to why they feel that this is the way it should go. Unfortunately, I think, every time but one, I've had to tell them, "No." They're wonderful people, but they don't necessarily understand how a plant grows and how a field is not uniform. Nature is not perfect. If you see a straight line or a perfect curve in nature, no, that's manmade. You will never see an entire field grow in perfect sync. Especially if you have dry conditions, if you have dry conditions, you could have everything. If a regular plant life cycle goes from emergence, so it’s first leaves through seven leaves,  you will have everything from two through six available at any given time in a dry year, because each of those plants are germinating with whatever water they have available, whenever they get the water. You will see a lot of variation. So trying to create these AI models that can work on any stage of plant can be a challenge.

[00:21:21] RS: So you mentioned earlier how there's a stage where if the drone sees green shoot because it can only be a weed at this stage, it's not at the crop stage. So if you see green, it has to be sprayed. Seems pretty straightforward, but say there's a green John Deere tractor parked out there, and now it sees green and it's going to go spray at that too. You have to build in more and more rules, complexity increases. It's a cartoonish example, but what I guess I'm asking is, what's an example of other things that kind of got more complicated once you in AI parlance shipped to production or in your case, once these drones got out there and started spaying?

[00:21:59] HC: I have two examples that I can think of right off the hop. So maybe I'll walk you through both and hopefully one of them makes a little bit of sense. What most people don't realize is, dirt looks very different no matter where you are. Dirt in Northern Saskatchewan is a very nice, dark, lush back. Dirt down in southern and specifically Southwest Saskatchewan is a very light brown, almost bordering on red undertones. When you go down into Arizona, it's absolutely red, like it could be Mars for all you think comparing that dirt to Northern Saskatchewan dirt. The AI team has had to deal with the difference in the color histograms if you will. That's really impacting the AI models. 

Trying to teach the models to work on all soil zones is probably not the right approach. Teaching it to be able to work on any plant maturity level is somewhat consistent. We could leave that as are consistent and then make the variation within the model, the different regional components, the soil color, the soil texture. There's also a phenomenon in horticulture called phenoplasticity. That means that the exact same plant will grow differently depending on where it's grown. So you could take that exact same violet, and plant it in the dirt in Northern Saskatchewan and it would look like a very different plant, if you planted it down in Brazil. It's all about the different nutrients available, the moisture levels, the environment in general. There's another component in the soil called organic matter. Organic matter is previous crops that are breaking down using the soil microbes.

All of those factor into how a plant grows. What might be a big tall plant in Northern Saskatchewan because it's had the time to grow, and it creates giant paddle-like leaves. When it's down in Southern Saskatchewan, because it's hotter and there's not enough moisture, it's a much smaller plant with thinner leaves because it's just trying to conserve all of its nutrients and moisture to set seed as opposed to grow a plant. All of those are challenges that our AI team is having to face. We're having to go back to our partners at Sama to give them additional examples to annotate so that we can tackle all of these issues and create regional models. Another component that is very regional is the weeds themselves that we're targeting. There's a lot of weeds that actually look somewhat similar to the plants that we're trying to protect. We're Having to give the models lots of examples of the weeds, and it's created some very interesting conversations with myself and our AI team because I can see the nuances of the two different plants. To me, it's blatant. Like the one on the left has a nice big, rounded paddle-shaped leaf. Or the one on the right, it's much more like a spade. My guys are going, "Yeah, looks like the same plant." I'm like, "Okay. Look at the shades of blue versus green. How about that? Let's go at it from a color standpoint."

It's incredible how working with my AI team has opened up my eyes to being able to look at these plants from a very logical standpoint. I now need to be able to describe the nuances as opposed to say, "Well, it's lambs' quarters." I don't have to explain that to anybody in agriculture. I just walk out and say, "Well, you got lamb's quarters in your canola." Everyone's like, "Okay, cool." My AI team is like, "What is lamb's quarters and why is it bad?"

[00:26:00] RS: Yeah, I have 40 follow-up questions on lamb's quarters.

[00:26:04] HC: They are a wonderful source of food for some of your cattle and pheasants, actually. Pheasants love them.

[00:26:10] RS: There you go. But again, not something that your typical AI practitioner would know. The reason I wanted to ask you is, because this increasing complexity, I think is the case wherever one is deploying AI. There's only so much you can kind of glean from a test environment. And at a certain point, I guess you just have to take the leap and understand where your model fell short. But I'm curious in your experience, just from like a product development cycle, how can you work to make sure you do as much of that beforehand, so that you have as little adjustments to make and that something can be as efficient right off the bat?

[00:26:46] HC: Not to toot my own horn, but I believe that having a product manager that comes from the industry that understands the environment that the product is launching in is pretty important at this point. If we had just promoted somebody from within who had no idea what the practical applications of this AI would be, this product would look very different and it wouldn't work. Plain and simple. It just wouldn't work because they wouldn't have understood the differences that they were looking at. Having someone with that practical knowledge ahead of time, I'm cutting a lot of the corners for them, if you will. Trying to get ahead of it. I'm doing my best to create user stories and write down all the different ways that I can see things going wrong. But to be honest, in this industry, as I'm sure in a lot of them, there are actually – there are so many different ways that this can go wrong. I don't know how to anticipate everything.

When I'm creating a user story of an errant Canada goose attacks the drone, it sounds like a cartoon, but this is something serious. But how do you anticipate everything when nature is unique, and it likes to throw us curveballs? But having the practical infield experience, being able to look at that picture and know what I'm looking at. I know that that has leapt us forward years compared to what an internal product manager would have done. They would have just been researching, and they wouldn't necessarily be able to understand that country roads are 14 feet across, and that you can't leave the doors down on the trailer for the drone to land on because the semis going to drive by and knock the door off. Because they didn't live it, they don't know what the shared space looks like, because they didn't have to try and pass a school bus every morning. There's some of that that I know that I bring to the table that not a lot of people would have been able to, had they not grown up or had the background that I had.

[00:28:49] RS: Yeah, it's so crucial. I think in lots of cases, particularly in younger, smaller companies, the AI practitioners themselves are sort of cast as the subject matter expert and they're doing all this just in time learning to try and understand the field they're trying to disrupt in. But it's a huge advantage to have an expert in the field guiding your decision-making and development process. That makes all the sense in the world.

Heather, this has been a ton of fun chatting with you. I have one more question before I let you go and it's a selfish one. If you were to give one hot tip to the house plant enthusiasts among us, what do you think people are doing wrong that they could fix to get happier, healthier houseplants?

[00:29:26] HC: You're probably overwatering. Plants are not used to getting what they want all the time. Actually, giving it a little bit of stress will push the roots to grow deeper and to make the plant more robust. Maybe don't be as nice to it as you have been. Stick it in the sunlight for too many days and let it really grow and dry out, then give it the drink and all of a sudden, it'll push itself. That being said, I noted it earlier. I can't keep the house plant alive. That being said, if your cucumbers in your garden are somewhat looking yellow, you should have picked them two weeks earlier.

[00:29:59] RS: Got it. That is great advice. Plants and people grow when they face adversity. A little bit of holistic advice for all of the living organic material around you. Heather, this has been a delight. Thank you so much for being on the podcast. I've loved chatting with you today.

[00:30:14] HC: My pleasure. Thanks so much.

[OUTRO]

[00:30:17] RS: How AI Happens is brought to you by Sama. Sama provides accurate data for ambitious AI, specializing in image, video, and sensor data annotation and validation for machine learning algorithms in industries such as transportation, retail, e-commerce, media, MedTech, robotics, and agriculture. For more information, head to sama.com.

[END]