How AI Happens

Data Relish Founder Jennifer Stirrup

Episode Summary

Healthy data culture and practices are essential for modern businesses. They promote informed decision-making, efficiency, innovation, and accountability while helping organizations stay competitive and adaptable in a data-driven world. In this episode, we tackle the age-old struggle many companies face: how to unleash the potential of data while overcoming the hurdles that stand in the way of building a truly healthy data culture. To help us navigate this topic is Jennifer Stirrup, a distinguished expert in artificial intelligence, business intelligence, big data, and data visualization solutions.

Episode Notes

Jennifer is the founder of Data Relish, a boutique consultancy firm dedicated to providing strategic guidance and executing data technology solutions that generate tangible business benefits for organizations of diverse scales across the globe. In our conversation, we unpack why a data platform is not the same as a database, working as a freelancer in the industry, common problems companies face, the cultural aspect of her work, and starting with the end in mind. We also delve into her approach to helping companies in crisis, why ‘small’ data is just as important as ‘big’ data, building companies for the future, the idea of a ‘data dictionary’, good and bad examples of data culture, and the importance of identifying an executive sponsor.

Key Points From This Episode:

Quotes:

“Something that is important in AI is having an executive sponsor, someone who can really unblock any obstacles for you.” — @jenstirrup [0:08:50]

“Probably the biggest [challenge companies face] is access to the right data and having a really good data platform.” — @jenstirrup [0:10:50]

“If the crisis is not being handled by an executive sponsor, then there is nothing I can do.” — @jenstirrup [0:20:55]

“I want people to understand the value that [data] can have because when your data is good it can change lives.” — @jenstirrup [0:32:50]

Links Mentioned in Today’s Episode:

Jennifer Stirrup

Jennifer Stirrup on LinkedIn

Jennifer Stirrup on X

Data Relish

How AI Happens

Sama

Episode Transcription

Jennifer Stirrup  00:00

And I think I made a sort of comment along those lines of it's not the size. It's the quality, the nature, the format, you know, you can generate synthetic data and produce a huge, whopping database, but it's not going to be used for anything, or is the point.

 

Rob Stevenson  00:17

Welcome to how AI happens, a podcast where experts explain their work at the cutting edge of artificial intelligence. You'll hear from AI researchers, data scientists, and machine learning engineers, as they get technical about the most exciting developments in their field and the challenges they're facing along the way. I'm your host, Rob Stevenson. And we're about to learn how AI happens. Here with me today on how AI happens is a guest who has had a very interesting background in our space in data and algorithms, selection, all sorts of other areas. So much so that I kind of don't want to hamstring your intro by trying to do it poorly myself. I'll just bring you in. Jennifer stirrup founder of data relish. Welcome to the podcast. How are you today?

 

Jennifer Stirrup  01:08

I'm great. Thank you so much. And thank you for having me on how AI happens. It's my pleasure to be here. 

 

Rob Stevenson  01:14

I am thrilled to have you and I swear I'm not being lazy by not introducing you, I just really think that it's better, you know, coming directly from the source. So since I wanted to leave up to you, John, maybe here at the top, I'll just ask you to share a little bit about your background, how you wound up in the space and what you're working on right now over a data relish.

 

Jennifer Stirrup  01:33

I've always been a geek around stata and technology. I learned to code when I was eight years old. So a very long time ago. What did you learn to code on, I learned to code in basic or no Sinclair's edX 81. And it has 16 kB of memory. So it was there a very long time ago, we had to lose the computer with tapes. So it's a very long time ago. But I think I was so inspired by older members of my family. Some of them were spies during the second world war, they were using enigma and the purple machines to collect chords. And one of my great uncle's works with Alan Turing. So he had such a great interest in AI in maths and I think he inspired me. So I'm very lucky to go to university study EI. Again, I did that before the year 2000 thought was a long time ago, I'm very glad to see that AI is becoming easier and easier all the time. And I'm working on some really great projects. I do some mentoring for startups. So I like doing that because it's great to see the success that they can have, you know, and it's such a great energetic environment to work in. I also deliver projects for much larger customers as well. And I tend to get involved to try and help them to onboard data science and AI. And that's not just from a technology standpoint, but from a people and a cultural standpoint as well. 

 

Rob Stevenson  03:05

Did your uncle have any Alan Turing stories that he shared over the years? 

 

Jennifer Stirrup  03:10

He just said that Alan Turing was almost like a god, you know, you people is such respect for him, you know, and that was really his takeaway. Votel and tooling seem to think that he was a real, a real demigod, you see him round walking around Bletchley Park, and you get a sense of the energy coming from him. So I think it was inspiring for me to feel that there was a personal connection with AI from a very early age.

 

Rob Stevenson  03:41

What do you think Alan Turing's legacy is in terms of AI? Do you think this is like the logical progression from some of the early work he did in computing?

 

Jennifer Stirrup  03:48

Yes, I think it is. I think I get a sense when I've read his materials, this papers, that is something he explained very well, he seemed very keen that people would understand the concept of it to something when you read his papers that comes across very well. So I think he probably would have appreciated how popular AI has become.

 

Rob Stevenson  04:11

Yeah, I should hope. So. Jennifer, you have an interesting background, and that lots of the people I speak with on the show. They've been in academia for a long time, they'd had roles as machine learning engineers or software engineers. And it seems like you've sort of been able to march to the beat of your own drum for most of your career. Is that fair to say?

 

Jennifer Stirrup  04:33

Yes, it is. I'm a terrible employee. So perhaps that's the reason why I ran my own business. When I started a career after leaving university, I was very lucky to be involved in working in mainly the finance industry, getting involved in first generation AI algorithms. I was using those to help answer email. It may seem Strange now but at the time, and 1990s, email was a big thing. And companies sense that they were not going to be able to handle the email. And they needed to automate some of that. And I think it's probably something we still struggle with today. I don't know about you rob it, I get far too many emails. And I do think I would love to automate these. So I feel like the industry is coming full circle now that we are starting to automate more and more, as time goes on.

 

Rob Stevenson  05:32

How did you sort of forge a career in this space where you didn't need to be an employee having noticed you were a terrible one, and perhaps not even enjoying it? How have you forged this ability to go into large companies and startups as well and sort of poke around in their day-to-day operations, 

 

Jennifer Stirrup  05:48

I think what's helped me is to have the very customer focused perspective, you have to consider that you are only as good as the last project you did. So even though I've been working in the industry for 25 years, I still treat every project as if it's important, I really focused on the business value of whatever it is we're trying to do. So it could be that I'm trying to save costs, or improve customer service, or perhaps a small research and design project as an example. So I think when you deliver each project to the best for everyone, with good outcomes, word starts to spread. Because when you work with business leaders, and then they leave and go and start to do something else, they take people with them. So I've been really lucky in that respect, that I've had a really good network. And I think it's just that ethos of just working really, really hard in making sure that you deliver you over deliver every single time beat people's expectations.

 

Rob Stevenson  06:58

Yeah, I've noticed that too, that leaders leave places, but then if you have delivered for them, you make them look good, right? So they go into their new company, and you execute this awesome project. You've really like you get compensated, but they're the one who who like logs, the organizational cache, right? Like they get the credit in a way for bringing you in. And so you deliver for people don't ever forget it.

 

Jennifer Stirrup  07:22

Yeah, that's true. I think sometimes they get asked to do Canady indicator projects. And what I mean by that is the Canady if it dies, people don't seem to care. And so I sometimes get involved in really risky projects. Because there's an external consultant, I can be expendable. But I think the secret is to show small, iterative successes, and keep proving because people will align themselves with a success. And they walk away from a failure. If you keep pushing and getting those small steps of success, people start to help you. 

 

Rob Stevenson  07:57

Yes, yes, it's just building momentum slowly over time. Here in the colonies, we say canary in a coal mine. But it's the same idea. It's that like, this is the harbinger of like, if there's a leak, then the canary dies first, and then everyone else knows how to get out. But yeah, that's an interesting approach for you, because maybe internal employees might not want to touch a really sticky problem, because they're afraid of, I don't know, like, what are they afraid of it not going well, or failing or making things worse, whereas you can, you know, like, you say, your, you know, work there, so it's easy to fire you if it goes wrong.

 

Jennifer Stirrup  08:29

That's right. I think also, I've got a fresh pair of eyes. Sometimes they'll go into organizations because they have failed to do this by themselves. So when I come in, they don't always like it. See, for example, have problems with the IT departments because they've tried to do this and then the field. And sometimes, I find it useful to think about people as laggards or who is going to be the leaders and the visionaries in the organization and try and single them out. I think it's also very important in AI to have an executive sponsor, someone who can really unblock any obstacles for you, because that is a really important thing with any IT project. But particularly with AI, because people are worried it's gonna feel the weight, it's going to threaten their jobs in some way. And because of that, they're thinking to themselves, how can I not be involved with this? And one way to do that is to say, Jennifer, no, the answer's no. Before I've asked the question, sometimes,

 

Rob Stevenson  09:34

who is that advocate? Do they tend to be like a Chief Technology Officer or VP of engineering? Who do you kind of look forward to fill that role? The

 

Jennifer Stirrup  09:42

CTO usually or the CDI, yo. So I usually report on either to the board or to the C suite and nothing because some of these projects are so necessary. It has to have that sort of visibility on it. What they tend to see how Penang as in boards, particularly, there's usually one visionary in the some laggards, and that's at the board level. So that sort of disconnect within their leadership can really translate into technology projects, because the rest of the boards might not be on board with the project itself. So you, Amanda, usually a lot of pressure to try and get things to work to show success early. But I'm very lucky and that I think having a good network helps, I can start to bring along partners to help deliver a fair that needs to happen. 

 

Rob Stevenson  10:36

Part of the reason I was excited to speak with you, Jen is because you get to see lots of different companies, you know, when I speak with people who are internal installed somewhere, they understand that company's problems very well. But you get to see more of a landscape of things that people are struggling with in the space. So I'm curious what is common out there, when you're brought in to places? What are the kinds of common challenges you see companies facing with regards to their data and AI operations?

 

Jennifer Stirrup  11:03

I think probably the biggest thing is access to the right data, and having a really good data platform. So sometimes organizations seem to think that it is a platform is simply a database. But what actually happens is the good parts of information all over the place, they can't match it up very well, they can't access the data. They don't know what the data means. So then they bring me in, and I have to see well, how well do you know your data? And they will see, well, you know, we tried to build a data dictionary, and that didn't work. And we're too busy for all that, and now have to see, but you need a good foundation. And no, it's not popular message to try. In that case to start small. As the rate we start with this data set, we're going to mix it with that data set. And then our first few Sprint's will start to show some good business intelligence, because without good BI, you can't have good AI. And I see that in organizations, large and small. People seem to lose the understanding of the data very quickly. And then the business moves so fast is the old Eisenhower matrix of urgent versus important. They can focus on what's urgent, showing something on a dashboard, for example, but not what's important, how do they get there?

 

Rob Stevenson  12:25

So when you come into these teams, and you start poking around with a flashlight, what are you looking for? What are the kind of the telltale signs of how they've been approaching their data platform or not as the case may be?

 

Jennifer Stirrup  12:37

I asked them, if they've got any Excel spreadsheets, they'd like to show me. Now I know that sounds a really odd place to start. But if all of the data is set in an Excel spreadsheets, and they're not using the enterprise data warehouse, for example, that tells me the data warehouse is not fit for purpose. It tells me they're in a state of data anarchy, they've got data all over the place, no one's looking after it. They're probably doing lots and those Excel spreadsheets that are perhaps good from a user point of view. But from a testing and robust perspective, probably not really, really should be. I remember having one customer once who swore blind, they had no spreadsheets at all. And then I met a business analyst, when I'd been there back and forth about three months. And she said, I've got spreadsheet to show you. And I sat with her. By the end of it. The hour my head was in my hands, she has so many spreadsheets, she had a spreadsheet to help keep track of all the spreadsheets. And this was the organization that's warplanes. They had no Excel anywhere at all that told me I lost a bit the business because I then went back to the IT department into the CTO. And I said, you said you had no Excel? No, we don't. They said he said you do have uncovered a whole lot of it in someone's laptop. And worse, it wasn't really on their laptop, it was in the email. You know, Microsoft Exchange is not a document storage mechanism. They had no idea what was the latest version of the spreadsheet, it was all in an email, it was awful. And when the whole time was the other didn't really tackle the problem of the little data. And it's the little data I'm looking for. We talk a lot about big data is the hashtag. But actually often what's really important is the business little data that's sitting in people's hard drives or the business one they access. They're using that to make decisions, or maybe to meet bridging or mapping tables, for example, this is equal to that. So I'm really looking for that I'm looking for the data that's fallen between all the cracks and people's attitudes to that detail. 

 

Rob Stevenson  14:51

The Excel thing is so interesting because it sounds like it's a signal that people at the company are not getting what they Need from the existing data structure whatever is believed to be the source of truth. They're not getting it. So they're doing it on their own. Is that kind of what it's telling about people who have there are who are keeping their own spreadsheets?

 

Jennifer Stirrup  15:11

Yes, absolutely. They're not getting what they need. And there's a culture of people Empire using data about little empires. And what happens is, sometimes the culture of the organization is poor, the ways perhaps about keeping their job. So it's the old adage about knowledge and information is power, power is control. So people start to control in the build up these little data silos that they don't share. So tells me a little bit the organization because if the culture is not good, trying to do an innovative AI project, is probably going to feel because you need people on board. And that's why it's so important to have that executive sponsor. When I leave an organization, a brave CTO will sometimes say, tell me 10 things that you don't want to tell me about my business 10 things that you feel awkward telling me, I'll give you a day to think about it. When man said in that case, that came up with 20 things. So I could have kept going if I'd had more time or some of the things. One was not doing backups properly of the data. And I'd mentioned this bear many times as I've gone along. But it's very difficult in an organization may get someone who's not doing the job properly, like that. Because you know that you can encourage them to do backups, for example. But there's a point at which you have to decide, do it escalate this? Or do I just walk away from it? And I would have told them anyway, as I was leaving, there was no backups done. But I was actually really pleased that he proactively asked me about things that made me uncomfortable. It turns out in that case, I was right. And I knew I was right. And the gentleman in question didn't last the day. So sometimes you have a really big impact when you went to an organization that was able to prove everything. And it was the contractor he was charging them a lot of money, and not doing his job. So it was bad for the contractor, but good for the organization. And it was quite a hard thing to see. But I had spotted this and no one else seemed to have seen it. And everyone seemed to like this individual. And it was quite difficult, I think because you need to make friends enough that you can get your job done, and the projects push through. But at the same time, it's really tough because you're being parachuted into your culture, which is maybe not as open as it needs to be to be doing a innovative AI project. And if people are not doing the jobs properly, and the culture is not fit for purpose, that's really tough to change that. I think

 

Rob Stevenson  17:57

it's interesting. You mentioned that early on that you wind up tangled in the the people and cultural aspect of things. I was surprised to hear that because you're such an empirical person, because you have this background in data science and encoding, etc. What does it look like when a company is doing a good job with their data culture? I guess if they were doing a good job, maybe they wouldn't be calling you. But let's maybe start with what a good example would be. And then we'll do the opposite, I guess. 

 

Jennifer Stirrup  18:23

Yes, I like to think that I start with the end in mind. So when you're right, they've called me because there's a problem. Sometimes they make the decision to leave, because my job is done. And I've helped to turn the ship around. So when that happens, I noticed myself I'm doing less of their hands on work, because I've mentored other people to do the hands on work, and they need to meet less and less as time goes on. And that's good. Because I don't have the attention span to do the same job for 2030 years. I like the idea of lifting up everyone to make the organization better. So when I see that they are self sufficient, they're able to run their own projects, they're pretty much handling the code themselves. If not, they're at least good enough to troubleshoot by themselves. They've got a good support structure in place. And I'm trying to build a rinse and repeat formula for them. So I want them doing like DevOps, for example. So they need an example of how to build applications, how to roll them out. I need to think about that from a data perspective as well. And once they see what good looks like and they are able to do it themselves. That's when I start to back off because I think I've achieved that now. They're self sufficient. They don't need me, but also our basic makeups that boards then because you know the challenge is going

 

Rob Stevenson  19:53

Yeah, yeah, of course. So what do you think if you put yourself in the mind of the chief data officer, the Chief Technology Officer, what is starting to break in the organization where they might think we need to bring someone like Jennifer in to sort this out,

 

Jennifer Stirrup  20:08

I think what happens is they usually get pressure from Elsevier, because the business teams are not getting the data that they want, and not getting the results that they want. There's usually quite a divorce between IT and the business. And I think sometimes IT departments can become very egotistical, and they don't have the right to be. Because if they were doing a good job, the business wouldn't be complaining so much. So that's an internal thing. I think also, the external pressures come from the facts, they see the competitors doing interesting things with data they feel left behind, they may get see a request for a proposal, and they can't fulfill it. And they can't do that, because they don't have the skill set. And they don't have a good data platform. So I usually commend when there's two things in place. One is a crisis. And the second thing is an executive sponsor who needs to clean up that crisis. If the crisis is not been handled by an executive sponsor, then there's nothing I can do. Because they'll never get it pushed through, they've got themselves into a bad state. And I have to watch that I do become the blame hound, you know, I don't get blamed for that. So I really need someone who's going to help me to unblock things, to make them successful. And those things can be quite tough. So I think a lot of issues in business are really down to the people to the culture, the concerns about the jobs, maybe they don't like change, people don't like change, and they don't like to be changed. And when something like aI comes along, they're looking at it thinking this is a whole lot of change. So again, it's coming back to that gently, gently, incrementally approach. Sure. Small success. I think the days are over if people want in big bank projects, and they say runs you to 1000s, people would take maybe two or three years to build a data warehouse, those days are overly gone, we need to think about delivering in a more agile fashion.

 

Rob Stevenson  22:13

What is the difference between a two to three year data warehouse and what you think is necessary now,

 

Jennifer Stirrup  22:19

I think within two to three year data warehouse is set up in a waterfall fashion. So whatever is going in place, wherever the business needed to three years ago, is not where they're going through it. Yeah, two or three years down the line. I remember working for a consulting firm, and I was delivering very small and DCBs projects are using Microsoft technology at the time. And in this consulting arm. There was another team, they were delivering very extensive data warehouses with another type of technology. And they were taking two, three years to do projects. And that projects that they did actually failed, not because it feel technically it was technically very complex, it just no longer fitted the needs of the business. And for me, the real success criteria is about do the business adopted, do the users adopt it? user adoption is sawed off anyway, when you're trying to roll out software programs. But when you're dealing with AI, business intelligence, data warehousing, if they don't like it, they won't use it, they'll tip back into Excel. And the business has spent a lot of money for nothing. Whereas small agile means that the business get continuity of success. And they also have the ability to have lots of input. And that stickiness helps you to make sure that you will eventually have that user adoption. And this success criteria. Customers sometimes asked me for all sorts of things. And they usually have a wish list because when they get excited, they get really excited. And I become this like fairy godmother, that's gonna solve all their problems. So it's very difficult because I have to see well, in the last sprint, you said me deliver X Y, Zed. And now halfway through the sprint, you are now asked me to change everything and go off in different direction. And that's what I mean. But it's not what I see. What I see is, I don't say no, I just say Not yet. Because very often, the so enthusiastic, if you say no to them, they're going to be really upset. And again, it's a little bit people in culture. So whereas the US will add it to the backlog worth for tasks will add it to the backlog, and then we will prioritize according to your commitments at the next sprint planning meeting, because I know next day, I'm going to have someone else coming down the corridor with another wish list is also going to last for 10 different things. So not yet is this except that you see when they ask you for lots and lots of different things, because they will. But that's good.

 

Rob Stevenson  25:06

Yeah, treat them like a toddler, when they ask for something, say maybe for your birthday, maybe, maybe for Christmas, not yet. If you

 

Jennifer Stirrup  25:13

really, really good. And you ask them to really nicely, and I'm Santa, and

 

Rob Stevenson  25:18

you're not naughty for the next few months, you may be we'll get you this straightaway.

 

Jennifer Stirrup  25:23

I do think sometimes businesses are late children, actually, they want things that are not good for them. They want them yesterday, they want them now. And these stomping feet till they get and that person can be quite difficult to deal with. Because I'm not like that tend to be maybe because I'm a data person a bit more plodding. They are trying to harness that passion, if I can, which can be very tough, because they want their AI thing done. So they can go and tell people that in the eye, so you're just trying to bring them along the journey that can be difficult to educate them, as well. I do expect them to sit and do the math behind an algorithm bear just to try and understand a bit better with some of the problems actually are.

 

Rob Stevenson  26:05

Yeah, of course. What are some of the data management practices people could be putting into place to perhaps stop things from getting so dire that they will need to call you and I'm not asking you to advise how to get yourself out of a job here. But what should people be doing so that it doesn't become so bad? Besides, of course, staying out of Microsoft Exchange? 

 

Jennifer Stirrup  26:28

Yeah, I think a bit late to see more organizations having a data amnesty, and being very honest about what data sources they are using, where they are, and which data sources they are not using. Every organization hangs on to the data without really thinking about what the important data is where the vital data is. I think this is where I think a data dictionary starts to become very useful. Because people often don't know where to find the data they're looking for. They don't know what it means. And very often I get dragged into trying to figure out what the data means. It's not really what I'm there for. But I'm finding I'm having to take that step back before I can move forward. 

 

Rob Stevenson  27:15

Yeah, that makes sense. What are the questions you can ask to determine what is the data you should maybe let go of, 

 

Jennifer Stirrup  27:22

in terms of the data to let go of, I like to try and run reports on the system to see who's accessing what data, when, and how often, I know, that probably sounds like I'm spying on them. But I have had cases where the customer may see a one, these 22 things in this report, and as to be done by Friday and have to go and scramble around to find someone to do it, then you deliver it, and then it will look at it. So what I do is think about with my armor on and I see Well, the last few reports you requested at the last minute, I've noticed that you haven't actually used them. So what is that you're trying to achieve? Because what you're asking for, and what you're getting seem to be two different things. And let's take a bit of a step back, try and work out. What are you trying to achieve? And often when asked that, it turns out, there's something else in the organization that might give them that, or give him something very close to that. I think sometimes people are not very good at asking for what they need to trust, but verify. You know, I think it's good to prototype as well. And when you prototype, you can understand what they want to see in the dashboard. And then you can start to understand what data sources that they're really not using.

 

Rob Stevenson  28:38

Yeah, that's interesting, because like you say before, everyone says Big Data is a hashtag. And I can think for a long time, the perspective was the more data the better. Like let's just get as much as possible. And we'll sort it out later. It sounds like you're prescribing a leaner approach. 

 

Jennifer Stirrup  28:55

Yeah, I think you need to understand the nature, the reliability, the format and the quality of the data that you're looking at. Because I remember holding a presentation once. And at the end, someone asked, said, what's the largest size of database you've ever created? And I just thought, I said, Well, I don't know. I was really thinking about that. And I thought was silly question. But I was gonna move on. But he said, My biggest database is 150 gigs. And I thought, I know I've reduced way more than that. But I thought, I don't know what to say to this person. Because it could be 150 gigs worth of rubbish. You know, you just don't know. And I think I made a sort of comment along those lines of it's not the size. It's the quality, the nature, the format, you know, you can generate synthetic data and produce a huge, whopping database, but it's not going to be used for anything, or it's the point. We need to think about the volume versus the quality. So people do want more data alone. But, but sometimes I think what they're looking for is, if they have every data source possible, then some of it might be good quality. And the more we have, then we've got more chance of finding data that's better quality. So I think it's a bit of a strange mindset, but I see where they're coming from. I remember going through with our finance department, and asking them, tell me about your reports, just like an introductory workshop, you know, show me what you're using. And they showed me this very simple report with 10 columns on it, and said, Alright, so much of this is right. This is a finance department. And she said, Well, we know six of the columns are right. And I said, Okay, the other fours are wrong. And she said, Yes. And I said, Well, that means 60% of the report is wrong. Is that kind of where you're setting up with your data? And she said, Yes, but we know which columns are wrong. And I thought, that's not really the point is not is that you know, which ones are the wrong columns? You know, have you thought about fixing them is obviously my next question. And then that's when he gets batted away, because people are in piling again.

 

Rob Stevenson  31:11

Yeah, well, you're asking them to do work for them to go clean things up, which is, you know, that's that's not fun, maybe, but  it is necessary. 

 

Jennifer Stirrup  31:17

Is that leak is when it's 6040. It's too close to 5050 for my Liking, 

 

Rob Stevenson  31:22

and yeah, it's a coin flip. Why do you even bother looking at data just to flip a coin and see what happens? Yeah. exactly at all So maybe a pedestrian question, Jennifer, but I'm curious, as we wrap here, why the name data relish,

 

Jennifer Stirrup  31:36

wanted people to love and relish the data. And know that larger data firms talk about love your data, it's a bit of a hashtag, and I can't use that I'll get sued. So I was thinking, What could I do that would show that keep people can get excited about the data. So I seem to see lots of problems where people don't want to look at it, and really want to get people to the point where they've got a successful solution that they're proud of. They're happy that worked on it. And they really relishing it. And that was why I did work with one of the best data science teams I worked with ever. Were not data scientists, one had a history degree went to Language Literature degree, one was an accountant. And they just came at the data science with such gusto that I really liked that. And I taught them a little bit the algorithms that taught them to rate SQL and R, as taught them about data storage. And it was a great project, because they just really did relish the opportunity. And they've all gone on to do great things now. But I think I regard that as one of my achievements, to be honest.

 

Rob Stevenson  32:42

I love that I love this idea that it's not something to be afraid of, or not something to be a chore that's like, if you enjoy it, if you have gusto and you're thrilled about it, then your data will sing for you, right?

 

Jennifer Stirrup  32:54

Yeah, absolutely. I want people to understand the value that they can have this when your data is good. It can change lives, when people are able to automate things. I remember years ago, building a really small package to transfer data in join two tables together something super simple. And I showed this women, that didn't go very well. And I showed her what I'd done. And she started to cry. And I thought, What have I done have upset her. I know, because people get very sensitive about the data. And she said, I copy these two datasets together in excel every night and a half to stay behind late to do it. And it takes me away from a family wants to go home and spend time with my children. But I'm stuck in work, copying and pasting. And what you've done is you've given me more time to spend with my children. And it was really powerful. I wanted to cry as well, to be honest, but when she said that, but I think if you can automate things and see people as actual human beings that you're there to help, and getting the data rate helps individuals. And if we can send people like that home a bit early to spend time or time with their kids. That's that's a huge win. For me.

 

Rob Stevenson  34:09

It absolutely is, in addition to all of the the bottom line business things and the ROI and the money, money, money, you can also change your life and have someone spend the time on what they really want to spend the time on. So, Jen, I've loved chatting with you about all this stuff today. And I think this episode has been full of advice for people things, they can look around in their own data operations and perhaps shine the flashlight like you do and make things more efficient. So at this point, I'll just say thank you so much for being here. John, I've loved chatting with you today.

 

Jennifer Stirrup  34:37

Thank you so much, Rob, thank you very much.

 

Rob Stevenson  34:39

How AI happens is brought to you by Sama. Sama provides accurate data for ambitious AI specializing in image video and sensor data annotation and validation for machine learning algorithms in industries such as transportation, retail e-commerce, media Medtech robotics and agriculture for more information head to sama.com