How AI Happens

Microsoft's Priyanka Roy on Design Thinking & Data Governance

Episode Summary

Data & AI Solution Specialist for Microsoft, Priyanka Roy, explains how Design Thinking is a crucial approach in the development of any AI technology, and how its proper utilization results in better products and more effective teams. Priyanka also outlines the key pillars of a successful data governance approach, and the utility of "thick" data.

Episode Notes

Data & AI Solution Specialist for Microsoft, Priyanka Roy, explains how Design Thinking is a crucial approach in the development of any AI technology, and how it's proper utilization results in better products and more effective teams. Priyanka also outlines the key pillars of a successful data governance approach, and the utility of "thick" data.

Episode Transcription

0:00:00.0 Priyanka Roy: When we talk about something, are we talking about the same thing or are we all having these little bubbles in our heads saying, "Oh, it looks like they might be talking about this."

[music]

0:00:10.5 Rob Stevenson: Welcome to How AI Happens, a podcast where experts explain their work at the cutting edge of artificial intelligence. You'll hear from AI researchers, data scientists and machine learning engineers as they get technical about the most exciting developments in their field and the challenges they're facing along the way. I'm your host, Rob Stevenson, and we are about to learn how AI happens.

0:00:40.9 RS: Several installments of this podcast have touched on the multifaceted role of the AI practitioner, how it's not enough for them to be a talented coder or an accomplished academic with loads of papers to their name, rather, the modern AI practitioner needs to also display a deep understanding of the individual their technology seeks to assist. Priyanka Roy, a data and AI Solution Specialist with Microsoft is someone who understands this need and cultivates it within the AI building teams she consults. Priyanka refers to it as design thinking, and as she explains, it is a crucial approach in developing any meaningful AI technology. Priyanka joined to discuss the importance of design thinking and how it applies to rigorous data governance. Priyanka, welcome to the podcast. How are you today?

0:01:31.1 PR: Hey, thanks Rob. Thanks for having me.

0:01:33.5 RS: I'm so pleased to have you. You are on the exact opposite side of the globe, I don't think we could be further away from each other if we try.

0:01:41.8 PR: That's right. [chuckle]

0:01:42.8 RS: Thank you for tuning in from all the way in New Zealand.

0:01:44.8 PR: Thank you. That's the world we live in, right, Rob? It's so flat. You probably found me somewhere on the inter-web, and here we are, exchanging notes.

0:01:55.7 RS: It is figuratively flat, there's a whole sub-genre of YouTube dedicated to the world being actually flat. Well, this is not what we're gonna be covering here. [chuckle] But I am so excited to have you on Priyanka, I have so many questions for you. I guess before we get too deep in the weeds, would you mind sharing a little bit about your background and then your current role at Microsoft?

0:02:15.3 PR: Sure. Hi everyone. I live in Wellington, New Zealand, and I currently work at Microsoft about nine months into my role as a data and AI solution specialist. And what that means is, I basically have a set of customers in the finance, banking, insurance, and also mixed by construction and healthcare industry, where I'm responsible for getting in touch with them and understanding what sort of problems they are currently dealing with. And these problems don't have to be data-related problems, it could be any industry challenges, any business challenges that they're currently facing, and then coming together with my team and our Microsoft partners to help come up with a solution that'll help the customer move forward. And a bit about my background. So before Microsoft, I've got about 16 years experience in the data and AI industry. It wasn't called AI when I started, it was data and warehousing when I started back in 2005. I've worked mostly in consulting companies, the likes of Wipro, Capgemini, Deloitte, so that kind of gives me a good understanding of various business sectors and makes me right for the current role that I do.

0:03:32.1 RS: It's interesting to me that you get this access into a lot of companies who are sort of mulling over, humming and hawing how they might apply AI into their businesses. What are some of the common challenges that these folks are facing and where do you come in to be like, "Hey, maybe you could be looking at these sort of solutions?"

0:03:52.5 PR: In the past few years we'd have noticed, or even in the pandemic times that we live in, a lot of focus is that most of these companies are giving on, is customer experience. So how do they ensure that their customers are retained or customers come back to where they left pre-pandemic? An example of banking, how do they make their customer touchpoints current to these times that we live in? How do we ensure that we know a customer much better than we knew before? So making customer experience the best and top-notch is one of the things or trends that I come across quite often. Also in the banking space, there is a great focus on fraud and fraud analytics is also a big focus. So this is just one or two of them that I've mentioned, but in the manufacturing industry, for example, it's all about how do we prevent machine outages from happening? So for example, an airline, every time an airline has to stop operating adhoc, so much loss to the company and to the brand. So how do you ensure that that can be monitored proactively and the brand is maintained, customer experience is maintained because you're on time, wherever you have to be.

0:05:21.2 RS: At the outset of this project or campaign that these customers would end up going on, it strikes me that they would need to be very deliberate about what exactly the outcome is they want. And that's something that I'm hearing from folks more and more, is you can have the most fantastically complex algorithm on the planet, but if you're not asking the right question, then you're not getting the right response or useful response. Is part of your role to help, be consultative with these companies to make sure that they are approaching these problems in the correct way, and asking the right questions of the technology?

0:06:00.0 PR: Yeah, absolutely. And this took me a while to understand. The reason I'm saying this is, years ago, when I would meet customers, they already came to us with a problem and they also had a solution in mind that, "Look, this is the problem, and I think this is the solution that we want, and can you just help us build the solution out, implement it and maybe even train us to use it." But the problems of today are quite unknown, the idea here is for people like me to actually challenge the customer and ask further questions, be curious about why exactly are they thinking what they're thinking, why do they think that this is the problem, has someone actually just asked them to do this or have they spent time with their customers understanding if this is a real challenge?

0:06:51.3 PR: And this is the toughest thing to do, just because when you're meeting a customer for the first time, you always wanna have a good relationship and you just don't wanna go and challenge them and saying, "Hey, you're thinking is wrong," so it takes time to first build trust with the customer for them to even let you ask such questions to them. It's almost how human relationships play out. You don't just take advice from a stranger. You wanna understand who that person is and why they're saying what they're saying, and they have a good intention in mind. So that's the challenge. It's about discovering and uncovering the problem rather than jumping straight to the solution.

0:07:34.1 RS: You probably have a sense of what are some of these probing questions that you can ask that will reveal someone's approach to you. I'm curious, in the cases where maybe someone hasn't been as thoughtful, their approach isn't as examined, what do their answers to your questions typically look like? What's sort of like the giveaway that maybe you haven't thought as deeply about whether this technology is right for you?

0:08:00.9 PR: Yes, so the first giveaway is usually when it's just one person who is speaking to you and maybe that person is coming from their own business area, and problems of today don't just lie in one business area. You have these problems reside in different areas of the business. And that's when I try and bring multiple people into the room, where people from different business units who think there is a common problem, and I'll get to the point where some of the techniques that we use to get to this, and we call it "Design thinking techniques." Now, you probably know of design thinking but for our listeners, it's basically a way of discovering or uncovering the problem rather than jumping to the solution, and it's a very, I'd say a human-centered approach.

0:08:53.0 PR: Let's say it's a student organisation, it's a university, and we're trying to design a solution for them where it's easier for them to apply to the institution, and then maybe even do some blended learning on it, both online and on campus, and then finally take them through their whole student journey until the point where they graduate. Imagine just sitting here in a room or in a meeting room with a fancy white board and just building a solution for them without even speaking to the students about what they want. Do they even have a problem with what currently exists? Have you gone and looked out what a student of today looks like? Who are they, what they aspire to do, what are their behaviors, what are they thinking, feeling, doing whenever they're interacting with any institution? Do they need certain sort of help, motivation? Things like that.

0:09:52.8 PR: So that's when you create these personas of what a prospective student could look like when they are interacting with the university. And then you go and interview them and ask them questions and get into their thought process, and off that comes a few personas and that's when you start actually thinking about them from a human perspective. And then you come back and then you come to your group of people, maybe even bring a few of them into the room when you're brainstorming and brainstorming ideas and how you could solve them. You'd notice here that what I'm trying to do here is getting to know the end user more than just throwing a solution at them, so that's where design thinking comes into play.

0:10:44.7 RS: It's so important because it's not enough for the AI practitioner to be exceptionally good at building algorithms or tweaking a model. The more I learn about this field, the more it feels as though individuals building these tools need to always be thinking about what it's gonna look like in the hands of the user. It's not enough to be like, "Oh, that's not my department." Maybe individuals who are in a more academic area or in a more research-based approach to AI can get away with that, but if you're building a product, then you really do need to have an intimate understanding of how this will impact people. Is that kinda how you view the typical AI practitioner? How can design thinking really be helpful to them?

0:11:26.7 PR: Yes, so for a typical AI practitioner, and I call myself a data and AI practitioner because I feel there is no AI without the right data. So for a data and AI practitioner, it's building good models and algorithms and having the right ML code is important because that's the real thing that enables good experiences, but getting to know the end user or the end customer who's going to benefit from such experiences is really important. Yes, so getting into the mindset of the end user. And often what happens is the teams that design solutions and the team that builds the solution is quite different. So someone just gets the specs and they start building, and they never speak to the person who thought about this solution and why the solution was needed. So it's really important for these teams to come together and think creatively and collaboratively about the problem.

0:12:26.4 PR: And I see this happening a lot more in the past few years. I've noticed that design thinking techniques have seeped well into various areas of business. So I live in Wellington here, so we have a lot of public sector organisations here, and I've noticed that most of the public sector companies here, so let's say if we have to apply for a passport, these companies are actually looking at the end users. So where is this person coming from? Who is the citizen? Who is the customer? Where do they come from? What sort of challenges could they face if they are applying for a passport? Are they from New Zealand? Do they understand English or are they someone who immigrated here? Do they need any help when they apply for things? What are they using to apply? Are they doing it online, or are they going to a passport office to do this? There's so many things. And they gather data about all this, and that's when and only after that, do they come together to solve or build a solution. So I'm seeing a lot of that happening, and that's quite refreshing.

0:13:32.7 RS: It's also very common amongst AI practitioners to think of their work in terms of impact and scale. With software, you can impact potentially the whole world, anyone with Internet access. And if that's your goal, then you simply can't afford not to employ this approach when you are thinking about the people you want to impact and how these products are going to wind up in their hands right?

0:14:00.8 PR: Exactly, yeah.

0:14:01.8 RS: I like how you said a moment ago that there is no AI without data, and the importance of data is well-tilled AI blog soil and AI podcast soil, but that's because it's so important and you hear individuals talk about, "Oh well, it's not just data, you need quality data, and you have to have data that's clean and that you have a good data hygiene." And your approach to all of this, I was really interested to learn about because you kind of put an umbrella over all of it and call it data governance, and so I would just love to hear your more far-reaching approach to data with a capital D. [chuckle] As it is applied to building AI technologies, not just in the acquisition of that data, not just in the utilisation of it, but in every facet of making it work for you?

0:14:52.0 PR: Correct. Okay, yes, of course data is an important part of AI. I hear this often in New Zealand, BI before AI, and so meaning business intelligence before artificial intelligence. The reason people say that is, most organisations over here at least are struggling with quality data, it's usually an afterthought. So now coming to data governance, data governance in a nutshell would mean how does an organisation get its data and keep its data right so that it can be used in the right fashion across the organisation. For organisation employees to make decisions, it's important for them to trust their data and how exactly are they going to trust their data? That can only happen if the data quality is maintained. If you go and speak within an organisation, the language is different in different parts of the business. In one area, a passenger... Let's say, or in transport organisation, you speak to one part of the business, their definition of a passenger would be different if you go and speak in various areas of the business, so having a common glossary of terms within the organisation, we call it the data dictionary. That's really important.

0:16:07.2 PR: So when we talk about something, are we talking about the same thing or are we all having these little bubbles in our heads saying, "Oh, it looks like they might be talking about this," and getting the glossary right, getting the lingo right, that's really important. If you wanna get your data governance right, you need to start thinking about people, process and platform all in unison. So you wanna understand that people understand data, they trust it, they're data literate, and they reach out to facts, when there is a need to answer a question. So that's the people aspect. The process is when... So within a business, there are various systems that store data, so you have product data, you've got supplier data, you've got location and you've got various things, and all these systems are managed by different areas of the business.

0:17:00.6 PR: So we need to find custodians within the business who look after this, whenever something new comes in, they need to be the authority saying that, "Okay, I manage the supplier data and I can fully guarantee that we've got good processes and checks in place that ensures that the quality of this data is complete and you can trust it." So imagine if every area of the business does that, how good the quality and decision-making of the business will become? And that finally, you need a platform, so you need a platform, meaning technology where all of this can be stored, and in my case, because I'm Microsoft, I'm talking about Microsoft Azure, where you have a common data platform where you're storing all these records, and then finally people are building data warehouses and then even machine learning on top of it to report on it. So yeah, so you can see how a good foundation can help.

0:18:02.3 RS: Yeah, yeah, absolutely. What do you think are some of the pitfalls of not approaching your data strategy in this way, what's some of the things that can go wrong if you aren't delivered about your data governance strategy?

0:18:16.9 PR: Yeah, so what'll happen is it's like establishing or building a house, you probably got a vision of a beautiful house, but imagine if your foundation is not right, what may happen is that, let's say there's this turmoil, there's a cyclone or something, any natural calamity the house might just topple down. Yeah, just because your foundation wasn't right. So initially, it might all look okay. In an organisation, you might start a project on AI, let's say, and you do a little pilot, a proof of concept, and you get what you're after, and off you go, you're making decisions. But for the organisation to change completely, you want this technology to be adopted, this particular AI technology to be adopted, how do you ensure that everyone in the organisation is on board? And that'll happen only when you take them all together in your little spacecraft. You want to ensure that their systems, their data systems are also nicely curated, they are taken care of, the health is maintained, and only then can everyone start exploring and building good AI models and maybe in projects on top of that. Yeah, think of it as a house and the fact that you'd want to lay a solid foundation so that you can keep building.

0:19:46.9 RS: What part does thick data have to play within a well-ordered data governance strategy?

0:19:53.0 PR: Yeah, so thick data. So this was a term I just came across maybe a few years ago, I only came... I knew of big data and I knew of small data, and then there was thick data. So basically thick data is data that adds context and meaning to numbers. So when you think about big data, it's basically any data that comes at a high volume, it's at good velocity, there's variety in it, it's unstructured and structured or semi-structured, but it's just numbers. Yes, so you don't really know what those numbers mean. What is it about? What are those numbers about? Is it about people? Where is it coming from? So only when you add context and context to those numbers does it become holistic. So you can't just rely on numbers, you need to understand what's behind it. So if it's about a person, what was the background of the person? Where is it coming from? What is the person doing when this data was captured? Otherwise, it's very skewed. And I think I read an article once about how Netflix is able to understand what people like to watch.

0:21:11.7 PR: So for example, let's say people are watching a particular title and they're going through it, they also wanna understand, who are they watching it with? What are they having while watching the movie? Are they having popcorn or are they having a wine? And things like that, so just to make the whole experience a lot more richer. It's not like they wanna spy and look at what you're doing, but imagine if they gather data by asking us, so when you were watching this particular title, what were you watching or who are you watching it with, and what do you prefer to eat? So maybe they're wanting to build a richer customer experience, maybe by offering such things in the future. So yes, so think of thick data as just context to whatever, just more language to it rather than mathematics. But it's something that again, adds into the concept of design thinking. It's again, going back to the end user and trying to understand them a bit more and not just treating them as a number, but more as a human with needs and wants, and then building things around that.

0:22:27.3 RS: Yeah, it's so interesting because my first thought was, "Oh, well this is qualitative data, this is maybe more anecdotal, it's more contextual," but the whole point of building these fantastically complicated models is that they can sometimes see patterns where we don't. And so, whereas if you're watching a television show, you're having a Sprite versus a glass of red wine, I might not think that matters. But it may, it may matter, and we would never know unless you feed it, unless you append that data onto your viewing data and start to run through a much more powerful algorithm.

0:23:13.5 PR: Exactly, yeah, you never know. Someone must have already done something and we probably will experience the benefits of it. So who knows? [chuckle]

0:23:23.7 RS: Yeah, yeah, exactly. Who knows? Well, Priyanka, this has been such a delight chatting with you today. I've learned so much from you and your background is just so rich and with expertise, and I love learning all about data governance and your own view to design thinking. It's been fascinating chatting with you today, so thank you so much for being a part of the podcast.

0:23:42.5 PR: Thank you Rob, and thanks for finding me somewhere on the Internet.

0:23:47.0 RS: In the big flat world we live in.

0:23:49.0 PR: Yeah, exactly. [laughter]

[music]

0:24:02.0 PR: How AI Happens is brought to you by Sama. Sama provides accurate data for ambitious AI, specialising in image, video and sensor data, annotation and validation for machine learning algorithms in industries such as transportation, retail, E-commerce, media, MedTech, robotics and agriculture. More information, head to sama.com.

[music]

 

priyankaroy.txt

Open with Google Docs

 

 

 

 

 

 

 

 

Uploading 1 item

 

 

Displaying priyankaroy.txt.

Drive