Traditional LiDAR systems require moving parts to operate, making them less cost-effective, robust, and safe. Cibby Pulikkaseril is the Founder and CTO of Baraja, a company that has reinvented LiDAR for self-driving vehicles by using a color-changing laser routed by a prism. After his Ph.D. in lasers and fiber optic communications, Cibby got a job at a telecom equipment company, and that is when he discovered that a laser used in DWDM networks could be used to reinvent LiDAR. By joining this conversation, you’ll hear exactly how Baraja’s LiDAR technology works and what this means for the future of autonomous vehicles. Cibby also talks about some of the upcoming challenges we will face in the world of self-driving cars and the solutions his innovation offers. Furthermore, Cibby explains what spectrum scan LiDAR can offer the field of robotics more broadly.
Traditional LiDAR systems require moving parts to operate, making them less cost-effective, robust, and safe. Cibby Pulikkaseril is the Founder and CTO of Baraja, a company that has reinvented LiDAR for self-driving vehicles by using a color-changing laser routed by a prism. After his Ph.D. in lasers and fiber optic communications, Cibby got a job at a telecom equipment company, and that is when he discovered that a laser used in DWDM networks could be used to reinvent LiDAR. By joining this conversation, you’ll hear exactly how Baraja’s LiDAR technology works and what this means for the future of autonomous vehicles. Cibby also talks about some of the upcoming challenges we will face in the world of self-driving cars and the solutions his innovation offers. Furthermore, Cibby explains what spectrum scan LiDAR can offer the field of robotics more broadly.
Key Points From This Episode:
Tweetables:
“We started to think, what else could we do with it. The insight was that if we could get the laser light out of the fiber and into free space, then we could start doing LiDAR.” — Cibby Pulikkaseril [0:01:23]
“We were excited by this idea that there was going to be a change in the future of mobility and we can be a part of that wave.” — Cibby Pulikkaseril [0:02:13]
“We are the inventors of what we call spectrum scan LiDAR that is harnessing the natural phenomenon of the color of light to be able to steer a beam without any moving parts.” — Cibby Pulikkaseril [0:03:37]
“We had this insight which is that if you can change the color of light very rapidly, by coupling that into prism-like optics, this can route the wavelengths based on the color and so you can steer a beam without any moving parts.” — Cibby Pulikkaseril [0:03:57]
Links Mentioned in Today’s Episode:
Cibby Pulikkaseril on LinkedIn
EPISODE 24
[INTRODUCTION]
[00:00:00] CP: And we have this insight, which is if you can change the color of light very rapidly by coupling that into prism like optics, this can route the wavelengths based on the color and so you can steer a beam without any moving parts.
[00:00:14] RS: Welcome to how AI happens, a podcast where experts explain their work at the cutting edge of artificial intelligence. You'll hear from AI researchers, data scientists, and machine learning engineers, as they get technical about the most exciting developments in their field and the challenges they're facing along the way. I'm your host, Rob Stevenson, and we're about to learn How AI Happens.
[INTERVIEW]
[00:00:45] RS: How do you innovate in a field that is already cutting edge? If you're bringing products to market that have never been seen before, the technical specifications of which far beyond the average consumer, aren't you necessarily pushing the envelope just by having your hands anywhere near the envelope? How can a practitioner stand out as an innovator, when by definition, all of their peers are innovators to this question led me to today's guest, Cibby Pulikkaseril. Cibby is the co-founder and CTO of Baraja, a company bringing exciting strides to light our technology.
When I pitch guests to be on the show, I often tell them, the show was meant to be technical. I give them the example that we're not going to pause to define sensor fusion or LiDAR to our audience. We're going to give them more credit than that. Today, we're not going to define LiDAR, but we are going to meet with someone who was redefining it.
[00:01:51] CP: I'm originally Canadian, and so I came to Australia to do my graduate studies. I did a PhD in lasers and fiber optic communications. And then I was really lucky, I got a job here in Sydney working with a company that was called Finisar, which make telecom equipment for long haul networks all over the world. And I got to work on a special laser technology. It's a laser that's used in DWDM networks, and it could change color. And that was in telecom used to carry more data in a single fiber.
But I got really excited, and my co-founder and I started getting really excited that this technology was not very well known and not used in many applications outside of telecom. And we started to think, what else could we do with it? And then I think we had the insight, which is if we could get the laser light out of the fiber into free space, well, then we could start doing LiDAR. And at that moment, the Google X cars were going around California, and you were starting to see a lot of momentum in autonomous vehicles. And they were all using really large, expensive LiDAR systems, and we thought, yeah, we've got something special here, we can definitely displace that.
[00:02:55] RS: It's so interesting that you landed on autonomous vehicles as the application for this technology, because you, by admission, you're not particularly a gearhead, right? You don't like collect classic cars. So, I'm curious, why exactly was it that you decided that AV was for now the best application of this tech?
[00:03:12] CP: Yeah, look, I guess there are two interesting facets there. One is, I mean, you need some application that is really mission driven. So, I think, we got really excited by this idea that there's going to be a change in the future of mobility, and we can be part of that wave. And that is really motivating for us as founders, but also for that early team, it's super exciting for them.
For me, personally, actually, I have an admission, which is, I mean, I don't like cars very much at all. I mean, to me, they are a great way to get around, I've never enjoyed owning them. And for me, this idea that this future look of car ownership could be completely different that you might not own a car at all, that you might be able to summon an autonomous vehicle of different forms, that you could take your kids in autonomous vehicle, that you could go for trips. I mean, that, to me is really exciting and really motivating for me personally, to be part of this.
[00:04:04] RS: Perhaps in a world where it wasn't a human driver, you might like cars more.
[00:04:09] CP: That's right.
[00:04:09] RS: I'm really interested in getting more into LiDAR, specifically, because you have some really interesting technology that you're developing. I guess, I would love to know a little bit more about early days of Baraja, the founding of Baraja, and kind of where you are now to set some context for the technology that you're ultimately sort of deploying to market.
[00:04:27] CP: So, we are the inventors of what we call spectrum scan LiDAR. And that is harnessing the natural phenomenon of the color of light to be able to steer a beam without any moving parts. When you look at traditional LiDAR, I mean a lot of it involves something spinning or oscillating, or you know, some mechanical motion that's used to steer the beam. And we have this insight, which is if you can change the color of light very rapidly, by coupling that into prism like optics, this can route the wavelengths based on the color and so you can steer a beam without any moving parts. And I think it makes sense to most people that if you're in a vehicle, the thing that is doing the scanning of the environment, you want that to be as robust as possible, and by removing the moving part, we have made it smaller, cheaper and more reliable.
[00:05:13] RS: Just less complicated, right? Fewer moving parts means less frequently needs to be repaired, maintained all of that, correct?
[00:05:19] CP: Exactly. That's right. We have seen that same transition in telecom, where it's gone from moving parts into solid state types of switching. And so, for us, it feels like a natural evolution, that it happens in LiDAR as well. Velodyne was the first company to come out with something that could be used by autonomous vehicles and they did an incredible thing. And they really set the stage that you could have high resolution LiDAR, and make that available for these types of autonomy applications. And they have actually – the autonomy industry has grown significantly with that initial type of LIDAS.
Since then, what you've seen is actually a huge explosion in kind of copycats of that style. So, there's a lot of other letter suppliers that mimic what Velodyne is doing. And now we're in the second phase, where the newer types of LiDARs are actually, they're not spinning anymore. And so, they look like they don't have moving parts, such as a sealed box. But inside, they're typically using either large mirrors or very small mirrors called MEMS, which is fine, it's a very obvious way to want to steer a beam. You can buy these steerable mirrors, and you can shoot a laser at it, and you can steer it around. But it's not very imaginative and it's not very elegant is what I would say where, when we started the company, what I set out to do was what is the most elegant way that I know how to make this LiDAR function and that was get rid of the moving parts, use wavelength, and use components that we knew that would be able to scale to volume.
[00:06:40] RS: Can we get into the technical description a little bit about how this works. The beam is being directed by light passing through a prism, can you kind of walk me through that, like explain like I'm five, how this is able to function?
[00:06:53] CP: Sure. So, you might imagine or you might have seen even as a child that if you have a glass prism, and you hold it up to sunlight, the output, what you'll see is the rainbow spread across wherever light is folding. So, if you kind of think that through, that's because white light has all the colors meshed together. So, if you put one color at a time, when they exit the prism, they all go to different locations. So, the prism itself is actually steering the beam based on color. We call this dispersion and we use the same effect, but obviously hyper optimized for a LiDAR application to get the maximum performance, but it's very similar. So, we changed the color of light, the dispersive optics then selects a unique angle for each wavelength, and that allows us to scan the beam and nothing had to move. All we had to do was change the color of light at the laser.
[00:07:40] RS: So, based on the direction that the beam goes, it will be a certain color, because that's how light is refracted. And then based on that color, now you know exactly where that beam was.
[00:07:51] CP: We knew where it is, exactly.
[00:07:52] RS: And you have LiDAR. Wow, it's fascinating. So, I mean, how many colors are there? Are there infinite amounts of directions in which this beam can be measured?
[00:07:59] CP: It's a really perceptive question, Rob. I mean, it's hard to actually define. Specifically, I would say, these types of lasers, I mean, they can do thousands of different wavelengths. I have experienced where we've done 100,000 different wavelengths, and the number of wavelengths that we can shoot out, determine the resolution that's achievable. So, this is what we talk about when we say this is the path towards infinite resolution, because you have such fine control over the wavelength of the laser.
[00:08:24] RS: It's just a matter of how precise you want to measure the color, I suppose.
[00:08:28] CP: Exactly. And so, that's a lot of our internal secret sauces, and how we calibrate and measure these wavelengths.
[00:08:34] RS: I'm glad you said that was a perceptive question, because it came out of my mouth and I felt very silly asking a CTO how many colors there are. Like, we were a Crayola 196 household, personally. But I guess the follow up there is how many do you need to be accurate? If there is infinite resolution, do you need to put a ceiling on it or not have a ceiling? Or where do you start to see a leveling off in return, I suppose, the more resolution you add.
[00:09:01] CP: So, there's a couple things that go into that and one is the overall like how many points you can shoot every second, and that's limited by the speed of light. So, there's some limitations on how much resolution you can have instantaneously. What I think is more exciting is because we have this infinite resolution, we can say, “Well, that resolution is available to the customer.” So, we have a feature here at Baraja, which we called Horizon tracking, which is, if you identify where the horizon is, and you tell the Baraja LiDAR, that's where the horizon is, we can automatically scan there with much more resolution that is possible with any other LiDAR just by software configuration.
This way, like this is then making the most efficient use of the data instead of having 10 million points of everything in the scene and things you don't need. The future for spectrum scan LiDAR is that the AI is going to say, “Hey, this is a pedestrian and I want resolution there”, and you get really high-density resolution on the pedestrian and small objects in the field and then just regular or coarser resolution everywhere else where it's less important.
[00:10:03] RS: That is an interesting question, though, how much resolution do you need in the case of a pedestrian, probably enough to predict their movement? I classify them as a pedestrian, but whether they're wearing glasses or not, whether their shirt is blue or green, probably not so important, right? Do you kind of have these limitations on based on what the LiDAR encounters it needs to apply this kind of resolution?
[00:10:23] CP: So, I think the first use case for LiDAR is really in terms of hazard detection. And this is really fundamentally like identifying is something in the path of the vehicle, and what is it? In these cases, you want to be sure, there's something there and then stop the vehicle safely. When we get to higher levels of autonomy, I think there's a really rich and unexplored area, already with cameras, they do things like this, which is understanding the intent of pedestrians and vehicles based on cues from the camera feed. I think once you add LiDAR to this, it’s going to be really exciting as well.
[00:10:54] RS: I can see the obvious advantage in there being less moving parts and making it smaller and more affordable. What are some of the other advantages of this technology in terms of accuracy, processing speed, that sort of thing?
[00:11:07] CP: So, I think the Baraja spectrum scan LiDAR has one really hugely advantageous feature, which is because we steer the beam by wavelength. This also ensures that only our wavelengths can couple back in. What this means is that any other sources of interference, for example, sunlight, other LiDARs, even other Baraja LiDARs, the chance of you, the ability for you to get this light into the receiver and make an interference effect is extremely minimized. This is actually something that's going to be a difficult problem in the future as people want to deploy hundreds or thousands of vehicles onto roadways, we are going to see a proliferation of laser pulses, everywhere in the environment. So, this is a hazard waiting to happen.
There's some really interesting work on in the research community where they look at these interference effects, because these can be adversarial. I mean, they can be deliberate attacks, or they can be unintended. And some of the results they've shown have been, you might have the ability to create false scenes, like create a false hazard in the road and try to get the vehicle to stop really suddenly. Unintentional effects can be that you might get copies of things that you didn't expect. If you have two LiDARs from the same manufacturer, they're close to each other, just the fact that these pulses are sort of reflecting off the same objects, you might get things like ghost buildings, or ghost vehicles.
So, you can imagine, these are unacceptable that this is now a function of how many LiDARs in the street there are, you can cause really strange, unexpected results, which are a hazard for everyone.
[00:12:38] RS: Interference is a great example, I think of perhaps by some certainly by the consumer, unforeseen challenges in there being a lot of autonomous vehicles on the road. Are there other examples of that, that you think about in terms of what challenges this widespread technology would pose?
[00:12:54] CP: Yeah, I think the thing that I think about because it's sort of very much in the present is the type of artificial intelligence that people are using, I think for passenger vehicles. I mean, I think it's very light. There's a lot of driver assist systems like that do automated braking, and lane assist, that use cameras, and they'll adopt LiDAR, which will be really exciting, because they'll have higher features. But when we get to things that are going to drive themselves, this is going to be a real challenge to understand what is the brittleness of these artificial intelligences and how can they fail. There's been tremendous work by all the self-driving companies to get there.
But there is really interesting research from different groups, where they show that if you use just a single sensor modality, like, for example, you just use cameras, or just use LiDAR, you can really affect the performance of neural nets in just simple ways. If you have a billboard that it has pictures of people on it, the cameras can't really tell, they'll identify those people. And with LiDAR, it's interesting, too, that you can do things to disturb the LiDAR neural nets, just by having unexpected objects next to things like vehicles. And then this can cause unexpected events.
[00:14:07] RS: For that reason, do you predict one type of sensor will went out? Or will it always be a balance of multiple types?
[00:14:13] CP: So, I think the trend is and what you're seeing, I've seen some interesting presentations from the group that was at Lyft, where they're talking about, I think they call it autonomy 2.0, which is more about, fusing the sensor inputs really early before doing a lot of machine learning, and this is to avoid some of these cases, I think.
[00:14:32] RS: So, you think we would have to creep closer to sensor fusion?
[00:14:35] CP: Yeah, just more sophistication, more resilience, and really good, high quality sensor data. And I think for that to happen, it has to be completely immune to interference.
[00:14:45] RS: What are some of the applications of solid-state LiDAR and just advances in LiDAR outside of AV perhaps going into robotics or other industries? I'm sure you've thought about you know how other ways Baraja might bring this technology to market. What are some applications you think you could realize down the road?
[00:15:02] CP: Yeah, so that means the first application for Baraja spectrum scan is in automotive. I mean, to be clear, I'm extremely excited by that. And it's wonderful to be a part of the automotive journey, because it is such an advanced industry that knows how to deliver really safe products to customers. But automotive gives you the scale and the quality and the performance that you then can have the confidence to deploy into other fields. And so, it's going to give us I think, the breadth, and the capability to advance robotics in many other fields. And something that I've always been really excited by is there's a lot of research here in Australia on agricultural robots. You can imagine in Australia, I mean, water is always a scarce resource. And so, some of these robots that are autonomous in fields, for example, can by themselves go up and down the field, and harvest weeds, or direct water right at the plant, instead of having to spray the entire field with pesticides and water. And so, you can imagine from an efficiency and conservation point of view, this is a huge increase in productivity for a farmer.
[00:16:03] RS: I want to ask about you a little bit, because you're a really interesting guy, and you have this really technical background, and just from like a career perspective, as the CTO, do you still get to scratch that itch of being really technical, of like getting in the weeds and you kind of wrenching on something or, I think of like the VP of engineering, who they get so good at engineering that they stopped writing code, that their job is to oversee people. But when you think of your own career sort of trajectory, how do you kind of characterize your current role?
[00:16:36] CP: So, well, I'm lucky in that my role is extremely technically focused. And Baraja, we are a very technical company, so we are mostly engineering stuff. We have a lot of advanced research here. The R&D team reports directly to me and that's the stuff that's really fascinating, which is, how do you shrink spectrum scan LiDAR into tiny photonic integrated circuits. And so, we have a team working on this and that technology is exciting, because it's not only about LiDAR, but it's really about the proliferation of optics everywhere, which is, I think, an exciting place to be and so many future applications are going to be using photonics and optics in the coming decade.
Myself, I don't get a lot of chance to build products or be in the lab like I used to. But actually, what I do is I do direct a lot of the research topics that we work on, and I like to work closely with the engineers on those. And those are really, to me some important technical outcomes that we use when we talk to customers, being able to say, “Hey, mathematically, this is how we express this, or here's simulations that prove that we're on the right track, and we're going to experimentally validate them.” So, this, for me is really fascinating. But I have to admit that I hire people that are much smarter than me to do the actual technical work that gives us a product.
[00:17:51] RS: Yeah, well, that's the sign of a good leader, I should think. Could you share before I let you go some of these areas of research that you're directing, folks, and that kind of gets you excited, the kind of stuff that you read after work, and that you would be interested in as a hobbyist, or what have you, even if you weren't running this company?
[00:18:04] CP: So, some of the stuff that we do research here is, like understanding, when we move to the intuitive photonics platform, is what kind of performance we can expect with our specific type of LiDAR. So, if you look in the academic literature, there's a lot of work that's happened on pulse time of flight LiDAR, which is the traditional method of ranging. So, all the spinning LiDARs, for example, there's a lot of work on that. And there is much less comparative work on the type that we're working on, which is called a homodyne detection LiDAR. It's a technical term, I won't go into it. But I think there's a lot of really interesting things to explore there. Because we are one of the few people that's playing in this space. And so, there's some scientific outputs from this, I think will be interesting, we'll present at some conferences and share with customers. But this is the type of thing we work on.
[00:18:51] RS: Can you give a little teaser? I wouldn't mind it if you got a little technical into the nomenclature and what homodyne means.
[00:18:57] CP: So, homodyne refers to a method of optical detection. Normally, what happens in most of the traditional LiDARs out there, what they have is they shoot a laser out into the environment, the light that comes back, they detect with a photo detector, or an avalanche, what they call an avalanche photodiode. And this is just a semiconductor that turns those received photons to electricity.
Now that's fine. And that's a simple method and a cheap method of detection. Homodyne is much more complicated. And that means that now when the light comes back and I detect it, I also mix it with a copy of what I sent. There's like a reference laser that we put onto the photodiode. And what this causes is optical mixing, and has the effect where the current that we produce now is now the function of these two things, which is the received photons as well as the reference that we sent. And when you do this actually, what you can do is not only reconstruct how much power you got back, but you can also reconstruct the optical field so you can understand the amplitude and the phase of light that came back. What does this mean for the consumer? It means well, one thing you can measure is the speed of the target that you're measuring. And so, this is typically called Doppler LiDAR, and it's an extremely advanced functionality. It's only available if you can do homodyne LiDAR. So, we are one of few companies out there that are going to offer this.
[00:20:19] RS: Is the reference data that is sent out at the same time, is this just immediately previous data? Or is there some kind of like expected control group it's sending out? Where's that first bit come from?
[00:20:29] CP: It's a copy of the light that we sent out prior to us sending it. And so, because it's a reference, what allows us to understand what is the phase of light when it comes back. And so, this first copy of like, the reference, is you can think of it it has like phase zero. And when you get back then, if what you get back is changing phase, that equates to the target moving at the time that you measured it and that's what we call Doppler.
[00:20:54] RS: Okay, got it. So, so you're detecting the change between the two instantaneously?
[00:20:59] CP: Exactly.
[00:21:00] RS: It's almost simulating computer vision, to have two different images and then detect the difference between the images, what's happening on the light phase, instantaneously, correct?
[00:21:08] CP: Right. I mean, this is an old technique. It's been used for decades, in many other fields. And I think, like, really excitingly, the gravitational wave detectors in California, they use this type of detection. It's ultra-sensitive and you can measure the phase of light.
[00:21:23] RS: Oh, that's fascinating. This is so cool. I would love to hear from you, before I let you go. For the folks out there, who are working in autonomous vehicles robotics in these fields, what would your advice be to them to make sure that they are pushing the envelope, continuing to innovate and be creative in a field that is fundamentally already very creative, because you still have to go that extra mile to set yourself apart in this field.
[00:21:47] CP: The thing that's true is, no one's an expert in all the different types of LiDAR. Like you said, it's a new field, it's cutting edge, myself, all I can make is inferences about other types of LiDAR. And we're lucky that the older types, there's a lot of scientific literature out there, so you can go and read about them. But really, nobody's an expert in all these things, unless you try to build it and make a product, you don't know what it's capable of.
So, I think if you're a user of these things, the best thing is to be really open minded. And understand that – I think very early on, three years ago, people made drastic conclusions, that LiDAR was going to be, LiDAR on the chip for $100 and it was going to happen in 2021. And those of us in the LiDAR space, I think a lot of us were very skeptical and thought, “Well, that doesn't seem likely.” And I think that's come to be true. And so, I think both being skeptical, but also being open to the idea that you can have huge innovation when something unique comes along is the right attitude to take.
[00:22:46] RS: I love it. Cibby, this has been fascinating chatting with you. Thank you so much for being on the show and sharing your expertise with us today.
[00:22:52] CP: Thanks, Rob. Thanks for the questions, very insightful.
[OUTRO]
[00:22:59] RS: How AI Happens is brought to you by Sama. Sama provides accurate data for ambitious AI, specializing in image, video, and sensor data annotation and validation for machine learning algorithms in industries such as transportation, retail, ecommerce, media, medtech, robotics, and agriculture. For more information, head to sama.com.
[END]