1. Fetch from startup to exit, with Melonee Wise

2021-08-24 · 1:10:50

In this episode, Audrow Nash speaks with Melonee Wise, former CEO of Fetch Robotics and current VP of Robotics Automation at Zebra Technologies. Melonee speaks about the origin of Fetch Robotics, her experience at Willow Garage, her experience being acquired by Zebra Technologies, challenges in the warehouse setting, on autonomous cars, and on the future of robotics.


Links

Comments

Transcript

The transcript is for informational purposes and is not guaranteed to be correct.

(0:00:04) Audrow Nash

Hi, everyone, welcome to the sense think act podcast. I'm your host Audrow Nash. Today I speak with Melonee Wise, former CEO of Fetch Robotics and now VP of Robotics Automation at Zebra Technologies. We talked about her experience with Fetch robotics, the differences and similarities between mobile route robots and a factory setting, and autonomous cars, and her outlook on the future of robotics. I really enjoyed this interview. First, it was really nice getting to know Melonee. And also I found it very interesting to hear about the different stages that FX Fetch robotics went through before being acquired. Specifically, they started as a hardware company building the robots, which only lasted five months or so. And then they were a software company. And now, Melanie is describing them as a data company is very interesting. With that, I hope you enjoy. Hi, Melanie, would you introduce yourself?

(0:01:09) Melonee Wise

Yeah, I'm Melonee Wise. I'm the former CEO of Fetch Robotics now the VP of Automation, Robotics Automation at Zebra,

(0:01:20) Audrow Nash

So that's really exciting. Would you tell me a bit how Fetch and Zebra are related?

(0:01:25) Melonee Wise

Yeah, so I started Fetch robotics about seven years ago. And this week, we actually closed on an acquisition of zebra acquiring Fetch robotics. And so Zebra Technologies acquired Fetch to create a new robotics automation division. and Fetch makes autonomous mobile robots for the logistics and manufacturing industries.

(0:01:50) Audrow Nash

So going into Fetch a bit more congrats on being acquired. That's so exciting. And you said it was ahead of your timeline, your 10 year, whatever being your highlight, it was actually acquired in seven years. Would you tell me a bit more about Fetch. So it's for warehouse related robots, but just I guess, background information on the company.

(0:02:18) Melonee Wise

Yeah. And to be clear, when I said I had a 10 year vision, What I meant is when I, when they first started talking about it, Fetch robotics to investors, I kind of I kind of made sure that they were in it for the long haul, because robotics is such a hard problem. So I was like, Hey, you realize that this is a 10 year problem, it's not like you're gonna get an acquisition, like you would in a software company and, you know, pump a couple dollars into flip into an IPO or an acquisition a couple years. And so, so that's there wasn't really like, I was like, Oh, I'm gonna sell the company in 10 years, it was more like, you know, when you go and try and convince people that robotics is something interesting to invest in, you need to help temper their expectations. Because, you know, if you look at the car industry, we've been saying, since, I don't know, 20, or 27, like 2007, that we're gonna have autonomous cars next year. And we've been saying that every year for the last, you know, 14 years.

(0:03:18) Audrow Nash

That's crazy.

(0:03:19) Melonee Wise

So I so seven years ago, I started Fetch and I started the company with three other co founders, Derek King, Eric Dyer, and Michael Ferguson and we all worked at Willow Garage together. We were part of the robot development team had at Willow Garage. It's a team that that basically for about a two year period built a lot of different robotic prototypes to look at different industries that that were of interest to Willow Garage because but two years before it ended, the founder that started Willow kind of said, well, what's next? Congratulations, you did open source, although that's kind of a funny statement to make. But he kind of said Congratulations, you did some open source. Now what you know and, and Steve cousins at the time said, Hey, we need to go look at you know, the home and then industrial applications. And it's really funny when you look at you know, the people that worked at Willow Garage and, and the companies that were then formed by the former employees of Willow Garage, they all kind of reflect the different areas that we investigated in the last two years. And we investigated everything from like grocery retail to manufacturing to hotels, for example, Steve went and started savvy oak and did hotels and, and the guys in AI that started Fetch robotics. We we looked at at the manufacturing and logistic verticals and we saw it as a an interesting problem to solve. And, and when you look at that the reason that was interesting to us is we thought it was attractive, tractable problem, like most warehouses and most manufacturing facilities are what you might call semi structured. But there's also a large, like, dynamic happening in the industry where finding new employees is very hard. And even before the pandemic, there was a large labor shortage.

(0:05:34) Audrow Nash

You mean for factories?

(0:05:35) Melonee Wise

Yeah, yeah.

(0:05:37) Audrow Nash

Okay.

(0:05:38) Melonee Wise

Yeah. For factories and warehouses. Before the pandemic, there was about a 600,000 person shortage. So there were 600,000 open jobs,

(0:05:48) Audrow Nash

How many people in the industry overall?

(0:05:51) Melonee Wise

I don't know that statistic off the top my head, but I do know, before the pandemic there were, there were about 6 million jobs available. And the job openings in the manufacturing and logistics industry represented about 10% of all open jobs in the United States. Yeah, crazy. Yeah. And for a long time, the United States was running at a deficit for labor, right? So we had, for a long time, we only had about five or 6 million people unemployed, and we only had about five, we had about five or 6 million jobs available. And so it was a close thing. There were actually we were running what you might call a negative employment rate, where there just weren't enough jobs, there weren't enough people for all the jobs.

(0:06:46) Audrow Nash

So that's interesting. I mean, you hear about robots taking jobs and things like this. In this case, it's the complete opposite. It's there are not people for that. But we need more people to fill out the roles so that the companies can be more efficient. That's very interesting.

(0:07:05) Melonee Wise

And I think that's one of the key reasons why over the last couple of decades, there's been this strong push for automation. And why you're seeing in a lot of industrialized nations, you're seeing a lot of push to get more and more automation so that they can stay competitive so that they can grow their GDP, etc, etc, etc. And so that led us to be interested in creating Fetch and how know the

(0:07:33) Audrow Nash

so that that is the case now, just before COVID. Fetch is seven years old, or so. Was that the case then, too? And how has that changed over time?

(0:07:45) Melonee Wise

Yeah, so that that trend was very much true before. before when we started Fetch, there was just a huge labor gap in the industry. And it what's even worse is it's coupled with a turnover rate of about 40%. So super

(0:08:00) Audrow Nash

high, basically. What kinds of jobs are these? Like? What are the what is a person and these jobs do further day at work?

(0:08:10) Melonee Wise

Yeah, so they, they can do anything from like packing out a box to walking around in 110 degree warehouse picking boxes off the shelf? Yeah,

(0:08:20) Audrow Nash

turnover. So yeah,

(0:08:24) Melonee Wise

they can be assembling small parts or machinery they can be. I mean, our robots are doing the delivery task of someone's job and that and that's the other thing that conflate some of the problems is if you look at it, when they're looking at automation to offset labor, it's not typically like this robot is going to play replace this person 100% there's just there really isn't the possibility to do that in most cases today. And so we're why exactly um because if you look at it, there's there's a lot of tasks that make up your job. So if you look at say, me think a pickers job, if we look at what a picker does,

(0:09:15) Audrow Nash

are those a person that picks things out of bins and then puts them into boxes, like an Amazon person?

(0:09:21) Melonee Wise

Yeah, that's that's the overall they fulfill orders from customers. And so there's, there's a lot of parts of their job. So there's the part of the job where they first prepare all the boxes, and then they put them on a cart. And then they push that cart to a location in a warehouse where there's a shelf and they pull something out of a box, and they put it into a box on that cart. And they're like scanning each of those locations to make sure that that thing that they're they're picking is correct, etc. And they do that through an entire warehouse and then they come back and they typically offload the boxes from that cart on to an area where either separate person or that same person packs it out. So they take it from those boxes, confirm that all the the items in it are correct, and then get it ready for shipping. And that means putting like paperwork in it and basically shoving it down a line. So someone can put a label on it and tape it. But if you look at it, the picking task is very hard. It's not like there's a lot of robots today, they can just wander up to a shelf and pull an item out of a box, especially when they're, let's say they're their t shirts. And they're, I don't know, baby onesies. And they're packed in a box, you know, that is probably only meant to hold 50. But there's 500 of them in there. And so it's a pretty complicated task from a robotic standpoint, whether it's manipulation or vision. And so we today can't do all of the task. And so there's some things that we replaced completely, like we don't have them walk all that distance, they'll meet the robot or something like that, or, or we change the roles like the now if you have a robot that where the person used to configure the cart and put all the boxes on the cart, maybe one person now is just responsible for doing that for every robot, as opposed to that person individually beer sponsor responsible for their cart. And so the the jobs change what the roles are change, and and really, it's about taking out the part of their job or the set of tasks that they do that can be done by a robot. And I, I think that there's good analogues in kind of the technical world, you know, if you if you look at, if you look at maybe writers, or journalists, they previously had researchers, and they had secretaries, and they had all these, all these other tools or resources available to them. And now, many journalists, their entire job has been reduced down to doing all those things themselves. But now the tool they have instead of a person. is Google, or, or some search tool. Yeah. Yeah. And, and so I think there there's plenty of analogues for kind of this transition from people doing tasks or part of our jobs for us to us now using automated tools.

(0:12:34) Audrow Nash

Yep. So if I understand correctly, basically, the shipping process has a whole bunch of different components that must get done on the way. And what you do is you say, oh, a robot might be very good at this one spot. And so by adding robots into the system, you free up people from doing that one task. But it's kind of it's not like saying a robot replaces a person. Because the there are other things that the people can focus more on, when the robot frees up that one, that one area with this kind of thing. So that's why it's like apples to oranges to say one robot in is not one human out, kind of

(0:13:15) Melonee Wise

it? It typically isn't. I mean, and a

(0:13:18) Audrow Nash

gigantic shortage, so you can just shift the people elsewhere? Probably.

(0:13:23) Melonee Wise

Yeah, that's the other thing is, is when you have such a labor shortage, you you're not displacing anyone, you're just trying to fill in the gaps.

(0:13:31) Audrow Nash

Mm hmm. Yeah. And make the few people that are there as efficient as possible. So that they can you can get as much work done as possible, socially? How did you guys find the So you mentioned kind of the detailed process of finding or of what occurs in sending out pick and place this kind of thing? or picking products out? How did you find the exact part of that where you thought a robot would be a good fit?

(0:14:04) Melonee Wise

Yeah, I think that you know, if you look at it, some of this stuff that that we did, it will Oh, it became clear that that when you look at manufacturing and logistics, whether it's picking or it's it's assembling something, there's always a process in which material has to come from some obnoxious Lee faraway part of the facility to another part. And it became very clear that there was a high transportation cost. And there's been plenty of studies done for like labor analysis, looking at different different jobs, like the fulfillment job that I just described, or a manufacturing job. And typically walking time or transport represents upwards of 50% of the person's total time. And so that's a that's a very compelling statistics when statistic when you're looking at Should I make a mobile transport robot

(0:15:00) Audrow Nash

It seems like to me that seems like almost a simple way of identifying low hanging fruit that can be solved with robotics things. I'm thinking of like nurses spend a large period of their time commuting from patient to patient, and then company, hello t robotics, where did an internship previously, they were saying how a huge they would have physical therapy or caretakers come to people's homes in Norway, and they spent like 60% of their time commuting from home to home. And these seems like it seemed like really good robotics applications to me if you can just put a robot in there and have it do the commuting wabbit do away with their commuting in some way. Okay, so you looked at this, and you saw this number? 660? He said, 50% of their time walking? Yeah. Which is crazy. So you started there? And

(0:15:56) Melonee Wise

then we started with building mobile robots? Uh huh. And then and then it came down to now you have a mobile robot, you have a base now what do you have to put on it to make it productive for the application. And if you look at Fetch today, we offer three major platforms, and have different payload capacity. So 100 kilograms, 500 kilograms and 1500 kilograms. And then we put tops on it to basically attack the, the different problems in the warehouse. So we have, for our small and big robots, we have cart Connect, which allows us to move around carts, we have roller top accessory, which allows us to interface with conveyor, we have standard shelving, it's kind of boring, but you know, like the shelf with the touchscreen on it. Stuff like that. I mean, it really comes down to the accessories for the hardware. I mean, if you look at that, and like, the vision and and what we were trying to build from, from day one, you know, we started as a hardware company, and as all robotics companies go, you quickly become software companies. I mean, we spent like a hot five seconds as a hardware company. And I mean, you work in Open Robotics, and like the whole business, their software. Yeah. And so we spent a large part of our middle years as a software company in over the last couple of years, we've been transitioning more towards being a data company as well.

(0:17:29) Audrow Nash

So Wow, that's the next level. Yeah. And so building

(0:17:33) Melonee Wise

out our data platforms, building out our data capabilities, things like that. But that's still an ongoing process, because you know, you have to build a pretty large data like to do anything.

(0:17:49) Audrow Nash

So I imagine a lot of like interest in cloud robotics at the moment with this is what you mean with the data? Yeah,

(0:17:55) Melonee Wise

also. So if you look at it, if you if you look at it, the middle years, when we were becoming a software company, that's when we became we built our cloud robotics platform. So all of our robots are actually controlled from the cloud, they have their autonomous units. But if you want to tell them to like, go from point

(0:18:13) Audrow Nash

A to decision making, the lower level control is done on the robot problem

(0:18:18) Melonee Wise

was safety, navigation stuff is done on the robot. But the task level commands like yeah, go here, do this come from the cloud. And we did that, because it makes getting set up on site a lot easier. If you, if you look at like many of these facilities, they're kind of out in the boonies, they don't have good IT support, stuff like that. And if you come along with a very complex thing, like a robot, and you're like it Oh, by the way, where do we install the server? It kind of adds a lot. So we went off to the cloud. But one of the advantages of that is, we also save the robot data. And we're building capabilities on top of that data. So forklift tection, we can tell when forklifts are speeding, we can tell when, when they get too close to people, things like that. And so those are some of the capabilities that we're starting to build on top of our data platform. But the important thing is, is you need a lot of training data, you need a lot of real world data to do anything. And I think it's it's one of the reasons why when people talk about autonomous cars, I don't think they understand the scale and the scope of what we're talking about when they when they talk about like, Oh, yeah, it'll just machine learn it. It's like Yeah. Yeah. Or they they'll say like, the car has the AI if I if I hear AI one more time, right now

(0:19:49) Audrow Nash

I know. It's true for so I want to go back and talk about the hardware company but just tying in this for the data company and talking about kind of this of the challenge for for autonomous cars. So you mentioned warehouses, semi structured environments is probably your data is a lot nicer than what you get. Would you just talk a bit about the difference between the data you're getting and the type of data needed for like an autonomous car? And how much how difficult that will be?

(0:20:20) Melonee Wise

Probably. So I think I think the thing is, is is, I wouldn't say that the problem we're solving is any less difficult than that of autonomous cars. I would say they have unique idiosyncrasies,

(0:20:32) Audrow Nash

what is the problem you're solving.

(0:20:35) Melonee Wise

So we're basically building a ton of indoor cars for the warehouse. The differences is we don't have weather, they have weather, but they typically their sensor package costs about $200,000. Our sensor package costs about 18 $100. So there's order of magnitude difference in the cost and capability of the sensors that we use.

(0:21:01) Audrow Nash

So that becomes a massive difference your sensors, is it like a LIDAR and a few cameras? Or

(0:21:06) Melonee Wise

what is the LIDAR in two cameras?

(0:21:09) Audrow Nash

Okay, is that 1800? And it's like, okay, that's one LIDAR and through?

(0:21:13) Melonee Wise

Well, that's, the bigger robot has to LIDAR and eight cameras. But yeah, I mean, it's still not a $70,000 velodyne, you know, and so so there's, there's that there's one, the the kind of sensing tools that we have, are very different. And so although although they have somewhat of a harder problem to solve from weather and some other aspects, they have better budgets, and sensing tools and fidelity, to deal with some of those environments that we we don't. The other thing that's very unique about our environment, that's not true for cars is our environments are far more dynamic in some ways. While while roads can be very dynamic, and, and things like that, it's, yeah, they have lots of rules of the road. And pedestrians don't just end up in major highways on a regular basis. Whereas if you look at warehouses, there's like main highways, but people walk into them all the time. The other the other, the other thing that you have to understand is that our vehicles typically go around 1.5 to two meters per second. So we're talking like three to four miles an hour. But some of the vehicles we have to operate around like forklifts can go up to eight or nine miles an hour, or even faster. And so it's very, it's a very difficult problem when you are in a scenario in which vehicles around you are going much faster than you. And then there is the forklift problem.

(0:22:52) Audrow Nash

So sounds ominous. So

(0:22:54) Melonee Wise

cars don't have to deal with this notion of of a piece of equipment being around them, that has tines that are very thin, very

(0:23:04) Audrow Nash

hard to detect the like probes at the top of

(0:23:07) Melonee Wise

the fork of the tines before, and they can be at any height at any time. And

(0:23:12) Audrow Nash

so they're super hard to sense probably to Yeah, they're

(0:23:15) Melonee Wise

super hard to sense. They can be at any height. And there's a lot of things like we don't, cars don't have to deal with a lot of overhanging things. Roads are built such that nothing is going to be hanging in the middle of the road. It's not like that in manufacturing. So

(0:23:32) Audrow Nash

what are you just taking a plane or whatever. So that makes it super difficult to do a lot with computer vision? Probably. Yeah. And that's challenging, because you have to recognize all these different circumstances.

(0:23:43) Melonee Wise

Yeah. And so I'm not saying that that the autonomous car problem isn't hard. I'm saying it's very hard. But I think sometimes people tend to look at the problem we're looking at, it's like, well, it's got to be easier than autonomous cars. And it's like, well, it's it's not it. It's different. And, and there's things that constrain our problem solving, and things like that. And then the other thing that is is very true, is our industry is heavily regulated from a safety perspective. And autonomous cars are not yet. They're not heavily regulated yet. I mean, there's some regulation, but there's no, there's no you have to do this test to prove that your autonomous car is safe, etc, etc. Sorry, I actually need to stop and take this call really quick. Okay.

(0:24:39) Audrow Nash

So we were talking about the differences between the data and that basically the problem of autonomous cars versus for manufacturing, logistics. Can you tell me a bit more about the data differences From these perspectives,

(0:25:01) Melonee Wise

yeah, I will say that from a, you know, data perspective, you know, autonomous cars are using a lot more data, that's what makes us a little bit more open to, to being able to port and move all of our data to the cloud. And you do like algorithms and some of these things in the cloud, which a car may not have some of the luxury of doing. I mean, that's, I think one of the big things that's going to happen in the next decade or two is we realize what it's going to take to make the practical deployment of autonomous cars real is the sheer amount of data, you need to just store, the map of the local area drive there that you drive on a daily basis is, is massive, and what data networks are going to support just that capability, you know, everyone's going to have to have, you know, fiber to their house, practically, to be able to like overnight, download the local area updates and things like that. And then, if you want to go on a long trip, you know, say, from San Jose to Tahoe, I can only imagine what the strategies are going to be for, like a four hour drive. So for four and a half hour drive, the the data strategy is gonna have to be to do that the nice thing for warehousing and in our robots is it's a million square feet, that's about as big as the problem gets. And so we definitely, we we have, we're running into the multiple source problem. And I think right now, in the autonomous car space, they're still struggling with the single source problem of how do you get just data out of one car, and we're dealing with, okay, the data that we have to move isn't ginormous, but we have 3050 100 500 robots that we're, we're transmitting data from. And that's a lot of data. And so it's a pipe problem.

(0:27:05) Audrow Nash

Yep. So with autonomous cars, they're just now thinking how to locally do it. So how can I make this one car do drive from one place to another autonomously, and you're going, we have these robots driving around this entire environment. They're all doing some sort of mapping of this environment. Like what maybe there's an obstacle that has occurred. And so then they all coordinate based on this, they go, Oh, that one path is blocked, I'm gonna have to go around, and then it reroutes

(0:27:36) Melonee Wise

that, yes, as part of our coordination, we also use peer to peer Bluetooth. But I was just speaking more from the sheer amount of like, loading the data, and then post processing. If you if you will get today I think most companies Google way Mo, all those guys use, quote, unquote, sneakernet, which is putting a hard drive in a mailbox and sending it in the mail.

(0:27:59) Audrow Nash

Oh, yes. That's crazy. The heart? What do you mean exactly? Where are they doing that? So like, well,

(0:28:07) Melonee Wise

so you say, company has 30 autonomous cars that their field testing?

(0:28:13) Audrow Nash

Oh, rather than actually putting it on the cloud and sending it and uploading it, they actually just physically mailed the device because of bandwidth constraints.

(0:28:21) Melonee Wise

They take out the hard drives and mail them. That's crazy. sneakernet

(0:28:26) Audrow Nash

Yeah, we're walking it.

(0:28:28) Melonee Wise

Yeah, I heard I heard some companies are uploading more than 40 or 50 terabytes a week? Wow. Yeah. So a lot of data.

(0:28:40) Audrow Nash

So you, so um, what it sounds like, to me, you have a good bit of faith in these machine learning approaches in the long game. So like 1020 years from now, for? Is that what you think? Or because I mean, data is not terribly important if you're going to do classical approaches for things, but you're finding in the warehouse setting that it is important to have a lot of data. Just Can you talk a little bit about this?

(0:29:09) Melonee Wise

Yeah, I think that there are are some types of semantic behaviors that we care about, that we need models for. So we need to know what the thing is. And the problem is, is that there's a lot of variation in the world. And so if you tried to build a classically trained model for every object in your, you know, I guess traversable space, it would be very hard.

(0:29:39) Audrow Nash

And by classically trained, would you talk about about what that means?

(0:29:44) Melonee Wise

You have 1000 images and data examples of say a Volkswagen bug from 1976. And you put that all into an algorithm and it spits out a model and And it's really good that one detector is really good at processing Volkswagen bugs. And so if you were to look at like the most, I guess you might call it asinine way of approaching the problem is you could say, I'm going to build a specific model for every car on the road every year because they change year to year. And I'm then going to spin up parallel detectors for all of these things. And, and maybe want to know, the car model, maybe you don't, but maybe there is some material reason why you want to know this. Right now, I think they reduce it down to car, truck or other. But But say you want to do this this thing, then the problem is, is you're probably going to have some set of detectors that all say that they found the thing, right, so if you have 1000 detectors, and 15 of them, say I found the thing, then you need to put it through another set of detectors and voting system to decide which one of the things that says they found the thing is right,

(0:31:11) Audrow Nash

or some approach to figure out exactly what is the one you listen to? Or what's most likely. Okay.

(0:31:18) Melonee Wise

And so the the challenge with that is it takes massive amounts of data. And that's why people are very old,

(0:31:26) Audrow Nash

like 1000 detectors or something like this, because it's that one problem. But there's a lot more and you have to make distinctions between that or is it the first model, but just a bunch of times? Because you could probably do that too, and then just have a confidence and wait to confidence?

(0:31:41) Melonee Wise

Yeah, those those models aren't as good because so

(0:31:44) Audrow Nash

you want it all in one model, basically, yeah. So it learns the discrimination between I think

(0:31:49) Melonee Wise

the problem is, is if you look at it like a good a good way to think about, like, our way that we've been doing learning for a long time, is, for a very long time, we built models, and systems where we said, this is the thing you care about, this is the thing you care about, we kept showing it, the thing they cared about from all angles. But if you look at the way people learn, it's a little bit different. We we learn, this is the thing that you care about. But we also see on a regular basis 1000s of examples of this is not the thing you care about. Hmm, interesting. Yeah. And so but there's a problem of trying to prove the negative when the space of things that you're trying to prove the negative on is, is infinite, right. And, and so this is one of the problems with, like machine learning in general. And, and even if you want to show, this is the thing that you care about, say we wanted to show a phone, right, this is the thing you care about, and specifically this phone. So it's blue, it has some yellow buttons. You know, a person probably only has the tolerance to take 50 pictures 100 pictures of this phone, ad nauseum. Yeah. And one of the reasons that convolution neural nets are so interesting is is it It allows us to take the 100 images and make it into like 10,000 images. Because that's what it is, the purpose of it is it basically allows us to take the pieces of all these things, split them up, put them back together and use them to train to train systems. And that's why so it's a way of artificially creating more data. And it's found that that artificial data that we create results in high, high confidence confidently trained models, typically, now, okay, there's a lot of tuning and a lot of mumbo jumbo. Yeah, it's kind of a flasher, but definitely the same time. But a lot of those systems, again, are only trained on positive examples. I'm positive. And and this is this is in general, a bigger problem, theoretically, with all of robotics is because the negative examples that can be very bad, they can have very bad outcomes, like how do you teach a robot not to hit someone when hitting them could kill a person, right? So, so not it? Whether or not training with only positive reinforcement is good. Sometimes you can't produce negative reinforcement data, although I have seen work coming out more and more on how you build simulation models to create negative reinforcement data, for example. But

(0:34:34) Audrow Nash

probably that got us all the way there, because you can overfit the simulator pretty heavily.

(0:34:38) Melonee Wise

Yeah, yeah. And so that's the problem. As far as all sorts of problems overfitting weird edge cases, you know, things like

(0:34:49) Audrow Nash

and so how this relates. So, if I understand quite correctly, how this relates in the autonomous car problem or in the way Your house problem, you are using some sort as you get all of this data, and you train different models on it. And you use that effectively for perception to help the robot understand the world and what's going on around it.

(0:35:17) Melonee Wise

So I didn't know say that a person is a person. And so then the robot can exhibit different behaviors appropriate for that person. So, for example, the one thing that's the most mature in our navigation stack is robot, the robot detection. So we know when something is another robot. And that allows us to implement certain behaviors within the robot that make them more effective. They they will, one give priority to other robots based on what those robots are doing. They will, you know,

(0:35:54) Audrow Nash

really cool. So it's going, Oh, it's a robot. But it's also going, Oh, I can understand from the context of what I can sense what this robot is doing. And that's a higher priority than me, kind of thing. That's interesting.

(0:36:08) Melonee Wise

And until then, that extends into things like a forklift. One of the things we were talking about earlier, it's forklifts are very difficult. Well, one of the more difficult scenarios with a forklift is the tines can be all the way up in the air, they can be 15 feet up in the air well above the robot, and the robot can't see it. Of course, it can't, it's very high up near. Now, if you know what's a forklift, you can know not to drive into that area. Even if you can't see what's there. And so that's one of the examples in which, if you can know what things are, if you submit to glean know that this is a person

(0:36:46) Audrow Nash

you can inform your policies of don't drive right in the spot where the tongs made come down, right, this kind of thing.

(0:36:53) Melonee Wise

Yeah. And that, that is that is what the majority of applications for machine learning autonomous cars are also for is semantic understanding. This is a ramp, this is a person, this is a crosswalk, this is a stop sign.

(0:37:08) Audrow Nash

Now there's one thing that I've seen, which is interesting to me, is ontologies and ontologies. It's like a map that shows how all of the different parts that you might see in an environment relate to each other. So if I was an autonomous car, it could be like, this is a road, this is a sidewalk, the sidewalk might have a person or a dog on it, or something like this. So then it'd be like road road has a sidewalk, sidewalk has a person, this kind of thing. It's a relationship between all these, do you? And this is kind of a manual way to specify all the different components. Yeah. Do you guys use ontologies? Or? So is it interesting?

(0:37:50) Melonee Wise

If you look at that, and topologies in our environment, our district and this is one of the problems I was saying is like, there's a shelf, there's a ioway next to it, but people and everything else can be in a box can be in a trash can be in it. person can be in it. So it's a very flat ontology. This is a shelf. This is a floor. Yeah, so i think i think i agree that ontologies can be very, very helpful for creating kind of semantic maps and relationships. But they they don't always apply well in our environments.

(0:38:27) Audrow Nash

Unfortunately, it's very complex. Yeah, yeah. Gotcha. Okay, going back a little bit. No, it's cool. This is all very interesting. Going back to being a hardware company. Yeah. Can you tell me a bit about when you guys started when you were a hardware company?

(0:38:44) Melonee Wise

Oh, yeah. Yeah. I mean, we were hard. We were a hardware company for about five months.

(0:38:52) Audrow Nash

In the seven years that have been so fun, yes. Crazy. Because

(0:38:56) Melonee Wise

that's, that's when we didn't have any robots. So we had to make the robot. Yeah, we had to make the robots. The good thing is, is we had a really stellar team of co founders who, who all knew a lot about hardware. And it was really funny. I think one of the more kind of interesting experiences for me is is and I think for a lot of the the guys they started competing with is, you know, we while we were will abroad, we probably build five or six different types of robots together. And, you know, we learned a lot of lessons sometimes they didn't power on sometimes they have problems.

(0:39:34) Audrow Nash

All the things.

(0:39:35) Melonee Wise

Yeah, yeah, all the things. But when we, when we started Fetch, we had so much experience like when we built the hardware, while we design the hardware, we sent it to his job. It came back from the shop we put together turned on it worked fine. And what was really funny is there were some employees who had been in other robotics companies who had been at other hardware companies. And like, we're like, what do you Like, this timeline is insane. All of this assumes that the hardware is going to turn on and work and not catch on fire. And we're like, Well, first, it will never catch on fire. We're not that bad. But second, second, we'll be fine. And it was, it was an interesting thing, because I think one of the things that I had developed is a somewhat like, what you might call robotics amnesia, where it kind of started forgetting that, that hardware could ever really be kind of shitty. And so and so like,

(0:40:31) Audrow Nash

give us a nice position to be in when you really are like, you're like, Oh, I'm surprised that this didn't work. Yes, everything has just been working.

(0:40:39) Melonee Wise

Yeah. And so it was, it was definitely an interesting juxtaposition. As, as the four co founders were kind of like, Oh, you know, we got this, this isn't going to be a problem. And then some of the new employees that we hired early on, were kinda like, this rodeo before. And never, it's never this move. So it was, it was cool. We were a hardware company for a very short amount of time early on me then eventually cycled back and we built the bigger robots, we became a hardware company briefly again. But the part of the company became a hardware company. It wasn't like, everyone there full time was just doing fucked.

(0:41:22) Audrow Nash

Yeah. So then, how did you go about, like, the manufacturing process and everything for like, so once you worked out the designs, how do you go into actually making lots of the robots? Like I assume that that's quite a big thing?

(0:41:42) Melonee Wise

It's actually relatively simple. Because it's not like we're bending metal, someone else does all of that stuff. Like they build the sheet metal. So you do the all the CAD designs, you do all the drawings, you send it out to a shop, they fabricate it. And really what you're doing at a robotics company, if you're doing assembly is you're doing assembly and test. And not to say that we're not manufacturing, but we're not bending metal. And so yep, the parts are putting together Yeah, we're putting them together, and then upgrading them, and then making sure that they turn on. And so it's, it's actually relatively easy to build or assemble, let's call it, robot hardware, especially in the in the case of the smaller robots, the larger robots get a lot more complicated because of the safety systems and things like that, that are required. But for smaller robots, almost all the parts in in smaller remotes are custom. And so we have a lot of control over it and comes in, we put together and worked.

(0:42:52) Audrow Nash

Gotcha. That's interesting. I've heard some companies that I've talked to I've had a lot of difficulties, like finding suppliers and working with it. And I've heard things like the CEO will fly every rec to or some some person in the company will fly every three months to say, the Schengen area of China, and work with them on the manufacturing. Did you guys have to do these kinds of things? Or?

(0:43:18) Melonee Wise

No, we predominantly made everything in the United States. And anything that wasn't made in the United States, we had us partners that handle all

(0:43:26) Audrow Nash

that. Interesting. Can you speak a bit about that the manufacturing in the US because I'm not honestly terribly aware of many robotics. things being made in the US? Yeah, sure. new info to me.

(0:43:43) Melonee Wise

Yeah. So if you if you look at it, if you look at the freight 100 or 100 kilogram platform, it has, it has about 300 total parts about 100 unique parts. And the the things that are made not in the United States are things like the laser scanner that's made in Germany by the, the skins, those are injection molded in in China, but they're designed in the United States, actually, the molds are designed in the United States. And then they're built in China, but and then our wheel motors are based off of a hub motor that you see in like scooters and stuff like that. And that was clever. Yeah. And we did the we do the engineering design in the United States with a company that we use their their Sai motors, they make custom motors, and then they own parts of factories in China. So in terms of the extent of the componentry, that's like fabricated or made not in the United States outside of the computer, let's call it it's all made in the United States. So we designed our own PCBs, though. prefabricated at a PCB fabrication place in actually California. All the sheet metal is bent in, in many, I mean, the bay area has a lot of manufacturing capabilities. And all of the sheet metal is I mean, one of the shemale vendors is literally blocked from our office. They they kind of scooted down the road. And so all of all that in the paint, you know, the any of the other things like powder coating, stuff like that, it's all done locally. So if you if you look at it, unfortunately, the the a US commerce defines defined in the USA by cost of goods. So because we have a laser scanner that's made in Germany, it we can't call the robot made in the USA, we can say design in the United States, but but if you look at it on a per

(0:46:02) Audrow Nash

like because the laser scanner is such a large component of the whole robots cost. Yes. Yeah. Can you can you tell how much one of the robots costs? No. Okay.

(0:46:14) Melonee Wise

Sorry. That's a closely guarded secret. Okay. Okay. Yeah. But if you look at it from a total part count, sense, the majority of the parts are made in the United States. And so most of the labor and things like that, and the jobs are being created in the United States. But there's this technical way of judging. You know, if

(0:46:39) Audrow Nash

you can call what you can call it, that makes sense. Although, it'd be nice if it was by item

(0:46:46) Melonee Wise

would be, it'd be really nice.

(0:46:49) Audrow Nash

Because honestly, before talking to you about this, just now, I didn't realize robotics manufacturer was done in the US to any real extent.

(0:46:59) Melonee Wise

There's lots of companies out there fabrication and assembly in the United States, you'd be surprised.

(0:47:05) Audrow Nash

I am. interested. Interesting. Okay. I want to learn more about that. Yeah. So now, moving on to the next stage. So after five months, or whatever it was, you were a software company?

(0:47:17) Melonee Wise

Yes, we were. Tell me a bit about that. Yeah. So that's, that's building all this software for the robotic side and the server side. And so when we first started, we, we, you know, we kind of heavily heavily modified the raw stack to support our needs. Eventually, we ended up basically, kind of drawing the line where we were going to use a lot bra stack. So we use it, basically, for message passing, and all the libraries and visualization. But we no longer base any of our navigation and slam on top of the ROS stack.

(0:48:02) Audrow Nash

With that, would that change with ROS? Two out of curiosity? No, no, no,

(0:48:08) Melonee Wise

no, I think that, you know, we definitely are going to move to ROS, two in the future. We're trying to figure out the right time to do it. But the the roster, navigation stack is a research stack that really wasn't meant to scale for the type of applications that we're deploying it in today. And the same is true for kind of the slant algorithms. It doesn't say anything negative about the ROS navigation capabilities or anything like that. It's just not well suited for our application.

(0:48:41) Audrow Nash

Yeah, you said million square foot warehouses, like, I'm not terribly surprised that it doesn't scale to that. Yeah.

(0:48:50) Melonee Wise

And then, you know, then we also built the server side componentry. So this is stuff that that I think now you're starting to see more attention to in ROS with, like ROS web tools. But But previously, there wasn't as much focused on, you know, user interfaces, tool chains for editing and modifying the map and, and things like that. And so Fetch builds all that software stack for, for basically allowing users who know nothing about robots to use robot.

(0:49:22) Audrow Nash

Yep. So they can have a nice interface, and the interface kind of helps them use the application. Yeah. I think that's an important thing to add into ROS.

(0:49:32) Melonee Wise

Yeah, ROS was ecosystem. Yeah, the ROS ecosystem being is missing like this entire kind of dev tools. Robot chain.

(0:49:43) Audrow Nash

That what do you mean the actual robot chain?

(0:49:47) Melonee Wise

So like, there's a lot of deployment tools for robots that are missing like there's no way to there's no good open source packages for deploying software to lots of robots. There's no good open source tools for monitoring lots of robots concurrently there

(0:50:03) Audrow Nash

you looked at? I don't know. I haven't used it. Have you tried Amazon Greengrass, and they're open sourced? No, no, you said no. Open Source. Yeah. No

(0:50:15) Melonee Wise

open source. Yes. Yeah. I mean, there's plenty of people who are willing to charge you money for these services. But for me, the thing that the thing that sparked all of the explosion in in companies for robotics and using ROS was that there was no cost burden to get started. And so those are some of the tools that are missing from the community.

(0:50:43) Audrow Nash

What would you say explicitly I mean, I'm, I'm the ROS boss for the humble to release in a year. And so I'm interested in to see what like we're still building the roadmap, basically.

(0:50:58) Melonee Wise

I mean, if you ask my team today what they want they just want documentation like

(0:51:03) Audrow Nash

that's that's the biggest priority so far for humble I totally agree with you on that Yeah. And

(0:51:09) Melonee Wise

you're talking to the person who used to be the ROS Wikis are so

(0:51:14) Audrow Nash

back in the day Yeah.

(0:51:16) Melonee Wise

Well, all the tutorials I wrote are still the details for us.

(0:51:19) Audrow Nash

That's crazy when so so this the Willow Garage days, or this is the early ROS days, yeah, milestone

(0:51:28) Melonee Wise

three of ROS, and will abroad was documentation. And during that time, I wrote about 200 ROS tutorials. So all the big tutorials like it's still all your work. Any anything you see with like turtle sim I wrote.

(0:51:47) Audrow Nash

That's so funny. And now in ROS 2, it's the same thing. They're just derived from the old ones. Yeah. So you probably like I remember writing that. That's Yeah, Faria.

(0:51:57) Melonee Wise

Actually, when she was converting, like asked me questions about him. She's like, your names at the bottom of all these pages? And I was like, yeah, cuz I wrote. Uh huh.

(0:52:07) Audrow Nash

And Marya is the former, she's no longer at open robotics. But she was like the documentation lead for the ROS two initiative, as far as I understand.

(0:52:17) Melonee Wise

Yeah. And she, she, I think she ported a lot of the ROS 1 tutorials to ROS Who?

(0:52:23) Audrow Nash

Definitely. Okay, so that's interesting. So you built your own nav stack. And then on the server side, so you can control the robots. Those are two things that are really not the nav stack. But the server side, the building user interfaces are about web tools, these kinds of things. These would be good to see. And ROS, do you think, right?

(0:52:45) Melonee Wise

Yeah, I think I think from an open source perspective, yes, I think the biggest challenge

(0:52:49) Audrow Nash

is deployment.

(0:52:51) Melonee Wise

Yeah, deployment, robot monitoring, but all of that. I mean, those are huge undertakings and letter, whether anyone's willing to fund it or support it. I mean, because many companies like Amazon are building Greengrass, close sourced tool chains that, you know, they want to get you hooked on their platforms.

(0:53:13) Audrow Nash

Woody. So did you guys build your own? Or do you use one of these? Oh, yeah.

(0:53:17) Melonee Wise

I mean, it was early two, it was really early, I mean, in orbit, and forment. And what are the other six other guys that are all doing kind of those things? All started two or three years after Fetch started as a company.

(0:53:35) Audrow Nash

So you had to?

(0:53:37) Melonee Wise

Yeah, I mean, it's it's why, you know, it's it's the same thing that a lot of people ask why ROS didn't base their communication layer off of protobuf exists. Yeah, it didn't exist when? When ROS started, you know? And then DBS wasn't as mature A long time ago. So there's just lots of questions like, I think, why not? Yeah, but what do you want? It didn't exist?

(0:54:08) Audrow Nash

It didn't exist. And we've been building on top of what we have this whole time. Yeah. Okay. And then, then you've any what were what were some of the hardest challenges, I would say with the software company that you had for before becoming a data company? Yeah, I mean, so million foot square foot warehouse. Sounds really tough. Yeah, that

(0:54:32) Melonee Wise

was hard. Um, I would actually say that that many of the challenges that we ran into were with forklifts. I hate forklifts. Getting safety right. That's really hard. I think in general helping to help customers. Understand Then what is capable and what's or what is possible and what's not possible?

(0:55:04) Audrow Nash

Just like what you're saying at the beginning with the tempering expectations? Yeah, kind of thing.

(0:55:08) Melonee Wise

Yeah, I mean, I, you know, I think so. So, you know a fair bit about robotics and you probably know that, like, the robot footprint really matters, right? Like, as the robot travels through the world, it needs to know how big it is, right? One of the most common asks from any customer is Can't we just, that's how it always starts. Can't we just overhang the footprint by an inch, or three inches or 15 inch is? And it's like no robot doesn't know how big is the robot doesn't can't see those things. But that's one of the hardest things with with kind of people's understanding or lack of understanding of robotics and also their kind of expectations as they see it from really killer. You know, videos like Boston Dynamics. I mean, those are amazing. Yeah, yeah, they're, they're super amazing. But you know, I would load up the expectations to upload the 40,000 or, you know, retakes before they got the backflip to work that one time.

(0:56:20) Audrow Nash

I couldn't believe that when that occurred, like watching that video was like, Oh my god, what is that? That's crazy. I was in a legged locomotion lab at the time. And it was like super crazy. We're just getting nothing to walk. Different backflips.

(0:56:33) Melonee Wise

Yeah. But you know, you and I both know as roboticist like that. There's a real, there's a blooper reel. And it's really huge, right? Yeah. And so one of the bigger things about about software, as you know, is lots of things are possible. But the time to implement those things can be very long. And they can be very weird. They're hidden, they can also contain very weird edge cases and things like that. If you were doing legged locomotion, then you definitely know how bad edge cases are.

(0:57:04) Audrow Nash

Oh, it's very spiky, the optimization space, it would like work, great work, great fall down. Like it would very, very tough to tune parameters. But yeah, yeah.

(0:57:14) Melonee Wise

And so. So those, I think are some of the bigger things that we ran into, over the time of building software, and just figuring out how to support some level of adaptability of the system, or customization without basically writing one off code for every customer that ever wants anything.

(0:57:38) Audrow Nash

Yes, totally. Yeah, that's hard. Because it's, it's tough to make the software flexible, in a good way. Like, I mean, it's good software design, but it's hard to predict. And it'll be like, Oh, they want this little thing that's slightly different. That requires a major assumption that the whole thing was built upon to be J. That can be painful and difficult. Yeah, we just want this other little thing. Like, no,

(0:58:03) Melonee Wise

okay. We went through this whole exercise for our car Connect like and we basically got the biggest cart we could absolutely move around with this robot. And, and like, in the field, everyone's loving it and then like about six months in, then we start getting the request. Can't we just make it an inch bigger? answer's no, we made it as big as we possibly can. You know, there's no more

(0:58:31) Audrow Nash

Oh, funny one inch. Can't we just What a funny you're like, Oh, no. Oh, no, every time

(0:58:38) Melonee Wise

I hear it, can't we just are would it be so hard if? Or

(0:58:45) Audrow Nash

these? Yeah. Always. Yeah. So let's see. Okay, so we talked a little bit about the data before and I'd love to learn a little bit because we're running out of time. Sorry about that. No, it's all been very interesting. But I'd love to hear a little bit about the acquisition process with you but I know you're still going through it.

(0:59:09) Melonee Wise

It just closed like in 30 minutes. I mean, we announced a complete a couple days ago but the money came into the bank today.

(0:59:20) Audrow Nash

Oh, what like what a breaking interview just so can you talk a bit about the experience. I've heard horror stories with acquisitions. If I understand correctly, they had say 5% or something stake in your company and then they just bought the rest out? Or

(0:59:41) Melonee Wise

Yeah, I mean, acquisition process like they had been a previous investor. We went through an acquisition process we engaged with other potential acquirers but never really seemed like the best fit for us. And one of the things that really got me excited about zebra acquisition is in January, they actually hired Jim Lawton. He's been in the robotics industry for a while he worked at rethink robotics. He, he worked at Universal robots. And he done a variety of different startups, things like that, and you have a good reputation. And it's signaled to me that they were very serious about robotics and robotics, automation.

(1:00:26) Audrow Nash

They've done their homework, they made a good pick, they brought someone

(1:00:29) Melonee Wise

and someone was working. Yeah. And, and you know, when you when you look at it, as you said, there's always this this fear of the acquisition going terrible, like someone buying your company and then saying, you know, what, we want to do home robots or something like that, I think that would have been soul crushing to me. Like, forget all this stuff you've been working on for the last half decade or so why don't we just go work on you know, vacuums. And so, so, you know, the, the zebra acquisition was really for me about finding good fit, I mean, obviously, money is a factor, but making sure that the visions are aligned, the outcomes are well aligned. And, and I felt like, zebra was a good fit for us, if you, if you look at some of the things that were happening before that, so after they invested in us, we started partnering on some things, and one of the cool things that we did, um, about a year and a half ago is, is so they make camscanner so you can scan a barcode and it like tells you what this thing is, or you you know, it goes to a system and does one thing, we a year and a half ago made it easy to integrate their hand scanners with our robots. So like someone could scan a barcode and a robot would like automatically appear. And so that was that was a super cool integration because if you look in manufacturing some guys building something right? You know, he's, he's assembling something, and maybe he runs out of screws, or maybe he runs out of something, you can just scan a barcode in a robot with a cart shows up and says, Here's your thing, you know, and, and so that that was very powerful for some of our customers, and they, they really liked it. And we've been slowly expanding our platform to integrate with, with like, all the other industrial IoT devices out there, we actually created a partnership with sec as well for this kind of application. And so that, you know, we that was really cool, we had experience working with them, we knew it wasn't like the worst thing in the world. And it was easy to work with them get things done, and we got things done. Because that's the other thing that you could you could run into is you you get acquired and nothing happens Do you do the quintessential rest invest? And however rushed invest? It's it's a term that people throw around in startups, after you get acquired. You doesn't if you don't like what the company's doing or you you don't like you just know you stick around and you just rest invest.

(1:03:12) Audrow Nash

Rest while you're

(1:03:14) Melonee Wise

vesting the rest of your options or shares. Yeah, I mean, I

(1:03:18) Audrow Nash

typically Yeah, you're not investing?

(1:03:21) Melonee Wise

No, but but I didn't that was one of my goals. I didn't want to end up at a company where that was my main bag, you know?

(1:03:28) Audrow Nash

Oh, definitely. Yeah. Okay, and then how did you so it's just interesting to me to find a good fit it probably means that there were other companies that looked into buying you guys

(1:03:43) Melonee Wise

yeah, I can't I can't speak to the whole process in that regards. But yeah, I like

(1:03:47) Audrow Nash

that you found a good fit for this kind of thing. Like we found a good fit. Awesome. So with just a few minutes left, I was hoping for more time for this because you have an interesting perspective. But what's the so if you if you're projecting like five or 10 years down the road, where do you think we're going to be with robotics? And you can speak to warehouses or anything else?

(1:04:14) Melonee Wise

Not much further than we are today. Yeah, I I, you know, a couple years ago actually at a ROS con, the CEO of of Ubuntu or canonical got up and he said, you know, the the thing about success is it it takes 10 years to have an overnight success. And, and robotics is is a slow moving train. I think that a lot of people probably know Fetch and you know me, don't or may not have realized or remembered that we're seven year old company, right and it's taken seven years to get to this success point and we we have done a lot In that seven years, and we were

(1:05:02) Audrow Nash

involved in Willow and everything before that,

(1:05:05) Melonee Wise

yeah. And, and so if you if you look at it from the time I came out here in 2007, it's been a long time, right? And it's not saying that incremental advancement won't happen over these 10 years. I just don't see us having like this big, light switch moment in robotics. I mean, everything will kind of just incrementally trickle. But if you were to ask me about autonomous cars, I think we're 20 to 40 years out on autonomous. Oh, yeah. Like, that's all we're saying that Tesla is going to be driving on the BS. I mean, anyone who says that needs to go like, yeah, go do it for a couple years and figure out how hard the goddamn problem is.

(1:05:51) Audrow Nash

For sure.

(1:05:52) Melonee Wise

Um, and so I, I don't know, I mean, I think the the one thing that I could see that could be game changing for indoor robotics in the next 10 years, is the new wireless standards that are coming out for localization to kind of solve this, where's my robot problem? I think that could be really game changing if that technology takes off like ultra wideband location, more of the Wi Fi six standard that includes local local location information as part of the Wi Fi like packet. That's something that can be super game changing, I think could have a wide sweeping effect in in robotics because now really, then the problem comes down to how good can your navigation be? Not Where the hell are you all the time? I think that we'll see good improvement in look like a bipedal locomotion robots? I don't think we'll get anything really practical out there in the world. I mean, you worked on it, you know, how how, how spiky it is, let's call it true. I think that one thing we might see, though, from the autonomous car space, which I'm kind of interested in seeing is I think that that players like neuro and those smaller will local distribution, autonomy vehicles that are more like little smart cars with, with delivery baskets in them on the roadways, could be game changing. But they it's not full autonomy. And it's kind of you know, side streets, not highway driving, for delivery of groceries and things like that. I think there's promise there. And I think that might be something we might be able to deliver on inside of 10 years. Because the complexity of the autonomy is localized. And limited.

(1:07:58) Audrow Nash

Do you think? Do we get further with like human robot hybrid models where a human takes control? If the robot gets into a situation, it doesn't have high confidence about navigating?

(1:08:09) Melonee Wise

Absolutely, we're already doing that today. The question is, is how far can we push the safety boundaries of that? So far, it's not clear that we have much of a good clear idea of what that means. But like, if you have an autonomous view of a truck that's being driven remotely by a person and the Wi Fi cuts out it's

(1:08:31) Audrow Nash

terrifying.

(1:08:32) Melonee Wise

Yeah, terrifying. Keywords terrifying. Now, if you look at Fetch and plus one and lots of other companies we all have remote monitoring in some way to assist the robots but we we greatly rely on autonomy it's like 99.5% autonomy point 5% human I don't know where that'll if that'll start spectrum wise going more towards the 90%. But it's it's hard to provide return on investment with a lot of human in the loop autonomy because of just the time expensive and time to time to accomplish it. Like time to recover.

(1:09:17) Audrow Nash

Oh, I see. Gotcha. Okay. So let's see wrapping up. Do you have any public like social media or company blog or anything that you'd like to share with our listeners if they'd like to learn more or Sure,

(1:09:33) Melonee Wise

I mean, if people want to learn more, I mean the biggest thing you just need to know is how to spell my name. It's me yellow, and E. I'm very unique and very, I've great SEO. I have a Twitter that's just Melanie wise I have. Everything is just Melanie wise. So if you Google, Melanie wise, you'll find my Instagram, my, my Twitter, my LinkedIn, all of that very easily. I will say some some of my channels are pretty fragmented. Some of them are only personal stuff. So if you go to my Instagram, there's no robots there. It really is just beer in travel. But, you know,

(1:10:12) Audrow Nash

it's a human to Yeah. Okay, is that it for those? deaf? Okay. Well, I've enjoyed speaking with you. Thank you. It's been

(1:10:22) Melonee Wise

great. Thank you very much. I appreciate your time.

(1:10:28) Audrow Nash

That's all we have for you today. If you enjoyed this interview, consider subscribing to comment on this interview or to give us general feedback, especially while we're just getting started. Go to sense think act calm, or you can find us on the ROS discourse. We'll be back in two weeks time. Goodbye, everyone.