Fast company logo
|
advertisement

Intel’s Jack Weast shares new lessons about robo cars. For one: they’ll fail if they don’t tell us clearly and consistently what they are up to, and why.

Self-Driving Cars Must Earn Our Trust, Says Intel’s Autonomous Chief

[Photo: Dyaa Eldin Moustafa]

BY Sean Captain7 minute read

Intel may have missed out on the smartphone boom, but it’s determined to get a foothold in smart cars. In January, the company unveiled a line of in-car computer systems called Intel GO; in March, it announced the purchase of Mobileye, an Israeli company that makes sensors and cameras for driverless vehicles. Now, Intel is collaborating with BMW on an autonomous car platform, which the companies hope to eventually license to other carmakers—an automotive version of the “Wintel” (Windows and Intel) duo that dominated the PC business.

Of course they have competition. Nvidia offers its own car computer, called Drive PX, which has won over automakers Audi, Mercedes-Benz, Tesla, Toyota, and Volvo. (Intel says it has more partners that it can’t name yet.) Chinese tech giant Baidu just announced its free, open-source Apollo software, which is like the Linux of self-driving cars.

Jack Weast, the chief systems architect for Intel’s new automated driving division. [Photo: Boone Rodriguez]
Intel’s strategy is to revive it’s old game plan from the PC industry. As it helped set standards for technologies like USB and Wi-Fi, Intel aims to do the same for cars, says Jack Weast, the chief systems architect for the company’s new automated driving division. That goes beyond technical standards to include a uniform way of for people to interact with cars.

Intel recently conducted a usability study, placing people in autonomous cars and gathering feedback on what they did and didn’t like, and what could be improved. The company plans to publish results in August, but Weast gave Fast Company a sneak peek when we caught up with him at the recent Automated Vehicles Symposium in San Francisco. (This conversation has been edited for length and readability.)

Fast Company: Why is the interface for a self-driving car important?

Jack Weast: How do you know it’s your car when it pulls up? How do you authenticate [it’s] you getting into your car versus somebody else? What happens if you want to change your destination mid-ride? Do you do that on your phone? Or do you do that via voice? There are important technology implications for us to make sure our products support whatever it might be–gesture recognition, natural language processing. How do you put all that together? We think this is a really important area that hasn’t been talked about enough.

FC: What were some of the things people wanted? Did they want the car to tell them what it was doing at all times?

JW: We [had] a scenario where a pedestrian darts out in front of the car. So one of the things we had the car do is both verbally, audibly, as well as on the information display, alert the passenger that, hey pedestrian detected, slowing down or stopping. It seems simple, but if [the car] wasn’t telling you what it was doing, [you’d wonder], why did it just change lanes there? What’s going on? Can I trust this thing?

It comes to human psychology, really. One of the things that’s really crucial is bi-directional, open communication of what somebody’s doing and why. So if I brush a bug off your shoulder, I’ll tell you that before I do it. Otherwise you go, why the hell did he put his arm out and touch me?

Maybe for some things, we as an industry need to agree on a common interaction. Think about cruise control. You can get into any rental car, any brand around the world, and you see Set, Resume, On, Off.

BMW engineer André Mueller works on an automated test car. The final computers will be able to fit under a driver’s seat. [Photo: Intel]
FC:Are there other things you remember from the study that people may have had a weird reaction to?

JW: One of the concepts we were exploring was a personalized display so you would know which was your car. But [what] if you’re wearing sunglasses, it’s really sunny, or the person’s blind?

It’s also interesting to think about cultural differences. How do people from different countries [react]? Do some cultures favor voice? Others maybe favor the ability to read text. Is there a geographic element to [the user interface], that these vehicles can’t just be the same worldwide? We just wanted to get it going and provide some thought leadership, because we really didn’t hear people talking about this. We’re going to publish those results…and make sure [carmakers] have products that work great technically but also that are going to be embraced by folks who don’t understand the technology.


Related: These Tesla Vets Are Taking On Tech Giants With Robo-Car Maps Made By The Crowd


FC: What do you think you might do with those results?

JW: So the results, we’ll publish, share with our customers and partners as well as the industry. But any development, particularly when you talk about in-vehicle experiences, that’s obviously the role of our customers. BMW has experience designers, so a BMW feels and looks like a BMW. And so we’re focused on, what are the things that really transcend the specific designs and need to be there from a human-machine interaction standpoint.

FC: You say there should be interface standards. Would Intel be interested in playing a role to corral that?

advertisement

JW: Yeah I think that if you look at our company history, we very often see opportunities for standardization to help the industry. There are parts of [product] development that are differentiating and add value to a brand and experience, and there’s other parts [that are just] reinventing the same wheel that everybody else is doing.

We would work within organizations like [engineering association] SAE, with our customers and partners, if we all decided, hey let’s standardize the interaction model for car-to-pedestrian communication.

FC: Does anything else need to be standardized?

JW: Right now, each sensor vendor has its own proprietary [data formats]. So we sit back and say, that’s a clear opportunity to have a standard interoperability [to] make it easier to assemble these things. Certainly, cameras are different from lidar [the laser equivalent of radar]. But even then is there a standard data stream from a camera? Is there a standard way we can mange these devices, whether it’s from a power standpoint, or is there a standard way to get failure information from sensors?

Typically when you [standardize], costs go down, the barrier to entry for new suppliers is reduced. We’ve done this in many other industries. You used to have proprietary interconnects, and there wasn’t a standard way to connect an external hard drive or whatever it might be.

Intel’s in-car computer. [Photo: Intel]
FC: Coming from the traditional tech world, I just assume that everything works together.

JW: In this case it doesn’t. It takes many months of hand coding and manual configuration to get your sensors set up. So why not shave six months off your development time or provide the opportunity to swap out different sensors and try things out more easily? Maybe an automaker wants to try out different suppliers. With a standardized mechanism to interface and discover and operate those [sensors], you could do that. Today you can’t.

FC: What do you think about the Baidu initiative to do open-source autonomous car software?

JW: There’s a common bit of functionality that needs to exist in these systems. I think there’s value in having an open reference implementation for the industry to use instead of automakers inventing their own. It’s very similar to what we have with the Intel-BMW-Mobileye partnership.

FC: So BMW, Mobileye, and Intel are also going for an interoperable standard?

JW: It’s a solution that will be licensable by any other automaker in the world who wants to use it. Both Delphi and Continental [automotive tech suppliers] have said to the world that they will offer an industrialized version of that Intel-BMW solution. So if some [carmaker] comes along, they can call up Delphi and say, “hey, we want that Intel-BMW-Mobileye solution for our car that’s going to come out in 2021,” Delphi and Continental are ready to deliver it to them. OEMs that partner with us and BMW and Mobileye and license that solution will save a huge amount of money in development costs, and [will] be able to get it done faster than rolling their own, but still have opportunities to differentiate their own brand experience.

FC: Might we ever see home-brew autonomous cars? You can buy the hardware. At least the Baidu software is open source and free. If the sensors are all standardized, it would be easy to program.

JW: It would be interesting to see. The regulatory environment will have to catch up here at some point. That’s a fun thought. I hadn’t considered that.

Recognize your brand’s excellence by applying to this year’s Brands That Matter Awards before the early-rate deadline, May 3.

PluggedIn Newsletter logo
Sign up for our weekly tech digest.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Privacy Policy

ABOUT THE AUTHOR

Sean Captain is a business, technology, and science journalist based in North Carolina. Follow him on Twitter  More


Explore Topics