By Tim Hornyak
On the 11th floor of a building in Tokyo’s Shibuya neighborhood, engineers watch a virtual car cruise through a virtual city. When the simulation begins, the car can barely drive in a straight line. As it veers to the left and hits a building, the simulation resets. But within minutes, its artificial intelligence system has learned to negotiate several corners without crashing.
Japanese start-up Ascent Robotics wants to create software for self-driving cars and sell it to vehicle-makers and equipment manufacturers. The company, which was founded in 2016, says it raised $18 million in funding this year and aims to complete a fully functional AI vehicle system by late 2020 — when it hopes to hold an initial public offering.
The company is catching the attention of Japan’s tech community because it counts Ken Kutaragi, the creator of Sony’s PlayStation game console, as one of its board members.
On the software side, the company is trying to distinguish itself in the crowded field of AI start-ups with its simulator platform called Atlas. It incorporates PlayStation steering wheels so human drivers can inject unexpected elements into a training simulation such as reckless drivers on the road. It also features sensor data from vehicles, predictive modeling and reinforcement-learning algorithms to help determine which behaviors are most useful for driving.
According to Ascent, the result is an AI training environment that can coach the program to quickly learn to drive cars safely. The question is whether simulation data can help make better self-driving cars compared to miles logged in the real world.
“The difference between us and Waymo is simply that we are building machines that think like us, and Waymo is building a machine that follows rules,” said Fred Almeida, Ascent co-founder and chief architect. “In 2016, what became feasible was the application of new algorithms based on theoretical neuroscience to allow the machine to plan and navigate through a process of reasoning.”
Vehicles using Ascent’s AI system will navigate with low-definition maps and measure distance to objects with sensors such as sonar, radar, cameras and lidar — a light detection and ranging remote sensing method using lasers. The company said the vehicles will recognize their surroundings, make judgments about them and will not require high-definition or three-dimensional maps like systems being developed in the U.S. That theoretically allows them to drive anywhere.
“We are not only using AI in perception to make cars drive smarter, but our car uses AI end-to-end, from understanding the environment and planning decisions,” said CEO Masayuki (Davey) Ishizaki. “We’re using neuroscience and understanding of how the brain works translated into an algorithm which we are using as the foundation of our software. That’s going to be the key to make cars drive as though humans were driving them.”
Read the rest of the article at CNBC.