In the last few months, Apple’s self-driving cars have been spotted many times on California roads. Apple is testing its autonomous driving systems using Lexus vehicles rather than its own cars. A new research paper reveals a new method that could help the Apple self-driving car detect cyclists and pedestrians more accurately. The paper was published last week in the online journal arXiv.
VoxelNet is better than LiDAR, claim Apple scientists
The paper offers a rare sneak peek into Apple’s autonomous car-related research. The Cupertino company is known for its corporate secrecy, especially when it comes to the future products. Apple’s secretive behavior was seen as a drawback by many AI and machine learning scientists who usually share their work with peers at other organizations. Apple tried to address their concerns by launching the Apple Machine Learning Journal earlier this year.
However, the latest research paper was published in an independent online journal rather than Apple’s own journal. In the paper, Apple scientists Yin Zhou and Oncel Tuzel talk about a new technology to detect 3D objects. The Apple self-driving car will need to detect objects near and far with speed and precision to be able to drive without any assistance from human operators.
Currently, most self-driving cars use two-dimensional cameras and depth-sensing LiDAR technology to detect the objects around them. LiDAR works by shining laser light onto a surface and then measuring the amount of time it takes for the light to return after bouncing off the surface. It helps LiDAR figure out the shape of the object and its distance. However, LiDAR cannot always detect small objects that are far away.
Apple scientists noted that the 2D camera and LiDAR combination is not the best solution in many situations. Also, it could be “sensitive to sensor failure.” Yin Zhou and Oncel Tuzel proposed a new software-based approach called VoxelNet to help detect complex 3D objects. It uses computer vision and machine learning to detect even small, faraway objects.
The VoxelNet framework uses voxel feature encoding (VFE) layers to characterize 3-dimensional shapes. The scientists said their methodology outperformed the LiDAR-based 3D detection technology “by a large margin.” However, the method was tested in computer simulations rather than real-life road tests. They had trained the VoxelNet to detect cars, pedestrians, and cyclists. The tests were conducted on the KITTI 3D object detection benchmark.
Though Apple scientists focused mainly on the autonomous car navigation, the technology could have many other applications, says Apple Insider. For instance, the technology could also be used in augmented reality systems that rely on depth mapping to detect real-world objects. A VoxelNet-like technology could also be used in the future smartphones to deliver enhanced augmented reality experience.
Apple testing its self-driving systems on public roads
Apple has been investing heavily in autonomous driving systems for years. Tim Cook recently described the autonomous driving system as the “mother of all AI projects.” The company is focusing on self-driving from a “core technology point of view.” Cook’s comments indicate that the tech giant is focusing on building the autonomous driving systems rather than developing an actual car.
However, building only one part of a product is not in Apple’s DNA. The company usually controls all aspects of its products. Take iPhones, for example. Apple designs the hardware, fully controls the software, and builds its own custom processors, though it outsources manufacturing to Foxconn and other vendors. Experts believe that Apple will eventually build its own car or join hands with an established automaker to make one.
In April this year, the California Department of Motor Vehicles (DMV) granted Apple the permission to test its self-driving systems on public roads using Lexus SUVs. The company is also reportedly using the technology into autonomous shuttles to transport its employees from one building to another.
Apple self-driving car spotted in California
Last month, Voyage co-founder MacCallister Higgins spotted an Apple self-driving car in California. The vehicle was outfitted with six Velodyne LiDAR sensors, multiple cameras, and several radar units encased in white plastic. The entire setup was located on the vehicle’s roof. Higgins said the computer stack was placed on the roof along with the sensors. That’s surprising because most other self-driving car operators place the computer stacks or GPUs in the cars’ trunks.
Going to need more than 140 characters to go over 🍎's Project Titan. I call it "The Thing" pic.twitter.com/sLDJd7iYSa
— MacCallister Higgins (@macjshiggins) October 17, 2017
In July, a report coming out of China claimed that Apple was working with Contemporary Amperex Technology Ltd (CATL) on developing batteries for electric vehicles. Improving battery efficiency is one of the biggest challenges before EV makers. CATL makes batteries for electric cars, electric trucks, electric buses, and stationary energy storage solutions. The agreement with CATL is a clear indication that Apple’s automobile ambitions go far beyond just developing self-driving systems.