THE NEW VALUE FRONTIER

CES

KYOCERA @ CES 2020

Sensing Technology

LIDAR technology is considered key to autonomous driving, and highly accurate LIDAR sensors will be essential for the future of mobility. Kyocera’s “Camera-LIDAR Fusion Sensor” reduces distortion and parallax error by integrating LIDAR distance-measuring into the camera’s image sensor.

Camera-LIDAR Fusion Sensor

Advanced LIDAR sensor-integrated camera performs far beyond the limits of human vision

  • 01Features

    • - Integrated LIDAR sensor and camera
    • - Calibration and syncing not required
    • - Original optical system enables compact size and simple design
    • - LIDAR sensor measures the distance to objects recognized by the camera
    • - Recognizes visual information imperceptible to the human eye
    • - Motion sensing/recognition
    • - Multidimensional information processing
    • - Customizable for various applications
    Camera-LIDAR Fusion Sensor
    Camera-LIDAR Fusion Sensor
  • 02Applications

    • - Safety and security systems
    • - Robot vision (AR/VR)
    • - Drones
    • - Industrial robots
    • - Automotive/mobility
    Applications
  • 03Function

    Function
  • Click here to download (1.9MB)

    *For the above product information panel

  • Interview

    Kyocera Article - LIDAR

    Kyocera’s Camera-LIDAR Fusion Sensor Brings Clearer Vision to Autonomous Driving

    We all know that you should watch the road when you drive. But soon, your car will be watching with you, finding potential hazards and navigating them before they become dangers. Kyocera’s exclusive Camera-LIDAR Fusion Sensor will give automobiles the ability to locate objects and determine their size through LIDAR, while the camera enables them to identify what they see. Hiroyuki Minagawa, Senior Manager of Kyocera’s Future Technology Research Laboratory, describes his team’s work: “By bringing camera data and LIDAR data together, we are able to create extremely high-resolution 3-D images. This is exclusive to Kyocera.”

    LIDAR (Laser Imaging Detection and Ranging) systems scan an area with a laser, measuring the time required for the laser beam to bounce back in order to determine the distance and size of surrounding objects. LIDAR has actually been around since the 1960s, typically being used in aircraft and satellites for creating highly detailed maps. Recently, many teams around the world have started looking into LIDAR as a tool for autonomous driving systems, as it is capable of very accurate measurements at relatively long distances, and is very useful for spotting objects that have fallen on the road.

    Senior Manager of Research Hiroyuki Minagawa shows the Fusion LIDAR system, developed by his team.
    Senior Manager of Research Hiroyuki Minagawa shows the Fusion LIDAR system, developed by his team.

    The Parallax Challenge

    “LIDAR can measure distance, but it is not able to precisely recognize what it sees,” explains Minagawa. Camera systems, on the other hand, are very good at distinguishing shapes and colors, but can mistake shadows or textures for 3D objects. By using the two together, however, a composite image can be created that shows where everything is located and can identify what each object is.

    But this technique raises a new challenge: because the LIDAR and camera are separate units, the views they see will always be slightly different. This creates a form of distortion known as parallax. Our eyes and brains have evolved to handle parallax instinctively (it’s how we see in three dimensions), “but for machines, to strictly overcome parallax and merge the LIDAR and camera data would require a computer. It might be possible with a cloud-based system, but that would be much too slow for driving,” Minagawa explains. “Other manufacturers have been working on this problem, but because of space, heat, and other restrictions, it was impossible for them to bring the LIDAR and camera close enough to eliminate parallax.” The problem of how to handle parallax had put a limitation on how accurate LIDAR systems could be, creating a serious obstacle to achieving vehicles capable of autonomous driving.

    Kyocera Future Research Laboratory Senior Manager Hiroyuki Minagawa discusses the development of the Fusion LIDAR System.
    Kyocera Future Research Laboratory Senior Manager Hiroyuki Minagawa discusses the development of the Fusion LIDAR System.
    The Camera-LIDAR Fusion Sensor is able to combine LIDAR and camera data into a single image of the road that clearly shows how far away other objects are.
    The Camera-LIDAR Fusion Sensor is able to combine LIDAR and camera data into a single image of the road that clearly shows how far away other objects are.

    Kyocera’s Solution: One Box, One Lens, One Vision

    Kyocera, however, found an answer. “We were able to bring the camera and LIDAR into a single unit with one lens,” said Minagawa. “This was a Kyocera exclusive discovery.” Because Kyocera’s Camera-LIDAR Fusion Sensor contains both devices in a single unit, they both see through the same lens, and both have the same viewpoint with no parallax. This makes combining the camera data and LIDAR data a much simpler process that can be done at driving speeds. Only Kyocera has been able to fuse LIDAR with a camera, making Kyocera’s system far more powerful than any other manufacturer’s.

    Greater Durability, Thanks to Kyocera’s Advanced Ceramics Know-How

    A second issue is durability. “There is a big problem with many of these LIDAR systems,” Minagawa says, “they use mechanical motors to rotate their scanning mirrors, and they can’t withstand the shaking and vibration that happens normally in a car. I don’t think they would last more than a couple of years.” To overcome this limitation, Kyocera develops another exclusive solution, using a MEMS (Micro Electro-Mechanical System) Mirror housed inside Kyocera’s exclusive Ceramic Packaging Technology. Ceramics are in Kyocera’s DNA, right down to its name, and breakthroughs in ceramic technology are an integral part of Kyocera’s history. This advanced ceramic technology makes Kyocera’s Camera-LIDAR Fusion Sensor far more durable in real-world driving conditions, making it the superior choice for autonomous driving systems.

    Kyocera is planning to introduce its Camera-LIDAR Fusion Sensor on upcoming generations of vehicles, giving autonomous driving systems a huge boost in being able to see the road around them and react properly to potential hazards. In addition, they see applications in many other fields, including safety systems for heavy machinery, navigation and environmental sensing for robotics, and even for enabling security systems to recognize people and objects. In these ways, Kyocera’s small device can make a big contribution to traffic, safety, and society as a whole.

    With Kyocera’s exclusive Camera-LIDAR Fusion Sensor, a single compact device provides the distance and size detection of LIDAR with the capability of camera imaging.
    Hiroyuki Minagawa and his team members discuss the development of the Camera-LIDAR Fusion Sensor