News / Tech News

    360-degree head-up display view could warn drivers of road obstacles in real time

    Current head-up display systems are limited to two-dimensional projections onto the windscreen of a vehicle, but researchers from the Universities of Cambridge, Oxford and University College London (UCL) developed a system using a 3D laser scanner and LiDAR data to create a fully 3D representation of London streets.



    LiDAR data (left), Holographic result (right). Photo: Jana Skirnewskaja/Universities of Cambridge


    The system they developed can effectively ‘see through’ objects to project holographic representations of road obstacles that are hidden from the driver’s field of view, aligned with the real object in size and distance. For example, a road sign blocked from view by a large truck would appear as a 3D hologram so that the driver knows exactly where the sign is and what information it displays.

    The 3D holographic projection technology keeps the driver’s focus on the road instead of the windscreen, and could improve road safety by projecting road obstacles and potential hazards in real time from any angle.

    “The idea behind a head-up display is that it keeps the driver’s eyes up, because even a fraction of a second not looking at the road is enough time for a crash to happen,” said Jana Skirnewskaja from Cambridge’s Department of Engineering, the study’s first author. “However, because these are two-dimensional images, projected onto a small area of the windscreen, the driver can be looking at the image, and not at the road ahead of them.”

    For several years, Skirnewskaja and her colleagues have been working to develop alternatives to head-up displays (HUDs) that could improve road safety by providing more accurate information to drivers while keeping their eyes on the road.

    “We want to project information anywhere in the driver’s field of view, but in a way that isn’t overwhelming or distracting,” said Skirnewskaja. “We don’t want to provide any information that isn’t directly related to the driving task at hand.”

    The team developed an augmented reality holographic point cloud video projection system to display objects aligned with real-life objects in size and distance within the driver’s field of view.

    The system combines data from a 3D holographic setup with LiDAR (light detection and ranging) data. LiDAR uses a pulsed light source to illuminate an object and the reflected light pulses are then measured to calculate how far the object is from the light source.

    The researchers tested the system by scanning Malet Street on the UCL campus in central London. Information from the LiDAR point cloud was transformed into layered 3D holograms, consisting of as many as 400,000 data points.

    The concept of projecting a 360° obstacle assessment for drivers stemmed from meticulous data processing, ensuring clear visibility of each object’s depth.

    The researchers sped up the scanning process so that the holograms were generated and projected in real time. Importantly, the scans can provide dynamic information, since busy streets change from one moment to the next.

    “The data we collected can be shared and stored in the cloud, so that any drivers passing by would have access to it – it’s like a more sophisticated version of the navigation apps we use every day to provide real-time traffic information,” said Skirnewskaja.

    “This way, the system is dynamic and can adapt to changing conditions, as hazards or obstacles move on or off the street.”

    While more data collection from diverse locations enhances accuracy, the researchers say the unique contribution of their study lies in enabling a 360° view by choosing data points from single scans of specific objects, such as trucks or buildings, enabling a comprehensive assessment of road hazards.

    “We can scan up to 400,000 data points for a single object, but that is quite data-heavy and makes it more challenging to scan, extract and project data about that object in real time,” said Skirnewskaja.

    “With as little as 100 data points, we can know what the object is and how big it is. We need to get just enough information so that the driver knows what’s around them.”

    Earlier this year, Skirnewskaja and her colleagues conducted a virtual demonstration with virtual reality headsets loaded with the LiDAR data of the system at the Science Museum in London.

    User feedback from the sessions helped the researchers improve the system to make the design more inclusive and user-friendly. For example, they have fine-tuned the system to reduce eye strain, and have accounted for visual impairments.

    JANUARY 5, 2024



    YOU MAY ALSO LIKE

    The University of Tokyo scientists have achieved a breakthrough by integrating large language models (LLMs) with robots, marking a significant stride in enabling more humanlike gestures without relying on conventional hardware-driven controls.
    An international team of researchers has developed a handheld, non-invasive device that can detect biomarkers for Alzheimer's and Parkinson's disease.
    Dynamic crystallization control solves 2D halide perovskite synthesis bottleneck.
    German company Cerabyte recently tested a prototype of its own data storage system using ceramic media, potentially heralding a shift away from traditional HDDs and SSDs in data centers by 2030.
    For the first time, researchers demonstrate how to electronically alter the direction of electron flow in promising materials for quantum computing.
    Guided by machine learning, chemists at the Department of Energy’s Oak Ridge National Laboratory designed a record-setting carbonaceous supercapacitor material that stores four times more energy than the best commercial material.

    © 1991-2024 The Titi Tudorancea Bulletin | Titi Tudorancea® is a Registered Trademark | Terms of use and privacy policy
    Contact