News / Science News

    Researchers develop speedier network analysis for a range of computer hardware

    Graphs -- data structures that show relationships among objects -- are highly versatile. It's easy to imagine a graph depicting a social media network's web of connections.



    Researchers develop advance that could boost recommendation algorithms and internet searches. Photo: MIT


    But graphs are also used in programs as diverse as content recommendation and navigation. As Massachusetts Institute of Technology researcher Ajay Brahmakshatriya summarizes, "graphs are basically everywhere."

    Brahmakshatriya has developed software to run graph applications more efficiently on a wide range of computer hardware.

    The software extends GraphIt, a state-of-the-art graph programming language, to run on graphics processing units (GPUs) hardware that processes many data streams in parallel.

    The advance could accelerate graph analysis, especially for applications that benefit from a GPU's parallelism, such as recommendation algorithms.

    Brahmakshatriya will present the work, partially funded by the U.S. National Science Foundation, at the International Symposium on Code Generation and Optimization.

    When programmers write code, they don't talk directly to the computer hardware. The hardware itself operates in "binary" -- zeros and ones -- while the coder writes in a structured, "high-level" language made up of words and symbols.

    Translating that high-level language into hardware-readable binary requires programs called compilers. "A compiler converts the code to a format that can run on the hardware," says Brahmakshatriya. GraphIt is one such compiler, specially designed for graph analysis.

    Researchers developed GraphIt to optimize the performance of graph-based algorithms regardless of the size and shape of the graph. GraphIt allows the user not only to input an algorithm, but to schedule how that algorithm runs on the hardware.

    "Data analytics, including graph processing, that take advantage of massively modern parallel hardware in real-time can benefit many applications that directly affect quality of life and enable informed decision-making based on quickly changing data inputs," says Almadena Chtchelkanova, a program director in NSF's Directorate for Computer and Information Science and Engineering. (National Science Foundation)

    MARCH 19, 2021



    YOU MAY ALSO LIKE

    High-energy neutrino detected after a star's destruction as it is consumed by a black hole.
    Researchers placed nanosensors in microfibers to create a 'smart bandage'.
    The mosquito protein AEG12 strongly inhibits the family of viruses that cause yellow fever, dengue, West Nile, and Zika (...)
    New research led by the University of Cambridge has found rare evidence (...)
    Computer models for predicting slide-prone areas confront lack of subsurface data.
    The most diverse bee communities have the lowest levels of three common viral pathogens.

    © 1991-2023 The Titi Tudorancea Bulletin | Titi Tudorancea® is a Registered Trademark | Terms of use and privacy policy
    Contact