Max Welling (University of Amsterdam, Qualcomm), Marian Verhelst (KU Leuven), Daniel Soudry (Technion University)
According to recent research by University of Massachusetts at Amherst, the amount of power required for training a popular transformer neural network with 213 million parameters is equivalent to the emissions of roughly 626,000 pounds of carbon dioxide. That’s nearly five times the lifetime emissions of the average U.S. car, including its manufacturing. For that purpose, we turn our attention to research that successfully manages to quantize neural networks. We sit down virtually to discuss how we can make AI models smaller and more power- and energy-efficient. Our guests will discuss trends and opportunities in this field and then we’ll open the floor for questions and conversation with the audience.