ML@GT to Present Nine Papers at Competitive Machine Learning Conference

The International Conference on Machine Learning (ICML) received nearly 5,000 submissions for its 2020 conference and accepted 1,088 papers. Machine Learning Center at Georgia Tech (ML@GT) researchers authored nine accepted papers. The papers explore topics like privacy, semantics in predictive agents, data science, and artificial intelligence. One paper, Boosting Frank-Wolfe by Chasing Gradients, proposes a new state-of-the-art algorithm for constrained… Continue reading ML@GT to Present Nine Papers at Competitive Machine Learning Conference

Snapshots of ICML 2019

The 36th International Conference on Machine Learning (ICML) is by all accounts a premier conference in the machine learning world. Thousands of papers are submitted and thousands of people from around the world travel to attend the weeklong conference. This year was no different with over 6,000 attendees and 2,473 submitted papers. Only 621 papers… Continue reading Snapshots of ICML 2019

Mixing Frank-Wolfe and Gradient Descent

By Sebastian Pokutta, associate director of ML@GT TL;DR: This is an informal summary of our recent paper Blended Conditional Gradients with Gábor Braun, Dan Tu, and Stephen Wright, showing how mixing Frank-Wolfe and Gradient Descent gives a new, very fast, projection-free algorithm for constrained smooth convex minimization. What is the paper about and why you might care Frank-Wolfe methods [FW]… Continue reading Mixing Frank-Wolfe and Gradient Descent

ICML 2017 accepted papers and ML@GT

The list of accepted papers at ICML2017 was released yesterday and Andrej Karpathy has published a very nice post breaking down the acceptance by institution. Out of 1701 submissions 433 papers were accepted (or roughly 25.46%) from 420 different institutions. I am excited to see a very strong representation of Machine Learning @ Georgia Tech… Continue reading ICML 2017 accepted papers and ML@GT