Making Artificial Intelligence Work in a Changing Environment

By Adrian Rivera Cardoso and He Wang Machine learning (ML) is changing our lives. We can instantly translate from one language to another, search entire libraries in a matter of seconds, and even prevent credit card fraud. ML’s success is mostly due to the power of artificial neural networks — a machine learning model inspired … Continue reading Making Artificial Intelligence Work in a Changing Environment

Explaining Nonparametric Regression on Low Dimensional Manifolds using Deep Neural Networks

By Minshuo Chen Background and Motivation Deep learning has made significant breakthroughs in various real-world applications, such as computer vision, natural language processing, healthcare, robotics, etc. In image classification, the winner of the 2017 ImageNet challenge retained a top-5 error rate of 2.25% [1], while the data set consists of about 1.2 million labeled high-resolution … Continue reading Explaining Nonparametric Regression on Low Dimensional Manifolds using Deep Neural Networks

Overcoming Large-scale Annotation Requirements for Understanding Videos in the Wild

By Min-Hung Chen, Zsolt Kira and Ghassan AlRegib Videos have become an increasingly important type of media from which we obtain valuable information and knowledge. This motivates the need for the development of video analysis techniques. The development of these techniques could, for example, provide recommendations or support discovery for different objectives. Given the recent … Continue reading Overcoming Large-scale Annotation Requirements for Understanding Videos in the Wild

Snapshots of ICML 2019

The 36th International Conference on Machine Learning (ICML) is by all accounts a premier conference in the machine learning world. Thousands of papers are submitted and thousands of people from around the world travel to attend the weeklong conference. This year was no different with over 6,000 attendees and 2,473 submitted papers. Only 621 papers … Continue reading Snapshots of ICML 2019

Mixing Frank-Wolfe and Gradient Descent

By Sebastian Pokutta, associate director of ML@GT TL;DR: This is an informal summary of our recent paper Blended Conditional Gradients with Gábor Braun, Dan Tu, and Stephen Wright, showing how mixing Frank-Wolfe and Gradient Descent gives a new, very fast, projection-free algorithm for constrained smooth convex minimization. What is the paper about and why you might care Frank-Wolfe methods [FW] … Continue reading Mixing Frank-Wolfe and Gradient Descent

Machine Learning Meets Interactive Stories

By Mark Riedl, Associate Professor at Georgia Tech On Friday, December 28th Netflix aired a special episode of their Black Mirror series called Bandersnatch. What made Bandersnatch interesting is that it was an interactive story: watchers could use a remote control to choose options at various points throughout the story’s progression to influence character choice and plot progression. (This post does not contain … Continue reading Machine Learning Meets Interactive Stories

How Not to Rock the Semantic Boat

By Yuval Pinter Imagine you’re building a boat, starting from a heap of parts. With each new board or screw, you make sure that it fits the adjacent parts, and that the material type is suitable for the section of the boat it’s in. But there are also bigger concerns to consider - is the … Continue reading How Not to Rock the Semantic Boat