Snapshots of ICML 2019

The 36th International Conference on Machine Learning (ICML) is by all accounts a premier conference in the machine learning world. Thousands of papers are submitted and thousands of people from around the world travel to attend the weeklong conference. This year was no different with over 6,000 attendees and 2,473 submitted papers. Only 621 papers … Continue reading Snapshots of ICML 2019

Mixing Frank-Wolfe and Gradient Descent

By Sebastian Pokutta, associate director of ML@GT TL;DR: This is an informal summary of our recent paper Blended Conditional Gradients with Gábor Braun, Dan Tu, and Stephen Wright, showing how mixing Frank-Wolfe and Gradient Descent gives a new, very fast, projection-free algorithm for constrained smooth convex minimization. What is the paper about and why you might care Frank-Wolfe methods [FW] … Continue reading Mixing Frank-Wolfe and Gradient Descent

Machine Learning Meets Interactive Stories

By Mark Riedl, Associate Professor at Georgia Tech On Friday, December 28th Netflix aired a special episode of their Black Mirror series called Bandersnatch. What made Bandersnatch interesting is that it was an interactive story: watchers could use a remote control to choose options at various points throughout the story’s progression to influence character choice and plot progression. (This post does not contain … Continue reading Machine Learning Meets Interactive Stories

How Not to Rock the Semantic Boat

By Yuval Pinter Imagine you’re building a boat, starting from a heap of parts. With each new board or screw, you make sure that it fits the adjacent parts, and that the material type is suitable for the section of the boat it’s in. But there are also bigger concerns to consider - is the … Continue reading How Not to Rock the Semantic Boat

What Makes a New Word Stick?

By Ian Stewart The language that people use to communicate online is in constant flux. People may have once written "haha" to indicate laughter but over time have adopted "lol" instead. Entire dictionaries and websites such as UrbanDictionary.com are dedicated to tracking the ebb and flow of the latest slang (i.e. nonstandard) words that propagate … Continue reading What Makes a New Word Stick?

Learning Rigidity and Scene Flow Estimation

By Zhaoyang Lv We live in a three-dimensional (3D), dynamic world every day. Being able to perceive 3D high-resolution motion is a fundamental ability of our perception system, which enables us to perform versatile jobs. At the age when we are building intelligent robots, autonomous vehicles, and augmented reality toolkits, how can we also enable … Continue reading Learning Rigidity and Scene Flow Estimation

Choose Your Neuron: Incorporating Domain Knowledge through Neuron-Importance

By: Prithvijit Chattopadhyay and Ramprasaath R. Selvaraju (Paper authors include Ramprasaath R. Selvaraju, Prithvijit Chattopadhyay, Mohamed Elhoseiny, Tilak Sharma, Dhruv Batra, Devi Parikh, and Stefan Lee) Deep Neural Networks have pushed the boundaries of standard image-classification tasks in the past few years, with performance on many challenging benchmarks reaching near human-level accuracies. One of the … Continue reading Choose Your Neuron: Incorporating Domain Knowledge through Neuron-Importance