Abstract - Machine Learning ||
Beyond Conditionals: Structured Prediction for Interacting Processes

University of Illinois at Chicago

 

Abstract:
The principle of maximum entropy provides a powerful framework for estimating joint, conditional, and marginal probability distributions. Markov random fields and conditional random fields can be viewed as the maximum entropy approach in action. However, beyond joint and conditional distributions, there are many other important distributions with elements of interaction and feedback where its applicability has not been established. In this talk, I will present the principle of maximum causal entropy—an approach based on directed information theory for estimating an unknown process based on its interactions with a known process.

Bio:
Brian Ziebart is an Assistant Professor in the Department of Computer Science at the University of Illinois at Chicago. He received his PhD in Machine Learning from Carnegie Mellon University in 2010, where he was also a postdoctoral fellow. He also holds a B.S. in Computer Engineering (highest honors) from the University of Illinois, Urbana-Champaign. His research interests include machine learning, decision theory, game theory, robotics, and assistive technologies.