wir bieten...
Dekobild im Seitenkopf ISMLL
 
Courses in winter term 2015 / Lecture Modern Optimization Techniques

Optimization techniques are at the heart of the solution to a number of real world problems. A number of optimization techniques have been developed during the years and each of them find their application according to the problem characteristics. This course will concentrate on recognizing common convex optimization problems that arise in real world applications and their key charachteristics as well as different approaches to solve them. Several optimization problems will be formally described and illustrated with examples. We will study approaches for unconstrained and equality constrained optimization (stochastic gradient descent, Newton's method and coordinate descent), interior-point methods for solving inequality-constrained problems, as well as extensions and improvements of classical optimization methods like Quasi-Newton, conjugate gradient and cutting plane methods. All of those methods will be illustrated with proactical applications mainly in the area of machine learning.

Textbooks:

  1. Stephen Boyd and Lieven Vandenberghe. Convex Optimization. Cambridge Univ Press, 2004.
  2. Suvrit Sra, Sebastian Nowozin and Stephen J. Wright. Optimization for Machine Learning. MIT Press, 2011.
  3. Igor Griva. Linear and nonlinear optimization. Society for Industrial and Applied Mathematics, 2009.

Lecturer: Lucas Drumond
Trainer: Lydia Voß
 
Lecture:
Time: Tue 10-12
Location: B 26
Begin: 20.10.2015
Assignment: MSc WI & IMIT
 
Tutorial:
Time: Fri 10-12
Location: B 26
Begin: 30.10.2015
 
Exam:
Date:
Location:
Begin:
Length:
 
More:
Moodle:Moodle
LSF:LSF
Modul- Handbuch:MHB
Last Lecture: here