wir bieten...
Dekobild im Seitenkopf ISMLL
 
Veranstaltungen im Wintersemester 2016 / Vorlesung Modern Optimization Techniques

Optimization techniques are at the heart of the solution to a number of real world problems. A number of optimization techniques have been developed during the years and each of them find their application according to the problem characteristics. This course will concentrate on recognizing common convex optimization problems that arise in real world applications and their key charachteristics as well as different approaches to solve them. Several optimization problems will be formally described and illustrated with examples. We will study approaches for unconstrained and equality constrained optimization (stochastic gradient descent, Newton's method and coordinate descent), interior-point methods for solving inequality-constrained problems, as well as extensions and improvements of classical optimization methods like Quasi-Newton, conjugate gradient and cutting plane methods. All of those methods will be illustrated with proactical applications mainly in the area of machine learning.

Textbooks:

  1. Stephen Boyd and Lieven Vandenberghe. Convex Optimization. Cambridge Univ Press, 2004.
  2. Suvrit Sra, Sebastian Nowozin and Stephen J. Wright. Optimization for Machine Learning. MIT Press, 2011.
  3. Igor Griva. Linear and nonlinear optimization. Society for Industrial and Applied Mathematics, 2009.

Dozent: Prof. Dr. Dr. Lars Schmidt-Thieme
Übungsleiter: Lydia Voß
 
Vorlesung:
Zeit: Di 10-12
Ort: A 102
Beginn: 18.10.2016
Zuordnung:MSc WI & IMIT
 
Übung:
Zeit: Fri 12-14
Ort: A 102
Beginn: 21.10.2016
 
Klausur:
Datum:
Ort:
Beginn:
Dauer:
 
Mehr:
Moodle:Moodle
LSF:LSF
Modul- Handbuch:MHB
Voheriger Durchlauf:here