Recent News
Computer Science Colloquium will discuss strategies for sustainable AI data centers
March 10, 2025
Dissertation defense: Alyshia Bustos
March 5, 2025
Computer Science undergraduate honored for cybersecurity research
February 20, 2025
Associate Professor Matt Lakin wins PECASE Award
January 31, 2025
News Archives
Fast Algorithms for Support Vector Machines
April 5, 2005
- Date: Tuesday, April 5, 2005
- Time: 11:00 a.m.
- Place: Woodward 149
Dr. Don Hush dhush@lanl.gov
Los Alamos National Labs
This talk provides a brief introduction to support vector machines (SVMs) for machine learning and then describes some recent advances in the development of fast algorithms for solving the quadratic programming (QP) problem associated with the support vector machine (SVM) training process. Since the primal QP problem can be prohibitively large a smaller dual QP problem is solved and its solution mapped to a primal solution. Decomposition algorithms are arguably the most common type of algorithm used to solve the dual in practice. We describe a new class of decomposition algorithms (introduced by Simon) that produce an approximate solution to the dual in O(n2 ln(1/e)) time, where n is the number of data samples and e is the accuracy of the approximation. This is a substantial improvement over the previous best result of O(n5 ln(n)/e) for an algorithm we developed in 2003. We also describe a new O(n ln(n)) algorithm for mapping an approximate dual solution to an approximate primal solution. Finally we compare these new algorithms with algorithms currently used in practice.