In this paper we present novel data-driven optimization models for Support Vector Machines (SVM), with the aim of linearly separating two sets of points that have non-disjoint convex closures. Traditional classification algorithms assume that the training data points are always known exactly. However, real-life data are often subject to noise. To handle such uncertainty, we formulate robust models with uncertainty sets in the form of hyperrectangles or hyperellipsoids, and propose a moment-based distributionally robust optimization model enforcing limits on first-order deviations along principal directions. All the formulations reduce to convex programs. The efficiency of the new classifiers is evaluated on real-world databases. Experiments show that robust classifiers are especially beneficial for data sets with a small number of observations. As the dimension of the data sets increases, features behavior is gradually learned and higher levels of out-of-sample accuracy can be achieved via the considered distributionally robust optimization method. The proposed formulations, overall, allow finding a trade-off between increasing the average performance accuracy and protecting against uncertainty, with respect to deterministic approaches.

(2022). Robust and Distributionally Robust Optimization Models for Support Vector Machine [journal article - articolo]. In COMPUTERS & OPERATIONS RESEARCH. Retrieved from http://hdl.handle.net/10446/226109

Robust and Distributionally Robust Optimization Models for Support Vector Machine

Faccini, Daniel;Maggioni, Francesca;
2022

Abstract

In this paper we present novel data-driven optimization models for Support Vector Machines (SVM), with the aim of linearly separating two sets of points that have non-disjoint convex closures. Traditional classification algorithms assume that the training data points are always known exactly. However, real-life data are often subject to noise. To handle such uncertainty, we formulate robust models with uncertainty sets in the form of hyperrectangles or hyperellipsoids, and propose a moment-based distributionally robust optimization model enforcing limits on first-order deviations along principal directions. All the formulations reduce to convex programs. The efficiency of the new classifiers is evaluated on real-world databases. Experiments show that robust classifiers are especially beneficial for data sets with a small number of observations. As the dimension of the data sets increases, features behavior is gradually learned and higher levels of out-of-sample accuracy can be achieved via the considered distributionally robust optimization method. The proposed formulations, overall, allow finding a trade-off between increasing the average performance accuracy and protecting against uncertainty, with respect to deterministic approaches.
articolo
Faccini, Daniel; Maggioni, Francesca; Potra, Florian
(2022). Robust and Distributionally Robust Optimization Models for Support Vector Machine [journal article - articolo]. In COMPUTERS & OPERATIONS RESEARCH. Retrieved from http://hdl.handle.net/10446/226109
File allegato/i alla scheda:
File Dimensione del file Formato  
Faccinietal_1-s2.0-S0305054822001861-main.pdf

Solo gestori di archivio

Descrizione: Postprint
Versione: publisher's version - versione editoriale
Licenza: Licenza default Aisberg
Dimensione del file 1.86 MB
Formato Adobe PDF
1.86 MB Adobe PDF   Visualizza/Apri
FacciniMaggioniPotraPreprint.pdf

embargo fino al 12/07/2025

Versione: postprint - versione referata/accettata senza referaggio
Licenza: Creative commons
Dimensione del file 1.93 MB
Formato Adobe PDF
1.93 MB Adobe PDF   Visualizza/Apri
Pubblicazioni consigliate

Caricamento pubblicazioni consigliate

Aisberg ©2008 Servizi bibliotecari, Università degli studi di Bergamo | Terms of use/Condizioni di utilizzo

Utilizza questo identificativo per citare o creare un link a questo documento: http://hdl.handle.net/10446/226109
Citazioni
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
social impact