In this work we propose new proofs of some classical results of nonlinear programming milestones, in particular for the Kuhn-Tucker conditions and Lagrangian methods and functions. This study is concerned with some interesting features found in the well-known tools and methods, connecting them with a technical analysis of the "Maximal Margin Classifier" designed specifically for linearly separable data, while referring to the condition in which data can be separated linearly by using a hyperplane. In this context of analysis, we technically point out the centrality played by these mathematical tools when obtaining robustness in Machine Learning procedures analyzing some support vector machine (SVM) models, as they are used in various contexts and applications (e.g., Soft Margin SVM and Maximum Margin SVM). This paper represents the first study reinforcing the ongoing Machine Learning Modeling and the research project we will launch in the near future on this fascinating frame of analysis. In this work we examine the problem of estimating the bias into a decision-making process. A new decision function algorithm is introduced as well.

Karush-Kuhn-Tucker conditions and Lagrangian approach for improving machine learning techniques: A survey and new developments

Ferrara, Massimiliano
Conceptualization
2024-01-01

Abstract

In this work we propose new proofs of some classical results of nonlinear programming milestones, in particular for the Kuhn-Tucker conditions and Lagrangian methods and functions. This study is concerned with some interesting features found in the well-known tools and methods, connecting them with a technical analysis of the "Maximal Margin Classifier" designed specifically for linearly separable data, while referring to the condition in which data can be separated linearly by using a hyperplane. In this context of analysis, we technically point out the centrality played by these mathematical tools when obtaining robustness in Machine Learning procedures analyzing some support vector machine (SVM) models, as they are used in various contexts and applications (e.g., Soft Margin SVM and Maximum Margin SVM). This paper represents the first study reinforcing the ongoing Machine Learning Modeling and the research project we will launch in the near future on this fascinating frame of analysis. In this work we examine the problem of estimating the bias into a decision-making process. A new decision function algorithm is introduced as well.
2024
Nonlinear Programming; KKT Conditions and Lagrangian; Support Vector Machines; Margin Classifier
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.12318/143146
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? 0
social impact