Repository logo
Communities & Collections
All of DSpace
  • English
  • العربية
  • বাংলা
  • Català
  • Čeština
  • Deutsch
  • Ελληνικά
  • Español
  • Suomi
  • Français
  • Gàidhlig
  • हिंदी
  • Magyar
  • Italiano
  • Қазақ
  • Latviešu
  • Nederlands
  • Polski
  • Português
  • Português do Brasil
  • Srpski (lat)
  • Српски
  • Svenska
  • Türkçe
  • Yкраї́нська
  • Tiếng Việt
Log In
New user? Click here to register.Have you forgotten your password?
  1. Home
  2. Browse by Author

Browsing by Author "Pirouz, Behzad"

Filter results by typing the first few letters
Now showing 1 - 1 of 1
  • Results Per Page
  • Sort Options
  • Thumbnail Image
    Item
    Feature Selection in Classification by means of Optimization and Multi-Objective Optimization
    (Università della Calabria, 2023-05-10) Pirouz, Behzad; Fortino, Giancarlo; Gaudioso, Manlio
    The thesis is in the area of mathematical optimization with application to Machine Learning. The focus is on Feature Selection (FS) in the framework of binary classification via Support Vector Machine paradigm. We concentrate on the use of sparse optimization techniques, which are widely considered as the election tool for tackling FS. We study the problem both in terms of single and multi-objective optimization. We propose first a novel Mixed-Integer Nonlinear Programming (MINLP) model for sparse optimization based on the polyhedral k-norm. We introduce a new way to take into account the k-norm for sparse optimization by setting a model based on fractional programming (FP). Then we address the continuous relaxation of the problem, which is reformulated via a DC (Difference of Convex) decomposition. On the other hand, designing supervised learning systems, in general, is a multi-objective problem. It requires finding appropriate trade-offs between several objectives, for example, between the number of misclassified training data (minimizing the squared error) and the number of nonzero elements separating the hyperplane (minimizing the number of nonzero elements). When we deal with multi-objective optimization problems, the optimization problem has yet to have a single solution that represents the best solution for all objectives simultaneously. Consequently, there is not a single solution but a set of solutions, known as the Pareto-optimal solutions. We overview the SVM models and the related Feature Selection in terms of multi-objective optimization. Our multi-objective approach considers two simultaneous objectives: minimizing the squared error and minimizing the number of nonzero elements of the normal vector of the separator hyperplane. In this thesis, we propose a multi-objective model for sparse optimization. Our primary purpose is to demonstrate the advantages of considering SVM models as multi-objective optimization problems. In multi-objective cases, we can obtain a set of Pareto optimal solutions instead of one in single-objective cases. Therefore, our main contribution in this thesis is of two levels: first, we propose a new model for sparse optimization based on the polyhedral k-norm for SVM classification, and second, use multi-objective optimization to consider this new model. The results of several numerical experiments on some classification datasets are reported. We used all the datasets for single-objective and multi-objective models.

Unical - Sistema Bibliotecario di Ateneo - Servizio Automazione Biblioteche @ 2025

  • Privacy policy
  • End User Agreement
  • Send Feedback