Sculpting Neural Networks: A Deep Dive into Dimension Reduction Strategies
Conference
65th ISI World Statistics Congress 2025
Format: CPS Abstract - WSC 2025
Keywords: dimension-reduction, machine learning, neuralnetworks, pca
Session: CPS 11 - Dimension Reduction and Clustering Techniques for High-Dimensional Data
Wednesday 8 October 4 p.m. - 5 p.m. (Europe/Amsterdam)
Abstract
This research introduces an innovative approach called Siamese Fraternal Neural Network (SFNN), which synergistically combines Siamese Neural Network (SNN) with Principal Component Analysis (PCA). The SFNN framework is designed to harness the strengths of both techniques, providing a robust solution for complex predictive tasks. The SFNN model is designed to learn data pair similarities via SNN while employing PCA for dimensionality reduction within the hidden layers. This innovative method aims to enhance computational efficiency and lower computational costs. The study assesses SFNN's predictive capabilities on a variety of datasets, both structured and unstructured. By combining the complementary strengths of SNN and PCA, SFNN addresses key challenges associated with high-dimensional data, offering a powerful tool for a wide range of applications. The outcomes of this research are expected to provide valuable insights and practical solutions, fostering further advancements in the field of machine learning.