Conditional Gumbel-Softmax for constrained feature selection with application to node selection in wireless sensor networks

T Strypsteen, A Bertrand - arXiv preprint arXiv:2406.01162, 2024 - arxiv.org
In this paper, we introduce Conditional Gumbel-Softmax as a method to perform end-to-end
learning of the optimal feature subset for a given task and deep neural network (DNN) …

A distributed neural network architecture for dynamic sensor selection with application to bandwidth-constrained body-sensor networks

T Strypsteen, A Bertrand - arXiv preprint arXiv:2308.08379, 2023 - arxiv.org
We propose a dynamic sensor selection approach for deep neural networks (DNNs), which
is able to derive an optimal sensor subset selection for each specific input sample instead of …

End-to-end learnable EEG channel selection for deep neural networks with Gumbel-softmax

T Strypsteen, A Bertrand - Journal of Neural Engineering, 2021 - iopscience.iop.org
Objective. To develop an efficient, embedded electroencephalogram (EEG) channel
selection approach for deep neural networks, allowing us to match the channel selection to …

[PDF][PDF] End-to-end learnable EEG channel selection for deep neural networks with Gumbel-softmax

T Strypsteen, A Bertrand - Journal Of Neural Engineering, 2021 - lirias.kuleuven.be
Objective. To develop an efficient, embedded electroencephalogram (EEG) channel
selection approach for deep neural networks, allowing us to match the channel selection to …

Group-Feature (Sensor) Selection With Controlled Redundancy Using Neural Networks

A Saha, NR Pal - arXiv preprint arXiv:2310.20524, 2023 - arxiv.org
In this paper, we present a novel embedded feature selection method based on a Multi-layer
Perceptron (MLP) network and generalize it for group-feature or sensor selection problems …

Accelerating Hessian-free optimization for deep neural networks by implicit preconditioning and sampling

TN Sainath, L Horesh, B Kingsbury… - … IEEE Workshop on …, 2013 - ieeexplore.ieee.org
Hessian-free training has become a popular parallel second order optimization technique
for Deep Neural Network training. This study aims at speeding up Hessian-free training, both …

Supervised feature selection through deep neural networks with pairwise connected structure

Y Huang, W Jin, Z Yu, B Li - Knowledge-Based Systems, 2020 - Elsevier
Feature selection is an important data preprocessing strategy, has been proven empirically
that it contributes to reducing the dimensionality of feature and enhancing the performance …

Neural Capacitance: A New Perspective of Neural Network Selection via Edge Dynamics

C Jiang, T Pedapati, PY Chen, Y Sun, J Gao - arXiv preprint arXiv …, 2022 - arxiv.org
Efficient model selection for identifying a suitable pre-trained neural network to a
downstream task is a fundamental yet challenging task in deep learning. Current practice …

Semi-supervised feature selection with soft label learning

C Zhang, L Zhu, D Shi, J Zheng… - IEEE/CAA Journal of …, 2022 - ieeexplore.ieee.org
With the rapid increase of high-dimensional data mixed with labelled and unlabelled
samples, the semi-supervised feature selection technique has received much attention in …

Disentangling neural architectures and weights: A case study in supervised classification

N Colombo, Y Gao - arXiv preprint arXiv:2009.05346, 2020 - arxiv.org
The history of deep learning has shown that human-designed problem-specific networks
can greatly improve the classification performance of general neural models. In most …