[HTML][HTML] Hidden unit specialization in layered neural networks: ReLU vs. sigmoidal activation

E Oostwal, M Straat, M Biehl - Physica A: Statistical Mechanics and its …, 2021 - Elsevier
By applying concepts from the statistical physics of learning, we study layered neural
networks of rectified linear units (ReLU). The comparison with conventional, sigmoidal …

[HTML][HTML] FDG-PET combined with learning vector quantization allows classification of neurodegenerative diseases and reveals the trajectory of idiopathic REM sleep …

R van Veen, SK Meles, RJ Renken, FE Reesink… - Computer Methods and …, 2022 - Elsevier
Background and Objectives: 18 F-fluorodeoxyglucose (FDG) positron emission tomography
(PET) combined with principal component analysis (PCA) has been applied to identify …

sklvq: Scikit learning vector quantization

R Van Veen, M Biehl, GJ De Vries - Journal of Machine Learning Research, 2021 - jmlr.org
The sklvq package is an open-source Python implementation of a set of learning vector
quantization (LVQ) algorithms. In addition to providing the core functionality for the GLVQ …

Quantum-inspired learning vector quantizers for prototype-based classification: Confidential: for personal use only—submitted to Neural Networks and Applications 5 …

T Villmann, A Engelsberger, J Ravichandran… - Neural Computing and …, 2022 - Springer
Prototype-based models like the Generalized Learning Vector Quantization (GLVQ) belong
to the class of interpretable classifiers. Moreover, quantum-inspired methods get more and …

Convex and concave envelopes of artificial neural network activation functions for deterministic global optimization

ME Wilhelm, C Wang, MD Stuber - Journal of Global Optimization, 2023 - Springer
In this work, we present general methods to construct convex/concave relaxations of the
activation functions that are commonly chosen for artificial neural networks (ANNs). The …

Variants of dropconnect in learning vector quantization networks for evaluation of classification stability

J Ravichandran, M Kaden, S Saralajew, T Villmann - Neurocomputing, 2020 - Elsevier
Dropout and DropConnect are useful methods to prevent multilayer neural networks from
overfitting. In addition, it turns out that these tools can also be used to estimate the stability of …

[图书][B] The Shallow and the Deep: A biased introduction to neural networks and old school machine learning

M Biehl - 2023 - research.rug.nl
Abstract The Shallow and the Deep is a collection of lecture notes that offers an accessible
introduction to neural networks and machine learning in general. However, it was clear from …

[PDF][PDF] Quantum-Inspired Learning Vector Quantization for Classification Learning.

T Villmann, J Ravichandran, A Engelsberger… - ESANN, 2020 - esann.org
This paper introduces a variant of the prototype-based generalized learning vector
quantization algorithm (GLVQ) for classification learning, which is inspired by quantum …

Learning vector quantization with applications in neuroimaging and biomedicine

R van Veen - 2022 - research.rug.nl
University of Groningen Learning Vector Quantization with Applications in Neuroimaging and
Biomedicine van Veen, Rick Page 1 University of Groningen Learning Vector Quantization with …

New Prototype Concepts in Classification Learning

S Saralajew - 2020 - pub.uni-bielefeld.de
Abstract Machine learning algorithms are becoming more and more important in everyday
life. Applications in search engines, driver assistance systems, consumer electronics, and so …