Towards explainable deep neural networks (xDNN)

P Angelov, E Soares - Neural Networks, 2020 - Elsevier
In this paper, we propose an elegant solution that is directly addressing the bottlenecks of
the traditional deep learning approaches and offers an explainable internal architecture that …

The coming of age of interpretable and explainable machine learning models

PJG Lisboa, S Saralajew, A Vellido… - Neurocomputing, 2023 - Elsevier
Abstract Machine-learning-based systems are now part of a wide array of real-world
applications seamlessly embedded in the social realm. In the wake of this realization, strict …

Can learning vector quantization be an alternative to svm and deep learning?-Recent trends and advanced variants of learning vector quantization for classification …

T Villmann, A Bohnsack, M Kaden - Journal of Artificial Intelligence and …, 2017 - sciendo.com
Learning vector quantization (LVQ) is one of the most powerful approaches for prototype
based classification of vector data, intuitively introduced by Kohonen. The prototype …

A precise and stable machine learning algorithm: eigenvalue classification (EigenClass)

U Erkan - Neural Computing and Applications, 2021 - Springer
In this study, a precise and efficient eigenvalue-based machine learning algorithm,
particularly denoted as Eigenvalue Classification (EigenClass) algorithm, has been …

Graph kernel neural networks

L Cosmo, G Minello, A Bicciato… - … on Neural Networks …, 2024 - ieeexplore.ieee.org
The convolution operator at the core of many modern neural architectures can effectively be
seen as performing a dot product between an input matrix and a filter. While this is readily …

Alignment-free sequence comparison: A systematic survey from a machine learning perspective

KS Bohnsack, M Kaden, J Abel… - IEEE/ACM Transactions …, 2022 - ieeexplore.ieee.org
The encounter of large amounts of biological sequence data generated during the last
decades and the algorithmic and hardware improvements have offered the possibility to …

[PDF][PDF] Efficient Representation of Biochemical Structures for Supervised and Unsupervised Machine Learning Models Using Multi-Sensoric Embeddings.

KS Bohnsack, A Engelsberger, M Kaden… - …, 2023 - scitepress.org
We present an approach to efficiently embed complex data objects from the chem-and
bioinformatics domain like graph structures into Euclidean vector spaces such that those …

Variants of dropconnect in learning vector quantization networks for evaluation of classification stability

J Ravichandran, M Kaden, S Saralajew, T Villmann - Neurocomputing, 2020 - Elsevier
Dropout and DropConnect are useful methods to prevent multilayer neural networks from
overfitting. In addition, it turns out that these tools can also be used to estimate the stability of …

Prototype-based neural network layers: incorporating vector quantization

S Saralajew, L Holdijk, M Rees, T Villmann - arXiv preprint arXiv …, 2018 - arxiv.org
Neural networks currently dominate the machine learning community and they do so for
good reasons. Their accuracy on complex tasks such as image classification is unrivaled at …

Fusion of deep learning architectures, multilayer feedforward networks and learning vector quantizers for deep classification learning

T Villmann, M Biehl, A Villmann… - 2017 12th international …, 2017 - ieeexplore.ieee.org
The advantage of prototype based learning vector quantizers are the intuitive and simple
model adaptation as well as the easy interpretability of the prototypes as class …