A fractal belief KL divergence for decision fusion

J Zeng, F Xiao - Engineering Applications of Artificial Intelligence, 2023 - Elsevier
J Zeng, F Xiao
Engineering Applications of Artificial Intelligence, 2023Elsevier
Abstract Dempster–Shafer (D–S) evidence theory is useful in the realm of multi-source data
fusion. However, a counterintuitive result may be obtained when the belief probability
assignments (BPAs) are highly conflicting. To overcome this flaw, in this paper a symmetric
fractal-based belief Kullback–Leibler divergence (FBDSKL) is proposed. It is used to
measure the divergence between BPAs, and is more capable than the existing belief
divergence methods in measuring the conflict between two BPAs in numerical examples …
Abstract
Abstract Dempster–Shafer (D–S) evidence theory is useful in the realm of multi-source data fusion. However, a counterintuitive result may be obtained when the belief probability assignments (BPAs) are highly conflicting. To overcome this flaw, in this paper a symmetric fractal-based belief Kullback–Leibler divergence (F B D S K L) is proposed. It is used to measure the divergence between BPAs, and is more capable than the existing belief divergence methods in measuring the conflict between two BPAs in numerical examples. Furthermore, the proposed F B D S K L is proved to have desirable properties including nonnegativity, nondegeneracy and symmetry. To apply F B D S K L divergence measure to practical problems, a novel F B D S K L-based multi-source data fusion (F B D S K L-MSDF) algorithm is designed. Through comparisons with the well-known related methods, the proposed F B D S K L-MSDF algorithm is validated to be superior and more robust. Finally, the proposed F B D S K L-MSDF is applied to two real-world classification problems to verify its high practicability.
Elsevier
以上显示的是最相近的搜索结果。 查看全部搜索结果