Improved mayfly optimization deep stacked sparse auto encoder feature selection scorched gradient descent driven dropout XLM learning framework for software …

M Anbu - Concurrency and Computation: Practice and …, 2022 - Wiley Online Library
M Anbu
Concurrency and Computation: Practice and Experience, 2022Wiley Online Library
Software testing is the process of improving software quality by classifying and removing
defects in the software development. Previously, several methods were used for software
defect prediction, but any one method did not provide sufficient accuracy. To overcome this
issue, an improved may fly optimization with deep stacked sparse auto encoder feature
selection scorched gradient descent driven dropout extreme learning machine framework
(SDP‐IMFOFS‐GDDDXLMC) is proposed in this article for software defect prediction. Here …
Summary
Software testing is the process of improving software quality by classifying and removing defects in the software development. Previously, several methods were used for software defect prediction, but any one method did not provide sufficient accuracy. To overcome this issue, an improved may fly optimization with deep stacked sparse auto encoder feature selection scorched gradient descent driven dropout extreme learning machine framework (SDP‐IMFOFS‐GDDDXLMC) is proposed in this article for software defect prediction. Here, an IMFO method is considered for feature selection techniques. In feature selection technique, features are selected, such as PC1, PC4, and MC1 for selecting the probable minimal attribute. Then, GDDDXLMC approach is used to classify buggy and clean software detection. The proposed approach is implemented in MATLAB platform. The performance metrics, such as accuracy, precision, F‐measure, sensitivity, specificity, Mathew correlation coefficient (MCC) is examined to examine the performance of the proposed method. The simulation performance of the proposed SDP‐IMFOFS‐GDDDXLMC method attains higher accuracy 99.75%, 97.85%, 95.13%, 14.89%, 16.34%, 17.89%, and 98.79, higher sensitivity 96.34%, 91.23%, 89.12%, 12.67%, 17.56%, 18.90%, and 87.25% higher specificity 14.89%, 16.89%, 20.67%, 93.67%, 92.37%, 98.47%, and 94.78% compared with the existing methods, like SDP‐MLP‐PSO, SDP‐BPNN‐RBFNN, SDP‐CNN‐RNN‐LSTM, SDP‐KBN‐LIME, SDP‐SLDeep‐LSTM, SDP‐K‐PCA‐ELM, and SDP‐CNN‐PHI forest, respectively.
Wiley Online Library
以上显示的是最相近的搜索结果。 查看全部搜索结果