PDF (الإنجليزية)

الكلمات المفتاحية

LEARNING RATE
Decision tree
Machine learning

الملخص

Software applications have become widely spread in an unprecedented manner in our daily lives, controlling some of the most sensitive and critical aspects within institutions. Examples include automated systems such as traffic control, aviation, and self-driving cars, among many others. Identifying software defects in these systems poses a challenge for most software-producing companies. In order to develop high-quality and reliable software, companies have turned to defect prediction using machine learning, relying on historical project datasets. This study aims to classify software defect prediction using machine learning techniques, specifically classification techniques. One of the classification techniques employed is eXtreme Gradient Boosting (XGBoost), a useful method for regression and classification analysis based on a gradient boosting decision tree (GBoost). XGBoost incorporates several hyperparameters that can be fine-tuned to enhance the model's performance. The employed hyperparameter tuning method is grid search, validated thereafter using 10-fold cross-validation. The hyperparameters configured for XGBoost include n_estimators, max_depth, subsample, gamma, colsample_bylevel, min_child_weight, and learning_rate. Based on the results of this study, it has been demonstrated that the utilization of algorithms with hyperparameter tuning can improve the performance of the XGBoost algorithm in accurately classifying software defects with high precision.
https://doi.org/10.33899/csmj.2023.142739.1081
  PDF (الإنجليزية)