Main Subjects : Artificial Intelligence

Maintainability Prediction for Object-Oriented Software Systems Based on Intelligent Techniques: Literature Review

anfal abd fadhil; Taghreed Riyadh Alreffaee

AL-Rafidain Journal of Computer Sciences and Mathematics, 2020, Volume 14, Issue 2, Pages 97-111
DOI: 10.33899/csmj.2020.167342

The maintainability of the software is one of the most substantial aspects when assessing software product quality. It is known as the easiness with which the current software can be changed. In the literature, a great number of models have been suggested to predict and measure maintainability during various stages of the Software Development Life Cycle, to conduct a comparative study of the existing suggested models of the prediction, only few attempts have been done. This study hints at the basics about the manner of how to measure maintainability in the object-oriented (OO) design knowing that the maintainability will be measured differently at every level. Also, we will concentrate on the artificial intelligence technologies of these studies.

A Comparative Study of Methods for Separating Audio Signals

Riham J. Issa; Yusra Mohammad

AL-Rafidain Journal of Computer Sciences and Mathematics, 2020, Volume 14, Issue 2, Pages 39-49
DOI: 10.33899/csmj.2020.167345

The process of separating signal from a mixture of signals represents an essential task for many applications including sound signal processing and speech processing systems as well as medical signal processing. In this paper, a review of sound source separation problem has been presented, as well as the methods used to extract features from the audio signal, also, we define the Blind source separation problem and comparing between some of the methods used to solve the problem of source separation.

The Linguistic Connotations of the Word Light in the Holy Quran (An analytical study of Quranic verses using Artificial intelligent techniques)

Nima A. Al-Fakhry

AL-Rafidain Journal of Computer Sciences and Mathematics, 2020, Volume 14, Issue 2, Pages 13-24
DOI: 10.33899/csmj.2020.167343

The Holy Quran is a sea of words, articulations, phrases, regulations, laws, and judgments. Therefore, when we dive in the Quran verses we need a large amount of information in various aspects to achieve the required knowledge. The word (Al-Noor) is one of the Quran's vocabularies, which enjoys a special place, and this privacy came from the specificity of the Quran and its sanctity. The word (Al-Noor) has one pronunciation and many meanings and vocabulary.
The research has sought to know God's lights: “the science, the guidance, the kernels, and faith “the closest and most intense and congregated of the verse (35/Al-Noor).  Furthermore, this verse was chosen due to it speaks about the Sultan of Allah Almighty and god's light. Finally, the research has used the algorithm of “subtractive clustering and weighted subtractive clustering” measured and Matlab language (2013) to achieve the practical aspect of the study.

Medical Image Classification Using Different Machine Learning Algorithms

Sami H. Ismael; Shahab W. Kareem; Firas H. Almukhtar

AL-Rafidain Journal of Computer Sciences and Mathematics, 2020, Volume 14, Issue 1, Pages 135-147
DOI: 10.33899/csmj.2020.164682

The different types of white blood cells equips us an important data for diagnosing and identifying of many diseases. The automation of this task can save time and avoid errors in the identification process. In this paper, we explore whether using shape features of nucleus is sufficient to classify white blood cells or not. According to this, an automatic system is implemented that is able to identify and analyze White Blood Cells (WBCs) into five categories (Basophil, Eosinophil, Lymphocyte, Monocyte, and Neutrophil). Four steps are required for such a system; the first step represents the segmentation of the cell images and the second step involves the scanning of each segmented image to prepare its dataset. Extracting the shapes and textures from scanned image are performed in the third step. Finally, different machine learning algorithms such as (K* classifier, Additive Regression, Bagging, Input Mapped Classifier, or Decision Table) is separately applied to the extracted (shapes and textures) to obtain the results. Each algorithm results are compared to select the best one according to different criteria’s.

HPPD: A Hybrid Parallel Framework of Partition-based and Density-based Clustering Algorithms in Data Streams

Ammar Thaher Abd Alazeez

AL-Rafidain Journal of Computer Sciences and Mathematics, 2020, Volume 14, Issue 1, Pages 67-82
DOI: 10.33899/csmj.2020.164677

Data stream clustering refers to the process of grouping continuously arriving new data chunks into continuously changing groups to enable dynamic analysis of segmentation patterns. However, the main attention of research on clustering methods till now has been concerned with alteration of the methods updated for static datasets and changes of the available modified methods. Such methods presented only one type of final output clusters, i.e. convex or non-convex shape clusters. This paper presents a novel two-phase parallel hybrid clustering (HPPD) algorithm that identify convex and non-convex groups in online stage and mixed groups in offline stage from data stream. In this work, we first receive the data stream and apply pre-processing step to identify convex and non-convex clusters. Secondly, apply modified EINCKM to present online output convex clusters and modified EDDS to present online output non-convex clusters in parallel scheme. Thirdly, apply adaptive merging strategy in offline stage to give last composed output groups. The method is assessed on a synthetic dataset. The output results of the experiments have authenticate the activeness and effectiveness of the method.

Collaboration Networks: University of Mosul Case Study

Basim Mohammed Mahmood; Nagham A. Sultan; Karam H. Thanoon; Dhiya Sh. Khadhim

AL-Rafidain Journal of Computer Sciences and Mathematics, 2020, Volume 14, Issue 1, Pages 117-133
DOI: 10.33899/csmj.2020.164679

Scientific research is currently considered as one of the key factors in the development of our life. It plays a significant role in managing our business, study, and work in a more flexible and convenient way. The most important aspect when it comes to scientific research is the level of collaboration among scientific researchers. This level should be maximized as much as possible in order to obtain more reliable solutions for our everyday issues. To this end, it is needed to understand the collaboration patterns among researchers and come up with convenient strategies for strengthening the scientific collaboration. The scientific collaboration among the University of Mosul researchers–which is our case in this study–has not yet been investigated or analyzed. In this work, we aim at revealing the patterns of the scientific collaboration of the scientific colleges in the University of Mosul. We generate a co-authorship network for the university; the generated network is based on the data we collected from each individual researcher. The generated co-authorship network reveals many interesting facts regarding the collaboration patterns among the university researchers.

Design Simulation System to Simplifying Boolean Equation by using Karnaugh Map

Elham H. Aziz

AL-Rafidain Journal of Computer Sciences and Mathematics, 2020, Volume 14, Issue 1, Pages 97-115
DOI: 10.33899/csmj.2020.164680

Simulation is one of most important technique used for learning, it makes learning possible without cost and provides best way to improve the practical skills for learners. The purpose of this  research  was to design program  to simulate  processing of simplifying  Boolean expression by using kranaugh- map depending on rules and procedures applied to Boolean equation in order  minimize  it to obtain  final optimal expression with minimum  number of  variables ,and reduction in  equipment  that leads to  reduce cost,  and this research recommend to use modern methods in education which  Simulation programs is one of this method to  improve E-learning  to keep up with universities  which care to use E-learning with traditional education and make student more interactive with education progress. 

The Pandemic COVID-19 Infection Spreading Spatial Aspects: A Network-Based Software Approach

Basim Mohammed Mahmood; Marwah M. Dabdawb

AL-Rafidain Journal of Computer Sciences and Mathematics, 2020, Volume 14, Issue 1, Pages 159-170
DOI: 10.33899/csmj.2020.164684

Coronavirus or what has been termed COVID-19 is one of the infectious diseases that have been recently classified as a pandemic. Currently, it is considered as the activist and the most dangerous disease that is rapidly spreaded around the world causing thousands of death cases. COVID-19 spreads between people through the contact with the infected ones when they sneeze, cough, or droplets of saliva. In this article, we investigated the impact of the spatial aspects and the movement patterns on COVID-19 infection spreading. We considered three aspects, namely, mobility patterns, curfew (stay-at-home) impact, and the distribution of people within places. The results show that spatial aspects can be considered as one of the factors that play a significant role in spreading the virus.

Predicting Bank Loan Risks Using Machine Learning Algorithms

Maan Y. Alsaleem; Safwan O. Hasoon

AL-Rafidain Journal of Computer Sciences and Mathematics, 2020, Volume 14, Issue 1, Pages 149-158
DOI: 10.33899/csmj.2020.164686

Bank loans play a crucial role in the development of banks investment business. Nowadays, there are many risk-related issues associated with bank loans. With the advent of computerization systems, banks have become able to register borrowers' data according their criteria. In fact, there is a tremendous amount of borrowers’ data, which makes the process of load management a challenging task. Many studies have utilized data mining algorithms for the purpose of loans classification in terms of repayment or when the loans are not based on customers’ financial history. This kind of algorithms can help banks in making grant decisions for their customers. In this paper, the performance of machine learning algorithms has been compared for the purpose of classifying bank loan risks using the standard criteria and then choosing (Multilayer Perceptron) as it has given best accuracy compared to RandomForest, BayesNet, NaiveBayes and DTJ48 algorithms.

Applying Standard JPEG 2000 Part One on Image Compression

Maha Abdul Rahman Hasso; Sahlah Abed Ali

AL-Rafidain Journal of Computer Sciences and Mathematics, 2020, Volume 14, Issue 1, Pages 13-33
DOI: 10.33899/csmj.2020.164796

In this paper, has been proposed Algorithm for standard JPEG2000 part one for image compression. The proposed Algorithm was executed by  using  MATLAB7.11  environment,  applied  these  algorithm  on  the gray and color images for type of the images natural, medical, Graphics images  and  remote  sensing.  Dependence  on  the  Peak  Signal-to-Noise Ratio  (PSNR)  for  comparing  the  result  of  the  proposed  Algorithm  by using the Daubechies filters 5/3 tap filter and 9/7 tap filter  Biothogonal , Another  comparison  is  held  concerning  the  obtained  results  of   the algorithm    of    ModJPEG  and  Color-SPECK. Proved  the  processing results Efficiency performance of   proposed Algorithm.

Information Hiding Based on Chan-Vese Algorithm

Samia Sh. Lazar; Nadia M. Mohammed

AL-Rafidain Journal of Computer Sciences and Mathematics, 2011, Volume 8, Issue 2, Pages 99-110
DOI: 10.33899/csmj.2011.7876

The process of data transfer via the Internet becomes an easy process as a result of the great advances in networking technologies, and now many people can communicate with each other easily and quickly through them.Because the online environment is general and open, the unauthorized one can control information were transmitted between any two parts and interception of getting access for it, because of that there is an emergency need for write covered, which is the science of hiding secret information in a digital cover such as an images, so it is impossible for the normal person and others unauthorized to detected or perceives. In this paper, the technology in the field of information hiding in the images is developed, where first, the cover (PNG, BMP) image is segmented using Chan-Vese algorithm, then the text will hide in the segmented image depending on the areas of clipping.The standards (PSNR, BER) are used to measure technical efficiency. In addition the algorithm of this technique is implemented in Matlab.

A Hybrid Ant Colony Optimization Algorithm to Solve Assignment Problem by Hungarian Method

Isra N. Alkallak

AL-Rafidain Journal of Computer Sciences and Mathematics, 2009, Volume 6, Issue 2, Pages 159-175
DOI: 10.33899/csmj.2009.163805

This research studied ant colony optimization with optimization problem as an assignment model problem by Hungarian method. The proposed heuristic algorithm simulate ant colony optimization algorithm with  Hungarian method for Assignment   problem. The ant colony optimization algorithm simulates  the behavior of  real ant colony, to find the shortest path between many paths for solving the problem. It dependent on the path from the nest (problem of research) to food (optimal solution) by deposited pheromone on the path they take between the nest and food, so that other ants can smell it.
The experiment in this research  shows that the algorithm provides optimal solution. It  has outperforms with computation and it is an effective approach  and the algorithm performs significantly better than the classical method, to  reduce the region of the space considered and computation as compared to the classical methods.

Comparison of Edge Detection Methods in Gray Images

Sobhi H. Hamdoun; Afzal A. Hassan

AL-Rafidain Journal of Computer Sciences and Mathematics, 2006, Volume 3, Issue 2, Pages 11-28
DOI: 10.33899/csmj.2006.164056

The methods of edge detection play an important role in many image processing applications as edge detection is regarded as an important stage in image processing and the extraction of certain information from it.
Therefore, this subject was the focus of many studies performed by many authors. Many new techniques of edge detection which search into the discontinuity in color intensity of the image leading to the features of the image components were suggested.
Despite of the presence of many methods of edge detection which proved their efficiency in certain fields and gave good results on application, the performance of one method differs from one application to another, thus there was a need to carry out an evaluation of performance for each method to show its efficiency. The aim of this research is to evaluate the performance of edge detection by choosing five methods known as (Canny, Laplacian of Gaussian,Prewitt, Scobel, Roberts) and the application of each method on images with grayscale to find out the performance of each of them and writing down computer programs for each. Also, a subjective evaluation to compare the performance of these five methods using Partt Figure of Merit, calculating the increase percent in the detected edges, decrease percent in the edge points and the correct position of the edge in each method.

Efficiency of Artificial Neural Networks (Percepton Network) in the Diagnosis of Thyroid Diseases

Suher A. Dawood; Laheeb M. Ibrahim; Nabil D. Kharofa

AL-Rafidain Journal of Computer Sciences and Mathematics, 2006, Volume 3, Issue 1, Pages 11-22
DOI: 10.33899/csmj.2006.164042

Thyroid gland software which was obtained through research is considered an effective system to diagnosed thyroid gland automatically. This is done by a built  complementary database which is flexible and easy at work with data patients concerning those patients under observation at Hazim Al-Hafith Hospital for Oncology & Nuclear  Medicine in Mosul. The activity of Thyroid gland software was tested on information about 200 Patients, and information about them was stred in Thyroid database, after that we diagnosed The Thyroid Gland Disease by using an artificial neural network (Perceptron) that is able to recognize Thyroid Gland Disease in good recognized ratio and with a ratio close to the doctor diagnosis depending on (sign & symptoms) which may enables the doctors in depending on it the right diagnosis for the disease.                                                                 

A Modified Heuristic Procedure for NP-Complete Problems Supported by Genetic Algorithms

Najla Akram Al-Saati

AL-Rafidain Journal of Computer Sciences and Mathematics, 2004, Volume 1, Issue 1, Pages 120-137
DOI: 10.33899/csmj.2004.164101

This work is based on the process of modifying an intelligent heuristic rule used in solving NP-Complete problems, where a study and a modification of a Flow Shop assignment heuristic has been carried out to solve a well-known classic Artificial Intelligent problem, which is the traveling Salesman problem. For this modification to be carried out successfully, the problem’s mathematical formulation had to be studied carefully and the possibility of reformulating the problem to be more suitable for the heuristic procedure. This may require some changes in the heuristic procedure itself, these adjustments were due to the noticeable differences like the symmetric property present in the traveling salesman problem environment and some other differences.
Genetic Algorithm is added to improve the results obtained by the used heuristic, where the use of crossover and mutation procedures will provide better chances for the near optimal solution to be improved towards optimal solutions.
The test problem is made on cities that lie on the regular square grid, which simplify the calculations of distance traveled between any two cities. Programs were written using C programming language, and timers were used to measure the elapsed time of calculations in order to assess the efficiency of the program.

Fast Backpropagation Neural Network for VQ-Image Compression

Basil S. Mahmood; Omaima N. AL-Allaf

AL-Rafidain Journal of Computer Sciences and Mathematics, 2004, Volume 1, Issue 1, Pages 96-118
DOI: 10.33899/csmj.2004.164100

The problem inherent to any digital image is the large amount of bandwidth required for transmission or storage. This has driven the research area of image compression to develop algorithms that compress images to lower data rates with better quality.  Artificial neural networks are becoming very attractive in image processing where high computational performance and parallel architectures are required.
In this work, a three layered backpropagation neural network (BPNN) is designed to compress images using vector quantization technique (VQ).The results coming out from the hidden layer represent the codebook used in vector quantization, therefore this is a new method to generate VQ-codebook. Fast algorithm for backpropagation called

(FBP) is built and tested on the designed BPNN. Results show that for the same compression ratio and signal to noise ratio as compared with the ordinary backpropagation algorithm, FBP can speed up the neural system by more than 50. This system is used for both compression/decompression  of any image. The fast backpropagation (FBP) neural network algorithm was used for  training  the designed BPNN. The efficiency of the designed BPNN comes from reducing the chance of error occurring during the compressed image transmission through analog channel (BPNN can be used for enhancing any noisy compressed image that had already been corrupted during transmission through analog channel). The simulation of the BPNN image compression system is  performed using the Borland C++ Ver 3.5 programming language. The compression system has been applied on the well known images such as Lena, Carena, and Car images, and also deals with BMP graphic format images.