مجلد 10 عدد 1 (2013)
Articles
Abstract: Protection of digital multimedia content has become an increasingly important issue for content owners and service providers. Watermarking is identified as a major means to achieve copyright protection. The algorithm proposed in this work, is to use a blind watermarking scheme based on the Discrete Wavelet Transform (DWT). Watermark components are embedded in the LL subband of the 4th DWT level of the host image by quantizing coefficients of the LL subband to improve the watermark robustness. The Genetic Algorithm (GA) is used for optimizing the quantization step size parameter, and the strength of factors. The host image used is a 512×512 gray scale image and the watermark image is a 32×32 binary logo. The proposed scheme was tested against mostly known threats and it proves to give good robustness. Also, it still gives a high quality watermarked image. MATLAB Program was used to perform the watermarking task.
Abstract: Recently, many cases of deformities and health problems that affect the newborn were recorded, because of the known pollutants and radiation. One of these problems is the growth of the child, both in terms of height and weight or in terms of the natural growth of head circumference. Any increase or decrease in the measurement of head circumference a sign of a problem This research addresses the issue of natural growth of the baby's head circumference from the first month until he/she reaches a year and half old. Artificial neural networks were used to train the normal values for the growth of head circumference based on the medical chart for the growth of head circumference and adopted internationally. Results showed the efficiency and accuracy of the work of these networks in the diagnosis of natural cases from the others.
Abstract: Genetic Algorithm has been hybridized with classical optimization methods. Hybridization has been done in three approaches, by using conjugate gradient algorithm for Fletcher and Reeves, second by using steepest descent method and lastly by creation of initial population for genetic algorithm from one of conjugate gradient method, the numerical results were encouraging.
Abstract: This paper presents a medical application based on digital image processing and Artificial Neural Network (ANN), which can recognize three types of Hereditary Hemolytic Anemia (HHA) that affect the Red Blood Cells (RBCs) and change their shape. Three Feed Forward Back Propagation Learning (FFBBL) Neural Networks are used in hierarchical approach to achieve this goal. The essence of this research is to segment each Red Blood Cell in a separate image and then extract some interesting features from each image in order to present them to the neural networks. The latter will, in turn, take the decision whether the RBC is infected or not. The results showed a recognition rate 92.38 %.
Abstract: Among the various forms of malware, Botnets are emerging as the most serious threat, Botnets, remotely controlled by the attackers, and whose members are located in homes, schools, businesses, and governments around the world. This paper is a survey about Botnet and how Botnet is detected. The survey clarifies Botnet history, Botnet lifecycle, Botnet detection techniques, and proposed software has ability to detect (koobface) Botnet which attacks facebook website.
Abstract: Data mining is the process of extracting hidden patterns from data. One of the most important activities in data mining is the association rule mining and the new head for data mining research area is privacy of mining. Privacy preserving data mining is a new research trend in privacy data for data mining and statistical database. Data mining can be applied on centered or distributed databases. Most efficient approaches for mining distributed databases suppose that all of the data at each site can be shared. Privacy concerns may prevent the sites from directly sharing the data, and some types of information about the data. Privacy Preserving Data Mining (PPDM) has become increasingly popular because it allows sharing of privacy sensitive data for analysis purposes. In this paper, the problem of privacy preserving association rule mining in horizontally distributed database is addressed by proposing a system to compute a global frequent itemsets or association rules from different sites without disclosing individual transactions. Indeed, a new algorithm is proposed to hide sensitive frequent itemsets or sensitive association rules from the global frequent itemsets by hiding them from each site individually. This can be done by modifying the original database for each site in order to decrease the support for each sensitive itemset or association rule. Experimental results show that the proposed algorithm hides rules in a distributed system with the good execution time, and with limited side effects. Also, the proposed system has the capability to calculate the global frequent itemsets from different sites and preserves the privacy for each site.
Abstract: The main objective of speech coding is to allow the transmission of speech over digital channel of the highest speech quality and least possible bit rate, beside the security purpose. In this paper, the speech was coded by transforming it applying (which is often single dimension) into a two dimensional array to be suitable for transferring the countourlet transformation. Applied EZC (Embedded Zero tree Contourlet) algorithm, then is applied to the Huffman coding on the results of EZC, and used RLE (Run Length Encoding). The above idea gave the ability for coding, compression with retrieved information of high accuracy, using some measurements for quality measured of reconstructed signal, and found results show high similarity between the original and reconstructed signal.
Abstract: Mobile technology is rapidly developed, this developing leads the production of the multimedia smartphone that supports wireless LAN (WLAN) that are widely deployed because of its ease of use, flexibility, sharing application and support for multimedia transmission. In this paper, a system was designed and implemented to stream audio from the computer and displayed it on the smartphone model using the (Client\ Server) model. This work deals with real-time audio, using real-time streaming protocol (RTSP), the proposed work has achieved success of sending voice over the wireless network environments via heterogeneous operating system, Linux and Android.
Abstract: Security has become an important issue for networks. Intrusion detection technology is an effective approach in dealing with the problems of network security. In this paper, we present an intrusion detection model based on PCA and MLP. The key idea is to take advantage of different feature of NSL-KDD data set and choose the best feature of data, and using neural network for classification of intrusion detection. The new model has ability to recognize an attack from normal connections. Training and testing data were obtained from the complete NSL-KDD intrusion detection evaluation data set.
Abstract: Software Quality still a vague and multifaceted concept, which means different things to different people, metrics for object oriented design focuses on measurements that are applied to the class and design characteristics. These measurements allow designers to access the software early in process, making changes that will reduce complexity and improve the continuing capability of the design. This paper focused on a set of object oriented metrics that can be used to measure the quality of an object oriented design. We study carefully Metrics for object oriented design and focus on MOOD model.
Abstract: Data warehousing and on-line analytical processing (OLAP) are essential elements of decision support, which has increasingly become a focus of the database industry. A distributed OLAP system is designed which uses multi microcomputers based local area network. The introduction distributes technology into OLAP system that can disintegrate the complicated query and analysis into different servers. In this paper, there are a lot of theoretical concepts associated with data warehouse and OLAP systems, and distributed data will be the implementation of several measures such as design cubic data and distribution algorithm and division of the data warehouse and decision support system DSS is performed to answer the complicated query. Practical results show that the distribution of data to multiple servers with OLAP system is faster according to the algorithm that has been dealing with client-server architecture to distribute the data warehouse. Statistical analysis concepts are used from current work to get predictable results which can be used to get suitable result DSS.
Abstract: Along with the development and growth of the internet network, and the rapid expansion of World Wide Web and local network systems have changed the computing world in the last decade. Nowadays, as more people make use of the internet, their computers and the valuable data in their computer system contain become more exposed to attackers. Therefore, there is an increasing need to protect computer and network from attacks and unauthorized access. Such that network intrusion classification and detection systems to prevent unlawful accesses. This work has taken the advantage of classification and detection abilities of Artificial Intelligent Techniques AITs algorithms to recognize intrusion(attack) and also detect new attacks. These algorithms are used to multi classifier and binary classifier for network intrusion and detect it, AITs such as unsupervised and supervised fuzzy clustering algorithms ( Fuzzy C-Mean FCM, Gustafson-Kessel GK, and Possibilistic C-Means PCM ), was applied to classify intrusion into 23 classes according to the subtype of attack. The same dataset classifies it into 5 classes according to the type of attacks (Normal, DoS, Probe, U2R, R2L). And also classifies this dataset into 2 classes (Normal, and Attack), one for normal traffic and another for attack, also these algorithms are used to detect intrusion. Other techniques were used which are artificial neural network (ANN) represented by counter propagation neural network (CPN) which is hybrid learning (supervised and unsupervised) that is applied to classify intrusion into 23, 5 and 2 class(es) and used it to detect the network intrusions, and then we combined fuzzy c-mean with two layers Kohonen layer and Grossberg layer for counter propagation neural network to produce the proposed approach or system that called it fuzzy counter propagation neural network (FCPN) were applied it to classify network intrusion into 23, 5 and 2 class(es) and detect the intrusion. DARPA 1999 (Defense Advanced Research Project Agency) dataset which is represented by Knowledge Discovery and Data mining (KDD) cup 99 dataset was used for both training and testing. This research evaluates the performance of the approaches that are used that obtained high classification and detection rate with low false alarm rate. The performance of the proposed approach FCPN is the best if it is compared with the other approaches that are used and with previous works. Finally, in this research comparisons are made between the results obtained from the application of these algorithms on this dataset and the FCPN is the best approach that is implemented into Laptop where, CPU 2.27GH and RAM are 2.00 GB.
Abstract: Semantic Web is an extension to the current web. It will convert the way we use World Wide Web (WWW) by giving the machine the capability to process and infer data in web. It starts as a vision and becomes a future trend in web. Due to the huge data that is scattered as web pages in web today, adding semantic meaning to data stored in these pages became necessary for the next age of information technology. The Semantic Web will bring structure to the meaningful content of Web pages, creating an environment where software agents roaming from page to page can readily carry out sophisticated tasks for users. Several tools and new technologies have been emerged to help bring this vision to reality. In this paper, Semantic Web is defined and described with its layering architecture and supporting technologies and tools. An example is given to show how to use these tools to semantically representing data model. At last, challenges and difficulties faced building this web and made it an extension to the current web has been discussed.
Abstract: In the distributed real time systems, tasks must meet their deadline even in the presence of hardware/software faults. Fault tolerance in distributed real time systems refers to the ability of the system to meet the tasks deadline and to detect their failure and recover them. In this paper, we considered the problem of fault tolerance and developed a fault tolerance protocol called DRT-FTIP (Distributed Real Time – Fault Tolerance Integrity Protocol).This protocol increases the integrity of the scheduling in distributed real time systems.
Abstract: Cuneiform is the ancient writing systemin the world. But, there is no clear interest recognition cuneiform symbol, despite its importance. This research interested in building an algorithm for cuneiform symbol recognition. Firstly, the Sumerian texts were entered through the scanner and make some initial preprocessing operations such as segmentation for the purpose of cutting the text and getting a cuneiform symbol. Then, features were extracted for each symbol by using vertical and horizontal projections, centre of gravity, and connected component. Because of the large number of cuneiform symbols, the similar symbols are clustered by K-means algorithm, then multilayer neural networks are used to classify the symbol within the same cluster. The proposed algorithm gave good results.
Abstract: random and have the ability to store a large amount of information. DNA sent either in natural form in the test tube (after being processed) or in digital form (a series of nitrogenous bases), in both cases, the DNA bar can be vulnerable to intrusion by unauthorized destination. This research aims to authenticate the acid and thus the receiving side can ensure the credibility of the acid and that it had not been altered.
Abstract: The Research aim to implementation Distributed Database approach through ERP system that new application for information Technology Owing to the Scarcity of the Arabic studies tackling ERP System Integrate Shaped the Research tried tacking system through suggestion Models with design suggest Electronic Program to implementing it, the research Attempts to Answer to this question: what is the mechanism occasion to implementing ERP Systems in Factory of Fabric and Weave?. The result indicate a set of Conclusions: the Suggest ERP system is important expedient to linkage and integrated all part in Factory. in light of the conclusions of the research has provided a number of recommendations are consistent with these conclusions.
Abstract: This paper present a new practical authentication system which is depend on one of the biometrics features Hand Geometry features. the proposed system Captures human hand image to extract 50 features then use these features in creating for database system authenticate users stage and recognition stage. The paper present a complete and clear study of the basic fundamentals open set principle and shown the effect of population size on the recognition rate. Good results was obtain that is more than 91% in this application. MATLAB7.9.0 (2009b) programming language has been used to execute the paper algorithms, because its facilities in processing digital images.
Abstract: In this research an algorithm was suggested for studying speech signal properties for both smokers and non smokers then verificate that the person smoker or not based on his speech signal. A data base that contain 30 speech signals 15 belong to smoker and 15 belong to non smokers for male only. In this algorithm formant frequencies such as f1, f2 were adopted as characteristic properties for speech signal for splitting between two classes which it calculate using lpc algorithm. The algorithm consist of two stages: ♦ Data base preparation stage ♦ Speaker state classification stage The absolute, eclideance and d1 distance were adopted as measures for evaluating the performance of the algorithm and it gave convergece results.
Abstract: Hiding information is an effective solution for the protection of copyright and confidentiality to allow a person to send the data in the middle of the cover image to a person without knowing any third party in this transmission, methods of delivering secret messages are very important. This research provides a way to hide data (which is a text file) after is encrypted adoption method (Keyword Mixed Transposition) to produce cipher text is included in Low–High coefficient wavelet transform and get a good quality image and the possibility of recovering fully embedded message and decoded without relying on the original image. Results have applied to the digital images to get inline images to the data with a high correlation coefficient when compared with the original images in addition to that they gave a few differences when calculating measurements(SNR, PSNR, MSE).
Abstract: In this paper addresses the multi-period single-item inventory problem with stochastic demands. In which the demand in each period is known but it varies from time to time for N replenishment periods. The main idea is to calculate expected the total minimum cost and the optimal quantity and the optimal replenishment periods using probabilistic dynamic programming. The results showed the optimal replenishment are (1, 10, 28), optimal quantity is (34974) and the expected total minimum cost is (883.487).
Abstract: In this study we are build a model for the intervals of occurrence of viral hepatitis type C in Nineveh province by using of time-series analysis, and compared with the Geometric process model, and reconcile an adequate model from Box - Jenkins models for that data. Resulted from the research that the intervals between HIV cases are decreasing Geometric process leading to unpredictability of these cases or configuration predictive function, while the ARIMA models proposed appropriate for the intervals of occurrence of viral hepatitis type C cases, and through comparison of these models show that ARIMA (1, 1, 5) model is the best model proposed for these data.
Abstract: To identify certain variable that influence the result Internet and Computing Core Certification examination (IC3) which is conducted in the ComputerCenter and Internet at the University of Mosul. The affect of the response of the examination score in the computer applications by seven explanatory variables were studied, the variables are: the score of computer basics examination, the examinee's sex, the experience of the examinee in the computer, the specialization of the examinee, the participation of the examinee in courses, degree to be obtained through study and the repeated participation in the examination. Four statistical procedures were applied to select the best regression equation that involves the explanatory variables which have a real effect on the dependent variable. These procedures include: forward selection, backward elimination, stepwise regression and all possible procedures depending on certain comparison measures, which include mean square error MSe, and Mallows-Cp Statistics. It was clear that the best model which non-presence of the constant b0, and involves the variables: the score of computer basics examination, the examinee's sex, the specialization of the examinee and degree to be obtained through study.
Abstract: This paper presents a new method for fingerprint recognition depending on various sizes of fingerprint images. The proposed algorithm applied on more than 30 fingerprint samples, the results was good. The proposed algorithm begins with apply enhancement operations on the fingerprint image to eliminate unwanted noise around the fingerprint by using median filter. Then apply thinning operation on the enhanced image and compute co-occurrence matrices for produced image. Next, the properties of the co-occurrence matrices used as inputs of the neural network for recognition process. To speed the recognition process back propagation network used. The ratio of recognition about 100%.
Abstract: In this paper we used non parametric density estimator (Kernel Estimator) to estimate probability density function for Image data of Hand Gesture and warping Hand Gesture and we see the curve for Kernel Estimator and combine the curve between Kernel Estimator and normal Distribution. Programs written using the language Matlab (R2009a) .
Abstract: The focus of the present research is on the issue of patterns compatibility regarding an English letter through the use of a probable research Algorithm called the Artificial Immune Network (AIN(. The research clarifies the algorithm ability in patterns compatibility between the original (ideal) pattern of the letter and the deformed patterns since the Artificial Immune Network (AIN (is good for some tasks that require examples. It applies to the issues that have large (wide) areas and large variables. In addition, it can also be quickly and easily solved as well as it provides a solution that is quite near to the ideal solution of the patterns used, The data base used contain file involves data for each original (ideal) pattern of the English letter, the pattern recognition operation (template matching) provided %94 by using Artificial Immune Network. knowing that we obtain the practical result by using MATLAB 2008.
Abstract: This paper deals with PageRanks which are used in the analysis of web links. It displays an introduction for the concept PageRanks and also some basic theoretical aspects related to it. It exposures three common methods for the purpose of calculating PageRanks: the direct method, the Markovian method and the power method. In order of fortification, these methods are applied on Internet sites being considered for the centers of the University of Mosul. Because the lack of realistic data, we dealt with hypothetical data for modeling and analysis of the links between sites centers affiliated to the University of Mosul. It is clear from the practical application of the three methods that they give consistent results.
Abstract: With the development of means of communication, computer science and information exchange via electronic information networks an urgent need emerged to find ways to save exchanged information. Encryption had a prominent role in this area. However, with the development of intrusion hackers become able to access to information and change it. This showed the need to adopt more sophisticated technology and more confidentiality in order to preserve the information. So, it become famous to use the system of coverage in which the sent the information being invisible to anyone, through hiding it inside the sent media, such as audio, image, text, and video. This paper aims to apply the idea to hide image message, using the least significant bit algorithm inside an image and encrypt it in a new way for encryption using a genetic algorithm. For the purpose of increasing security of the access of the letter it is being encrypted to hide the message before using the genetic algorithm to generate random numbers employed in the process of concealment, for the highest extent of randomness. This in turn increases the strength of encryption and concealment. The study has been able to achieve this by adopting the recommended approach in such cases
Abstract: In this research discuss the concept of appropriate form, by examining mathematical behavior for three models represent neural networks are (GRNN, BPNN, PNN), were applied two types of medical data are (osteoporosis and weaknesses auditory) and different in the way of classification and spaces Input and output, and show through the application of these data and suitability models with neural networks in terms of the Domain and Range the network (PNN) is the best in the diagnosis of audio data through average MSE, and network (GRNN) is better diagnose bone crisp data (which are more complex) and the network (BPNN) is the most generalization, especially when test data are large compared with the training data.
Abstract: The concealment of the most important means used by the security institutions with critical communications in all countries of the world, they provided the technology of high security, especially in the communication networks and the Internet. Summarized algorithm concealment improved transfer confidential letter to the formula of (Binary) and then encrypted using a key agreed upon by the two parties (sender and recipient), followed by the division of the image to be hide data where clips blocks size (8*8) and account values standard deviation (Standard Deviation (STD)) for each section are then finding less and the largest value of a standard deviation in addition to the median value, then isolate sections where the value of the standard deviation less Oomsawih of median value to be key concealment (ie be adopted as locations to hide) and that by including all bit of message into the (LSB) for each section of the selected sections. Used measures of efficiency (PSNR), (MSE), (BER) for the purpose of measuring the efficiency of the algorithm and the adoption of improved language (MATLAB).
Abstract: The research deals with implementation of a new algorithm to encrypt multimedia files this method called random scrambling, to increase the security of the transmitted files, a secret key is used to prevent unauthorized persons from extracting these files. The encryption applied on image, sound, and video files, Matlab is used to implement the algorithm due to the facilities it provides for dealing with multimedia files as well as GUI. Finally, experimental results demonstrate the efficiency of the algorithm in the encryption of Multimedia files.
Abstract: With the evolution of life and complexity there is a need to use the images in many scientific fields weather forecast, medical, engineering, and the fact that these images are a lot of distortion during capture due to the availability of cameras and Rate in abundance and in the hands of people who do not have enough experience in how to capture images in terms of quality lighting and distribution has become important use a variety of methods to improve these images show the best form of these methods and improve the image contrast (Contrast Enhancement). In this research applied traditional algorithm filter to improve the Adaptive Contrast Enhancement Global Filter (ACE _mean) and has been the development of this algorithm to a new two Hybrid algorithms to improve the contrast Enhancement (Median Contrast Enhancement ACE_median) and the other algorithm is (Max & Min Contrast Enhancement ACE_max&min). the two hybrid algorithms obtained from merging two available methods, first is the traditional algorithm (ACE_mean) with the second methods of another practice in improving the image is (Smoothing Images Enhancement) which is used to removing the noise from images .the results of each method were compared with the results of other algorithms to show the best of them.
Abstract: An improvement to a Steganography method for hiding data using Gray Level Images is proposed in this paper. The method is an improvement over earlier method named Least Significant Bit (LSB). This method uses the 5th, and 6th bits of pixel value for insertion and retrieval of message by using the same bits of pixel value. The idea here, deals with images in different sizes and extensions (BMP, JPG, PNG) for obtaining better results and more efficient than its original one (LSB). The selection of pixel locations is done by using the pseudo random number generator, Which is uses the same key for insertion as well as retrieval of process. The experiments show the Superiority of the new idea through using the performance measures (PSNR, BER, NC).
Abstract: The increasing of security attacks and unauthorized intrusion have made network security one of the main subjects that should be considered in present data communication environment. Intrusion detection system is one of the suitable solutions to prevent and detect such attacks. This paper aims to design and implement a Network Intrusion Detection System (NIDS) based on genetic algorithm. In order to get rid of redundancy and inappropriate features principle component analysis (PCA) is useful for selecting features. The complete NSL-KDD dataset is used for training and testing data. Number of different experiments have been done. The experimental results show that the proposed system based on GA and using PCA (for selecting five features) on NSL-KDD able to speed up the process of intrusion detection and to minimize the CPU time cost and reducing time for training and testing. C# programming language is used for system implementation.
Abstract: Steganography art is a technique of hiding data within data, as hiding text message within an image or an audio file or a video file, it is a new method used as a substitute for the known encryption technology. This kind of hidden messages is distinguished that they reach their destination completely confidential, unlike encrypted messages that although it can never decoded without access to the encryption key; it can be identified as an encrypted message. In this work, a new algorithm is proposed for compression and encodes the data using Huffman coding technique, then hides them in an audio file using a secret key for the distribution of encoded data within and LSB well known approach. The output is saved in a new audio file. In the recovery phase, the values are retrieved and decoded by the same method. Results show highly accurate results as a measure auditory perception of the original file and the embedded data file as well as performance metrics knowledge (signal-to-noise ratio and the correlation coefficient and the mean square error and the peak signal to noise ratio).
Abstract: In this work, anew hiding algorithm is proposed that depends on using two deferent domain (spatial and frequency) to achieve and provide security and protection for transformed data, which is represented in a text message encrypted using a key of direct standard method, then spatial domain were use to hide it inside an audio file, Using Discrete cosine transform (DCT) this file has been hidden inside a host audio file. The data of the text message has been fully retrieved after decrypting process, the value of normalization correlation is equal to one.
Abstract: The world is tend now a lot of development in communication field specially in internet , therefore it is very important to find security methods for files that is send from one place to another. One of these security approaches is steganography which have characteristic of transfer massages within transmission media (files) without notice from any obtrusive. This paper provide anew suggested method for hiding text in audio file (wav file), first the text file will be read to extract the massage which is hidden, for more security this massage will be encrypted (or coded ) before hiding it inside sound media (in time domain) . This algorithm was applied on more than one example and good results were obtained without missing the text massage or notice any noise in the cover file.
Abstract: Information and Communication Technique show an efficient technical capabilities for management and controlling the distributed operations among various factories. For this, the industrial companies started enhancing their operations to cooperative mode to exploit these capabilities in production operations enhancement through achieving the speeding in production's designing and manufacturing. So, the importance of this research is that a cooperative factory owns an intellectual capital as accumulated experiences in designing distinct productions for cars spare parts to be exploited by the other factories. This goal can be achieved by adopting agent for each operation main activity and considering the agent as connection means among the cooperative factories, in addition to connect two industrial companies in cooperative structure and owning integrated data base for production activities controlling and management. Highlights the importance of this study, providing a methodology for advanced manufacturing systems within the framework virtual manufacturing as well as building the application and programming model, to benefit the Iraqi industrial companies in dealing with recession, and to enhance its competitive position among Arab and international companies. This research presents proposed system model called Approach for Designing Distributed Manufactured System (ADMS) deals with factories distributed systems which include data base, design tools, and distributed among computers network. The system serves a query for incoming enquiries execution about spare parts to the main computer (Server) from other connected computers (Servers) without any confliction by using network techniques and data base language capabilities. In this research, a Server/Server model has been designed dealing with Visual FoxPro as Data Base Management System and using AutoCAD package for spare parts design. Note that the research samples has been collected for TOYOTA spare parts for cars which are Wheel gears inside Gear box, from Technical Institute labs in Mosul.
Abstract: The aim of this research is to use neural network in future forecasting field to show the of jumping competitions in international Olympics for (2016-2024). Expert system named (AAA) is designed by using neural network in future forecasting field for period chain of data from 1984 to 2012,which represents 8 years period. The data represent the first three winners in running competition for (100 m., 200 m., 400 m., 100 m. Hurdles, 400 m. Hurdles, 4×100 Relay, 4×400 Relay), The prepared programs for this research has been done C++. Then it forecast three future levels represented in (2016, 2020, 2024),where the Olympic Cycle take place each 4 years. Throughout the results it found that forecasting values are the best by using neural networks then other traditional methods used before. This paper is depended on the results of athletes who take Olympic medals in women jumping events (long, triple, high and pole-vault) in 8 Olympic cycles, since Tokyo cycle (1984) till the last Olympic cycle in (2012) . The cycle on (1984) was used as the beginning of study as it considered as the first Olympic cycle.
Abstract: Recently the Wavelet Transform has gained a lot of popularity in the field of signal and image processing, this is due to its capability of providing both time and frequency information simultaneously.In this paper, focus on used Discrete Wavelet Transform 2-dimenssion )2D- DWT) based on conventional approach, convolution, to image processing and implementation by using FPGA(Field Programmable Gate Array) ,due to many researches were implemented on this Hardware in recent years ,using VHDL. In this paper, has been proposed two VHDL architectures to implementation the conventional of the Daubechies 5/3-tap biorthogonal filter bank, a simple – straightforward one and an optimized one, substituting the multipliers used for scaling with shift – add operations. The architecture of optimized approaches were designed and implemented on FPGA, type of Xilinx XC3S500E Spartan-3E.
Abstract: In this work, A proposed Algorithm has been constructed for the selecting the best band and lessening high dimension of remote sensing data depending on multi algorithms, each on carried out and its results are studied irrespective of other, then combining them in the proposed algorithms, in the principle component analysis algorithm find covariance matrix for the processing bands . Then find Eigen vector by using Jacobs’s method and this represents the highest value in Eigen vector. The algorithm was applied on many groups of multispectral image for the Mapper sensor. By applying it on the first group of images it concluded that the sixth band is the best one, because the value of its Eigen vector is the biggest one. when the algorithm was applied on the second group of images it concluded that the second band is the best one, and the value of its Eigen vector is the biggest one, when the algorithm was applied on the third group of images it concluded that the fifth band is the best, and the value of its Eigen vector is the biggest one (regarding separating the sixth infrared band in the three groups By using wavelet transform algorithm for one level of analysis and selecting the best band according to the least value of mean square error , to show the result of selecting the best ,the k_means algorithm was used to classify images By using K_mean classification algorithm in images .A new way was proposed to determine centers which is an important matter in accurate classifications and specifying initial centers by finding the maximum value and minimum value and finding the mean between them until getting the wanted number of centers. Thealgorithm was applied on three groups of multispectral images .the classification was done on total number of bands to product one band out of it. A new algorithm was constructed depending on the previous three algorithms which applies the wavelet transform on multispectral images and finding the signal to noise ratio depending on variance of each band And arranging it decendingly and then choosing the bands that have highest datas in order to select the best bands and apply the principle component analysis on it. After finding the Eigen vector from the algorithm and selecting the highest values from it, it will be classified. From applying the proposed algorithm it has been clear that it is the best in accordance to applying .because it has shown high efficiency and accuracy in classification and in finding the best band.
Abstract: Estimation models in software engineering are used to predict some important and future features for software project such as effort estimation for developing software projects. Failures of software are mainly due to the faulty project management practices. software project effort estimation is an important step in the process of software management of large projects. Continuous changing in software project makes effort estimation more challenging. The main objective of this paper is find a model to get a more accurate estimation. In this paper we used the Intermediate COCOMO model which is categorized as the best of traditional Techniques in Algorithmic effort estimation methods. also we used an Artificial approaches which is presented in (FFNN,CNN,ENN,RBFN) because of the Ability of ANN(Artificial Neural Network) to model a complex set of relationship between the dependent variable (effort) and the independent variables (cost drivers)which makes it as a potential tool for estimation. This paper presents a performance analysis of ANNs used in effort estimation. We create and simulate this networks by MATLAB11 NNTool depending on NASA aerospace dataset which contains a features of 60 software project and its actual effort. the result of estimation in this paper shows that the neural networks in general enhance the performance of traditional COCOMO and we proved that the ENN was the best network between neural networks and the CNN was the next best network and the COCOMO have the worst between the used methods.
Abstract: This research aims presented a practical application for the process of assessing the overall risk of the projects following the risk management in software engineering steps, the life cycle is the most famous in this area is a (Spiral Model), this model represented by the core of the process to building a system based on customers information taken from them in the form of hearings conversations. This research depended on EA tool (Enterprise Architect) that building by a company (Sparex Systems) for represent the processes of stages to analysis because it is the basis currently used in the world and the most prevalent and most recently the base idea for this work is found the rate of aggregative risk in project by used fuzzy model for fuzzy number by follow Lee's algorithm. This system has been applied a practical in bank / public administration / northern region of the data / analysis and risk assessment of bank credit / short-term and long-term.
Abstract: Reverse engineering has presented solution to a major problem in the development and maintenance of the legacy software which is the process of understanding these software types, It is a difficult task because most of legacy software lacked a proper documentation or a correct design model. Unified Modeling Language has an important and great role in determine (extract) the specifications of the software in accordance with the principles of the reverse engineering and modeling it using one of its types which is a Class Diagram. Reverse Engineering Class Diagram is an abstract representation gives an overview of the software structure and it does not give full information about the internal details and the relationships of software components. In this research a computer aided software engineering tool has been constructed which is called RMDT (Relational Meta Data Table). It bases on constructing an interpreter of an entered software source code analyzing to extract an information that assist in understanding the structure of the software and to clarify its components and the relationships that bind its parts (internal structural of the entered software). The RMDT tool represents the information in tables which have been designed in a highly flexible manner and be suitable for use in the future when applying software re-engineering on the entered software. Furthermore ,the research has studied and tested several of most common software engineering tools which used to implement reverse engineering like (Reverse, ArgoUML, Rational rose, Enterprise Architecture (EA), class2uml, Together). The analysis focused on these tools to produce class diagram of the software source code written in Java. The produced class diagram includes the number of classes, relationship types among classes and the common classes. However, the obtained results from the RMDT tool has been compared with those obtained from others. The produced tables from RMDT tool includes all the information required to recover the design, as they used to produce a class diagram, due to the availability of class, method, variables names, method parameter names, interface, relationships (association, dependency and Generalization) and identify visibility while no such details found in a class diagram that produced by other tools.
Abstract: Computer users need to advanced methods for accessing stored information with at high speed and accurate information retrieval of from among the millions of files stored on a personal computer when they request. This research aims to design and construct a search engine works on personal computer or Intranet to retrieve required files, using natural language dialogue interaction with the user and the computer, rely on key words as an essential element semantic system available on operating systems that support only path and file name in the search. The search engine programme (Search Engine for PC and Intranet - SPCI) covering the most important file types that the user interacts on an ongoing basis and most text files where the text file can be retrieved only by mentioning a word or phrase within the text. The (SPCI) programmer has been developed under Microsoft visual C# version 3.0.
Abstract: In this research, the tools and techniques of artificial intelligence were studied and employed in software engineering. And that was conducted through using the Particle Swarm Optimization PSO and Cat Swarm Optimization CSO in generating optimal test cases of the software written with C++ language in an automatic way because that enables the corporation which develops the program to save time and costs as well as ensuring the test process quality, which is estimated by 50% of the product cost. In this research, the software engineering tool Generate Test Suite GTS TOOL was constructed and modeled with the aid of the computer, which is used to generate optimal test cases automatically and this tool also support the drawing of the control flowgraphs and paths inside the program and tests each path using CSO and PSO. The proposed tool succeeded in generating optimal test cases for several programs and in a very short time. The average of generating the test cases using PSO was 4 minutes and 1.2 minutes for CSO. Where the performance of the CSO was much better than the performance of PSO.