1 Bulletin of the Transilvania University of Braşov Vol 9(58), No Series III: Mathematics, Informatics, Physics, SEMINAL QUALITY EVALUATION WITH RBF N...

Author:
William White

0 downloads 0 Views 400KB Size

SEMINAL QUALITY EVALUATION WITH RBF NEURAL NETWORK

Abdulkader HELWAN1 , Adnan KHASHMAN∗2 , Ebenezer O. OLANIYI3 , Oyebade K. OYEDOTUN4 and Oyetade A. OYEDOTUN5

Abstract The orthodox system of diagnosis in medicine requires that laboratory procedures should be performed to obtain the results of diagnostic queries. While this practice is standard, the field of machine learning is now revolutionizing medical diagnoses. Data (medical histories) of different parents archived over long periods can be used to make predictions on new cases with reasonable accuracies using suitable machine learning methods. Moreover, in many instances where the current situations of patients do not yet justify the cost of expensive laboratory test procedures, machine learning methods can be used to learn past medical data, with which new cases can be diagnosed. Moreover, cost is more reasonable and justifiable with this approach. In this work, we apply radial basis function neural network to the evaluation of semen quality (viability) using some attributes relating to the patients. Important parameters used to assess the performance of the considered model include precision, recall, accuracy and F1 score based on a 10-fold cross-validation scheme. 2010 Mathematics Subject Classification: Primary 60G25, 62M20; Secondary 93E11, 93E10. Key words: Semen quality, evaluation, diagnosis, prediction, radial basis function neural network. 1

Near East University, Lefkosa, Mersin-10, North Cyprus, Turkey, e-mail: [email protected] 2 *Corresponding author, European Centre for Research and Academic Affairs (ECRAA), Lefkosa, North Cyprus, Engineering Faculty, Final International University, Girne, Mersin 10, Turkey, e-mail: [email protected] 3 European Centre for Research and Academic Affairs, Lefkosa, Mersin-10, North Cyprus, Turkey, e-mail: [email protected] 4 European Centre for Research and Academic Affairs, Lefkosa, Mersin-10, North Cyprus, Turkey, e-mail: [email protected] 5 Microbiology/Federal University of Agriculture, Abeokuta, Nigeria, e-mail: [email protected]

138

1

A. Helwan, A. Khashman, E.O. Olaniyi, O.K. Oyedotun, O.A. Oyedotun

Introduction

Modern lifestyle is fast changing so many aspects of human life and the quest to keep up with our now fast paced environment is driving us towards living conditions that adversely affect our health. One of these is the purported decline in seminal fluid quality which is claimed to be somewhat related to environmental factors such as pollution and diet change [1], [2]. Recently, many health organizations and agencies are raising awareness on this health issue. Going further, it becomes necessary that semen analysis is carried out to determine whether its quality suggest that fertilization capability is not affected; better still, such analysis is aimed at determining the viability of seminal fluid such that fertilization chance is unaffected. Generally, laboratory tests are employed for performing the analysis of semen quality. These laboratory tests are expensive and more so even tedious [3]. However, there is an emerging trend in employing machine learning methods for profiling semen in order to ascertain viability. For example, Linneberg, Salamon, Svarer, et al. in their work employed machine learning for the evaluation of the human sperm head based on morphological features [4]. In the work, shape discrimination was achieved using Fourier analysis and neural network. The work developed a framework for classifying processed sperm cells as normal and abnormal; a test error of 25% was reported within this same work. More recently, Gil, Jose, De Juan et al. applied artificial intelligence methods for the prediction of seminal quality [5]. They employed artificial intelligence techniques such as decision trees, multilayer perceptron and support vector machine within the work for learning extracted features (attributes) which are characteristics of seminal quality; the experimental results reported suggest that such artificial intelligence techniques are capable of assessing seminal fluid quality with reasonable precision, recall and accuracy. In this work, we consider some critical factors that are shown to be related to the quality (or viability) of seminal fluid such as the season in which examination is carried out (winter, spring, summer, fall), age of subject (18-36 years), childish disease (i.e. chicken pox, measles, mumps, polio), accident (or serious trauma), surgical intervention, history of high fever spanning over the last one year, frequency of alcohol consumption, smoking habit and number of hours spent sitting per day for classifying semen as normal or altered [6]. Note that normal is translated as viable and altered is translated as non-viable. Particularly, we employ radial basis function neural network (RBFNN) for learning the mapping of the aforementioned factors (attributes) relating to seminal quality to the classes normal or altered. Table 1 shows the details of considered attributes. Attributes presented in Table 1 are normalized into the range 0 to 1; that is, a value range that is suitable for neural network input.

Seminal quality evaluation with RBF neural network

139

Table 1. Fertility data set with details of attributes and range of values

2

Radial basis function neural network (RBFNN)

The radial basis function neural network (RBFNN) relies on the learning capability of interconnected artificial neurons as is evident in many applications [7]-[15]. RBFNN typically have three layers which are the input layer, one hidden layer and output layer. The input layer presents the input attributes to the network. The hidden layer contains neurons which are positioned in the space defined by training data using some pre-defined techniques. The hidden layer neurons compose radial basis functions centred on points of the same dimensionality as the training data. The neurons in the hidden layer compute Euclidean distances of neurons centres from inputs, then apply radial basis functions to the distance to obtain the outputs of the hidden layer. The output layer solely performs a linear combination of the hidden layer output using weights associated with each of the hidden layer neuron. Generally, radial basis functions are chosen to have the nice property such that their responses (outputs) are monotonically decreasing with respect to distances from their central points. The radial basis function neural network (RBFNN) can be seen as universal function approximators relying on the expansion of input data into a higher dimensional space [16]; it also features a simpler and much faster training scheme than the back propagation neural network (BPNN). In RBFNN, the hidden layer activations rely on radial basis functions. One of the highlights of the RBFNN is that input data are project onto a higher dimensional feature space using the hidden layer neurons. Accordingly to Covers theorem, the probability of data linear separability increases when features are expanded onto a higher dimensional hidden space.

140

A. Helwan, A. Khashman, E.O. Olaniyi, O.K. Oyedotun, O.A. Oyedotun

For example, consider a dataset {Xn , tn }, where Xn are input pattern vectors and tn are corresponding target vectors. Hidden layer neurons are centred on each training data giving n set of basis functions, with the form φkXn − Xc k as activations of hidden layer neurons [17]; where the operation φ(·) is the non-linear radial basis function and Xc are centred hidden neurons. The activations of output neurons can be computed using Equation (1), since output layer neurons have activations which are just linear combinations of hidden layer neuron activations. Where, j indexes hidden layer neurons, wj are the hidden-output layer weights and n is the number of hidden layer basis functions [18]. Y =

n X

wj φ kXn − Xc k

(1)

j=1

The aim of learning in RBFNN is to obtain a set of weights wj which allows the function Y to go through the data points contained in the training data. φ11 φ12 φ13 .......φ1N w1 t1 φ21 φ22 φ13 ......φ2N w2 t2 (2) .. .. = .. . .. .. .. .. . . . . . . . φN 1 φN 2 φN 3 ......φN N

wN

tN

From multivariate interpolation analysis, it can be said that given a training dataset {xn , tn } with N different examples, where xn ∈ Rm and tn ∈ R1 , then the aim is to find a function F : Rm → R1 that for all N satisfies F (xn ) = tn ; hence we can write a system of simultaneous linear equations as Equation (2). Where φnc = φkXn − Xc k for (n, c) = 1, 2, 3.....N . If we let w = [w1 , w2 , w3 , . . . , wn ] and t = [t1 , t2 , t3 , . . . , tn ], then we can rewrite Equation (2) as Equation (3). φw = t

(3)

where φ is referred to as the interpolation matrix. Note that we can analytically obtain w as in Equation (4) when φ is a non-singular matrix. w = φ−1 t

(4)

The critical point about obtaining w is to ascertain that φ is indeed a nonsingular matrix. Fortunately, we can guarantee this for several types of radial basis functions under some specific constraints. The most common type of radial basis function used is the Gaussian, which is also used within this work. Equation (5) expresses the Gaussian function [19]. φ(r) = exp

−r2 2σ 2

(5)

Where, r = kXn − Xc k and σ is the smoothing parameter or spread constant for the Gaussian function.

Seminal quality evaluation with RBF neural network

3

141

Experiments

Radial basis function neural network is designed for the mapping of aforementioned semen attributes (model inputs) to viability (model output). The model has nine input attributes as discussed in Section 1; the model has two output neurons to accommodate the two classes herein for viability decision. i.e. normal (viable) or altered (non-viable). The Gaussian function has been used in the hidden layer, while the Logistic-Sigmoid function is used in the output layer. The Logistic-Sigmoid function is given in Equation (6). Figure (1) shows the topology of the designed neural network. Ok =

1 1 + exp(−T.Pk )

(6)

where, T.P is the neurons k total potential, which can be computed using Equation (7). Note that Oj is the hidden layer neuron j output. T.Pk =

m X

wkj Oj

(7)

j=1

Several experiments are performed to determine the suitable value for the spread constant. The number of hidden neurons has been set equal to the number of training data points (i.e. 100 hidden neurons) as in the typical RBFNN architecture. Also, there are 2 output layer neurons; either of these neurons responds maximally to the two classes within this work. i.e. normal (viable) semen and altered (non-viable) semen. Note that since the dataset is small (100 data points), a 10-fold cross validation scheme is implemented for training in order to better capture input attributes distribution.

Figure 1: The designed radial basis function neural network topology

142

A. Helwan, A. Khashman, E.O. Olaniyi, O.K. Oyedotun, O.A. Oyedotun

That is, data is partitioned into 10 different segments; 9 of the 10 segments are used for learning while the remaining 1 segment is used for testing the model for generalization. This procedure is repeated 10 times by selecting combinations of different segments for training and the left out segment for testing. The average training time for the RBFNN is 0.402 seconds. Table 2 shows the training performances of the models with different spread constant values. Note that the performances reported within this work are averages (and standard deviations) of 10 experiments based on the 10-cross validation scheme earlier discussed. It will be seen that RBFNN5 with a spread constant of 1.0 achieved the highest performance on many of the performance metrics. The performance metrics reported within this work include precision, recall, F1-score and accuracy. i.e. Table 2, 3 and 4.

Table 2. Performance parameters on training data for RBFNNs with different spread constants where, expressions for performance metrics are: precision = TP/(TP+FP); Recall = TP/(TP+FN); F1-score = 2(precisionrecall)/(precision+recall); Accuracy = (TP+TN)/(TP+TN+FP+FN).

4

Experimental results and discussion

The RBFNN models described in section 3 are built using training data obtained from the 10-fold cross validation scheme also described within the same section. However, it is not enough to rely on the performance of such models based solely on training data; to ascertain that trained models have good generalization capability, we also report on the performances of the trained models on testing data obtained from the aforementioned 10-fold cross validation scheme. Table 3 shows the averages and standard deviations of models on the considered performance metrics. From Table 3, it will be seen that RBFNN2 with a spread constant of 0.3 achieved the highest performances on many of the performance metrics. Also, it will be seen that though RBFNN5 achieved the highest performances on many performance metrics based on the training data (i.e. Table 2); there is a drastic

Seminal quality evaluation with RBF neural network

143

decline in performance on the testing data (i.e. Table 3). Hence, it can be said that overfitting is observed in RBFNN5. Conversely, the performances of RBFNN2 on both training and testing data are quite decent; that is when models performances on training and testing data are considered. Hence, suitable learning is observed in RBFNN2. Furthermore, in Table 4, we provide a comparative analysis of the best experimental results (i.e. from RBFNN2) obtained within this work with an earlier work which employed machine learning methods such as multilayer perceptron (MLP), support vector machine (SVM) and decision tree (DT) [5].

Table 3. Performance parameters on testing data for RBFNNs with different spread constants

Table 4. Comparison of experimental results with an earlier work Table 4 shows that the best model obtained within this work provides competitive performance in comparison with other models from the aforementioned earlier work. More important is that we consider the time required to build models. Also, in Table 4, we report the average training time for the RBFNN model and estimated training times for the models from the earlier work considered. Although, the earlier work did not report training times for the developed models, it is not unreasonable to have our estimates presented in Table 4. That is, the training times for the MLP and SVM models are far greater than one second; the training time for the DT cannot be estimated (though it should be very small) therefore denoted ’XX’; while the training time for our model, RBFNN, is 0.4. Hence, when the performance metrics of the model presented within this work is compared with models performances from the earlier work relative to training times, it can be seen RBFNN is promising for such an application.

144

5

A. Helwan, A. Khashman, E.O. Olaniyi, O.K. Oyedotun, O.A. Oyedotun

Conclusion

Laboratory procedures are usually carried out to determine the quality of seminal fluid. However, artificial intelligence methods are significantly being adopted in many support decisions systems. Moreover, they provide relatively inexpensive approach for medical diagnosis as against laboratory setups. In this work, we consider the problem of semen quality analysis. We employ radial basis function neural network for learning the classification of patients attributes which emerging researches suggest are related to semen quality (i.e. concentration) which inevitably affects fertilization power. The framework presented allows the evaluation of seminal fluid to determine whether it is normal or altered. We evaluate the performance of the designed radial basis function neural network using metrics which include precision, recall, F1-score and accuracy. The experimental results obtained from the employed model based on a 10-fold cross validation scheme are promising and suggest that they can be successfully deployed as an application.

References [1] Carlsen, E., Giwercman, A., Keiding, N. and Skakkebk, N. E., Evidence for decreasing quality of semen during past 50 years, Bmj. 305 (1992), 609-613. [2] Sharpe, R. M. and Skakkebaek, N. E., Are oestrogens involved in falling sperm counts and disorders of the male reproductive tract, The Lancet. 341 (1933), 1392-1396. [3] Vasan, S. S., Semen analysis and sperm function tests: How much to test?, Indian journal of urology: IJU: journal of the Urological Society of India. 27, no. 1, (2011), 41. [4] Linneberg, C., Salamon, P., Svarer, C., Hansen, L. K. and Meyrowitsch, J., Towards semen quality assessment using neural networks, In IEEE Workshop of Neural Networks for Signal Proceesing IV (1994), 509-517. [5] Gil, D., Girela, J. L., De Juan, J., Gomez-Torres, M. J. and Johnsson, M., Predicting seminal quality with artificial intelligence methods. Expert Systems with Applications, Expert Systems with Applications. 39, no. 16, (2012), 12564–12573. [6] Gil, D. and Girela, J. L., Fertility data set. UCI Machine Learning Repository, Available online: [https://archive.ics.uci.edu/ml/datasets/Fertility] (2013). [7] Khashman, A., IBCIS: Intelligent blood cell identification system, Progress in Natural Science. 18, no. 10 (2008), 1309–1314. [8] Khashman A., and Dimililer K., Neural Networks Arbitration for Optimum DCT Image Compression, Appl.Math. Lett. 18, no. 10, (2008), 1309–1314.

Seminal quality evaluation with RBF neural network

145

[9] Khashman A. and Al-Zgoul E., Image Segmentation of Blood Cells in Leukemia Patients, 4th WSEAS International Conference on Computer Engineering and Applications (CEA’10), The Harvard Inn, Cambridge, MA, USA, 27-29 January (2010). [10] Khashman A., Blood Cell Identification Using Emotional Neural Networks, Journal of Information Science and Engineering 6, no. 6, (2009), 1737–1751. [11] Khashman A., Application of an Emotional Neural Network to Facial Recognition, Neural Computing and Applications, Springer, New York, USA, 18, no. 4, (2009), 309-320. [12] Khashman A. and Dimililer K., Comparison Criteria for Optimum Image Compression, Proceeding of the IEEE International Conference on Computer as a Tool (EUROCON05), Serbia & Montenegro, 21-24 November, (2005). [13] Khashman A., Face Recognition Using Neural Networks and Pattern Averaging, International Symposium on Neural Networks, China (2006), 98–103. [14] Khashman A. and Nwulu N., Support Vector Machines versus Back Propagation Algorithm for Oil Price Prediction, International Symposium on Neural Networks, China, (2011). [15] Oyedotun, O. K., and Khashman, A., Document segmentation using textural features summarization and feedforward neural network, Applied Intelligence. 45, no. 1, (2016), 198-212. [16] Babu, G. S., and Suresh, S., Sequential projection-based metacognitive learning in a radial basis function network for classification problems, IEEE transactions on neural networks and learning systems. 24, no. 2, (2013), 194-206. [17] Billings, S. A., Wei, H. L., and Balikhin, M. A., Generalized multiscale radial basis function networks, Neural Networks. 20, no. 10, (2007), 1081–1094. [18] Ng, W. W., Dorado, A., Yeung, D. S., Pedrycz, W. and Izquierdo, E., Image classification with the use of radial basis function neural networks and the minimization of the localized generalization error, Pattern Recognition. 40, no. 1, (2007), 19-32. [19] Cˆ arstea C., Optimization Techniques in Project Controlling, Ovidius University Annals Economic Sciences Series, Ovidius University Press 13 (2013), 428-432. [20] Chang, G. W., Chen, C. I. and Teng, Y. F., CRadial-basis-function-based neural network for harmonic detection, IEEE Transactions on Industrial Electronics. 57, no. 6, (2010), 2171–2179.

146

A. Helwan, A. Khashman, E.O. Olaniyi, O.K. Oyedotun, O.A. Oyedotun

ZAPDOC.TIPS | To ensure the functioning of the site, we use **cookies**. We share information about your activities on the site with our partners and Google partners: social networks and companies engaged in advertising and web analytics. For more information, see the Privacy Policy and Google Privacy & Terms. Your consent to our cookies if you continue to use this website. Accept