Quantcast
Channel: OpenCV Q&A Forum - Latest question feed
Viewing all articles
Browse latest Browse all 19555

Opencv mlp is not working as the doc pretend it should work.

$
0
0
I am new using Opencv and in particular multilayers perceptron (mlp). I have looked up on the docs and I am a bit confused at some points. Basically I am trying to make a basic neural network that outputs a 1 if the input is -1 and also to output a 1 if the input is 1. I am using CvANN_MLP::SIGMOID_SYM as the activation function of each neuron with parameters α = 1 and β = 1. Thus, according to the doc, the output of the mlp should always be inside [-1; 1]. But I am able to get an output greater than 1 with the following code: //creation of the multilayer perceptron Mat layers = cv::Mat(3, 1, CV_32SC1); layers.row(0) = Scalar(1); layers.row(1) = Scalar(2); layers.row(2) = Scalar(1); CvANN_MLP mlp; mlp.create(layers, CvANN_MLP::SIGMOID_SYM, 1, 1); //the training inputs and outputs Mat trainingData(2, 1, CV_32FC1); Mat trainingClasses(2, 1, CV_32FC1); trainingData.at(Point(0, 0)) = 1; trainingData.at(Point(0, 1)) = -1; trainingClasses.at(Point(0, 0)) = 1; trainingClasses.at(Point(0, 1)) = 1; //the training params CvANN_MLP_TrainParams params; CvTermCriteria criteria; criteria.max_iter = 1; criteria.epsilon = 0.00001f; criteria.type = CV_TERMCRIT_ITER | CV_TERMCRIT_EPS; params.train_method = CvANN_MLP_TrainParams::BACKPROP; params.bp_dw_scale = 1.0f; params.bp_moment_scale = 1.0f; params.term_crit = criteria; mlp.train(trainingData, trainingClasses, Mat(), Mat(), params); for (int j(0); j < trainingData.rows; j++) { Mat input = trainingData.row(j); Mat output(1, 1, CV_32FC1); mlp.predict(input, output); cout << output.at(0, 0) << " "; } cout << endl; This code outputs in the console: "1.06762 0.972991".
We can see that the mlp outputs 1.06762 instead of 1 when the input is equal to 1. Am I doing somethink wrong ?

Viewing all articles
Browse latest Browse all 19555

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>