Skip to main content

Artificial Intelligence (K Nearest Neighbor) in OPENCV

In pattern recognition, the k-nearest neighbor algorithm (k-NN) is a method for classifying objects based on closest training examples in the feature space. k-NN is a type of instance-based learning, or lazy learning where the function is only approximated locally and all computation is deferred until classification. The k-nearest neighbor algorithm is amongst the simplest of all machine learning algorithms: an object is classified by a majority vote of its neighbors, with the object being assigned to the class most common amongst its k nearest neighbors (k is a positive integer, typically small). If k = 1, then the object is simply assigned to the class of its nearest neighbor.

The k-NN algorithm can also be adapted for use in estimating continuous variables. One such implementation uses an inverse distance weighted average of the k-nearest multivariate neighbors. This algorithm functions as follows:
  1. Compute Euclidean or Mahalanobis distance from target plot to those that were sampled.
  2. Order samples taking for account calculated distances.
  3. Choose heuristically optimal k nearest neighbor based on RMSE done by cross validation technique.
  4. Calculate an inverse distance weighted average with the k-nearest multivariate neighbors.
Using a weighted k-NN also significantly improves the results: the class (or value, in regression problems) of each of the k nearest points is multiplied by a weight proportional to the inverse of the distance between that point and the point for which the class is to be predicted.


class atsKNN{
public :
    void knn(cv::Mat& trainingData, cv::Mat& trainingClasses, cv::Mat& testData, cv::Mat& testClasses, int K)
    {
        cv::KNearest knn(trainingData, trainingClasses, cv::Mat(), false, K);
        cv::Mat predicted(testClasses.rows, 1, CV_32F);
        for(int i = 0; i < testData.rows; i++) {
                const cv::Mat sample = testData.row(i);
                predicted.at<float>(i,0) = knn.find_nearest(sample, K);
        }

        float percentage = evaluate(predicted, testClasses) * 100;
        cout << "K Nearest Neighbor Evaluated Accuracy = " << percentage << "%" << endl;
        prediction = predicted;
    }
    void showplot(cv::Mat testData)
    {
        plot_binary(testData, prediction, "Predictions Backpropagation");
    }
private:
    cv::Mat prediction;

};



the simplicity of the algorithm in classifying things requires you to provide alot of training samples to ensure lots of vectors are created within K. therefore it is not optimized for speed and space. in addition to consuming alot of memory it is really slow.

Comments

  1. Thanks for your post! It is realy informative. If a neighbor will have several parameters(variables), how will a code look like?

    ReplyDelete
  2. Hey.. I need your help.. I want a code that detect the red color from an image using KNN classifier.

    Thank you

    ReplyDelete
  3. The prime focus of this Artificial Intelligence Course in Hyderabad program by AI Patasala is to transform the students into becoming job-ready experts in AI.
    AI Training in Hyderabad

    ReplyDelete

Post a Comment

Popular posts from this blog

Computing Entropy of an image (CORRECTED)

entropy is a measure of the uncertainty associated with a random variable. basically i want to get a single value representing the entropy of an image. 1. Assign 255 bins for the range of values between 0-255 2. separate the image into its 3 channels 3. compute histogram for each channel 4. normalize all 3 channels unifirmely 5. for each channel get the bin value (Hc) and use its absolute value (negative log is infinity) 6. compute Hc*log10(Hc) 7. add to entropy and continue with 5 until a single value converges 5. get the frequency of each channel - add all the values of the bin 6. for each bin get a probability - if bin 1 = 20 bin 2 = 30 then frequency is 50 and probability is 20/50 and 30/50 then compute using shannon formula  REFERENCE: http://people.revoledu.com/kardi/tutorial/DecisionTree/how-to-measure-impurity.htm class atsHistogram { public:     cv::Mat DrawHistogram(Mat src)     {         /// Separate the image in 3 places ( R, G and B )    

Blob Detection, Connected Component (Pure Opencv)

Connected-component labeling (alternatively connected-component analysis, blob extraction, region labeling, blob discovery, or region extraction) is an algorithmic application of graph theory, where subsets of connected components are uniquely labeled based on a given heuristic. Connected-component labeling is not to be confused with segmentation. i got the initial code from this URL: http://nghiaho.com/?p=1102 However the code did not compile with my setup of OpenCV 2.2, im guessing it was an older version. so a refactored and corrected the errors to come up with this Class class atsBlobFinder     {     public:         atsBlobFinder()         {         }         ///Original Code by http://nghiaho.com/?p=1102         ///Changed and added commments. Removed Errors         ///works with VS2010 and OpenCV 2.2+         void FindBlobs(const cv::Mat &binary, vector < vector<cv::Point>  > &blobs)         {             blobs.clear();             // Fill the la