In depth of classification ml

                                    Classification machine learning

                     

1. The k-Nearest Neighbors (k-NN) algorithm:
  • Training data:
Apple: (Weight: 150g, Color Intensity: 7)
Orange: (Weight: 170g, Color Intensity: 6)
Lemon: (Weight: 120g, Color Intensity: 10)
  • New data point:
Unknown Fruit: (Weight: 160g, Color Intensity: 8)
  • Distance calculations:
 The Euclidean distance between two points and in a 2-dimensional space is given by: 
Distant to Apple = 10.5
Distant to Orange = 10.20
Distant to Lemon = 40.01
  • Result:
Based on these distances, the unknown fruit is closest to the Apple (distance of 10.05), then the Orange (distance of 10.20), and farthest from the Lemon (distance of 40.01). If we were using k-NN with k=1, we would classify the unknown fruit as an Apple. If k=3, we need to look at the majority class among the nearest three fruits.



2. Naive Bayes Classifier:


So, here the value of yes-fever is 0.17 and no-fever is 0.13.....
Hence, if a person has both flu and covid, it is very likely that the person also has a fever.

3. Logistic regression:

Logistic regression is used when the target or dependent variable is binary, meaning it has only two possible outcomes (e.g., yes/no, pass/fail, true/false). Logistic regression uses a logistic function called a sigmoid function to map predictions and their probabilities. The sigmoid function refers to an S-shaped curve that converts any real value to a range between 0 and 1.
x = input value
y = predicted output
a0 = bias or intercept term
a1 = coefficient for input (x)
We calculate the values of a0 and a1 by maximum likelihood estimation, which is very complex.

4. Decision tree:
In a Decision tree, there are two nodes: the Decision Node and the Leaf Node. Decision nodes are used to make any decision and have multiple branches, whereas Leaf nodes are the output of those decisions and do not contain any further branches.
Example: Suppose there is a candidate who has a job offer and wants to decide whether he should accept the offer or not. 
       
                                     
Since the information gain of weather is the highest it is taken as the main root.



5. Random forest:
  • A random forest is like an army of decision trees working together.
  • Each decision tree in the forest is trained on a slightly different subset of the data.
  • The final output is based on the majority vote (for classification) or average (for regression) of all the trees' outputs.





















 




                         

Comments

Popular posts from this blog

Basics of AI

Types of ML

Basics of Neural networks