MATLAB File Help: cv.LogisticRegression Index
cv.LogisticRegression

Logistic Regression classifier

Logistic Regression

ML implements logistic regression, which is a probabilistic classification technique. Logistic Regression is a binary classification algorithm which is closely related to Support Vector Machines (SVM). Like SVM, Logistic Regression can be extended to work on multi-class classification problems like digit recognition (i.e. recognizing digitis like 0,1 2, 3,... from the given images). This version of Logistic Regression supports both binary and multi-class classifications (for multi-class it creates a multiple 2-class classifiers). In order to train the logistic regression classifier, Batch Gradient Descent and Mini-Batch Gradient Descent algorithms are used (see [BatchDesWiki]). Logistic Regression is a discriminative classifier (see [LogRegTomMitch] for more details). Logistic Regression is implemented as a C++ class in cv.LogisticRegression.

In Logistic Regression, we try to optimize the training paramater theta such that the hypothesis 0 <= h_theta(x) <= 1 is acheived. We have h_theta(x) = g(h_theta(x)) and g(z)=1/(1+e^(-z)) as the logistic or sigmoid function. The term "Logistic" in Logistic Regression refers to this function. For given data of a binary classification problem of classes 0 and 1, one can determine that the given data instance belongs to class 1 if h_theta(x) >= 0.5 or class 0 if h_theta(x) < 0.5.

In Logistic Regression, choosing the right parameters is of utmost importance for reducing the training error and ensuring high training accuracy:

Example

A sample set of training parameters for the Logistic Regression classifier can be initialized as follows:

lr = cv.LogisticRegression();
lr.LearningRate = 0.05;
lr.Iterations = 1000;
lr.Regularization = 'L2';
lr.TrainMethod = 'MiniBatch';
lr.MiniBatchSize = 10;

References

[LogRegWiki]:

http://en.wikipedia.org/wiki/Logistic_regression

[BatchDesWiki]:

http://en.wikipedia.org/wiki/Gradient_descent_optimization

[LogRegTomMitch]:

"Generative and Discriminative Classifiers: Naive Bayes and Logistic Regression" in Machine Learning, Tom Mitchell. http://www.cs.cmu.edu/~tom/NewChapters.html

[RenMalik2003]:

"Learning a Classification Model for Segmentation". Proc. CVPR, Nice, France (2003).

See also
Class Details
Superclasses handle
Sealed false
Construct on load false
Constructor Summary
LogisticRegression Creates/trains a logistic regression model 
Property Summary
Iterations Number of iterations. 
LearningRate The learning rate of the optimization algorithm. 
MiniBatchSize Number of training samples taken in each step of Mini-Batch Gradient 
Regularization Kind of regularization to be applied. 
TermCriteria Termination criteria of the training algorithm. 
TrainMethod Kind of training method used to train the classifier. 
id Object ID 
Method Summary
  addlistener Add listener for event. 
  calcError Computes error on the training or test dataset 
  clear Clears the algorithm state 
  delete Destructor 
  empty Returns true if the algorithm is empty 
  eq == (EQ) Test handle equality. 
  findobj Find objects matching specified conditions. 
  findprop Find property of MATLAB handle object. 
  ge >= (GE) Greater than or equal relation for handles. 
  getDefaultName Returns the algorithm string identifier 
  getLearntThetas Returns the trained paramters 
  getVarCount Returns the number of variables in training samples 
  gt > (GT) Greater than relation for handles. 
  isClassifier Returns true if the model is a classifier 
  isTrained Returns true if the model is trained 
Sealed   isvalid Test handle validity. 
  le <= (LE) Less than or equal relation for handles. 
  load Loads algorithm from a file or a string 
  lt < (LT) Less than relation for handles. 
  ne ~= (NE) Not equal relation for handles. 
  notify Notify listeners of event. 
  predict Predicts responses for input samples 
  save Saves the algorithm parameters to a file or a string 
  train Trains the statistical model