Both the Machine Learning Algorithms Linear Regression and Logistic Regression comes under the Supervised Machine Learning Technique. Since both the algorithms are of Supervised nature, therefore they use labeled dataset. The major difference between the two is how they have been used. Linear Regression help solves regression problems whereas Logistic Regression help solves classification problems.

Linear Regression

  • It solves regression problems.
  • It is mandatory to have a linear relationship between input variables and the output variable.
  • Types of Linear Regression:
    • Simple Regression: It finds a linear relationship between one independent variable and one dependent variable. 
    • Multivariable Regression: It finds a linear relationship between more than one independent variable and one dependent variable. 
  • The output or the dependent variable is continuous.
  • It uses a straight line to show the best fit.
  • It is used when the output variable is continuous for example age, salary, weight, etc.
  • To reduce the errors and arrive at the best possible fit, it uses least-squares estimation.
  • The equation for this algorithm is of the form Y = MX + C.
  • It requires minimal tuning.
  • It runs fast when compared to others.

Applications of Linear Regression:

  • Risk Assessment: It assesses the risks involved in insurance. For instance what age group people tend to claim more insurance policies.
  • Predicting Sales: It is used by companies for forecasting sales based on the trends.

Logistic Regression

  • It solves classification problems.
  • It is not mandatory to have a linear relationship between input variables and the output variable.
  • The output or the dependent variable is discrete values.
  • It is used when the output variable is discrete for example 0 or 1, True or False, etc.
  • Types of Logistic Regression:
    • Binary Logistic Regression: When the output variable has 2 possible outcomes.
    • Multi-nominal Logistic Regression: When the output variable has 3 possible outcomes.
    • Ordinal Logistic Regression: When the output variable has 3 or more possible outcomes.
  • The equation for this algorithm is of the form Y = eX + e-X .
  • To arrive at the solution it uses Maximum Likelihood Estimation.
  • Robust algorithm.
  • It uses the S – shape graph to show the best fit.

Applications of Logistic Regression:

  • Marketing: A marketing team wants to know whether the subsidiary of the company will make a profit or loss.
  • Spam Detection: Gmail uses it to know whether the email is spam or not spam.
  • Human Resources: To predict the absenteeism pattern of the employees.

Difference between- Linear Regression and Logistic Regression

LINEAR REGRESSIONLOGISTIC REGRESSION
Used for solving Regression problems.Used for solving classification problems.
Predict the output for continuous variables.Predict the output for categorical variables.
Output must be a continuous value. For instance- age, salary etc.Output must be a Categorical value. For instance- 0 or 1, True or False etc.
It is mandatory to have a linear relationship between the dependent and independent variables.It is not mandatory to have a linear relationship between the dependent and independent variables.
Assumes gaussian distribution of dependent data.Assumes binomial distribution of dependent data.
Based on the least-squares estimation.Based on Maximum Likelihood Estimation. 
Uses a straight lineUses an S – curve or sigmoid function.
Equation used is Y = mX + CEquation used is Y = eX + e-X
Application: Risk Assessment, Predictive AnalyticsApplications: In weather Forecasting, identifying risk factors for diseases.