# Linear Regression: Definition, Types, Examples Linear regression has been used extensively in multiple industries for prediction purposes. It is an analytical method used in data science to study data and predict the value of an unknown dependent variable based on the known dependent variable.

This article aims to cover the definition of linear regression and its types with examples for better understanding.

What is linear regression?
Linear regression is a statistical method used to visualise the linear relationship between two variables. It quantifies the correlation between one or more independent variables and the dependent variable.

The two types of variables are: Independent variable: It is the variable whose value is known and is used to determine the value of the dependent variable.

Dependent variable: This is the variable whose value is being determined, which changes with the independent variable.

The value of ‘A’, the dependent variable, can be determined through ‘B’, the independent variable.
The graphical representation of linear regression is a straight-line graph, assuming a linear relationship between the variables. The method is highly accurate and reduces the discrepancies between the predicted and the actual outputs.

### What are the uses of linear regression?

Linear regression has been used in scientific and academic research, behavioural science, and business. Some of the uses of linear regression are:

To predict future values like the estimate of revenue value upon changes in investment. To envision future trends in both science and commerce.

This can also be used to understand the strength of the relationship between the independent or predictor variable(s) and the dependent variable.

### What are the types of linear regression?

There are two types of linear regression - simple and multiple.

Simple linear regression
Simple linear regression is used when there is only one independent variable at hand which is used to determine the dependent variable.

The equation for simple linear regression involves four coefficients:

A = 𝛽0 + 𝛽1B + ε

In the given equation, A is the regression coefficient or dependent variable, 𝛽o is the intercept, 𝛽1 is the slope, B is the independent variable which assumably is affecting A, and the epsilon (ε) is the adjustment of A or the error value in our estimate of A.

Simple linear regression makes four preconceptions:
The data has been collected using scientific methods without any bias and does not have any hidden relation.
The error size (value of ellipses) does not have any significant change across the independent variable values.
The data is symmetrically distributed and maintains a level of normality.

The relationship between the independent and dependent variables is always linear. It can never be a curve or hyperbola.

Simple linear regression is used in determining the value of a dependent variable that is directly influenced by the independent variable. Simple examples of this would be determining the marks a student may score depending on the hours of study, or the amount of increase in antibodies against the number of viral cells.

Multiple linear regression

Multiple linear regression quantifies the relationship between two or more independent variables and the dependent variable. The independent variable may be consequent values or grouped values.
The equation for multiple linear regression involves the coefficients:

A = 𝛽0 + 𝛽1B1 + 𝛽2B2 + …. + 𝛽nBn + ε

Therefore, the coefficients change in numbers. As seen in simple linear regression in the above equation:
A is the regression coefficient

𝛽o is the intercept
𝛽1B1 is the first independent coefficient, the slope (𝛽1) of the first independent or predictor value (B1).
… signifies the number of coefficient values, the slope of the subsequent independent variables present between the second and last variable value.

𝛽nBn is the last independent coefficient
ε is the error value.

The above-given equation visualises the linear relationship between the dependent variable (A) and all the predictor variables.

### Examples of linear regression

Linear regression is used in multiple scenarios -
Linear regression can also be used to predict the sales of a product based on the advertisement done through different media channels. This will employ the multiple linear regression model.

In medicine, simple linear regression may be used to determine the blood sugar level of a patient upon administration of a particular dosage of insulin. This can be used similarly to determine the relationship between different drugs and the patient’s vitals.

Conclusion
Data science has become an indispensable part of multiple industries, and linear regression, an important aspect of data science, has too. Therefore, knowledge in data science is a high-demand skill, and the best way to head-start your career is to earn a degree in data science certification. Learning linear regression makes it easy to calculate predictive values and estimate the condition of commercial or scientific variables.

Imarticus Learning offers a Postgraduate Analytics degree, a data science course with job guarantee. The PG Data Science and Analytics course is a job-assured programme that is trusted by industry leaders for the resources and quality of training provided. In this futuristic programme, students learn data visualisation with Python, Tableau and Power BI. Thus, the programme also helps learners in adapting to regular upskilling requirements.

## drop us a message and keep in touch 