A. Introduction
Multiple regression is a development of simple regression. If simple regression has one dependent variable (Y) and one independent variable (X), then multiple regression has one dependent variable (Y) and 2 independent variables (X1, X2). Multiple regression analysis is a statistical analysis to predict the influence value of 2 or more independent variables on 1 dependent variable, to prove whether there is a causal relationship between 2 or more independent variables and one dependent variable or dependent variable.
The multiple regression equation implies that in a regression equation there is one dependent variable and more than one independent variable. Regression analysis was used to determine the effect of the independent variables X1, X2, X3, X4, X5, X6 on the dependent variable Y. Multiple regression in lisrel has assumptions that must be met, namely normality and multicollinearity (Ghozali, 2008: 36). The most fundamental assumption in multivariate analysis is normality, which is the form of a distribution of data on a single metric variable to produce a normal distribution. A data distribution that does not form a normal distribution, then the data is not normal, otherwise the data is said to be normal if it forms a normal distribution. If the normality assumption is not met and the normality deviation is large, then all statistical test results are invalid because t-test calculations and so on, are calculated with normal data assumptions.
B. Multiple Regression
In fact, thinking about multiple regression analysis is like doing multiple regression analyzes, once for each predictor. Multiple regression analysis becomes more complicated because often the two predictors have a relationship that influences the relationship of each predictor to the criterion. This is what makes the results of regression analysis using more than one predictor different from regression analysis for each predictor. Differences appear, for example, in the results of the estimation of b and R2.
As previously explained, when parameter estimation in regression involves more than two predictors, it is necessary to take into account the correlation between predictors. This is reflected in the formulas for finding each parameter. In this chapter, the explanation of multiple regression analysis involves only two predictors for ease of explanation. Therefore, the formula for the prediction line that will be sought is:
Where :
Y = dependent variable
X, X2 = independent variables
a = constant
b1, b2 = independent variable coefficients
e = error
Another multiple regression equation is formulated:
The steps for answering multiple regression are as follows:
Step 1. Make Ha and Ho in sentence form
Step 2. Create Ha and Ho in statistical form
Step 3. Create a helper table to calculate statistical figures
Step 4. Calculate the values for the equations b1, b2, and a using the formula
Least square method (least square):
Equation formula if there are 2 independent variables
The equation formula if there are 3 independent variables
Step 5. Enter the calculation results above into formulas b and a
Step 6. Find r arithmetic by formula
Step 7. Square the value of r
Step 8. Calculate the calculated F value using the formula
Step 9. Calculate Ftable with formula
F table = F (1 – α) (dk numerator, dk denominator)
dk numerator = m
dk denominator = n – m – 1
then look at the F table, then determine the Ho test criteria, namely
Ha : not significant
Ho : significant
If Fcount ≤ Ftable then Ho is accepted or significant.
Step 10. Make a conclusion
Source :
RESEARCH STATISTICS: MANUAL ANALYSIS AND IBM SPSS
Authors : Agung Wahyudi Biantoro, Muh. Kholil