Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationThu, 08 Dec 2016 11:54:12 +0100
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2016/Dec/08/t14811957450b1nklnnwyk1kh7.htm/, Retrieved Sun, 28 Apr 2024 11:26:56 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=298338, Retrieved Sun, 28 Apr 2024 11:26:56 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact93
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [Multiple regressi...] [2016-12-08 10:54:12] [36884fbde1107444791dd71ee0072a5a] [Current]
Feedback Forum

Post a new message
Dataseries X:
5	4	4	4	14
5	NA	4	4	19
4	3	3	2	17
4	3	3	3	17
5	4	4	3	15
5	3	4	3	20
5	4	2	3	15
5	4	2	4	19
5	2	2	4	15
5	1	2	4	15
4	4	3	2	19
5	4	3	2	NA
5	4	5	4	20
5	5	4	5	18
4	4	3	4	15
5	1	4	4	14
3	4	4	2	20
5	4	NA	NA	NA
5	2	NA	2	16
5	3	4	5	16
5	3	NA	4	16
NA	2	3	1	10
3	1	3	5	19
4	3	2	3	19
4	2	2	4	16
4	NA	3	4	15
5	4	3	2	18
4	4	3	4	17
5	2	4	2	19
4	3	4	3	17
5	4	3	4	NA
4	4	4	4	19
4	4	3	4	20
4	3	4	4	5
5	4	3	4	19
5	4	3	4	16
5	4	3	5	15
5	4	3	4	16
2	3	2	4	18
4	3	5	3	16
4	4	3	4	15
4	2	1	4	17
5	3	2	3	NA
5	4	2	2	20
5	4	3	5	19
4	3	2	4	7
4	2	3	3	13
5	3	5	4	16
5	3	4	4	16
5	4	5	4	NA
4	3	2	3	18
4	3	4	4	18
5	3	3	4	16
5	3	3	4	17
5	3	2	4	19
4	5	3	5	16
5	4	2	4	19
5	NA	4	2	13
4	3	NA	4	16
4	4	3	5	13
5	4	1	2	12
5	1	1	3	17
4	4	3	4	17
4	3	NA	3	17
5	3	2	4	16
3	4	3	4	16
3	2	4	4	14
5	4	3	5	16
4	5	4	3	13
4	4	4	4	16
5	4	3	4	14
5	4	4	4	20
4	NA	4	4	12
5	4	3	4	13
4	2	3	4	18
4	4	5	4	14
4	2	2	4	19
5	5	4	4	18
4	5	3	3	14
4	2	3	3	18
4	4	3	2	19
4	3	4	2	15
4	3	4	2	14
2	3	NA	3	17
4	4	5	4	19
4	4	3	4	13
5	3	4	4	19
4	3	3	4	18
5	4	5	4	20
4	4	4	4	15
4	2	4	4	15
3	3	4	2	15
4	3	4	3	20
2	3	2	2	15
4	4	3	3	19
5	4	4	4	18
3	4	3	5	18
4	4	3	4	15
5	5	5	5	20
2	4	3	3	17
5	3	1	5	12
5	4	3	4	18
5	4	4	5	19
4	2	2	2	20
4	3	3	3	NA
5	3	4	4	17
5	3	4	5	15
4	4	4	4	16
4	4	4	5	18
5	4	NA	5	18
5	4	4	5	14
5	3	3	4	15
4	3	3	4	12
5	3	3	4	17
4	2	NA	4	14
5	3	4	4	18
4	2	2	4	17
5	4	5	5	17
5	5	2	5	20
4	3	2	5	16
4	3	2	4	14
4	3	3	4	15
5	2	3	4	18
5	3	4	5	20
4	3	NA	4	17
4	3	4	4	17
5	4	3	4	17
5	4	4	4	17
4	3	4	2	15
4	4	3	4	17
4	1	3	2	18
4	5	5	4	17
5	4	4	3	20
5	3	3	5	15
4	5	3	2	16
NA	4	3	4	15
4	3	3	3	18
4	NA	NA	NA	11
3	4	3	3	15
4	4	2	4	18
5	3	4	5	20
4	2	4	3	19
4	4	4	2	14
5	3	5	5	16
3	3	2	4	15
4	4	2	4	17
1	2	3	2	18
5	3	3	5	20
4	4	2	3	17
5	4	4	3	18
3	3	2	3	15
4	4	3	4	16
4	4	NA	4	11
4	3	3	4	15
4	2	3	4	18
5	4	4	4	17
5	2	2	4	16
5	3	5	5	12
5	4	4	3	19
4	3	3	NA	18
5	2	5	4	15
5	4	2	4	17
4	1	4	5	19
3	5	4	3	18
4	4	4	4	19
4	3	3	2	16
5	4	5	5	16
4	4	3	4	16
4	3	3	3	14




Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time8 seconds
R ServerBig Analytics Cloud Computing Center

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input view raw input (R code)  \tabularnewline
Raw Outputview raw output of R engine  \tabularnewline
Computing time8 seconds \tabularnewline
R ServerBig Analytics Cloud Computing Center \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=298338&T=0

[TABLE]
[ROW]
Summary of computational transaction[/C][/ROW] [ROW]Raw Input[/C] view raw input (R code) [/C][/ROW] [ROW]Raw Output[/C]view raw output of R engine [/C][/ROW] [ROW]Computing time[/C]8 seconds[/C][/ROW] [ROW]R Server[/C]Big Analytics Cloud Computing Center[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=298338&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=298338&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time8 seconds
R ServerBig Analytics Cloud Computing Center







Multiple Linear Regression - Estimated Regression Equation
K1[t] = + 2.38799 + 0.0420172K2[t] + 0.0566008K3[t] + 0.28681K4[t] + 0.028565ITH[t] -0.00507587`K1(t-1)`[t] + 0.117242`K1(t-2)`[t] -0.0113644`K1(t-3)`[t] -0.0636007`K1(t-4)`[t] -0.0287444`K1(t-5)`[t] + e[t]
Warning: you did not specify the column number of the endogenous series! The first column was selected by default.

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
K1[t] =  +  2.38799 +  0.0420172K2[t] +  0.0566008K3[t] +  0.28681K4[t] +  0.028565ITH[t] -0.00507587`K1(t-1)`[t] +  0.117242`K1(t-2)`[t] -0.0113644`K1(t-3)`[t] -0.0636007`K1(t-4)`[t] -0.0287444`K1(t-5)`[t]  + e[t] \tabularnewline
Warning: you did not specify the column number of the endogenous series! The first column was selected by default. \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=298338&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]K1[t] =  +  2.38799 +  0.0420172K2[t] +  0.0566008K3[t] +  0.28681K4[t] +  0.028565ITH[t] -0.00507587`K1(t-1)`[t] +  0.117242`K1(t-2)`[t] -0.0113644`K1(t-3)`[t] -0.0636007`K1(t-4)`[t] -0.0287444`K1(t-5)`[t]  + e[t][/C][/ROW]
[ROW][C]Warning: you did not specify the column number of the endogenous series! The first column was selected by default.[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=298338&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=298338&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
K1[t] = + 2.38799 + 0.0420172K2[t] + 0.0566008K3[t] + 0.28681K4[t] + 0.028565ITH[t] -0.00507587`K1(t-1)`[t] + 0.117242`K1(t-2)`[t] -0.0113644`K1(t-3)`[t] -0.0636007`K1(t-4)`[t] -0.0287444`K1(t-5)`[t] + e[t]
Warning: you did not specify the column number of the endogenous series! The first column was selected by default.







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+2.388 0.9595+2.4890e+00 0.01407 0.007035
K2+0.04202 0.0666+6.3090e-01 0.5292 0.2646
K3+0.0566 0.06659+8.4990e-01 0.3969 0.1985
K4+0.2868 0.06863+4.1790e+00 5.309e-05 2.655e-05
ITH+0.02857 0.02524+1.1320e+00 0.2597 0.1299
`K1(t-1)`-0.005076 0.08109-6.2600e-02 0.9502 0.4751
`K1(t-2)`+0.1172 0.08101+1.4470e+00 0.1502 0.07511
`K1(t-3)`-0.01136 0.08154-1.3940e-01 0.8894 0.4447
`K1(t-4)`-0.0636 0.08208-7.7490e-01 0.4398 0.2199
`K1(t-5)`-0.02874 0.08172-3.5170e-01 0.7256 0.3628

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & +2.388 &  0.9595 & +2.4890e+00 &  0.01407 &  0.007035 \tabularnewline
K2 & +0.04202 &  0.0666 & +6.3090e-01 &  0.5292 &  0.2646 \tabularnewline
K3 & +0.0566 &  0.06659 & +8.4990e-01 &  0.3969 &  0.1985 \tabularnewline
K4 & +0.2868 &  0.06863 & +4.1790e+00 &  5.309e-05 &  2.655e-05 \tabularnewline
ITH & +0.02857 &  0.02524 & +1.1320e+00 &  0.2597 &  0.1299 \tabularnewline
`K1(t-1)` & -0.005076 &  0.08109 & -6.2600e-02 &  0.9502 &  0.4751 \tabularnewline
`K1(t-2)` & +0.1172 &  0.08101 & +1.4470e+00 &  0.1502 &  0.07511 \tabularnewline
`K1(t-3)` & -0.01136 &  0.08154 & -1.3940e-01 &  0.8894 &  0.4447 \tabularnewline
`K1(t-4)` & -0.0636 &  0.08208 & -7.7490e-01 &  0.4398 &  0.2199 \tabularnewline
`K1(t-5)` & -0.02874 &  0.08172 & -3.5170e-01 &  0.7256 &  0.3628 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=298338&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]+2.388[/C][C] 0.9595[/C][C]+2.4890e+00[/C][C] 0.01407[/C][C] 0.007035[/C][/ROW]
[ROW][C]K2[/C][C]+0.04202[/C][C] 0.0666[/C][C]+6.3090e-01[/C][C] 0.5292[/C][C] 0.2646[/C][/ROW]
[ROW][C]K3[/C][C]+0.0566[/C][C] 0.06659[/C][C]+8.4990e-01[/C][C] 0.3969[/C][C] 0.1985[/C][/ROW]
[ROW][C]K4[/C][C]+0.2868[/C][C] 0.06863[/C][C]+4.1790e+00[/C][C] 5.309e-05[/C][C] 2.655e-05[/C][/ROW]
[ROW][C]ITH[/C][C]+0.02857[/C][C] 0.02524[/C][C]+1.1320e+00[/C][C] 0.2597[/C][C] 0.1299[/C][/ROW]
[ROW][C]`K1(t-1)`[/C][C]-0.005076[/C][C] 0.08109[/C][C]-6.2600e-02[/C][C] 0.9502[/C][C] 0.4751[/C][/ROW]
[ROW][C]`K1(t-2)`[/C][C]+0.1172[/C][C] 0.08101[/C][C]+1.4470e+00[/C][C] 0.1502[/C][C] 0.07511[/C][/ROW]
[ROW][C]`K1(t-3)`[/C][C]-0.01136[/C][C] 0.08154[/C][C]-1.3940e-01[/C][C] 0.8894[/C][C] 0.4447[/C][/ROW]
[ROW][C]`K1(t-4)`[/C][C]-0.0636[/C][C] 0.08208[/C][C]-7.7490e-01[/C][C] 0.4398[/C][C] 0.2199[/C][/ROW]
[ROW][C]`K1(t-5)`[/C][C]-0.02874[/C][C] 0.08172[/C][C]-3.5170e-01[/C][C] 0.7256[/C][C] 0.3628[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=298338&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=298338&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+2.388 0.9595+2.4890e+00 0.01407 0.007035
K2+0.04202 0.0666+6.3090e-01 0.5292 0.2646
K3+0.0566 0.06659+8.4990e-01 0.3969 0.1985
K4+0.2868 0.06863+4.1790e+00 5.309e-05 2.655e-05
ITH+0.02857 0.02524+1.1320e+00 0.2597 0.1299
`K1(t-1)`-0.005076 0.08109-6.2600e-02 0.9502 0.4751
`K1(t-2)`+0.1172 0.08101+1.4470e+00 0.1502 0.07511
`K1(t-3)`-0.01136 0.08154-1.3940e-01 0.8894 0.4447
`K1(t-4)`-0.0636 0.08208-7.7490e-01 0.4398 0.2199
`K1(t-5)`-0.02874 0.08172-3.5170e-01 0.7256 0.3628







Multiple Linear Regression - Regression Statistics
Multiple R 0.3938
R-squared 0.1551
Adjusted R-squared 0.09701
F-TEST (value) 2.671
F-TEST (DF numerator)9
F-TEST (DF denominator)131
p-value 0.007034
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 0.7178
Sum Squared Residuals 67.5

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R &  0.3938 \tabularnewline
R-squared &  0.1551 \tabularnewline
Adjusted R-squared &  0.09701 \tabularnewline
F-TEST (value) &  2.671 \tabularnewline
F-TEST (DF numerator) & 9 \tabularnewline
F-TEST (DF denominator) & 131 \tabularnewline
p-value &  0.007034 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation &  0.7178 \tabularnewline
Sum Squared Residuals &  67.5 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=298338&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C] 0.3938[/C][/ROW]
[ROW][C]R-squared[/C][C] 0.1551[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C] 0.09701[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C] 2.671[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]9[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]131[/C][/ROW]
[ROW][C]p-value[/C][C] 0.007034[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C] 0.7178[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C] 67.5[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=298338&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=298338&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R 0.3938
R-squared 0.1551
Adjusted R-squared 0.09701
F-TEST (value) 2.671
F-TEST (DF numerator)9
F-TEST (DF denominator)131
p-value 0.007034
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 0.7178
Sum Squared Residuals 67.5







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 5 4.075 0.9246
2 5 4.494 0.5061
3 5 4.232 0.768
4 5 4.161 0.8388
5 4 3.885 0.1155
6 5 4.605 0.395
7 5 4.698 0.3023
8 4 4.355-0.3552
9 5 4.315 0.6855
10 3 3.881-0.8812
11 5 4.695 0.3047
12 3 4.448-1.448
13 4 4.134-0.1343
14 4 4.13-0.1296
15 5 3.824 1.176
16 4 4.422-0.4223
17 5 3.995 1.005
18 4 4.104-0.1039
19 4 4.56-0.5599
20 4 4.438-0.4381
21 4 4.001-0.0007358
22 5 4.421 0.5791
23 5 4.359 0.6411
24 5 4.734 0.2656
25 5 4.465 0.5352
26 2 4.36-2.36
27 4 4.172-0.172
28 4 3.997 0.00279
29 4 4.126-0.1257
30 5 3.946 1.054
31 5 4.789 0.2111
32 4 4.12-0.1204
33 4 4.013-0.01331
34 5 4.36 0.6398
35 5 4.281 0.7189
36 4 4.119-0.1191
37 4 4.542-0.5416
38 5 4.247 0.753
39 5 4.253 0.7469
40 5 4.434 0.5655
41 4 4.794-0.7936
42 5 4.435 0.5647
43 4 4.456-0.4563
44 5 3.588 1.412
45 5 3.821 1.179
46 4 4.441-0.4411
47 5 4.342 0.6575
48 3 4.284-1.284
49 3 4.309-1.309
50 5 4.498 0.5015
51 4 3.916 0.08447
52 4 4.584-0.584
53 5 4.388 0.6122
54 5 4.495 0.5051
55 5 4.362 0.6383
56 4 4.438-0.4379
57 4 4.462-0.4623
58 4 4.205-0.2053
59 5 4.427 0.5726
60 4 4.028-0.02823
61 4 4.168-0.1675
62 4 3.865 0.1353
63 4 3.713 0.2872
64 4 3.719 0.2809
65 4 4.563-0.5629
66 4 4.278-0.2783
67 5 4.464 0.5358
68 4 4.374-0.374
69 5 4.709 0.2913
70 4 4.376-0.3756
71 4 4.362-0.3616
72 3 3.736-0.7363
73 4 4.147-0.1475
74 2 3.517-1.517
75 4 4.184-0.1844
76 5 4.307 0.6932
77 3 4.754-1.754
78 4 4.585-0.5849
79 5 4.849 0.1509
80 2 4.048-2.048
81 5 4.544 0.4565
82 5 4.199 0.8011
83 5 4.864 0.1356
84 4 3.963 0.03677
85 5 4.507 0.4931
86 5 4.528 0.472
87 4 4.44-0.4404
88 4 4.842-0.8416
89 5 4.575 0.4247
90 5 4.196 0.804
91 4 4.291-0.2911
92 5 4.456 0.5436
93 5 4.356 0.6444
94 4 4.272-0.2717
95 5 4.87 0.1303
96 5 4.67 0.3296
97 4 4.572-0.572
98 4 4.285-0.2854
99 4 4.218-0.2184
100 5 4.245 0.7553
101 5 4.746 0.2542
102 4 4.519-0.5193
103 5 4.498 0.5016
104 5 4.369 0.6309
105 4 3.796 0.2038
106 4 4.47-0.4697
107 4 3.646 0.3535
108 4 4.455-0.4554
109 5 4.219 0.7807
110 5 4.575 0.4249
111 4 3.945 0.05547
112 4 4.198-0.1981
113 3 3.974-0.9736
114 4 4.277-0.2772
115 5 4.634 0.3664
116 4 4.142-0.1417
117 4 3.971 0.02934
118 5 4.739 0.2607
119 3 4.168-1.168
120 4 4.43-0.4295
121 1 3.635-2.635
122 5 4.686 0.314
123 4 3.727 0.2728
124 5 4.371 0.629
125 3 4.124-1.124
126 4 4.509-0.5089
127 4 4.136-0.136
128 4 4.285-0.2848
129 5 4.484 0.516
130 5 4.247 0.753
131 5 4.72 0.2801
132 5 4.32 0.6797
133 5 4.402 0.5982
134 5 4.344 0.6556
135 4 4.675-0.6755
136 3 4.246-1.246
137 4 4.408-0.4076
138 4 3.539 0.4613
139 5 4.747 0.2534
140 4 4.422-0.4225
141 4 4.124-0.124

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 &  5 &  4.075 &  0.9246 \tabularnewline
2 &  5 &  4.494 &  0.5061 \tabularnewline
3 &  5 &  4.232 &  0.768 \tabularnewline
4 &  5 &  4.161 &  0.8388 \tabularnewline
5 &  4 &  3.885 &  0.1155 \tabularnewline
6 &  5 &  4.605 &  0.395 \tabularnewline
7 &  5 &  4.698 &  0.3023 \tabularnewline
8 &  4 &  4.355 & -0.3552 \tabularnewline
9 &  5 &  4.315 &  0.6855 \tabularnewline
10 &  3 &  3.881 & -0.8812 \tabularnewline
11 &  5 &  4.695 &  0.3047 \tabularnewline
12 &  3 &  4.448 & -1.448 \tabularnewline
13 &  4 &  4.134 & -0.1343 \tabularnewline
14 &  4 &  4.13 & -0.1296 \tabularnewline
15 &  5 &  3.824 &  1.176 \tabularnewline
16 &  4 &  4.422 & -0.4223 \tabularnewline
17 &  5 &  3.995 &  1.005 \tabularnewline
18 &  4 &  4.104 & -0.1039 \tabularnewline
19 &  4 &  4.56 & -0.5599 \tabularnewline
20 &  4 &  4.438 & -0.4381 \tabularnewline
21 &  4 &  4.001 & -0.0007358 \tabularnewline
22 &  5 &  4.421 &  0.5791 \tabularnewline
23 &  5 &  4.359 &  0.6411 \tabularnewline
24 &  5 &  4.734 &  0.2656 \tabularnewline
25 &  5 &  4.465 &  0.5352 \tabularnewline
26 &  2 &  4.36 & -2.36 \tabularnewline
27 &  4 &  4.172 & -0.172 \tabularnewline
28 &  4 &  3.997 &  0.00279 \tabularnewline
29 &  4 &  4.126 & -0.1257 \tabularnewline
30 &  5 &  3.946 &  1.054 \tabularnewline
31 &  5 &  4.789 &  0.2111 \tabularnewline
32 &  4 &  4.12 & -0.1204 \tabularnewline
33 &  4 &  4.013 & -0.01331 \tabularnewline
34 &  5 &  4.36 &  0.6398 \tabularnewline
35 &  5 &  4.281 &  0.7189 \tabularnewline
36 &  4 &  4.119 & -0.1191 \tabularnewline
37 &  4 &  4.542 & -0.5416 \tabularnewline
38 &  5 &  4.247 &  0.753 \tabularnewline
39 &  5 &  4.253 &  0.7469 \tabularnewline
40 &  5 &  4.434 &  0.5655 \tabularnewline
41 &  4 &  4.794 & -0.7936 \tabularnewline
42 &  5 &  4.435 &  0.5647 \tabularnewline
43 &  4 &  4.456 & -0.4563 \tabularnewline
44 &  5 &  3.588 &  1.412 \tabularnewline
45 &  5 &  3.821 &  1.179 \tabularnewline
46 &  4 &  4.441 & -0.4411 \tabularnewline
47 &  5 &  4.342 &  0.6575 \tabularnewline
48 &  3 &  4.284 & -1.284 \tabularnewline
49 &  3 &  4.309 & -1.309 \tabularnewline
50 &  5 &  4.498 &  0.5015 \tabularnewline
51 &  4 &  3.916 &  0.08447 \tabularnewline
52 &  4 &  4.584 & -0.584 \tabularnewline
53 &  5 &  4.388 &  0.6122 \tabularnewline
54 &  5 &  4.495 &  0.5051 \tabularnewline
55 &  5 &  4.362 &  0.6383 \tabularnewline
56 &  4 &  4.438 & -0.4379 \tabularnewline
57 &  4 &  4.462 & -0.4623 \tabularnewline
58 &  4 &  4.205 & -0.2053 \tabularnewline
59 &  5 &  4.427 &  0.5726 \tabularnewline
60 &  4 &  4.028 & -0.02823 \tabularnewline
61 &  4 &  4.168 & -0.1675 \tabularnewline
62 &  4 &  3.865 &  0.1353 \tabularnewline
63 &  4 &  3.713 &  0.2872 \tabularnewline
64 &  4 &  3.719 &  0.2809 \tabularnewline
65 &  4 &  4.563 & -0.5629 \tabularnewline
66 &  4 &  4.278 & -0.2783 \tabularnewline
67 &  5 &  4.464 &  0.5358 \tabularnewline
68 &  4 &  4.374 & -0.374 \tabularnewline
69 &  5 &  4.709 &  0.2913 \tabularnewline
70 &  4 &  4.376 & -0.3756 \tabularnewline
71 &  4 &  4.362 & -0.3616 \tabularnewline
72 &  3 &  3.736 & -0.7363 \tabularnewline
73 &  4 &  4.147 & -0.1475 \tabularnewline
74 &  2 &  3.517 & -1.517 \tabularnewline
75 &  4 &  4.184 & -0.1844 \tabularnewline
76 &  5 &  4.307 &  0.6932 \tabularnewline
77 &  3 &  4.754 & -1.754 \tabularnewline
78 &  4 &  4.585 & -0.5849 \tabularnewline
79 &  5 &  4.849 &  0.1509 \tabularnewline
80 &  2 &  4.048 & -2.048 \tabularnewline
81 &  5 &  4.544 &  0.4565 \tabularnewline
82 &  5 &  4.199 &  0.8011 \tabularnewline
83 &  5 &  4.864 &  0.1356 \tabularnewline
84 &  4 &  3.963 &  0.03677 \tabularnewline
85 &  5 &  4.507 &  0.4931 \tabularnewline
86 &  5 &  4.528 &  0.472 \tabularnewline
87 &  4 &  4.44 & -0.4404 \tabularnewline
88 &  4 &  4.842 & -0.8416 \tabularnewline
89 &  5 &  4.575 &  0.4247 \tabularnewline
90 &  5 &  4.196 &  0.804 \tabularnewline
91 &  4 &  4.291 & -0.2911 \tabularnewline
92 &  5 &  4.456 &  0.5436 \tabularnewline
93 &  5 &  4.356 &  0.6444 \tabularnewline
94 &  4 &  4.272 & -0.2717 \tabularnewline
95 &  5 &  4.87 &  0.1303 \tabularnewline
96 &  5 &  4.67 &  0.3296 \tabularnewline
97 &  4 &  4.572 & -0.572 \tabularnewline
98 &  4 &  4.285 & -0.2854 \tabularnewline
99 &  4 &  4.218 & -0.2184 \tabularnewline
100 &  5 &  4.245 &  0.7553 \tabularnewline
101 &  5 &  4.746 &  0.2542 \tabularnewline
102 &  4 &  4.519 & -0.5193 \tabularnewline
103 &  5 &  4.498 &  0.5016 \tabularnewline
104 &  5 &  4.369 &  0.6309 \tabularnewline
105 &  4 &  3.796 &  0.2038 \tabularnewline
106 &  4 &  4.47 & -0.4697 \tabularnewline
107 &  4 &  3.646 &  0.3535 \tabularnewline
108 &  4 &  4.455 & -0.4554 \tabularnewline
109 &  5 &  4.219 &  0.7807 \tabularnewline
110 &  5 &  4.575 &  0.4249 \tabularnewline
111 &  4 &  3.945 &  0.05547 \tabularnewline
112 &  4 &  4.198 & -0.1981 \tabularnewline
113 &  3 &  3.974 & -0.9736 \tabularnewline
114 &  4 &  4.277 & -0.2772 \tabularnewline
115 &  5 &  4.634 &  0.3664 \tabularnewline
116 &  4 &  4.142 & -0.1417 \tabularnewline
117 &  4 &  3.971 &  0.02934 \tabularnewline
118 &  5 &  4.739 &  0.2607 \tabularnewline
119 &  3 &  4.168 & -1.168 \tabularnewline
120 &  4 &  4.43 & -0.4295 \tabularnewline
121 &  1 &  3.635 & -2.635 \tabularnewline
122 &  5 &  4.686 &  0.314 \tabularnewline
123 &  4 &  3.727 &  0.2728 \tabularnewline
124 &  5 &  4.371 &  0.629 \tabularnewline
125 &  3 &  4.124 & -1.124 \tabularnewline
126 &  4 &  4.509 & -0.5089 \tabularnewline
127 &  4 &  4.136 & -0.136 \tabularnewline
128 &  4 &  4.285 & -0.2848 \tabularnewline
129 &  5 &  4.484 &  0.516 \tabularnewline
130 &  5 &  4.247 &  0.753 \tabularnewline
131 &  5 &  4.72 &  0.2801 \tabularnewline
132 &  5 &  4.32 &  0.6797 \tabularnewline
133 &  5 &  4.402 &  0.5982 \tabularnewline
134 &  5 &  4.344 &  0.6556 \tabularnewline
135 &  4 &  4.675 & -0.6755 \tabularnewline
136 &  3 &  4.246 & -1.246 \tabularnewline
137 &  4 &  4.408 & -0.4076 \tabularnewline
138 &  4 &  3.539 &  0.4613 \tabularnewline
139 &  5 &  4.747 &  0.2534 \tabularnewline
140 &  4 &  4.422 & -0.4225 \tabularnewline
141 &  4 &  4.124 & -0.124 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=298338&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C] 5[/C][C] 4.075[/C][C] 0.9246[/C][/ROW]
[ROW][C]2[/C][C] 5[/C][C] 4.494[/C][C] 0.5061[/C][/ROW]
[ROW][C]3[/C][C] 5[/C][C] 4.232[/C][C] 0.768[/C][/ROW]
[ROW][C]4[/C][C] 5[/C][C] 4.161[/C][C] 0.8388[/C][/ROW]
[ROW][C]5[/C][C] 4[/C][C] 3.885[/C][C] 0.1155[/C][/ROW]
[ROW][C]6[/C][C] 5[/C][C] 4.605[/C][C] 0.395[/C][/ROW]
[ROW][C]7[/C][C] 5[/C][C] 4.698[/C][C] 0.3023[/C][/ROW]
[ROW][C]8[/C][C] 4[/C][C] 4.355[/C][C]-0.3552[/C][/ROW]
[ROW][C]9[/C][C] 5[/C][C] 4.315[/C][C] 0.6855[/C][/ROW]
[ROW][C]10[/C][C] 3[/C][C] 3.881[/C][C]-0.8812[/C][/ROW]
[ROW][C]11[/C][C] 5[/C][C] 4.695[/C][C] 0.3047[/C][/ROW]
[ROW][C]12[/C][C] 3[/C][C] 4.448[/C][C]-1.448[/C][/ROW]
[ROW][C]13[/C][C] 4[/C][C] 4.134[/C][C]-0.1343[/C][/ROW]
[ROW][C]14[/C][C] 4[/C][C] 4.13[/C][C]-0.1296[/C][/ROW]
[ROW][C]15[/C][C] 5[/C][C] 3.824[/C][C] 1.176[/C][/ROW]
[ROW][C]16[/C][C] 4[/C][C] 4.422[/C][C]-0.4223[/C][/ROW]
[ROW][C]17[/C][C] 5[/C][C] 3.995[/C][C] 1.005[/C][/ROW]
[ROW][C]18[/C][C] 4[/C][C] 4.104[/C][C]-0.1039[/C][/ROW]
[ROW][C]19[/C][C] 4[/C][C] 4.56[/C][C]-0.5599[/C][/ROW]
[ROW][C]20[/C][C] 4[/C][C] 4.438[/C][C]-0.4381[/C][/ROW]
[ROW][C]21[/C][C] 4[/C][C] 4.001[/C][C]-0.0007358[/C][/ROW]
[ROW][C]22[/C][C] 5[/C][C] 4.421[/C][C] 0.5791[/C][/ROW]
[ROW][C]23[/C][C] 5[/C][C] 4.359[/C][C] 0.6411[/C][/ROW]
[ROW][C]24[/C][C] 5[/C][C] 4.734[/C][C] 0.2656[/C][/ROW]
[ROW][C]25[/C][C] 5[/C][C] 4.465[/C][C] 0.5352[/C][/ROW]
[ROW][C]26[/C][C] 2[/C][C] 4.36[/C][C]-2.36[/C][/ROW]
[ROW][C]27[/C][C] 4[/C][C] 4.172[/C][C]-0.172[/C][/ROW]
[ROW][C]28[/C][C] 4[/C][C] 3.997[/C][C] 0.00279[/C][/ROW]
[ROW][C]29[/C][C] 4[/C][C] 4.126[/C][C]-0.1257[/C][/ROW]
[ROW][C]30[/C][C] 5[/C][C] 3.946[/C][C] 1.054[/C][/ROW]
[ROW][C]31[/C][C] 5[/C][C] 4.789[/C][C] 0.2111[/C][/ROW]
[ROW][C]32[/C][C] 4[/C][C] 4.12[/C][C]-0.1204[/C][/ROW]
[ROW][C]33[/C][C] 4[/C][C] 4.013[/C][C]-0.01331[/C][/ROW]
[ROW][C]34[/C][C] 5[/C][C] 4.36[/C][C] 0.6398[/C][/ROW]
[ROW][C]35[/C][C] 5[/C][C] 4.281[/C][C] 0.7189[/C][/ROW]
[ROW][C]36[/C][C] 4[/C][C] 4.119[/C][C]-0.1191[/C][/ROW]
[ROW][C]37[/C][C] 4[/C][C] 4.542[/C][C]-0.5416[/C][/ROW]
[ROW][C]38[/C][C] 5[/C][C] 4.247[/C][C] 0.753[/C][/ROW]
[ROW][C]39[/C][C] 5[/C][C] 4.253[/C][C] 0.7469[/C][/ROW]
[ROW][C]40[/C][C] 5[/C][C] 4.434[/C][C] 0.5655[/C][/ROW]
[ROW][C]41[/C][C] 4[/C][C] 4.794[/C][C]-0.7936[/C][/ROW]
[ROW][C]42[/C][C] 5[/C][C] 4.435[/C][C] 0.5647[/C][/ROW]
[ROW][C]43[/C][C] 4[/C][C] 4.456[/C][C]-0.4563[/C][/ROW]
[ROW][C]44[/C][C] 5[/C][C] 3.588[/C][C] 1.412[/C][/ROW]
[ROW][C]45[/C][C] 5[/C][C] 3.821[/C][C] 1.179[/C][/ROW]
[ROW][C]46[/C][C] 4[/C][C] 4.441[/C][C]-0.4411[/C][/ROW]
[ROW][C]47[/C][C] 5[/C][C] 4.342[/C][C] 0.6575[/C][/ROW]
[ROW][C]48[/C][C] 3[/C][C] 4.284[/C][C]-1.284[/C][/ROW]
[ROW][C]49[/C][C] 3[/C][C] 4.309[/C][C]-1.309[/C][/ROW]
[ROW][C]50[/C][C] 5[/C][C] 4.498[/C][C] 0.5015[/C][/ROW]
[ROW][C]51[/C][C] 4[/C][C] 3.916[/C][C] 0.08447[/C][/ROW]
[ROW][C]52[/C][C] 4[/C][C] 4.584[/C][C]-0.584[/C][/ROW]
[ROW][C]53[/C][C] 5[/C][C] 4.388[/C][C] 0.6122[/C][/ROW]
[ROW][C]54[/C][C] 5[/C][C] 4.495[/C][C] 0.5051[/C][/ROW]
[ROW][C]55[/C][C] 5[/C][C] 4.362[/C][C] 0.6383[/C][/ROW]
[ROW][C]56[/C][C] 4[/C][C] 4.438[/C][C]-0.4379[/C][/ROW]
[ROW][C]57[/C][C] 4[/C][C] 4.462[/C][C]-0.4623[/C][/ROW]
[ROW][C]58[/C][C] 4[/C][C] 4.205[/C][C]-0.2053[/C][/ROW]
[ROW][C]59[/C][C] 5[/C][C] 4.427[/C][C] 0.5726[/C][/ROW]
[ROW][C]60[/C][C] 4[/C][C] 4.028[/C][C]-0.02823[/C][/ROW]
[ROW][C]61[/C][C] 4[/C][C] 4.168[/C][C]-0.1675[/C][/ROW]
[ROW][C]62[/C][C] 4[/C][C] 3.865[/C][C] 0.1353[/C][/ROW]
[ROW][C]63[/C][C] 4[/C][C] 3.713[/C][C] 0.2872[/C][/ROW]
[ROW][C]64[/C][C] 4[/C][C] 3.719[/C][C] 0.2809[/C][/ROW]
[ROW][C]65[/C][C] 4[/C][C] 4.563[/C][C]-0.5629[/C][/ROW]
[ROW][C]66[/C][C] 4[/C][C] 4.278[/C][C]-0.2783[/C][/ROW]
[ROW][C]67[/C][C] 5[/C][C] 4.464[/C][C] 0.5358[/C][/ROW]
[ROW][C]68[/C][C] 4[/C][C] 4.374[/C][C]-0.374[/C][/ROW]
[ROW][C]69[/C][C] 5[/C][C] 4.709[/C][C] 0.2913[/C][/ROW]
[ROW][C]70[/C][C] 4[/C][C] 4.376[/C][C]-0.3756[/C][/ROW]
[ROW][C]71[/C][C] 4[/C][C] 4.362[/C][C]-0.3616[/C][/ROW]
[ROW][C]72[/C][C] 3[/C][C] 3.736[/C][C]-0.7363[/C][/ROW]
[ROW][C]73[/C][C] 4[/C][C] 4.147[/C][C]-0.1475[/C][/ROW]
[ROW][C]74[/C][C] 2[/C][C] 3.517[/C][C]-1.517[/C][/ROW]
[ROW][C]75[/C][C] 4[/C][C] 4.184[/C][C]-0.1844[/C][/ROW]
[ROW][C]76[/C][C] 5[/C][C] 4.307[/C][C] 0.6932[/C][/ROW]
[ROW][C]77[/C][C] 3[/C][C] 4.754[/C][C]-1.754[/C][/ROW]
[ROW][C]78[/C][C] 4[/C][C] 4.585[/C][C]-0.5849[/C][/ROW]
[ROW][C]79[/C][C] 5[/C][C] 4.849[/C][C] 0.1509[/C][/ROW]
[ROW][C]80[/C][C] 2[/C][C] 4.048[/C][C]-2.048[/C][/ROW]
[ROW][C]81[/C][C] 5[/C][C] 4.544[/C][C] 0.4565[/C][/ROW]
[ROW][C]82[/C][C] 5[/C][C] 4.199[/C][C] 0.8011[/C][/ROW]
[ROW][C]83[/C][C] 5[/C][C] 4.864[/C][C] 0.1356[/C][/ROW]
[ROW][C]84[/C][C] 4[/C][C] 3.963[/C][C] 0.03677[/C][/ROW]
[ROW][C]85[/C][C] 5[/C][C] 4.507[/C][C] 0.4931[/C][/ROW]
[ROW][C]86[/C][C] 5[/C][C] 4.528[/C][C] 0.472[/C][/ROW]
[ROW][C]87[/C][C] 4[/C][C] 4.44[/C][C]-0.4404[/C][/ROW]
[ROW][C]88[/C][C] 4[/C][C] 4.842[/C][C]-0.8416[/C][/ROW]
[ROW][C]89[/C][C] 5[/C][C] 4.575[/C][C] 0.4247[/C][/ROW]
[ROW][C]90[/C][C] 5[/C][C] 4.196[/C][C] 0.804[/C][/ROW]
[ROW][C]91[/C][C] 4[/C][C] 4.291[/C][C]-0.2911[/C][/ROW]
[ROW][C]92[/C][C] 5[/C][C] 4.456[/C][C] 0.5436[/C][/ROW]
[ROW][C]93[/C][C] 5[/C][C] 4.356[/C][C] 0.6444[/C][/ROW]
[ROW][C]94[/C][C] 4[/C][C] 4.272[/C][C]-0.2717[/C][/ROW]
[ROW][C]95[/C][C] 5[/C][C] 4.87[/C][C] 0.1303[/C][/ROW]
[ROW][C]96[/C][C] 5[/C][C] 4.67[/C][C] 0.3296[/C][/ROW]
[ROW][C]97[/C][C] 4[/C][C] 4.572[/C][C]-0.572[/C][/ROW]
[ROW][C]98[/C][C] 4[/C][C] 4.285[/C][C]-0.2854[/C][/ROW]
[ROW][C]99[/C][C] 4[/C][C] 4.218[/C][C]-0.2184[/C][/ROW]
[ROW][C]100[/C][C] 5[/C][C] 4.245[/C][C] 0.7553[/C][/ROW]
[ROW][C]101[/C][C] 5[/C][C] 4.746[/C][C] 0.2542[/C][/ROW]
[ROW][C]102[/C][C] 4[/C][C] 4.519[/C][C]-0.5193[/C][/ROW]
[ROW][C]103[/C][C] 5[/C][C] 4.498[/C][C] 0.5016[/C][/ROW]
[ROW][C]104[/C][C] 5[/C][C] 4.369[/C][C] 0.6309[/C][/ROW]
[ROW][C]105[/C][C] 4[/C][C] 3.796[/C][C] 0.2038[/C][/ROW]
[ROW][C]106[/C][C] 4[/C][C] 4.47[/C][C]-0.4697[/C][/ROW]
[ROW][C]107[/C][C] 4[/C][C] 3.646[/C][C] 0.3535[/C][/ROW]
[ROW][C]108[/C][C] 4[/C][C] 4.455[/C][C]-0.4554[/C][/ROW]
[ROW][C]109[/C][C] 5[/C][C] 4.219[/C][C] 0.7807[/C][/ROW]
[ROW][C]110[/C][C] 5[/C][C] 4.575[/C][C] 0.4249[/C][/ROW]
[ROW][C]111[/C][C] 4[/C][C] 3.945[/C][C] 0.05547[/C][/ROW]
[ROW][C]112[/C][C] 4[/C][C] 4.198[/C][C]-0.1981[/C][/ROW]
[ROW][C]113[/C][C] 3[/C][C] 3.974[/C][C]-0.9736[/C][/ROW]
[ROW][C]114[/C][C] 4[/C][C] 4.277[/C][C]-0.2772[/C][/ROW]
[ROW][C]115[/C][C] 5[/C][C] 4.634[/C][C] 0.3664[/C][/ROW]
[ROW][C]116[/C][C] 4[/C][C] 4.142[/C][C]-0.1417[/C][/ROW]
[ROW][C]117[/C][C] 4[/C][C] 3.971[/C][C] 0.02934[/C][/ROW]
[ROW][C]118[/C][C] 5[/C][C] 4.739[/C][C] 0.2607[/C][/ROW]
[ROW][C]119[/C][C] 3[/C][C] 4.168[/C][C]-1.168[/C][/ROW]
[ROW][C]120[/C][C] 4[/C][C] 4.43[/C][C]-0.4295[/C][/ROW]
[ROW][C]121[/C][C] 1[/C][C] 3.635[/C][C]-2.635[/C][/ROW]
[ROW][C]122[/C][C] 5[/C][C] 4.686[/C][C] 0.314[/C][/ROW]
[ROW][C]123[/C][C] 4[/C][C] 3.727[/C][C] 0.2728[/C][/ROW]
[ROW][C]124[/C][C] 5[/C][C] 4.371[/C][C] 0.629[/C][/ROW]
[ROW][C]125[/C][C] 3[/C][C] 4.124[/C][C]-1.124[/C][/ROW]
[ROW][C]126[/C][C] 4[/C][C] 4.509[/C][C]-0.5089[/C][/ROW]
[ROW][C]127[/C][C] 4[/C][C] 4.136[/C][C]-0.136[/C][/ROW]
[ROW][C]128[/C][C] 4[/C][C] 4.285[/C][C]-0.2848[/C][/ROW]
[ROW][C]129[/C][C] 5[/C][C] 4.484[/C][C] 0.516[/C][/ROW]
[ROW][C]130[/C][C] 5[/C][C] 4.247[/C][C] 0.753[/C][/ROW]
[ROW][C]131[/C][C] 5[/C][C] 4.72[/C][C] 0.2801[/C][/ROW]
[ROW][C]132[/C][C] 5[/C][C] 4.32[/C][C] 0.6797[/C][/ROW]
[ROW][C]133[/C][C] 5[/C][C] 4.402[/C][C] 0.5982[/C][/ROW]
[ROW][C]134[/C][C] 5[/C][C] 4.344[/C][C] 0.6556[/C][/ROW]
[ROW][C]135[/C][C] 4[/C][C] 4.675[/C][C]-0.6755[/C][/ROW]
[ROW][C]136[/C][C] 3[/C][C] 4.246[/C][C]-1.246[/C][/ROW]
[ROW][C]137[/C][C] 4[/C][C] 4.408[/C][C]-0.4076[/C][/ROW]
[ROW][C]138[/C][C] 4[/C][C] 3.539[/C][C] 0.4613[/C][/ROW]
[ROW][C]139[/C][C] 5[/C][C] 4.747[/C][C] 0.2534[/C][/ROW]
[ROW][C]140[/C][C] 4[/C][C] 4.422[/C][C]-0.4225[/C][/ROW]
[ROW][C]141[/C][C] 4[/C][C] 4.124[/C][C]-0.124[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=298338&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=298338&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 5 4.075 0.9246
2 5 4.494 0.5061
3 5 4.232 0.768
4 5 4.161 0.8388
5 4 3.885 0.1155
6 5 4.605 0.395
7 5 4.698 0.3023
8 4 4.355-0.3552
9 5 4.315 0.6855
10 3 3.881-0.8812
11 5 4.695 0.3047
12 3 4.448-1.448
13 4 4.134-0.1343
14 4 4.13-0.1296
15 5 3.824 1.176
16 4 4.422-0.4223
17 5 3.995 1.005
18 4 4.104-0.1039
19 4 4.56-0.5599
20 4 4.438-0.4381
21 4 4.001-0.0007358
22 5 4.421 0.5791
23 5 4.359 0.6411
24 5 4.734 0.2656
25 5 4.465 0.5352
26 2 4.36-2.36
27 4 4.172-0.172
28 4 3.997 0.00279
29 4 4.126-0.1257
30 5 3.946 1.054
31 5 4.789 0.2111
32 4 4.12-0.1204
33 4 4.013-0.01331
34 5 4.36 0.6398
35 5 4.281 0.7189
36 4 4.119-0.1191
37 4 4.542-0.5416
38 5 4.247 0.753
39 5 4.253 0.7469
40 5 4.434 0.5655
41 4 4.794-0.7936
42 5 4.435 0.5647
43 4 4.456-0.4563
44 5 3.588 1.412
45 5 3.821 1.179
46 4 4.441-0.4411
47 5 4.342 0.6575
48 3 4.284-1.284
49 3 4.309-1.309
50 5 4.498 0.5015
51 4 3.916 0.08447
52 4 4.584-0.584
53 5 4.388 0.6122
54 5 4.495 0.5051
55 5 4.362 0.6383
56 4 4.438-0.4379
57 4 4.462-0.4623
58 4 4.205-0.2053
59 5 4.427 0.5726
60 4 4.028-0.02823
61 4 4.168-0.1675
62 4 3.865 0.1353
63 4 3.713 0.2872
64 4 3.719 0.2809
65 4 4.563-0.5629
66 4 4.278-0.2783
67 5 4.464 0.5358
68 4 4.374-0.374
69 5 4.709 0.2913
70 4 4.376-0.3756
71 4 4.362-0.3616
72 3 3.736-0.7363
73 4 4.147-0.1475
74 2 3.517-1.517
75 4 4.184-0.1844
76 5 4.307 0.6932
77 3 4.754-1.754
78 4 4.585-0.5849
79 5 4.849 0.1509
80 2 4.048-2.048
81 5 4.544 0.4565
82 5 4.199 0.8011
83 5 4.864 0.1356
84 4 3.963 0.03677
85 5 4.507 0.4931
86 5 4.528 0.472
87 4 4.44-0.4404
88 4 4.842-0.8416
89 5 4.575 0.4247
90 5 4.196 0.804
91 4 4.291-0.2911
92 5 4.456 0.5436
93 5 4.356 0.6444
94 4 4.272-0.2717
95 5 4.87 0.1303
96 5 4.67 0.3296
97 4 4.572-0.572
98 4 4.285-0.2854
99 4 4.218-0.2184
100 5 4.245 0.7553
101 5 4.746 0.2542
102 4 4.519-0.5193
103 5 4.498 0.5016
104 5 4.369 0.6309
105 4 3.796 0.2038
106 4 4.47-0.4697
107 4 3.646 0.3535
108 4 4.455-0.4554
109 5 4.219 0.7807
110 5 4.575 0.4249
111 4 3.945 0.05547
112 4 4.198-0.1981
113 3 3.974-0.9736
114 4 4.277-0.2772
115 5 4.634 0.3664
116 4 4.142-0.1417
117 4 3.971 0.02934
118 5 4.739 0.2607
119 3 4.168-1.168
120 4 4.43-0.4295
121 1 3.635-2.635
122 5 4.686 0.314
123 4 3.727 0.2728
124 5 4.371 0.629
125 3 4.124-1.124
126 4 4.509-0.5089
127 4 4.136-0.136
128 4 4.285-0.2848
129 5 4.484 0.516
130 5 4.247 0.753
131 5 4.72 0.2801
132 5 4.32 0.6797
133 5 4.402 0.5982
134 5 4.344 0.6556
135 4 4.675-0.6755
136 3 4.246-1.246
137 4 4.408-0.4076
138 4 3.539 0.4613
139 5 4.747 0.2534
140 4 4.422-0.4225
141 4 4.124-0.124







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
13 0.3169 0.6338 0.6831
14 0.2156 0.4311 0.7844
15 0.6368 0.7264 0.3632
16 0.5182 0.9635 0.4818
17 0.4282 0.8563 0.5718
18 0.3699 0.7398 0.6301
19 0.4128 0.8256 0.5872
20 0.3213 0.6426 0.6787
21 0.4192 0.8384 0.5808
22 0.4509 0.9018 0.5491
23 0.4174 0.8349 0.5826
24 0.3552 0.7104 0.6448
25 0.2885 0.5769 0.7115
26 0.8975 0.2051 0.1025
27 0.8772 0.2456 0.1228
28 0.8771 0.2458 0.1229
29 0.8401 0.3197 0.1599
30 0.8189 0.3622 0.1811
31 0.7787 0.4427 0.2213
32 0.7776 0.4448 0.2224
33 0.7368 0.5265 0.2632
34 0.7287 0.5425 0.2713
35 0.7163 0.5674 0.2837
36 0.6721 0.6558 0.3279
37 0.6596 0.6808 0.3404
38 0.6863 0.6274 0.3137
39 0.6847 0.6307 0.3153
40 0.6563 0.6875 0.3437
41 0.6554 0.6892 0.3446
42 0.657 0.686 0.343
43 0.6157 0.7685 0.3843
44 0.7026 0.5947 0.2974
45 0.7694 0.4612 0.2306
46 0.7435 0.5129 0.2565
47 0.7328 0.5344 0.2672
48 0.8075 0.3849 0.1925
49 0.8939 0.2122 0.1061
50 0.8809 0.2382 0.1191
51 0.8535 0.2929 0.1465
52 0.8502 0.2997 0.1498
53 0.8333 0.3334 0.1667
54 0.8222 0.3555 0.1778
55 0.8175 0.3651 0.1825
56 0.7924 0.4151 0.2076
57 0.768 0.4641 0.232
58 0.7292 0.5416 0.2708
59 0.715 0.5699 0.285
60 0.6826 0.6349 0.3174
61 0.6444 0.7113 0.3556
62 0.6098 0.7804 0.3902
63 0.5724 0.8552 0.4276
64 0.5382 0.9235 0.4618
65 0.52 0.96 0.48
66 0.4824 0.9647 0.5176
67 0.4601 0.9203 0.5399
68 0.422 0.8439 0.578
69 0.383 0.7659 0.617
70 0.3461 0.6922 0.6539
71 0.3088 0.6175 0.6912
72 0.3217 0.6434 0.6783
73 0.2804 0.5609 0.7196
74 0.4566 0.9132 0.5434
75 0.4167 0.8334 0.5833
76 0.4104 0.8208 0.5896
77 0.6473 0.7054 0.3527
78 0.6339 0.7321 0.3661
79 0.5936 0.8127 0.4064
80 0.8689 0.2623 0.1311
81 0.8745 0.2509 0.1255
82 0.8685 0.2629 0.1315
83 0.8568 0.2864 0.1432
84 0.8406 0.3189 0.1594
85 0.8185 0.3631 0.1815
86 0.7961 0.4079 0.2039
87 0.7814 0.4372 0.2186
88 0.796 0.4079 0.204
89 0.7673 0.4654 0.2327
90 0.7784 0.4433 0.2216
91 0.7375 0.5249 0.2625
92 0.7359 0.5283 0.2641
93 0.7216 0.5569 0.2784
94 0.677 0.6461 0.323
95 0.6266 0.7469 0.3734
96 0.5793 0.8414 0.4207
97 0.5704 0.8592 0.4296
98 0.5195 0.9611 0.4805
99 0.4655 0.9311 0.5345
100 0.4758 0.9516 0.5242
101 0.4239 0.8478 0.5761
102 0.4137 0.8274 0.5863
103 0.4052 0.8104 0.5948
104 0.391 0.782 0.609
105 0.3375 0.6749 0.6625
106 0.2927 0.5854 0.7073
107 0.347 0.694 0.653
108 0.3558 0.7116 0.6442
109 0.369 0.7379 0.631
110 0.315 0.6301 0.685
111 0.2594 0.5188 0.7406
112 0.2288 0.4577 0.7712
113 0.2217 0.4434 0.7783
114 0.174 0.3481 0.826
115 0.1358 0.2716 0.8642
116 0.09963 0.1993 0.9004
117 0.07804 0.1561 0.922
118 0.05686 0.1137 0.9431
119 0.1529 0.3058 0.8471
120 0.1204 0.2408 0.8796
121 0.4736 0.9473 0.5264
122 0.5299 0.9402 0.4701
123 0.471 0.942 0.529
124 0.3772 0.7543 0.6228
125 0.38 0.7599 0.62
126 0.2697 0.5394 0.7303
127 0.1936 0.3872 0.8064
128 0.1092 0.2184 0.8908

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
13 &  0.3169 &  0.6338 &  0.6831 \tabularnewline
14 &  0.2156 &  0.4311 &  0.7844 \tabularnewline
15 &  0.6368 &  0.7264 &  0.3632 \tabularnewline
16 &  0.5182 &  0.9635 &  0.4818 \tabularnewline
17 &  0.4282 &  0.8563 &  0.5718 \tabularnewline
18 &  0.3699 &  0.7398 &  0.6301 \tabularnewline
19 &  0.4128 &  0.8256 &  0.5872 \tabularnewline
20 &  0.3213 &  0.6426 &  0.6787 \tabularnewline
21 &  0.4192 &  0.8384 &  0.5808 \tabularnewline
22 &  0.4509 &  0.9018 &  0.5491 \tabularnewline
23 &  0.4174 &  0.8349 &  0.5826 \tabularnewline
24 &  0.3552 &  0.7104 &  0.6448 \tabularnewline
25 &  0.2885 &  0.5769 &  0.7115 \tabularnewline
26 &  0.8975 &  0.2051 &  0.1025 \tabularnewline
27 &  0.8772 &  0.2456 &  0.1228 \tabularnewline
28 &  0.8771 &  0.2458 &  0.1229 \tabularnewline
29 &  0.8401 &  0.3197 &  0.1599 \tabularnewline
30 &  0.8189 &  0.3622 &  0.1811 \tabularnewline
31 &  0.7787 &  0.4427 &  0.2213 \tabularnewline
32 &  0.7776 &  0.4448 &  0.2224 \tabularnewline
33 &  0.7368 &  0.5265 &  0.2632 \tabularnewline
34 &  0.7287 &  0.5425 &  0.2713 \tabularnewline
35 &  0.7163 &  0.5674 &  0.2837 \tabularnewline
36 &  0.6721 &  0.6558 &  0.3279 \tabularnewline
37 &  0.6596 &  0.6808 &  0.3404 \tabularnewline
38 &  0.6863 &  0.6274 &  0.3137 \tabularnewline
39 &  0.6847 &  0.6307 &  0.3153 \tabularnewline
40 &  0.6563 &  0.6875 &  0.3437 \tabularnewline
41 &  0.6554 &  0.6892 &  0.3446 \tabularnewline
42 &  0.657 &  0.686 &  0.343 \tabularnewline
43 &  0.6157 &  0.7685 &  0.3843 \tabularnewline
44 &  0.7026 &  0.5947 &  0.2974 \tabularnewline
45 &  0.7694 &  0.4612 &  0.2306 \tabularnewline
46 &  0.7435 &  0.5129 &  0.2565 \tabularnewline
47 &  0.7328 &  0.5344 &  0.2672 \tabularnewline
48 &  0.8075 &  0.3849 &  0.1925 \tabularnewline
49 &  0.8939 &  0.2122 &  0.1061 \tabularnewline
50 &  0.8809 &  0.2382 &  0.1191 \tabularnewline
51 &  0.8535 &  0.2929 &  0.1465 \tabularnewline
52 &  0.8502 &  0.2997 &  0.1498 \tabularnewline
53 &  0.8333 &  0.3334 &  0.1667 \tabularnewline
54 &  0.8222 &  0.3555 &  0.1778 \tabularnewline
55 &  0.8175 &  0.3651 &  0.1825 \tabularnewline
56 &  0.7924 &  0.4151 &  0.2076 \tabularnewline
57 &  0.768 &  0.4641 &  0.232 \tabularnewline
58 &  0.7292 &  0.5416 &  0.2708 \tabularnewline
59 &  0.715 &  0.5699 &  0.285 \tabularnewline
60 &  0.6826 &  0.6349 &  0.3174 \tabularnewline
61 &  0.6444 &  0.7113 &  0.3556 \tabularnewline
62 &  0.6098 &  0.7804 &  0.3902 \tabularnewline
63 &  0.5724 &  0.8552 &  0.4276 \tabularnewline
64 &  0.5382 &  0.9235 &  0.4618 \tabularnewline
65 &  0.52 &  0.96 &  0.48 \tabularnewline
66 &  0.4824 &  0.9647 &  0.5176 \tabularnewline
67 &  0.4601 &  0.9203 &  0.5399 \tabularnewline
68 &  0.422 &  0.8439 &  0.578 \tabularnewline
69 &  0.383 &  0.7659 &  0.617 \tabularnewline
70 &  0.3461 &  0.6922 &  0.6539 \tabularnewline
71 &  0.3088 &  0.6175 &  0.6912 \tabularnewline
72 &  0.3217 &  0.6434 &  0.6783 \tabularnewline
73 &  0.2804 &  0.5609 &  0.7196 \tabularnewline
74 &  0.4566 &  0.9132 &  0.5434 \tabularnewline
75 &  0.4167 &  0.8334 &  0.5833 \tabularnewline
76 &  0.4104 &  0.8208 &  0.5896 \tabularnewline
77 &  0.6473 &  0.7054 &  0.3527 \tabularnewline
78 &  0.6339 &  0.7321 &  0.3661 \tabularnewline
79 &  0.5936 &  0.8127 &  0.4064 \tabularnewline
80 &  0.8689 &  0.2623 &  0.1311 \tabularnewline
81 &  0.8745 &  0.2509 &  0.1255 \tabularnewline
82 &  0.8685 &  0.2629 &  0.1315 \tabularnewline
83 &  0.8568 &  0.2864 &  0.1432 \tabularnewline
84 &  0.8406 &  0.3189 &  0.1594 \tabularnewline
85 &  0.8185 &  0.3631 &  0.1815 \tabularnewline
86 &  0.7961 &  0.4079 &  0.2039 \tabularnewline
87 &  0.7814 &  0.4372 &  0.2186 \tabularnewline
88 &  0.796 &  0.4079 &  0.204 \tabularnewline
89 &  0.7673 &  0.4654 &  0.2327 \tabularnewline
90 &  0.7784 &  0.4433 &  0.2216 \tabularnewline
91 &  0.7375 &  0.5249 &  0.2625 \tabularnewline
92 &  0.7359 &  0.5283 &  0.2641 \tabularnewline
93 &  0.7216 &  0.5569 &  0.2784 \tabularnewline
94 &  0.677 &  0.6461 &  0.323 \tabularnewline
95 &  0.6266 &  0.7469 &  0.3734 \tabularnewline
96 &  0.5793 &  0.8414 &  0.4207 \tabularnewline
97 &  0.5704 &  0.8592 &  0.4296 \tabularnewline
98 &  0.5195 &  0.9611 &  0.4805 \tabularnewline
99 &  0.4655 &  0.9311 &  0.5345 \tabularnewline
100 &  0.4758 &  0.9516 &  0.5242 \tabularnewline
101 &  0.4239 &  0.8478 &  0.5761 \tabularnewline
102 &  0.4137 &  0.8274 &  0.5863 \tabularnewline
103 &  0.4052 &  0.8104 &  0.5948 \tabularnewline
104 &  0.391 &  0.782 &  0.609 \tabularnewline
105 &  0.3375 &  0.6749 &  0.6625 \tabularnewline
106 &  0.2927 &  0.5854 &  0.7073 \tabularnewline
107 &  0.347 &  0.694 &  0.653 \tabularnewline
108 &  0.3558 &  0.7116 &  0.6442 \tabularnewline
109 &  0.369 &  0.7379 &  0.631 \tabularnewline
110 &  0.315 &  0.6301 &  0.685 \tabularnewline
111 &  0.2594 &  0.5188 &  0.7406 \tabularnewline
112 &  0.2288 &  0.4577 &  0.7712 \tabularnewline
113 &  0.2217 &  0.4434 &  0.7783 \tabularnewline
114 &  0.174 &  0.3481 &  0.826 \tabularnewline
115 &  0.1358 &  0.2716 &  0.8642 \tabularnewline
116 &  0.09963 &  0.1993 &  0.9004 \tabularnewline
117 &  0.07804 &  0.1561 &  0.922 \tabularnewline
118 &  0.05686 &  0.1137 &  0.9431 \tabularnewline
119 &  0.1529 &  0.3058 &  0.8471 \tabularnewline
120 &  0.1204 &  0.2408 &  0.8796 \tabularnewline
121 &  0.4736 &  0.9473 &  0.5264 \tabularnewline
122 &  0.5299 &  0.9402 &  0.4701 \tabularnewline
123 &  0.471 &  0.942 &  0.529 \tabularnewline
124 &  0.3772 &  0.7543 &  0.6228 \tabularnewline
125 &  0.38 &  0.7599 &  0.62 \tabularnewline
126 &  0.2697 &  0.5394 &  0.7303 \tabularnewline
127 &  0.1936 &  0.3872 &  0.8064 \tabularnewline
128 &  0.1092 &  0.2184 &  0.8908 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=298338&T=5

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]13[/C][C] 0.3169[/C][C] 0.6338[/C][C] 0.6831[/C][/ROW]
[ROW][C]14[/C][C] 0.2156[/C][C] 0.4311[/C][C] 0.7844[/C][/ROW]
[ROW][C]15[/C][C] 0.6368[/C][C] 0.7264[/C][C] 0.3632[/C][/ROW]
[ROW][C]16[/C][C] 0.5182[/C][C] 0.9635[/C][C] 0.4818[/C][/ROW]
[ROW][C]17[/C][C] 0.4282[/C][C] 0.8563[/C][C] 0.5718[/C][/ROW]
[ROW][C]18[/C][C] 0.3699[/C][C] 0.7398[/C][C] 0.6301[/C][/ROW]
[ROW][C]19[/C][C] 0.4128[/C][C] 0.8256[/C][C] 0.5872[/C][/ROW]
[ROW][C]20[/C][C] 0.3213[/C][C] 0.6426[/C][C] 0.6787[/C][/ROW]
[ROW][C]21[/C][C] 0.4192[/C][C] 0.8384[/C][C] 0.5808[/C][/ROW]
[ROW][C]22[/C][C] 0.4509[/C][C] 0.9018[/C][C] 0.5491[/C][/ROW]
[ROW][C]23[/C][C] 0.4174[/C][C] 0.8349[/C][C] 0.5826[/C][/ROW]
[ROW][C]24[/C][C] 0.3552[/C][C] 0.7104[/C][C] 0.6448[/C][/ROW]
[ROW][C]25[/C][C] 0.2885[/C][C] 0.5769[/C][C] 0.7115[/C][/ROW]
[ROW][C]26[/C][C] 0.8975[/C][C] 0.2051[/C][C] 0.1025[/C][/ROW]
[ROW][C]27[/C][C] 0.8772[/C][C] 0.2456[/C][C] 0.1228[/C][/ROW]
[ROW][C]28[/C][C] 0.8771[/C][C] 0.2458[/C][C] 0.1229[/C][/ROW]
[ROW][C]29[/C][C] 0.8401[/C][C] 0.3197[/C][C] 0.1599[/C][/ROW]
[ROW][C]30[/C][C] 0.8189[/C][C] 0.3622[/C][C] 0.1811[/C][/ROW]
[ROW][C]31[/C][C] 0.7787[/C][C] 0.4427[/C][C] 0.2213[/C][/ROW]
[ROW][C]32[/C][C] 0.7776[/C][C] 0.4448[/C][C] 0.2224[/C][/ROW]
[ROW][C]33[/C][C] 0.7368[/C][C] 0.5265[/C][C] 0.2632[/C][/ROW]
[ROW][C]34[/C][C] 0.7287[/C][C] 0.5425[/C][C] 0.2713[/C][/ROW]
[ROW][C]35[/C][C] 0.7163[/C][C] 0.5674[/C][C] 0.2837[/C][/ROW]
[ROW][C]36[/C][C] 0.6721[/C][C] 0.6558[/C][C] 0.3279[/C][/ROW]
[ROW][C]37[/C][C] 0.6596[/C][C] 0.6808[/C][C] 0.3404[/C][/ROW]
[ROW][C]38[/C][C] 0.6863[/C][C] 0.6274[/C][C] 0.3137[/C][/ROW]
[ROW][C]39[/C][C] 0.6847[/C][C] 0.6307[/C][C] 0.3153[/C][/ROW]
[ROW][C]40[/C][C] 0.6563[/C][C] 0.6875[/C][C] 0.3437[/C][/ROW]
[ROW][C]41[/C][C] 0.6554[/C][C] 0.6892[/C][C] 0.3446[/C][/ROW]
[ROW][C]42[/C][C] 0.657[/C][C] 0.686[/C][C] 0.343[/C][/ROW]
[ROW][C]43[/C][C] 0.6157[/C][C] 0.7685[/C][C] 0.3843[/C][/ROW]
[ROW][C]44[/C][C] 0.7026[/C][C] 0.5947[/C][C] 0.2974[/C][/ROW]
[ROW][C]45[/C][C] 0.7694[/C][C] 0.4612[/C][C] 0.2306[/C][/ROW]
[ROW][C]46[/C][C] 0.7435[/C][C] 0.5129[/C][C] 0.2565[/C][/ROW]
[ROW][C]47[/C][C] 0.7328[/C][C] 0.5344[/C][C] 0.2672[/C][/ROW]
[ROW][C]48[/C][C] 0.8075[/C][C] 0.3849[/C][C] 0.1925[/C][/ROW]
[ROW][C]49[/C][C] 0.8939[/C][C] 0.2122[/C][C] 0.1061[/C][/ROW]
[ROW][C]50[/C][C] 0.8809[/C][C] 0.2382[/C][C] 0.1191[/C][/ROW]
[ROW][C]51[/C][C] 0.8535[/C][C] 0.2929[/C][C] 0.1465[/C][/ROW]
[ROW][C]52[/C][C] 0.8502[/C][C] 0.2997[/C][C] 0.1498[/C][/ROW]
[ROW][C]53[/C][C] 0.8333[/C][C] 0.3334[/C][C] 0.1667[/C][/ROW]
[ROW][C]54[/C][C] 0.8222[/C][C] 0.3555[/C][C] 0.1778[/C][/ROW]
[ROW][C]55[/C][C] 0.8175[/C][C] 0.3651[/C][C] 0.1825[/C][/ROW]
[ROW][C]56[/C][C] 0.7924[/C][C] 0.4151[/C][C] 0.2076[/C][/ROW]
[ROW][C]57[/C][C] 0.768[/C][C] 0.4641[/C][C] 0.232[/C][/ROW]
[ROW][C]58[/C][C] 0.7292[/C][C] 0.5416[/C][C] 0.2708[/C][/ROW]
[ROW][C]59[/C][C] 0.715[/C][C] 0.5699[/C][C] 0.285[/C][/ROW]
[ROW][C]60[/C][C] 0.6826[/C][C] 0.6349[/C][C] 0.3174[/C][/ROW]
[ROW][C]61[/C][C] 0.6444[/C][C] 0.7113[/C][C] 0.3556[/C][/ROW]
[ROW][C]62[/C][C] 0.6098[/C][C] 0.7804[/C][C] 0.3902[/C][/ROW]
[ROW][C]63[/C][C] 0.5724[/C][C] 0.8552[/C][C] 0.4276[/C][/ROW]
[ROW][C]64[/C][C] 0.5382[/C][C] 0.9235[/C][C] 0.4618[/C][/ROW]
[ROW][C]65[/C][C] 0.52[/C][C] 0.96[/C][C] 0.48[/C][/ROW]
[ROW][C]66[/C][C] 0.4824[/C][C] 0.9647[/C][C] 0.5176[/C][/ROW]
[ROW][C]67[/C][C] 0.4601[/C][C] 0.9203[/C][C] 0.5399[/C][/ROW]
[ROW][C]68[/C][C] 0.422[/C][C] 0.8439[/C][C] 0.578[/C][/ROW]
[ROW][C]69[/C][C] 0.383[/C][C] 0.7659[/C][C] 0.617[/C][/ROW]
[ROW][C]70[/C][C] 0.3461[/C][C] 0.6922[/C][C] 0.6539[/C][/ROW]
[ROW][C]71[/C][C] 0.3088[/C][C] 0.6175[/C][C] 0.6912[/C][/ROW]
[ROW][C]72[/C][C] 0.3217[/C][C] 0.6434[/C][C] 0.6783[/C][/ROW]
[ROW][C]73[/C][C] 0.2804[/C][C] 0.5609[/C][C] 0.7196[/C][/ROW]
[ROW][C]74[/C][C] 0.4566[/C][C] 0.9132[/C][C] 0.5434[/C][/ROW]
[ROW][C]75[/C][C] 0.4167[/C][C] 0.8334[/C][C] 0.5833[/C][/ROW]
[ROW][C]76[/C][C] 0.4104[/C][C] 0.8208[/C][C] 0.5896[/C][/ROW]
[ROW][C]77[/C][C] 0.6473[/C][C] 0.7054[/C][C] 0.3527[/C][/ROW]
[ROW][C]78[/C][C] 0.6339[/C][C] 0.7321[/C][C] 0.3661[/C][/ROW]
[ROW][C]79[/C][C] 0.5936[/C][C] 0.8127[/C][C] 0.4064[/C][/ROW]
[ROW][C]80[/C][C] 0.8689[/C][C] 0.2623[/C][C] 0.1311[/C][/ROW]
[ROW][C]81[/C][C] 0.8745[/C][C] 0.2509[/C][C] 0.1255[/C][/ROW]
[ROW][C]82[/C][C] 0.8685[/C][C] 0.2629[/C][C] 0.1315[/C][/ROW]
[ROW][C]83[/C][C] 0.8568[/C][C] 0.2864[/C][C] 0.1432[/C][/ROW]
[ROW][C]84[/C][C] 0.8406[/C][C] 0.3189[/C][C] 0.1594[/C][/ROW]
[ROW][C]85[/C][C] 0.8185[/C][C] 0.3631[/C][C] 0.1815[/C][/ROW]
[ROW][C]86[/C][C] 0.7961[/C][C] 0.4079[/C][C] 0.2039[/C][/ROW]
[ROW][C]87[/C][C] 0.7814[/C][C] 0.4372[/C][C] 0.2186[/C][/ROW]
[ROW][C]88[/C][C] 0.796[/C][C] 0.4079[/C][C] 0.204[/C][/ROW]
[ROW][C]89[/C][C] 0.7673[/C][C] 0.4654[/C][C] 0.2327[/C][/ROW]
[ROW][C]90[/C][C] 0.7784[/C][C] 0.4433[/C][C] 0.2216[/C][/ROW]
[ROW][C]91[/C][C] 0.7375[/C][C] 0.5249[/C][C] 0.2625[/C][/ROW]
[ROW][C]92[/C][C] 0.7359[/C][C] 0.5283[/C][C] 0.2641[/C][/ROW]
[ROW][C]93[/C][C] 0.7216[/C][C] 0.5569[/C][C] 0.2784[/C][/ROW]
[ROW][C]94[/C][C] 0.677[/C][C] 0.6461[/C][C] 0.323[/C][/ROW]
[ROW][C]95[/C][C] 0.6266[/C][C] 0.7469[/C][C] 0.3734[/C][/ROW]
[ROW][C]96[/C][C] 0.5793[/C][C] 0.8414[/C][C] 0.4207[/C][/ROW]
[ROW][C]97[/C][C] 0.5704[/C][C] 0.8592[/C][C] 0.4296[/C][/ROW]
[ROW][C]98[/C][C] 0.5195[/C][C] 0.9611[/C][C] 0.4805[/C][/ROW]
[ROW][C]99[/C][C] 0.4655[/C][C] 0.9311[/C][C] 0.5345[/C][/ROW]
[ROW][C]100[/C][C] 0.4758[/C][C] 0.9516[/C][C] 0.5242[/C][/ROW]
[ROW][C]101[/C][C] 0.4239[/C][C] 0.8478[/C][C] 0.5761[/C][/ROW]
[ROW][C]102[/C][C] 0.4137[/C][C] 0.8274[/C][C] 0.5863[/C][/ROW]
[ROW][C]103[/C][C] 0.4052[/C][C] 0.8104[/C][C] 0.5948[/C][/ROW]
[ROW][C]104[/C][C] 0.391[/C][C] 0.782[/C][C] 0.609[/C][/ROW]
[ROW][C]105[/C][C] 0.3375[/C][C] 0.6749[/C][C] 0.6625[/C][/ROW]
[ROW][C]106[/C][C] 0.2927[/C][C] 0.5854[/C][C] 0.7073[/C][/ROW]
[ROW][C]107[/C][C] 0.347[/C][C] 0.694[/C][C] 0.653[/C][/ROW]
[ROW][C]108[/C][C] 0.3558[/C][C] 0.7116[/C][C] 0.6442[/C][/ROW]
[ROW][C]109[/C][C] 0.369[/C][C] 0.7379[/C][C] 0.631[/C][/ROW]
[ROW][C]110[/C][C] 0.315[/C][C] 0.6301[/C][C] 0.685[/C][/ROW]
[ROW][C]111[/C][C] 0.2594[/C][C] 0.5188[/C][C] 0.7406[/C][/ROW]
[ROW][C]112[/C][C] 0.2288[/C][C] 0.4577[/C][C] 0.7712[/C][/ROW]
[ROW][C]113[/C][C] 0.2217[/C][C] 0.4434[/C][C] 0.7783[/C][/ROW]
[ROW][C]114[/C][C] 0.174[/C][C] 0.3481[/C][C] 0.826[/C][/ROW]
[ROW][C]115[/C][C] 0.1358[/C][C] 0.2716[/C][C] 0.8642[/C][/ROW]
[ROW][C]116[/C][C] 0.09963[/C][C] 0.1993[/C][C] 0.9004[/C][/ROW]
[ROW][C]117[/C][C] 0.07804[/C][C] 0.1561[/C][C] 0.922[/C][/ROW]
[ROW][C]118[/C][C] 0.05686[/C][C] 0.1137[/C][C] 0.9431[/C][/ROW]
[ROW][C]119[/C][C] 0.1529[/C][C] 0.3058[/C][C] 0.8471[/C][/ROW]
[ROW][C]120[/C][C] 0.1204[/C][C] 0.2408[/C][C] 0.8796[/C][/ROW]
[ROW][C]121[/C][C] 0.4736[/C][C] 0.9473[/C][C] 0.5264[/C][/ROW]
[ROW][C]122[/C][C] 0.5299[/C][C] 0.9402[/C][C] 0.4701[/C][/ROW]
[ROW][C]123[/C][C] 0.471[/C][C] 0.942[/C][C] 0.529[/C][/ROW]
[ROW][C]124[/C][C] 0.3772[/C][C] 0.7543[/C][C] 0.6228[/C][/ROW]
[ROW][C]125[/C][C] 0.38[/C][C] 0.7599[/C][C] 0.62[/C][/ROW]
[ROW][C]126[/C][C] 0.2697[/C][C] 0.5394[/C][C] 0.7303[/C][/ROW]
[ROW][C]127[/C][C] 0.1936[/C][C] 0.3872[/C][C] 0.8064[/C][/ROW]
[ROW][C]128[/C][C] 0.1092[/C][C] 0.2184[/C][C] 0.8908[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=298338&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=298338&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
13 0.3169 0.6338 0.6831
14 0.2156 0.4311 0.7844
15 0.6368 0.7264 0.3632
16 0.5182 0.9635 0.4818
17 0.4282 0.8563 0.5718
18 0.3699 0.7398 0.6301
19 0.4128 0.8256 0.5872
20 0.3213 0.6426 0.6787
21 0.4192 0.8384 0.5808
22 0.4509 0.9018 0.5491
23 0.4174 0.8349 0.5826
24 0.3552 0.7104 0.6448
25 0.2885 0.5769 0.7115
26 0.8975 0.2051 0.1025
27 0.8772 0.2456 0.1228
28 0.8771 0.2458 0.1229
29 0.8401 0.3197 0.1599
30 0.8189 0.3622 0.1811
31 0.7787 0.4427 0.2213
32 0.7776 0.4448 0.2224
33 0.7368 0.5265 0.2632
34 0.7287 0.5425 0.2713
35 0.7163 0.5674 0.2837
36 0.6721 0.6558 0.3279
37 0.6596 0.6808 0.3404
38 0.6863 0.6274 0.3137
39 0.6847 0.6307 0.3153
40 0.6563 0.6875 0.3437
41 0.6554 0.6892 0.3446
42 0.657 0.686 0.343
43 0.6157 0.7685 0.3843
44 0.7026 0.5947 0.2974
45 0.7694 0.4612 0.2306
46 0.7435 0.5129 0.2565
47 0.7328 0.5344 0.2672
48 0.8075 0.3849 0.1925
49 0.8939 0.2122 0.1061
50 0.8809 0.2382 0.1191
51 0.8535 0.2929 0.1465
52 0.8502 0.2997 0.1498
53 0.8333 0.3334 0.1667
54 0.8222 0.3555 0.1778
55 0.8175 0.3651 0.1825
56 0.7924 0.4151 0.2076
57 0.768 0.4641 0.232
58 0.7292 0.5416 0.2708
59 0.715 0.5699 0.285
60 0.6826 0.6349 0.3174
61 0.6444 0.7113 0.3556
62 0.6098 0.7804 0.3902
63 0.5724 0.8552 0.4276
64 0.5382 0.9235 0.4618
65 0.52 0.96 0.48
66 0.4824 0.9647 0.5176
67 0.4601 0.9203 0.5399
68 0.422 0.8439 0.578
69 0.383 0.7659 0.617
70 0.3461 0.6922 0.6539
71 0.3088 0.6175 0.6912
72 0.3217 0.6434 0.6783
73 0.2804 0.5609 0.7196
74 0.4566 0.9132 0.5434
75 0.4167 0.8334 0.5833
76 0.4104 0.8208 0.5896
77 0.6473 0.7054 0.3527
78 0.6339 0.7321 0.3661
79 0.5936 0.8127 0.4064
80 0.8689 0.2623 0.1311
81 0.8745 0.2509 0.1255
82 0.8685 0.2629 0.1315
83 0.8568 0.2864 0.1432
84 0.8406 0.3189 0.1594
85 0.8185 0.3631 0.1815
86 0.7961 0.4079 0.2039
87 0.7814 0.4372 0.2186
88 0.796 0.4079 0.204
89 0.7673 0.4654 0.2327
90 0.7784 0.4433 0.2216
91 0.7375 0.5249 0.2625
92 0.7359 0.5283 0.2641
93 0.7216 0.5569 0.2784
94 0.677 0.6461 0.323
95 0.6266 0.7469 0.3734
96 0.5793 0.8414 0.4207
97 0.5704 0.8592 0.4296
98 0.5195 0.9611 0.4805
99 0.4655 0.9311 0.5345
100 0.4758 0.9516 0.5242
101 0.4239 0.8478 0.5761
102 0.4137 0.8274 0.5863
103 0.4052 0.8104 0.5948
104 0.391 0.782 0.609
105 0.3375 0.6749 0.6625
106 0.2927 0.5854 0.7073
107 0.347 0.694 0.653
108 0.3558 0.7116 0.6442
109 0.369 0.7379 0.631
110 0.315 0.6301 0.685
111 0.2594 0.5188 0.7406
112 0.2288 0.4577 0.7712
113 0.2217 0.4434 0.7783
114 0.174 0.3481 0.826
115 0.1358 0.2716 0.8642
116 0.09963 0.1993 0.9004
117 0.07804 0.1561 0.922
118 0.05686 0.1137 0.9431
119 0.1529 0.3058 0.8471
120 0.1204 0.2408 0.8796
121 0.4736 0.9473 0.5264
122 0.5299 0.9402 0.4701
123 0.471 0.942 0.529
124 0.3772 0.7543 0.6228
125 0.38 0.7599 0.62
126 0.2697 0.5394 0.7303
127 0.1936 0.3872 0.8064
128 0.1092 0.2184 0.8908







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level0 0OK
5% type I error level00OK
10% type I error level00OK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 0 &  0 & OK \tabularnewline
5% type I error level & 0 & 0 & OK \tabularnewline
10% type I error level & 0 & 0 & OK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=298338&T=6

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]0[/C][C] 0[/C][C]OK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=298338&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=298338&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level0 0OK
5% type I error level00OK
10% type I error level00OK







Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 0.30653, df1 = 2, df2 = 129, p-value = 0.7365
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 1.3639, df1 = 18, df2 = 113, p-value = 0.1637
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 1.1847, df1 = 2, df2 = 129, p-value = 0.3091

\begin{tabular}{lllllllll}
\hline
Ramsey RESET F-Test for powers (2 and 3) of fitted values \tabularnewline
> reset_test_fitted
	RESET test
data:  mylm
RESET = 0.30653, df1 = 2, df2 = 129, p-value = 0.7365
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of regressors \tabularnewline
> reset_test_regressors
	RESET test
data:  mylm
RESET = 1.3639, df1 = 18, df2 = 113, p-value = 0.1637
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of principal components \tabularnewline
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 1.1847, df1 = 2, df2 = 129, p-value = 0.3091
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=298338&T=7

[TABLE]
[ROW][C]Ramsey RESET F-Test for powers (2 and 3) of fitted values[/C][/ROW]
[ROW][C]
> reset_test_fitted
	RESET test
data:  mylm
RESET = 0.30653, df1 = 2, df2 = 129, p-value = 0.7365
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of regressors[/C][/ROW] [ROW][C]
> reset_test_regressors
	RESET test
data:  mylm
RESET = 1.3639, df1 = 18, df2 = 113, p-value = 0.1637
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of principal components[/C][/ROW] [ROW][C]
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 1.1847, df1 = 2, df2 = 129, p-value = 0.3091
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=298338&T=7

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=298338&T=7

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 0.30653, df1 = 2, df2 = 129, p-value = 0.7365
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 1.3639, df1 = 18, df2 = 113, p-value = 0.1637
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 1.1847, df1 = 2, df2 = 129, p-value = 0.3091







Variance Inflation Factors (Multicollinearity)
> vif
       K2        K3        K4       ITH `K1(t-1)` `K1(t-2)` `K1(t-3)` `K1(t-4)` 
 1.072192  1.121545  1.035371  1.027422  1.024381  1.027083  1.035669  1.049466 
`K1(t-5)` 
 1.045113 

\begin{tabular}{lllllllll}
\hline
Variance Inflation Factors (Multicollinearity) \tabularnewline
> vif
       K2        K3        K4       ITH `K1(t-1)` `K1(t-2)` `K1(t-3)` `K1(t-4)` 
 1.072192  1.121545  1.035371  1.027422  1.024381  1.027083  1.035669  1.049466 
`K1(t-5)` 
 1.045113 
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=298338&T=8

[TABLE]
[ROW][C]Variance Inflation Factors (Multicollinearity)[/C][/ROW]
[ROW][C]
> vif
       K2        K3        K4       ITH `K1(t-1)` `K1(t-2)` `K1(t-3)` `K1(t-4)` 
 1.072192  1.121545  1.035371  1.027422  1.024381  1.027083  1.035669  1.049466 
`K1(t-5)` 
 1.045113 
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=298338&T=8

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=298338&T=8

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Variance Inflation Factors (Multicollinearity)
> vif
       K2        K3        K4       ITH `K1(t-1)` `K1(t-2)` `K1(t-3)` `K1(t-4)` 
 1.072192  1.121545  1.035371  1.027422  1.024381  1.027083  1.035669  1.049466 
`K1(t-5)` 
 1.045113 



Parameters (Session):
par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ; par4 = 5 ; par5 = 0 ;
Parameters (R input):
par1 = ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ; par4 = 5 ; par5 = 0 ;
R code (references can be found in the software module):
par5 <- '0'
par4 <- '0'
par3 <- 'No Linear Trend'
par2 <- 'Do not include Seasonal Dummies'
par1 <- '1'
library(lattice)
library(lmtest)
library(car)
library(MASS)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
mywarning <- ''
par1 <- as.numeric(par1)
if(is.na(par1)) {
par1 <- 1
mywarning = 'Warning: you did not specify the column number of the endogenous series! The first column was selected by default.'
}
if (par4=='') par4 <- 0
par4 <- as.numeric(par4)
if (par5=='') par5 <- 0
par5 <- as.numeric(par5)
x <- na.omit(t(y))
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'Seasonal Differences (s=12)'){
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'First and Seasonal Differences (s=12)'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if(par4 > 0) {
x2 <- array(0, dim=c(n-par4,par4), dimnames=list(1:(n-par4), paste(colnames(x)[par1],'(t-',1:par4,')',sep='')))
for (i in 1:(n-par4)) {
for (j in 1:par4) {
x2[i,j] <- x[i+par4-j,par1]
}
}
x <- cbind(x[(par4+1):n,], x2)
n <- n - par4
}
if(par5 > 0) {
x2 <- array(0, dim=c(n-par5*12,par5), dimnames=list(1:(n-par5*12), paste(colnames(x)[par1],'(t-',1:par5,'s)',sep='')))
for (i in 1:(n-par5*12)) {
for (j in 1:par5) {
x2[i,j] <- x[i+par5*12-j*12,par1]
}
}
x <- cbind(x[(par5*12+1):n,], x2)
n <- n - par5*12
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
(k <- length(x[n,]))
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
print(x)
(k <- length(x[n,]))
head(x)
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
sresid <- studres(mylm)
hist(sresid, freq=FALSE, main='Distribution of Studentized Residuals')
xfit<-seq(min(sresid),max(sresid),length=40)
yfit<-dnorm(xfit)
lines(xfit, yfit)
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqPlot(mylm, main='QQ Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
print(z)
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, signif(mysum$coefficients[i,1],6), sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, mywarning)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Multiple Linear Regression - Ordinary Least Squares', 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,formatC(signif(mysum$coefficients[i,1],5),format='g',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,2],5),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,3],4),format='e',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4],4),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4]/2,4),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a,formatC(signif(sqrt(mysum$r.squared),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$adj.r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a,formatC(signif(mysum$fstatistic[1],6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[2],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[3],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a,formatC(signif(1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a,formatC(signif(mysum$sigma,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a,formatC(signif(sum(myerror*myerror),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
myr <- as.numeric(mysum$resid)
myr
if(n < 200) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,formatC(signif(x[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(x[i]-mysum$resid[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$resid[i],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,1],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,2],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,3],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant1,6))
a<-table.element(a,formatC(signif(numsignificant1/numgqtests,6),format='g',flag=' '))
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant5,6))
a<-table.element(a,signif(numsignificant5/numgqtests,6))
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant10,6))
a<-table.element(a,signif(numsignificant10/numgqtests,6))
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}
}
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of fitted values',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_fitted <- resettest(mylm,power=2:3,type='fitted')
a<-table.element(a,paste('
',RC.texteval('reset_test_fitted'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of regressors',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_regressors <- resettest(mylm,power=2:3,type='regressor')
a<-table.element(a,paste('
',RC.texteval('reset_test_regressors'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of principal components',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_principal_components <- resettest(mylm,power=2:3,type='princomp')
a<-table.element(a,paste('
',RC.texteval('reset_test_principal_components'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable8.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Variance Inflation Factors (Multicollinearity)',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
vif <- vif(mylm)
a<-table.element(a,paste('
',RC.texteval('vif'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable9.tab')