Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationSat, 03 Dec 2016 14:58:11 +0100
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2016/Dec/03/t1480773598v1r76qhfyqdi1r1.htm/, Retrieved Sun, 05 May 2024 12:12:30 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=297625, Retrieved Sun, 05 May 2024 12:12:30 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact115
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [regressiemodel paper] [2016-12-03 13:58:11] [b7b12d6257d20c3ae3b596da588d7d29] [Current]
Feedback Forum

Post a new message
Dataseries X:
4	2	4	3	5	4	13	0
5	3	3	4	5	4	16	1
4	4	5	4	5	4	17	1
3	4	3	3	4	4	15	1
4	4	5	4	5	4	16	0
3	4	4	4	5	5	16	1
3	4	4	3	3	4	18	1
3	4	5	4	4	4	16	1
4	5	4	4	5	5	17	0
4	5	5	4	5	5	17	0
4	4	2	4	5	4	17	0
4	4	5	3	5	4	15	0
4	4	4	3	4	5	16	0
3	3	5	4	4	5	14	0
4	4	5	4	2	5	16	0
3	4	5	4	4	5	17	1
3	4	5	4	4	5	16	0
5	5	4	3	4	4	15	1
4	4	4	4	5	4	17	1
3	4	5	3	4	5	16	1
4	4	4	4	5	5	15	1
4	4	5	4	4	5	16	0
4	4	5	4	4	4	15	1
4	4	5	4	4	5	17	1
3	4	4	4	4	4	14	0
3	4	4	3	5	5	16	0
4	4	4	4	4	4	15	0
2	4	5	4	5	5	16	0
5	4	4	4	4	4	16	0
4	3	5	4	4	4	13	0
4	5	5	4	5	5	15	0
5	4	5	4	4	5	17	1
4	3	5	4	NA	5	15	0
2	3	5	4	5	4	13	0
4	5	2	4	4	4	17	1
3	4	5	4	4	4	15	0
4	3	5	3	4	5	14	0
4	3	3	4	4	4	14	1
4	4	5	4	4	4	18	0
5	4	4	4	4	4	15	0
4	5	5	4	5	5	17	1
3	3	4	4	4	4	13	1
5	5	5	3	5	5	16	1
5	4	5	3	4	4	15	1
4	4	4	3	4	5	15	1
4	4	4	4	4	4	16	1
3	5	5	3	3	4	15	1
4	4	4	4	5	4	13	1
4	5	5	4	4	4	17	1
5	5	2	4	5	4	18	0
5	5	5	4	4	4	17	1
4	3	5	4	5	5	11	0
4	3	4	3	4	5	14	1
4	4	5	4	4	4	13	1
3	4	4	3	3	4	15	0
3	4	4	4	4	3	17	1
4	4	4	3	5	4	16	0
4	4	4	4	5	4	15	0
5	5	3	4	5	5	17	1
2	4	4	4	5	5	16	0
4	4	4	4	5	5	16	1
3	4	4	4	2	4	16	1
4	4	5	4	5	5	15	1
4	2	4	4	4	4	12	1
4	4	4	3	5	3	17	0
4	4	4	3	5	4	14	1
5	4	5	3	3	5	14	0
3	4	4	3	5	5	16	1
3	4	4	3	4	5	15	0
4	5	5	5	5	4	15	1
4	4	3	4	NA	4	13	1
4	4	4	4	4	4	13	1
4	4	4	5	5	4	17	0
3	4	3	4	4	4	15	1
4	4	4	4	5	4	16	0
3	4	5	3	5	5	14	1
3	3	5	4	4	5	15	0
4	3	5	4	4	4	17	1
4	4	5	4	4	5	16	0
3	3	3	4	4	4	10	1
4	4	4	4	5	4	16	1
4	4	3	4	5	5	17	0
4	4	4	4	5	5	17	0
5	4	4	4	4	4	20	0
5	4	3	5	4	5	17	1
4	4	5	4	5	5	18	1
3	4	5	4	4	5	15	1
3	NA	4	4	4	4	17	1
4	2	3	3	4	4	14	1
4	4	5	4	4	3	15	0
4	4	5	4	4	5	17	1
4	4	4	4	5	4	16	0
4	5	4	4	5	3	17	0
3	4	4	3	5	5	15	0
4	4	5	4	4	5	16	0
5	4	3	4	4	5	18	1
5	4	5	5	4	5	18	1
4	5	4	4	5	5	16	1
5	3	4	4	5	5	17	0
4	4	5	4	4	5	15	1
5	4	4	4	4	5	13	1
3	4	4	3	NA	4	15	0
5	4	4	5	5	5	17	0
4	4	5	3	NA	5	16	1
4	4	3	3	4	3	16	1
4	4	5	4	4	4	15	1
4	4	5	4	4	4	16	1
3	4	5	4	5	3	16	0
4	4	4	4	4	4	13	0
4	4	4	3	4	5	15	0
3	3	4	3	5	5	12	1
4	4	4	3	4	4	19	1
3	4	5	4	4	4	16	1
4	4	5	4	3	4	16	0
5	4	5	1	5	5	17	0
5	4	5	4	5	5	16	1
4	4	4	4	4	3	14	0
4	4	5	3	4	4	15	0
3	4	4	3	4	5	14	1
4	4	4	4	4	4	16	0
4	4	4	4	5	4	15	0
4	5	3	4	4	4	17	0
3	4	4	4	4	4	15	1
4	4	4	3	4	4	16	1
4	4	4	4	4	5	16	1
3	4	3	3	4	4	15	1
4	4	4	3	4	3	15	1
3	2	4	2	4	4	11	0
4	4	4	3	5	4	16	0
5	4	4	3	5	4	18	1
2	4	4	3	3	5	13	1
3	3	4	4	4	4	11	1
4	4	4	3	4	4	16	0
5	5	4	4	5	4	18	1
4	5	5	4	4	4	15	0
5	5	5	5	5	4	19	0
4	5	5	4	5	5	17	0
4	4	4	3	4	5	13	0
3	4	5	4	5	4	14	1
4	4	5	4	4	4	16	0
4	4	2	4	4	4	13	0
4	4	3	4	5	5	17	0
4	4	4	4	5	5	14	0
5	4	5	3	5	4	19	1
4	3	5	4	4	4	14	0
4	4	5	4	4	4	16	1
3	3	2	3	4	4	12	1
4	5	5	4	4	3	16	1
4	4	4	3	4	4	16	1
4	4	4	4	4	5	15	0
3	4	5	3	5	5	12	0
4	4	5	4	4	5	15	0
5	4	5	4	5	4	17	0
4	4	5	4	3	4	13	1
2	3	5	4	4	4	15	1
4	4	4	4	4	5	18	1
4	3	4	3	5	5	15	1
4	4	4	4	4	3	18	0
4	5	5	5	4	4	15	1
5	4	3	4	4	4	15	0
5	4	4	3	4	4	16	1
3	3	1	4	5	5	13	0
4	4	4	4	4	5	16	0
4	4	4	4	5	4	13	0
2	3	4	5	5	4	16	0




Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time7 seconds
R ServerBig Analytics Cloud Computing Center

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input view raw input (R code)  \tabularnewline
Raw Outputview raw output of R engine  \tabularnewline
Computing time7 seconds \tabularnewline
R ServerBig Analytics Cloud Computing Center \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=297625&T=0

[TABLE]
[ROW]
Summary of computational transaction[/C][/ROW] [ROW]Raw Input[/C] view raw input (R code) [/C][/ROW] [ROW]Raw Output[/C]view raw output of R engine [/C][/ROW] [ROW]Computing time[/C]7 seconds[/C][/ROW] [ROW]R Server[/C]Big Analytics Cloud Computing Center[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=297625&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297625&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time7 seconds
R ServerBig Analytics Cloud Computing Center







Multiple Linear Regression - Estimated Regression Equation
TVDCSUM[t] = + 6.63848 + 0.570679SK1[t] + 1.14791SK2[t] + 0.0640941SK3[t] + 0.240909SK4[t] + 0.209946SK5[t] -0.00302854SK6[t] + 0.0247021G[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
TVDCSUM[t] =  +  6.63848 +  0.570679SK1[t] +  1.14791SK2[t] +  0.0640941SK3[t] +  0.240909SK4[t] +  0.209946SK5[t] -0.00302854SK6[t] +  0.0247021G[t]  + e[t] \tabularnewline
 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=297625&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]TVDCSUM[t] =  +  6.63848 +  0.570679SK1[t] +  1.14791SK2[t] +  0.0640941SK3[t] +  0.240909SK4[t] +  0.209946SK5[t] -0.00302854SK6[t] +  0.0247021G[t]  + e[t][/C][/ROW]
[ROW][C][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=297625&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297625&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
TVDCSUM[t] = + 6.63848 + 0.570679SK1[t] + 1.14791SK2[t] + 0.0640941SK3[t] + 0.240909SK4[t] + 0.209946SK5[t] -0.00302854SK6[t] + 0.0247021G[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+6.638 1.586+4.1860e+00 4.794e-05 2.397e-05
SK1+0.5707 0.1655+3.4480e+00 0.000732 0.000366
SK2+1.148 0.202+5.6840e+00 6.54e-08 3.27e-08
SK3+0.06409 0.1479+4.3340e-01 0.6653 0.3327
SK4+0.2409 0.2013+1.1970e+00 0.2332 0.1166
SK5+0.2099 0.1924+1.0910e+00 0.2769 0.1384
SK6-0.003028 0.1972-1.5360e-02 0.9878 0.4939
G+0.0247 0.2331+1.0600e-01 0.9158 0.4579

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & +6.638 &  1.586 & +4.1860e+00 &  4.794e-05 &  2.397e-05 \tabularnewline
SK1 & +0.5707 &  0.1655 & +3.4480e+00 &  0.000732 &  0.000366 \tabularnewline
SK2 & +1.148 &  0.202 & +5.6840e+00 &  6.54e-08 &  3.27e-08 \tabularnewline
SK3 & +0.06409 &  0.1479 & +4.3340e-01 &  0.6653 &  0.3327 \tabularnewline
SK4 & +0.2409 &  0.2013 & +1.1970e+00 &  0.2332 &  0.1166 \tabularnewline
SK5 & +0.2099 &  0.1924 & +1.0910e+00 &  0.2769 &  0.1384 \tabularnewline
SK6 & -0.003028 &  0.1972 & -1.5360e-02 &  0.9878 &  0.4939 \tabularnewline
G & +0.0247 &  0.2331 & +1.0600e-01 &  0.9158 &  0.4579 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=297625&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]+6.638[/C][C] 1.586[/C][C]+4.1860e+00[/C][C] 4.794e-05[/C][C] 2.397e-05[/C][/ROW]
[ROW][C]SK1[/C][C]+0.5707[/C][C] 0.1655[/C][C]+3.4480e+00[/C][C] 0.000732[/C][C] 0.000366[/C][/ROW]
[ROW][C]SK2[/C][C]+1.148[/C][C] 0.202[/C][C]+5.6840e+00[/C][C] 6.54e-08[/C][C] 3.27e-08[/C][/ROW]
[ROW][C]SK3[/C][C]+0.06409[/C][C] 0.1479[/C][C]+4.3340e-01[/C][C] 0.6653[/C][C] 0.3327[/C][/ROW]
[ROW][C]SK4[/C][C]+0.2409[/C][C] 0.2013[/C][C]+1.1970e+00[/C][C] 0.2332[/C][C] 0.1166[/C][/ROW]
[ROW][C]SK5[/C][C]+0.2099[/C][C] 0.1924[/C][C]+1.0910e+00[/C][C] 0.2769[/C][C] 0.1384[/C][/ROW]
[ROW][C]SK6[/C][C]-0.003028[/C][C] 0.1972[/C][C]-1.5360e-02[/C][C] 0.9878[/C][C] 0.4939[/C][/ROW]
[ROW][C]G[/C][C]+0.0247[/C][C] 0.2331[/C][C]+1.0600e-01[/C][C] 0.9158[/C][C] 0.4579[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=297625&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297625&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+6.638 1.586+4.1860e+00 4.794e-05 2.397e-05
SK1+0.5707 0.1655+3.4480e+00 0.000732 0.000366
SK2+1.148 0.202+5.6840e+00 6.54e-08 3.27e-08
SK3+0.06409 0.1479+4.3340e-01 0.6653 0.3327
SK4+0.2409 0.2013+1.1970e+00 0.2332 0.1166
SK5+0.2099 0.1924+1.0910e+00 0.2769 0.1384
SK6-0.003028 0.1972-1.5360e-02 0.9878 0.4939
G+0.0247 0.2331+1.0600e-01 0.9158 0.4579







Multiple Linear Regression - Regression Statistics
Multiple R 0.5629
R-squared 0.3168
Adjusted R-squared 0.2854
F-TEST (value) 10.07
F-TEST (DF numerator)7
F-TEST (DF denominator)152
p-value 2.581e-10
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 1.453
Sum Squared Residuals 321

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R &  0.5629 \tabularnewline
R-squared &  0.3168 \tabularnewline
Adjusted R-squared &  0.2854 \tabularnewline
F-TEST (value) &  10.07 \tabularnewline
F-TEST (DF numerator) & 7 \tabularnewline
F-TEST (DF denominator) & 152 \tabularnewline
p-value &  2.581e-10 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation &  1.453 \tabularnewline
Sum Squared Residuals &  321 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=297625&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C] 0.5629[/C][/ROW]
[ROW][C]R-squared[/C][C] 0.3168[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C] 0.2854[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C] 10.07[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]7[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]152[/C][/ROW]
[ROW][C]p-value[/C][C] 2.581e-10[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C] 1.453[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C] 321[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=297625&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297625&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R 0.5629
R-squared 0.3168
Adjusted R-squared 0.2854
F-TEST (value) 10.07
F-TEST (DF numerator)7
F-TEST (DF denominator)152
p-value 2.581e-10
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 1.453
Sum Squared Residuals 321







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 13 13.23-0.2337
2 16 15.15 0.8461
3 17 15.86 1.141
4 15 14.71 0.2904
5 16 15.83 0.1654
6 16 15.22 0.7785
7 18 14.56 3.436
8 16 15.08 0.9214
9 17 16.92 0.08464
10 17 16.98 0.02054
11 17 15.64 1.358
12 15 15.59-0.5937
13 16 15.32 0.6834
14 14 13.9 0.097
15 16 15.2 0.7983
16 17 15.08 1.924
17 16 15.05 0.9491
18 15 17.06-2.063
19 17 15.8 1.205
20 16 14.83 1.165
21 15 15.79-0.7922
22 16 15.62 0.3784
23 15 15.65-0.6493
24 17 15.65 1.354
25 14 14.99-0.9899
26 16 14.96 1.044
27 15 15.56-0.5605
28 16 14.69 1.31
29 16 16.13-0.1312
30 13 14.48-1.477
31 15 16.98-1.979
32 17 16.22 0.783
33 13 13.55-0.5453
34 17 16.61 0.395
35 15 15.05-0.05395
36 14 14.23-0.2328
37 14 14.37-0.3732
38 18 15.62 2.375
39 15 16.13-1.131
40 17 17-0.00416
41 13 13.87-0.8666
42 16 17.33-1.334
43 15 15.98-0.9791
44 15 15.34-0.3413
45 16 15.59 0.4148
46 15 15.78-0.7757
47 13 15.8-2.795
48 17 16.8 0.2028
49 18 17.36 0.6391
50 17 17.37-0.3679
51 11 14.68-3.684
52 14 14.19-0.1934
53 13 15.65-2.649
54 15 14.54 0.461
55 17 15.02 1.982
56 16 15.53 0.4704
57 15 15.77-0.7705
58 17 17.45-0.4467
59 16 14.63 1.374
60 16 15.79 0.2078
61 16 14.59 1.405
62 15 15.86-0.8562
63 12 13.29-1.289
64 17 15.53 1.467
65 14 15.55-1.554
66 14 15.74-1.741
67 16 14.98 1.019
68 15 14.75 0.2541
69 15 17.25-2.248
70 13 15.59-2.585
71 17 16.01 0.9886
72 15 14.95 0.04954
73 16 15.77 0.2295
74 14 15.04-1.045
75 15 13.9 1.097
76 17 14.5 2.499
77 16 15.62 0.3784
78 10 13.8-3.803
79 16 15.8 0.2048
80 17 15.7 1.297
81 17 15.77 1.233
82 20 16.13 3.869
83 17 16.33 0.6703
84 18 15.86 2.144
85 15 15.08-0.07562
86 14 12.98 1.016
87 15 15.63-0.6277
88 17 15.65 1.354
89 16 15.77 0.2295
90 17 16.92 0.07858
91 15 14.96 0.04414
92 16 15.62 0.3784
93 18 16.09 1.911
94 18 16.46 1.542
95 16 16.94-0.9401
96 17 15.19 1.81
97 15 15.65-0.6463
98 13 16.15-3.153
99 17 16.58 0.421
100 16 15.28 0.7167
101 15 15.65-0.6493
102 16 15.65 0.3507
103 16 15.27 0.7331
104 13 15.56-2.561
105 15 15.32-0.3166
106 12 13.83-1.833
107 19 15.34 3.656
108 16 15.08 0.9214
109 16 15.41 0.5853
110 17 15.68 1.321
111 16 16.43-0.4269
112 14 15.56-1.564
113 15 15.38-0.3837
114 14 14.77-0.7706
115 16 15.56 0.4395
116 15 15.77-0.7705
117 17 16.64 0.3556
118 15 15.01-0.01455
119 16 15.34 0.6557
120 16 15.58 0.4178
121 15 14.71 0.2904
122 15 15.35-0.3474
123 11 12.21-1.212
124 16 15.53 0.4704
125 18 16.12 1.875
126 13 13.99-0.99
127 11 13.87-2.867
128 16 15.32 0.6804
129 18 17.51 0.4862
130 15 16.77-1.773
131 19 17.79 1.206
132 17 16.98 0.02054
133 13 15.32-2.317
134 14 15.29-1.289
135 16 15.62 0.3754
136 13 15.43-2.432
137 17 15.7 1.297
138 14 15.77-1.767
139 19 16.19 2.811
140 14 14.48-0.4767
141 16 15.65 0.3507
142 12 13.5-1.498
143 16 16.8-0.8003
144 16 15.34 0.6557
145 15 15.56-0.5575
146 12 15.02-3.02
147 15 15.62-0.6216
148 17 16.41 0.5947
149 13 15.44-2.439
150 15 13.36 1.64
151 18 15.58 2.418
152 15 14.4 0.5967
153 18 15.56 2.436
154 15 17.04-2.038
155 15 16.07-1.067
156 16 15.91 0.08499
157 13 13.86-0.8566
158 16 15.56 0.4425
159 13 15.77-2.77
160 16 13.72 2.278

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 &  13 &  13.23 & -0.2337 \tabularnewline
2 &  16 &  15.15 &  0.8461 \tabularnewline
3 &  17 &  15.86 &  1.141 \tabularnewline
4 &  15 &  14.71 &  0.2904 \tabularnewline
5 &  16 &  15.83 &  0.1654 \tabularnewline
6 &  16 &  15.22 &  0.7785 \tabularnewline
7 &  18 &  14.56 &  3.436 \tabularnewline
8 &  16 &  15.08 &  0.9214 \tabularnewline
9 &  17 &  16.92 &  0.08464 \tabularnewline
10 &  17 &  16.98 &  0.02054 \tabularnewline
11 &  17 &  15.64 &  1.358 \tabularnewline
12 &  15 &  15.59 & -0.5937 \tabularnewline
13 &  16 &  15.32 &  0.6834 \tabularnewline
14 &  14 &  13.9 &  0.097 \tabularnewline
15 &  16 &  15.2 &  0.7983 \tabularnewline
16 &  17 &  15.08 &  1.924 \tabularnewline
17 &  16 &  15.05 &  0.9491 \tabularnewline
18 &  15 &  17.06 & -2.063 \tabularnewline
19 &  17 &  15.8 &  1.205 \tabularnewline
20 &  16 &  14.83 &  1.165 \tabularnewline
21 &  15 &  15.79 & -0.7922 \tabularnewline
22 &  16 &  15.62 &  0.3784 \tabularnewline
23 &  15 &  15.65 & -0.6493 \tabularnewline
24 &  17 &  15.65 &  1.354 \tabularnewline
25 &  14 &  14.99 & -0.9899 \tabularnewline
26 &  16 &  14.96 &  1.044 \tabularnewline
27 &  15 &  15.56 & -0.5605 \tabularnewline
28 &  16 &  14.69 &  1.31 \tabularnewline
29 &  16 &  16.13 & -0.1312 \tabularnewline
30 &  13 &  14.48 & -1.477 \tabularnewline
31 &  15 &  16.98 & -1.979 \tabularnewline
32 &  17 &  16.22 &  0.783 \tabularnewline
33 &  13 &  13.55 & -0.5453 \tabularnewline
34 &  17 &  16.61 &  0.395 \tabularnewline
35 &  15 &  15.05 & -0.05395 \tabularnewline
36 &  14 &  14.23 & -0.2328 \tabularnewline
37 &  14 &  14.37 & -0.3732 \tabularnewline
38 &  18 &  15.62 &  2.375 \tabularnewline
39 &  15 &  16.13 & -1.131 \tabularnewline
40 &  17 &  17 & -0.00416 \tabularnewline
41 &  13 &  13.87 & -0.8666 \tabularnewline
42 &  16 &  17.33 & -1.334 \tabularnewline
43 &  15 &  15.98 & -0.9791 \tabularnewline
44 &  15 &  15.34 & -0.3413 \tabularnewline
45 &  16 &  15.59 &  0.4148 \tabularnewline
46 &  15 &  15.78 & -0.7757 \tabularnewline
47 &  13 &  15.8 & -2.795 \tabularnewline
48 &  17 &  16.8 &  0.2028 \tabularnewline
49 &  18 &  17.36 &  0.6391 \tabularnewline
50 &  17 &  17.37 & -0.3679 \tabularnewline
51 &  11 &  14.68 & -3.684 \tabularnewline
52 &  14 &  14.19 & -0.1934 \tabularnewline
53 &  13 &  15.65 & -2.649 \tabularnewline
54 &  15 &  14.54 &  0.461 \tabularnewline
55 &  17 &  15.02 &  1.982 \tabularnewline
56 &  16 &  15.53 &  0.4704 \tabularnewline
57 &  15 &  15.77 & -0.7705 \tabularnewline
58 &  17 &  17.45 & -0.4467 \tabularnewline
59 &  16 &  14.63 &  1.374 \tabularnewline
60 &  16 &  15.79 &  0.2078 \tabularnewline
61 &  16 &  14.59 &  1.405 \tabularnewline
62 &  15 &  15.86 & -0.8562 \tabularnewline
63 &  12 &  13.29 & -1.289 \tabularnewline
64 &  17 &  15.53 &  1.467 \tabularnewline
65 &  14 &  15.55 & -1.554 \tabularnewline
66 &  14 &  15.74 & -1.741 \tabularnewline
67 &  16 &  14.98 &  1.019 \tabularnewline
68 &  15 &  14.75 &  0.2541 \tabularnewline
69 &  15 &  17.25 & -2.248 \tabularnewline
70 &  13 &  15.59 & -2.585 \tabularnewline
71 &  17 &  16.01 &  0.9886 \tabularnewline
72 &  15 &  14.95 &  0.04954 \tabularnewline
73 &  16 &  15.77 &  0.2295 \tabularnewline
74 &  14 &  15.04 & -1.045 \tabularnewline
75 &  15 &  13.9 &  1.097 \tabularnewline
76 &  17 &  14.5 &  2.499 \tabularnewline
77 &  16 &  15.62 &  0.3784 \tabularnewline
78 &  10 &  13.8 & -3.803 \tabularnewline
79 &  16 &  15.8 &  0.2048 \tabularnewline
80 &  17 &  15.7 &  1.297 \tabularnewline
81 &  17 &  15.77 &  1.233 \tabularnewline
82 &  20 &  16.13 &  3.869 \tabularnewline
83 &  17 &  16.33 &  0.6703 \tabularnewline
84 &  18 &  15.86 &  2.144 \tabularnewline
85 &  15 &  15.08 & -0.07562 \tabularnewline
86 &  14 &  12.98 &  1.016 \tabularnewline
87 &  15 &  15.63 & -0.6277 \tabularnewline
88 &  17 &  15.65 &  1.354 \tabularnewline
89 &  16 &  15.77 &  0.2295 \tabularnewline
90 &  17 &  16.92 &  0.07858 \tabularnewline
91 &  15 &  14.96 &  0.04414 \tabularnewline
92 &  16 &  15.62 &  0.3784 \tabularnewline
93 &  18 &  16.09 &  1.911 \tabularnewline
94 &  18 &  16.46 &  1.542 \tabularnewline
95 &  16 &  16.94 & -0.9401 \tabularnewline
96 &  17 &  15.19 &  1.81 \tabularnewline
97 &  15 &  15.65 & -0.6463 \tabularnewline
98 &  13 &  16.15 & -3.153 \tabularnewline
99 &  17 &  16.58 &  0.421 \tabularnewline
100 &  16 &  15.28 &  0.7167 \tabularnewline
101 &  15 &  15.65 & -0.6493 \tabularnewline
102 &  16 &  15.65 &  0.3507 \tabularnewline
103 &  16 &  15.27 &  0.7331 \tabularnewline
104 &  13 &  15.56 & -2.561 \tabularnewline
105 &  15 &  15.32 & -0.3166 \tabularnewline
106 &  12 &  13.83 & -1.833 \tabularnewline
107 &  19 &  15.34 &  3.656 \tabularnewline
108 &  16 &  15.08 &  0.9214 \tabularnewline
109 &  16 &  15.41 &  0.5853 \tabularnewline
110 &  17 &  15.68 &  1.321 \tabularnewline
111 &  16 &  16.43 & -0.4269 \tabularnewline
112 &  14 &  15.56 & -1.564 \tabularnewline
113 &  15 &  15.38 & -0.3837 \tabularnewline
114 &  14 &  14.77 & -0.7706 \tabularnewline
115 &  16 &  15.56 &  0.4395 \tabularnewline
116 &  15 &  15.77 & -0.7705 \tabularnewline
117 &  17 &  16.64 &  0.3556 \tabularnewline
118 &  15 &  15.01 & -0.01455 \tabularnewline
119 &  16 &  15.34 &  0.6557 \tabularnewline
120 &  16 &  15.58 &  0.4178 \tabularnewline
121 &  15 &  14.71 &  0.2904 \tabularnewline
122 &  15 &  15.35 & -0.3474 \tabularnewline
123 &  11 &  12.21 & -1.212 \tabularnewline
124 &  16 &  15.53 &  0.4704 \tabularnewline
125 &  18 &  16.12 &  1.875 \tabularnewline
126 &  13 &  13.99 & -0.99 \tabularnewline
127 &  11 &  13.87 & -2.867 \tabularnewline
128 &  16 &  15.32 &  0.6804 \tabularnewline
129 &  18 &  17.51 &  0.4862 \tabularnewline
130 &  15 &  16.77 & -1.773 \tabularnewline
131 &  19 &  17.79 &  1.206 \tabularnewline
132 &  17 &  16.98 &  0.02054 \tabularnewline
133 &  13 &  15.32 & -2.317 \tabularnewline
134 &  14 &  15.29 & -1.289 \tabularnewline
135 &  16 &  15.62 &  0.3754 \tabularnewline
136 &  13 &  15.43 & -2.432 \tabularnewline
137 &  17 &  15.7 &  1.297 \tabularnewline
138 &  14 &  15.77 & -1.767 \tabularnewline
139 &  19 &  16.19 &  2.811 \tabularnewline
140 &  14 &  14.48 & -0.4767 \tabularnewline
141 &  16 &  15.65 &  0.3507 \tabularnewline
142 &  12 &  13.5 & -1.498 \tabularnewline
143 &  16 &  16.8 & -0.8003 \tabularnewline
144 &  16 &  15.34 &  0.6557 \tabularnewline
145 &  15 &  15.56 & -0.5575 \tabularnewline
146 &  12 &  15.02 & -3.02 \tabularnewline
147 &  15 &  15.62 & -0.6216 \tabularnewline
148 &  17 &  16.41 &  0.5947 \tabularnewline
149 &  13 &  15.44 & -2.439 \tabularnewline
150 &  15 &  13.36 &  1.64 \tabularnewline
151 &  18 &  15.58 &  2.418 \tabularnewline
152 &  15 &  14.4 &  0.5967 \tabularnewline
153 &  18 &  15.56 &  2.436 \tabularnewline
154 &  15 &  17.04 & -2.038 \tabularnewline
155 &  15 &  16.07 & -1.067 \tabularnewline
156 &  16 &  15.91 &  0.08499 \tabularnewline
157 &  13 &  13.86 & -0.8566 \tabularnewline
158 &  16 &  15.56 &  0.4425 \tabularnewline
159 &  13 &  15.77 & -2.77 \tabularnewline
160 &  16 &  13.72 &  2.278 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=297625&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C] 13[/C][C] 13.23[/C][C]-0.2337[/C][/ROW]
[ROW][C]2[/C][C] 16[/C][C] 15.15[/C][C] 0.8461[/C][/ROW]
[ROW][C]3[/C][C] 17[/C][C] 15.86[/C][C] 1.141[/C][/ROW]
[ROW][C]4[/C][C] 15[/C][C] 14.71[/C][C] 0.2904[/C][/ROW]
[ROW][C]5[/C][C] 16[/C][C] 15.83[/C][C] 0.1654[/C][/ROW]
[ROW][C]6[/C][C] 16[/C][C] 15.22[/C][C] 0.7785[/C][/ROW]
[ROW][C]7[/C][C] 18[/C][C] 14.56[/C][C] 3.436[/C][/ROW]
[ROW][C]8[/C][C] 16[/C][C] 15.08[/C][C] 0.9214[/C][/ROW]
[ROW][C]9[/C][C] 17[/C][C] 16.92[/C][C] 0.08464[/C][/ROW]
[ROW][C]10[/C][C] 17[/C][C] 16.98[/C][C] 0.02054[/C][/ROW]
[ROW][C]11[/C][C] 17[/C][C] 15.64[/C][C] 1.358[/C][/ROW]
[ROW][C]12[/C][C] 15[/C][C] 15.59[/C][C]-0.5937[/C][/ROW]
[ROW][C]13[/C][C] 16[/C][C] 15.32[/C][C] 0.6834[/C][/ROW]
[ROW][C]14[/C][C] 14[/C][C] 13.9[/C][C] 0.097[/C][/ROW]
[ROW][C]15[/C][C] 16[/C][C] 15.2[/C][C] 0.7983[/C][/ROW]
[ROW][C]16[/C][C] 17[/C][C] 15.08[/C][C] 1.924[/C][/ROW]
[ROW][C]17[/C][C] 16[/C][C] 15.05[/C][C] 0.9491[/C][/ROW]
[ROW][C]18[/C][C] 15[/C][C] 17.06[/C][C]-2.063[/C][/ROW]
[ROW][C]19[/C][C] 17[/C][C] 15.8[/C][C] 1.205[/C][/ROW]
[ROW][C]20[/C][C] 16[/C][C] 14.83[/C][C] 1.165[/C][/ROW]
[ROW][C]21[/C][C] 15[/C][C] 15.79[/C][C]-0.7922[/C][/ROW]
[ROW][C]22[/C][C] 16[/C][C] 15.62[/C][C] 0.3784[/C][/ROW]
[ROW][C]23[/C][C] 15[/C][C] 15.65[/C][C]-0.6493[/C][/ROW]
[ROW][C]24[/C][C] 17[/C][C] 15.65[/C][C] 1.354[/C][/ROW]
[ROW][C]25[/C][C] 14[/C][C] 14.99[/C][C]-0.9899[/C][/ROW]
[ROW][C]26[/C][C] 16[/C][C] 14.96[/C][C] 1.044[/C][/ROW]
[ROW][C]27[/C][C] 15[/C][C] 15.56[/C][C]-0.5605[/C][/ROW]
[ROW][C]28[/C][C] 16[/C][C] 14.69[/C][C] 1.31[/C][/ROW]
[ROW][C]29[/C][C] 16[/C][C] 16.13[/C][C]-0.1312[/C][/ROW]
[ROW][C]30[/C][C] 13[/C][C] 14.48[/C][C]-1.477[/C][/ROW]
[ROW][C]31[/C][C] 15[/C][C] 16.98[/C][C]-1.979[/C][/ROW]
[ROW][C]32[/C][C] 17[/C][C] 16.22[/C][C] 0.783[/C][/ROW]
[ROW][C]33[/C][C] 13[/C][C] 13.55[/C][C]-0.5453[/C][/ROW]
[ROW][C]34[/C][C] 17[/C][C] 16.61[/C][C] 0.395[/C][/ROW]
[ROW][C]35[/C][C] 15[/C][C] 15.05[/C][C]-0.05395[/C][/ROW]
[ROW][C]36[/C][C] 14[/C][C] 14.23[/C][C]-0.2328[/C][/ROW]
[ROW][C]37[/C][C] 14[/C][C] 14.37[/C][C]-0.3732[/C][/ROW]
[ROW][C]38[/C][C] 18[/C][C] 15.62[/C][C] 2.375[/C][/ROW]
[ROW][C]39[/C][C] 15[/C][C] 16.13[/C][C]-1.131[/C][/ROW]
[ROW][C]40[/C][C] 17[/C][C] 17[/C][C]-0.00416[/C][/ROW]
[ROW][C]41[/C][C] 13[/C][C] 13.87[/C][C]-0.8666[/C][/ROW]
[ROW][C]42[/C][C] 16[/C][C] 17.33[/C][C]-1.334[/C][/ROW]
[ROW][C]43[/C][C] 15[/C][C] 15.98[/C][C]-0.9791[/C][/ROW]
[ROW][C]44[/C][C] 15[/C][C] 15.34[/C][C]-0.3413[/C][/ROW]
[ROW][C]45[/C][C] 16[/C][C] 15.59[/C][C] 0.4148[/C][/ROW]
[ROW][C]46[/C][C] 15[/C][C] 15.78[/C][C]-0.7757[/C][/ROW]
[ROW][C]47[/C][C] 13[/C][C] 15.8[/C][C]-2.795[/C][/ROW]
[ROW][C]48[/C][C] 17[/C][C] 16.8[/C][C] 0.2028[/C][/ROW]
[ROW][C]49[/C][C] 18[/C][C] 17.36[/C][C] 0.6391[/C][/ROW]
[ROW][C]50[/C][C] 17[/C][C] 17.37[/C][C]-0.3679[/C][/ROW]
[ROW][C]51[/C][C] 11[/C][C] 14.68[/C][C]-3.684[/C][/ROW]
[ROW][C]52[/C][C] 14[/C][C] 14.19[/C][C]-0.1934[/C][/ROW]
[ROW][C]53[/C][C] 13[/C][C] 15.65[/C][C]-2.649[/C][/ROW]
[ROW][C]54[/C][C] 15[/C][C] 14.54[/C][C] 0.461[/C][/ROW]
[ROW][C]55[/C][C] 17[/C][C] 15.02[/C][C] 1.982[/C][/ROW]
[ROW][C]56[/C][C] 16[/C][C] 15.53[/C][C] 0.4704[/C][/ROW]
[ROW][C]57[/C][C] 15[/C][C] 15.77[/C][C]-0.7705[/C][/ROW]
[ROW][C]58[/C][C] 17[/C][C] 17.45[/C][C]-0.4467[/C][/ROW]
[ROW][C]59[/C][C] 16[/C][C] 14.63[/C][C] 1.374[/C][/ROW]
[ROW][C]60[/C][C] 16[/C][C] 15.79[/C][C] 0.2078[/C][/ROW]
[ROW][C]61[/C][C] 16[/C][C] 14.59[/C][C] 1.405[/C][/ROW]
[ROW][C]62[/C][C] 15[/C][C] 15.86[/C][C]-0.8562[/C][/ROW]
[ROW][C]63[/C][C] 12[/C][C] 13.29[/C][C]-1.289[/C][/ROW]
[ROW][C]64[/C][C] 17[/C][C] 15.53[/C][C] 1.467[/C][/ROW]
[ROW][C]65[/C][C] 14[/C][C] 15.55[/C][C]-1.554[/C][/ROW]
[ROW][C]66[/C][C] 14[/C][C] 15.74[/C][C]-1.741[/C][/ROW]
[ROW][C]67[/C][C] 16[/C][C] 14.98[/C][C] 1.019[/C][/ROW]
[ROW][C]68[/C][C] 15[/C][C] 14.75[/C][C] 0.2541[/C][/ROW]
[ROW][C]69[/C][C] 15[/C][C] 17.25[/C][C]-2.248[/C][/ROW]
[ROW][C]70[/C][C] 13[/C][C] 15.59[/C][C]-2.585[/C][/ROW]
[ROW][C]71[/C][C] 17[/C][C] 16.01[/C][C] 0.9886[/C][/ROW]
[ROW][C]72[/C][C] 15[/C][C] 14.95[/C][C] 0.04954[/C][/ROW]
[ROW][C]73[/C][C] 16[/C][C] 15.77[/C][C] 0.2295[/C][/ROW]
[ROW][C]74[/C][C] 14[/C][C] 15.04[/C][C]-1.045[/C][/ROW]
[ROW][C]75[/C][C] 15[/C][C] 13.9[/C][C] 1.097[/C][/ROW]
[ROW][C]76[/C][C] 17[/C][C] 14.5[/C][C] 2.499[/C][/ROW]
[ROW][C]77[/C][C] 16[/C][C] 15.62[/C][C] 0.3784[/C][/ROW]
[ROW][C]78[/C][C] 10[/C][C] 13.8[/C][C]-3.803[/C][/ROW]
[ROW][C]79[/C][C] 16[/C][C] 15.8[/C][C] 0.2048[/C][/ROW]
[ROW][C]80[/C][C] 17[/C][C] 15.7[/C][C] 1.297[/C][/ROW]
[ROW][C]81[/C][C] 17[/C][C] 15.77[/C][C] 1.233[/C][/ROW]
[ROW][C]82[/C][C] 20[/C][C] 16.13[/C][C] 3.869[/C][/ROW]
[ROW][C]83[/C][C] 17[/C][C] 16.33[/C][C] 0.6703[/C][/ROW]
[ROW][C]84[/C][C] 18[/C][C] 15.86[/C][C] 2.144[/C][/ROW]
[ROW][C]85[/C][C] 15[/C][C] 15.08[/C][C]-0.07562[/C][/ROW]
[ROW][C]86[/C][C] 14[/C][C] 12.98[/C][C] 1.016[/C][/ROW]
[ROW][C]87[/C][C] 15[/C][C] 15.63[/C][C]-0.6277[/C][/ROW]
[ROW][C]88[/C][C] 17[/C][C] 15.65[/C][C] 1.354[/C][/ROW]
[ROW][C]89[/C][C] 16[/C][C] 15.77[/C][C] 0.2295[/C][/ROW]
[ROW][C]90[/C][C] 17[/C][C] 16.92[/C][C] 0.07858[/C][/ROW]
[ROW][C]91[/C][C] 15[/C][C] 14.96[/C][C] 0.04414[/C][/ROW]
[ROW][C]92[/C][C] 16[/C][C] 15.62[/C][C] 0.3784[/C][/ROW]
[ROW][C]93[/C][C] 18[/C][C] 16.09[/C][C] 1.911[/C][/ROW]
[ROW][C]94[/C][C] 18[/C][C] 16.46[/C][C] 1.542[/C][/ROW]
[ROW][C]95[/C][C] 16[/C][C] 16.94[/C][C]-0.9401[/C][/ROW]
[ROW][C]96[/C][C] 17[/C][C] 15.19[/C][C] 1.81[/C][/ROW]
[ROW][C]97[/C][C] 15[/C][C] 15.65[/C][C]-0.6463[/C][/ROW]
[ROW][C]98[/C][C] 13[/C][C] 16.15[/C][C]-3.153[/C][/ROW]
[ROW][C]99[/C][C] 17[/C][C] 16.58[/C][C] 0.421[/C][/ROW]
[ROW][C]100[/C][C] 16[/C][C] 15.28[/C][C] 0.7167[/C][/ROW]
[ROW][C]101[/C][C] 15[/C][C] 15.65[/C][C]-0.6493[/C][/ROW]
[ROW][C]102[/C][C] 16[/C][C] 15.65[/C][C] 0.3507[/C][/ROW]
[ROW][C]103[/C][C] 16[/C][C] 15.27[/C][C] 0.7331[/C][/ROW]
[ROW][C]104[/C][C] 13[/C][C] 15.56[/C][C]-2.561[/C][/ROW]
[ROW][C]105[/C][C] 15[/C][C] 15.32[/C][C]-0.3166[/C][/ROW]
[ROW][C]106[/C][C] 12[/C][C] 13.83[/C][C]-1.833[/C][/ROW]
[ROW][C]107[/C][C] 19[/C][C] 15.34[/C][C] 3.656[/C][/ROW]
[ROW][C]108[/C][C] 16[/C][C] 15.08[/C][C] 0.9214[/C][/ROW]
[ROW][C]109[/C][C] 16[/C][C] 15.41[/C][C] 0.5853[/C][/ROW]
[ROW][C]110[/C][C] 17[/C][C] 15.68[/C][C] 1.321[/C][/ROW]
[ROW][C]111[/C][C] 16[/C][C] 16.43[/C][C]-0.4269[/C][/ROW]
[ROW][C]112[/C][C] 14[/C][C] 15.56[/C][C]-1.564[/C][/ROW]
[ROW][C]113[/C][C] 15[/C][C] 15.38[/C][C]-0.3837[/C][/ROW]
[ROW][C]114[/C][C] 14[/C][C] 14.77[/C][C]-0.7706[/C][/ROW]
[ROW][C]115[/C][C] 16[/C][C] 15.56[/C][C] 0.4395[/C][/ROW]
[ROW][C]116[/C][C] 15[/C][C] 15.77[/C][C]-0.7705[/C][/ROW]
[ROW][C]117[/C][C] 17[/C][C] 16.64[/C][C] 0.3556[/C][/ROW]
[ROW][C]118[/C][C] 15[/C][C] 15.01[/C][C]-0.01455[/C][/ROW]
[ROW][C]119[/C][C] 16[/C][C] 15.34[/C][C] 0.6557[/C][/ROW]
[ROW][C]120[/C][C] 16[/C][C] 15.58[/C][C] 0.4178[/C][/ROW]
[ROW][C]121[/C][C] 15[/C][C] 14.71[/C][C] 0.2904[/C][/ROW]
[ROW][C]122[/C][C] 15[/C][C] 15.35[/C][C]-0.3474[/C][/ROW]
[ROW][C]123[/C][C] 11[/C][C] 12.21[/C][C]-1.212[/C][/ROW]
[ROW][C]124[/C][C] 16[/C][C] 15.53[/C][C] 0.4704[/C][/ROW]
[ROW][C]125[/C][C] 18[/C][C] 16.12[/C][C] 1.875[/C][/ROW]
[ROW][C]126[/C][C] 13[/C][C] 13.99[/C][C]-0.99[/C][/ROW]
[ROW][C]127[/C][C] 11[/C][C] 13.87[/C][C]-2.867[/C][/ROW]
[ROW][C]128[/C][C] 16[/C][C] 15.32[/C][C] 0.6804[/C][/ROW]
[ROW][C]129[/C][C] 18[/C][C] 17.51[/C][C] 0.4862[/C][/ROW]
[ROW][C]130[/C][C] 15[/C][C] 16.77[/C][C]-1.773[/C][/ROW]
[ROW][C]131[/C][C] 19[/C][C] 17.79[/C][C] 1.206[/C][/ROW]
[ROW][C]132[/C][C] 17[/C][C] 16.98[/C][C] 0.02054[/C][/ROW]
[ROW][C]133[/C][C] 13[/C][C] 15.32[/C][C]-2.317[/C][/ROW]
[ROW][C]134[/C][C] 14[/C][C] 15.29[/C][C]-1.289[/C][/ROW]
[ROW][C]135[/C][C] 16[/C][C] 15.62[/C][C] 0.3754[/C][/ROW]
[ROW][C]136[/C][C] 13[/C][C] 15.43[/C][C]-2.432[/C][/ROW]
[ROW][C]137[/C][C] 17[/C][C] 15.7[/C][C] 1.297[/C][/ROW]
[ROW][C]138[/C][C] 14[/C][C] 15.77[/C][C]-1.767[/C][/ROW]
[ROW][C]139[/C][C] 19[/C][C] 16.19[/C][C] 2.811[/C][/ROW]
[ROW][C]140[/C][C] 14[/C][C] 14.48[/C][C]-0.4767[/C][/ROW]
[ROW][C]141[/C][C] 16[/C][C] 15.65[/C][C] 0.3507[/C][/ROW]
[ROW][C]142[/C][C] 12[/C][C] 13.5[/C][C]-1.498[/C][/ROW]
[ROW][C]143[/C][C] 16[/C][C] 16.8[/C][C]-0.8003[/C][/ROW]
[ROW][C]144[/C][C] 16[/C][C] 15.34[/C][C] 0.6557[/C][/ROW]
[ROW][C]145[/C][C] 15[/C][C] 15.56[/C][C]-0.5575[/C][/ROW]
[ROW][C]146[/C][C] 12[/C][C] 15.02[/C][C]-3.02[/C][/ROW]
[ROW][C]147[/C][C] 15[/C][C] 15.62[/C][C]-0.6216[/C][/ROW]
[ROW][C]148[/C][C] 17[/C][C] 16.41[/C][C] 0.5947[/C][/ROW]
[ROW][C]149[/C][C] 13[/C][C] 15.44[/C][C]-2.439[/C][/ROW]
[ROW][C]150[/C][C] 15[/C][C] 13.36[/C][C] 1.64[/C][/ROW]
[ROW][C]151[/C][C] 18[/C][C] 15.58[/C][C] 2.418[/C][/ROW]
[ROW][C]152[/C][C] 15[/C][C] 14.4[/C][C] 0.5967[/C][/ROW]
[ROW][C]153[/C][C] 18[/C][C] 15.56[/C][C] 2.436[/C][/ROW]
[ROW][C]154[/C][C] 15[/C][C] 17.04[/C][C]-2.038[/C][/ROW]
[ROW][C]155[/C][C] 15[/C][C] 16.07[/C][C]-1.067[/C][/ROW]
[ROW][C]156[/C][C] 16[/C][C] 15.91[/C][C] 0.08499[/C][/ROW]
[ROW][C]157[/C][C] 13[/C][C] 13.86[/C][C]-0.8566[/C][/ROW]
[ROW][C]158[/C][C] 16[/C][C] 15.56[/C][C] 0.4425[/C][/ROW]
[ROW][C]159[/C][C] 13[/C][C] 15.77[/C][C]-2.77[/C][/ROW]
[ROW][C]160[/C][C] 16[/C][C] 13.72[/C][C] 2.278[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=297625&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297625&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 13 13.23-0.2337
2 16 15.15 0.8461
3 17 15.86 1.141
4 15 14.71 0.2904
5 16 15.83 0.1654
6 16 15.22 0.7785
7 18 14.56 3.436
8 16 15.08 0.9214
9 17 16.92 0.08464
10 17 16.98 0.02054
11 17 15.64 1.358
12 15 15.59-0.5937
13 16 15.32 0.6834
14 14 13.9 0.097
15 16 15.2 0.7983
16 17 15.08 1.924
17 16 15.05 0.9491
18 15 17.06-2.063
19 17 15.8 1.205
20 16 14.83 1.165
21 15 15.79-0.7922
22 16 15.62 0.3784
23 15 15.65-0.6493
24 17 15.65 1.354
25 14 14.99-0.9899
26 16 14.96 1.044
27 15 15.56-0.5605
28 16 14.69 1.31
29 16 16.13-0.1312
30 13 14.48-1.477
31 15 16.98-1.979
32 17 16.22 0.783
33 13 13.55-0.5453
34 17 16.61 0.395
35 15 15.05-0.05395
36 14 14.23-0.2328
37 14 14.37-0.3732
38 18 15.62 2.375
39 15 16.13-1.131
40 17 17-0.00416
41 13 13.87-0.8666
42 16 17.33-1.334
43 15 15.98-0.9791
44 15 15.34-0.3413
45 16 15.59 0.4148
46 15 15.78-0.7757
47 13 15.8-2.795
48 17 16.8 0.2028
49 18 17.36 0.6391
50 17 17.37-0.3679
51 11 14.68-3.684
52 14 14.19-0.1934
53 13 15.65-2.649
54 15 14.54 0.461
55 17 15.02 1.982
56 16 15.53 0.4704
57 15 15.77-0.7705
58 17 17.45-0.4467
59 16 14.63 1.374
60 16 15.79 0.2078
61 16 14.59 1.405
62 15 15.86-0.8562
63 12 13.29-1.289
64 17 15.53 1.467
65 14 15.55-1.554
66 14 15.74-1.741
67 16 14.98 1.019
68 15 14.75 0.2541
69 15 17.25-2.248
70 13 15.59-2.585
71 17 16.01 0.9886
72 15 14.95 0.04954
73 16 15.77 0.2295
74 14 15.04-1.045
75 15 13.9 1.097
76 17 14.5 2.499
77 16 15.62 0.3784
78 10 13.8-3.803
79 16 15.8 0.2048
80 17 15.7 1.297
81 17 15.77 1.233
82 20 16.13 3.869
83 17 16.33 0.6703
84 18 15.86 2.144
85 15 15.08-0.07562
86 14 12.98 1.016
87 15 15.63-0.6277
88 17 15.65 1.354
89 16 15.77 0.2295
90 17 16.92 0.07858
91 15 14.96 0.04414
92 16 15.62 0.3784
93 18 16.09 1.911
94 18 16.46 1.542
95 16 16.94-0.9401
96 17 15.19 1.81
97 15 15.65-0.6463
98 13 16.15-3.153
99 17 16.58 0.421
100 16 15.28 0.7167
101 15 15.65-0.6493
102 16 15.65 0.3507
103 16 15.27 0.7331
104 13 15.56-2.561
105 15 15.32-0.3166
106 12 13.83-1.833
107 19 15.34 3.656
108 16 15.08 0.9214
109 16 15.41 0.5853
110 17 15.68 1.321
111 16 16.43-0.4269
112 14 15.56-1.564
113 15 15.38-0.3837
114 14 14.77-0.7706
115 16 15.56 0.4395
116 15 15.77-0.7705
117 17 16.64 0.3556
118 15 15.01-0.01455
119 16 15.34 0.6557
120 16 15.58 0.4178
121 15 14.71 0.2904
122 15 15.35-0.3474
123 11 12.21-1.212
124 16 15.53 0.4704
125 18 16.12 1.875
126 13 13.99-0.99
127 11 13.87-2.867
128 16 15.32 0.6804
129 18 17.51 0.4862
130 15 16.77-1.773
131 19 17.79 1.206
132 17 16.98 0.02054
133 13 15.32-2.317
134 14 15.29-1.289
135 16 15.62 0.3754
136 13 15.43-2.432
137 17 15.7 1.297
138 14 15.77-1.767
139 19 16.19 2.811
140 14 14.48-0.4767
141 16 15.65 0.3507
142 12 13.5-1.498
143 16 16.8-0.8003
144 16 15.34 0.6557
145 15 15.56-0.5575
146 12 15.02-3.02
147 15 15.62-0.6216
148 17 16.41 0.5947
149 13 15.44-2.439
150 15 13.36 1.64
151 18 15.58 2.418
152 15 14.4 0.5967
153 18 15.56 2.436
154 15 17.04-2.038
155 15 16.07-1.067
156 16 15.91 0.08499
157 13 13.86-0.8566
158 16 15.56 0.4425
159 13 15.77-2.77
160 16 13.72 2.278







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
11 0.3327 0.6655 0.6673
12 0.1859 0.3717 0.8141
13 0.1006 0.2013 0.8994
14 0.06782 0.1356 0.9322
15 0.07122 0.1424 0.9288
16 0.05453 0.1091 0.9455
17 0.03242 0.06485 0.9676
18 0.1227 0.2454 0.8773
19 0.0844 0.1688 0.9156
20 0.05833 0.1167 0.9417
21 0.05584 0.1117 0.9442
22 0.03491 0.06983 0.9651
23 0.03697 0.07393 0.963
24 0.03287 0.06574 0.9671
25 0.06947 0.1389 0.9305
26 0.05102 0.102 0.949
27 0.03762 0.07524 0.9624
28 0.02599 0.05198 0.974
29 0.01744 0.03488 0.9826
30 0.01998 0.03997 0.98
31 0.02599 0.05197 0.974
32 0.01958 0.03916 0.9804
33 0.01683 0.03365 0.9832
34 0.01223 0.02447 0.9878
35 0.007818 0.01564 0.9922
36 0.005282 0.01056 0.9947
37 0.005513 0.01103 0.9945
38 0.02334 0.04668 0.9767
39 0.01803 0.03606 0.982
40 0.01252 0.02503 0.9875
41 0.01653 0.03307 0.9835
42 0.01519 0.03037 0.9848
43 0.01143 0.02285 0.9886
44 0.009073 0.01815 0.9909
45 0.006129 0.01226 0.9939
46 0.005769 0.01154 0.9942
47 0.01953 0.03905 0.9805
48 0.01425 0.0285 0.9858
49 0.01185 0.02371 0.9881
50 0.008363 0.01673 0.9916
51 0.05233 0.1047 0.9477
52 0.04035 0.08071 0.9596
53 0.07458 0.1492 0.9254
54 0.06023 0.1205 0.9398
55 0.07165 0.1433 0.9283
56 0.06085 0.1217 0.9392
57 0.04936 0.09873 0.9506
58 0.03862 0.07725 0.9614
59 0.0345 0.069 0.9655
60 0.02602 0.05203 0.974
61 0.0253 0.05061 0.9747
62 0.02051 0.04102 0.9795
63 0.02001 0.04002 0.98
64 0.02346 0.04692 0.9765
65 0.02547 0.05094 0.9745
66 0.02606 0.05212 0.9739
67 0.02198 0.04395 0.978
68 0.01772 0.03544 0.9823
69 0.02597 0.05193 0.974
70 0.05233 0.1047 0.9477
71 0.04884 0.09769 0.9512
72 0.04223 0.08445 0.9578
73 0.03301 0.06603 0.967
74 0.02943 0.05886 0.9706
75 0.02659 0.05319 0.9734
76 0.05544 0.1109 0.9446
77 0.04489 0.08977 0.9551
78 0.2043 0.4086 0.7957
79 0.1761 0.3523 0.8239
80 0.1705 0.3411 0.8295
81 0.1642 0.3284 0.8358
82 0.4175 0.835 0.5825
83 0.3792 0.7584 0.6208
84 0.4374 0.8748 0.5626
85 0.3939 0.7877 0.6061
86 0.3657 0.7314 0.6343
87 0.3288 0.6576 0.6712
88 0.3277 0.6554 0.6723
89 0.2866 0.5732 0.7134
90 0.2476 0.4953 0.7524
91 0.2168 0.4335 0.7832
92 0.1905 0.381 0.8095
93 0.212 0.424 0.788
94 0.2192 0.4383 0.7808
95 0.1978 0.3955 0.8022
96 0.2151 0.4303 0.7849
97 0.1855 0.3711 0.8145
98 0.3223 0.6447 0.6777
99 0.2836 0.5672 0.7164
100 0.2504 0.5008 0.7496
101 0.221 0.442 0.779
102 0.1886 0.3771 0.8114
103 0.1654 0.3308 0.8346
104 0.2272 0.4545 0.7728
105 0.194 0.3881 0.806
106 0.2171 0.4342 0.7829
107 0.4503 0.9006 0.5497
108 0.427 0.854 0.573
109 0.4136 0.8272 0.5864
110 0.4071 0.8143 0.5929
111 0.3846 0.7693 0.6154
112 0.3795 0.7591 0.6205
113 0.3325 0.665 0.6675
114 0.2938 0.5876 0.7062
115 0.2627 0.5254 0.7373
116 0.2339 0.4679 0.7661
117 0.2214 0.4427 0.7786
118 0.1848 0.3696 0.8152
119 0.1597 0.3193 0.8403
120 0.1324 0.2649 0.8676
121 0.1182 0.2363 0.8818
122 0.09298 0.186 0.907
123 0.08261 0.1652 0.9174
124 0.06502 0.13 0.935
125 0.06386 0.1277 0.9361
126 0.0646 0.1292 0.9354
127 0.1478 0.2957 0.8522
128 0.1647 0.3294 0.8353
129 0.1303 0.2606 0.8697
130 0.1093 0.2186 0.8907
131 0.08995 0.1799 0.9101
132 0.08727 0.1745 0.9127
133 0.07732 0.1546 0.9227
134 0.07789 0.1558 0.9221
135 0.0617 0.1234 0.9383
136 0.06 0.12 0.94
137 0.07176 0.1435 0.9282
138 0.06243 0.1249 0.9376
139 0.09356 0.1871 0.9064
140 0.1052 0.2103 0.8948
141 0.07236 0.1447 0.9276
142 0.07789 0.1558 0.9221
143 0.05289 0.1058 0.9471
144 0.04703 0.09405 0.953
145 0.02797 0.05595 0.972
146 0.0222 0.0444 0.9778
147 0.0124 0.0248 0.9876
148 0.006021 0.01204 0.994
149 0.06733 0.1347 0.9327

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
11 &  0.3327 &  0.6655 &  0.6673 \tabularnewline
12 &  0.1859 &  0.3717 &  0.8141 \tabularnewline
13 &  0.1006 &  0.2013 &  0.8994 \tabularnewline
14 &  0.06782 &  0.1356 &  0.9322 \tabularnewline
15 &  0.07122 &  0.1424 &  0.9288 \tabularnewline
16 &  0.05453 &  0.1091 &  0.9455 \tabularnewline
17 &  0.03242 &  0.06485 &  0.9676 \tabularnewline
18 &  0.1227 &  0.2454 &  0.8773 \tabularnewline
19 &  0.0844 &  0.1688 &  0.9156 \tabularnewline
20 &  0.05833 &  0.1167 &  0.9417 \tabularnewline
21 &  0.05584 &  0.1117 &  0.9442 \tabularnewline
22 &  0.03491 &  0.06983 &  0.9651 \tabularnewline
23 &  0.03697 &  0.07393 &  0.963 \tabularnewline
24 &  0.03287 &  0.06574 &  0.9671 \tabularnewline
25 &  0.06947 &  0.1389 &  0.9305 \tabularnewline
26 &  0.05102 &  0.102 &  0.949 \tabularnewline
27 &  0.03762 &  0.07524 &  0.9624 \tabularnewline
28 &  0.02599 &  0.05198 &  0.974 \tabularnewline
29 &  0.01744 &  0.03488 &  0.9826 \tabularnewline
30 &  0.01998 &  0.03997 &  0.98 \tabularnewline
31 &  0.02599 &  0.05197 &  0.974 \tabularnewline
32 &  0.01958 &  0.03916 &  0.9804 \tabularnewline
33 &  0.01683 &  0.03365 &  0.9832 \tabularnewline
34 &  0.01223 &  0.02447 &  0.9878 \tabularnewline
35 &  0.007818 &  0.01564 &  0.9922 \tabularnewline
36 &  0.005282 &  0.01056 &  0.9947 \tabularnewline
37 &  0.005513 &  0.01103 &  0.9945 \tabularnewline
38 &  0.02334 &  0.04668 &  0.9767 \tabularnewline
39 &  0.01803 &  0.03606 &  0.982 \tabularnewline
40 &  0.01252 &  0.02503 &  0.9875 \tabularnewline
41 &  0.01653 &  0.03307 &  0.9835 \tabularnewline
42 &  0.01519 &  0.03037 &  0.9848 \tabularnewline
43 &  0.01143 &  0.02285 &  0.9886 \tabularnewline
44 &  0.009073 &  0.01815 &  0.9909 \tabularnewline
45 &  0.006129 &  0.01226 &  0.9939 \tabularnewline
46 &  0.005769 &  0.01154 &  0.9942 \tabularnewline
47 &  0.01953 &  0.03905 &  0.9805 \tabularnewline
48 &  0.01425 &  0.0285 &  0.9858 \tabularnewline
49 &  0.01185 &  0.02371 &  0.9881 \tabularnewline
50 &  0.008363 &  0.01673 &  0.9916 \tabularnewline
51 &  0.05233 &  0.1047 &  0.9477 \tabularnewline
52 &  0.04035 &  0.08071 &  0.9596 \tabularnewline
53 &  0.07458 &  0.1492 &  0.9254 \tabularnewline
54 &  0.06023 &  0.1205 &  0.9398 \tabularnewline
55 &  0.07165 &  0.1433 &  0.9283 \tabularnewline
56 &  0.06085 &  0.1217 &  0.9392 \tabularnewline
57 &  0.04936 &  0.09873 &  0.9506 \tabularnewline
58 &  0.03862 &  0.07725 &  0.9614 \tabularnewline
59 &  0.0345 &  0.069 &  0.9655 \tabularnewline
60 &  0.02602 &  0.05203 &  0.974 \tabularnewline
61 &  0.0253 &  0.05061 &  0.9747 \tabularnewline
62 &  0.02051 &  0.04102 &  0.9795 \tabularnewline
63 &  0.02001 &  0.04002 &  0.98 \tabularnewline
64 &  0.02346 &  0.04692 &  0.9765 \tabularnewline
65 &  0.02547 &  0.05094 &  0.9745 \tabularnewline
66 &  0.02606 &  0.05212 &  0.9739 \tabularnewline
67 &  0.02198 &  0.04395 &  0.978 \tabularnewline
68 &  0.01772 &  0.03544 &  0.9823 \tabularnewline
69 &  0.02597 &  0.05193 &  0.974 \tabularnewline
70 &  0.05233 &  0.1047 &  0.9477 \tabularnewline
71 &  0.04884 &  0.09769 &  0.9512 \tabularnewline
72 &  0.04223 &  0.08445 &  0.9578 \tabularnewline
73 &  0.03301 &  0.06603 &  0.967 \tabularnewline
74 &  0.02943 &  0.05886 &  0.9706 \tabularnewline
75 &  0.02659 &  0.05319 &  0.9734 \tabularnewline
76 &  0.05544 &  0.1109 &  0.9446 \tabularnewline
77 &  0.04489 &  0.08977 &  0.9551 \tabularnewline
78 &  0.2043 &  0.4086 &  0.7957 \tabularnewline
79 &  0.1761 &  0.3523 &  0.8239 \tabularnewline
80 &  0.1705 &  0.3411 &  0.8295 \tabularnewline
81 &  0.1642 &  0.3284 &  0.8358 \tabularnewline
82 &  0.4175 &  0.835 &  0.5825 \tabularnewline
83 &  0.3792 &  0.7584 &  0.6208 \tabularnewline
84 &  0.4374 &  0.8748 &  0.5626 \tabularnewline
85 &  0.3939 &  0.7877 &  0.6061 \tabularnewline
86 &  0.3657 &  0.7314 &  0.6343 \tabularnewline
87 &  0.3288 &  0.6576 &  0.6712 \tabularnewline
88 &  0.3277 &  0.6554 &  0.6723 \tabularnewline
89 &  0.2866 &  0.5732 &  0.7134 \tabularnewline
90 &  0.2476 &  0.4953 &  0.7524 \tabularnewline
91 &  0.2168 &  0.4335 &  0.7832 \tabularnewline
92 &  0.1905 &  0.381 &  0.8095 \tabularnewline
93 &  0.212 &  0.424 &  0.788 \tabularnewline
94 &  0.2192 &  0.4383 &  0.7808 \tabularnewline
95 &  0.1978 &  0.3955 &  0.8022 \tabularnewline
96 &  0.2151 &  0.4303 &  0.7849 \tabularnewline
97 &  0.1855 &  0.3711 &  0.8145 \tabularnewline
98 &  0.3223 &  0.6447 &  0.6777 \tabularnewline
99 &  0.2836 &  0.5672 &  0.7164 \tabularnewline
100 &  0.2504 &  0.5008 &  0.7496 \tabularnewline
101 &  0.221 &  0.442 &  0.779 \tabularnewline
102 &  0.1886 &  0.3771 &  0.8114 \tabularnewline
103 &  0.1654 &  0.3308 &  0.8346 \tabularnewline
104 &  0.2272 &  0.4545 &  0.7728 \tabularnewline
105 &  0.194 &  0.3881 &  0.806 \tabularnewline
106 &  0.2171 &  0.4342 &  0.7829 \tabularnewline
107 &  0.4503 &  0.9006 &  0.5497 \tabularnewline
108 &  0.427 &  0.854 &  0.573 \tabularnewline
109 &  0.4136 &  0.8272 &  0.5864 \tabularnewline
110 &  0.4071 &  0.8143 &  0.5929 \tabularnewline
111 &  0.3846 &  0.7693 &  0.6154 \tabularnewline
112 &  0.3795 &  0.7591 &  0.6205 \tabularnewline
113 &  0.3325 &  0.665 &  0.6675 \tabularnewline
114 &  0.2938 &  0.5876 &  0.7062 \tabularnewline
115 &  0.2627 &  0.5254 &  0.7373 \tabularnewline
116 &  0.2339 &  0.4679 &  0.7661 \tabularnewline
117 &  0.2214 &  0.4427 &  0.7786 \tabularnewline
118 &  0.1848 &  0.3696 &  0.8152 \tabularnewline
119 &  0.1597 &  0.3193 &  0.8403 \tabularnewline
120 &  0.1324 &  0.2649 &  0.8676 \tabularnewline
121 &  0.1182 &  0.2363 &  0.8818 \tabularnewline
122 &  0.09298 &  0.186 &  0.907 \tabularnewline
123 &  0.08261 &  0.1652 &  0.9174 \tabularnewline
124 &  0.06502 &  0.13 &  0.935 \tabularnewline
125 &  0.06386 &  0.1277 &  0.9361 \tabularnewline
126 &  0.0646 &  0.1292 &  0.9354 \tabularnewline
127 &  0.1478 &  0.2957 &  0.8522 \tabularnewline
128 &  0.1647 &  0.3294 &  0.8353 \tabularnewline
129 &  0.1303 &  0.2606 &  0.8697 \tabularnewline
130 &  0.1093 &  0.2186 &  0.8907 \tabularnewline
131 &  0.08995 &  0.1799 &  0.9101 \tabularnewline
132 &  0.08727 &  0.1745 &  0.9127 \tabularnewline
133 &  0.07732 &  0.1546 &  0.9227 \tabularnewline
134 &  0.07789 &  0.1558 &  0.9221 \tabularnewline
135 &  0.0617 &  0.1234 &  0.9383 \tabularnewline
136 &  0.06 &  0.12 &  0.94 \tabularnewline
137 &  0.07176 &  0.1435 &  0.9282 \tabularnewline
138 &  0.06243 &  0.1249 &  0.9376 \tabularnewline
139 &  0.09356 &  0.1871 &  0.9064 \tabularnewline
140 &  0.1052 &  0.2103 &  0.8948 \tabularnewline
141 &  0.07236 &  0.1447 &  0.9276 \tabularnewline
142 &  0.07789 &  0.1558 &  0.9221 \tabularnewline
143 &  0.05289 &  0.1058 &  0.9471 \tabularnewline
144 &  0.04703 &  0.09405 &  0.953 \tabularnewline
145 &  0.02797 &  0.05595 &  0.972 \tabularnewline
146 &  0.0222 &  0.0444 &  0.9778 \tabularnewline
147 &  0.0124 &  0.0248 &  0.9876 \tabularnewline
148 &  0.006021 &  0.01204 &  0.994 \tabularnewline
149 &  0.06733 &  0.1347 &  0.9327 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=297625&T=5

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]11[/C][C] 0.3327[/C][C] 0.6655[/C][C] 0.6673[/C][/ROW]
[ROW][C]12[/C][C] 0.1859[/C][C] 0.3717[/C][C] 0.8141[/C][/ROW]
[ROW][C]13[/C][C] 0.1006[/C][C] 0.2013[/C][C] 0.8994[/C][/ROW]
[ROW][C]14[/C][C] 0.06782[/C][C] 0.1356[/C][C] 0.9322[/C][/ROW]
[ROW][C]15[/C][C] 0.07122[/C][C] 0.1424[/C][C] 0.9288[/C][/ROW]
[ROW][C]16[/C][C] 0.05453[/C][C] 0.1091[/C][C] 0.9455[/C][/ROW]
[ROW][C]17[/C][C] 0.03242[/C][C] 0.06485[/C][C] 0.9676[/C][/ROW]
[ROW][C]18[/C][C] 0.1227[/C][C] 0.2454[/C][C] 0.8773[/C][/ROW]
[ROW][C]19[/C][C] 0.0844[/C][C] 0.1688[/C][C] 0.9156[/C][/ROW]
[ROW][C]20[/C][C] 0.05833[/C][C] 0.1167[/C][C] 0.9417[/C][/ROW]
[ROW][C]21[/C][C] 0.05584[/C][C] 0.1117[/C][C] 0.9442[/C][/ROW]
[ROW][C]22[/C][C] 0.03491[/C][C] 0.06983[/C][C] 0.9651[/C][/ROW]
[ROW][C]23[/C][C] 0.03697[/C][C] 0.07393[/C][C] 0.963[/C][/ROW]
[ROW][C]24[/C][C] 0.03287[/C][C] 0.06574[/C][C] 0.9671[/C][/ROW]
[ROW][C]25[/C][C] 0.06947[/C][C] 0.1389[/C][C] 0.9305[/C][/ROW]
[ROW][C]26[/C][C] 0.05102[/C][C] 0.102[/C][C] 0.949[/C][/ROW]
[ROW][C]27[/C][C] 0.03762[/C][C] 0.07524[/C][C] 0.9624[/C][/ROW]
[ROW][C]28[/C][C] 0.02599[/C][C] 0.05198[/C][C] 0.974[/C][/ROW]
[ROW][C]29[/C][C] 0.01744[/C][C] 0.03488[/C][C] 0.9826[/C][/ROW]
[ROW][C]30[/C][C] 0.01998[/C][C] 0.03997[/C][C] 0.98[/C][/ROW]
[ROW][C]31[/C][C] 0.02599[/C][C] 0.05197[/C][C] 0.974[/C][/ROW]
[ROW][C]32[/C][C] 0.01958[/C][C] 0.03916[/C][C] 0.9804[/C][/ROW]
[ROW][C]33[/C][C] 0.01683[/C][C] 0.03365[/C][C] 0.9832[/C][/ROW]
[ROW][C]34[/C][C] 0.01223[/C][C] 0.02447[/C][C] 0.9878[/C][/ROW]
[ROW][C]35[/C][C] 0.007818[/C][C] 0.01564[/C][C] 0.9922[/C][/ROW]
[ROW][C]36[/C][C] 0.005282[/C][C] 0.01056[/C][C] 0.9947[/C][/ROW]
[ROW][C]37[/C][C] 0.005513[/C][C] 0.01103[/C][C] 0.9945[/C][/ROW]
[ROW][C]38[/C][C] 0.02334[/C][C] 0.04668[/C][C] 0.9767[/C][/ROW]
[ROW][C]39[/C][C] 0.01803[/C][C] 0.03606[/C][C] 0.982[/C][/ROW]
[ROW][C]40[/C][C] 0.01252[/C][C] 0.02503[/C][C] 0.9875[/C][/ROW]
[ROW][C]41[/C][C] 0.01653[/C][C] 0.03307[/C][C] 0.9835[/C][/ROW]
[ROW][C]42[/C][C] 0.01519[/C][C] 0.03037[/C][C] 0.9848[/C][/ROW]
[ROW][C]43[/C][C] 0.01143[/C][C] 0.02285[/C][C] 0.9886[/C][/ROW]
[ROW][C]44[/C][C] 0.009073[/C][C] 0.01815[/C][C] 0.9909[/C][/ROW]
[ROW][C]45[/C][C] 0.006129[/C][C] 0.01226[/C][C] 0.9939[/C][/ROW]
[ROW][C]46[/C][C] 0.005769[/C][C] 0.01154[/C][C] 0.9942[/C][/ROW]
[ROW][C]47[/C][C] 0.01953[/C][C] 0.03905[/C][C] 0.9805[/C][/ROW]
[ROW][C]48[/C][C] 0.01425[/C][C] 0.0285[/C][C] 0.9858[/C][/ROW]
[ROW][C]49[/C][C] 0.01185[/C][C] 0.02371[/C][C] 0.9881[/C][/ROW]
[ROW][C]50[/C][C] 0.008363[/C][C] 0.01673[/C][C] 0.9916[/C][/ROW]
[ROW][C]51[/C][C] 0.05233[/C][C] 0.1047[/C][C] 0.9477[/C][/ROW]
[ROW][C]52[/C][C] 0.04035[/C][C] 0.08071[/C][C] 0.9596[/C][/ROW]
[ROW][C]53[/C][C] 0.07458[/C][C] 0.1492[/C][C] 0.9254[/C][/ROW]
[ROW][C]54[/C][C] 0.06023[/C][C] 0.1205[/C][C] 0.9398[/C][/ROW]
[ROW][C]55[/C][C] 0.07165[/C][C] 0.1433[/C][C] 0.9283[/C][/ROW]
[ROW][C]56[/C][C] 0.06085[/C][C] 0.1217[/C][C] 0.9392[/C][/ROW]
[ROW][C]57[/C][C] 0.04936[/C][C] 0.09873[/C][C] 0.9506[/C][/ROW]
[ROW][C]58[/C][C] 0.03862[/C][C] 0.07725[/C][C] 0.9614[/C][/ROW]
[ROW][C]59[/C][C] 0.0345[/C][C] 0.069[/C][C] 0.9655[/C][/ROW]
[ROW][C]60[/C][C] 0.02602[/C][C] 0.05203[/C][C] 0.974[/C][/ROW]
[ROW][C]61[/C][C] 0.0253[/C][C] 0.05061[/C][C] 0.9747[/C][/ROW]
[ROW][C]62[/C][C] 0.02051[/C][C] 0.04102[/C][C] 0.9795[/C][/ROW]
[ROW][C]63[/C][C] 0.02001[/C][C] 0.04002[/C][C] 0.98[/C][/ROW]
[ROW][C]64[/C][C] 0.02346[/C][C] 0.04692[/C][C] 0.9765[/C][/ROW]
[ROW][C]65[/C][C] 0.02547[/C][C] 0.05094[/C][C] 0.9745[/C][/ROW]
[ROW][C]66[/C][C] 0.02606[/C][C] 0.05212[/C][C] 0.9739[/C][/ROW]
[ROW][C]67[/C][C] 0.02198[/C][C] 0.04395[/C][C] 0.978[/C][/ROW]
[ROW][C]68[/C][C] 0.01772[/C][C] 0.03544[/C][C] 0.9823[/C][/ROW]
[ROW][C]69[/C][C] 0.02597[/C][C] 0.05193[/C][C] 0.974[/C][/ROW]
[ROW][C]70[/C][C] 0.05233[/C][C] 0.1047[/C][C] 0.9477[/C][/ROW]
[ROW][C]71[/C][C] 0.04884[/C][C] 0.09769[/C][C] 0.9512[/C][/ROW]
[ROW][C]72[/C][C] 0.04223[/C][C] 0.08445[/C][C] 0.9578[/C][/ROW]
[ROW][C]73[/C][C] 0.03301[/C][C] 0.06603[/C][C] 0.967[/C][/ROW]
[ROW][C]74[/C][C] 0.02943[/C][C] 0.05886[/C][C] 0.9706[/C][/ROW]
[ROW][C]75[/C][C] 0.02659[/C][C] 0.05319[/C][C] 0.9734[/C][/ROW]
[ROW][C]76[/C][C] 0.05544[/C][C] 0.1109[/C][C] 0.9446[/C][/ROW]
[ROW][C]77[/C][C] 0.04489[/C][C] 0.08977[/C][C] 0.9551[/C][/ROW]
[ROW][C]78[/C][C] 0.2043[/C][C] 0.4086[/C][C] 0.7957[/C][/ROW]
[ROW][C]79[/C][C] 0.1761[/C][C] 0.3523[/C][C] 0.8239[/C][/ROW]
[ROW][C]80[/C][C] 0.1705[/C][C] 0.3411[/C][C] 0.8295[/C][/ROW]
[ROW][C]81[/C][C] 0.1642[/C][C] 0.3284[/C][C] 0.8358[/C][/ROW]
[ROW][C]82[/C][C] 0.4175[/C][C] 0.835[/C][C] 0.5825[/C][/ROW]
[ROW][C]83[/C][C] 0.3792[/C][C] 0.7584[/C][C] 0.6208[/C][/ROW]
[ROW][C]84[/C][C] 0.4374[/C][C] 0.8748[/C][C] 0.5626[/C][/ROW]
[ROW][C]85[/C][C] 0.3939[/C][C] 0.7877[/C][C] 0.6061[/C][/ROW]
[ROW][C]86[/C][C] 0.3657[/C][C] 0.7314[/C][C] 0.6343[/C][/ROW]
[ROW][C]87[/C][C] 0.3288[/C][C] 0.6576[/C][C] 0.6712[/C][/ROW]
[ROW][C]88[/C][C] 0.3277[/C][C] 0.6554[/C][C] 0.6723[/C][/ROW]
[ROW][C]89[/C][C] 0.2866[/C][C] 0.5732[/C][C] 0.7134[/C][/ROW]
[ROW][C]90[/C][C] 0.2476[/C][C] 0.4953[/C][C] 0.7524[/C][/ROW]
[ROW][C]91[/C][C] 0.2168[/C][C] 0.4335[/C][C] 0.7832[/C][/ROW]
[ROW][C]92[/C][C] 0.1905[/C][C] 0.381[/C][C] 0.8095[/C][/ROW]
[ROW][C]93[/C][C] 0.212[/C][C] 0.424[/C][C] 0.788[/C][/ROW]
[ROW][C]94[/C][C] 0.2192[/C][C] 0.4383[/C][C] 0.7808[/C][/ROW]
[ROW][C]95[/C][C] 0.1978[/C][C] 0.3955[/C][C] 0.8022[/C][/ROW]
[ROW][C]96[/C][C] 0.2151[/C][C] 0.4303[/C][C] 0.7849[/C][/ROW]
[ROW][C]97[/C][C] 0.1855[/C][C] 0.3711[/C][C] 0.8145[/C][/ROW]
[ROW][C]98[/C][C] 0.3223[/C][C] 0.6447[/C][C] 0.6777[/C][/ROW]
[ROW][C]99[/C][C] 0.2836[/C][C] 0.5672[/C][C] 0.7164[/C][/ROW]
[ROW][C]100[/C][C] 0.2504[/C][C] 0.5008[/C][C] 0.7496[/C][/ROW]
[ROW][C]101[/C][C] 0.221[/C][C] 0.442[/C][C] 0.779[/C][/ROW]
[ROW][C]102[/C][C] 0.1886[/C][C] 0.3771[/C][C] 0.8114[/C][/ROW]
[ROW][C]103[/C][C] 0.1654[/C][C] 0.3308[/C][C] 0.8346[/C][/ROW]
[ROW][C]104[/C][C] 0.2272[/C][C] 0.4545[/C][C] 0.7728[/C][/ROW]
[ROW][C]105[/C][C] 0.194[/C][C] 0.3881[/C][C] 0.806[/C][/ROW]
[ROW][C]106[/C][C] 0.2171[/C][C] 0.4342[/C][C] 0.7829[/C][/ROW]
[ROW][C]107[/C][C] 0.4503[/C][C] 0.9006[/C][C] 0.5497[/C][/ROW]
[ROW][C]108[/C][C] 0.427[/C][C] 0.854[/C][C] 0.573[/C][/ROW]
[ROW][C]109[/C][C] 0.4136[/C][C] 0.8272[/C][C] 0.5864[/C][/ROW]
[ROW][C]110[/C][C] 0.4071[/C][C] 0.8143[/C][C] 0.5929[/C][/ROW]
[ROW][C]111[/C][C] 0.3846[/C][C] 0.7693[/C][C] 0.6154[/C][/ROW]
[ROW][C]112[/C][C] 0.3795[/C][C] 0.7591[/C][C] 0.6205[/C][/ROW]
[ROW][C]113[/C][C] 0.3325[/C][C] 0.665[/C][C] 0.6675[/C][/ROW]
[ROW][C]114[/C][C] 0.2938[/C][C] 0.5876[/C][C] 0.7062[/C][/ROW]
[ROW][C]115[/C][C] 0.2627[/C][C] 0.5254[/C][C] 0.7373[/C][/ROW]
[ROW][C]116[/C][C] 0.2339[/C][C] 0.4679[/C][C] 0.7661[/C][/ROW]
[ROW][C]117[/C][C] 0.2214[/C][C] 0.4427[/C][C] 0.7786[/C][/ROW]
[ROW][C]118[/C][C] 0.1848[/C][C] 0.3696[/C][C] 0.8152[/C][/ROW]
[ROW][C]119[/C][C] 0.1597[/C][C] 0.3193[/C][C] 0.8403[/C][/ROW]
[ROW][C]120[/C][C] 0.1324[/C][C] 0.2649[/C][C] 0.8676[/C][/ROW]
[ROW][C]121[/C][C] 0.1182[/C][C] 0.2363[/C][C] 0.8818[/C][/ROW]
[ROW][C]122[/C][C] 0.09298[/C][C] 0.186[/C][C] 0.907[/C][/ROW]
[ROW][C]123[/C][C] 0.08261[/C][C] 0.1652[/C][C] 0.9174[/C][/ROW]
[ROW][C]124[/C][C] 0.06502[/C][C] 0.13[/C][C] 0.935[/C][/ROW]
[ROW][C]125[/C][C] 0.06386[/C][C] 0.1277[/C][C] 0.9361[/C][/ROW]
[ROW][C]126[/C][C] 0.0646[/C][C] 0.1292[/C][C] 0.9354[/C][/ROW]
[ROW][C]127[/C][C] 0.1478[/C][C] 0.2957[/C][C] 0.8522[/C][/ROW]
[ROW][C]128[/C][C] 0.1647[/C][C] 0.3294[/C][C] 0.8353[/C][/ROW]
[ROW][C]129[/C][C] 0.1303[/C][C] 0.2606[/C][C] 0.8697[/C][/ROW]
[ROW][C]130[/C][C] 0.1093[/C][C] 0.2186[/C][C] 0.8907[/C][/ROW]
[ROW][C]131[/C][C] 0.08995[/C][C] 0.1799[/C][C] 0.9101[/C][/ROW]
[ROW][C]132[/C][C] 0.08727[/C][C] 0.1745[/C][C] 0.9127[/C][/ROW]
[ROW][C]133[/C][C] 0.07732[/C][C] 0.1546[/C][C] 0.9227[/C][/ROW]
[ROW][C]134[/C][C] 0.07789[/C][C] 0.1558[/C][C] 0.9221[/C][/ROW]
[ROW][C]135[/C][C] 0.0617[/C][C] 0.1234[/C][C] 0.9383[/C][/ROW]
[ROW][C]136[/C][C] 0.06[/C][C] 0.12[/C][C] 0.94[/C][/ROW]
[ROW][C]137[/C][C] 0.07176[/C][C] 0.1435[/C][C] 0.9282[/C][/ROW]
[ROW][C]138[/C][C] 0.06243[/C][C] 0.1249[/C][C] 0.9376[/C][/ROW]
[ROW][C]139[/C][C] 0.09356[/C][C] 0.1871[/C][C] 0.9064[/C][/ROW]
[ROW][C]140[/C][C] 0.1052[/C][C] 0.2103[/C][C] 0.8948[/C][/ROW]
[ROW][C]141[/C][C] 0.07236[/C][C] 0.1447[/C][C] 0.9276[/C][/ROW]
[ROW][C]142[/C][C] 0.07789[/C][C] 0.1558[/C][C] 0.9221[/C][/ROW]
[ROW][C]143[/C][C] 0.05289[/C][C] 0.1058[/C][C] 0.9471[/C][/ROW]
[ROW][C]144[/C][C] 0.04703[/C][C] 0.09405[/C][C] 0.953[/C][/ROW]
[ROW][C]145[/C][C] 0.02797[/C][C] 0.05595[/C][C] 0.972[/C][/ROW]
[ROW][C]146[/C][C] 0.0222[/C][C] 0.0444[/C][C] 0.9778[/C][/ROW]
[ROW][C]147[/C][C] 0.0124[/C][C] 0.0248[/C][C] 0.9876[/C][/ROW]
[ROW][C]148[/C][C] 0.006021[/C][C] 0.01204[/C][C] 0.994[/C][/ROW]
[ROW][C]149[/C][C] 0.06733[/C][C] 0.1347[/C][C] 0.9327[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=297625&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297625&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
11 0.3327 0.6655 0.6673
12 0.1859 0.3717 0.8141
13 0.1006 0.2013 0.8994
14 0.06782 0.1356 0.9322
15 0.07122 0.1424 0.9288
16 0.05453 0.1091 0.9455
17 0.03242 0.06485 0.9676
18 0.1227 0.2454 0.8773
19 0.0844 0.1688 0.9156
20 0.05833 0.1167 0.9417
21 0.05584 0.1117 0.9442
22 0.03491 0.06983 0.9651
23 0.03697 0.07393 0.963
24 0.03287 0.06574 0.9671
25 0.06947 0.1389 0.9305
26 0.05102 0.102 0.949
27 0.03762 0.07524 0.9624
28 0.02599 0.05198 0.974
29 0.01744 0.03488 0.9826
30 0.01998 0.03997 0.98
31 0.02599 0.05197 0.974
32 0.01958 0.03916 0.9804
33 0.01683 0.03365 0.9832
34 0.01223 0.02447 0.9878
35 0.007818 0.01564 0.9922
36 0.005282 0.01056 0.9947
37 0.005513 0.01103 0.9945
38 0.02334 0.04668 0.9767
39 0.01803 0.03606 0.982
40 0.01252 0.02503 0.9875
41 0.01653 0.03307 0.9835
42 0.01519 0.03037 0.9848
43 0.01143 0.02285 0.9886
44 0.009073 0.01815 0.9909
45 0.006129 0.01226 0.9939
46 0.005769 0.01154 0.9942
47 0.01953 0.03905 0.9805
48 0.01425 0.0285 0.9858
49 0.01185 0.02371 0.9881
50 0.008363 0.01673 0.9916
51 0.05233 0.1047 0.9477
52 0.04035 0.08071 0.9596
53 0.07458 0.1492 0.9254
54 0.06023 0.1205 0.9398
55 0.07165 0.1433 0.9283
56 0.06085 0.1217 0.9392
57 0.04936 0.09873 0.9506
58 0.03862 0.07725 0.9614
59 0.0345 0.069 0.9655
60 0.02602 0.05203 0.974
61 0.0253 0.05061 0.9747
62 0.02051 0.04102 0.9795
63 0.02001 0.04002 0.98
64 0.02346 0.04692 0.9765
65 0.02547 0.05094 0.9745
66 0.02606 0.05212 0.9739
67 0.02198 0.04395 0.978
68 0.01772 0.03544 0.9823
69 0.02597 0.05193 0.974
70 0.05233 0.1047 0.9477
71 0.04884 0.09769 0.9512
72 0.04223 0.08445 0.9578
73 0.03301 0.06603 0.967
74 0.02943 0.05886 0.9706
75 0.02659 0.05319 0.9734
76 0.05544 0.1109 0.9446
77 0.04489 0.08977 0.9551
78 0.2043 0.4086 0.7957
79 0.1761 0.3523 0.8239
80 0.1705 0.3411 0.8295
81 0.1642 0.3284 0.8358
82 0.4175 0.835 0.5825
83 0.3792 0.7584 0.6208
84 0.4374 0.8748 0.5626
85 0.3939 0.7877 0.6061
86 0.3657 0.7314 0.6343
87 0.3288 0.6576 0.6712
88 0.3277 0.6554 0.6723
89 0.2866 0.5732 0.7134
90 0.2476 0.4953 0.7524
91 0.2168 0.4335 0.7832
92 0.1905 0.381 0.8095
93 0.212 0.424 0.788
94 0.2192 0.4383 0.7808
95 0.1978 0.3955 0.8022
96 0.2151 0.4303 0.7849
97 0.1855 0.3711 0.8145
98 0.3223 0.6447 0.6777
99 0.2836 0.5672 0.7164
100 0.2504 0.5008 0.7496
101 0.221 0.442 0.779
102 0.1886 0.3771 0.8114
103 0.1654 0.3308 0.8346
104 0.2272 0.4545 0.7728
105 0.194 0.3881 0.806
106 0.2171 0.4342 0.7829
107 0.4503 0.9006 0.5497
108 0.427 0.854 0.573
109 0.4136 0.8272 0.5864
110 0.4071 0.8143 0.5929
111 0.3846 0.7693 0.6154
112 0.3795 0.7591 0.6205
113 0.3325 0.665 0.6675
114 0.2938 0.5876 0.7062
115 0.2627 0.5254 0.7373
116 0.2339 0.4679 0.7661
117 0.2214 0.4427 0.7786
118 0.1848 0.3696 0.8152
119 0.1597 0.3193 0.8403
120 0.1324 0.2649 0.8676
121 0.1182 0.2363 0.8818
122 0.09298 0.186 0.907
123 0.08261 0.1652 0.9174
124 0.06502 0.13 0.935
125 0.06386 0.1277 0.9361
126 0.0646 0.1292 0.9354
127 0.1478 0.2957 0.8522
128 0.1647 0.3294 0.8353
129 0.1303 0.2606 0.8697
130 0.1093 0.2186 0.8907
131 0.08995 0.1799 0.9101
132 0.08727 0.1745 0.9127
133 0.07732 0.1546 0.9227
134 0.07789 0.1558 0.9221
135 0.0617 0.1234 0.9383
136 0.06 0.12 0.94
137 0.07176 0.1435 0.9282
138 0.06243 0.1249 0.9376
139 0.09356 0.1871 0.9064
140 0.1052 0.2103 0.8948
141 0.07236 0.1447 0.9276
142 0.07789 0.1558 0.9221
143 0.05289 0.1058 0.9471
144 0.04703 0.09405 0.953
145 0.02797 0.05595 0.972
146 0.0222 0.0444 0.9778
147 0.0124 0.0248 0.9876
148 0.006021 0.01204 0.994
149 0.06733 0.1347 0.9327







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level0 0OK
5% type I error level290.208633NOK
10% type I error level530.381295NOK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 0 &  0 & OK \tabularnewline
5% type I error level & 29 & 0.208633 & NOK \tabularnewline
10% type I error level & 53 & 0.381295 & NOK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=297625&T=6

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]0[/C][C] 0[/C][C]OK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]29[/C][C]0.208633[/C][C]NOK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]53[/C][C]0.381295[/C][C]NOK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=297625&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297625&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level0 0OK
5% type I error level290.208633NOK
10% type I error level530.381295NOK







Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 1.5777, df1 = 2, df2 = 150, p-value = 0.2099
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 1.0871, df1 = 14, df2 = 138, p-value = 0.3745
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 1.1848, df1 = 2, df2 = 150, p-value = 0.3086

\begin{tabular}{lllllllll}
\hline
Ramsey RESET F-Test for powers (2 and 3) of fitted values \tabularnewline
> reset_test_fitted
	RESET test
data:  mylm
RESET = 1.5777, df1 = 2, df2 = 150, p-value = 0.2099
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of regressors \tabularnewline
> reset_test_regressors
	RESET test
data:  mylm
RESET = 1.0871, df1 = 14, df2 = 138, p-value = 0.3745
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of principal components \tabularnewline
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 1.1848, df1 = 2, df2 = 150, p-value = 0.3086
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=297625&T=7

[TABLE]
[ROW][C]Ramsey RESET F-Test for powers (2 and 3) of fitted values[/C][/ROW]
[ROW][C]
> reset_test_fitted
	RESET test
data:  mylm
RESET = 1.5777, df1 = 2, df2 = 150, p-value = 0.2099
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of regressors[/C][/ROW] [ROW][C]
> reset_test_regressors
	RESET test
data:  mylm
RESET = 1.0871, df1 = 14, df2 = 138, p-value = 0.3745
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of principal components[/C][/ROW] [ROW][C]
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 1.1848, df1 = 2, df2 = 150, p-value = 0.3086
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=297625&T=7

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297625&T=7

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 1.5777, df1 = 2, df2 = 150, p-value = 0.2099
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 1.0871, df1 = 14, df2 = 138, p-value = 0.3745
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 1.1848, df1 = 2, df2 = 150, p-value = 0.3086







Variance Inflation Factors (Multicollinearity)
> vif
     SK1      SK2      SK3      SK4      SK5      SK6        G 
1.093131 1.133760 1.045661 1.045248 1.070926 1.032815 1.028859 

\begin{tabular}{lllllllll}
\hline
Variance Inflation Factors (Multicollinearity) \tabularnewline
> vif
     SK1      SK2      SK3      SK4      SK5      SK6        G 
1.093131 1.133760 1.045661 1.045248 1.070926 1.032815 1.028859 
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=297625&T=8

[TABLE]
[ROW][C]Variance Inflation Factors (Multicollinearity)[/C][/ROW]
[ROW][C]
> vif
     SK1      SK2      SK3      SK4      SK5      SK6        G 
1.093131 1.133760 1.045661 1.045248 1.070926 1.032815 1.028859 
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=297625&T=8

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297625&T=8

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Variance Inflation Factors (Multicollinearity)
> vif
     SK1      SK2      SK3      SK4      SK5      SK6        G 
1.093131 1.133760 1.045661 1.045248 1.070926 1.032815 1.028859 



Parameters (Session):
par1 = 7 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
Parameters (R input):
par1 = 7 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ; par4 = ; par5 = ;
R code (references can be found in the software module):
par5 <- ''
par4 <- ''
par3 <- 'No Linear Trend'
par2 <- 'Do not include Seasonal Dummies'
par1 <- ''
library(lattice)
library(lmtest)
library(car)
library(MASS)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
mywarning <- ''
par1 <- as.numeric(par1)
if(is.na(par1)) {
par1 <- 1
mywarning = 'Warning: you did not specify the column number of the endogenous series! The first column was selected by default.'
}
if (par4=='') par4 <- 0
par4 <- as.numeric(par4)
if (par5=='') par5 <- 0
par5 <- as.numeric(par5)
x <- na.omit(t(y))
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'Seasonal Differences (s=12)'){
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'First and Seasonal Differences (s=12)'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if(par4 > 0) {
x2 <- array(0, dim=c(n-par4,par4), dimnames=list(1:(n-par4), paste(colnames(x)[par1],'(t-',1:par4,')',sep='')))
for (i in 1:(n-par4)) {
for (j in 1:par4) {
x2[i,j] <- x[i+par4-j,par1]
}
}
x <- cbind(x[(par4+1):n,], x2)
n <- n - par4
}
if(par5 > 0) {
x2 <- array(0, dim=c(n-par5*12,par5), dimnames=list(1:(n-par5*12), paste(colnames(x)[par1],'(t-',1:par5,'s)',sep='')))
for (i in 1:(n-par5*12)) {
for (j in 1:par5) {
x2[i,j] <- x[i+par5*12-j*12,par1]
}
}
x <- cbind(x[(par5*12+1):n,], x2)
n <- n - par5*12
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
(k <- length(x[n,]))
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
print(x)
(k <- length(x[n,]))
head(x)
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
sresid <- studres(mylm)
hist(sresid, freq=FALSE, main='Distribution of Studentized Residuals')
xfit<-seq(min(sresid),max(sresid),length=40)
yfit<-dnorm(xfit)
lines(xfit, yfit)
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqPlot(mylm, main='QQ Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
print(z)
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, signif(mysum$coefficients[i,1],6), sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, mywarning)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Multiple Linear Regression - Ordinary Least Squares', 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,formatC(signif(mysum$coefficients[i,1],5),format='g',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,2],5),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,3],4),format='e',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4],4),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4]/2,4),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a,formatC(signif(sqrt(mysum$r.squared),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$adj.r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a,formatC(signif(mysum$fstatistic[1],6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[2],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[3],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a,formatC(signif(1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a,formatC(signif(mysum$sigma,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a,formatC(signif(sum(myerror*myerror),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
myr <- as.numeric(mysum$resid)
myr
if(n < 200) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,formatC(signif(x[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(x[i]-mysum$resid[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$resid[i],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,1],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,2],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,3],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant1,6))
a<-table.element(a,formatC(signif(numsignificant1/numgqtests,6),format='g',flag=' '))
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant5,6))
a<-table.element(a,signif(numsignificant5/numgqtests,6))
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant10,6))
a<-table.element(a,signif(numsignificant10/numgqtests,6))
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}
}
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of fitted values',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_fitted <- resettest(mylm,power=2:3,type='fitted')
a<-table.element(a,paste('
',RC.texteval('reset_test_fitted'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of regressors',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_regressors <- resettest(mylm,power=2:3,type='regressor')
a<-table.element(a,paste('
',RC.texteval('reset_test_regressors'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of principal components',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_principal_components <- resettest(mylm,power=2:3,type='princomp')
a<-table.element(a,paste('
',RC.texteval('reset_test_principal_components'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable8.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Variance Inflation Factors (Multicollinearity)',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
vif <- vif(mylm)
a<-table.element(a,paste('
',RC.texteval('vif'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable9.tab')