Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationSat, 10 Dec 2016 18:23:10 +0100
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2016/Dec/10/t148139060858c6pldr5v3lvpx.htm/, Retrieved Mon, 06 May 2024 03:05:07 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=298736, Retrieved Mon, 06 May 2024 03:05:07 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact64
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [Multiple regressi...] [2016-12-10 17:23:10] [2e11ca31a00cf8de75c33c1af2d59434] [Current]
Feedback Forum

Post a new message
Dataseries X:
13	4	2	3	5	1
16	5	3	4	5	1
17	4	4	4	5	1
15	3	4	3	4	1
16	4	4	4	5	1
16	3	4	4	5	1
17	3	4	3	3	1
16	3	4	4	4	1
17	4	5	4	5	0
17	4	5	4	5	1
17	4	4	4	5	1
15	4	4	3	5	0
16	4	4	3	4	0
14	3	3	4	4	1
16	4	4	4	2	1
17	3	4	4	4	1
16	3	4	4	4	1
NA	5	5	5	5	0
15	5	5	3	4	1
17	4	4	4	5	1
16	3	4	3	4	1
15	4	4	4	5	1
16	4	4	4	4	1
15	4	4	4	4	1
17	4	4	4	4	1
15	3	4	4	4	1
16	3	4	3	5	0
15	4	4	4	4	0
16	2	4	4	5	1
16	5	4	4	4	1
13	4	3	4	4	1
15	4	5	4	5	0
17	5	4	4	4	1
15	4	3	4	4	0
13	2	3	4	5	1
17	4	5	4	4	1
15	3	4	4	4	1
14	4	3	3	4	0
14	4	3	4	4	0
18	4	4	4	4	0
15	5	4	4	4	0
17	4	5	4	5	1
13	3	3	4	4	0
16	5	5	3	5	1
15	5	4	3	4	0
15	4	4	3	4	1
16	4	4	4	4	1
15	3	5	3	3	0
13	4	4	4	5	1
NA	2	3	2	3	1
17	4	5	4	4	0
17	5	5	4	5	0
17	5	5	4	4	0
11	4	3	4	5	0
14	4	3	3	4	1
13	4	4	4	4	1
15	3	4	3	3	1
17	3	4	4	4	0
16	4	4	3	5	0
15	4	4	4	5	1
17	5	5	4	5	1
16	2	4	4	5	0
16	4	4	4	5	1
16	3	4	4	2	1
15	4	4	4	5	1
12	4	2	4	4	1
17	4	4	3	5	1
14	4	4	3	5	1
14	5	4	3	3	1
16	3	4	3	5	1
15	3	4	3	4	1
15	4	5	5	5	1
13	4	4	4	4	1
13	4	4	4	4	0
17	4	4	5	5	1
15	3	4	4	4	1
16	4	4	4	5	1
14	3	4	3	5	1
15	3	3	4	4	1
17	4	3	4	4	1
16	4	4	4	4	1
12	3	3	4	4	1
16	4	4	4	5	1
17	4	4	4	5	0
17	4	4	4	5	1
20	5	4	4	4	1
17	5	4	5	4	1
18	4	4	4	5	1
15	3	4	4	4	1
17	3	4	4	4	1
14	4	2	3	4	1
15	4	4	4	4	0
17	4	4	4	4	1
16	4	4	4	5	1
17	4	5	4	5	0
15	3	4	3	5	1
16	4	4	4	4	1
18	5	4	4	4	1
18	5	4	5	4	1
16	4	5	4	5	1
NA	3	4	4	4	1
17	5	3	4	5	1
15	4	4	4	4	1
13	5	4	4	4	1
15	3	4	3	4	1
17	5	4	5	5	1
16	4	4	3	4	0
16	4	4	3	4	1
15	4	4	4	4	1
16	4	4	4	4	1
16	3	4	4	5	1
13	4	4	4	4	1
15	4	4	3	4	0
12	3	3	3	5	1
19	4	4	3	4	1
16	3	4	4	4	1
16	4	4	4	3	1
17	5	4	5	5	1
16	5	4	4	5	0
14	4	4	4	4	0
15	4	4	3	4	0
14	3	4	3	4	1
16	4	4	4	4	0
15	4	4	4	5	1
17	4	5	4	4	1
15	3	4	4	4	1
16	4	4	3	4	1
16	4	4	4	4	1
15	3	4	3	4	0
15	4	4	3	4	0
11	3	2	2	4	1
16	4	4	3	5	0
18	5	4	3	5	1
13	2	4	3	3	0
11	3	3	4	4	1
16	4	4	3	4	1
18	5	5	4	5	0
NA	NA	NA	NA	NA	1
15	4	5	4	4	1
19	5	5	5	5	1
17	4	5	4	5	1
13	4	4	3	4	0
14	3	4	4	5	0
16	4	4	4	4	1
13	4	4	4	4	1
17	4	4	4	5	1
14	4	4	4	5	1
19	5	4	3	5	1
14	4	3	4	4	1
16	4	4	4	4	1
12	3	3	3	4	1
16	4	5	4	4	1
16	4	4	3	4	1
15	4	4	4	4	0
12	3	4	3	5	0
15	4	4	4	4	1
17	5	4	4	5	0
13	4	4	4	3	1
15	2	3	4	4	1
18	4	4	4	4	0
15	4	3	3	5	1
18	4	4	4	4	0
15	4	5	5	4	1
15	5	4	4	4	1
16	5	4	3	4	1
13	3	3	4	5	1
16	4	4	4	4	1
13	4	4	4	5	1
16	2	3	5	5	1




Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time10 seconds
R ServerBig Analytics Cloud Computing Center

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input view raw input (R code)  \tabularnewline
Raw Outputview raw output of R engine  \tabularnewline
Computing time10 seconds \tabularnewline
R ServerBig Analytics Cloud Computing Center \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=298736&T=0

[TABLE]
[ROW]
Summary of computational transaction[/C][/ROW] [ROW]Raw Input[/C] view raw input (R code) [/C][/ROW] [ROW]Raw Output[/C]view raw output of R engine [/C][/ROW] [ROW]Computing time[/C]10 seconds[/C][/ROW] [ROW]R Server[/C]Big Analytics Cloud Computing Center[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=298736&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=298736&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time10 seconds
R ServerBig Analytics Cloud Computing Center







Multiple Linear Regression - Estimated Regression Equation
TVSUM[t] = + 6.84284 + 0.499442SK1[t] + 1.12986SK2[t] + 0.335972SK4[t] + 0.181257SK5[t] + 0.24936ALG2[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
TVSUM[t] =  +  6.84284 +  0.499442SK1[t] +  1.12986SK2[t] +  0.335972SK4[t] +  0.181257SK5[t] +  0.24936ALG2[t]  + e[t] \tabularnewline
 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=298736&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]TVSUM[t] =  +  6.84284 +  0.499442SK1[t] +  1.12986SK2[t] +  0.335972SK4[t] +  0.181257SK5[t] +  0.24936ALG2[t]  + e[t][/C][/ROW]
[ROW][C][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=298736&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=298736&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
TVSUM[t] = + 6.84284 + 0.499442SK1[t] + 1.12986SK2[t] + 0.335972SK4[t] + 0.181257SK5[t] + 0.24936ALG2[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+6.843 1.2+5.7010e+00 5.641e-08 2.821e-08
SK1+0.4994 0.1579+3.1620e+00 0.001875 0.0009373
SK2+1.13 0.1914+5.9030e+00 2.091e-08 1.045e-08
SK4+0.336 0.2072+1.6220e+00 0.1068 0.05342
SK5+0.1813 0.1802+1.0060e+00 0.3162 0.1581
ALG2+0.2494 0.2531+9.8510e-01 0.3261 0.163

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & +6.843 &  1.2 & +5.7010e+00 &  5.641e-08 &  2.821e-08 \tabularnewline
SK1 & +0.4994 &  0.1579 & +3.1620e+00 &  0.001875 &  0.0009373 \tabularnewline
SK2 & +1.13 &  0.1914 & +5.9030e+00 &  2.091e-08 &  1.045e-08 \tabularnewline
SK4 & +0.336 &  0.2072 & +1.6220e+00 &  0.1068 &  0.05342 \tabularnewline
SK5 & +0.1813 &  0.1802 & +1.0060e+00 &  0.3162 &  0.1581 \tabularnewline
ALG2 & +0.2494 &  0.2531 & +9.8510e-01 &  0.3261 &  0.163 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=298736&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]+6.843[/C][C] 1.2[/C][C]+5.7010e+00[/C][C] 5.641e-08[/C][C] 2.821e-08[/C][/ROW]
[ROW][C]SK1[/C][C]+0.4994[/C][C] 0.1579[/C][C]+3.1620e+00[/C][C] 0.001875[/C][C] 0.0009373[/C][/ROW]
[ROW][C]SK2[/C][C]+1.13[/C][C] 0.1914[/C][C]+5.9030e+00[/C][C] 2.091e-08[/C][C] 1.045e-08[/C][/ROW]
[ROW][C]SK4[/C][C]+0.336[/C][C] 0.2072[/C][C]+1.6220e+00[/C][C] 0.1068[/C][C] 0.05342[/C][/ROW]
[ROW][C]SK5[/C][C]+0.1813[/C][C] 0.1802[/C][C]+1.0060e+00[/C][C] 0.3162[/C][C] 0.1581[/C][/ROW]
[ROW][C]ALG2[/C][C]+0.2494[/C][C] 0.2531[/C][C]+9.8510e-01[/C][C] 0.3261[/C][C] 0.163[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=298736&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=298736&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+6.843 1.2+5.7010e+00 5.641e-08 2.821e-08
SK1+0.4994 0.1579+3.1620e+00 0.001875 0.0009373
SK2+1.13 0.1914+5.9030e+00 2.091e-08 1.045e-08
SK4+0.336 0.2072+1.6220e+00 0.1068 0.05342
SK5+0.1813 0.1802+1.0060e+00 0.3162 0.1581
ALG2+0.2494 0.2531+9.8510e-01 0.3261 0.163







Multiple Linear Regression - Regression Statistics
Multiple R 0.56
R-squared 0.3136
Adjusted R-squared 0.292
F-TEST (value) 14.53
F-TEST (DF numerator)5
F-TEST (DF denominator)159
p-value 1.014e-11
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 1.395
Sum Squared Residuals 309.6

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R &  0.56 \tabularnewline
R-squared &  0.3136 \tabularnewline
Adjusted R-squared &  0.292 \tabularnewline
F-TEST (value) &  14.53 \tabularnewline
F-TEST (DF numerator) & 5 \tabularnewline
F-TEST (DF denominator) & 159 \tabularnewline
p-value &  1.014e-11 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation &  1.395 \tabularnewline
Sum Squared Residuals &  309.6 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=298736&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C] 0.56[/C][/ROW]
[ROW][C]R-squared[/C][C] 0.3136[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C] 0.292[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C] 14.53[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]5[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]159[/C][/ROW]
[ROW][C]p-value[/C][C] 1.014e-11[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C] 1.395[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C] 309.6[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=298736&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=298736&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R 0.56
R-squared 0.3136
Adjusted R-squared 0.292
F-TEST (value) 14.53
F-TEST (DF numerator)5
F-TEST (DF denominator)159
p-value 1.014e-11
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 1.395
Sum Squared Residuals 309.6







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 13 13.26-0.2639
2 16 15.23 0.7708
3 17 15.86 1.14
4 15 14.84 0.1571
5 16 15.86 0.1404
6 16 15.36 0.6398
7 17 14.66 2.338
8 16 15.18 0.8211
9 17 16.74 0.2599
10 17 16.99 0.01054
11 17 15.86 1.14
12 15 15.27-0.2743
13 16 15.09 0.907
14 14 14.05-0.04904
15 16 15.32 0.6842
16 17 15.18 1.821
17 16 15.18 0.8211
18 15 16.97-1.972
19 17 15.86 1.14
20 16 14.84 1.157
21 15 15.86-0.8596
22 16 15.68 0.3217
23 15 15.68-0.6783
24 17 15.68 1.322
25 15 15.18-0.1789
26 16 14.77 1.225
27 15 15.43-0.429
28 16 14.86 1.139
29 16 16.18-0.1778
30 13 14.55-1.548
31 15 16.74-1.74
32 17 16.18 0.8222
33 15 14.3 0.7009
34 13 13.73-0.7309
35 17 16.81 0.1918
36 15 15.18-0.1789
37 14 13.96 0.03686
38 14 14.3-0.2991
39 18 15.43 2.571
40 15 15.93-0.9284
41 17 16.99 0.01054
42 13 13.8-0.7997
43 16 17.15-1.153
44 15 15.59-0.5924
45 15 15.34-0.3424
46 16 15.68 0.3217
47 15 15.54-0.5422
48 13 15.86-2.86
49 17 16.56 0.4412
50 17 17.24-0.2395
51 17 17.06-0.05828
52 11 14.48-3.48
53 14 14.21-0.2125
54 13 15.68-2.678
55 15 14.66 0.3383
56 17 14.93 2.07
57 16 15.27 0.7257
58 15 15.86-0.8596
59 17 17.49-0.4889
60 16 14.61 1.389
61 16 15.86 0.1404
62 16 14.82 1.184
63 15 15.86-0.8596
64 12 13.42-1.419
65 17 15.52 1.476
66 14 15.52-1.524
67 14 15.66-1.661
68 16 15.02 0.9758
69 15 14.84 0.1571
70 15 17.33-2.325
71 13 15.68-2.678
72 13 15.43-2.429
73 17 16.2 0.8044
74 15 15.18-0.1789
75 16 15.86 0.1404
76 14 15.02-1.024
77 15 14.05 0.951
78 17 14.55 2.452
79 16 15.68 0.3217
80 12 14.05-2.049
81 16 15.86 0.1404
82 17 15.61 1.39
83 17 15.86 1.14
84 20 16.18 3.822
85 17 16.51 0.4862
86 18 15.86 2.14
87 15 15.18-0.1789
88 17 15.18 1.821
89 14 13.08 0.9174
90 15 15.43-0.429
91 17 15.68 1.322
92 16 15.86 0.1404
93 17 16.74 0.2599
94 15 15.02-0.02418
95 16 15.68 0.3217
96 18 16.18 1.822
97 18 16.51 1.486
98 16 16.99-0.9895
99 17 15.23 1.771
100 15 15.68-0.6783
101 13 16.18-3.178
102 15 14.84 0.1571
103 17 16.7 0.305
104 16 15.09 0.907
105 16 15.34 0.6576
106 15 15.68-0.6783
107 16 15.68 0.3217
108 16 15.36 0.6398
109 13 15.68-2.678
110 15 15.09-0.09301
111 12 13.89-1.894
112 19 15.34 3.658
113 16 15.18 0.8211
114 16 15.5 0.5029
115 17 16.7 0.305
116 16 16.11-0.1097
117 14 15.43-1.429
118 15 15.09-0.09301
119 14 14.84-0.8429
120 16 15.43 0.571
121 15 15.86-0.8596
122 17 16.81 0.1918
123 15 15.18-0.1789
124 16 15.34 0.6576
125 16 15.68 0.3217
126 15 14.59 0.4064
127 15 15.09-0.09301
128 11 12.25-1.247
129 16 15.27 0.7257
130 18 16.02 1.977
131 13 13.91-0.9129
132 11 14.05-3.049
133 16 15.34 0.6576
134 18 17.24 0.7605
135 15 16.81-1.808
136 19 17.82 1.175
137 17 16.99 0.01054
138 13 15.09-2.093
139 14 15.11-1.111
140 16 15.68 0.3217
141 13 15.68-2.678
142 17 15.86 1.14
143 14 15.86-1.86
144 19 16.02 2.977
145 14 14.55-0.5485
146 16 15.68 0.3217
147 12 13.71-1.713
148 16 16.81-0.8082
149 16 15.34 0.6576
150 15 15.43-0.429
151 12 14.77-2.775
152 15 15.68-0.6783
153 17 16.11 0.8903
154 13 15.5-2.497
155 15 13.55 1.45
156 18 15.43 2.571
157 15 14.39 0.6062
158 18 15.43 2.571
159 15 17.14-2.144
160 15 16.18-1.178
161 16 15.84 0.1582
162 13 14.23-1.23
163 16 15.68 0.3217
164 13 15.86-2.86
165 16 14.07 1.933

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 &  13 &  13.26 & -0.2639 \tabularnewline
2 &  16 &  15.23 &  0.7708 \tabularnewline
3 &  17 &  15.86 &  1.14 \tabularnewline
4 &  15 &  14.84 &  0.1571 \tabularnewline
5 &  16 &  15.86 &  0.1404 \tabularnewline
6 &  16 &  15.36 &  0.6398 \tabularnewline
7 &  17 &  14.66 &  2.338 \tabularnewline
8 &  16 &  15.18 &  0.8211 \tabularnewline
9 &  17 &  16.74 &  0.2599 \tabularnewline
10 &  17 &  16.99 &  0.01054 \tabularnewline
11 &  17 &  15.86 &  1.14 \tabularnewline
12 &  15 &  15.27 & -0.2743 \tabularnewline
13 &  16 &  15.09 &  0.907 \tabularnewline
14 &  14 &  14.05 & -0.04904 \tabularnewline
15 &  16 &  15.32 &  0.6842 \tabularnewline
16 &  17 &  15.18 &  1.821 \tabularnewline
17 &  16 &  15.18 &  0.8211 \tabularnewline
18 &  15 &  16.97 & -1.972 \tabularnewline
19 &  17 &  15.86 &  1.14 \tabularnewline
20 &  16 &  14.84 &  1.157 \tabularnewline
21 &  15 &  15.86 & -0.8596 \tabularnewline
22 &  16 &  15.68 &  0.3217 \tabularnewline
23 &  15 &  15.68 & -0.6783 \tabularnewline
24 &  17 &  15.68 &  1.322 \tabularnewline
25 &  15 &  15.18 & -0.1789 \tabularnewline
26 &  16 &  14.77 &  1.225 \tabularnewline
27 &  15 &  15.43 & -0.429 \tabularnewline
28 &  16 &  14.86 &  1.139 \tabularnewline
29 &  16 &  16.18 & -0.1778 \tabularnewline
30 &  13 &  14.55 & -1.548 \tabularnewline
31 &  15 &  16.74 & -1.74 \tabularnewline
32 &  17 &  16.18 &  0.8222 \tabularnewline
33 &  15 &  14.3 &  0.7009 \tabularnewline
34 &  13 &  13.73 & -0.7309 \tabularnewline
35 &  17 &  16.81 &  0.1918 \tabularnewline
36 &  15 &  15.18 & -0.1789 \tabularnewline
37 &  14 &  13.96 &  0.03686 \tabularnewline
38 &  14 &  14.3 & -0.2991 \tabularnewline
39 &  18 &  15.43 &  2.571 \tabularnewline
40 &  15 &  15.93 & -0.9284 \tabularnewline
41 &  17 &  16.99 &  0.01054 \tabularnewline
42 &  13 &  13.8 & -0.7997 \tabularnewline
43 &  16 &  17.15 & -1.153 \tabularnewline
44 &  15 &  15.59 & -0.5924 \tabularnewline
45 &  15 &  15.34 & -0.3424 \tabularnewline
46 &  16 &  15.68 &  0.3217 \tabularnewline
47 &  15 &  15.54 & -0.5422 \tabularnewline
48 &  13 &  15.86 & -2.86 \tabularnewline
49 &  17 &  16.56 &  0.4412 \tabularnewline
50 &  17 &  17.24 & -0.2395 \tabularnewline
51 &  17 &  17.06 & -0.05828 \tabularnewline
52 &  11 &  14.48 & -3.48 \tabularnewline
53 &  14 &  14.21 & -0.2125 \tabularnewline
54 &  13 &  15.68 & -2.678 \tabularnewline
55 &  15 &  14.66 &  0.3383 \tabularnewline
56 &  17 &  14.93 &  2.07 \tabularnewline
57 &  16 &  15.27 &  0.7257 \tabularnewline
58 &  15 &  15.86 & -0.8596 \tabularnewline
59 &  17 &  17.49 & -0.4889 \tabularnewline
60 &  16 &  14.61 &  1.389 \tabularnewline
61 &  16 &  15.86 &  0.1404 \tabularnewline
62 &  16 &  14.82 &  1.184 \tabularnewline
63 &  15 &  15.86 & -0.8596 \tabularnewline
64 &  12 &  13.42 & -1.419 \tabularnewline
65 &  17 &  15.52 &  1.476 \tabularnewline
66 &  14 &  15.52 & -1.524 \tabularnewline
67 &  14 &  15.66 & -1.661 \tabularnewline
68 &  16 &  15.02 &  0.9758 \tabularnewline
69 &  15 &  14.84 &  0.1571 \tabularnewline
70 &  15 &  17.33 & -2.325 \tabularnewline
71 &  13 &  15.68 & -2.678 \tabularnewline
72 &  13 &  15.43 & -2.429 \tabularnewline
73 &  17 &  16.2 &  0.8044 \tabularnewline
74 &  15 &  15.18 & -0.1789 \tabularnewline
75 &  16 &  15.86 &  0.1404 \tabularnewline
76 &  14 &  15.02 & -1.024 \tabularnewline
77 &  15 &  14.05 &  0.951 \tabularnewline
78 &  17 &  14.55 &  2.452 \tabularnewline
79 &  16 &  15.68 &  0.3217 \tabularnewline
80 &  12 &  14.05 & -2.049 \tabularnewline
81 &  16 &  15.86 &  0.1404 \tabularnewline
82 &  17 &  15.61 &  1.39 \tabularnewline
83 &  17 &  15.86 &  1.14 \tabularnewline
84 &  20 &  16.18 &  3.822 \tabularnewline
85 &  17 &  16.51 &  0.4862 \tabularnewline
86 &  18 &  15.86 &  2.14 \tabularnewline
87 &  15 &  15.18 & -0.1789 \tabularnewline
88 &  17 &  15.18 &  1.821 \tabularnewline
89 &  14 &  13.08 &  0.9174 \tabularnewline
90 &  15 &  15.43 & -0.429 \tabularnewline
91 &  17 &  15.68 &  1.322 \tabularnewline
92 &  16 &  15.86 &  0.1404 \tabularnewline
93 &  17 &  16.74 &  0.2599 \tabularnewline
94 &  15 &  15.02 & -0.02418 \tabularnewline
95 &  16 &  15.68 &  0.3217 \tabularnewline
96 &  18 &  16.18 &  1.822 \tabularnewline
97 &  18 &  16.51 &  1.486 \tabularnewline
98 &  16 &  16.99 & -0.9895 \tabularnewline
99 &  17 &  15.23 &  1.771 \tabularnewline
100 &  15 &  15.68 & -0.6783 \tabularnewline
101 &  13 &  16.18 & -3.178 \tabularnewline
102 &  15 &  14.84 &  0.1571 \tabularnewline
103 &  17 &  16.7 &  0.305 \tabularnewline
104 &  16 &  15.09 &  0.907 \tabularnewline
105 &  16 &  15.34 &  0.6576 \tabularnewline
106 &  15 &  15.68 & -0.6783 \tabularnewline
107 &  16 &  15.68 &  0.3217 \tabularnewline
108 &  16 &  15.36 &  0.6398 \tabularnewline
109 &  13 &  15.68 & -2.678 \tabularnewline
110 &  15 &  15.09 & -0.09301 \tabularnewline
111 &  12 &  13.89 & -1.894 \tabularnewline
112 &  19 &  15.34 &  3.658 \tabularnewline
113 &  16 &  15.18 &  0.8211 \tabularnewline
114 &  16 &  15.5 &  0.5029 \tabularnewline
115 &  17 &  16.7 &  0.305 \tabularnewline
116 &  16 &  16.11 & -0.1097 \tabularnewline
117 &  14 &  15.43 & -1.429 \tabularnewline
118 &  15 &  15.09 & -0.09301 \tabularnewline
119 &  14 &  14.84 & -0.8429 \tabularnewline
120 &  16 &  15.43 &  0.571 \tabularnewline
121 &  15 &  15.86 & -0.8596 \tabularnewline
122 &  17 &  16.81 &  0.1918 \tabularnewline
123 &  15 &  15.18 & -0.1789 \tabularnewline
124 &  16 &  15.34 &  0.6576 \tabularnewline
125 &  16 &  15.68 &  0.3217 \tabularnewline
126 &  15 &  14.59 &  0.4064 \tabularnewline
127 &  15 &  15.09 & -0.09301 \tabularnewline
128 &  11 &  12.25 & -1.247 \tabularnewline
129 &  16 &  15.27 &  0.7257 \tabularnewline
130 &  18 &  16.02 &  1.977 \tabularnewline
131 &  13 &  13.91 & -0.9129 \tabularnewline
132 &  11 &  14.05 & -3.049 \tabularnewline
133 &  16 &  15.34 &  0.6576 \tabularnewline
134 &  18 &  17.24 &  0.7605 \tabularnewline
135 &  15 &  16.81 & -1.808 \tabularnewline
136 &  19 &  17.82 &  1.175 \tabularnewline
137 &  17 &  16.99 &  0.01054 \tabularnewline
138 &  13 &  15.09 & -2.093 \tabularnewline
139 &  14 &  15.11 & -1.111 \tabularnewline
140 &  16 &  15.68 &  0.3217 \tabularnewline
141 &  13 &  15.68 & -2.678 \tabularnewline
142 &  17 &  15.86 &  1.14 \tabularnewline
143 &  14 &  15.86 & -1.86 \tabularnewline
144 &  19 &  16.02 &  2.977 \tabularnewline
145 &  14 &  14.55 & -0.5485 \tabularnewline
146 &  16 &  15.68 &  0.3217 \tabularnewline
147 &  12 &  13.71 & -1.713 \tabularnewline
148 &  16 &  16.81 & -0.8082 \tabularnewline
149 &  16 &  15.34 &  0.6576 \tabularnewline
150 &  15 &  15.43 & -0.429 \tabularnewline
151 &  12 &  14.77 & -2.775 \tabularnewline
152 &  15 &  15.68 & -0.6783 \tabularnewline
153 &  17 &  16.11 &  0.8903 \tabularnewline
154 &  13 &  15.5 & -2.497 \tabularnewline
155 &  15 &  13.55 &  1.45 \tabularnewline
156 &  18 &  15.43 &  2.571 \tabularnewline
157 &  15 &  14.39 &  0.6062 \tabularnewline
158 &  18 &  15.43 &  2.571 \tabularnewline
159 &  15 &  17.14 & -2.144 \tabularnewline
160 &  15 &  16.18 & -1.178 \tabularnewline
161 &  16 &  15.84 &  0.1582 \tabularnewline
162 &  13 &  14.23 & -1.23 \tabularnewline
163 &  16 &  15.68 &  0.3217 \tabularnewline
164 &  13 &  15.86 & -2.86 \tabularnewline
165 &  16 &  14.07 &  1.933 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=298736&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C] 13[/C][C] 13.26[/C][C]-0.2639[/C][/ROW]
[ROW][C]2[/C][C] 16[/C][C] 15.23[/C][C] 0.7708[/C][/ROW]
[ROW][C]3[/C][C] 17[/C][C] 15.86[/C][C] 1.14[/C][/ROW]
[ROW][C]4[/C][C] 15[/C][C] 14.84[/C][C] 0.1571[/C][/ROW]
[ROW][C]5[/C][C] 16[/C][C] 15.86[/C][C] 0.1404[/C][/ROW]
[ROW][C]6[/C][C] 16[/C][C] 15.36[/C][C] 0.6398[/C][/ROW]
[ROW][C]7[/C][C] 17[/C][C] 14.66[/C][C] 2.338[/C][/ROW]
[ROW][C]8[/C][C] 16[/C][C] 15.18[/C][C] 0.8211[/C][/ROW]
[ROW][C]9[/C][C] 17[/C][C] 16.74[/C][C] 0.2599[/C][/ROW]
[ROW][C]10[/C][C] 17[/C][C] 16.99[/C][C] 0.01054[/C][/ROW]
[ROW][C]11[/C][C] 17[/C][C] 15.86[/C][C] 1.14[/C][/ROW]
[ROW][C]12[/C][C] 15[/C][C] 15.27[/C][C]-0.2743[/C][/ROW]
[ROW][C]13[/C][C] 16[/C][C] 15.09[/C][C] 0.907[/C][/ROW]
[ROW][C]14[/C][C] 14[/C][C] 14.05[/C][C]-0.04904[/C][/ROW]
[ROW][C]15[/C][C] 16[/C][C] 15.32[/C][C] 0.6842[/C][/ROW]
[ROW][C]16[/C][C] 17[/C][C] 15.18[/C][C] 1.821[/C][/ROW]
[ROW][C]17[/C][C] 16[/C][C] 15.18[/C][C] 0.8211[/C][/ROW]
[ROW][C]18[/C][C] 15[/C][C] 16.97[/C][C]-1.972[/C][/ROW]
[ROW][C]19[/C][C] 17[/C][C] 15.86[/C][C] 1.14[/C][/ROW]
[ROW][C]20[/C][C] 16[/C][C] 14.84[/C][C] 1.157[/C][/ROW]
[ROW][C]21[/C][C] 15[/C][C] 15.86[/C][C]-0.8596[/C][/ROW]
[ROW][C]22[/C][C] 16[/C][C] 15.68[/C][C] 0.3217[/C][/ROW]
[ROW][C]23[/C][C] 15[/C][C] 15.68[/C][C]-0.6783[/C][/ROW]
[ROW][C]24[/C][C] 17[/C][C] 15.68[/C][C] 1.322[/C][/ROW]
[ROW][C]25[/C][C] 15[/C][C] 15.18[/C][C]-0.1789[/C][/ROW]
[ROW][C]26[/C][C] 16[/C][C] 14.77[/C][C] 1.225[/C][/ROW]
[ROW][C]27[/C][C] 15[/C][C] 15.43[/C][C]-0.429[/C][/ROW]
[ROW][C]28[/C][C] 16[/C][C] 14.86[/C][C] 1.139[/C][/ROW]
[ROW][C]29[/C][C] 16[/C][C] 16.18[/C][C]-0.1778[/C][/ROW]
[ROW][C]30[/C][C] 13[/C][C] 14.55[/C][C]-1.548[/C][/ROW]
[ROW][C]31[/C][C] 15[/C][C] 16.74[/C][C]-1.74[/C][/ROW]
[ROW][C]32[/C][C] 17[/C][C] 16.18[/C][C] 0.8222[/C][/ROW]
[ROW][C]33[/C][C] 15[/C][C] 14.3[/C][C] 0.7009[/C][/ROW]
[ROW][C]34[/C][C] 13[/C][C] 13.73[/C][C]-0.7309[/C][/ROW]
[ROW][C]35[/C][C] 17[/C][C] 16.81[/C][C] 0.1918[/C][/ROW]
[ROW][C]36[/C][C] 15[/C][C] 15.18[/C][C]-0.1789[/C][/ROW]
[ROW][C]37[/C][C] 14[/C][C] 13.96[/C][C] 0.03686[/C][/ROW]
[ROW][C]38[/C][C] 14[/C][C] 14.3[/C][C]-0.2991[/C][/ROW]
[ROW][C]39[/C][C] 18[/C][C] 15.43[/C][C] 2.571[/C][/ROW]
[ROW][C]40[/C][C] 15[/C][C] 15.93[/C][C]-0.9284[/C][/ROW]
[ROW][C]41[/C][C] 17[/C][C] 16.99[/C][C] 0.01054[/C][/ROW]
[ROW][C]42[/C][C] 13[/C][C] 13.8[/C][C]-0.7997[/C][/ROW]
[ROW][C]43[/C][C] 16[/C][C] 17.15[/C][C]-1.153[/C][/ROW]
[ROW][C]44[/C][C] 15[/C][C] 15.59[/C][C]-0.5924[/C][/ROW]
[ROW][C]45[/C][C] 15[/C][C] 15.34[/C][C]-0.3424[/C][/ROW]
[ROW][C]46[/C][C] 16[/C][C] 15.68[/C][C] 0.3217[/C][/ROW]
[ROW][C]47[/C][C] 15[/C][C] 15.54[/C][C]-0.5422[/C][/ROW]
[ROW][C]48[/C][C] 13[/C][C] 15.86[/C][C]-2.86[/C][/ROW]
[ROW][C]49[/C][C] 17[/C][C] 16.56[/C][C] 0.4412[/C][/ROW]
[ROW][C]50[/C][C] 17[/C][C] 17.24[/C][C]-0.2395[/C][/ROW]
[ROW][C]51[/C][C] 17[/C][C] 17.06[/C][C]-0.05828[/C][/ROW]
[ROW][C]52[/C][C] 11[/C][C] 14.48[/C][C]-3.48[/C][/ROW]
[ROW][C]53[/C][C] 14[/C][C] 14.21[/C][C]-0.2125[/C][/ROW]
[ROW][C]54[/C][C] 13[/C][C] 15.68[/C][C]-2.678[/C][/ROW]
[ROW][C]55[/C][C] 15[/C][C] 14.66[/C][C] 0.3383[/C][/ROW]
[ROW][C]56[/C][C] 17[/C][C] 14.93[/C][C] 2.07[/C][/ROW]
[ROW][C]57[/C][C] 16[/C][C] 15.27[/C][C] 0.7257[/C][/ROW]
[ROW][C]58[/C][C] 15[/C][C] 15.86[/C][C]-0.8596[/C][/ROW]
[ROW][C]59[/C][C] 17[/C][C] 17.49[/C][C]-0.4889[/C][/ROW]
[ROW][C]60[/C][C] 16[/C][C] 14.61[/C][C] 1.389[/C][/ROW]
[ROW][C]61[/C][C] 16[/C][C] 15.86[/C][C] 0.1404[/C][/ROW]
[ROW][C]62[/C][C] 16[/C][C] 14.82[/C][C] 1.184[/C][/ROW]
[ROW][C]63[/C][C] 15[/C][C] 15.86[/C][C]-0.8596[/C][/ROW]
[ROW][C]64[/C][C] 12[/C][C] 13.42[/C][C]-1.419[/C][/ROW]
[ROW][C]65[/C][C] 17[/C][C] 15.52[/C][C] 1.476[/C][/ROW]
[ROW][C]66[/C][C] 14[/C][C] 15.52[/C][C]-1.524[/C][/ROW]
[ROW][C]67[/C][C] 14[/C][C] 15.66[/C][C]-1.661[/C][/ROW]
[ROW][C]68[/C][C] 16[/C][C] 15.02[/C][C] 0.9758[/C][/ROW]
[ROW][C]69[/C][C] 15[/C][C] 14.84[/C][C] 0.1571[/C][/ROW]
[ROW][C]70[/C][C] 15[/C][C] 17.33[/C][C]-2.325[/C][/ROW]
[ROW][C]71[/C][C] 13[/C][C] 15.68[/C][C]-2.678[/C][/ROW]
[ROW][C]72[/C][C] 13[/C][C] 15.43[/C][C]-2.429[/C][/ROW]
[ROW][C]73[/C][C] 17[/C][C] 16.2[/C][C] 0.8044[/C][/ROW]
[ROW][C]74[/C][C] 15[/C][C] 15.18[/C][C]-0.1789[/C][/ROW]
[ROW][C]75[/C][C] 16[/C][C] 15.86[/C][C] 0.1404[/C][/ROW]
[ROW][C]76[/C][C] 14[/C][C] 15.02[/C][C]-1.024[/C][/ROW]
[ROW][C]77[/C][C] 15[/C][C] 14.05[/C][C] 0.951[/C][/ROW]
[ROW][C]78[/C][C] 17[/C][C] 14.55[/C][C] 2.452[/C][/ROW]
[ROW][C]79[/C][C] 16[/C][C] 15.68[/C][C] 0.3217[/C][/ROW]
[ROW][C]80[/C][C] 12[/C][C] 14.05[/C][C]-2.049[/C][/ROW]
[ROW][C]81[/C][C] 16[/C][C] 15.86[/C][C] 0.1404[/C][/ROW]
[ROW][C]82[/C][C] 17[/C][C] 15.61[/C][C] 1.39[/C][/ROW]
[ROW][C]83[/C][C] 17[/C][C] 15.86[/C][C] 1.14[/C][/ROW]
[ROW][C]84[/C][C] 20[/C][C] 16.18[/C][C] 3.822[/C][/ROW]
[ROW][C]85[/C][C] 17[/C][C] 16.51[/C][C] 0.4862[/C][/ROW]
[ROW][C]86[/C][C] 18[/C][C] 15.86[/C][C] 2.14[/C][/ROW]
[ROW][C]87[/C][C] 15[/C][C] 15.18[/C][C]-0.1789[/C][/ROW]
[ROW][C]88[/C][C] 17[/C][C] 15.18[/C][C] 1.821[/C][/ROW]
[ROW][C]89[/C][C] 14[/C][C] 13.08[/C][C] 0.9174[/C][/ROW]
[ROW][C]90[/C][C] 15[/C][C] 15.43[/C][C]-0.429[/C][/ROW]
[ROW][C]91[/C][C] 17[/C][C] 15.68[/C][C] 1.322[/C][/ROW]
[ROW][C]92[/C][C] 16[/C][C] 15.86[/C][C] 0.1404[/C][/ROW]
[ROW][C]93[/C][C] 17[/C][C] 16.74[/C][C] 0.2599[/C][/ROW]
[ROW][C]94[/C][C] 15[/C][C] 15.02[/C][C]-0.02418[/C][/ROW]
[ROW][C]95[/C][C] 16[/C][C] 15.68[/C][C] 0.3217[/C][/ROW]
[ROW][C]96[/C][C] 18[/C][C] 16.18[/C][C] 1.822[/C][/ROW]
[ROW][C]97[/C][C] 18[/C][C] 16.51[/C][C] 1.486[/C][/ROW]
[ROW][C]98[/C][C] 16[/C][C] 16.99[/C][C]-0.9895[/C][/ROW]
[ROW][C]99[/C][C] 17[/C][C] 15.23[/C][C] 1.771[/C][/ROW]
[ROW][C]100[/C][C] 15[/C][C] 15.68[/C][C]-0.6783[/C][/ROW]
[ROW][C]101[/C][C] 13[/C][C] 16.18[/C][C]-3.178[/C][/ROW]
[ROW][C]102[/C][C] 15[/C][C] 14.84[/C][C] 0.1571[/C][/ROW]
[ROW][C]103[/C][C] 17[/C][C] 16.7[/C][C] 0.305[/C][/ROW]
[ROW][C]104[/C][C] 16[/C][C] 15.09[/C][C] 0.907[/C][/ROW]
[ROW][C]105[/C][C] 16[/C][C] 15.34[/C][C] 0.6576[/C][/ROW]
[ROW][C]106[/C][C] 15[/C][C] 15.68[/C][C]-0.6783[/C][/ROW]
[ROW][C]107[/C][C] 16[/C][C] 15.68[/C][C] 0.3217[/C][/ROW]
[ROW][C]108[/C][C] 16[/C][C] 15.36[/C][C] 0.6398[/C][/ROW]
[ROW][C]109[/C][C] 13[/C][C] 15.68[/C][C]-2.678[/C][/ROW]
[ROW][C]110[/C][C] 15[/C][C] 15.09[/C][C]-0.09301[/C][/ROW]
[ROW][C]111[/C][C] 12[/C][C] 13.89[/C][C]-1.894[/C][/ROW]
[ROW][C]112[/C][C] 19[/C][C] 15.34[/C][C] 3.658[/C][/ROW]
[ROW][C]113[/C][C] 16[/C][C] 15.18[/C][C] 0.8211[/C][/ROW]
[ROW][C]114[/C][C] 16[/C][C] 15.5[/C][C] 0.5029[/C][/ROW]
[ROW][C]115[/C][C] 17[/C][C] 16.7[/C][C] 0.305[/C][/ROW]
[ROW][C]116[/C][C] 16[/C][C] 16.11[/C][C]-0.1097[/C][/ROW]
[ROW][C]117[/C][C] 14[/C][C] 15.43[/C][C]-1.429[/C][/ROW]
[ROW][C]118[/C][C] 15[/C][C] 15.09[/C][C]-0.09301[/C][/ROW]
[ROW][C]119[/C][C] 14[/C][C] 14.84[/C][C]-0.8429[/C][/ROW]
[ROW][C]120[/C][C] 16[/C][C] 15.43[/C][C] 0.571[/C][/ROW]
[ROW][C]121[/C][C] 15[/C][C] 15.86[/C][C]-0.8596[/C][/ROW]
[ROW][C]122[/C][C] 17[/C][C] 16.81[/C][C] 0.1918[/C][/ROW]
[ROW][C]123[/C][C] 15[/C][C] 15.18[/C][C]-0.1789[/C][/ROW]
[ROW][C]124[/C][C] 16[/C][C] 15.34[/C][C] 0.6576[/C][/ROW]
[ROW][C]125[/C][C] 16[/C][C] 15.68[/C][C] 0.3217[/C][/ROW]
[ROW][C]126[/C][C] 15[/C][C] 14.59[/C][C] 0.4064[/C][/ROW]
[ROW][C]127[/C][C] 15[/C][C] 15.09[/C][C]-0.09301[/C][/ROW]
[ROW][C]128[/C][C] 11[/C][C] 12.25[/C][C]-1.247[/C][/ROW]
[ROW][C]129[/C][C] 16[/C][C] 15.27[/C][C] 0.7257[/C][/ROW]
[ROW][C]130[/C][C] 18[/C][C] 16.02[/C][C] 1.977[/C][/ROW]
[ROW][C]131[/C][C] 13[/C][C] 13.91[/C][C]-0.9129[/C][/ROW]
[ROW][C]132[/C][C] 11[/C][C] 14.05[/C][C]-3.049[/C][/ROW]
[ROW][C]133[/C][C] 16[/C][C] 15.34[/C][C] 0.6576[/C][/ROW]
[ROW][C]134[/C][C] 18[/C][C] 17.24[/C][C] 0.7605[/C][/ROW]
[ROW][C]135[/C][C] 15[/C][C] 16.81[/C][C]-1.808[/C][/ROW]
[ROW][C]136[/C][C] 19[/C][C] 17.82[/C][C] 1.175[/C][/ROW]
[ROW][C]137[/C][C] 17[/C][C] 16.99[/C][C] 0.01054[/C][/ROW]
[ROW][C]138[/C][C] 13[/C][C] 15.09[/C][C]-2.093[/C][/ROW]
[ROW][C]139[/C][C] 14[/C][C] 15.11[/C][C]-1.111[/C][/ROW]
[ROW][C]140[/C][C] 16[/C][C] 15.68[/C][C] 0.3217[/C][/ROW]
[ROW][C]141[/C][C] 13[/C][C] 15.68[/C][C]-2.678[/C][/ROW]
[ROW][C]142[/C][C] 17[/C][C] 15.86[/C][C] 1.14[/C][/ROW]
[ROW][C]143[/C][C] 14[/C][C] 15.86[/C][C]-1.86[/C][/ROW]
[ROW][C]144[/C][C] 19[/C][C] 16.02[/C][C] 2.977[/C][/ROW]
[ROW][C]145[/C][C] 14[/C][C] 14.55[/C][C]-0.5485[/C][/ROW]
[ROW][C]146[/C][C] 16[/C][C] 15.68[/C][C] 0.3217[/C][/ROW]
[ROW][C]147[/C][C] 12[/C][C] 13.71[/C][C]-1.713[/C][/ROW]
[ROW][C]148[/C][C] 16[/C][C] 16.81[/C][C]-0.8082[/C][/ROW]
[ROW][C]149[/C][C] 16[/C][C] 15.34[/C][C] 0.6576[/C][/ROW]
[ROW][C]150[/C][C] 15[/C][C] 15.43[/C][C]-0.429[/C][/ROW]
[ROW][C]151[/C][C] 12[/C][C] 14.77[/C][C]-2.775[/C][/ROW]
[ROW][C]152[/C][C] 15[/C][C] 15.68[/C][C]-0.6783[/C][/ROW]
[ROW][C]153[/C][C] 17[/C][C] 16.11[/C][C] 0.8903[/C][/ROW]
[ROW][C]154[/C][C] 13[/C][C] 15.5[/C][C]-2.497[/C][/ROW]
[ROW][C]155[/C][C] 15[/C][C] 13.55[/C][C] 1.45[/C][/ROW]
[ROW][C]156[/C][C] 18[/C][C] 15.43[/C][C] 2.571[/C][/ROW]
[ROW][C]157[/C][C] 15[/C][C] 14.39[/C][C] 0.6062[/C][/ROW]
[ROW][C]158[/C][C] 18[/C][C] 15.43[/C][C] 2.571[/C][/ROW]
[ROW][C]159[/C][C] 15[/C][C] 17.14[/C][C]-2.144[/C][/ROW]
[ROW][C]160[/C][C] 15[/C][C] 16.18[/C][C]-1.178[/C][/ROW]
[ROW][C]161[/C][C] 16[/C][C] 15.84[/C][C] 0.1582[/C][/ROW]
[ROW][C]162[/C][C] 13[/C][C] 14.23[/C][C]-1.23[/C][/ROW]
[ROW][C]163[/C][C] 16[/C][C] 15.68[/C][C] 0.3217[/C][/ROW]
[ROW][C]164[/C][C] 13[/C][C] 15.86[/C][C]-2.86[/C][/ROW]
[ROW][C]165[/C][C] 16[/C][C] 14.07[/C][C] 1.933[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=298736&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=298736&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 13 13.26-0.2639
2 16 15.23 0.7708
3 17 15.86 1.14
4 15 14.84 0.1571
5 16 15.86 0.1404
6 16 15.36 0.6398
7 17 14.66 2.338
8 16 15.18 0.8211
9 17 16.74 0.2599
10 17 16.99 0.01054
11 17 15.86 1.14
12 15 15.27-0.2743
13 16 15.09 0.907
14 14 14.05-0.04904
15 16 15.32 0.6842
16 17 15.18 1.821
17 16 15.18 0.8211
18 15 16.97-1.972
19 17 15.86 1.14
20 16 14.84 1.157
21 15 15.86-0.8596
22 16 15.68 0.3217
23 15 15.68-0.6783
24 17 15.68 1.322
25 15 15.18-0.1789
26 16 14.77 1.225
27 15 15.43-0.429
28 16 14.86 1.139
29 16 16.18-0.1778
30 13 14.55-1.548
31 15 16.74-1.74
32 17 16.18 0.8222
33 15 14.3 0.7009
34 13 13.73-0.7309
35 17 16.81 0.1918
36 15 15.18-0.1789
37 14 13.96 0.03686
38 14 14.3-0.2991
39 18 15.43 2.571
40 15 15.93-0.9284
41 17 16.99 0.01054
42 13 13.8-0.7997
43 16 17.15-1.153
44 15 15.59-0.5924
45 15 15.34-0.3424
46 16 15.68 0.3217
47 15 15.54-0.5422
48 13 15.86-2.86
49 17 16.56 0.4412
50 17 17.24-0.2395
51 17 17.06-0.05828
52 11 14.48-3.48
53 14 14.21-0.2125
54 13 15.68-2.678
55 15 14.66 0.3383
56 17 14.93 2.07
57 16 15.27 0.7257
58 15 15.86-0.8596
59 17 17.49-0.4889
60 16 14.61 1.389
61 16 15.86 0.1404
62 16 14.82 1.184
63 15 15.86-0.8596
64 12 13.42-1.419
65 17 15.52 1.476
66 14 15.52-1.524
67 14 15.66-1.661
68 16 15.02 0.9758
69 15 14.84 0.1571
70 15 17.33-2.325
71 13 15.68-2.678
72 13 15.43-2.429
73 17 16.2 0.8044
74 15 15.18-0.1789
75 16 15.86 0.1404
76 14 15.02-1.024
77 15 14.05 0.951
78 17 14.55 2.452
79 16 15.68 0.3217
80 12 14.05-2.049
81 16 15.86 0.1404
82 17 15.61 1.39
83 17 15.86 1.14
84 20 16.18 3.822
85 17 16.51 0.4862
86 18 15.86 2.14
87 15 15.18-0.1789
88 17 15.18 1.821
89 14 13.08 0.9174
90 15 15.43-0.429
91 17 15.68 1.322
92 16 15.86 0.1404
93 17 16.74 0.2599
94 15 15.02-0.02418
95 16 15.68 0.3217
96 18 16.18 1.822
97 18 16.51 1.486
98 16 16.99-0.9895
99 17 15.23 1.771
100 15 15.68-0.6783
101 13 16.18-3.178
102 15 14.84 0.1571
103 17 16.7 0.305
104 16 15.09 0.907
105 16 15.34 0.6576
106 15 15.68-0.6783
107 16 15.68 0.3217
108 16 15.36 0.6398
109 13 15.68-2.678
110 15 15.09-0.09301
111 12 13.89-1.894
112 19 15.34 3.658
113 16 15.18 0.8211
114 16 15.5 0.5029
115 17 16.7 0.305
116 16 16.11-0.1097
117 14 15.43-1.429
118 15 15.09-0.09301
119 14 14.84-0.8429
120 16 15.43 0.571
121 15 15.86-0.8596
122 17 16.81 0.1918
123 15 15.18-0.1789
124 16 15.34 0.6576
125 16 15.68 0.3217
126 15 14.59 0.4064
127 15 15.09-0.09301
128 11 12.25-1.247
129 16 15.27 0.7257
130 18 16.02 1.977
131 13 13.91-0.9129
132 11 14.05-3.049
133 16 15.34 0.6576
134 18 17.24 0.7605
135 15 16.81-1.808
136 19 17.82 1.175
137 17 16.99 0.01054
138 13 15.09-2.093
139 14 15.11-1.111
140 16 15.68 0.3217
141 13 15.68-2.678
142 17 15.86 1.14
143 14 15.86-1.86
144 19 16.02 2.977
145 14 14.55-0.5485
146 16 15.68 0.3217
147 12 13.71-1.713
148 16 16.81-0.8082
149 16 15.34 0.6576
150 15 15.43-0.429
151 12 14.77-2.775
152 15 15.68-0.6783
153 17 16.11 0.8903
154 13 15.5-2.497
155 15 13.55 1.45
156 18 15.43 2.571
157 15 14.39 0.6062
158 18 15.43 2.571
159 15 17.14-2.144
160 15 16.18-1.178
161 16 15.84 0.1582
162 13 14.23-1.23
163 16 15.68 0.3217
164 13 15.86-2.86
165 16 14.07 1.933







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
9 0.1405 0.281 0.8595
10 0.06798 0.136 0.932
11 0.04305 0.0861 0.9569
12 0.01743 0.03486 0.9826
13 0.00651 0.01302 0.9935
14 0.00923 0.01846 0.9908
15 0.0171 0.03419 0.9829
16 0.0182 0.03641 0.9818
17 0.009079 0.01816 0.9909
18 0.03233 0.06467 0.9677
19 0.02439 0.04877 0.9756
20 0.01649 0.03299 0.9835
21 0.01801 0.03601 0.982
22 0.01046 0.02093 0.9895
23 0.01037 0.02074 0.9896
24 0.008727 0.01745 0.9913
25 0.009399 0.0188 0.9906
26 0.006058 0.01212 0.9939
27 0.005386 0.01077 0.9946
28 0.003407 0.006815 0.9966
29 0.001927 0.003855 0.9981
30 0.004896 0.009792 0.9951
31 0.008914 0.01783 0.9911
32 0.008657 0.01731 0.9913
33 0.006065 0.01213 0.9939
34 0.009647 0.01929 0.9904
35 0.006218 0.01244 0.9938
36 0.004732 0.009464 0.9953
37 0.002988 0.005976 0.997
38 0.001911 0.003821 0.9981
39 0.007366 0.01473 0.9926
40 0.00607 0.01214 0.9939
41 0.003947 0.007894 0.9961
42 0.003949 0.007898 0.9961
43 0.003409 0.006818 0.9966
44 0.002311 0.004622 0.9977
45 0.001591 0.003182 0.9984
46 0.001004 0.002008 0.999
47 0.0008791 0.001758 0.9991
48 0.005786 0.01157 0.9942
49 0.004057 0.008115 0.9959
50 0.002764 0.005528 0.9972
51 0.001825 0.003651 0.9982
52 0.01498 0.02996 0.985
53 0.01081 0.02161 0.9892
54 0.03283 0.06567 0.9672
55 0.0258 0.0516 0.9742
56 0.03363 0.06727 0.9664
57 0.02897 0.05795 0.971
58 0.02361 0.04722 0.9764
59 0.01786 0.03573 0.9821
60 0.01618 0.03237 0.9838
61 0.01203 0.02407 0.988
62 0.01087 0.02174 0.9891
63 0.008746 0.01749 0.9913
64 0.008582 0.01716 0.9914
65 0.00994 0.01988 0.9901
66 0.01101 0.02202 0.989
67 0.01212 0.02424 0.9879
68 0.01006 0.02013 0.9899
69 0.007615 0.01523 0.9924
70 0.01336 0.02672 0.9866
71 0.02977 0.05954 0.9702
72 0.05237 0.1047 0.9476
73 0.04968 0.09937 0.9503
74 0.04019 0.08039 0.9598
75 0.03175 0.06351 0.9682
76 0.03034 0.06068 0.9697
77 0.0265 0.053 0.9735
78 0.05206 0.1041 0.9479
79 0.04176 0.08351 0.9582
80 0.05906 0.1181 0.9409
81 0.04741 0.09483 0.9526
82 0.04879 0.09759 0.9512
83 0.04674 0.09347 0.9533
84 0.1977 0.3954 0.8023
85 0.1727 0.3454 0.8273
86 0.2156 0.4311 0.7844
87 0.1869 0.3738 0.8131
88 0.2181 0.4362 0.7819
89 0.1983 0.3965 0.8017
90 0.1704 0.3407 0.8296
91 0.1699 0.3399 0.8301
92 0.1427 0.2853 0.8573
93 0.1191 0.2382 0.8809
94 0.09843 0.1969 0.9016
95 0.0814 0.1628 0.9186
96 0.09435 0.1887 0.9056
97 0.09828 0.1966 0.9017
98 0.0888 0.1776 0.9112
99 0.1 0.2 0.9
100 0.08478 0.1696 0.9152
101 0.1817 0.3634 0.8183
102 0.1552 0.3104 0.8448
103 0.1297 0.2594 0.8703
104 0.1155 0.2311 0.8845
105 0.09923 0.1985 0.9008
106 0.08344 0.1669 0.9166
107 0.06834 0.1367 0.9317
108 0.05873 0.1175 0.9413
109 0.09857 0.1971 0.9014
110 0.07926 0.1585 0.9207
111 0.09083 0.1817 0.9092
112 0.273 0.546 0.727
113 0.2663 0.5325 0.7337
114 0.248 0.4959 0.752
115 0.2116 0.4231 0.7884
116 0.1862 0.3725 0.8138
117 0.1886 0.3771 0.8114
118 0.1564 0.3127 0.8436
119 0.1333 0.2665 0.8668
120 0.1104 0.2208 0.8896
121 0.0962 0.1924 0.9038
122 0.08068 0.1614 0.9193
123 0.06675 0.1335 0.9332
124 0.05934 0.1187 0.9407
125 0.04818 0.09635 0.9518
126 0.04043 0.08085 0.9596
127 0.03002 0.06003 0.97
128 0.02547 0.05094 0.9745
129 0.01923 0.03845 0.9808
130 0.02249 0.04497 0.9775
131 0.01945 0.0389 0.9805
132 0.04421 0.08842 0.9558
133 0.04333 0.08665 0.9567
134 0.033 0.06599 0.967
135 0.02782 0.05563 0.9722
136 0.02283 0.04566 0.9772
137 0.01937 0.03874 0.9806
138 0.02417 0.04835 0.9758
139 0.02025 0.04049 0.9798
140 0.0151 0.0302 0.9849
141 0.02412 0.04825 0.9759
142 0.0233 0.04659 0.9767
143 0.02272 0.04543 0.9773
144 0.1141 0.2282 0.8859
145 0.1063 0.2126 0.8937
146 0.08278 0.1656 0.9172
147 0.0982 0.1964 0.9018
148 0.1158 0.2316 0.8842
149 0.1891 0.3781 0.8109
150 0.22 0.4401 0.78
151 0.4395 0.8789 0.5605
152 0.3399 0.6797 0.6601
153 0.2616 0.5232 0.7384
154 0.5667 0.8666 0.4333
155 0.6792 0.6416 0.3208
156 0.5206 0.9587 0.4794

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
9 &  0.1405 &  0.281 &  0.8595 \tabularnewline
10 &  0.06798 &  0.136 &  0.932 \tabularnewline
11 &  0.04305 &  0.0861 &  0.9569 \tabularnewline
12 &  0.01743 &  0.03486 &  0.9826 \tabularnewline
13 &  0.00651 &  0.01302 &  0.9935 \tabularnewline
14 &  0.00923 &  0.01846 &  0.9908 \tabularnewline
15 &  0.0171 &  0.03419 &  0.9829 \tabularnewline
16 &  0.0182 &  0.03641 &  0.9818 \tabularnewline
17 &  0.009079 &  0.01816 &  0.9909 \tabularnewline
18 &  0.03233 &  0.06467 &  0.9677 \tabularnewline
19 &  0.02439 &  0.04877 &  0.9756 \tabularnewline
20 &  0.01649 &  0.03299 &  0.9835 \tabularnewline
21 &  0.01801 &  0.03601 &  0.982 \tabularnewline
22 &  0.01046 &  0.02093 &  0.9895 \tabularnewline
23 &  0.01037 &  0.02074 &  0.9896 \tabularnewline
24 &  0.008727 &  0.01745 &  0.9913 \tabularnewline
25 &  0.009399 &  0.0188 &  0.9906 \tabularnewline
26 &  0.006058 &  0.01212 &  0.9939 \tabularnewline
27 &  0.005386 &  0.01077 &  0.9946 \tabularnewline
28 &  0.003407 &  0.006815 &  0.9966 \tabularnewline
29 &  0.001927 &  0.003855 &  0.9981 \tabularnewline
30 &  0.004896 &  0.009792 &  0.9951 \tabularnewline
31 &  0.008914 &  0.01783 &  0.9911 \tabularnewline
32 &  0.008657 &  0.01731 &  0.9913 \tabularnewline
33 &  0.006065 &  0.01213 &  0.9939 \tabularnewline
34 &  0.009647 &  0.01929 &  0.9904 \tabularnewline
35 &  0.006218 &  0.01244 &  0.9938 \tabularnewline
36 &  0.004732 &  0.009464 &  0.9953 \tabularnewline
37 &  0.002988 &  0.005976 &  0.997 \tabularnewline
38 &  0.001911 &  0.003821 &  0.9981 \tabularnewline
39 &  0.007366 &  0.01473 &  0.9926 \tabularnewline
40 &  0.00607 &  0.01214 &  0.9939 \tabularnewline
41 &  0.003947 &  0.007894 &  0.9961 \tabularnewline
42 &  0.003949 &  0.007898 &  0.9961 \tabularnewline
43 &  0.003409 &  0.006818 &  0.9966 \tabularnewline
44 &  0.002311 &  0.004622 &  0.9977 \tabularnewline
45 &  0.001591 &  0.003182 &  0.9984 \tabularnewline
46 &  0.001004 &  0.002008 &  0.999 \tabularnewline
47 &  0.0008791 &  0.001758 &  0.9991 \tabularnewline
48 &  0.005786 &  0.01157 &  0.9942 \tabularnewline
49 &  0.004057 &  0.008115 &  0.9959 \tabularnewline
50 &  0.002764 &  0.005528 &  0.9972 \tabularnewline
51 &  0.001825 &  0.003651 &  0.9982 \tabularnewline
52 &  0.01498 &  0.02996 &  0.985 \tabularnewline
53 &  0.01081 &  0.02161 &  0.9892 \tabularnewline
54 &  0.03283 &  0.06567 &  0.9672 \tabularnewline
55 &  0.0258 &  0.0516 &  0.9742 \tabularnewline
56 &  0.03363 &  0.06727 &  0.9664 \tabularnewline
57 &  0.02897 &  0.05795 &  0.971 \tabularnewline
58 &  0.02361 &  0.04722 &  0.9764 \tabularnewline
59 &  0.01786 &  0.03573 &  0.9821 \tabularnewline
60 &  0.01618 &  0.03237 &  0.9838 \tabularnewline
61 &  0.01203 &  0.02407 &  0.988 \tabularnewline
62 &  0.01087 &  0.02174 &  0.9891 \tabularnewline
63 &  0.008746 &  0.01749 &  0.9913 \tabularnewline
64 &  0.008582 &  0.01716 &  0.9914 \tabularnewline
65 &  0.00994 &  0.01988 &  0.9901 \tabularnewline
66 &  0.01101 &  0.02202 &  0.989 \tabularnewline
67 &  0.01212 &  0.02424 &  0.9879 \tabularnewline
68 &  0.01006 &  0.02013 &  0.9899 \tabularnewline
69 &  0.007615 &  0.01523 &  0.9924 \tabularnewline
70 &  0.01336 &  0.02672 &  0.9866 \tabularnewline
71 &  0.02977 &  0.05954 &  0.9702 \tabularnewline
72 &  0.05237 &  0.1047 &  0.9476 \tabularnewline
73 &  0.04968 &  0.09937 &  0.9503 \tabularnewline
74 &  0.04019 &  0.08039 &  0.9598 \tabularnewline
75 &  0.03175 &  0.06351 &  0.9682 \tabularnewline
76 &  0.03034 &  0.06068 &  0.9697 \tabularnewline
77 &  0.0265 &  0.053 &  0.9735 \tabularnewline
78 &  0.05206 &  0.1041 &  0.9479 \tabularnewline
79 &  0.04176 &  0.08351 &  0.9582 \tabularnewline
80 &  0.05906 &  0.1181 &  0.9409 \tabularnewline
81 &  0.04741 &  0.09483 &  0.9526 \tabularnewline
82 &  0.04879 &  0.09759 &  0.9512 \tabularnewline
83 &  0.04674 &  0.09347 &  0.9533 \tabularnewline
84 &  0.1977 &  0.3954 &  0.8023 \tabularnewline
85 &  0.1727 &  0.3454 &  0.8273 \tabularnewline
86 &  0.2156 &  0.4311 &  0.7844 \tabularnewline
87 &  0.1869 &  0.3738 &  0.8131 \tabularnewline
88 &  0.2181 &  0.4362 &  0.7819 \tabularnewline
89 &  0.1983 &  0.3965 &  0.8017 \tabularnewline
90 &  0.1704 &  0.3407 &  0.8296 \tabularnewline
91 &  0.1699 &  0.3399 &  0.8301 \tabularnewline
92 &  0.1427 &  0.2853 &  0.8573 \tabularnewline
93 &  0.1191 &  0.2382 &  0.8809 \tabularnewline
94 &  0.09843 &  0.1969 &  0.9016 \tabularnewline
95 &  0.0814 &  0.1628 &  0.9186 \tabularnewline
96 &  0.09435 &  0.1887 &  0.9056 \tabularnewline
97 &  0.09828 &  0.1966 &  0.9017 \tabularnewline
98 &  0.0888 &  0.1776 &  0.9112 \tabularnewline
99 &  0.1 &  0.2 &  0.9 \tabularnewline
100 &  0.08478 &  0.1696 &  0.9152 \tabularnewline
101 &  0.1817 &  0.3634 &  0.8183 \tabularnewline
102 &  0.1552 &  0.3104 &  0.8448 \tabularnewline
103 &  0.1297 &  0.2594 &  0.8703 \tabularnewline
104 &  0.1155 &  0.2311 &  0.8845 \tabularnewline
105 &  0.09923 &  0.1985 &  0.9008 \tabularnewline
106 &  0.08344 &  0.1669 &  0.9166 \tabularnewline
107 &  0.06834 &  0.1367 &  0.9317 \tabularnewline
108 &  0.05873 &  0.1175 &  0.9413 \tabularnewline
109 &  0.09857 &  0.1971 &  0.9014 \tabularnewline
110 &  0.07926 &  0.1585 &  0.9207 \tabularnewline
111 &  0.09083 &  0.1817 &  0.9092 \tabularnewline
112 &  0.273 &  0.546 &  0.727 \tabularnewline
113 &  0.2663 &  0.5325 &  0.7337 \tabularnewline
114 &  0.248 &  0.4959 &  0.752 \tabularnewline
115 &  0.2116 &  0.4231 &  0.7884 \tabularnewline
116 &  0.1862 &  0.3725 &  0.8138 \tabularnewline
117 &  0.1886 &  0.3771 &  0.8114 \tabularnewline
118 &  0.1564 &  0.3127 &  0.8436 \tabularnewline
119 &  0.1333 &  0.2665 &  0.8668 \tabularnewline
120 &  0.1104 &  0.2208 &  0.8896 \tabularnewline
121 &  0.0962 &  0.1924 &  0.9038 \tabularnewline
122 &  0.08068 &  0.1614 &  0.9193 \tabularnewline
123 &  0.06675 &  0.1335 &  0.9332 \tabularnewline
124 &  0.05934 &  0.1187 &  0.9407 \tabularnewline
125 &  0.04818 &  0.09635 &  0.9518 \tabularnewline
126 &  0.04043 &  0.08085 &  0.9596 \tabularnewline
127 &  0.03002 &  0.06003 &  0.97 \tabularnewline
128 &  0.02547 &  0.05094 &  0.9745 \tabularnewline
129 &  0.01923 &  0.03845 &  0.9808 \tabularnewline
130 &  0.02249 &  0.04497 &  0.9775 \tabularnewline
131 &  0.01945 &  0.0389 &  0.9805 \tabularnewline
132 &  0.04421 &  0.08842 &  0.9558 \tabularnewline
133 &  0.04333 &  0.08665 &  0.9567 \tabularnewline
134 &  0.033 &  0.06599 &  0.967 \tabularnewline
135 &  0.02782 &  0.05563 &  0.9722 \tabularnewline
136 &  0.02283 &  0.04566 &  0.9772 \tabularnewline
137 &  0.01937 &  0.03874 &  0.9806 \tabularnewline
138 &  0.02417 &  0.04835 &  0.9758 \tabularnewline
139 &  0.02025 &  0.04049 &  0.9798 \tabularnewline
140 &  0.0151 &  0.0302 &  0.9849 \tabularnewline
141 &  0.02412 &  0.04825 &  0.9759 \tabularnewline
142 &  0.0233 &  0.04659 &  0.9767 \tabularnewline
143 &  0.02272 &  0.04543 &  0.9773 \tabularnewline
144 &  0.1141 &  0.2282 &  0.8859 \tabularnewline
145 &  0.1063 &  0.2126 &  0.8937 \tabularnewline
146 &  0.08278 &  0.1656 &  0.9172 \tabularnewline
147 &  0.0982 &  0.1964 &  0.9018 \tabularnewline
148 &  0.1158 &  0.2316 &  0.8842 \tabularnewline
149 &  0.1891 &  0.3781 &  0.8109 \tabularnewline
150 &  0.22 &  0.4401 &  0.78 \tabularnewline
151 &  0.4395 &  0.8789 &  0.5605 \tabularnewline
152 &  0.3399 &  0.6797 &  0.6601 \tabularnewline
153 &  0.2616 &  0.5232 &  0.7384 \tabularnewline
154 &  0.5667 &  0.8666 &  0.4333 \tabularnewline
155 &  0.6792 &  0.6416 &  0.3208 \tabularnewline
156 &  0.5206 &  0.9587 &  0.4794 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=298736&T=5

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]9[/C][C] 0.1405[/C][C] 0.281[/C][C] 0.8595[/C][/ROW]
[ROW][C]10[/C][C] 0.06798[/C][C] 0.136[/C][C] 0.932[/C][/ROW]
[ROW][C]11[/C][C] 0.04305[/C][C] 0.0861[/C][C] 0.9569[/C][/ROW]
[ROW][C]12[/C][C] 0.01743[/C][C] 0.03486[/C][C] 0.9826[/C][/ROW]
[ROW][C]13[/C][C] 0.00651[/C][C] 0.01302[/C][C] 0.9935[/C][/ROW]
[ROW][C]14[/C][C] 0.00923[/C][C] 0.01846[/C][C] 0.9908[/C][/ROW]
[ROW][C]15[/C][C] 0.0171[/C][C] 0.03419[/C][C] 0.9829[/C][/ROW]
[ROW][C]16[/C][C] 0.0182[/C][C] 0.03641[/C][C] 0.9818[/C][/ROW]
[ROW][C]17[/C][C] 0.009079[/C][C] 0.01816[/C][C] 0.9909[/C][/ROW]
[ROW][C]18[/C][C] 0.03233[/C][C] 0.06467[/C][C] 0.9677[/C][/ROW]
[ROW][C]19[/C][C] 0.02439[/C][C] 0.04877[/C][C] 0.9756[/C][/ROW]
[ROW][C]20[/C][C] 0.01649[/C][C] 0.03299[/C][C] 0.9835[/C][/ROW]
[ROW][C]21[/C][C] 0.01801[/C][C] 0.03601[/C][C] 0.982[/C][/ROW]
[ROW][C]22[/C][C] 0.01046[/C][C] 0.02093[/C][C] 0.9895[/C][/ROW]
[ROW][C]23[/C][C] 0.01037[/C][C] 0.02074[/C][C] 0.9896[/C][/ROW]
[ROW][C]24[/C][C] 0.008727[/C][C] 0.01745[/C][C] 0.9913[/C][/ROW]
[ROW][C]25[/C][C] 0.009399[/C][C] 0.0188[/C][C] 0.9906[/C][/ROW]
[ROW][C]26[/C][C] 0.006058[/C][C] 0.01212[/C][C] 0.9939[/C][/ROW]
[ROW][C]27[/C][C] 0.005386[/C][C] 0.01077[/C][C] 0.9946[/C][/ROW]
[ROW][C]28[/C][C] 0.003407[/C][C] 0.006815[/C][C] 0.9966[/C][/ROW]
[ROW][C]29[/C][C] 0.001927[/C][C] 0.003855[/C][C] 0.9981[/C][/ROW]
[ROW][C]30[/C][C] 0.004896[/C][C] 0.009792[/C][C] 0.9951[/C][/ROW]
[ROW][C]31[/C][C] 0.008914[/C][C] 0.01783[/C][C] 0.9911[/C][/ROW]
[ROW][C]32[/C][C] 0.008657[/C][C] 0.01731[/C][C] 0.9913[/C][/ROW]
[ROW][C]33[/C][C] 0.006065[/C][C] 0.01213[/C][C] 0.9939[/C][/ROW]
[ROW][C]34[/C][C] 0.009647[/C][C] 0.01929[/C][C] 0.9904[/C][/ROW]
[ROW][C]35[/C][C] 0.006218[/C][C] 0.01244[/C][C] 0.9938[/C][/ROW]
[ROW][C]36[/C][C] 0.004732[/C][C] 0.009464[/C][C] 0.9953[/C][/ROW]
[ROW][C]37[/C][C] 0.002988[/C][C] 0.005976[/C][C] 0.997[/C][/ROW]
[ROW][C]38[/C][C] 0.001911[/C][C] 0.003821[/C][C] 0.9981[/C][/ROW]
[ROW][C]39[/C][C] 0.007366[/C][C] 0.01473[/C][C] 0.9926[/C][/ROW]
[ROW][C]40[/C][C] 0.00607[/C][C] 0.01214[/C][C] 0.9939[/C][/ROW]
[ROW][C]41[/C][C] 0.003947[/C][C] 0.007894[/C][C] 0.9961[/C][/ROW]
[ROW][C]42[/C][C] 0.003949[/C][C] 0.007898[/C][C] 0.9961[/C][/ROW]
[ROW][C]43[/C][C] 0.003409[/C][C] 0.006818[/C][C] 0.9966[/C][/ROW]
[ROW][C]44[/C][C] 0.002311[/C][C] 0.004622[/C][C] 0.9977[/C][/ROW]
[ROW][C]45[/C][C] 0.001591[/C][C] 0.003182[/C][C] 0.9984[/C][/ROW]
[ROW][C]46[/C][C] 0.001004[/C][C] 0.002008[/C][C] 0.999[/C][/ROW]
[ROW][C]47[/C][C] 0.0008791[/C][C] 0.001758[/C][C] 0.9991[/C][/ROW]
[ROW][C]48[/C][C] 0.005786[/C][C] 0.01157[/C][C] 0.9942[/C][/ROW]
[ROW][C]49[/C][C] 0.004057[/C][C] 0.008115[/C][C] 0.9959[/C][/ROW]
[ROW][C]50[/C][C] 0.002764[/C][C] 0.005528[/C][C] 0.9972[/C][/ROW]
[ROW][C]51[/C][C] 0.001825[/C][C] 0.003651[/C][C] 0.9982[/C][/ROW]
[ROW][C]52[/C][C] 0.01498[/C][C] 0.02996[/C][C] 0.985[/C][/ROW]
[ROW][C]53[/C][C] 0.01081[/C][C] 0.02161[/C][C] 0.9892[/C][/ROW]
[ROW][C]54[/C][C] 0.03283[/C][C] 0.06567[/C][C] 0.9672[/C][/ROW]
[ROW][C]55[/C][C] 0.0258[/C][C] 0.0516[/C][C] 0.9742[/C][/ROW]
[ROW][C]56[/C][C] 0.03363[/C][C] 0.06727[/C][C] 0.9664[/C][/ROW]
[ROW][C]57[/C][C] 0.02897[/C][C] 0.05795[/C][C] 0.971[/C][/ROW]
[ROW][C]58[/C][C] 0.02361[/C][C] 0.04722[/C][C] 0.9764[/C][/ROW]
[ROW][C]59[/C][C] 0.01786[/C][C] 0.03573[/C][C] 0.9821[/C][/ROW]
[ROW][C]60[/C][C] 0.01618[/C][C] 0.03237[/C][C] 0.9838[/C][/ROW]
[ROW][C]61[/C][C] 0.01203[/C][C] 0.02407[/C][C] 0.988[/C][/ROW]
[ROW][C]62[/C][C] 0.01087[/C][C] 0.02174[/C][C] 0.9891[/C][/ROW]
[ROW][C]63[/C][C] 0.008746[/C][C] 0.01749[/C][C] 0.9913[/C][/ROW]
[ROW][C]64[/C][C] 0.008582[/C][C] 0.01716[/C][C] 0.9914[/C][/ROW]
[ROW][C]65[/C][C] 0.00994[/C][C] 0.01988[/C][C] 0.9901[/C][/ROW]
[ROW][C]66[/C][C] 0.01101[/C][C] 0.02202[/C][C] 0.989[/C][/ROW]
[ROW][C]67[/C][C] 0.01212[/C][C] 0.02424[/C][C] 0.9879[/C][/ROW]
[ROW][C]68[/C][C] 0.01006[/C][C] 0.02013[/C][C] 0.9899[/C][/ROW]
[ROW][C]69[/C][C] 0.007615[/C][C] 0.01523[/C][C] 0.9924[/C][/ROW]
[ROW][C]70[/C][C] 0.01336[/C][C] 0.02672[/C][C] 0.9866[/C][/ROW]
[ROW][C]71[/C][C] 0.02977[/C][C] 0.05954[/C][C] 0.9702[/C][/ROW]
[ROW][C]72[/C][C] 0.05237[/C][C] 0.1047[/C][C] 0.9476[/C][/ROW]
[ROW][C]73[/C][C] 0.04968[/C][C] 0.09937[/C][C] 0.9503[/C][/ROW]
[ROW][C]74[/C][C] 0.04019[/C][C] 0.08039[/C][C] 0.9598[/C][/ROW]
[ROW][C]75[/C][C] 0.03175[/C][C] 0.06351[/C][C] 0.9682[/C][/ROW]
[ROW][C]76[/C][C] 0.03034[/C][C] 0.06068[/C][C] 0.9697[/C][/ROW]
[ROW][C]77[/C][C] 0.0265[/C][C] 0.053[/C][C] 0.9735[/C][/ROW]
[ROW][C]78[/C][C] 0.05206[/C][C] 0.1041[/C][C] 0.9479[/C][/ROW]
[ROW][C]79[/C][C] 0.04176[/C][C] 0.08351[/C][C] 0.9582[/C][/ROW]
[ROW][C]80[/C][C] 0.05906[/C][C] 0.1181[/C][C] 0.9409[/C][/ROW]
[ROW][C]81[/C][C] 0.04741[/C][C] 0.09483[/C][C] 0.9526[/C][/ROW]
[ROW][C]82[/C][C] 0.04879[/C][C] 0.09759[/C][C] 0.9512[/C][/ROW]
[ROW][C]83[/C][C] 0.04674[/C][C] 0.09347[/C][C] 0.9533[/C][/ROW]
[ROW][C]84[/C][C] 0.1977[/C][C] 0.3954[/C][C] 0.8023[/C][/ROW]
[ROW][C]85[/C][C] 0.1727[/C][C] 0.3454[/C][C] 0.8273[/C][/ROW]
[ROW][C]86[/C][C] 0.2156[/C][C] 0.4311[/C][C] 0.7844[/C][/ROW]
[ROW][C]87[/C][C] 0.1869[/C][C] 0.3738[/C][C] 0.8131[/C][/ROW]
[ROW][C]88[/C][C] 0.2181[/C][C] 0.4362[/C][C] 0.7819[/C][/ROW]
[ROW][C]89[/C][C] 0.1983[/C][C] 0.3965[/C][C] 0.8017[/C][/ROW]
[ROW][C]90[/C][C] 0.1704[/C][C] 0.3407[/C][C] 0.8296[/C][/ROW]
[ROW][C]91[/C][C] 0.1699[/C][C] 0.3399[/C][C] 0.8301[/C][/ROW]
[ROW][C]92[/C][C] 0.1427[/C][C] 0.2853[/C][C] 0.8573[/C][/ROW]
[ROW][C]93[/C][C] 0.1191[/C][C] 0.2382[/C][C] 0.8809[/C][/ROW]
[ROW][C]94[/C][C] 0.09843[/C][C] 0.1969[/C][C] 0.9016[/C][/ROW]
[ROW][C]95[/C][C] 0.0814[/C][C] 0.1628[/C][C] 0.9186[/C][/ROW]
[ROW][C]96[/C][C] 0.09435[/C][C] 0.1887[/C][C] 0.9056[/C][/ROW]
[ROW][C]97[/C][C] 0.09828[/C][C] 0.1966[/C][C] 0.9017[/C][/ROW]
[ROW][C]98[/C][C] 0.0888[/C][C] 0.1776[/C][C] 0.9112[/C][/ROW]
[ROW][C]99[/C][C] 0.1[/C][C] 0.2[/C][C] 0.9[/C][/ROW]
[ROW][C]100[/C][C] 0.08478[/C][C] 0.1696[/C][C] 0.9152[/C][/ROW]
[ROW][C]101[/C][C] 0.1817[/C][C] 0.3634[/C][C] 0.8183[/C][/ROW]
[ROW][C]102[/C][C] 0.1552[/C][C] 0.3104[/C][C] 0.8448[/C][/ROW]
[ROW][C]103[/C][C] 0.1297[/C][C] 0.2594[/C][C] 0.8703[/C][/ROW]
[ROW][C]104[/C][C] 0.1155[/C][C] 0.2311[/C][C] 0.8845[/C][/ROW]
[ROW][C]105[/C][C] 0.09923[/C][C] 0.1985[/C][C] 0.9008[/C][/ROW]
[ROW][C]106[/C][C] 0.08344[/C][C] 0.1669[/C][C] 0.9166[/C][/ROW]
[ROW][C]107[/C][C] 0.06834[/C][C] 0.1367[/C][C] 0.9317[/C][/ROW]
[ROW][C]108[/C][C] 0.05873[/C][C] 0.1175[/C][C] 0.9413[/C][/ROW]
[ROW][C]109[/C][C] 0.09857[/C][C] 0.1971[/C][C] 0.9014[/C][/ROW]
[ROW][C]110[/C][C] 0.07926[/C][C] 0.1585[/C][C] 0.9207[/C][/ROW]
[ROW][C]111[/C][C] 0.09083[/C][C] 0.1817[/C][C] 0.9092[/C][/ROW]
[ROW][C]112[/C][C] 0.273[/C][C] 0.546[/C][C] 0.727[/C][/ROW]
[ROW][C]113[/C][C] 0.2663[/C][C] 0.5325[/C][C] 0.7337[/C][/ROW]
[ROW][C]114[/C][C] 0.248[/C][C] 0.4959[/C][C] 0.752[/C][/ROW]
[ROW][C]115[/C][C] 0.2116[/C][C] 0.4231[/C][C] 0.7884[/C][/ROW]
[ROW][C]116[/C][C] 0.1862[/C][C] 0.3725[/C][C] 0.8138[/C][/ROW]
[ROW][C]117[/C][C] 0.1886[/C][C] 0.3771[/C][C] 0.8114[/C][/ROW]
[ROW][C]118[/C][C] 0.1564[/C][C] 0.3127[/C][C] 0.8436[/C][/ROW]
[ROW][C]119[/C][C] 0.1333[/C][C] 0.2665[/C][C] 0.8668[/C][/ROW]
[ROW][C]120[/C][C] 0.1104[/C][C] 0.2208[/C][C] 0.8896[/C][/ROW]
[ROW][C]121[/C][C] 0.0962[/C][C] 0.1924[/C][C] 0.9038[/C][/ROW]
[ROW][C]122[/C][C] 0.08068[/C][C] 0.1614[/C][C] 0.9193[/C][/ROW]
[ROW][C]123[/C][C] 0.06675[/C][C] 0.1335[/C][C] 0.9332[/C][/ROW]
[ROW][C]124[/C][C] 0.05934[/C][C] 0.1187[/C][C] 0.9407[/C][/ROW]
[ROW][C]125[/C][C] 0.04818[/C][C] 0.09635[/C][C] 0.9518[/C][/ROW]
[ROW][C]126[/C][C] 0.04043[/C][C] 0.08085[/C][C] 0.9596[/C][/ROW]
[ROW][C]127[/C][C] 0.03002[/C][C] 0.06003[/C][C] 0.97[/C][/ROW]
[ROW][C]128[/C][C] 0.02547[/C][C] 0.05094[/C][C] 0.9745[/C][/ROW]
[ROW][C]129[/C][C] 0.01923[/C][C] 0.03845[/C][C] 0.9808[/C][/ROW]
[ROW][C]130[/C][C] 0.02249[/C][C] 0.04497[/C][C] 0.9775[/C][/ROW]
[ROW][C]131[/C][C] 0.01945[/C][C] 0.0389[/C][C] 0.9805[/C][/ROW]
[ROW][C]132[/C][C] 0.04421[/C][C] 0.08842[/C][C] 0.9558[/C][/ROW]
[ROW][C]133[/C][C] 0.04333[/C][C] 0.08665[/C][C] 0.9567[/C][/ROW]
[ROW][C]134[/C][C] 0.033[/C][C] 0.06599[/C][C] 0.967[/C][/ROW]
[ROW][C]135[/C][C] 0.02782[/C][C] 0.05563[/C][C] 0.9722[/C][/ROW]
[ROW][C]136[/C][C] 0.02283[/C][C] 0.04566[/C][C] 0.9772[/C][/ROW]
[ROW][C]137[/C][C] 0.01937[/C][C] 0.03874[/C][C] 0.9806[/C][/ROW]
[ROW][C]138[/C][C] 0.02417[/C][C] 0.04835[/C][C] 0.9758[/C][/ROW]
[ROW][C]139[/C][C] 0.02025[/C][C] 0.04049[/C][C] 0.9798[/C][/ROW]
[ROW][C]140[/C][C] 0.0151[/C][C] 0.0302[/C][C] 0.9849[/C][/ROW]
[ROW][C]141[/C][C] 0.02412[/C][C] 0.04825[/C][C] 0.9759[/C][/ROW]
[ROW][C]142[/C][C] 0.0233[/C][C] 0.04659[/C][C] 0.9767[/C][/ROW]
[ROW][C]143[/C][C] 0.02272[/C][C] 0.04543[/C][C] 0.9773[/C][/ROW]
[ROW][C]144[/C][C] 0.1141[/C][C] 0.2282[/C][C] 0.8859[/C][/ROW]
[ROW][C]145[/C][C] 0.1063[/C][C] 0.2126[/C][C] 0.8937[/C][/ROW]
[ROW][C]146[/C][C] 0.08278[/C][C] 0.1656[/C][C] 0.9172[/C][/ROW]
[ROW][C]147[/C][C] 0.0982[/C][C] 0.1964[/C][C] 0.9018[/C][/ROW]
[ROW][C]148[/C][C] 0.1158[/C][C] 0.2316[/C][C] 0.8842[/C][/ROW]
[ROW][C]149[/C][C] 0.1891[/C][C] 0.3781[/C][C] 0.8109[/C][/ROW]
[ROW][C]150[/C][C] 0.22[/C][C] 0.4401[/C][C] 0.78[/C][/ROW]
[ROW][C]151[/C][C] 0.4395[/C][C] 0.8789[/C][C] 0.5605[/C][/ROW]
[ROW][C]152[/C][C] 0.3399[/C][C] 0.6797[/C][C] 0.6601[/C][/ROW]
[ROW][C]153[/C][C] 0.2616[/C][C] 0.5232[/C][C] 0.7384[/C][/ROW]
[ROW][C]154[/C][C] 0.5667[/C][C] 0.8666[/C][C] 0.4333[/C][/ROW]
[ROW][C]155[/C][C] 0.6792[/C][C] 0.6416[/C][C] 0.3208[/C][/ROW]
[ROW][C]156[/C][C] 0.5206[/C][C] 0.9587[/C][C] 0.4794[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=298736&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=298736&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
9 0.1405 0.281 0.8595
10 0.06798 0.136 0.932
11 0.04305 0.0861 0.9569
12 0.01743 0.03486 0.9826
13 0.00651 0.01302 0.9935
14 0.00923 0.01846 0.9908
15 0.0171 0.03419 0.9829
16 0.0182 0.03641 0.9818
17 0.009079 0.01816 0.9909
18 0.03233 0.06467 0.9677
19 0.02439 0.04877 0.9756
20 0.01649 0.03299 0.9835
21 0.01801 0.03601 0.982
22 0.01046 0.02093 0.9895
23 0.01037 0.02074 0.9896
24 0.008727 0.01745 0.9913
25 0.009399 0.0188 0.9906
26 0.006058 0.01212 0.9939
27 0.005386 0.01077 0.9946
28 0.003407 0.006815 0.9966
29 0.001927 0.003855 0.9981
30 0.004896 0.009792 0.9951
31 0.008914 0.01783 0.9911
32 0.008657 0.01731 0.9913
33 0.006065 0.01213 0.9939
34 0.009647 0.01929 0.9904
35 0.006218 0.01244 0.9938
36 0.004732 0.009464 0.9953
37 0.002988 0.005976 0.997
38 0.001911 0.003821 0.9981
39 0.007366 0.01473 0.9926
40 0.00607 0.01214 0.9939
41 0.003947 0.007894 0.9961
42 0.003949 0.007898 0.9961
43 0.003409 0.006818 0.9966
44 0.002311 0.004622 0.9977
45 0.001591 0.003182 0.9984
46 0.001004 0.002008 0.999
47 0.0008791 0.001758 0.9991
48 0.005786 0.01157 0.9942
49 0.004057 0.008115 0.9959
50 0.002764 0.005528 0.9972
51 0.001825 0.003651 0.9982
52 0.01498 0.02996 0.985
53 0.01081 0.02161 0.9892
54 0.03283 0.06567 0.9672
55 0.0258 0.0516 0.9742
56 0.03363 0.06727 0.9664
57 0.02897 0.05795 0.971
58 0.02361 0.04722 0.9764
59 0.01786 0.03573 0.9821
60 0.01618 0.03237 0.9838
61 0.01203 0.02407 0.988
62 0.01087 0.02174 0.9891
63 0.008746 0.01749 0.9913
64 0.008582 0.01716 0.9914
65 0.00994 0.01988 0.9901
66 0.01101 0.02202 0.989
67 0.01212 0.02424 0.9879
68 0.01006 0.02013 0.9899
69 0.007615 0.01523 0.9924
70 0.01336 0.02672 0.9866
71 0.02977 0.05954 0.9702
72 0.05237 0.1047 0.9476
73 0.04968 0.09937 0.9503
74 0.04019 0.08039 0.9598
75 0.03175 0.06351 0.9682
76 0.03034 0.06068 0.9697
77 0.0265 0.053 0.9735
78 0.05206 0.1041 0.9479
79 0.04176 0.08351 0.9582
80 0.05906 0.1181 0.9409
81 0.04741 0.09483 0.9526
82 0.04879 0.09759 0.9512
83 0.04674 0.09347 0.9533
84 0.1977 0.3954 0.8023
85 0.1727 0.3454 0.8273
86 0.2156 0.4311 0.7844
87 0.1869 0.3738 0.8131
88 0.2181 0.4362 0.7819
89 0.1983 0.3965 0.8017
90 0.1704 0.3407 0.8296
91 0.1699 0.3399 0.8301
92 0.1427 0.2853 0.8573
93 0.1191 0.2382 0.8809
94 0.09843 0.1969 0.9016
95 0.0814 0.1628 0.9186
96 0.09435 0.1887 0.9056
97 0.09828 0.1966 0.9017
98 0.0888 0.1776 0.9112
99 0.1 0.2 0.9
100 0.08478 0.1696 0.9152
101 0.1817 0.3634 0.8183
102 0.1552 0.3104 0.8448
103 0.1297 0.2594 0.8703
104 0.1155 0.2311 0.8845
105 0.09923 0.1985 0.9008
106 0.08344 0.1669 0.9166
107 0.06834 0.1367 0.9317
108 0.05873 0.1175 0.9413
109 0.09857 0.1971 0.9014
110 0.07926 0.1585 0.9207
111 0.09083 0.1817 0.9092
112 0.273 0.546 0.727
113 0.2663 0.5325 0.7337
114 0.248 0.4959 0.752
115 0.2116 0.4231 0.7884
116 0.1862 0.3725 0.8138
117 0.1886 0.3771 0.8114
118 0.1564 0.3127 0.8436
119 0.1333 0.2665 0.8668
120 0.1104 0.2208 0.8896
121 0.0962 0.1924 0.9038
122 0.08068 0.1614 0.9193
123 0.06675 0.1335 0.9332
124 0.05934 0.1187 0.9407
125 0.04818 0.09635 0.9518
126 0.04043 0.08085 0.9596
127 0.03002 0.06003 0.97
128 0.02547 0.05094 0.9745
129 0.01923 0.03845 0.9808
130 0.02249 0.04497 0.9775
131 0.01945 0.0389 0.9805
132 0.04421 0.08842 0.9558
133 0.04333 0.08665 0.9567
134 0.033 0.06599 0.967
135 0.02782 0.05563 0.9722
136 0.02283 0.04566 0.9772
137 0.01937 0.03874 0.9806
138 0.02417 0.04835 0.9758
139 0.02025 0.04049 0.9798
140 0.0151 0.0302 0.9849
141 0.02412 0.04825 0.9759
142 0.0233 0.04659 0.9767
143 0.02272 0.04543 0.9773
144 0.1141 0.2282 0.8859
145 0.1063 0.2126 0.8937
146 0.08278 0.1656 0.9172
147 0.0982 0.1964 0.9018
148 0.1158 0.2316 0.8842
149 0.1891 0.3781 0.8109
150 0.22 0.4401 0.78
151 0.4395 0.8789 0.5605
152 0.3399 0.6797 0.6601
153 0.2616 0.5232 0.7384
154 0.5667 0.8666 0.4333
155 0.6792 0.6416 0.3208
156 0.5206 0.9587 0.4794







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level16 0.1081NOK
5% type I error level650.439189NOK
10% type I error level890.601351NOK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 16 &  0.1081 & NOK \tabularnewline
5% type I error level & 65 & 0.439189 & NOK \tabularnewline
10% type I error level & 89 & 0.601351 & NOK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=298736&T=6

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]16[/C][C] 0.1081[/C][C]NOK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]65[/C][C]0.439189[/C][C]NOK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]89[/C][C]0.601351[/C][C]NOK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=298736&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=298736&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level16 0.1081NOK
5% type I error level650.439189NOK
10% type I error level890.601351NOK







Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 2.0596, df1 = 2, df2 = 157, p-value = 0.1309
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 1.3373, df1 = 10, df2 = 149, p-value = 0.2156
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 0.57315, df1 = 2, df2 = 157, p-value = 0.5649

\begin{tabular}{lllllllll}
\hline
Ramsey RESET F-Test for powers (2 and 3) of fitted values \tabularnewline
> reset_test_fitted
	RESET test
data:  mylm
RESET = 2.0596, df1 = 2, df2 = 157, p-value = 0.1309
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of regressors \tabularnewline
> reset_test_regressors
	RESET test
data:  mylm
RESET = 1.3373, df1 = 10, df2 = 149, p-value = 0.2156
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of principal components \tabularnewline
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 0.57315, df1 = 2, df2 = 157, p-value = 0.5649
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=298736&T=7

[TABLE]
[ROW][C]Ramsey RESET F-Test for powers (2 and 3) of fitted values[/C][/ROW]
[ROW][C]
> reset_test_fitted
	RESET test
data:  mylm
RESET = 2.0596, df1 = 2, df2 = 157, p-value = 0.1309
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of regressors[/C][/ROW] [ROW][C]
> reset_test_regressors
	RESET test
data:  mylm
RESET = 1.3373, df1 = 10, df2 = 149, p-value = 0.2156
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of principal components[/C][/ROW] [ROW][C]
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 0.57315, df1 = 2, df2 = 157, p-value = 0.5649
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=298736&T=7

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=298736&T=7

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 2.0596, df1 = 2, df2 = 157, p-value = 0.1309
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 1.3373, df1 = 10, df2 = 149, p-value = 0.2156
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 0.57315, df1 = 2, df2 = 157, p-value = 0.5649







Variance Inflation Factors (Multicollinearity)
> vif
     SK1      SK2      SK4      SK5     ALG2 
1.098982 1.121640 1.097277 1.028088 1.046235 

\begin{tabular}{lllllllll}
\hline
Variance Inflation Factors (Multicollinearity) \tabularnewline
> vif
     SK1      SK2      SK4      SK5     ALG2 
1.098982 1.121640 1.097277 1.028088 1.046235 
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=298736&T=8

[TABLE]
[ROW][C]Variance Inflation Factors (Multicollinearity)[/C][/ROW]
[ROW][C]
> vif
     SK1      SK2      SK4      SK5     ALG2 
1.098982 1.121640 1.097277 1.028088 1.046235 
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=298736&T=8

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=298736&T=8

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Variance Inflation Factors (Multicollinearity)
> vif
     SK1      SK2      SK4      SK5     ALG2 
1.098982 1.121640 1.097277 1.028088 1.046235 



Parameters (Session):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ; par4 = 0 ; par5 = 0 ;
Parameters (R input):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ; par4 = 0 ; par5 = 0 ;
R code (references can be found in the software module):
library(lattice)
library(lmtest)
library(car)
library(MASS)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
mywarning <- ''
par1 <- as.numeric(par1)
if(is.na(par1)) {
par1 <- 1
mywarning = 'Warning: you did not specify the column number of the endogenous series! The first column was selected by default.'
}
if (par4=='') par4 <- 0
par4 <- as.numeric(par4)
if (par5=='') par5 <- 0
par5 <- as.numeric(par5)
x <- na.omit(t(y))
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'Seasonal Differences (s=12)'){
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'First and Seasonal Differences (s=12)'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if(par4 > 0) {
x2 <- array(0, dim=c(n-par4,par4), dimnames=list(1:(n-par4), paste(colnames(x)[par1],'(t-',1:par4,')',sep='')))
for (i in 1:(n-par4)) {
for (j in 1:par4) {
x2[i,j] <- x[i+par4-j,par1]
}
}
x <- cbind(x[(par4+1):n,], x2)
n <- n - par4
}
if(par5 > 0) {
x2 <- array(0, dim=c(n-par5*12,par5), dimnames=list(1:(n-par5*12), paste(colnames(x)[par1],'(t-',1:par5,'s)',sep='')))
for (i in 1:(n-par5*12)) {
for (j in 1:par5) {
x2[i,j] <- x[i+par5*12-j*12,par1]
}
}
x <- cbind(x[(par5*12+1):n,], x2)
n <- n - par5*12
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
(k <- length(x[n,]))
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
print(x)
(k <- length(x[n,]))
head(x)
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
sresid <- studres(mylm)
hist(sresid, freq=FALSE, main='Distribution of Studentized Residuals')
xfit<-seq(min(sresid),max(sresid),length=40)
yfit<-dnorm(xfit)
lines(xfit, yfit)
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqPlot(mylm, main='QQ Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
print(z)
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, signif(mysum$coefficients[i,1],6), sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, mywarning)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Multiple Linear Regression - Ordinary Least Squares', 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,formatC(signif(mysum$coefficients[i,1],5),format='g',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,2],5),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,3],4),format='e',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4],4),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4]/2,4),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a,formatC(signif(sqrt(mysum$r.squared),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$adj.r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a,formatC(signif(mysum$fstatistic[1],6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[2],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[3],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a,formatC(signif(1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a,formatC(signif(mysum$sigma,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a,formatC(signif(sum(myerror*myerror),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
myr <- as.numeric(mysum$resid)
myr
if(n < 200) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,formatC(signif(x[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(x[i]-mysum$resid[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$resid[i],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,1],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,2],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,3],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant1,6))
a<-table.element(a,formatC(signif(numsignificant1/numgqtests,6),format='g',flag=' '))
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant5,6))
a<-table.element(a,signif(numsignificant5/numgqtests,6))
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant10,6))
a<-table.element(a,signif(numsignificant10/numgqtests,6))
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}
}
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of fitted values',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_fitted <- resettest(mylm,power=2:3,type='fitted')
a<-table.element(a,paste('
',RC.texteval('reset_test_fitted'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of regressors',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_regressors <- resettest(mylm,power=2:3,type='regressor')
a<-table.element(a,paste('
',RC.texteval('reset_test_regressors'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of principal components',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_principal_components <- resettest(mylm,power=2:3,type='princomp')
a<-table.element(a,paste('
',RC.texteval('reset_test_principal_components'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable8.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Variance Inflation Factors (Multicollinearity)',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
vif <- vif(mylm)
a<-table.element(a,paste('
',RC.texteval('vif'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable9.tab')