Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationSun, 06 Dec 2015 20:49:45 +0000
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2015/Dec/06/t1449435083ao4yqrfrcgrl036.htm/, Retrieved Thu, 16 May 2024 21:52:45 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=285321, Retrieved Thu, 16 May 2024 21:52:45 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywordsregressie 2
Estimated Impact65
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [Paper 2] [2015-12-06 20:49:45] [fb7ef44ef6cdfac67cf9078e3093d323] [Current]
Feedback Forum

Post a new message
Dataseries X:
-5	-6	50	19	-29
-1	-3	53	20	-29
-2	-4	50	21	-29
-5	-7	50	20	-27
-4	-7	51	21	-29
-6	-7	53	19	-24
-2	-3	49	22	-29
-2	0	54	20	-21
-2	-5	57	18	-20
-2	-3	58	16	-26
2	3	56	17	-19
1	2	60	18	-22
-8	-7	55	19	-22
-1	-1	54	18	-15
1	0	52	20	-16
-1	-3	55	21	-22
2	4	56	18	-21
2	2	54	19	-11
1	3	53	19	-10
-1	0	59	19	-6
-2	-10	62	21	-8
-2	-10	63	19	-15
-1	-9	64	19	-16
-8	-22	75	17	-24
-4	-16	77	16	-27
-6	-18	79	16	-33
-3	-14	77	17	-29
-3	-12	82	16	-34
-7	-17	83	15	-37
-9	-23	81	16	-31
-11	-28	78	16	-33
-13	-31	79	16	-25
-11	-21	79	18	-27
-9	-19	73	19	-21
-17	-22	72	16	-32
-22	-22	67	16	-31
-25	-25	67	16	-32
-20	-16	50	18	-30
-24	-22	45	16	-34
-24	-21	39	15	-35
-22	-10	39	15	-37
-19	-7	37	16	-32
-18	-5	30	18	-28
-17	-4	24	16	-26
-11	7	27	19	-24
-11	6	19	19	-27
-12	3	19	18	-26
-10	10	25	17	-27
-15	0	16	19	-27
-15	-2	20	22	-24
-15	-1	25	19	-28
-13	2	34	19	-23
-8	8	39	16	-23
-13	-6	40	18	-29
-9	-4	38	20	-25
-7	4	42	17	-24
-4	7	46	17	-20
-4	3	48	17	-22
-2	3	51	20	-24
0	8	55	21	-27
-2	3	52	19	-25
-3	-3	55	18	-26
1	4	58	20	-24
-2	-5	72	17	-26
-1	-1	70	15	-22
1	5	70	17	-20
-3	0	63	18	-26
-4	-6	66	20	-22
-9	-13	65	19	-29
-9	-15	55	20	-30
-7	-8	57	22	-26
-14	-20	60	20	-30
-12	-10	63	21	-33
-16	-22	65	19	-33
-20	-25	61	22	-31
-12	-10	65	19	-36
-12	-8	63	21	-43
-10	-9	59	19	-40
-10	-5	56	21	-38
-13	-7	54	18	-41
-16	-11	56	18	-38
-14	-11	54	20	-40
-17	-16	58	19	-41
-24	-28	59	19	-45
-25	-27	60	17	-54
-23	-23	57	18	-47
-17	-10	54	17	-44
-24	-22	52	18	-47
-20	-15	50	19	-47
-19	-14	51	17	-45
-18	-12	47	19	-42
-16	-10	51	19	-42
-12	1	46	17	-39
-7	9	44	19	-35
-6	7	39	21	-29
-6	9	43	20	-37
-5	7	46	19	-35
-4	12	43	21	-32
-4	10	34	20	-33
-8	7	36	18	-37
-9	4	34	18	-36
-6	5	38	16	-34
-7	5	32	18	-38
-10	-1	38	19	-33
-11	-5	30	18	-41
-11	-6	17	18	-39
-12	-9	14	17	-40
-14	-15	18	18	-42
-12	-10	18	19	-45
-9	-5	13	18	-39
-5	2	9	19	-44
-6	-1	12	19	-44
-6	0	19	20	-43
-3	4	20	21	-39
-2	8	25	17	-38
-6	-1	26	20	-43
-6	-4	29	21	-46
-10	-10	28	18	-42
-8	-6	30	19	-45
-4	-2	38	20	-46




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time5 seconds
R Server'George Udny Yule' @ yule.wessa.net

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 5 seconds \tabularnewline
R Server & 'George Udny Yule' @ yule.wessa.net \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=285321&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]5 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'George Udny Yule' @ yule.wessa.net[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=285321&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=285321&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time5 seconds
R Server'George Udny Yule' @ yule.wessa.net







Multiple Linear Regression - Estimated Regression Equation
consumentenvertrouwen[t] = -4.85582 + 0.326317vooruitz_economie[t] + 0.06684cons_prijzen_12m[t] + 0.0562382fin_sit_gezinnen[t] + 0.00554493gunstig_sparen[t] + 0.421374`consumentenvertrouwen(t-1)`[t] + 0.0621938`consumentenvertrouwen(t-2)`[t] + 0.0879151`consumentenvertrouwen(t-3)`[t] + 0.130799`consumentenvertrouwen(t-4)`[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
consumentenvertrouwen[t] =  -4.85582 +  0.326317vooruitz_economie[t] +  0.06684cons_prijzen_12m[t] +  0.0562382fin_sit_gezinnen[t] +  0.00554493gunstig_sparen[t] +  0.421374`consumentenvertrouwen(t-1)`[t] +  0.0621938`consumentenvertrouwen(t-2)`[t] +  0.0879151`consumentenvertrouwen(t-3)`[t] +  0.130799`consumentenvertrouwen(t-4)`[t]  + e[t] \tabularnewline
 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=285321&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]consumentenvertrouwen[t] =  -4.85582 +  0.326317vooruitz_economie[t] +  0.06684cons_prijzen_12m[t] +  0.0562382fin_sit_gezinnen[t] +  0.00554493gunstig_sparen[t] +  0.421374`consumentenvertrouwen(t-1)`[t] +  0.0621938`consumentenvertrouwen(t-2)`[t] +  0.0879151`consumentenvertrouwen(t-3)`[t] +  0.130799`consumentenvertrouwen(t-4)`[t]  + e[t][/C][/ROW]
[ROW][C][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=285321&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=285321&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
consumentenvertrouwen[t] = -4.85582 + 0.326317vooruitz_economie[t] + 0.06684cons_prijzen_12m[t] + 0.0562382fin_sit_gezinnen[t] + 0.00554493gunstig_sparen[t] + 0.421374`consumentenvertrouwen(t-1)`[t] + 0.0621938`consumentenvertrouwen(t-2)`[t] + 0.0879151`consumentenvertrouwen(t-3)`[t] + 0.130799`consumentenvertrouwen(t-4)`[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)-4.856 2.999-1.6190e+00 0.1084 0.05418
vooruitz_economie+0.3263 0.03667+8.8970e+00 1.582e-14 7.908e-15
cons_prijzen_12m+0.06684 0.01768+3.7810e+00 0.0002573 0.0001286
fin_sit_gezinnen+0.05624 0.1365+4.1210e-01 0.6811 0.3405
gunstig_sparen+0.005545 0.02975+1.8640e-01 0.8525 0.4262
`consumentenvertrouwen(t-1)`+0.4214 0.08091+5.2080e+00 9.313e-07 4.656e-07
`consumentenvertrouwen(t-2)`+0.06219 0.09026+6.8900e-01 0.4923 0.2461
`consumentenvertrouwen(t-3)`+0.08791 0.09095+9.6660e-01 0.3359 0.168
`consumentenvertrouwen(t-4)`+0.1308 0.07687+1.7020e+00 0.09172 0.04586

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & -4.856 &  2.999 & -1.6190e+00 &  0.1084 &  0.05418 \tabularnewline
vooruitz_economie & +0.3263 &  0.03667 & +8.8970e+00 &  1.582e-14 &  7.908e-15 \tabularnewline
cons_prijzen_12m & +0.06684 &  0.01768 & +3.7810e+00 &  0.0002573 &  0.0001286 \tabularnewline
fin_sit_gezinnen & +0.05624 &  0.1365 & +4.1210e-01 &  0.6811 &  0.3405 \tabularnewline
gunstig_sparen & +0.005545 &  0.02975 & +1.8640e-01 &  0.8525 &  0.4262 \tabularnewline
`consumentenvertrouwen(t-1)` & +0.4214 &  0.08091 & +5.2080e+00 &  9.313e-07 &  4.656e-07 \tabularnewline
`consumentenvertrouwen(t-2)` & +0.06219 &  0.09026 & +6.8900e-01 &  0.4923 &  0.2461 \tabularnewline
`consumentenvertrouwen(t-3)` & +0.08791 &  0.09095 & +9.6660e-01 &  0.3359 &  0.168 \tabularnewline
`consumentenvertrouwen(t-4)` & +0.1308 &  0.07687 & +1.7020e+00 &  0.09172 &  0.04586 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=285321&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]-4.856[/C][C] 2.999[/C][C]-1.6190e+00[/C][C] 0.1084[/C][C] 0.05418[/C][/ROW]
[ROW][C]vooruitz_economie[/C][C]+0.3263[/C][C] 0.03667[/C][C]+8.8970e+00[/C][C] 1.582e-14[/C][C] 7.908e-15[/C][/ROW]
[ROW][C]cons_prijzen_12m[/C][C]+0.06684[/C][C] 0.01768[/C][C]+3.7810e+00[/C][C] 0.0002573[/C][C] 0.0001286[/C][/ROW]
[ROW][C]fin_sit_gezinnen[/C][C]+0.05624[/C][C] 0.1365[/C][C]+4.1210e-01[/C][C] 0.6811[/C][C] 0.3405[/C][/ROW]
[ROW][C]gunstig_sparen[/C][C]+0.005545[/C][C] 0.02975[/C][C]+1.8640e-01[/C][C] 0.8525[/C][C] 0.4262[/C][/ROW]
[ROW][C]`consumentenvertrouwen(t-1)`[/C][C]+0.4214[/C][C] 0.08091[/C][C]+5.2080e+00[/C][C] 9.313e-07[/C][C] 4.656e-07[/C][/ROW]
[ROW][C]`consumentenvertrouwen(t-2)`[/C][C]+0.06219[/C][C] 0.09026[/C][C]+6.8900e-01[/C][C] 0.4923[/C][C] 0.2461[/C][/ROW]
[ROW][C]`consumentenvertrouwen(t-3)`[/C][C]+0.08791[/C][C] 0.09095[/C][C]+9.6660e-01[/C][C] 0.3359[/C][C] 0.168[/C][/ROW]
[ROW][C]`consumentenvertrouwen(t-4)`[/C][C]+0.1308[/C][C] 0.07687[/C][C]+1.7020e+00[/C][C] 0.09172[/C][C] 0.04586[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=285321&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=285321&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)-4.856 2.999-1.6190e+00 0.1084 0.05418
vooruitz_economie+0.3263 0.03667+8.8970e+00 1.582e-14 7.908e-15
cons_prijzen_12m+0.06684 0.01768+3.7810e+00 0.0002573 0.0001286
fin_sit_gezinnen+0.05624 0.1365+4.1210e-01 0.6811 0.3405
gunstig_sparen+0.005545 0.02975+1.8640e-01 0.8525 0.4262
`consumentenvertrouwen(t-1)`+0.4214 0.08091+5.2080e+00 9.313e-07 4.656e-07
`consumentenvertrouwen(t-2)`+0.06219 0.09026+6.8900e-01 0.4923 0.2461
`consumentenvertrouwen(t-3)`+0.08791 0.09095+9.6660e-01 0.3359 0.168
`consumentenvertrouwen(t-4)`+0.1308 0.07687+1.7020e+00 0.09172 0.04586







Multiple Linear Regression - Regression Statistics
Multiple R 0.9467
R-squared 0.8962
Adjusted R-squared 0.8884
F-TEST (value) 115.4
F-TEST (DF numerator)8
F-TEST (DF denominator)107
p-value 0
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 2.348
Sum Squared Residuals 589.9

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R &  0.9467 \tabularnewline
R-squared &  0.8962 \tabularnewline
Adjusted R-squared &  0.8884 \tabularnewline
F-TEST (value) &  115.4 \tabularnewline
F-TEST (DF numerator) & 8 \tabularnewline
F-TEST (DF denominator) & 107 \tabularnewline
p-value &  0 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation &  2.348 \tabularnewline
Sum Squared Residuals &  589.9 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=285321&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C] 0.9467[/C][/ROW]
[ROW][C]R-squared[/C][C] 0.8962[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C] 0.8884[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C] 115.4[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]8[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]107[/C][/ROW]
[ROW][C]p-value[/C][C] 0[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C] 2.348[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C] 589.9[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=285321&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=285321&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R 0.9467
R-squared 0.8962
Adjusted R-squared 0.8884
F-TEST (value) 115.4
F-TEST (DF numerator)8
F-TEST (DF denominator)107
p-value 0
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 2.348
Sum Squared Residuals 589.9







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1-4-5.684 1.684
2-6-4.965-1.035
3-2-4.961 2.961
4-2-2.46 0.4597
5-2-3.794 1.794
6-2-3.13 1.13
7 2-0.6877 2.688
8 1 0.9784 0.02155
9-8-2.409-5.591
10-1-4.038 3.038
11 1-0.9135 1.913
12-1-1.313 0.3129
13 2-0.4052 2.405
14 2 1.151 0.8486
15 1 1.689-0.6887
16-1 0.7138-1.714
17-2-2.76 0.76
18-2-3.478 1.478
19-1-3.459 2.459
20-8-7.051-0.9488
21-4-8.051 4.051
22-6-7.265 1.265
23-3-7.093 4.093
24-3-5.615 2.615
25-7-6.718-0.2815
26-9-10.4 1.404
27-11-12.95 1.946
28-13-15.13 2.133
29-11-13.43 2.434
30-9-12.81 3.812
31-17-13.56-3.442
32-22-17.22-4.781
33-25-20.37-4.629
34-20-20.46 0.4633
35-24-22.46-1.544
36-24-24.88 0.8845
37-22-21.51-0.4923
38-19-19.43 0.4334
39-18-18.25 0.2487
40-17-17.64 0.641
41-11-12.66 1.662
42-11-10.47-0.5308
43-12-10.91-1.093
44-10-8.047-1.953
45-15-10.23-4.767
46-15-12.5-2.496
47-15-12.3-2.7
48-13-10.87-2.13
49-8-8.558 0.5577
50-13-10.75-2.251
51-9-11.72 2.715
52-7-6.925-0.0751
53-4-4.35 0.3505
54-4-4.447 0.447
55-2-3.203 1.203
56 0 0.1034-0.1034
57-2-0.4705-1.529
58-3-2.832-0.1678
59 1-0.3323 1.332
60-2-0.8041-1.196
61-1-1.088 0.08765
62 1 1.449-0.4495
63-3 0.5374-3.537
64-4-2.951-1.049
65-9-5.761-3.239
66-9-9.29 0.29
67-7-7.66 0.6596
68-14-11.24-2.763
69-12-11.21-0.7869
70-16-14.52-1.476
71-20-17.51-2.494
72-12-15.21 3.214
73-12-11.59-0.4105
74-10-12.66 2.656
75-10-10.41 0.4051
76-13-10.21-2.794
77-16-12.45-3.551
78-14-13.67-0.3294
79-17-14.7-2.296
80-24-20.37-3.629
81-25-23.49-1.507
82-23-23.15 0.1523
83-17-19.38 2.378
84-24-21.74-2.262
85-20-22.06 2.063
86-19-19.73 0.732
87-18-18.38 0.378
88-16-17.54 1.538
89-12-12.86 0.8629
90-7-8.223 1.223
91-6-6.402 0.4016
92-6-4.237-1.763
93-5-3.709-1.291
94-4-0.9855-3.015
95-4-1.687-2.313
96-8-2.517-5.483
97-9-5.091-3.909
98-6-5.138-0.8622
99-7-4.598-2.402
100-10-6.917-3.083
101-11-10.05-0.949
102-11-11.54 0.5387
103-12-13.24 1.237
104-14-15.78 1.784
105-12-15.15 3.148
106-9-13.24 4.243
107-5-10.12 5.116
108-6-9.108 3.108
109-6-7.9 1.9
110-3-5.767 2.767
111-2-2.648 0.6477
112-6-4.9-1.1
113-6-6.998 0.998
114-10-8.938-1.062
115-8-9.366 1.366
116-4-7.404 3.404

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & -4 & -5.684 &  1.684 \tabularnewline
2 & -6 & -4.965 & -1.035 \tabularnewline
3 & -2 & -4.961 &  2.961 \tabularnewline
4 & -2 & -2.46 &  0.4597 \tabularnewline
5 & -2 & -3.794 &  1.794 \tabularnewline
6 & -2 & -3.13 &  1.13 \tabularnewline
7 &  2 & -0.6877 &  2.688 \tabularnewline
8 &  1 &  0.9784 &  0.02155 \tabularnewline
9 & -8 & -2.409 & -5.591 \tabularnewline
10 & -1 & -4.038 &  3.038 \tabularnewline
11 &  1 & -0.9135 &  1.913 \tabularnewline
12 & -1 & -1.313 &  0.3129 \tabularnewline
13 &  2 & -0.4052 &  2.405 \tabularnewline
14 &  2 &  1.151 &  0.8486 \tabularnewline
15 &  1 &  1.689 & -0.6887 \tabularnewline
16 & -1 &  0.7138 & -1.714 \tabularnewline
17 & -2 & -2.76 &  0.76 \tabularnewline
18 & -2 & -3.478 &  1.478 \tabularnewline
19 & -1 & -3.459 &  2.459 \tabularnewline
20 & -8 & -7.051 & -0.9488 \tabularnewline
21 & -4 & -8.051 &  4.051 \tabularnewline
22 & -6 & -7.265 &  1.265 \tabularnewline
23 & -3 & -7.093 &  4.093 \tabularnewline
24 & -3 & -5.615 &  2.615 \tabularnewline
25 & -7 & -6.718 & -0.2815 \tabularnewline
26 & -9 & -10.4 &  1.404 \tabularnewline
27 & -11 & -12.95 &  1.946 \tabularnewline
28 & -13 & -15.13 &  2.133 \tabularnewline
29 & -11 & -13.43 &  2.434 \tabularnewline
30 & -9 & -12.81 &  3.812 \tabularnewline
31 & -17 & -13.56 & -3.442 \tabularnewline
32 & -22 & -17.22 & -4.781 \tabularnewline
33 & -25 & -20.37 & -4.629 \tabularnewline
34 & -20 & -20.46 &  0.4633 \tabularnewline
35 & -24 & -22.46 & -1.544 \tabularnewline
36 & -24 & -24.88 &  0.8845 \tabularnewline
37 & -22 & -21.51 & -0.4923 \tabularnewline
38 & -19 & -19.43 &  0.4334 \tabularnewline
39 & -18 & -18.25 &  0.2487 \tabularnewline
40 & -17 & -17.64 &  0.641 \tabularnewline
41 & -11 & -12.66 &  1.662 \tabularnewline
42 & -11 & -10.47 & -0.5308 \tabularnewline
43 & -12 & -10.91 & -1.093 \tabularnewline
44 & -10 & -8.047 & -1.953 \tabularnewline
45 & -15 & -10.23 & -4.767 \tabularnewline
46 & -15 & -12.5 & -2.496 \tabularnewline
47 & -15 & -12.3 & -2.7 \tabularnewline
48 & -13 & -10.87 & -2.13 \tabularnewline
49 & -8 & -8.558 &  0.5577 \tabularnewline
50 & -13 & -10.75 & -2.251 \tabularnewline
51 & -9 & -11.72 &  2.715 \tabularnewline
52 & -7 & -6.925 & -0.0751 \tabularnewline
53 & -4 & -4.35 &  0.3505 \tabularnewline
54 & -4 & -4.447 &  0.447 \tabularnewline
55 & -2 & -3.203 &  1.203 \tabularnewline
56 &  0 &  0.1034 & -0.1034 \tabularnewline
57 & -2 & -0.4705 & -1.529 \tabularnewline
58 & -3 & -2.832 & -0.1678 \tabularnewline
59 &  1 & -0.3323 &  1.332 \tabularnewline
60 & -2 & -0.8041 & -1.196 \tabularnewline
61 & -1 & -1.088 &  0.08765 \tabularnewline
62 &  1 &  1.449 & -0.4495 \tabularnewline
63 & -3 &  0.5374 & -3.537 \tabularnewline
64 & -4 & -2.951 & -1.049 \tabularnewline
65 & -9 & -5.761 & -3.239 \tabularnewline
66 & -9 & -9.29 &  0.29 \tabularnewline
67 & -7 & -7.66 &  0.6596 \tabularnewline
68 & -14 & -11.24 & -2.763 \tabularnewline
69 & -12 & -11.21 & -0.7869 \tabularnewline
70 & -16 & -14.52 & -1.476 \tabularnewline
71 & -20 & -17.51 & -2.494 \tabularnewline
72 & -12 & -15.21 &  3.214 \tabularnewline
73 & -12 & -11.59 & -0.4105 \tabularnewline
74 & -10 & -12.66 &  2.656 \tabularnewline
75 & -10 & -10.41 &  0.4051 \tabularnewline
76 & -13 & -10.21 & -2.794 \tabularnewline
77 & -16 & -12.45 & -3.551 \tabularnewline
78 & -14 & -13.67 & -0.3294 \tabularnewline
79 & -17 & -14.7 & -2.296 \tabularnewline
80 & -24 & -20.37 & -3.629 \tabularnewline
81 & -25 & -23.49 & -1.507 \tabularnewline
82 & -23 & -23.15 &  0.1523 \tabularnewline
83 & -17 & -19.38 &  2.378 \tabularnewline
84 & -24 & -21.74 & -2.262 \tabularnewline
85 & -20 & -22.06 &  2.063 \tabularnewline
86 & -19 & -19.73 &  0.732 \tabularnewline
87 & -18 & -18.38 &  0.378 \tabularnewline
88 & -16 & -17.54 &  1.538 \tabularnewline
89 & -12 & -12.86 &  0.8629 \tabularnewline
90 & -7 & -8.223 &  1.223 \tabularnewline
91 & -6 & -6.402 &  0.4016 \tabularnewline
92 & -6 & -4.237 & -1.763 \tabularnewline
93 & -5 & -3.709 & -1.291 \tabularnewline
94 & -4 & -0.9855 & -3.015 \tabularnewline
95 & -4 & -1.687 & -2.313 \tabularnewline
96 & -8 & -2.517 & -5.483 \tabularnewline
97 & -9 & -5.091 & -3.909 \tabularnewline
98 & -6 & -5.138 & -0.8622 \tabularnewline
99 & -7 & -4.598 & -2.402 \tabularnewline
100 & -10 & -6.917 & -3.083 \tabularnewline
101 & -11 & -10.05 & -0.949 \tabularnewline
102 & -11 & -11.54 &  0.5387 \tabularnewline
103 & -12 & -13.24 &  1.237 \tabularnewline
104 & -14 & -15.78 &  1.784 \tabularnewline
105 & -12 & -15.15 &  3.148 \tabularnewline
106 & -9 & -13.24 &  4.243 \tabularnewline
107 & -5 & -10.12 &  5.116 \tabularnewline
108 & -6 & -9.108 &  3.108 \tabularnewline
109 & -6 & -7.9 &  1.9 \tabularnewline
110 & -3 & -5.767 &  2.767 \tabularnewline
111 & -2 & -2.648 &  0.6477 \tabularnewline
112 & -6 & -4.9 & -1.1 \tabularnewline
113 & -6 & -6.998 &  0.998 \tabularnewline
114 & -10 & -8.938 & -1.062 \tabularnewline
115 & -8 & -9.366 &  1.366 \tabularnewline
116 & -4 & -7.404 &  3.404 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=285321&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]-4[/C][C]-5.684[/C][C] 1.684[/C][/ROW]
[ROW][C]2[/C][C]-6[/C][C]-4.965[/C][C]-1.035[/C][/ROW]
[ROW][C]3[/C][C]-2[/C][C]-4.961[/C][C] 2.961[/C][/ROW]
[ROW][C]4[/C][C]-2[/C][C]-2.46[/C][C] 0.4597[/C][/ROW]
[ROW][C]5[/C][C]-2[/C][C]-3.794[/C][C] 1.794[/C][/ROW]
[ROW][C]6[/C][C]-2[/C][C]-3.13[/C][C] 1.13[/C][/ROW]
[ROW][C]7[/C][C] 2[/C][C]-0.6877[/C][C] 2.688[/C][/ROW]
[ROW][C]8[/C][C] 1[/C][C] 0.9784[/C][C] 0.02155[/C][/ROW]
[ROW][C]9[/C][C]-8[/C][C]-2.409[/C][C]-5.591[/C][/ROW]
[ROW][C]10[/C][C]-1[/C][C]-4.038[/C][C] 3.038[/C][/ROW]
[ROW][C]11[/C][C] 1[/C][C]-0.9135[/C][C] 1.913[/C][/ROW]
[ROW][C]12[/C][C]-1[/C][C]-1.313[/C][C] 0.3129[/C][/ROW]
[ROW][C]13[/C][C] 2[/C][C]-0.4052[/C][C] 2.405[/C][/ROW]
[ROW][C]14[/C][C] 2[/C][C] 1.151[/C][C] 0.8486[/C][/ROW]
[ROW][C]15[/C][C] 1[/C][C] 1.689[/C][C]-0.6887[/C][/ROW]
[ROW][C]16[/C][C]-1[/C][C] 0.7138[/C][C]-1.714[/C][/ROW]
[ROW][C]17[/C][C]-2[/C][C]-2.76[/C][C] 0.76[/C][/ROW]
[ROW][C]18[/C][C]-2[/C][C]-3.478[/C][C] 1.478[/C][/ROW]
[ROW][C]19[/C][C]-1[/C][C]-3.459[/C][C] 2.459[/C][/ROW]
[ROW][C]20[/C][C]-8[/C][C]-7.051[/C][C]-0.9488[/C][/ROW]
[ROW][C]21[/C][C]-4[/C][C]-8.051[/C][C] 4.051[/C][/ROW]
[ROW][C]22[/C][C]-6[/C][C]-7.265[/C][C] 1.265[/C][/ROW]
[ROW][C]23[/C][C]-3[/C][C]-7.093[/C][C] 4.093[/C][/ROW]
[ROW][C]24[/C][C]-3[/C][C]-5.615[/C][C] 2.615[/C][/ROW]
[ROW][C]25[/C][C]-7[/C][C]-6.718[/C][C]-0.2815[/C][/ROW]
[ROW][C]26[/C][C]-9[/C][C]-10.4[/C][C] 1.404[/C][/ROW]
[ROW][C]27[/C][C]-11[/C][C]-12.95[/C][C] 1.946[/C][/ROW]
[ROW][C]28[/C][C]-13[/C][C]-15.13[/C][C] 2.133[/C][/ROW]
[ROW][C]29[/C][C]-11[/C][C]-13.43[/C][C] 2.434[/C][/ROW]
[ROW][C]30[/C][C]-9[/C][C]-12.81[/C][C] 3.812[/C][/ROW]
[ROW][C]31[/C][C]-17[/C][C]-13.56[/C][C]-3.442[/C][/ROW]
[ROW][C]32[/C][C]-22[/C][C]-17.22[/C][C]-4.781[/C][/ROW]
[ROW][C]33[/C][C]-25[/C][C]-20.37[/C][C]-4.629[/C][/ROW]
[ROW][C]34[/C][C]-20[/C][C]-20.46[/C][C] 0.4633[/C][/ROW]
[ROW][C]35[/C][C]-24[/C][C]-22.46[/C][C]-1.544[/C][/ROW]
[ROW][C]36[/C][C]-24[/C][C]-24.88[/C][C] 0.8845[/C][/ROW]
[ROW][C]37[/C][C]-22[/C][C]-21.51[/C][C]-0.4923[/C][/ROW]
[ROW][C]38[/C][C]-19[/C][C]-19.43[/C][C] 0.4334[/C][/ROW]
[ROW][C]39[/C][C]-18[/C][C]-18.25[/C][C] 0.2487[/C][/ROW]
[ROW][C]40[/C][C]-17[/C][C]-17.64[/C][C] 0.641[/C][/ROW]
[ROW][C]41[/C][C]-11[/C][C]-12.66[/C][C] 1.662[/C][/ROW]
[ROW][C]42[/C][C]-11[/C][C]-10.47[/C][C]-0.5308[/C][/ROW]
[ROW][C]43[/C][C]-12[/C][C]-10.91[/C][C]-1.093[/C][/ROW]
[ROW][C]44[/C][C]-10[/C][C]-8.047[/C][C]-1.953[/C][/ROW]
[ROW][C]45[/C][C]-15[/C][C]-10.23[/C][C]-4.767[/C][/ROW]
[ROW][C]46[/C][C]-15[/C][C]-12.5[/C][C]-2.496[/C][/ROW]
[ROW][C]47[/C][C]-15[/C][C]-12.3[/C][C]-2.7[/C][/ROW]
[ROW][C]48[/C][C]-13[/C][C]-10.87[/C][C]-2.13[/C][/ROW]
[ROW][C]49[/C][C]-8[/C][C]-8.558[/C][C] 0.5577[/C][/ROW]
[ROW][C]50[/C][C]-13[/C][C]-10.75[/C][C]-2.251[/C][/ROW]
[ROW][C]51[/C][C]-9[/C][C]-11.72[/C][C] 2.715[/C][/ROW]
[ROW][C]52[/C][C]-7[/C][C]-6.925[/C][C]-0.0751[/C][/ROW]
[ROW][C]53[/C][C]-4[/C][C]-4.35[/C][C] 0.3505[/C][/ROW]
[ROW][C]54[/C][C]-4[/C][C]-4.447[/C][C] 0.447[/C][/ROW]
[ROW][C]55[/C][C]-2[/C][C]-3.203[/C][C] 1.203[/C][/ROW]
[ROW][C]56[/C][C] 0[/C][C] 0.1034[/C][C]-0.1034[/C][/ROW]
[ROW][C]57[/C][C]-2[/C][C]-0.4705[/C][C]-1.529[/C][/ROW]
[ROW][C]58[/C][C]-3[/C][C]-2.832[/C][C]-0.1678[/C][/ROW]
[ROW][C]59[/C][C] 1[/C][C]-0.3323[/C][C] 1.332[/C][/ROW]
[ROW][C]60[/C][C]-2[/C][C]-0.8041[/C][C]-1.196[/C][/ROW]
[ROW][C]61[/C][C]-1[/C][C]-1.088[/C][C] 0.08765[/C][/ROW]
[ROW][C]62[/C][C] 1[/C][C] 1.449[/C][C]-0.4495[/C][/ROW]
[ROW][C]63[/C][C]-3[/C][C] 0.5374[/C][C]-3.537[/C][/ROW]
[ROW][C]64[/C][C]-4[/C][C]-2.951[/C][C]-1.049[/C][/ROW]
[ROW][C]65[/C][C]-9[/C][C]-5.761[/C][C]-3.239[/C][/ROW]
[ROW][C]66[/C][C]-9[/C][C]-9.29[/C][C] 0.29[/C][/ROW]
[ROW][C]67[/C][C]-7[/C][C]-7.66[/C][C] 0.6596[/C][/ROW]
[ROW][C]68[/C][C]-14[/C][C]-11.24[/C][C]-2.763[/C][/ROW]
[ROW][C]69[/C][C]-12[/C][C]-11.21[/C][C]-0.7869[/C][/ROW]
[ROW][C]70[/C][C]-16[/C][C]-14.52[/C][C]-1.476[/C][/ROW]
[ROW][C]71[/C][C]-20[/C][C]-17.51[/C][C]-2.494[/C][/ROW]
[ROW][C]72[/C][C]-12[/C][C]-15.21[/C][C] 3.214[/C][/ROW]
[ROW][C]73[/C][C]-12[/C][C]-11.59[/C][C]-0.4105[/C][/ROW]
[ROW][C]74[/C][C]-10[/C][C]-12.66[/C][C] 2.656[/C][/ROW]
[ROW][C]75[/C][C]-10[/C][C]-10.41[/C][C] 0.4051[/C][/ROW]
[ROW][C]76[/C][C]-13[/C][C]-10.21[/C][C]-2.794[/C][/ROW]
[ROW][C]77[/C][C]-16[/C][C]-12.45[/C][C]-3.551[/C][/ROW]
[ROW][C]78[/C][C]-14[/C][C]-13.67[/C][C]-0.3294[/C][/ROW]
[ROW][C]79[/C][C]-17[/C][C]-14.7[/C][C]-2.296[/C][/ROW]
[ROW][C]80[/C][C]-24[/C][C]-20.37[/C][C]-3.629[/C][/ROW]
[ROW][C]81[/C][C]-25[/C][C]-23.49[/C][C]-1.507[/C][/ROW]
[ROW][C]82[/C][C]-23[/C][C]-23.15[/C][C] 0.1523[/C][/ROW]
[ROW][C]83[/C][C]-17[/C][C]-19.38[/C][C] 2.378[/C][/ROW]
[ROW][C]84[/C][C]-24[/C][C]-21.74[/C][C]-2.262[/C][/ROW]
[ROW][C]85[/C][C]-20[/C][C]-22.06[/C][C] 2.063[/C][/ROW]
[ROW][C]86[/C][C]-19[/C][C]-19.73[/C][C] 0.732[/C][/ROW]
[ROW][C]87[/C][C]-18[/C][C]-18.38[/C][C] 0.378[/C][/ROW]
[ROW][C]88[/C][C]-16[/C][C]-17.54[/C][C] 1.538[/C][/ROW]
[ROW][C]89[/C][C]-12[/C][C]-12.86[/C][C] 0.8629[/C][/ROW]
[ROW][C]90[/C][C]-7[/C][C]-8.223[/C][C] 1.223[/C][/ROW]
[ROW][C]91[/C][C]-6[/C][C]-6.402[/C][C] 0.4016[/C][/ROW]
[ROW][C]92[/C][C]-6[/C][C]-4.237[/C][C]-1.763[/C][/ROW]
[ROW][C]93[/C][C]-5[/C][C]-3.709[/C][C]-1.291[/C][/ROW]
[ROW][C]94[/C][C]-4[/C][C]-0.9855[/C][C]-3.015[/C][/ROW]
[ROW][C]95[/C][C]-4[/C][C]-1.687[/C][C]-2.313[/C][/ROW]
[ROW][C]96[/C][C]-8[/C][C]-2.517[/C][C]-5.483[/C][/ROW]
[ROW][C]97[/C][C]-9[/C][C]-5.091[/C][C]-3.909[/C][/ROW]
[ROW][C]98[/C][C]-6[/C][C]-5.138[/C][C]-0.8622[/C][/ROW]
[ROW][C]99[/C][C]-7[/C][C]-4.598[/C][C]-2.402[/C][/ROW]
[ROW][C]100[/C][C]-10[/C][C]-6.917[/C][C]-3.083[/C][/ROW]
[ROW][C]101[/C][C]-11[/C][C]-10.05[/C][C]-0.949[/C][/ROW]
[ROW][C]102[/C][C]-11[/C][C]-11.54[/C][C] 0.5387[/C][/ROW]
[ROW][C]103[/C][C]-12[/C][C]-13.24[/C][C] 1.237[/C][/ROW]
[ROW][C]104[/C][C]-14[/C][C]-15.78[/C][C] 1.784[/C][/ROW]
[ROW][C]105[/C][C]-12[/C][C]-15.15[/C][C] 3.148[/C][/ROW]
[ROW][C]106[/C][C]-9[/C][C]-13.24[/C][C] 4.243[/C][/ROW]
[ROW][C]107[/C][C]-5[/C][C]-10.12[/C][C] 5.116[/C][/ROW]
[ROW][C]108[/C][C]-6[/C][C]-9.108[/C][C] 3.108[/C][/ROW]
[ROW][C]109[/C][C]-6[/C][C]-7.9[/C][C] 1.9[/C][/ROW]
[ROW][C]110[/C][C]-3[/C][C]-5.767[/C][C] 2.767[/C][/ROW]
[ROW][C]111[/C][C]-2[/C][C]-2.648[/C][C] 0.6477[/C][/ROW]
[ROW][C]112[/C][C]-6[/C][C]-4.9[/C][C]-1.1[/C][/ROW]
[ROW][C]113[/C][C]-6[/C][C]-6.998[/C][C] 0.998[/C][/ROW]
[ROW][C]114[/C][C]-10[/C][C]-8.938[/C][C]-1.062[/C][/ROW]
[ROW][C]115[/C][C]-8[/C][C]-9.366[/C][C] 1.366[/C][/ROW]
[ROW][C]116[/C][C]-4[/C][C]-7.404[/C][C] 3.404[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=285321&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=285321&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1-4-5.684 1.684
2-6-4.965-1.035
3-2-4.961 2.961
4-2-2.46 0.4597
5-2-3.794 1.794
6-2-3.13 1.13
7 2-0.6877 2.688
8 1 0.9784 0.02155
9-8-2.409-5.591
10-1-4.038 3.038
11 1-0.9135 1.913
12-1-1.313 0.3129
13 2-0.4052 2.405
14 2 1.151 0.8486
15 1 1.689-0.6887
16-1 0.7138-1.714
17-2-2.76 0.76
18-2-3.478 1.478
19-1-3.459 2.459
20-8-7.051-0.9488
21-4-8.051 4.051
22-6-7.265 1.265
23-3-7.093 4.093
24-3-5.615 2.615
25-7-6.718-0.2815
26-9-10.4 1.404
27-11-12.95 1.946
28-13-15.13 2.133
29-11-13.43 2.434
30-9-12.81 3.812
31-17-13.56-3.442
32-22-17.22-4.781
33-25-20.37-4.629
34-20-20.46 0.4633
35-24-22.46-1.544
36-24-24.88 0.8845
37-22-21.51-0.4923
38-19-19.43 0.4334
39-18-18.25 0.2487
40-17-17.64 0.641
41-11-12.66 1.662
42-11-10.47-0.5308
43-12-10.91-1.093
44-10-8.047-1.953
45-15-10.23-4.767
46-15-12.5-2.496
47-15-12.3-2.7
48-13-10.87-2.13
49-8-8.558 0.5577
50-13-10.75-2.251
51-9-11.72 2.715
52-7-6.925-0.0751
53-4-4.35 0.3505
54-4-4.447 0.447
55-2-3.203 1.203
56 0 0.1034-0.1034
57-2-0.4705-1.529
58-3-2.832-0.1678
59 1-0.3323 1.332
60-2-0.8041-1.196
61-1-1.088 0.08765
62 1 1.449-0.4495
63-3 0.5374-3.537
64-4-2.951-1.049
65-9-5.761-3.239
66-9-9.29 0.29
67-7-7.66 0.6596
68-14-11.24-2.763
69-12-11.21-0.7869
70-16-14.52-1.476
71-20-17.51-2.494
72-12-15.21 3.214
73-12-11.59-0.4105
74-10-12.66 2.656
75-10-10.41 0.4051
76-13-10.21-2.794
77-16-12.45-3.551
78-14-13.67-0.3294
79-17-14.7-2.296
80-24-20.37-3.629
81-25-23.49-1.507
82-23-23.15 0.1523
83-17-19.38 2.378
84-24-21.74-2.262
85-20-22.06 2.063
86-19-19.73 0.732
87-18-18.38 0.378
88-16-17.54 1.538
89-12-12.86 0.8629
90-7-8.223 1.223
91-6-6.402 0.4016
92-6-4.237-1.763
93-5-3.709-1.291
94-4-0.9855-3.015
95-4-1.687-2.313
96-8-2.517-5.483
97-9-5.091-3.909
98-6-5.138-0.8622
99-7-4.598-2.402
100-10-6.917-3.083
101-11-10.05-0.949
102-11-11.54 0.5387
103-12-13.24 1.237
104-14-15.78 1.784
105-12-15.15 3.148
106-9-13.24 4.243
107-5-10.12 5.116
108-6-9.108 3.108
109-6-7.9 1.9
110-3-5.767 2.767
111-2-2.648 0.6477
112-6-4.9-1.1
113-6-6.998 0.998
114-10-8.938-1.062
115-8-9.366 1.366
116-4-7.404 3.404







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
12 0.545 0.9101 0.455
13 0.4116 0.8231 0.5884
14 0.3433 0.6866 0.6567
15 0.2337 0.4674 0.7663
16 0.1503 0.3005 0.8497
17 0.1866 0.3733 0.8134
18 0.1373 0.2746 0.8627
19 0.09547 0.1909 0.9045
20 0.06026 0.1205 0.9397
21 0.05473 0.1095 0.9453
22 0.0443 0.08861 0.9557
23 0.03546 0.07091 0.9645
24 0.02571 0.05142 0.9743
25 0.02106 0.04211 0.9789
26 0.01423 0.02846 0.9858
27 0.009692 0.01938 0.9903
28 0.009056 0.01811 0.9909
29 0.03424 0.06848 0.9658
30 0.04045 0.08091 0.9595
31 0.07706 0.1541 0.9229
32 0.3155 0.631 0.6845
33 0.5734 0.8531 0.4266
34 0.5267 0.9466 0.4733
35 0.69 0.6199 0.31
36 0.8325 0.3349 0.1675
37 0.7966 0.4067 0.2034
38 0.7508 0.4984 0.2492
39 0.7132 0.5736 0.2868
40 0.705 0.59 0.295
41 0.6588 0.6825 0.3412
42 0.6038 0.7924 0.3962
43 0.5518 0.8965 0.4482
44 0.55 0.9001 0.45
45 0.6436 0.7128 0.3564
46 0.659 0.6819 0.341
47 0.7038 0.5924 0.2962
48 0.7838 0.4325 0.2162
49 0.7493 0.5014 0.2507
50 0.7524 0.4952 0.2476
51 0.7479 0.5042 0.2521
52 0.7127 0.5746 0.2873
53 0.6721 0.6558 0.3279
54 0.6185 0.7631 0.3815
55 0.5767 0.8465 0.4233
56 0.5676 0.8648 0.4324
57 0.5369 0.9262 0.4631
58 0.4944 0.9888 0.5056
59 0.4857 0.9713 0.5143
60 0.5086 0.9828 0.4914
61 0.6196 0.7608 0.3804
62 0.726 0.5479 0.274
63 0.7735 0.453 0.2265
64 0.8388 0.3224 0.1612
65 0.8499 0.3003 0.1501
66 0.8851 0.2298 0.1149
67 0.873 0.2539 0.127
68 0.8723 0.2555 0.1277
69 0.8685 0.263 0.1315
70 0.8847 0.2305 0.1153
71 0.8701 0.2599 0.1299
72 0.9607 0.07868 0.03934
73 0.9467 0.1066 0.05332
74 0.9985 0.002911 0.001455
75 0.9986 0.002809 0.001405
76 0.9983 0.003373 0.001686
77 0.9981 0.003736 0.001868
78 0.9973 0.005489 0.002745
79 0.996 0.007995 0.003997
80 0.9948 0.01041 0.005203
81 0.992 0.01593 0.007963
82 0.9898 0.02033 0.01017
83 0.9864 0.02727 0.01363
84 0.9888 0.02239 0.01119
85 0.9842 0.03152 0.01576
86 0.9935 0.01297 0.006487
87 0.9894 0.02112 0.01056
88 0.9853 0.02936 0.01468
89 0.9784 0.04313 0.02157
90 0.9675 0.06508 0.03254
91 0.9574 0.08511 0.04256
92 0.9404 0.1193 0.05964
93 0.9301 0.1398 0.06989
94 0.9088 0.1824 0.09118
95 0.8641 0.2718 0.1359
96 0.9448 0.1103 0.05517
97 0.9585 0.083 0.0415
98 0.9447 0.1106 0.05528
99 0.9926 0.01486 0.007429
100 0.9847 0.03054 0.01527
101 0.9888 0.02238 0.01119
102 0.9924 0.01514 0.007572
103 0.9794 0.0411 0.02055
104 0.9543 0.09134 0.04567

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
12 &  0.545 &  0.9101 &  0.455 \tabularnewline
13 &  0.4116 &  0.8231 &  0.5884 \tabularnewline
14 &  0.3433 &  0.6866 &  0.6567 \tabularnewline
15 &  0.2337 &  0.4674 &  0.7663 \tabularnewline
16 &  0.1503 &  0.3005 &  0.8497 \tabularnewline
17 &  0.1866 &  0.3733 &  0.8134 \tabularnewline
18 &  0.1373 &  0.2746 &  0.8627 \tabularnewline
19 &  0.09547 &  0.1909 &  0.9045 \tabularnewline
20 &  0.06026 &  0.1205 &  0.9397 \tabularnewline
21 &  0.05473 &  0.1095 &  0.9453 \tabularnewline
22 &  0.0443 &  0.08861 &  0.9557 \tabularnewline
23 &  0.03546 &  0.07091 &  0.9645 \tabularnewline
24 &  0.02571 &  0.05142 &  0.9743 \tabularnewline
25 &  0.02106 &  0.04211 &  0.9789 \tabularnewline
26 &  0.01423 &  0.02846 &  0.9858 \tabularnewline
27 &  0.009692 &  0.01938 &  0.9903 \tabularnewline
28 &  0.009056 &  0.01811 &  0.9909 \tabularnewline
29 &  0.03424 &  0.06848 &  0.9658 \tabularnewline
30 &  0.04045 &  0.08091 &  0.9595 \tabularnewline
31 &  0.07706 &  0.1541 &  0.9229 \tabularnewline
32 &  0.3155 &  0.631 &  0.6845 \tabularnewline
33 &  0.5734 &  0.8531 &  0.4266 \tabularnewline
34 &  0.5267 &  0.9466 &  0.4733 \tabularnewline
35 &  0.69 &  0.6199 &  0.31 \tabularnewline
36 &  0.8325 &  0.3349 &  0.1675 \tabularnewline
37 &  0.7966 &  0.4067 &  0.2034 \tabularnewline
38 &  0.7508 &  0.4984 &  0.2492 \tabularnewline
39 &  0.7132 &  0.5736 &  0.2868 \tabularnewline
40 &  0.705 &  0.59 &  0.295 \tabularnewline
41 &  0.6588 &  0.6825 &  0.3412 \tabularnewline
42 &  0.6038 &  0.7924 &  0.3962 \tabularnewline
43 &  0.5518 &  0.8965 &  0.4482 \tabularnewline
44 &  0.55 &  0.9001 &  0.45 \tabularnewline
45 &  0.6436 &  0.7128 &  0.3564 \tabularnewline
46 &  0.659 &  0.6819 &  0.341 \tabularnewline
47 &  0.7038 &  0.5924 &  0.2962 \tabularnewline
48 &  0.7838 &  0.4325 &  0.2162 \tabularnewline
49 &  0.7493 &  0.5014 &  0.2507 \tabularnewline
50 &  0.7524 &  0.4952 &  0.2476 \tabularnewline
51 &  0.7479 &  0.5042 &  0.2521 \tabularnewline
52 &  0.7127 &  0.5746 &  0.2873 \tabularnewline
53 &  0.6721 &  0.6558 &  0.3279 \tabularnewline
54 &  0.6185 &  0.7631 &  0.3815 \tabularnewline
55 &  0.5767 &  0.8465 &  0.4233 \tabularnewline
56 &  0.5676 &  0.8648 &  0.4324 \tabularnewline
57 &  0.5369 &  0.9262 &  0.4631 \tabularnewline
58 &  0.4944 &  0.9888 &  0.5056 \tabularnewline
59 &  0.4857 &  0.9713 &  0.5143 \tabularnewline
60 &  0.5086 &  0.9828 &  0.4914 \tabularnewline
61 &  0.6196 &  0.7608 &  0.3804 \tabularnewline
62 &  0.726 &  0.5479 &  0.274 \tabularnewline
63 &  0.7735 &  0.453 &  0.2265 \tabularnewline
64 &  0.8388 &  0.3224 &  0.1612 \tabularnewline
65 &  0.8499 &  0.3003 &  0.1501 \tabularnewline
66 &  0.8851 &  0.2298 &  0.1149 \tabularnewline
67 &  0.873 &  0.2539 &  0.127 \tabularnewline
68 &  0.8723 &  0.2555 &  0.1277 \tabularnewline
69 &  0.8685 &  0.263 &  0.1315 \tabularnewline
70 &  0.8847 &  0.2305 &  0.1153 \tabularnewline
71 &  0.8701 &  0.2599 &  0.1299 \tabularnewline
72 &  0.9607 &  0.07868 &  0.03934 \tabularnewline
73 &  0.9467 &  0.1066 &  0.05332 \tabularnewline
74 &  0.9985 &  0.002911 &  0.001455 \tabularnewline
75 &  0.9986 &  0.002809 &  0.001405 \tabularnewline
76 &  0.9983 &  0.003373 &  0.001686 \tabularnewline
77 &  0.9981 &  0.003736 &  0.001868 \tabularnewline
78 &  0.9973 &  0.005489 &  0.002745 \tabularnewline
79 &  0.996 &  0.007995 &  0.003997 \tabularnewline
80 &  0.9948 &  0.01041 &  0.005203 \tabularnewline
81 &  0.992 &  0.01593 &  0.007963 \tabularnewline
82 &  0.9898 &  0.02033 &  0.01017 \tabularnewline
83 &  0.9864 &  0.02727 &  0.01363 \tabularnewline
84 &  0.9888 &  0.02239 &  0.01119 \tabularnewline
85 &  0.9842 &  0.03152 &  0.01576 \tabularnewline
86 &  0.9935 &  0.01297 &  0.006487 \tabularnewline
87 &  0.9894 &  0.02112 &  0.01056 \tabularnewline
88 &  0.9853 &  0.02936 &  0.01468 \tabularnewline
89 &  0.9784 &  0.04313 &  0.02157 \tabularnewline
90 &  0.9675 &  0.06508 &  0.03254 \tabularnewline
91 &  0.9574 &  0.08511 &  0.04256 \tabularnewline
92 &  0.9404 &  0.1193 &  0.05964 \tabularnewline
93 &  0.9301 &  0.1398 &  0.06989 \tabularnewline
94 &  0.9088 &  0.1824 &  0.09118 \tabularnewline
95 &  0.8641 &  0.2718 &  0.1359 \tabularnewline
96 &  0.9448 &  0.1103 &  0.05517 \tabularnewline
97 &  0.9585 &  0.083 &  0.0415 \tabularnewline
98 &  0.9447 &  0.1106 &  0.05528 \tabularnewline
99 &  0.9926 &  0.01486 &  0.007429 \tabularnewline
100 &  0.9847 &  0.03054 &  0.01527 \tabularnewline
101 &  0.9888 &  0.02238 &  0.01119 \tabularnewline
102 &  0.9924 &  0.01514 &  0.007572 \tabularnewline
103 &  0.9794 &  0.0411 &  0.02055 \tabularnewline
104 &  0.9543 &  0.09134 &  0.04567 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=285321&T=5

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]12[/C][C] 0.545[/C][C] 0.9101[/C][C] 0.455[/C][/ROW]
[ROW][C]13[/C][C] 0.4116[/C][C] 0.8231[/C][C] 0.5884[/C][/ROW]
[ROW][C]14[/C][C] 0.3433[/C][C] 0.6866[/C][C] 0.6567[/C][/ROW]
[ROW][C]15[/C][C] 0.2337[/C][C] 0.4674[/C][C] 0.7663[/C][/ROW]
[ROW][C]16[/C][C] 0.1503[/C][C] 0.3005[/C][C] 0.8497[/C][/ROW]
[ROW][C]17[/C][C] 0.1866[/C][C] 0.3733[/C][C] 0.8134[/C][/ROW]
[ROW][C]18[/C][C] 0.1373[/C][C] 0.2746[/C][C] 0.8627[/C][/ROW]
[ROW][C]19[/C][C] 0.09547[/C][C] 0.1909[/C][C] 0.9045[/C][/ROW]
[ROW][C]20[/C][C] 0.06026[/C][C] 0.1205[/C][C] 0.9397[/C][/ROW]
[ROW][C]21[/C][C] 0.05473[/C][C] 0.1095[/C][C] 0.9453[/C][/ROW]
[ROW][C]22[/C][C] 0.0443[/C][C] 0.08861[/C][C] 0.9557[/C][/ROW]
[ROW][C]23[/C][C] 0.03546[/C][C] 0.07091[/C][C] 0.9645[/C][/ROW]
[ROW][C]24[/C][C] 0.02571[/C][C] 0.05142[/C][C] 0.9743[/C][/ROW]
[ROW][C]25[/C][C] 0.02106[/C][C] 0.04211[/C][C] 0.9789[/C][/ROW]
[ROW][C]26[/C][C] 0.01423[/C][C] 0.02846[/C][C] 0.9858[/C][/ROW]
[ROW][C]27[/C][C] 0.009692[/C][C] 0.01938[/C][C] 0.9903[/C][/ROW]
[ROW][C]28[/C][C] 0.009056[/C][C] 0.01811[/C][C] 0.9909[/C][/ROW]
[ROW][C]29[/C][C] 0.03424[/C][C] 0.06848[/C][C] 0.9658[/C][/ROW]
[ROW][C]30[/C][C] 0.04045[/C][C] 0.08091[/C][C] 0.9595[/C][/ROW]
[ROW][C]31[/C][C] 0.07706[/C][C] 0.1541[/C][C] 0.9229[/C][/ROW]
[ROW][C]32[/C][C] 0.3155[/C][C] 0.631[/C][C] 0.6845[/C][/ROW]
[ROW][C]33[/C][C] 0.5734[/C][C] 0.8531[/C][C] 0.4266[/C][/ROW]
[ROW][C]34[/C][C] 0.5267[/C][C] 0.9466[/C][C] 0.4733[/C][/ROW]
[ROW][C]35[/C][C] 0.69[/C][C] 0.6199[/C][C] 0.31[/C][/ROW]
[ROW][C]36[/C][C] 0.8325[/C][C] 0.3349[/C][C] 0.1675[/C][/ROW]
[ROW][C]37[/C][C] 0.7966[/C][C] 0.4067[/C][C] 0.2034[/C][/ROW]
[ROW][C]38[/C][C] 0.7508[/C][C] 0.4984[/C][C] 0.2492[/C][/ROW]
[ROW][C]39[/C][C] 0.7132[/C][C] 0.5736[/C][C] 0.2868[/C][/ROW]
[ROW][C]40[/C][C] 0.705[/C][C] 0.59[/C][C] 0.295[/C][/ROW]
[ROW][C]41[/C][C] 0.6588[/C][C] 0.6825[/C][C] 0.3412[/C][/ROW]
[ROW][C]42[/C][C] 0.6038[/C][C] 0.7924[/C][C] 0.3962[/C][/ROW]
[ROW][C]43[/C][C] 0.5518[/C][C] 0.8965[/C][C] 0.4482[/C][/ROW]
[ROW][C]44[/C][C] 0.55[/C][C] 0.9001[/C][C] 0.45[/C][/ROW]
[ROW][C]45[/C][C] 0.6436[/C][C] 0.7128[/C][C] 0.3564[/C][/ROW]
[ROW][C]46[/C][C] 0.659[/C][C] 0.6819[/C][C] 0.341[/C][/ROW]
[ROW][C]47[/C][C] 0.7038[/C][C] 0.5924[/C][C] 0.2962[/C][/ROW]
[ROW][C]48[/C][C] 0.7838[/C][C] 0.4325[/C][C] 0.2162[/C][/ROW]
[ROW][C]49[/C][C] 0.7493[/C][C] 0.5014[/C][C] 0.2507[/C][/ROW]
[ROW][C]50[/C][C] 0.7524[/C][C] 0.4952[/C][C] 0.2476[/C][/ROW]
[ROW][C]51[/C][C] 0.7479[/C][C] 0.5042[/C][C] 0.2521[/C][/ROW]
[ROW][C]52[/C][C] 0.7127[/C][C] 0.5746[/C][C] 0.2873[/C][/ROW]
[ROW][C]53[/C][C] 0.6721[/C][C] 0.6558[/C][C] 0.3279[/C][/ROW]
[ROW][C]54[/C][C] 0.6185[/C][C] 0.7631[/C][C] 0.3815[/C][/ROW]
[ROW][C]55[/C][C] 0.5767[/C][C] 0.8465[/C][C] 0.4233[/C][/ROW]
[ROW][C]56[/C][C] 0.5676[/C][C] 0.8648[/C][C] 0.4324[/C][/ROW]
[ROW][C]57[/C][C] 0.5369[/C][C] 0.9262[/C][C] 0.4631[/C][/ROW]
[ROW][C]58[/C][C] 0.4944[/C][C] 0.9888[/C][C] 0.5056[/C][/ROW]
[ROW][C]59[/C][C] 0.4857[/C][C] 0.9713[/C][C] 0.5143[/C][/ROW]
[ROW][C]60[/C][C] 0.5086[/C][C] 0.9828[/C][C] 0.4914[/C][/ROW]
[ROW][C]61[/C][C] 0.6196[/C][C] 0.7608[/C][C] 0.3804[/C][/ROW]
[ROW][C]62[/C][C] 0.726[/C][C] 0.5479[/C][C] 0.274[/C][/ROW]
[ROW][C]63[/C][C] 0.7735[/C][C] 0.453[/C][C] 0.2265[/C][/ROW]
[ROW][C]64[/C][C] 0.8388[/C][C] 0.3224[/C][C] 0.1612[/C][/ROW]
[ROW][C]65[/C][C] 0.8499[/C][C] 0.3003[/C][C] 0.1501[/C][/ROW]
[ROW][C]66[/C][C] 0.8851[/C][C] 0.2298[/C][C] 0.1149[/C][/ROW]
[ROW][C]67[/C][C] 0.873[/C][C] 0.2539[/C][C] 0.127[/C][/ROW]
[ROW][C]68[/C][C] 0.8723[/C][C] 0.2555[/C][C] 0.1277[/C][/ROW]
[ROW][C]69[/C][C] 0.8685[/C][C] 0.263[/C][C] 0.1315[/C][/ROW]
[ROW][C]70[/C][C] 0.8847[/C][C] 0.2305[/C][C] 0.1153[/C][/ROW]
[ROW][C]71[/C][C] 0.8701[/C][C] 0.2599[/C][C] 0.1299[/C][/ROW]
[ROW][C]72[/C][C] 0.9607[/C][C] 0.07868[/C][C] 0.03934[/C][/ROW]
[ROW][C]73[/C][C] 0.9467[/C][C] 0.1066[/C][C] 0.05332[/C][/ROW]
[ROW][C]74[/C][C] 0.9985[/C][C] 0.002911[/C][C] 0.001455[/C][/ROW]
[ROW][C]75[/C][C] 0.9986[/C][C] 0.002809[/C][C] 0.001405[/C][/ROW]
[ROW][C]76[/C][C] 0.9983[/C][C] 0.003373[/C][C] 0.001686[/C][/ROW]
[ROW][C]77[/C][C] 0.9981[/C][C] 0.003736[/C][C] 0.001868[/C][/ROW]
[ROW][C]78[/C][C] 0.9973[/C][C] 0.005489[/C][C] 0.002745[/C][/ROW]
[ROW][C]79[/C][C] 0.996[/C][C] 0.007995[/C][C] 0.003997[/C][/ROW]
[ROW][C]80[/C][C] 0.9948[/C][C] 0.01041[/C][C] 0.005203[/C][/ROW]
[ROW][C]81[/C][C] 0.992[/C][C] 0.01593[/C][C] 0.007963[/C][/ROW]
[ROW][C]82[/C][C] 0.9898[/C][C] 0.02033[/C][C] 0.01017[/C][/ROW]
[ROW][C]83[/C][C] 0.9864[/C][C] 0.02727[/C][C] 0.01363[/C][/ROW]
[ROW][C]84[/C][C] 0.9888[/C][C] 0.02239[/C][C] 0.01119[/C][/ROW]
[ROW][C]85[/C][C] 0.9842[/C][C] 0.03152[/C][C] 0.01576[/C][/ROW]
[ROW][C]86[/C][C] 0.9935[/C][C] 0.01297[/C][C] 0.006487[/C][/ROW]
[ROW][C]87[/C][C] 0.9894[/C][C] 0.02112[/C][C] 0.01056[/C][/ROW]
[ROW][C]88[/C][C] 0.9853[/C][C] 0.02936[/C][C] 0.01468[/C][/ROW]
[ROW][C]89[/C][C] 0.9784[/C][C] 0.04313[/C][C] 0.02157[/C][/ROW]
[ROW][C]90[/C][C] 0.9675[/C][C] 0.06508[/C][C] 0.03254[/C][/ROW]
[ROW][C]91[/C][C] 0.9574[/C][C] 0.08511[/C][C] 0.04256[/C][/ROW]
[ROW][C]92[/C][C] 0.9404[/C][C] 0.1193[/C][C] 0.05964[/C][/ROW]
[ROW][C]93[/C][C] 0.9301[/C][C] 0.1398[/C][C] 0.06989[/C][/ROW]
[ROW][C]94[/C][C] 0.9088[/C][C] 0.1824[/C][C] 0.09118[/C][/ROW]
[ROW][C]95[/C][C] 0.8641[/C][C] 0.2718[/C][C] 0.1359[/C][/ROW]
[ROW][C]96[/C][C] 0.9448[/C][C] 0.1103[/C][C] 0.05517[/C][/ROW]
[ROW][C]97[/C][C] 0.9585[/C][C] 0.083[/C][C] 0.0415[/C][/ROW]
[ROW][C]98[/C][C] 0.9447[/C][C] 0.1106[/C][C] 0.05528[/C][/ROW]
[ROW][C]99[/C][C] 0.9926[/C][C] 0.01486[/C][C] 0.007429[/C][/ROW]
[ROW][C]100[/C][C] 0.9847[/C][C] 0.03054[/C][C] 0.01527[/C][/ROW]
[ROW][C]101[/C][C] 0.9888[/C][C] 0.02238[/C][C] 0.01119[/C][/ROW]
[ROW][C]102[/C][C] 0.9924[/C][C] 0.01514[/C][C] 0.007572[/C][/ROW]
[ROW][C]103[/C][C] 0.9794[/C][C] 0.0411[/C][C] 0.02055[/C][/ROW]
[ROW][C]104[/C][C] 0.9543[/C][C] 0.09134[/C][C] 0.04567[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=285321&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=285321&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
12 0.545 0.9101 0.455
13 0.4116 0.8231 0.5884
14 0.3433 0.6866 0.6567
15 0.2337 0.4674 0.7663
16 0.1503 0.3005 0.8497
17 0.1866 0.3733 0.8134
18 0.1373 0.2746 0.8627
19 0.09547 0.1909 0.9045
20 0.06026 0.1205 0.9397
21 0.05473 0.1095 0.9453
22 0.0443 0.08861 0.9557
23 0.03546 0.07091 0.9645
24 0.02571 0.05142 0.9743
25 0.02106 0.04211 0.9789
26 0.01423 0.02846 0.9858
27 0.009692 0.01938 0.9903
28 0.009056 0.01811 0.9909
29 0.03424 0.06848 0.9658
30 0.04045 0.08091 0.9595
31 0.07706 0.1541 0.9229
32 0.3155 0.631 0.6845
33 0.5734 0.8531 0.4266
34 0.5267 0.9466 0.4733
35 0.69 0.6199 0.31
36 0.8325 0.3349 0.1675
37 0.7966 0.4067 0.2034
38 0.7508 0.4984 0.2492
39 0.7132 0.5736 0.2868
40 0.705 0.59 0.295
41 0.6588 0.6825 0.3412
42 0.6038 0.7924 0.3962
43 0.5518 0.8965 0.4482
44 0.55 0.9001 0.45
45 0.6436 0.7128 0.3564
46 0.659 0.6819 0.341
47 0.7038 0.5924 0.2962
48 0.7838 0.4325 0.2162
49 0.7493 0.5014 0.2507
50 0.7524 0.4952 0.2476
51 0.7479 0.5042 0.2521
52 0.7127 0.5746 0.2873
53 0.6721 0.6558 0.3279
54 0.6185 0.7631 0.3815
55 0.5767 0.8465 0.4233
56 0.5676 0.8648 0.4324
57 0.5369 0.9262 0.4631
58 0.4944 0.9888 0.5056
59 0.4857 0.9713 0.5143
60 0.5086 0.9828 0.4914
61 0.6196 0.7608 0.3804
62 0.726 0.5479 0.274
63 0.7735 0.453 0.2265
64 0.8388 0.3224 0.1612
65 0.8499 0.3003 0.1501
66 0.8851 0.2298 0.1149
67 0.873 0.2539 0.127
68 0.8723 0.2555 0.1277
69 0.8685 0.263 0.1315
70 0.8847 0.2305 0.1153
71 0.8701 0.2599 0.1299
72 0.9607 0.07868 0.03934
73 0.9467 0.1066 0.05332
74 0.9985 0.002911 0.001455
75 0.9986 0.002809 0.001405
76 0.9983 0.003373 0.001686
77 0.9981 0.003736 0.001868
78 0.9973 0.005489 0.002745
79 0.996 0.007995 0.003997
80 0.9948 0.01041 0.005203
81 0.992 0.01593 0.007963
82 0.9898 0.02033 0.01017
83 0.9864 0.02727 0.01363
84 0.9888 0.02239 0.01119
85 0.9842 0.03152 0.01576
86 0.9935 0.01297 0.006487
87 0.9894 0.02112 0.01056
88 0.9853 0.02936 0.01468
89 0.9784 0.04313 0.02157
90 0.9675 0.06508 0.03254
91 0.9574 0.08511 0.04256
92 0.9404 0.1193 0.05964
93 0.9301 0.1398 0.06989
94 0.9088 0.1824 0.09118
95 0.8641 0.2718 0.1359
96 0.9448 0.1103 0.05517
97 0.9585 0.083 0.0415
98 0.9447 0.1106 0.05528
99 0.9926 0.01486 0.007429
100 0.9847 0.03054 0.01527
101 0.9888 0.02238 0.01119
102 0.9924 0.01514 0.007572
103 0.9794 0.0411 0.02055
104 0.9543 0.09134 0.04567







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level6 0.06452NOK
5% type I error level250.268817NOK
10% type I error level350.376344NOK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 6 &  0.06452 & NOK \tabularnewline
5% type I error level & 25 & 0.268817 & NOK \tabularnewline
10% type I error level & 35 & 0.376344 & NOK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=285321&T=6

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]6[/C][C] 0.06452[/C][C]NOK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]25[/C][C]0.268817[/C][C]NOK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]35[/C][C]0.376344[/C][C]NOK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=285321&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=285321&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level6 0.06452NOK
5% type I error level250.268817NOK
10% type I error level350.376344NOK



Parameters (Session):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ; par4 = 4 ; par5 = 0 ;
Parameters (R input):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ; par4 = 4 ; par5 = 0 ;
R code (references can be found in the software module):
par5 <- '0'
par4 <- '5'
par3 <- 'No Linear Trend'
par2 <- 'Include Monthly Dummies'
par1 <- '1'
library(lattice)
library(lmtest)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
mywarning <- ''
par1 <- as.numeric(par1)
if(is.na(par1)) {
par1 <- 1
mywarning = 'Warning: you did not specify the column number of the endogenous series! The first column was selected by default.'
}
if (par4=='') par4 <- 0
par4 <- as.numeric(par4)
if (par5=='') par5 <- 0
par5 <- as.numeric(par5)
x <- na.omit(t(y))
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'Seasonal Differences (s=12)'){
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'First and Seasonal Differences (s=12)'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if(par4 > 0) {
x2 <- array(0, dim=c(n-par4,par4), dimnames=list(1:(n-par4), paste(colnames(x)[par1],'(t-',1:par4,')',sep='')))
for (i in 1:(n-par4)) {
for (j in 1:par4) {
x2[i,j] <- x[i+par4-j,par1]
}
}
x <- cbind(x[(par4+1):n,], x2)
n <- n - par4
}
if(par5 > 0) {
x2 <- array(0, dim=c(n-par5*12,par5), dimnames=list(1:(n-par5*12), paste(colnames(x)[par1],'(t-',1:par5,'s)',sep='')))
for (i in 1:(n-par5*12)) {
for (j in 1:par5) {
x2[i,j] <- x[i+par5*12-j*12,par1]
}
}
x <- cbind(x[(par5*12+1):n,], x2)
n <- n - par5*12
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
(k <- length(x[n,]))
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
(k <- length(x[n,]))
head(x)
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
qqline(mysum$resid)
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, signif(mysum$coefficients[i,1],6), sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, mywarning)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,formatC(signif(mysum$coefficients[i,1],5),format='g',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,2],5),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,3],4),format='e',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4],4),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4]/2,4),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a,formatC(signif(sqrt(mysum$r.squared),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$adj.r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a,formatC(signif(mysum$fstatistic[1],6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[2],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[3],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a,formatC(signif(1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a,formatC(signif(mysum$sigma,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a,formatC(signif(sum(myerror*myerror),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
if(n < 200) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,formatC(signif(x[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(x[i]-mysum$resid[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$resid[i],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,1],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,2],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,3],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant1,6))
a<-table.element(a,formatC(signif(numsignificant1/numgqtests,6),format='g',flag=' '))
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant5,6))
a<-table.element(a,signif(numsignificant5/numgqtests,6))
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant10,6))
a<-table.element(a,signif(numsignificant10/numgqtests,6))
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}
}