Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationMon, 25 Nov 2013 14:41:59 -0500
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2013/Nov/25/t1385408576f6xjp9yxthbw1pr.htm/, Retrieved Mon, 29 Apr 2024 19:42:35 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=228410, Retrieved Mon, 29 Apr 2024 19:42:35 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact85
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-     [Multiple Regression] [HPC Retail Sales] [2008-03-08 13:40:54] [1c0f2c85e8a48e42648374b3bcceca26]
- RMPD    [Multiple Regression] [WS8: Multi Regres...] [2013-11-25 19:41:59] [0d4b5c001fcd12491258e86d922016e4] [Current]
Feedback Forum

Post a new message
Dataseries X:
13328	0	0
12873	0	0
14000	0	0
13477	0	0
14237	0	0
13674	0	0
13529	0	0
14058	0	0
12975	0	0
14326	0	0
14008	0	0
16193	1	0
14483	0	0
14011	0	0
15057	0	0
14884	0	0
15414	0	0
14440	0	0
14900	0	0
15074	0	0
14442	0	0
15307	0	0
14938	0	0
17193	0	0
15528	0	0
14765	0	0
15838	0	0
15723	0	0
16150	0	0
15486	0	0
15986	0	1
15983	0	1
15692	0	1
16490	0	1
15686	0	1
18897	0	1
16316	0	1
15636	0	1
17163	0	1
16534	0	1
16518	0	1
16375	0	1
16290	0	1
16352	0	1
15943	0	1
16362	0	1
16393	0	1
19051	0	1
16747	0	1
16320	0	1
17910	0	1
16961	0	1
17480	0	1
17049	0	1
16879	0	1
17473	0	1
16998	0	1
17307	0	1
17418	0	1
20169	0	1
17871	0	1
17226	0	1
19062	0	1
17804	0	1
19100	0	1
18522	0	1
18060	0	1
18869	0	1
18127	0	1
18871	0	1
18890	0	1
21263	0	1
19547	0	1
18450	0	1
20254	0	1
19240	0	1
20216	0	1
19420	0	1
19415	0	1
20018	0	1
18652	0	1
19978	0	1
19509	0	1
21971	0	1




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time10 seconds
R Server'Herman Ole Andreas Wold' @ wold.wessa.net

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 10 seconds \tabularnewline
R Server & 'Herman Ole Andreas Wold' @ wold.wessa.net \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=228410&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]10 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Herman Ole Andreas Wold' @ wold.wessa.net[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=228410&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=228410&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time10 seconds
R Server'Herman Ole Andreas Wold' @ wold.wessa.net







Multiple Linear Regression - Estimated Regression Equation
HPC[t] = + 15561.9 -338.956Dummies1[t] -203.848Dummies2[t] -2176.45M1[t] -2905.72M2[t] -1557.56M3[t] -2304.26M4[t] -1743.38M5[t] -2416.94M6[t] -2455.37M7[t] -2140.78M8[t] -2935.62M9[t] -2186.17M10[t] -2524.01M11[t] + 80.8393t + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
HPC[t] =  +  15561.9 -338.956Dummies1[t] -203.848Dummies2[t] -2176.45M1[t] -2905.72M2[t] -1557.56M3[t] -2304.26M4[t] -1743.38M5[t] -2416.94M6[t] -2455.37M7[t] -2140.78M8[t] -2935.62M9[t] -2186.17M10[t] -2524.01M11[t] +  80.8393t  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=228410&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]HPC[t] =  +  15561.9 -338.956Dummies1[t] -203.848Dummies2[t] -2176.45M1[t] -2905.72M2[t] -1557.56M3[t] -2304.26M4[t] -1743.38M5[t] -2416.94M6[t] -2455.37M7[t] -2140.78M8[t] -2935.62M9[t] -2186.17M10[t] -2524.01M11[t] +  80.8393t  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=228410&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=228410&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
HPC[t] = + 15561.9 -338.956Dummies1[t] -203.848Dummies2[t] -2176.45M1[t] -2905.72M2[t] -1557.56M3[t] -2304.26M4[t] -1743.38M5[t] -2416.94M6[t] -2455.37M7[t] -2140.78M8[t] -2935.62M9[t] -2186.17M10[t] -2524.01M11[t] + 80.8393t + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)15561.9140.078111.11.53514e-797.6757e-80
Dummies1-338.956321.595-1.0540.2955680.147784
Dummies2-203.848120.72-1.6890.09581130.0479056
M1-2176.45164.4-13.241.28748e-206.4374e-21
M2-2905.72164.208-17.72.38199e-271.191e-27
M3-1557.56164.049-9.4943.78664e-141.89332e-14
M4-2304.26163.926-14.066.20127e-223.10063e-22
M5-1743.38163.837-10.643.38444e-161.69222e-16
M6-2416.94163.782-14.764.94534e-232.47267e-23
M7-2455.37163.614-15.012.03532e-231.01766e-23
M8-2140.78163.422-13.12.17457e-201.08728e-20
M9-2935.62163.265-17.989.59025e-284.79513e-28
M10-2186.17163.142-13.47.02111e-213.51055e-21
M11-2524.01163.053-15.483.88437e-241.94219e-24
t80.83932.3828433.939.01111e-454.50555e-45

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 15561.9 & 140.078 & 111.1 & 1.53514e-79 & 7.6757e-80 \tabularnewline
Dummies1 & -338.956 & 321.595 & -1.054 & 0.295568 & 0.147784 \tabularnewline
Dummies2 & -203.848 & 120.72 & -1.689 & 0.0958113 & 0.0479056 \tabularnewline
M1 & -2176.45 & 164.4 & -13.24 & 1.28748e-20 & 6.4374e-21 \tabularnewline
M2 & -2905.72 & 164.208 & -17.7 & 2.38199e-27 & 1.191e-27 \tabularnewline
M3 & -1557.56 & 164.049 & -9.494 & 3.78664e-14 & 1.89332e-14 \tabularnewline
M4 & -2304.26 & 163.926 & -14.06 & 6.20127e-22 & 3.10063e-22 \tabularnewline
M5 & -1743.38 & 163.837 & -10.64 & 3.38444e-16 & 1.69222e-16 \tabularnewline
M6 & -2416.94 & 163.782 & -14.76 & 4.94534e-23 & 2.47267e-23 \tabularnewline
M7 & -2455.37 & 163.614 & -15.01 & 2.03532e-23 & 1.01766e-23 \tabularnewline
M8 & -2140.78 & 163.422 & -13.1 & 2.17457e-20 & 1.08728e-20 \tabularnewline
M9 & -2935.62 & 163.265 & -17.98 & 9.59025e-28 & 4.79513e-28 \tabularnewline
M10 & -2186.17 & 163.142 & -13.4 & 7.02111e-21 & 3.51055e-21 \tabularnewline
M11 & -2524.01 & 163.053 & -15.48 & 3.88437e-24 & 1.94219e-24 \tabularnewline
t & 80.8393 & 2.38284 & 33.93 & 9.01111e-45 & 4.50555e-45 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=228410&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]15561.9[/C][C]140.078[/C][C]111.1[/C][C]1.53514e-79[/C][C]7.6757e-80[/C][/ROW]
[ROW][C]Dummies1[/C][C]-338.956[/C][C]321.595[/C][C]-1.054[/C][C]0.295568[/C][C]0.147784[/C][/ROW]
[ROW][C]Dummies2[/C][C]-203.848[/C][C]120.72[/C][C]-1.689[/C][C]0.0958113[/C][C]0.0479056[/C][/ROW]
[ROW][C]M1[/C][C]-2176.45[/C][C]164.4[/C][C]-13.24[/C][C]1.28748e-20[/C][C]6.4374e-21[/C][/ROW]
[ROW][C]M2[/C][C]-2905.72[/C][C]164.208[/C][C]-17.7[/C][C]2.38199e-27[/C][C]1.191e-27[/C][/ROW]
[ROW][C]M3[/C][C]-1557.56[/C][C]164.049[/C][C]-9.494[/C][C]3.78664e-14[/C][C]1.89332e-14[/C][/ROW]
[ROW][C]M4[/C][C]-2304.26[/C][C]163.926[/C][C]-14.06[/C][C]6.20127e-22[/C][C]3.10063e-22[/C][/ROW]
[ROW][C]M5[/C][C]-1743.38[/C][C]163.837[/C][C]-10.64[/C][C]3.38444e-16[/C][C]1.69222e-16[/C][/ROW]
[ROW][C]M6[/C][C]-2416.94[/C][C]163.782[/C][C]-14.76[/C][C]4.94534e-23[/C][C]2.47267e-23[/C][/ROW]
[ROW][C]M7[/C][C]-2455.37[/C][C]163.614[/C][C]-15.01[/C][C]2.03532e-23[/C][C]1.01766e-23[/C][/ROW]
[ROW][C]M8[/C][C]-2140.78[/C][C]163.422[/C][C]-13.1[/C][C]2.17457e-20[/C][C]1.08728e-20[/C][/ROW]
[ROW][C]M9[/C][C]-2935.62[/C][C]163.265[/C][C]-17.98[/C][C]9.59025e-28[/C][C]4.79513e-28[/C][/ROW]
[ROW][C]M10[/C][C]-2186.17[/C][C]163.142[/C][C]-13.4[/C][C]7.02111e-21[/C][C]3.51055e-21[/C][/ROW]
[ROW][C]M11[/C][C]-2524.01[/C][C]163.053[/C][C]-15.48[/C][C]3.88437e-24[/C][C]1.94219e-24[/C][/ROW]
[ROW][C]t[/C][C]80.8393[/C][C]2.38284[/C][C]33.93[/C][C]9.01111e-45[/C][C]4.50555e-45[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=228410&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=228410&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)15561.9140.078111.11.53514e-797.6757e-80
Dummies1-338.956321.595-1.0540.2955680.147784
Dummies2-203.848120.72-1.6890.09581130.0479056
M1-2176.45164.4-13.241.28748e-206.4374e-21
M2-2905.72164.208-17.72.38199e-271.191e-27
M3-1557.56164.049-9.4943.78664e-141.89332e-14
M4-2304.26163.926-14.066.20127e-223.10063e-22
M5-1743.38163.837-10.643.38444e-161.69222e-16
M6-2416.94163.782-14.764.94534e-232.47267e-23
M7-2455.37163.614-15.012.03532e-231.01766e-23
M8-2140.78163.422-13.12.17457e-201.08728e-20
M9-2935.62163.265-17.989.59025e-284.79513e-28
M10-2186.17163.142-13.47.02111e-213.51055e-21
M11-2524.01163.053-15.483.88437e-241.94219e-24
t80.83932.3828433.939.01111e-454.50555e-45







Multiple Linear Regression - Regression Statistics
Multiple R0.991758
R-squared0.983585
Adjusted R-squared0.980254
F-TEST (value)295.317
F-TEST (DF numerator)14
F-TEST (DF denominator)69
p-value0
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation292.582
Sum Squared Residuals5906690

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.991758 \tabularnewline
R-squared & 0.983585 \tabularnewline
Adjusted R-squared & 0.980254 \tabularnewline
F-TEST (value) & 295.317 \tabularnewline
F-TEST (DF numerator) & 14 \tabularnewline
F-TEST (DF denominator) & 69 \tabularnewline
p-value & 0 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 292.582 \tabularnewline
Sum Squared Residuals & 5906690 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=228410&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.991758[/C][/ROW]
[ROW][C]R-squared[/C][C]0.983585[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]0.980254[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]295.317[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]14[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]69[/C][/ROW]
[ROW][C]p-value[/C][C]0[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]292.582[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]5906690[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=228410&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=228410&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.991758
R-squared0.983585
Adjusted R-squared0.980254
F-TEST (value)295.317
F-TEST (DF numerator)14
F-TEST (DF denominator)69
p-value0
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation292.582
Sum Squared Residuals5906690







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
11332813466.3-138.269
21287312817.855.1592
31400014246.8-246.841
41347713581-103.984
51423714222.714.302
6136741363044.0163
71352913672.4-143.39
81405814067.8-9.81904
91297513353.8-378.819
101432614184.1141.895
111400813927.180.8952
1216193161934.51635e-15
131448314436.346.659
141401113787.9223.088
151505715216.9-159.912
161488414551.1332.945
171541415192.8221.23
181444014600.1-160.055
191490014642.5257.538
201507415037.936.1093
211444214323.9118.109
221530715154.2152.824
231493814897.240.8236
241719317502-309.027
251552815406.4121.587
2614765147587.0159
271583816187-348.984
281572315521.1201.873
291615016162.8-12.8412
301548615570.1-84.127
311598615408.7577.314
321598315804.1178.885
331569215090.1601.885
341649015920.4569.6
351568615663.422.5995
361889718268.3628.749
371631616172.6143.363
381563615524.2111.792
391716316953.2209.792
401653416287.4246.649
411651816929.1-411.065
421637516336.438.6489
431629016378.8-88.7578
441635216774.2-422.186
451594316060.2-117.186
461636216890.5-528.472
471639316633.5-240.472
481905119238.3-187.323
491674717142.7-395.708
501632016494.3-174.28
511791017923.3-13.2798
521696117257.4-296.423
531748017899.1-419.137
541704917306.4-257.423
551687917348.8-469.829
561747317744.3-271.258
571699817030.3-32.2581
581730717860.5-553.544
591741817603.5-185.544
602016920208.4-39.3946
611787118112.8-241.78
621722617464.4-238.351
631906218893.4168.649
641780418227.5-423.494
651910018869.2230.791
661852218276.5245.506
671806018318.9-258.901
681886918714.3154.67
691812718000.3126.67
701887118830.640.3846
711889018573.6316.385
722126321178.584.5338
731954719082.9464.148
741845018434.415.5769
752025419863.4390.577
761924019197.642.434
772021619839.3376.72
781942019246.6173.434
791941519289126.027
802001819684.4333.599
811865218970.4-318.401
821997819800.7177.313
831950919543.7-34.6871
842197122148.5-177.538

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 13328 & 13466.3 & -138.269 \tabularnewline
2 & 12873 & 12817.8 & 55.1592 \tabularnewline
3 & 14000 & 14246.8 & -246.841 \tabularnewline
4 & 13477 & 13581 & -103.984 \tabularnewline
5 & 14237 & 14222.7 & 14.302 \tabularnewline
6 & 13674 & 13630 & 44.0163 \tabularnewline
7 & 13529 & 13672.4 & -143.39 \tabularnewline
8 & 14058 & 14067.8 & -9.81904 \tabularnewline
9 & 12975 & 13353.8 & -378.819 \tabularnewline
10 & 14326 & 14184.1 & 141.895 \tabularnewline
11 & 14008 & 13927.1 & 80.8952 \tabularnewline
12 & 16193 & 16193 & 4.51635e-15 \tabularnewline
13 & 14483 & 14436.3 & 46.659 \tabularnewline
14 & 14011 & 13787.9 & 223.088 \tabularnewline
15 & 15057 & 15216.9 & -159.912 \tabularnewline
16 & 14884 & 14551.1 & 332.945 \tabularnewline
17 & 15414 & 15192.8 & 221.23 \tabularnewline
18 & 14440 & 14600.1 & -160.055 \tabularnewline
19 & 14900 & 14642.5 & 257.538 \tabularnewline
20 & 15074 & 15037.9 & 36.1093 \tabularnewline
21 & 14442 & 14323.9 & 118.109 \tabularnewline
22 & 15307 & 15154.2 & 152.824 \tabularnewline
23 & 14938 & 14897.2 & 40.8236 \tabularnewline
24 & 17193 & 17502 & -309.027 \tabularnewline
25 & 15528 & 15406.4 & 121.587 \tabularnewline
26 & 14765 & 14758 & 7.0159 \tabularnewline
27 & 15838 & 16187 & -348.984 \tabularnewline
28 & 15723 & 15521.1 & 201.873 \tabularnewline
29 & 16150 & 16162.8 & -12.8412 \tabularnewline
30 & 15486 & 15570.1 & -84.127 \tabularnewline
31 & 15986 & 15408.7 & 577.314 \tabularnewline
32 & 15983 & 15804.1 & 178.885 \tabularnewline
33 & 15692 & 15090.1 & 601.885 \tabularnewline
34 & 16490 & 15920.4 & 569.6 \tabularnewline
35 & 15686 & 15663.4 & 22.5995 \tabularnewline
36 & 18897 & 18268.3 & 628.749 \tabularnewline
37 & 16316 & 16172.6 & 143.363 \tabularnewline
38 & 15636 & 15524.2 & 111.792 \tabularnewline
39 & 17163 & 16953.2 & 209.792 \tabularnewline
40 & 16534 & 16287.4 & 246.649 \tabularnewline
41 & 16518 & 16929.1 & -411.065 \tabularnewline
42 & 16375 & 16336.4 & 38.6489 \tabularnewline
43 & 16290 & 16378.8 & -88.7578 \tabularnewline
44 & 16352 & 16774.2 & -422.186 \tabularnewline
45 & 15943 & 16060.2 & -117.186 \tabularnewline
46 & 16362 & 16890.5 & -528.472 \tabularnewline
47 & 16393 & 16633.5 & -240.472 \tabularnewline
48 & 19051 & 19238.3 & -187.323 \tabularnewline
49 & 16747 & 17142.7 & -395.708 \tabularnewline
50 & 16320 & 16494.3 & -174.28 \tabularnewline
51 & 17910 & 17923.3 & -13.2798 \tabularnewline
52 & 16961 & 17257.4 & -296.423 \tabularnewline
53 & 17480 & 17899.1 & -419.137 \tabularnewline
54 & 17049 & 17306.4 & -257.423 \tabularnewline
55 & 16879 & 17348.8 & -469.829 \tabularnewline
56 & 17473 & 17744.3 & -271.258 \tabularnewline
57 & 16998 & 17030.3 & -32.2581 \tabularnewline
58 & 17307 & 17860.5 & -553.544 \tabularnewline
59 & 17418 & 17603.5 & -185.544 \tabularnewline
60 & 20169 & 20208.4 & -39.3946 \tabularnewline
61 & 17871 & 18112.8 & -241.78 \tabularnewline
62 & 17226 & 17464.4 & -238.351 \tabularnewline
63 & 19062 & 18893.4 & 168.649 \tabularnewline
64 & 17804 & 18227.5 & -423.494 \tabularnewline
65 & 19100 & 18869.2 & 230.791 \tabularnewline
66 & 18522 & 18276.5 & 245.506 \tabularnewline
67 & 18060 & 18318.9 & -258.901 \tabularnewline
68 & 18869 & 18714.3 & 154.67 \tabularnewline
69 & 18127 & 18000.3 & 126.67 \tabularnewline
70 & 18871 & 18830.6 & 40.3846 \tabularnewline
71 & 18890 & 18573.6 & 316.385 \tabularnewline
72 & 21263 & 21178.5 & 84.5338 \tabularnewline
73 & 19547 & 19082.9 & 464.148 \tabularnewline
74 & 18450 & 18434.4 & 15.5769 \tabularnewline
75 & 20254 & 19863.4 & 390.577 \tabularnewline
76 & 19240 & 19197.6 & 42.434 \tabularnewline
77 & 20216 & 19839.3 & 376.72 \tabularnewline
78 & 19420 & 19246.6 & 173.434 \tabularnewline
79 & 19415 & 19289 & 126.027 \tabularnewline
80 & 20018 & 19684.4 & 333.599 \tabularnewline
81 & 18652 & 18970.4 & -318.401 \tabularnewline
82 & 19978 & 19800.7 & 177.313 \tabularnewline
83 & 19509 & 19543.7 & -34.6871 \tabularnewline
84 & 21971 & 22148.5 & -177.538 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=228410&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]13328[/C][C]13466.3[/C][C]-138.269[/C][/ROW]
[ROW][C]2[/C][C]12873[/C][C]12817.8[/C][C]55.1592[/C][/ROW]
[ROW][C]3[/C][C]14000[/C][C]14246.8[/C][C]-246.841[/C][/ROW]
[ROW][C]4[/C][C]13477[/C][C]13581[/C][C]-103.984[/C][/ROW]
[ROW][C]5[/C][C]14237[/C][C]14222.7[/C][C]14.302[/C][/ROW]
[ROW][C]6[/C][C]13674[/C][C]13630[/C][C]44.0163[/C][/ROW]
[ROW][C]7[/C][C]13529[/C][C]13672.4[/C][C]-143.39[/C][/ROW]
[ROW][C]8[/C][C]14058[/C][C]14067.8[/C][C]-9.81904[/C][/ROW]
[ROW][C]9[/C][C]12975[/C][C]13353.8[/C][C]-378.819[/C][/ROW]
[ROW][C]10[/C][C]14326[/C][C]14184.1[/C][C]141.895[/C][/ROW]
[ROW][C]11[/C][C]14008[/C][C]13927.1[/C][C]80.8952[/C][/ROW]
[ROW][C]12[/C][C]16193[/C][C]16193[/C][C]4.51635e-15[/C][/ROW]
[ROW][C]13[/C][C]14483[/C][C]14436.3[/C][C]46.659[/C][/ROW]
[ROW][C]14[/C][C]14011[/C][C]13787.9[/C][C]223.088[/C][/ROW]
[ROW][C]15[/C][C]15057[/C][C]15216.9[/C][C]-159.912[/C][/ROW]
[ROW][C]16[/C][C]14884[/C][C]14551.1[/C][C]332.945[/C][/ROW]
[ROW][C]17[/C][C]15414[/C][C]15192.8[/C][C]221.23[/C][/ROW]
[ROW][C]18[/C][C]14440[/C][C]14600.1[/C][C]-160.055[/C][/ROW]
[ROW][C]19[/C][C]14900[/C][C]14642.5[/C][C]257.538[/C][/ROW]
[ROW][C]20[/C][C]15074[/C][C]15037.9[/C][C]36.1093[/C][/ROW]
[ROW][C]21[/C][C]14442[/C][C]14323.9[/C][C]118.109[/C][/ROW]
[ROW][C]22[/C][C]15307[/C][C]15154.2[/C][C]152.824[/C][/ROW]
[ROW][C]23[/C][C]14938[/C][C]14897.2[/C][C]40.8236[/C][/ROW]
[ROW][C]24[/C][C]17193[/C][C]17502[/C][C]-309.027[/C][/ROW]
[ROW][C]25[/C][C]15528[/C][C]15406.4[/C][C]121.587[/C][/ROW]
[ROW][C]26[/C][C]14765[/C][C]14758[/C][C]7.0159[/C][/ROW]
[ROW][C]27[/C][C]15838[/C][C]16187[/C][C]-348.984[/C][/ROW]
[ROW][C]28[/C][C]15723[/C][C]15521.1[/C][C]201.873[/C][/ROW]
[ROW][C]29[/C][C]16150[/C][C]16162.8[/C][C]-12.8412[/C][/ROW]
[ROW][C]30[/C][C]15486[/C][C]15570.1[/C][C]-84.127[/C][/ROW]
[ROW][C]31[/C][C]15986[/C][C]15408.7[/C][C]577.314[/C][/ROW]
[ROW][C]32[/C][C]15983[/C][C]15804.1[/C][C]178.885[/C][/ROW]
[ROW][C]33[/C][C]15692[/C][C]15090.1[/C][C]601.885[/C][/ROW]
[ROW][C]34[/C][C]16490[/C][C]15920.4[/C][C]569.6[/C][/ROW]
[ROW][C]35[/C][C]15686[/C][C]15663.4[/C][C]22.5995[/C][/ROW]
[ROW][C]36[/C][C]18897[/C][C]18268.3[/C][C]628.749[/C][/ROW]
[ROW][C]37[/C][C]16316[/C][C]16172.6[/C][C]143.363[/C][/ROW]
[ROW][C]38[/C][C]15636[/C][C]15524.2[/C][C]111.792[/C][/ROW]
[ROW][C]39[/C][C]17163[/C][C]16953.2[/C][C]209.792[/C][/ROW]
[ROW][C]40[/C][C]16534[/C][C]16287.4[/C][C]246.649[/C][/ROW]
[ROW][C]41[/C][C]16518[/C][C]16929.1[/C][C]-411.065[/C][/ROW]
[ROW][C]42[/C][C]16375[/C][C]16336.4[/C][C]38.6489[/C][/ROW]
[ROW][C]43[/C][C]16290[/C][C]16378.8[/C][C]-88.7578[/C][/ROW]
[ROW][C]44[/C][C]16352[/C][C]16774.2[/C][C]-422.186[/C][/ROW]
[ROW][C]45[/C][C]15943[/C][C]16060.2[/C][C]-117.186[/C][/ROW]
[ROW][C]46[/C][C]16362[/C][C]16890.5[/C][C]-528.472[/C][/ROW]
[ROW][C]47[/C][C]16393[/C][C]16633.5[/C][C]-240.472[/C][/ROW]
[ROW][C]48[/C][C]19051[/C][C]19238.3[/C][C]-187.323[/C][/ROW]
[ROW][C]49[/C][C]16747[/C][C]17142.7[/C][C]-395.708[/C][/ROW]
[ROW][C]50[/C][C]16320[/C][C]16494.3[/C][C]-174.28[/C][/ROW]
[ROW][C]51[/C][C]17910[/C][C]17923.3[/C][C]-13.2798[/C][/ROW]
[ROW][C]52[/C][C]16961[/C][C]17257.4[/C][C]-296.423[/C][/ROW]
[ROW][C]53[/C][C]17480[/C][C]17899.1[/C][C]-419.137[/C][/ROW]
[ROW][C]54[/C][C]17049[/C][C]17306.4[/C][C]-257.423[/C][/ROW]
[ROW][C]55[/C][C]16879[/C][C]17348.8[/C][C]-469.829[/C][/ROW]
[ROW][C]56[/C][C]17473[/C][C]17744.3[/C][C]-271.258[/C][/ROW]
[ROW][C]57[/C][C]16998[/C][C]17030.3[/C][C]-32.2581[/C][/ROW]
[ROW][C]58[/C][C]17307[/C][C]17860.5[/C][C]-553.544[/C][/ROW]
[ROW][C]59[/C][C]17418[/C][C]17603.5[/C][C]-185.544[/C][/ROW]
[ROW][C]60[/C][C]20169[/C][C]20208.4[/C][C]-39.3946[/C][/ROW]
[ROW][C]61[/C][C]17871[/C][C]18112.8[/C][C]-241.78[/C][/ROW]
[ROW][C]62[/C][C]17226[/C][C]17464.4[/C][C]-238.351[/C][/ROW]
[ROW][C]63[/C][C]19062[/C][C]18893.4[/C][C]168.649[/C][/ROW]
[ROW][C]64[/C][C]17804[/C][C]18227.5[/C][C]-423.494[/C][/ROW]
[ROW][C]65[/C][C]19100[/C][C]18869.2[/C][C]230.791[/C][/ROW]
[ROW][C]66[/C][C]18522[/C][C]18276.5[/C][C]245.506[/C][/ROW]
[ROW][C]67[/C][C]18060[/C][C]18318.9[/C][C]-258.901[/C][/ROW]
[ROW][C]68[/C][C]18869[/C][C]18714.3[/C][C]154.67[/C][/ROW]
[ROW][C]69[/C][C]18127[/C][C]18000.3[/C][C]126.67[/C][/ROW]
[ROW][C]70[/C][C]18871[/C][C]18830.6[/C][C]40.3846[/C][/ROW]
[ROW][C]71[/C][C]18890[/C][C]18573.6[/C][C]316.385[/C][/ROW]
[ROW][C]72[/C][C]21263[/C][C]21178.5[/C][C]84.5338[/C][/ROW]
[ROW][C]73[/C][C]19547[/C][C]19082.9[/C][C]464.148[/C][/ROW]
[ROW][C]74[/C][C]18450[/C][C]18434.4[/C][C]15.5769[/C][/ROW]
[ROW][C]75[/C][C]20254[/C][C]19863.4[/C][C]390.577[/C][/ROW]
[ROW][C]76[/C][C]19240[/C][C]19197.6[/C][C]42.434[/C][/ROW]
[ROW][C]77[/C][C]20216[/C][C]19839.3[/C][C]376.72[/C][/ROW]
[ROW][C]78[/C][C]19420[/C][C]19246.6[/C][C]173.434[/C][/ROW]
[ROW][C]79[/C][C]19415[/C][C]19289[/C][C]126.027[/C][/ROW]
[ROW][C]80[/C][C]20018[/C][C]19684.4[/C][C]333.599[/C][/ROW]
[ROW][C]81[/C][C]18652[/C][C]18970.4[/C][C]-318.401[/C][/ROW]
[ROW][C]82[/C][C]19978[/C][C]19800.7[/C][C]177.313[/C][/ROW]
[ROW][C]83[/C][C]19509[/C][C]19543.7[/C][C]-34.6871[/C][/ROW]
[ROW][C]84[/C][C]21971[/C][C]22148.5[/C][C]-177.538[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=228410&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=228410&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
11332813466.3-138.269
21287312817.855.1592
31400014246.8-246.841
41347713581-103.984
51423714222.714.302
6136741363044.0163
71352913672.4-143.39
81405814067.8-9.81904
91297513353.8-378.819
101432614184.1141.895
111400813927.180.8952
1216193161934.51635e-15
131448314436.346.659
141401113787.9223.088
151505715216.9-159.912
161488414551.1332.945
171541415192.8221.23
181444014600.1-160.055
191490014642.5257.538
201507415037.936.1093
211444214323.9118.109
221530715154.2152.824
231493814897.240.8236
241719317502-309.027
251552815406.4121.587
2614765147587.0159
271583816187-348.984
281572315521.1201.873
291615016162.8-12.8412
301548615570.1-84.127
311598615408.7577.314
321598315804.1178.885
331569215090.1601.885
341649015920.4569.6
351568615663.422.5995
361889718268.3628.749
371631616172.6143.363
381563615524.2111.792
391716316953.2209.792
401653416287.4246.649
411651816929.1-411.065
421637516336.438.6489
431629016378.8-88.7578
441635216774.2-422.186
451594316060.2-117.186
461636216890.5-528.472
471639316633.5-240.472
481905119238.3-187.323
491674717142.7-395.708
501632016494.3-174.28
511791017923.3-13.2798
521696117257.4-296.423
531748017899.1-419.137
541704917306.4-257.423
551687917348.8-469.829
561747317744.3-271.258
571699817030.3-32.2581
581730717860.5-553.544
591741817603.5-185.544
602016920208.4-39.3946
611787118112.8-241.78
621722617464.4-238.351
631906218893.4168.649
641780418227.5-423.494
651910018869.2230.791
661852218276.5245.506
671806018318.9-258.901
681886918714.3154.67
691812718000.3126.67
701887118830.640.3846
711889018573.6316.385
722126321178.584.5338
731954719082.9464.148
741845018434.415.5769
752025419863.4390.577
761924019197.642.434
772021619839.3376.72
781942019246.6173.434
791941519289126.027
802001819684.4333.599
811865218970.4-318.401
821997819800.7177.313
831950919543.7-34.6871
842197122148.5-177.538







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
180.2112610.4225210.788739
190.1421050.284210.857895
200.0741350.148270.925865
210.06527030.1305410.93473
220.0391460.0782920.960854
230.02480240.04960480.975198
240.01188150.02376290.988119
250.005463830.01092770.994536
260.00718980.01437960.99281
270.008275010.016550.991725
280.004168470.008336940.995832
290.003103780.006207550.996896
300.00155330.003106610.998447
310.001258450.00251690.998742
320.001122530.002245050.998877
330.00259070.00518140.997409
340.004420930.008841860.995579
350.01010130.02020270.989899
360.07066920.1413380.929331
370.08695210.1739040.913048
380.1317160.2634330.868284
390.1103890.2207770.889611
400.239660.4793210.76034
410.5271440.9457120.472856
420.5007610.9984780.499239
430.6331420.7337150.366858
440.7146560.5706880.285344
450.7257950.548410.274205
460.8469560.3060880.153044
470.8151530.3696940.184847
480.8018880.3962240.198112
490.7928460.4143090.207154
500.7593520.4812970.240648
510.6988310.6023370.301169
520.6757870.6484260.324213
530.7113440.5773120.288656
540.6581960.6836090.341804
550.6256880.7486250.374312
560.5914180.8171650.408582
570.5540030.8919940.445997
580.6392260.7215490.360774
590.5649640.8700710.435036
600.4825610.9651210.517439
610.6242010.7515990.375799
620.5395780.9208440.460422
630.4860980.9721960.513902
640.5453210.9093570.454679
650.4688740.9377490.531126
660.3338020.6676040.666198

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
18 & 0.211261 & 0.422521 & 0.788739 \tabularnewline
19 & 0.142105 & 0.28421 & 0.857895 \tabularnewline
20 & 0.074135 & 0.14827 & 0.925865 \tabularnewline
21 & 0.0652703 & 0.130541 & 0.93473 \tabularnewline
22 & 0.039146 & 0.078292 & 0.960854 \tabularnewline
23 & 0.0248024 & 0.0496048 & 0.975198 \tabularnewline
24 & 0.0118815 & 0.0237629 & 0.988119 \tabularnewline
25 & 0.00546383 & 0.0109277 & 0.994536 \tabularnewline
26 & 0.0071898 & 0.0143796 & 0.99281 \tabularnewline
27 & 0.00827501 & 0.01655 & 0.991725 \tabularnewline
28 & 0.00416847 & 0.00833694 & 0.995832 \tabularnewline
29 & 0.00310378 & 0.00620755 & 0.996896 \tabularnewline
30 & 0.0015533 & 0.00310661 & 0.998447 \tabularnewline
31 & 0.00125845 & 0.0025169 & 0.998742 \tabularnewline
32 & 0.00112253 & 0.00224505 & 0.998877 \tabularnewline
33 & 0.0025907 & 0.0051814 & 0.997409 \tabularnewline
34 & 0.00442093 & 0.00884186 & 0.995579 \tabularnewline
35 & 0.0101013 & 0.0202027 & 0.989899 \tabularnewline
36 & 0.0706692 & 0.141338 & 0.929331 \tabularnewline
37 & 0.0869521 & 0.173904 & 0.913048 \tabularnewline
38 & 0.131716 & 0.263433 & 0.868284 \tabularnewline
39 & 0.110389 & 0.220777 & 0.889611 \tabularnewline
40 & 0.23966 & 0.479321 & 0.76034 \tabularnewline
41 & 0.527144 & 0.945712 & 0.472856 \tabularnewline
42 & 0.500761 & 0.998478 & 0.499239 \tabularnewline
43 & 0.633142 & 0.733715 & 0.366858 \tabularnewline
44 & 0.714656 & 0.570688 & 0.285344 \tabularnewline
45 & 0.725795 & 0.54841 & 0.274205 \tabularnewline
46 & 0.846956 & 0.306088 & 0.153044 \tabularnewline
47 & 0.815153 & 0.369694 & 0.184847 \tabularnewline
48 & 0.801888 & 0.396224 & 0.198112 \tabularnewline
49 & 0.792846 & 0.414309 & 0.207154 \tabularnewline
50 & 0.759352 & 0.481297 & 0.240648 \tabularnewline
51 & 0.698831 & 0.602337 & 0.301169 \tabularnewline
52 & 0.675787 & 0.648426 & 0.324213 \tabularnewline
53 & 0.711344 & 0.577312 & 0.288656 \tabularnewline
54 & 0.658196 & 0.683609 & 0.341804 \tabularnewline
55 & 0.625688 & 0.748625 & 0.374312 \tabularnewline
56 & 0.591418 & 0.817165 & 0.408582 \tabularnewline
57 & 0.554003 & 0.891994 & 0.445997 \tabularnewline
58 & 0.639226 & 0.721549 & 0.360774 \tabularnewline
59 & 0.564964 & 0.870071 & 0.435036 \tabularnewline
60 & 0.482561 & 0.965121 & 0.517439 \tabularnewline
61 & 0.624201 & 0.751599 & 0.375799 \tabularnewline
62 & 0.539578 & 0.920844 & 0.460422 \tabularnewline
63 & 0.486098 & 0.972196 & 0.513902 \tabularnewline
64 & 0.545321 & 0.909357 & 0.454679 \tabularnewline
65 & 0.468874 & 0.937749 & 0.531126 \tabularnewline
66 & 0.333802 & 0.667604 & 0.666198 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=228410&T=5

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]18[/C][C]0.211261[/C][C]0.422521[/C][C]0.788739[/C][/ROW]
[ROW][C]19[/C][C]0.142105[/C][C]0.28421[/C][C]0.857895[/C][/ROW]
[ROW][C]20[/C][C]0.074135[/C][C]0.14827[/C][C]0.925865[/C][/ROW]
[ROW][C]21[/C][C]0.0652703[/C][C]0.130541[/C][C]0.93473[/C][/ROW]
[ROW][C]22[/C][C]0.039146[/C][C]0.078292[/C][C]0.960854[/C][/ROW]
[ROW][C]23[/C][C]0.0248024[/C][C]0.0496048[/C][C]0.975198[/C][/ROW]
[ROW][C]24[/C][C]0.0118815[/C][C]0.0237629[/C][C]0.988119[/C][/ROW]
[ROW][C]25[/C][C]0.00546383[/C][C]0.0109277[/C][C]0.994536[/C][/ROW]
[ROW][C]26[/C][C]0.0071898[/C][C]0.0143796[/C][C]0.99281[/C][/ROW]
[ROW][C]27[/C][C]0.00827501[/C][C]0.01655[/C][C]0.991725[/C][/ROW]
[ROW][C]28[/C][C]0.00416847[/C][C]0.00833694[/C][C]0.995832[/C][/ROW]
[ROW][C]29[/C][C]0.00310378[/C][C]0.00620755[/C][C]0.996896[/C][/ROW]
[ROW][C]30[/C][C]0.0015533[/C][C]0.00310661[/C][C]0.998447[/C][/ROW]
[ROW][C]31[/C][C]0.00125845[/C][C]0.0025169[/C][C]0.998742[/C][/ROW]
[ROW][C]32[/C][C]0.00112253[/C][C]0.00224505[/C][C]0.998877[/C][/ROW]
[ROW][C]33[/C][C]0.0025907[/C][C]0.0051814[/C][C]0.997409[/C][/ROW]
[ROW][C]34[/C][C]0.00442093[/C][C]0.00884186[/C][C]0.995579[/C][/ROW]
[ROW][C]35[/C][C]0.0101013[/C][C]0.0202027[/C][C]0.989899[/C][/ROW]
[ROW][C]36[/C][C]0.0706692[/C][C]0.141338[/C][C]0.929331[/C][/ROW]
[ROW][C]37[/C][C]0.0869521[/C][C]0.173904[/C][C]0.913048[/C][/ROW]
[ROW][C]38[/C][C]0.131716[/C][C]0.263433[/C][C]0.868284[/C][/ROW]
[ROW][C]39[/C][C]0.110389[/C][C]0.220777[/C][C]0.889611[/C][/ROW]
[ROW][C]40[/C][C]0.23966[/C][C]0.479321[/C][C]0.76034[/C][/ROW]
[ROW][C]41[/C][C]0.527144[/C][C]0.945712[/C][C]0.472856[/C][/ROW]
[ROW][C]42[/C][C]0.500761[/C][C]0.998478[/C][C]0.499239[/C][/ROW]
[ROW][C]43[/C][C]0.633142[/C][C]0.733715[/C][C]0.366858[/C][/ROW]
[ROW][C]44[/C][C]0.714656[/C][C]0.570688[/C][C]0.285344[/C][/ROW]
[ROW][C]45[/C][C]0.725795[/C][C]0.54841[/C][C]0.274205[/C][/ROW]
[ROW][C]46[/C][C]0.846956[/C][C]0.306088[/C][C]0.153044[/C][/ROW]
[ROW][C]47[/C][C]0.815153[/C][C]0.369694[/C][C]0.184847[/C][/ROW]
[ROW][C]48[/C][C]0.801888[/C][C]0.396224[/C][C]0.198112[/C][/ROW]
[ROW][C]49[/C][C]0.792846[/C][C]0.414309[/C][C]0.207154[/C][/ROW]
[ROW][C]50[/C][C]0.759352[/C][C]0.481297[/C][C]0.240648[/C][/ROW]
[ROW][C]51[/C][C]0.698831[/C][C]0.602337[/C][C]0.301169[/C][/ROW]
[ROW][C]52[/C][C]0.675787[/C][C]0.648426[/C][C]0.324213[/C][/ROW]
[ROW][C]53[/C][C]0.711344[/C][C]0.577312[/C][C]0.288656[/C][/ROW]
[ROW][C]54[/C][C]0.658196[/C][C]0.683609[/C][C]0.341804[/C][/ROW]
[ROW][C]55[/C][C]0.625688[/C][C]0.748625[/C][C]0.374312[/C][/ROW]
[ROW][C]56[/C][C]0.591418[/C][C]0.817165[/C][C]0.408582[/C][/ROW]
[ROW][C]57[/C][C]0.554003[/C][C]0.891994[/C][C]0.445997[/C][/ROW]
[ROW][C]58[/C][C]0.639226[/C][C]0.721549[/C][C]0.360774[/C][/ROW]
[ROW][C]59[/C][C]0.564964[/C][C]0.870071[/C][C]0.435036[/C][/ROW]
[ROW][C]60[/C][C]0.482561[/C][C]0.965121[/C][C]0.517439[/C][/ROW]
[ROW][C]61[/C][C]0.624201[/C][C]0.751599[/C][C]0.375799[/C][/ROW]
[ROW][C]62[/C][C]0.539578[/C][C]0.920844[/C][C]0.460422[/C][/ROW]
[ROW][C]63[/C][C]0.486098[/C][C]0.972196[/C][C]0.513902[/C][/ROW]
[ROW][C]64[/C][C]0.545321[/C][C]0.909357[/C][C]0.454679[/C][/ROW]
[ROW][C]65[/C][C]0.468874[/C][C]0.937749[/C][C]0.531126[/C][/ROW]
[ROW][C]66[/C][C]0.333802[/C][C]0.667604[/C][C]0.666198[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=228410&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=228410&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
180.2112610.4225210.788739
190.1421050.284210.857895
200.0741350.148270.925865
210.06527030.1305410.93473
220.0391460.0782920.960854
230.02480240.04960480.975198
240.01188150.02376290.988119
250.005463830.01092770.994536
260.00718980.01437960.99281
270.008275010.016550.991725
280.004168470.008336940.995832
290.003103780.006207550.996896
300.00155330.003106610.998447
310.001258450.00251690.998742
320.001122530.002245050.998877
330.00259070.00518140.997409
340.004420930.008841860.995579
350.01010130.02020270.989899
360.07066920.1413380.929331
370.08695210.1739040.913048
380.1317160.2634330.868284
390.1103890.2207770.889611
400.239660.4793210.76034
410.5271440.9457120.472856
420.5007610.9984780.499239
430.6331420.7337150.366858
440.7146560.5706880.285344
450.7257950.548410.274205
460.8469560.3060880.153044
470.8151530.3696940.184847
480.8018880.3962240.198112
490.7928460.4143090.207154
500.7593520.4812970.240648
510.6988310.6023370.301169
520.6757870.6484260.324213
530.7113440.5773120.288656
540.6581960.6836090.341804
550.6256880.7486250.374312
560.5914180.8171650.408582
570.5540030.8919940.445997
580.6392260.7215490.360774
590.5649640.8700710.435036
600.4825610.9651210.517439
610.6242010.7515990.375799
620.5395780.9208440.460422
630.4860980.9721960.513902
640.5453210.9093570.454679
650.4688740.9377490.531126
660.3338020.6676040.666198







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level70.142857NOK
5% type I error level130.265306NOK
10% type I error level140.285714NOK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 7 & 0.142857 & NOK \tabularnewline
5% type I error level & 13 & 0.265306 & NOK \tabularnewline
10% type I error level & 14 & 0.285714 & NOK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=228410&T=6

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]7[/C][C]0.142857[/C][C]NOK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]13[/C][C]0.265306[/C][C]NOK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]14[/C][C]0.285714[/C][C]NOK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=228410&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=228410&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level70.142857NOK
5% type I error level130.265306NOK
10% type I error level140.285714NOK



Parameters (Session):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
R code (references can be found in the software module):
par3 <- 'First Differences'
par2 <- 'Include Monthly Dummies'
par1 <- '1'
library(lattice)
library(lmtest)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
qqline(mysum$resid)
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, signif(mysum$coefficients[i,1],6), sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,signif(mysum$coefficients[i,1],6))
a<-table.element(a, signif(mysum$coefficients[i,2],6))
a<-table.element(a, signif(mysum$coefficients[i,3],4))
a<-table.element(a, signif(mysum$coefficients[i,4],6))
a<-table.element(a, signif(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, signif(sqrt(mysum$r.squared),6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, signif(mysum$r.squared,6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, signif(mysum$adj.r.squared,6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[1],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[2],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[3],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, signif(1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]),6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, signif(mysum$sigma,6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, signif(sum(myerror*myerror),6))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,signif(x[i],6))
a<-table.element(a,signif(x[i]-mysum$resid[i],6))
a<-table.element(a,signif(mysum$resid[i],6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,signif(gqarr[mypoint-kp3+1,1],6))
a<-table.element(a,signif(gqarr[mypoint-kp3+1,2],6))
a<-table.element(a,signif(gqarr[mypoint-kp3+1,3],6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant1,6))
a<-table.element(a,signif(numsignificant1/numgqtests,6))
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant5,6))
a<-table.element(a,signif(numsignificant5/numgqtests,6))
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant10,6))
a<-table.element(a,signif(numsignificant10/numgqtests,6))
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}