Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationMon, 23 Jan 2017 09:32:58 +0100
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2017/Jan/23/t1485160389ztgm1f29bi4k8ak.htm/, Retrieved Wed, 15 May 2024 10:52:48 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=303905, Retrieved Wed, 15 May 2024 10:52:48 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact74
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [] [2017-01-23 08:32:58] [863feeaf19a0ddfce7bd9c25059c4d8a] [Current]
Feedback Forum

Post a new message
Dataseries X:
22 13 22 4 2 4
24 16 24 5 3 3
21 17 26 4 4 5
21 NA 21 3 4 3
24 NA 26 4 4 5
20 16 25 3 4 4
22 NA 21 3 4 4
20 NA 24 3 4 5
19 NA 27 4 5 4
23 17 28 4 5 5
21 17 23 4 4 2
19 15 25 4 4 5
19 16 24 4 4 4
21 14 24 3 3 5
21 16 24 4 4 5
22 17 25 3 4 5
22 NA 25 3 4 5
19 NA NA NA NA 5
21 NA 25 5 5 4
21 NA 25 4 4 4
21 16 24 3 4 5
20 NA 26 4 4 4
22 16 26 4 4 5
22 NA 25 4 4 5
24 NA 26 4 4 5
21 NA 23 3 4 4
19 16 24 3 4 4
19 15 24 4 4 4
23 16 25 2 4 5
21 16 25 5 4 4
21 13 24 4 3 5
19 15 28 4 5 5
21 17 27 5 4 5
19 NA NA 4 3 5
21 13 23 2 3 5
21 17 23 4 5 2
23 NA 24 3 4 5
19 14 24 4 3 5
19 14 22 4 3 3
19 18 25 4 4 5
18 NA 25 5 4 4
22 17 28 4 5 5
18 13 22 3 3 4
22 16 28 5 5 5
18 15 25 5 4 5
22 15 24 4 4 4
22 NA 24 4 4 4
19 15 23 3 5 5
22 13 25 4 4 4
25 NA NA 2 3 4
19 17 26 4 5 5
19 NA 25 5 5 2
19 NA 27 5 5 5
19 11 26 4 3 5
21 14 23 4 3 4
21 13 25 4 4 5
20 NA 21 3 4 4
19 17 22 3 4 4
19 16 24 4 4 4
22 NA 25 4 4 4
26 17 27 5 5 3
19 16 24 2 4 4
21 16 26 4 4 4
21 16 21 3 4 4
20 15 27 4 4 5
23 12 22 4 2 4
22 17 23 4 4 4
22 14 24 4 4 4
22 14 25 5 4 5
21 16 24 3 4 4
21 NA 23 3 4 4
22 NA 28 4 5 5
23 NA NA 4 4 3
18 NA 24 4 4 4
24 NA 26 4 4 4
22 15 22 3 4 3
21 16 25 4 4 4
21 14 25 3 4 5
21 15 24 3 3 5
23 17 24 4 3 5
21 NA 26 4 4 5
23 10 21 3 3 3
21 NA 25 4 4 4
19 17 25 4 4 3
21 NA 26 4 4 4
21 20 25 5 4 4
21 17 26 5 4 3
23 18 27 4 4 5
23 NA 25 3 4 5
20 17 NA 3 NA 4
20 14 20 4 2 3
19 NA 24 4 4 5
23 17 26 4 4 5
22 NA 25 4 4 4
19 17 25 4 5 4
23 NA 24 3 4 4
22 16 26 4 4 5
22 18 25 5 4 3
21 18 28 5 4 5
21 16 27 4 5 4
21 NA 25 3 4 5
21 NA 26 5 3 4
22 15 26 4 4 5
25 13 26 5 4 4
21 NA NA 3 4 4
23 NA 28 5 4 4
19 NA NA 4 4 5
22 NA 21 4 4 3
20 NA 25 4 4 5
21 16 25 4 4 5
25 NA 24 3 4 5
21 NA 24 4 4 4
19 NA 24 4 4 4
23 12 23 3 3 4
22 NA 23 4 4 4
21 16 24 3 4 5
24 16 24 4 4 5
21 NA 25 5 4 5
19 16 28 5 4 5
18 14 23 4 4 4
19 15 24 4 4 5
20 14 23 3 4 4
19 NA 24 4 4 4
22 15 25 4 4 4
21 NA 24 4 5 3
22 15 23 3 4 4
24 16 23 4 4 4
28 NA 25 4 4 4
19 NA 21 3 4 3
18 NA 22 4 4 4
23 11 19 3 2 4
19 NA 24 4 4 4
23 18 25 5 4 4
19 NA 21 2 4 4
22 11 22 3 3 4
21 NA 23 4 4 4
19 18 27 5 5 4
22 NA NA NA NA 2
21 15 26 4 5 5
23 19 29 5 5 5
22 17 28 4 5 5
19 NA 24 4 4 4
19 14 25 3 4 5
21 NA 25 4 4 5
22 13 22 4 4 2
21 17 25 4 4 3
20 14 26 4 4 4
23 19 26 5 4 5
22 14 24 4 3 5
23 NA 25 4 4 5
22 NA 19 3 3 2
21 16 25 4 5 5
20 16 23 4 4 4
18 15 25 4 4 4
18 12 25 3 4 5
20 NA 26 4 4 5
19 17 27 5 4 5
21 NA 24 4 4 5
24 NA 22 2 3 5
19 18 25 4 4 4
20 15 24 4 3 4
19 18 23 4 4 4
23 15 27 4 5 5
22 NA 24 5 4 3
21 NA 24 5 4 4
24 NA 21 3 3 1
21 16 25 4 4 4
21 NA 25 4 4 4
22 16 23 2 3 4




Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time7 seconds
R ServerBig Analytics Cloud Computing Center

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input view raw input (R code)  \tabularnewline
Raw Outputview raw output of R engine  \tabularnewline
Computing time7 seconds \tabularnewline
R ServerBig Analytics Cloud Computing Center \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=303905&T=0

[TABLE]
[ROW]
Summary of computational transaction[/C][/ROW] [ROW]Raw Input[/C] view raw input (R code) [/C][/ROW] [ROW]Raw Output[/C]view raw output of R engine [/C][/ROW] [ROW]Computing time[/C]7 seconds[/C][/ROW] [ROW]R Server[/C]Big Analytics Cloud Computing Center[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=303905&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=303905&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time7 seconds
R ServerBig Analytics Cloud Computing Center







Multiple Linear Regression - Estimated Regression Equation
Bevr_Leeftijd[t] = + 18.871 -0.000499896TVDC[t] + 0.184702SKEOUSUM[t] + 0.146966SKEOU1[t] -0.44122SKEOU2[t] -0.298817SKEOU3[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
Bevr_Leeftijd[t] =  +  18.871 -0.000499896TVDC[t] +  0.184702SKEOUSUM[t] +  0.146966SKEOU1[t] -0.44122SKEOU2[t] -0.298817SKEOU3[t]  + e[t] \tabularnewline
 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=303905&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]Bevr_Leeftijd[t] =  +  18.871 -0.000499896TVDC[t] +  0.184702SKEOUSUM[t] +  0.146966SKEOU1[t] -0.44122SKEOU2[t] -0.298817SKEOU3[t]  + e[t][/C][/ROW]
[ROW][C][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=303905&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=303905&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
Bevr_Leeftijd[t] = + 18.871 -0.000499896TVDC[t] + 0.184702SKEOUSUM[t] + 0.146966SKEOU1[t] -0.44122SKEOU2[t] -0.298817SKEOU3[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+18.87 2.464+7.6590e+00 1.489e-11 7.446e-12
TVDC-0.0004999 0.1124-4.4470e-03 0.9965 0.4982
SKEOUSUM+0.1847 0.1643+1.1240e+00 0.2638 0.1319
SKEOU1+0.147 0.2921+5.0310e-01 0.6161 0.308
SKEOU2-0.4412 0.3549-1.2430e+00 0.2169 0.1084
SKEOU3-0.2988 0.2767-1.0800e+00 0.283 0.1415

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & +18.87 &  2.464 & +7.6590e+00 &  1.489e-11 &  7.446e-12 \tabularnewline
TVDC & -0.0004999 &  0.1124 & -4.4470e-03 &  0.9965 &  0.4982 \tabularnewline
SKEOUSUM & +0.1847 &  0.1643 & +1.1240e+00 &  0.2638 &  0.1319 \tabularnewline
SKEOU1 & +0.147 &  0.2921 & +5.0310e-01 &  0.6161 &  0.308 \tabularnewline
SKEOU2 & -0.4412 &  0.3549 & -1.2430e+00 &  0.2169 &  0.1084 \tabularnewline
SKEOU3 & -0.2988 &  0.2767 & -1.0800e+00 &  0.283 &  0.1415 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=303905&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]+18.87[/C][C] 2.464[/C][C]+7.6590e+00[/C][C] 1.489e-11[/C][C] 7.446e-12[/C][/ROW]
[ROW][C]TVDC[/C][C]-0.0004999[/C][C] 0.1124[/C][C]-4.4470e-03[/C][C] 0.9965[/C][C] 0.4982[/C][/ROW]
[ROW][C]SKEOUSUM[/C][C]+0.1847[/C][C] 0.1643[/C][C]+1.1240e+00[/C][C] 0.2638[/C][C] 0.1319[/C][/ROW]
[ROW][C]SKEOU1[/C][C]+0.147[/C][C] 0.2921[/C][C]+5.0310e-01[/C][C] 0.6161[/C][C] 0.308[/C][/ROW]
[ROW][C]SKEOU2[/C][C]-0.4412[/C][C] 0.3549[/C][C]-1.2430e+00[/C][C] 0.2169[/C][C] 0.1084[/C][/ROW]
[ROW][C]SKEOU3[/C][C]-0.2988[/C][C] 0.2767[/C][C]-1.0800e+00[/C][C] 0.283[/C][C] 0.1415[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=303905&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=303905&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+18.87 2.464+7.6590e+00 1.489e-11 7.446e-12
TVDC-0.0004999 0.1124-4.4470e-03 0.9965 0.4982
SKEOUSUM+0.1847 0.1643+1.1240e+00 0.2638 0.1319
SKEOU1+0.147 0.2921+5.0310e-01 0.6161 0.308
SKEOU2-0.4412 0.3549-1.2430e+00 0.2169 0.1084
SKEOU3-0.2988 0.2767-1.0800e+00 0.283 0.1415







Multiple Linear Regression - Regression Statistics
Multiple R 0.1972
R-squared 0.03889
Adjusted R-squared-0.01117
F-TEST (value) 0.7769
F-TEST (DF numerator)5
F-TEST (DF denominator)96
p-value 0.5688
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 1.689
Sum Squared Residuals 273.8

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R &  0.1972 \tabularnewline
R-squared &  0.03889 \tabularnewline
Adjusted R-squared & -0.01117 \tabularnewline
F-TEST (value) &  0.7769 \tabularnewline
F-TEST (DF numerator) & 5 \tabularnewline
F-TEST (DF denominator) & 96 \tabularnewline
p-value &  0.5688 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation &  1.689 \tabularnewline
Sum Squared Residuals &  273.8 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=303905&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C] 0.1972[/C][/ROW]
[ROW][C]R-squared[/C][C] 0.03889[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]-0.01117[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C] 0.7769[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]5[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]96[/C][/ROW]
[ROW][C]p-value[/C][C] 0.5688[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C] 1.689[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C] 273.8[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=303905&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=303905&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R 0.1972
R-squared 0.03889
Adjusted R-squared-0.01117
F-TEST (value) 0.7769
F-TEST (DF numerator)5
F-TEST (DF denominator)96
p-value 0.5688
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 1.689
Sum Squared Residuals 273.8







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 22 21.44 0.5619
2 24 21.81 2.189
3 21 20.99 0.00631
4 20 20.96-0.9613
5 23 20.92 2.078
6 21 21.34-0.336
7 19 20.81-1.81
8 19 20.92-1.924
9 21 20.92 0.07996
10 21 20.62 0.3752
11 22 20.66 1.338
12 21 20.48 0.5222
13 22 20.99 1.006
14 19 20.78-1.777
15 19 20.92-1.924
16 23 20.52 2.484
17 21 21.26-0.2553
18 21 21.07-0.06751
19 19 20.92-1.923
20 21 21.33-0.3254
21 21 20.59 0.4111
22 21 20.89 0.1052
23 19 21.07-2.067
24 19 21.3-2.295
25 19 20.81-1.808
26 22 20.92 1.078
27 18 20.85-2.85
28 22 21.07 0.9307
29 18 20.96-2.957
30 22 20.92 1.076
31 19 19.85-0.8524
32 22 21.11 0.8902
33 19 20.55-1.552
34 19 21.44-2.438
35 21 21.18-0.1811
36 21 20.81 0.189
37 19 20.41-1.407
38 19 20.92-1.924
39 26 21.48 4.518
40 19 20.63-1.63
41 21 21.29-0.293
42 21 20.22 0.7775
43 20 21.18-1.179
44 23 21.44 1.561
45 22 20.74 1.262
46 22 20.92 1.075
47 22 20.96 1.043
48 21 20.78 0.2234
49 22 20.71 1.293
50 21 21.11-0.1083
51 21 20.66 0.3365
52 21 20.92 0.08046
53 23 21.07 1.934
54 23 20.97 2.034
55 19 21.41-2.407
56 21 21.25-0.2533
57 21 21.74-0.7383
58 23 21.18 1.822
59 20 21.37-1.367
60 23 20.99 2.006
61 19 20.67-1.667
62 22 20.99 1.006
63 22 21.55 0.4469
64 21 21.51-0.5096
65 21 21.04-0.03649
66 22 20.99 1.005
67 25 21.44 3.559
68 21 20.81 0.1905
69 23 21.04 1.965
70 21 20.48 0.5222
71 24 20.62 3.375
72 19 21.51-2.511
73 18 20.74-2.74
74 19 20.63-1.625
75 20 20.59-0.5929
76 22 21.11 0.8912
77 22 20.59 1.408
78 24 20.74 3.261
79 23 20.74 2.262
80 23 21.25 1.746
81 22 20.85 1.149
82 19 21.18-2.182
83 21 20.55 0.4465
84 23 21.25 1.747
85 22 20.92 1.078
86 19 20.66-1.664
87 22 21.15 0.8467
88 21 21.41-0.4066
89 20 21.29-1.294
90 23 21.14 1.86
91 22 21.07 0.933
92 21 20.37 0.6317
93 20 20.74-0.7389
94 18 21.11-3.109
95 18 20.66-2.665
96 19 21.33-2.325
97 19 21.11-2.107
98 20 21.37-1.365
99 19 20.74-1.738
100 23 20.74 2.262
101 21 21.11-0.1083
102 22 20.89 1.114

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 &  22 &  21.44 &  0.5619 \tabularnewline
2 &  24 &  21.81 &  2.189 \tabularnewline
3 &  21 &  20.99 &  0.00631 \tabularnewline
4 &  20 &  20.96 & -0.9613 \tabularnewline
5 &  23 &  20.92 &  2.078 \tabularnewline
6 &  21 &  21.34 & -0.336 \tabularnewline
7 &  19 &  20.81 & -1.81 \tabularnewline
8 &  19 &  20.92 & -1.924 \tabularnewline
9 &  21 &  20.92 &  0.07996 \tabularnewline
10 &  21 &  20.62 &  0.3752 \tabularnewline
11 &  22 &  20.66 &  1.338 \tabularnewline
12 &  21 &  20.48 &  0.5222 \tabularnewline
13 &  22 &  20.99 &  1.006 \tabularnewline
14 &  19 &  20.78 & -1.777 \tabularnewline
15 &  19 &  20.92 & -1.924 \tabularnewline
16 &  23 &  20.52 &  2.484 \tabularnewline
17 &  21 &  21.26 & -0.2553 \tabularnewline
18 &  21 &  21.07 & -0.06751 \tabularnewline
19 &  19 &  20.92 & -1.923 \tabularnewline
20 &  21 &  21.33 & -0.3254 \tabularnewline
21 &  21 &  20.59 &  0.4111 \tabularnewline
22 &  21 &  20.89 &  0.1052 \tabularnewline
23 &  19 &  21.07 & -2.067 \tabularnewline
24 &  19 &  21.3 & -2.295 \tabularnewline
25 &  19 &  20.81 & -1.808 \tabularnewline
26 &  22 &  20.92 &  1.078 \tabularnewline
27 &  18 &  20.85 & -2.85 \tabularnewline
28 &  22 &  21.07 &  0.9307 \tabularnewline
29 &  18 &  20.96 & -2.957 \tabularnewline
30 &  22 &  20.92 &  1.076 \tabularnewline
31 &  19 &  19.85 & -0.8524 \tabularnewline
32 &  22 &  21.11 &  0.8902 \tabularnewline
33 &  19 &  20.55 & -1.552 \tabularnewline
34 &  19 &  21.44 & -2.438 \tabularnewline
35 &  21 &  21.18 & -0.1811 \tabularnewline
36 &  21 &  20.81 &  0.189 \tabularnewline
37 &  19 &  20.41 & -1.407 \tabularnewline
38 &  19 &  20.92 & -1.924 \tabularnewline
39 &  26 &  21.48 &  4.518 \tabularnewline
40 &  19 &  20.63 & -1.63 \tabularnewline
41 &  21 &  21.29 & -0.293 \tabularnewline
42 &  21 &  20.22 &  0.7775 \tabularnewline
43 &  20 &  21.18 & -1.179 \tabularnewline
44 &  23 &  21.44 &  1.561 \tabularnewline
45 &  22 &  20.74 &  1.262 \tabularnewline
46 &  22 &  20.92 &  1.075 \tabularnewline
47 &  22 &  20.96 &  1.043 \tabularnewline
48 &  21 &  20.78 &  0.2234 \tabularnewline
49 &  22 &  20.71 &  1.293 \tabularnewline
50 &  21 &  21.11 & -0.1083 \tabularnewline
51 &  21 &  20.66 &  0.3365 \tabularnewline
52 &  21 &  20.92 &  0.08046 \tabularnewline
53 &  23 &  21.07 &  1.934 \tabularnewline
54 &  23 &  20.97 &  2.034 \tabularnewline
55 &  19 &  21.41 & -2.407 \tabularnewline
56 &  21 &  21.25 & -0.2533 \tabularnewline
57 &  21 &  21.74 & -0.7383 \tabularnewline
58 &  23 &  21.18 &  1.822 \tabularnewline
59 &  20 &  21.37 & -1.367 \tabularnewline
60 &  23 &  20.99 &  2.006 \tabularnewline
61 &  19 &  20.67 & -1.667 \tabularnewline
62 &  22 &  20.99 &  1.006 \tabularnewline
63 &  22 &  21.55 &  0.4469 \tabularnewline
64 &  21 &  21.51 & -0.5096 \tabularnewline
65 &  21 &  21.04 & -0.03649 \tabularnewline
66 &  22 &  20.99 &  1.005 \tabularnewline
67 &  25 &  21.44 &  3.559 \tabularnewline
68 &  21 &  20.81 &  0.1905 \tabularnewline
69 &  23 &  21.04 &  1.965 \tabularnewline
70 &  21 &  20.48 &  0.5222 \tabularnewline
71 &  24 &  20.62 &  3.375 \tabularnewline
72 &  19 &  21.51 & -2.511 \tabularnewline
73 &  18 &  20.74 & -2.74 \tabularnewline
74 &  19 &  20.63 & -1.625 \tabularnewline
75 &  20 &  20.59 & -0.5929 \tabularnewline
76 &  22 &  21.11 &  0.8912 \tabularnewline
77 &  22 &  20.59 &  1.408 \tabularnewline
78 &  24 &  20.74 &  3.261 \tabularnewline
79 &  23 &  20.74 &  2.262 \tabularnewline
80 &  23 &  21.25 &  1.746 \tabularnewline
81 &  22 &  20.85 &  1.149 \tabularnewline
82 &  19 &  21.18 & -2.182 \tabularnewline
83 &  21 &  20.55 &  0.4465 \tabularnewline
84 &  23 &  21.25 &  1.747 \tabularnewline
85 &  22 &  20.92 &  1.078 \tabularnewline
86 &  19 &  20.66 & -1.664 \tabularnewline
87 &  22 &  21.15 &  0.8467 \tabularnewline
88 &  21 &  21.41 & -0.4066 \tabularnewline
89 &  20 &  21.29 & -1.294 \tabularnewline
90 &  23 &  21.14 &  1.86 \tabularnewline
91 &  22 &  21.07 &  0.933 \tabularnewline
92 &  21 &  20.37 &  0.6317 \tabularnewline
93 &  20 &  20.74 & -0.7389 \tabularnewline
94 &  18 &  21.11 & -3.109 \tabularnewline
95 &  18 &  20.66 & -2.665 \tabularnewline
96 &  19 &  21.33 & -2.325 \tabularnewline
97 &  19 &  21.11 & -2.107 \tabularnewline
98 &  20 &  21.37 & -1.365 \tabularnewline
99 &  19 &  20.74 & -1.738 \tabularnewline
100 &  23 &  20.74 &  2.262 \tabularnewline
101 &  21 &  21.11 & -0.1083 \tabularnewline
102 &  22 &  20.89 &  1.114 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=303905&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C] 22[/C][C] 21.44[/C][C] 0.5619[/C][/ROW]
[ROW][C]2[/C][C] 24[/C][C] 21.81[/C][C] 2.189[/C][/ROW]
[ROW][C]3[/C][C] 21[/C][C] 20.99[/C][C] 0.00631[/C][/ROW]
[ROW][C]4[/C][C] 20[/C][C] 20.96[/C][C]-0.9613[/C][/ROW]
[ROW][C]5[/C][C] 23[/C][C] 20.92[/C][C] 2.078[/C][/ROW]
[ROW][C]6[/C][C] 21[/C][C] 21.34[/C][C]-0.336[/C][/ROW]
[ROW][C]7[/C][C] 19[/C][C] 20.81[/C][C]-1.81[/C][/ROW]
[ROW][C]8[/C][C] 19[/C][C] 20.92[/C][C]-1.924[/C][/ROW]
[ROW][C]9[/C][C] 21[/C][C] 20.92[/C][C] 0.07996[/C][/ROW]
[ROW][C]10[/C][C] 21[/C][C] 20.62[/C][C] 0.3752[/C][/ROW]
[ROW][C]11[/C][C] 22[/C][C] 20.66[/C][C] 1.338[/C][/ROW]
[ROW][C]12[/C][C] 21[/C][C] 20.48[/C][C] 0.5222[/C][/ROW]
[ROW][C]13[/C][C] 22[/C][C] 20.99[/C][C] 1.006[/C][/ROW]
[ROW][C]14[/C][C] 19[/C][C] 20.78[/C][C]-1.777[/C][/ROW]
[ROW][C]15[/C][C] 19[/C][C] 20.92[/C][C]-1.924[/C][/ROW]
[ROW][C]16[/C][C] 23[/C][C] 20.52[/C][C] 2.484[/C][/ROW]
[ROW][C]17[/C][C] 21[/C][C] 21.26[/C][C]-0.2553[/C][/ROW]
[ROW][C]18[/C][C] 21[/C][C] 21.07[/C][C]-0.06751[/C][/ROW]
[ROW][C]19[/C][C] 19[/C][C] 20.92[/C][C]-1.923[/C][/ROW]
[ROW][C]20[/C][C] 21[/C][C] 21.33[/C][C]-0.3254[/C][/ROW]
[ROW][C]21[/C][C] 21[/C][C] 20.59[/C][C] 0.4111[/C][/ROW]
[ROW][C]22[/C][C] 21[/C][C] 20.89[/C][C] 0.1052[/C][/ROW]
[ROW][C]23[/C][C] 19[/C][C] 21.07[/C][C]-2.067[/C][/ROW]
[ROW][C]24[/C][C] 19[/C][C] 21.3[/C][C]-2.295[/C][/ROW]
[ROW][C]25[/C][C] 19[/C][C] 20.81[/C][C]-1.808[/C][/ROW]
[ROW][C]26[/C][C] 22[/C][C] 20.92[/C][C] 1.078[/C][/ROW]
[ROW][C]27[/C][C] 18[/C][C] 20.85[/C][C]-2.85[/C][/ROW]
[ROW][C]28[/C][C] 22[/C][C] 21.07[/C][C] 0.9307[/C][/ROW]
[ROW][C]29[/C][C] 18[/C][C] 20.96[/C][C]-2.957[/C][/ROW]
[ROW][C]30[/C][C] 22[/C][C] 20.92[/C][C] 1.076[/C][/ROW]
[ROW][C]31[/C][C] 19[/C][C] 19.85[/C][C]-0.8524[/C][/ROW]
[ROW][C]32[/C][C] 22[/C][C] 21.11[/C][C] 0.8902[/C][/ROW]
[ROW][C]33[/C][C] 19[/C][C] 20.55[/C][C]-1.552[/C][/ROW]
[ROW][C]34[/C][C] 19[/C][C] 21.44[/C][C]-2.438[/C][/ROW]
[ROW][C]35[/C][C] 21[/C][C] 21.18[/C][C]-0.1811[/C][/ROW]
[ROW][C]36[/C][C] 21[/C][C] 20.81[/C][C] 0.189[/C][/ROW]
[ROW][C]37[/C][C] 19[/C][C] 20.41[/C][C]-1.407[/C][/ROW]
[ROW][C]38[/C][C] 19[/C][C] 20.92[/C][C]-1.924[/C][/ROW]
[ROW][C]39[/C][C] 26[/C][C] 21.48[/C][C] 4.518[/C][/ROW]
[ROW][C]40[/C][C] 19[/C][C] 20.63[/C][C]-1.63[/C][/ROW]
[ROW][C]41[/C][C] 21[/C][C] 21.29[/C][C]-0.293[/C][/ROW]
[ROW][C]42[/C][C] 21[/C][C] 20.22[/C][C] 0.7775[/C][/ROW]
[ROW][C]43[/C][C] 20[/C][C] 21.18[/C][C]-1.179[/C][/ROW]
[ROW][C]44[/C][C] 23[/C][C] 21.44[/C][C] 1.561[/C][/ROW]
[ROW][C]45[/C][C] 22[/C][C] 20.74[/C][C] 1.262[/C][/ROW]
[ROW][C]46[/C][C] 22[/C][C] 20.92[/C][C] 1.075[/C][/ROW]
[ROW][C]47[/C][C] 22[/C][C] 20.96[/C][C] 1.043[/C][/ROW]
[ROW][C]48[/C][C] 21[/C][C] 20.78[/C][C] 0.2234[/C][/ROW]
[ROW][C]49[/C][C] 22[/C][C] 20.71[/C][C] 1.293[/C][/ROW]
[ROW][C]50[/C][C] 21[/C][C] 21.11[/C][C]-0.1083[/C][/ROW]
[ROW][C]51[/C][C] 21[/C][C] 20.66[/C][C] 0.3365[/C][/ROW]
[ROW][C]52[/C][C] 21[/C][C] 20.92[/C][C] 0.08046[/C][/ROW]
[ROW][C]53[/C][C] 23[/C][C] 21.07[/C][C] 1.934[/C][/ROW]
[ROW][C]54[/C][C] 23[/C][C] 20.97[/C][C] 2.034[/C][/ROW]
[ROW][C]55[/C][C] 19[/C][C] 21.41[/C][C]-2.407[/C][/ROW]
[ROW][C]56[/C][C] 21[/C][C] 21.25[/C][C]-0.2533[/C][/ROW]
[ROW][C]57[/C][C] 21[/C][C] 21.74[/C][C]-0.7383[/C][/ROW]
[ROW][C]58[/C][C] 23[/C][C] 21.18[/C][C] 1.822[/C][/ROW]
[ROW][C]59[/C][C] 20[/C][C] 21.37[/C][C]-1.367[/C][/ROW]
[ROW][C]60[/C][C] 23[/C][C] 20.99[/C][C] 2.006[/C][/ROW]
[ROW][C]61[/C][C] 19[/C][C] 20.67[/C][C]-1.667[/C][/ROW]
[ROW][C]62[/C][C] 22[/C][C] 20.99[/C][C] 1.006[/C][/ROW]
[ROW][C]63[/C][C] 22[/C][C] 21.55[/C][C] 0.4469[/C][/ROW]
[ROW][C]64[/C][C] 21[/C][C] 21.51[/C][C]-0.5096[/C][/ROW]
[ROW][C]65[/C][C] 21[/C][C] 21.04[/C][C]-0.03649[/C][/ROW]
[ROW][C]66[/C][C] 22[/C][C] 20.99[/C][C] 1.005[/C][/ROW]
[ROW][C]67[/C][C] 25[/C][C] 21.44[/C][C] 3.559[/C][/ROW]
[ROW][C]68[/C][C] 21[/C][C] 20.81[/C][C] 0.1905[/C][/ROW]
[ROW][C]69[/C][C] 23[/C][C] 21.04[/C][C] 1.965[/C][/ROW]
[ROW][C]70[/C][C] 21[/C][C] 20.48[/C][C] 0.5222[/C][/ROW]
[ROW][C]71[/C][C] 24[/C][C] 20.62[/C][C] 3.375[/C][/ROW]
[ROW][C]72[/C][C] 19[/C][C] 21.51[/C][C]-2.511[/C][/ROW]
[ROW][C]73[/C][C] 18[/C][C] 20.74[/C][C]-2.74[/C][/ROW]
[ROW][C]74[/C][C] 19[/C][C] 20.63[/C][C]-1.625[/C][/ROW]
[ROW][C]75[/C][C] 20[/C][C] 20.59[/C][C]-0.5929[/C][/ROW]
[ROW][C]76[/C][C] 22[/C][C] 21.11[/C][C] 0.8912[/C][/ROW]
[ROW][C]77[/C][C] 22[/C][C] 20.59[/C][C] 1.408[/C][/ROW]
[ROW][C]78[/C][C] 24[/C][C] 20.74[/C][C] 3.261[/C][/ROW]
[ROW][C]79[/C][C] 23[/C][C] 20.74[/C][C] 2.262[/C][/ROW]
[ROW][C]80[/C][C] 23[/C][C] 21.25[/C][C] 1.746[/C][/ROW]
[ROW][C]81[/C][C] 22[/C][C] 20.85[/C][C] 1.149[/C][/ROW]
[ROW][C]82[/C][C] 19[/C][C] 21.18[/C][C]-2.182[/C][/ROW]
[ROW][C]83[/C][C] 21[/C][C] 20.55[/C][C] 0.4465[/C][/ROW]
[ROW][C]84[/C][C] 23[/C][C] 21.25[/C][C] 1.747[/C][/ROW]
[ROW][C]85[/C][C] 22[/C][C] 20.92[/C][C] 1.078[/C][/ROW]
[ROW][C]86[/C][C] 19[/C][C] 20.66[/C][C]-1.664[/C][/ROW]
[ROW][C]87[/C][C] 22[/C][C] 21.15[/C][C] 0.8467[/C][/ROW]
[ROW][C]88[/C][C] 21[/C][C] 21.41[/C][C]-0.4066[/C][/ROW]
[ROW][C]89[/C][C] 20[/C][C] 21.29[/C][C]-1.294[/C][/ROW]
[ROW][C]90[/C][C] 23[/C][C] 21.14[/C][C] 1.86[/C][/ROW]
[ROW][C]91[/C][C] 22[/C][C] 21.07[/C][C] 0.933[/C][/ROW]
[ROW][C]92[/C][C] 21[/C][C] 20.37[/C][C] 0.6317[/C][/ROW]
[ROW][C]93[/C][C] 20[/C][C] 20.74[/C][C]-0.7389[/C][/ROW]
[ROW][C]94[/C][C] 18[/C][C] 21.11[/C][C]-3.109[/C][/ROW]
[ROW][C]95[/C][C] 18[/C][C] 20.66[/C][C]-2.665[/C][/ROW]
[ROW][C]96[/C][C] 19[/C][C] 21.33[/C][C]-2.325[/C][/ROW]
[ROW][C]97[/C][C] 19[/C][C] 21.11[/C][C]-2.107[/C][/ROW]
[ROW][C]98[/C][C] 20[/C][C] 21.37[/C][C]-1.365[/C][/ROW]
[ROW][C]99[/C][C] 19[/C][C] 20.74[/C][C]-1.738[/C][/ROW]
[ROW][C]100[/C][C] 23[/C][C] 20.74[/C][C] 2.262[/C][/ROW]
[ROW][C]101[/C][C] 21[/C][C] 21.11[/C][C]-0.1083[/C][/ROW]
[ROW][C]102[/C][C] 22[/C][C] 20.89[/C][C] 1.114[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=303905&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=303905&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 22 21.44 0.5619
2 24 21.81 2.189
3 21 20.99 0.00631
4 20 20.96-0.9613
5 23 20.92 2.078
6 21 21.34-0.336
7 19 20.81-1.81
8 19 20.92-1.924
9 21 20.92 0.07996
10 21 20.62 0.3752
11 22 20.66 1.338
12 21 20.48 0.5222
13 22 20.99 1.006
14 19 20.78-1.777
15 19 20.92-1.924
16 23 20.52 2.484
17 21 21.26-0.2553
18 21 21.07-0.06751
19 19 20.92-1.923
20 21 21.33-0.3254
21 21 20.59 0.4111
22 21 20.89 0.1052
23 19 21.07-2.067
24 19 21.3-2.295
25 19 20.81-1.808
26 22 20.92 1.078
27 18 20.85-2.85
28 22 21.07 0.9307
29 18 20.96-2.957
30 22 20.92 1.076
31 19 19.85-0.8524
32 22 21.11 0.8902
33 19 20.55-1.552
34 19 21.44-2.438
35 21 21.18-0.1811
36 21 20.81 0.189
37 19 20.41-1.407
38 19 20.92-1.924
39 26 21.48 4.518
40 19 20.63-1.63
41 21 21.29-0.293
42 21 20.22 0.7775
43 20 21.18-1.179
44 23 21.44 1.561
45 22 20.74 1.262
46 22 20.92 1.075
47 22 20.96 1.043
48 21 20.78 0.2234
49 22 20.71 1.293
50 21 21.11-0.1083
51 21 20.66 0.3365
52 21 20.92 0.08046
53 23 21.07 1.934
54 23 20.97 2.034
55 19 21.41-2.407
56 21 21.25-0.2533
57 21 21.74-0.7383
58 23 21.18 1.822
59 20 21.37-1.367
60 23 20.99 2.006
61 19 20.67-1.667
62 22 20.99 1.006
63 22 21.55 0.4469
64 21 21.51-0.5096
65 21 21.04-0.03649
66 22 20.99 1.005
67 25 21.44 3.559
68 21 20.81 0.1905
69 23 21.04 1.965
70 21 20.48 0.5222
71 24 20.62 3.375
72 19 21.51-2.511
73 18 20.74-2.74
74 19 20.63-1.625
75 20 20.59-0.5929
76 22 21.11 0.8912
77 22 20.59 1.408
78 24 20.74 3.261
79 23 20.74 2.262
80 23 21.25 1.746
81 22 20.85 1.149
82 19 21.18-2.182
83 21 20.55 0.4465
84 23 21.25 1.747
85 22 20.92 1.078
86 19 20.66-1.664
87 22 21.15 0.8467
88 21 21.41-0.4066
89 20 21.29-1.294
90 23 21.14 1.86
91 22 21.07 0.933
92 21 20.37 0.6317
93 20 20.74-0.7389
94 18 21.11-3.109
95 18 20.66-2.665
96 19 21.33-2.325
97 19 21.11-2.107
98 20 21.37-1.365
99 19 20.74-1.738
100 23 20.74 2.262
101 21 21.11-0.1083
102 22 20.89 1.114







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
9 0.1419 0.2837 0.8581
10 0.4028 0.8056 0.5972
11 0.335 0.67 0.665
12 0.2853 0.5705 0.7147
13 0.1895 0.379 0.8105
14 0.1479 0.2959 0.8521
15 0.0963 0.1926 0.9037
16 0.1572 0.3143 0.8428
17 0.1041 0.2082 0.8959
18 0.07347 0.1469 0.9265
19 0.05187 0.1037 0.9481
20 0.07358 0.1472 0.9264
21 0.04938 0.09876 0.9506
22 0.07389 0.1478 0.9261
23 0.08523 0.1705 0.9148
24 0.08508 0.1702 0.9149
25 0.1386 0.2772 0.8614
26 0.1112 0.2224 0.8888
27 0.1283 0.2565 0.8717
28 0.1211 0.2422 0.8789
29 0.133 0.266 0.867
30 0.1631 0.3263 0.8369
31 0.1574 0.3147 0.8426
32 0.1532 0.3065 0.8468
33 0.1387 0.2774 0.8613
34 0.1701 0.3402 0.8299
35 0.1388 0.2775 0.8612
36 0.1285 0.2571 0.8715
37 0.1097 0.2194 0.8903
38 0.1117 0.2233 0.8883
39 0.3709 0.7418 0.6291
40 0.3957 0.7914 0.6043
41 0.3513 0.7027 0.6487
42 0.3703 0.7405 0.6297
43 0.3488 0.6975 0.6512
44 0.3855 0.7711 0.6145
45 0.3688 0.7377 0.6312
46 0.3494 0.6988 0.6506
47 0.3302 0.6605 0.6698
48 0.2786 0.5573 0.7214
49 0.2663 0.5326 0.7337
50 0.2199 0.4397 0.7801
51 0.1825 0.365 0.8175
52 0.1482 0.2963 0.8518
53 0.1497 0.2995 0.8503
54 0.1798 0.3597 0.8202
55 0.2253 0.4506 0.7747
56 0.1837 0.3674 0.8163
57 0.1536 0.3072 0.8464
58 0.1548 0.3097 0.8451
59 0.1479 0.2958 0.8521
60 0.1576 0.3151 0.8424
61 0.1526 0.3052 0.8474
62 0.1293 0.2586 0.8707
63 0.1015 0.203 0.8985
64 0.07946 0.1589 0.9205
65 0.0598 0.1196 0.9402
66 0.04838 0.09676 0.9516
67 0.1396 0.2791 0.8604
68 0.1077 0.2155 0.8923
69 0.1242 0.2483 0.8758
70 0.09664 0.1933 0.9034
71 0.1699 0.3399 0.8301
72 0.1841 0.3683 0.8159
73 0.2634 0.5268 0.7366
74 0.2828 0.5656 0.7172
75 0.2398 0.4796 0.7602
76 0.2074 0.4148 0.7926
77 0.1783 0.3565 0.8217
78 0.2655 0.531 0.7345
79 0.2891 0.5782 0.7109
80 0.3066 0.6132 0.6934
81 0.3199 0.6398 0.6801
82 0.3773 0.7545 0.6227
83 0.3032 0.6064 0.6968
84 0.2737 0.5473 0.7263
85 0.2343 0.4686 0.7657
86 0.199 0.398 0.801
87 0.2931 0.5861 0.7069
88 0.2827 0.5655 0.7173
89 0.2366 0.4733 0.7634
90 0.1984 0.3968 0.8016
91 0.2502 0.5004 0.7498
92 0.1621 0.3243 0.8379
93 0.118 0.2361 0.882

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
9 &  0.1419 &  0.2837 &  0.8581 \tabularnewline
10 &  0.4028 &  0.8056 &  0.5972 \tabularnewline
11 &  0.335 &  0.67 &  0.665 \tabularnewline
12 &  0.2853 &  0.5705 &  0.7147 \tabularnewline
13 &  0.1895 &  0.379 &  0.8105 \tabularnewline
14 &  0.1479 &  0.2959 &  0.8521 \tabularnewline
15 &  0.0963 &  0.1926 &  0.9037 \tabularnewline
16 &  0.1572 &  0.3143 &  0.8428 \tabularnewline
17 &  0.1041 &  0.2082 &  0.8959 \tabularnewline
18 &  0.07347 &  0.1469 &  0.9265 \tabularnewline
19 &  0.05187 &  0.1037 &  0.9481 \tabularnewline
20 &  0.07358 &  0.1472 &  0.9264 \tabularnewline
21 &  0.04938 &  0.09876 &  0.9506 \tabularnewline
22 &  0.07389 &  0.1478 &  0.9261 \tabularnewline
23 &  0.08523 &  0.1705 &  0.9148 \tabularnewline
24 &  0.08508 &  0.1702 &  0.9149 \tabularnewline
25 &  0.1386 &  0.2772 &  0.8614 \tabularnewline
26 &  0.1112 &  0.2224 &  0.8888 \tabularnewline
27 &  0.1283 &  0.2565 &  0.8717 \tabularnewline
28 &  0.1211 &  0.2422 &  0.8789 \tabularnewline
29 &  0.133 &  0.266 &  0.867 \tabularnewline
30 &  0.1631 &  0.3263 &  0.8369 \tabularnewline
31 &  0.1574 &  0.3147 &  0.8426 \tabularnewline
32 &  0.1532 &  0.3065 &  0.8468 \tabularnewline
33 &  0.1387 &  0.2774 &  0.8613 \tabularnewline
34 &  0.1701 &  0.3402 &  0.8299 \tabularnewline
35 &  0.1388 &  0.2775 &  0.8612 \tabularnewline
36 &  0.1285 &  0.2571 &  0.8715 \tabularnewline
37 &  0.1097 &  0.2194 &  0.8903 \tabularnewline
38 &  0.1117 &  0.2233 &  0.8883 \tabularnewline
39 &  0.3709 &  0.7418 &  0.6291 \tabularnewline
40 &  0.3957 &  0.7914 &  0.6043 \tabularnewline
41 &  0.3513 &  0.7027 &  0.6487 \tabularnewline
42 &  0.3703 &  0.7405 &  0.6297 \tabularnewline
43 &  0.3488 &  0.6975 &  0.6512 \tabularnewline
44 &  0.3855 &  0.7711 &  0.6145 \tabularnewline
45 &  0.3688 &  0.7377 &  0.6312 \tabularnewline
46 &  0.3494 &  0.6988 &  0.6506 \tabularnewline
47 &  0.3302 &  0.6605 &  0.6698 \tabularnewline
48 &  0.2786 &  0.5573 &  0.7214 \tabularnewline
49 &  0.2663 &  0.5326 &  0.7337 \tabularnewline
50 &  0.2199 &  0.4397 &  0.7801 \tabularnewline
51 &  0.1825 &  0.365 &  0.8175 \tabularnewline
52 &  0.1482 &  0.2963 &  0.8518 \tabularnewline
53 &  0.1497 &  0.2995 &  0.8503 \tabularnewline
54 &  0.1798 &  0.3597 &  0.8202 \tabularnewline
55 &  0.2253 &  0.4506 &  0.7747 \tabularnewline
56 &  0.1837 &  0.3674 &  0.8163 \tabularnewline
57 &  0.1536 &  0.3072 &  0.8464 \tabularnewline
58 &  0.1548 &  0.3097 &  0.8451 \tabularnewline
59 &  0.1479 &  0.2958 &  0.8521 \tabularnewline
60 &  0.1576 &  0.3151 &  0.8424 \tabularnewline
61 &  0.1526 &  0.3052 &  0.8474 \tabularnewline
62 &  0.1293 &  0.2586 &  0.8707 \tabularnewline
63 &  0.1015 &  0.203 &  0.8985 \tabularnewline
64 &  0.07946 &  0.1589 &  0.9205 \tabularnewline
65 &  0.0598 &  0.1196 &  0.9402 \tabularnewline
66 &  0.04838 &  0.09676 &  0.9516 \tabularnewline
67 &  0.1396 &  0.2791 &  0.8604 \tabularnewline
68 &  0.1077 &  0.2155 &  0.8923 \tabularnewline
69 &  0.1242 &  0.2483 &  0.8758 \tabularnewline
70 &  0.09664 &  0.1933 &  0.9034 \tabularnewline
71 &  0.1699 &  0.3399 &  0.8301 \tabularnewline
72 &  0.1841 &  0.3683 &  0.8159 \tabularnewline
73 &  0.2634 &  0.5268 &  0.7366 \tabularnewline
74 &  0.2828 &  0.5656 &  0.7172 \tabularnewline
75 &  0.2398 &  0.4796 &  0.7602 \tabularnewline
76 &  0.2074 &  0.4148 &  0.7926 \tabularnewline
77 &  0.1783 &  0.3565 &  0.8217 \tabularnewline
78 &  0.2655 &  0.531 &  0.7345 \tabularnewline
79 &  0.2891 &  0.5782 &  0.7109 \tabularnewline
80 &  0.3066 &  0.6132 &  0.6934 \tabularnewline
81 &  0.3199 &  0.6398 &  0.6801 \tabularnewline
82 &  0.3773 &  0.7545 &  0.6227 \tabularnewline
83 &  0.3032 &  0.6064 &  0.6968 \tabularnewline
84 &  0.2737 &  0.5473 &  0.7263 \tabularnewline
85 &  0.2343 &  0.4686 &  0.7657 \tabularnewline
86 &  0.199 &  0.398 &  0.801 \tabularnewline
87 &  0.2931 &  0.5861 &  0.7069 \tabularnewline
88 &  0.2827 &  0.5655 &  0.7173 \tabularnewline
89 &  0.2366 &  0.4733 &  0.7634 \tabularnewline
90 &  0.1984 &  0.3968 &  0.8016 \tabularnewline
91 &  0.2502 &  0.5004 &  0.7498 \tabularnewline
92 &  0.1621 &  0.3243 &  0.8379 \tabularnewline
93 &  0.118 &  0.2361 &  0.882 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=303905&T=5

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]9[/C][C] 0.1419[/C][C] 0.2837[/C][C] 0.8581[/C][/ROW]
[ROW][C]10[/C][C] 0.4028[/C][C] 0.8056[/C][C] 0.5972[/C][/ROW]
[ROW][C]11[/C][C] 0.335[/C][C] 0.67[/C][C] 0.665[/C][/ROW]
[ROW][C]12[/C][C] 0.2853[/C][C] 0.5705[/C][C] 0.7147[/C][/ROW]
[ROW][C]13[/C][C] 0.1895[/C][C] 0.379[/C][C] 0.8105[/C][/ROW]
[ROW][C]14[/C][C] 0.1479[/C][C] 0.2959[/C][C] 0.8521[/C][/ROW]
[ROW][C]15[/C][C] 0.0963[/C][C] 0.1926[/C][C] 0.9037[/C][/ROW]
[ROW][C]16[/C][C] 0.1572[/C][C] 0.3143[/C][C] 0.8428[/C][/ROW]
[ROW][C]17[/C][C] 0.1041[/C][C] 0.2082[/C][C] 0.8959[/C][/ROW]
[ROW][C]18[/C][C] 0.07347[/C][C] 0.1469[/C][C] 0.9265[/C][/ROW]
[ROW][C]19[/C][C] 0.05187[/C][C] 0.1037[/C][C] 0.9481[/C][/ROW]
[ROW][C]20[/C][C] 0.07358[/C][C] 0.1472[/C][C] 0.9264[/C][/ROW]
[ROW][C]21[/C][C] 0.04938[/C][C] 0.09876[/C][C] 0.9506[/C][/ROW]
[ROW][C]22[/C][C] 0.07389[/C][C] 0.1478[/C][C] 0.9261[/C][/ROW]
[ROW][C]23[/C][C] 0.08523[/C][C] 0.1705[/C][C] 0.9148[/C][/ROW]
[ROW][C]24[/C][C] 0.08508[/C][C] 0.1702[/C][C] 0.9149[/C][/ROW]
[ROW][C]25[/C][C] 0.1386[/C][C] 0.2772[/C][C] 0.8614[/C][/ROW]
[ROW][C]26[/C][C] 0.1112[/C][C] 0.2224[/C][C] 0.8888[/C][/ROW]
[ROW][C]27[/C][C] 0.1283[/C][C] 0.2565[/C][C] 0.8717[/C][/ROW]
[ROW][C]28[/C][C] 0.1211[/C][C] 0.2422[/C][C] 0.8789[/C][/ROW]
[ROW][C]29[/C][C] 0.133[/C][C] 0.266[/C][C] 0.867[/C][/ROW]
[ROW][C]30[/C][C] 0.1631[/C][C] 0.3263[/C][C] 0.8369[/C][/ROW]
[ROW][C]31[/C][C] 0.1574[/C][C] 0.3147[/C][C] 0.8426[/C][/ROW]
[ROW][C]32[/C][C] 0.1532[/C][C] 0.3065[/C][C] 0.8468[/C][/ROW]
[ROW][C]33[/C][C] 0.1387[/C][C] 0.2774[/C][C] 0.8613[/C][/ROW]
[ROW][C]34[/C][C] 0.1701[/C][C] 0.3402[/C][C] 0.8299[/C][/ROW]
[ROW][C]35[/C][C] 0.1388[/C][C] 0.2775[/C][C] 0.8612[/C][/ROW]
[ROW][C]36[/C][C] 0.1285[/C][C] 0.2571[/C][C] 0.8715[/C][/ROW]
[ROW][C]37[/C][C] 0.1097[/C][C] 0.2194[/C][C] 0.8903[/C][/ROW]
[ROW][C]38[/C][C] 0.1117[/C][C] 0.2233[/C][C] 0.8883[/C][/ROW]
[ROW][C]39[/C][C] 0.3709[/C][C] 0.7418[/C][C] 0.6291[/C][/ROW]
[ROW][C]40[/C][C] 0.3957[/C][C] 0.7914[/C][C] 0.6043[/C][/ROW]
[ROW][C]41[/C][C] 0.3513[/C][C] 0.7027[/C][C] 0.6487[/C][/ROW]
[ROW][C]42[/C][C] 0.3703[/C][C] 0.7405[/C][C] 0.6297[/C][/ROW]
[ROW][C]43[/C][C] 0.3488[/C][C] 0.6975[/C][C] 0.6512[/C][/ROW]
[ROW][C]44[/C][C] 0.3855[/C][C] 0.7711[/C][C] 0.6145[/C][/ROW]
[ROW][C]45[/C][C] 0.3688[/C][C] 0.7377[/C][C] 0.6312[/C][/ROW]
[ROW][C]46[/C][C] 0.3494[/C][C] 0.6988[/C][C] 0.6506[/C][/ROW]
[ROW][C]47[/C][C] 0.3302[/C][C] 0.6605[/C][C] 0.6698[/C][/ROW]
[ROW][C]48[/C][C] 0.2786[/C][C] 0.5573[/C][C] 0.7214[/C][/ROW]
[ROW][C]49[/C][C] 0.2663[/C][C] 0.5326[/C][C] 0.7337[/C][/ROW]
[ROW][C]50[/C][C] 0.2199[/C][C] 0.4397[/C][C] 0.7801[/C][/ROW]
[ROW][C]51[/C][C] 0.1825[/C][C] 0.365[/C][C] 0.8175[/C][/ROW]
[ROW][C]52[/C][C] 0.1482[/C][C] 0.2963[/C][C] 0.8518[/C][/ROW]
[ROW][C]53[/C][C] 0.1497[/C][C] 0.2995[/C][C] 0.8503[/C][/ROW]
[ROW][C]54[/C][C] 0.1798[/C][C] 0.3597[/C][C] 0.8202[/C][/ROW]
[ROW][C]55[/C][C] 0.2253[/C][C] 0.4506[/C][C] 0.7747[/C][/ROW]
[ROW][C]56[/C][C] 0.1837[/C][C] 0.3674[/C][C] 0.8163[/C][/ROW]
[ROW][C]57[/C][C] 0.1536[/C][C] 0.3072[/C][C] 0.8464[/C][/ROW]
[ROW][C]58[/C][C] 0.1548[/C][C] 0.3097[/C][C] 0.8451[/C][/ROW]
[ROW][C]59[/C][C] 0.1479[/C][C] 0.2958[/C][C] 0.8521[/C][/ROW]
[ROW][C]60[/C][C] 0.1576[/C][C] 0.3151[/C][C] 0.8424[/C][/ROW]
[ROW][C]61[/C][C] 0.1526[/C][C] 0.3052[/C][C] 0.8474[/C][/ROW]
[ROW][C]62[/C][C] 0.1293[/C][C] 0.2586[/C][C] 0.8707[/C][/ROW]
[ROW][C]63[/C][C] 0.1015[/C][C] 0.203[/C][C] 0.8985[/C][/ROW]
[ROW][C]64[/C][C] 0.07946[/C][C] 0.1589[/C][C] 0.9205[/C][/ROW]
[ROW][C]65[/C][C] 0.0598[/C][C] 0.1196[/C][C] 0.9402[/C][/ROW]
[ROW][C]66[/C][C] 0.04838[/C][C] 0.09676[/C][C] 0.9516[/C][/ROW]
[ROW][C]67[/C][C] 0.1396[/C][C] 0.2791[/C][C] 0.8604[/C][/ROW]
[ROW][C]68[/C][C] 0.1077[/C][C] 0.2155[/C][C] 0.8923[/C][/ROW]
[ROW][C]69[/C][C] 0.1242[/C][C] 0.2483[/C][C] 0.8758[/C][/ROW]
[ROW][C]70[/C][C] 0.09664[/C][C] 0.1933[/C][C] 0.9034[/C][/ROW]
[ROW][C]71[/C][C] 0.1699[/C][C] 0.3399[/C][C] 0.8301[/C][/ROW]
[ROW][C]72[/C][C] 0.1841[/C][C] 0.3683[/C][C] 0.8159[/C][/ROW]
[ROW][C]73[/C][C] 0.2634[/C][C] 0.5268[/C][C] 0.7366[/C][/ROW]
[ROW][C]74[/C][C] 0.2828[/C][C] 0.5656[/C][C] 0.7172[/C][/ROW]
[ROW][C]75[/C][C] 0.2398[/C][C] 0.4796[/C][C] 0.7602[/C][/ROW]
[ROW][C]76[/C][C] 0.2074[/C][C] 0.4148[/C][C] 0.7926[/C][/ROW]
[ROW][C]77[/C][C] 0.1783[/C][C] 0.3565[/C][C] 0.8217[/C][/ROW]
[ROW][C]78[/C][C] 0.2655[/C][C] 0.531[/C][C] 0.7345[/C][/ROW]
[ROW][C]79[/C][C] 0.2891[/C][C] 0.5782[/C][C] 0.7109[/C][/ROW]
[ROW][C]80[/C][C] 0.3066[/C][C] 0.6132[/C][C] 0.6934[/C][/ROW]
[ROW][C]81[/C][C] 0.3199[/C][C] 0.6398[/C][C] 0.6801[/C][/ROW]
[ROW][C]82[/C][C] 0.3773[/C][C] 0.7545[/C][C] 0.6227[/C][/ROW]
[ROW][C]83[/C][C] 0.3032[/C][C] 0.6064[/C][C] 0.6968[/C][/ROW]
[ROW][C]84[/C][C] 0.2737[/C][C] 0.5473[/C][C] 0.7263[/C][/ROW]
[ROW][C]85[/C][C] 0.2343[/C][C] 0.4686[/C][C] 0.7657[/C][/ROW]
[ROW][C]86[/C][C] 0.199[/C][C] 0.398[/C][C] 0.801[/C][/ROW]
[ROW][C]87[/C][C] 0.2931[/C][C] 0.5861[/C][C] 0.7069[/C][/ROW]
[ROW][C]88[/C][C] 0.2827[/C][C] 0.5655[/C][C] 0.7173[/C][/ROW]
[ROW][C]89[/C][C] 0.2366[/C][C] 0.4733[/C][C] 0.7634[/C][/ROW]
[ROW][C]90[/C][C] 0.1984[/C][C] 0.3968[/C][C] 0.8016[/C][/ROW]
[ROW][C]91[/C][C] 0.2502[/C][C] 0.5004[/C][C] 0.7498[/C][/ROW]
[ROW][C]92[/C][C] 0.1621[/C][C] 0.3243[/C][C] 0.8379[/C][/ROW]
[ROW][C]93[/C][C] 0.118[/C][C] 0.2361[/C][C] 0.882[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=303905&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=303905&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
9 0.1419 0.2837 0.8581
10 0.4028 0.8056 0.5972
11 0.335 0.67 0.665
12 0.2853 0.5705 0.7147
13 0.1895 0.379 0.8105
14 0.1479 0.2959 0.8521
15 0.0963 0.1926 0.9037
16 0.1572 0.3143 0.8428
17 0.1041 0.2082 0.8959
18 0.07347 0.1469 0.9265
19 0.05187 0.1037 0.9481
20 0.07358 0.1472 0.9264
21 0.04938 0.09876 0.9506
22 0.07389 0.1478 0.9261
23 0.08523 0.1705 0.9148
24 0.08508 0.1702 0.9149
25 0.1386 0.2772 0.8614
26 0.1112 0.2224 0.8888
27 0.1283 0.2565 0.8717
28 0.1211 0.2422 0.8789
29 0.133 0.266 0.867
30 0.1631 0.3263 0.8369
31 0.1574 0.3147 0.8426
32 0.1532 0.3065 0.8468
33 0.1387 0.2774 0.8613
34 0.1701 0.3402 0.8299
35 0.1388 0.2775 0.8612
36 0.1285 0.2571 0.8715
37 0.1097 0.2194 0.8903
38 0.1117 0.2233 0.8883
39 0.3709 0.7418 0.6291
40 0.3957 0.7914 0.6043
41 0.3513 0.7027 0.6487
42 0.3703 0.7405 0.6297
43 0.3488 0.6975 0.6512
44 0.3855 0.7711 0.6145
45 0.3688 0.7377 0.6312
46 0.3494 0.6988 0.6506
47 0.3302 0.6605 0.6698
48 0.2786 0.5573 0.7214
49 0.2663 0.5326 0.7337
50 0.2199 0.4397 0.7801
51 0.1825 0.365 0.8175
52 0.1482 0.2963 0.8518
53 0.1497 0.2995 0.8503
54 0.1798 0.3597 0.8202
55 0.2253 0.4506 0.7747
56 0.1837 0.3674 0.8163
57 0.1536 0.3072 0.8464
58 0.1548 0.3097 0.8451
59 0.1479 0.2958 0.8521
60 0.1576 0.3151 0.8424
61 0.1526 0.3052 0.8474
62 0.1293 0.2586 0.8707
63 0.1015 0.203 0.8985
64 0.07946 0.1589 0.9205
65 0.0598 0.1196 0.9402
66 0.04838 0.09676 0.9516
67 0.1396 0.2791 0.8604
68 0.1077 0.2155 0.8923
69 0.1242 0.2483 0.8758
70 0.09664 0.1933 0.9034
71 0.1699 0.3399 0.8301
72 0.1841 0.3683 0.8159
73 0.2634 0.5268 0.7366
74 0.2828 0.5656 0.7172
75 0.2398 0.4796 0.7602
76 0.2074 0.4148 0.7926
77 0.1783 0.3565 0.8217
78 0.2655 0.531 0.7345
79 0.2891 0.5782 0.7109
80 0.3066 0.6132 0.6934
81 0.3199 0.6398 0.6801
82 0.3773 0.7545 0.6227
83 0.3032 0.6064 0.6968
84 0.2737 0.5473 0.7263
85 0.2343 0.4686 0.7657
86 0.199 0.398 0.801
87 0.2931 0.5861 0.7069
88 0.2827 0.5655 0.7173
89 0.2366 0.4733 0.7634
90 0.1984 0.3968 0.8016
91 0.2502 0.5004 0.7498
92 0.1621 0.3243 0.8379
93 0.118 0.2361 0.882







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level0 0OK
5% type I error level00OK
10% type I error level20.0235294OK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 0 &  0 & OK \tabularnewline
5% type I error level & 0 & 0 & OK \tabularnewline
10% type I error level & 2 & 0.0235294 & OK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=303905&T=6

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]0[/C][C] 0[/C][C]OK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]2[/C][C]0.0235294[/C][C]OK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=303905&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=303905&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level0 0OK
5% type I error level00OK
10% type I error level20.0235294OK







Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 0.49046, df1 = 2, df2 = 94, p-value = 0.6139
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.70849, df1 = 10, df2 = 86, p-value = 0.7141
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 2.7535, df1 = 2, df2 = 94, p-value = 0.06885

\begin{tabular}{lllllllll}
\hline
Ramsey RESET F-Test for powers (2 and 3) of fitted values \tabularnewline
> reset_test_fitted
	RESET test
data:  mylm
RESET = 0.49046, df1 = 2, df2 = 94, p-value = 0.6139
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of regressors \tabularnewline
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.70849, df1 = 10, df2 = 86, p-value = 0.7141
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of principal components \tabularnewline
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 2.7535, df1 = 2, df2 = 94, p-value = 0.06885
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=303905&T=7

[TABLE]
[ROW][C]Ramsey RESET F-Test for powers (2 and 3) of fitted values[/C][/ROW]
[ROW][C]
> reset_test_fitted
	RESET test
data:  mylm
RESET = 0.49046, df1 = 2, df2 = 94, p-value = 0.6139
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of regressors[/C][/ROW] [ROW][C]
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.70849, df1 = 10, df2 = 86, p-value = 0.7141
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of principal components[/C][/ROW] [ROW][C]
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 2.7535, df1 = 2, df2 = 94, p-value = 0.06885
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=303905&T=7

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=303905&T=7

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 0.49046, df1 = 2, df2 = 94, p-value = 0.6139
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.70849, df1 = 10, df2 = 86, p-value = 0.7141
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 2.7535, df1 = 2, df2 = 94, p-value = 0.06885







Variance Inflation Factors (Multicollinearity)
> vif
    TVDC SKEOUSUM   SKEOU1   SKEOU2   SKEOU3 
1.574278 3.330100 1.633175 2.129250 1.599800 

\begin{tabular}{lllllllll}
\hline
Variance Inflation Factors (Multicollinearity) \tabularnewline
> vif
    TVDC SKEOUSUM   SKEOU1   SKEOU2   SKEOU3 
1.574278 3.330100 1.633175 2.129250 1.599800 
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=303905&T=8

[TABLE]
[ROW][C]Variance Inflation Factors (Multicollinearity)[/C][/ROW]
[ROW][C]
> vif
    TVDC SKEOUSUM   SKEOU1   SKEOU2   SKEOU3 
1.574278 3.330100 1.633175 2.129250 1.599800 
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=303905&T=8

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=303905&T=8

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Variance Inflation Factors (Multicollinearity)
> vif
    TVDC SKEOUSUM   SKEOU1   SKEOU2   SKEOU3 
1.574278 3.330100 1.633175 2.129250 1.599800 



Parameters (Session):
Parameters (R input):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ; par4 = ; par5 = ;
R code (references can be found in the software module):
library(lattice)
library(lmtest)
library(car)
library(MASS)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
mywarning <- ''
par1 <- as.numeric(par1)
if(is.na(par1)) {
par1 <- 1
mywarning = 'Warning: you did not specify the column number of the endogenous series! The first column was selected by default.'
}
if (par4=='') par4 <- 0
par4 <- as.numeric(par4)
if (par5=='') par5 <- 0
par5 <- as.numeric(par5)
x <- na.omit(t(y))
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'Seasonal Differences (s=12)'){
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'First and Seasonal Differences (s=12)'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if(par4 > 0) {
x2 <- array(0, dim=c(n-par4,par4), dimnames=list(1:(n-par4), paste(colnames(x)[par1],'(t-',1:par4,')',sep='')))
for (i in 1:(n-par4)) {
for (j in 1:par4) {
x2[i,j] <- x[i+par4-j,par1]
}
}
x <- cbind(x[(par4+1):n,], x2)
n <- n - par4
}
if(par5 > 0) {
x2 <- array(0, dim=c(n-par5*12,par5), dimnames=list(1:(n-par5*12), paste(colnames(x)[par1],'(t-',1:par5,'s)',sep='')))
for (i in 1:(n-par5*12)) {
for (j in 1:par5) {
x2[i,j] <- x[i+par5*12-j*12,par1]
}
}
x <- cbind(x[(par5*12+1):n,], x2)
n <- n - par5*12
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
(k <- length(x[n,]))
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
print(x)
(k <- length(x[n,]))
head(x)
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
sresid <- studres(mylm)
hist(sresid, freq=FALSE, main='Distribution of Studentized Residuals')
xfit<-seq(min(sresid),max(sresid),length=40)
yfit<-dnorm(xfit)
lines(xfit, yfit)
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqPlot(mylm, main='QQ Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
print(z)
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, signif(mysum$coefficients[i,1],6), sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, mywarning)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Multiple Linear Regression - Ordinary Least Squares', 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,formatC(signif(mysum$coefficients[i,1],5),format='g',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,2],5),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,3],4),format='e',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4],4),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4]/2,4),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a,formatC(signif(sqrt(mysum$r.squared),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$adj.r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a,formatC(signif(mysum$fstatistic[1],6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[2],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[3],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a,formatC(signif(1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a,formatC(signif(mysum$sigma,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a,formatC(signif(sum(myerror*myerror),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
myr <- as.numeric(mysum$resid)
myr
if(n < 200) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,formatC(signif(x[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(x[i]-mysum$resid[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$resid[i],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,1],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,2],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,3],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant1,6))
a<-table.element(a,formatC(signif(numsignificant1/numgqtests,6),format='g',flag=' '))
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant5,6))
a<-table.element(a,signif(numsignificant5/numgqtests,6))
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant10,6))
a<-table.element(a,signif(numsignificant10/numgqtests,6))
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}
}
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of fitted values',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_fitted <- resettest(mylm,power=2:3,type='fitted')
a<-table.element(a,paste('
',RC.texteval('reset_test_fitted'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of regressors',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_regressors <- resettest(mylm,power=2:3,type='regressor')
a<-table.element(a,paste('
',RC.texteval('reset_test_regressors'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of principal components',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_principal_components <- resettest(mylm,power=2:3,type='princomp')
a<-table.element(a,paste('
',RC.texteval('reset_test_principal_components'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable8.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Variance Inflation Factors (Multicollinearity)',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
vif <- vif(mylm)
a<-table.element(a,paste('
',RC.texteval('vif'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable9.tab')