Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationMon, 23 Jan 2017 11:11:20 +0100
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2017/Jan/23/t1485166286zjczsioi715ucgp.htm/, Retrieved Wed, 15 May 2024 16:33:17 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=304819, Retrieved Wed, 15 May 2024 16:33:17 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact82
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [] [2017-01-23 10:11:20] [2e11ca31a00cf8de75c33c1af2d59434] [Current]
Feedback Forum

Post a new message
Dataseries X:
14 22 13 4 2 3 5 4
19 24 16 5 3 4 5 4
17 21 17 4 4 4 5 4
17 21 NA 3 4 3 4 4
15 24 NA 4 4 4 5 4
20 20 16 3 4 4 5 5
15 22 NA 3 4 3 3 4
19 20 NA 3 4 4 4 4
15 19 NA 4 5 4 5 5
15 23 17 4 5 4 5 5
19 21 17 4 4 4 5 4
NA 19 15 4 4 3 5 4
20 19 16 4 4 3 4 5
18 21 14 3 3 4 4 5
15 21 16 4 4 4 2 5
14 22 17 3 4 4 4 5
20 22 NA 3 4 4 4 5
NA 19 NA NA NA NA 5 5
16 21 NA 5 5 3 4 4
16 21 NA 4 4 4 5 4
16 21 16 3 4 3 4 5
10 20 NA 4 4 4 5 5
19 22 16 4 4 4 4 5
19 22 NA 4 4 4 4 4
16 24 NA 4 4 4 4 5
15 21 NA 3 4 4 4 4
18 19 16 3 4 3 5 5
17 19 15 4 4 4 4 4
19 23 16 2 4 4 5 5
17 21 16 5 4 4 4 4
NA 21 13 4 3 4 4 4
19 19 15 4 5 4 5 5
20 21 17 5 4 4 4 5
5 19 NA 4 3 4 NA 5
19 21 13 2 3 4 5 4
16 21 17 4 5 4 4 4
15 23 NA 3 4 4 4 4
16 19 14 4 3 3 4 5
18 19 14 4 3 4 4 4
16 19 18 4 4 4 4 4
15 18 NA 5 4 4 4 4
17 22 17 4 5 4 5 5
NA 18 13 3 3 4 4 4
20 22 16 5 5 3 5 5
19 18 15 5 4 3 4 4
7 22 15 4 4 3 4 5
13 22 NA 4 4 4 4 4
16 19 15 3 5 3 3 4
16 22 13 4 4 4 5 4
NA 25 NA 2 3 2 NA 4
18 19 17 4 5 4 4 4
18 19 NA 5 5 4 5 4
16 19 NA 5 5 4 4 4
17 19 11 4 3 4 5 5
19 21 14 4 3 3 4 5
16 21 13 4 4 4 4 4
19 20 NA 3 4 3 3 4
13 19 17 3 4 4 4 3
16 19 16 4 4 3 5 4
13 22 NA 4 4 4 5 4
12 26 17 5 5 4 5 5
17 19 16 2 4 4 5 5
17 21 16 4 4 4 5 5
17 21 16 3 4 4 2 4
16 20 15 4 4 4 5 5
16 23 12 4 2 4 4 4
14 22 17 4 4 3 5 3
16 22 14 4 4 3 5 4
13 22 14 5 4 3 3 5
16 21 16 3 4 3 5 5
14 21 NA 3 4 3 4 5
20 22 NA 4 5 5 5 4
12 23 NA 4 4 4 NA 4
13 18 NA 4 4 4 4 4
18 24 NA 4 4 5 5 4
14 22 15 3 4 4 4 4
19 21 16 4 4 4 5 4
18 21 14 3 4 3 5 5
14 21 15 3 3 4 4 5
18 23 17 4 3 4 4 4
19 21 NA 4 4 4 4 5
15 23 10 3 3 4 4 4
14 21 NA 4 4 4 5 4
17 19 17 4 4 4 5 5
19 21 NA 4 4 4 5 5
13 21 20 5 4 4 4 4
19 21 17 5 4 5 4 5
18 23 18 4 4 4 5 5
20 23 NA 3 4 4 4 5
15 20 17 3 NA 4 4 4
15 20 14 4 2 3 4 4
15 19 NA 4 4 4 4 3
20 23 17 4 4 4 4 5
15 22 NA 4 4 4 5 4
19 19 17 4 5 4 5 3
18 23 NA 3 4 3 5 5
18 22 16 4 4 4 4 5
15 22 18 5 4 4 4 5
20 21 18 5 4 5 4 5
17 21 16 4 5 4 5 5
12 21 NA 3 4 4 4 5
18 21 NA 5 3 4 5 5
19 22 15 4 4 4 4 5
20 25 13 5 4 4 4 5
NA 21 NA 3 4 3 NA 4
17 23 NA 5 4 5 5 5
15 19 NA 4 4 3 NA 5
16 22 NA 4 4 3 4 3
18 20 NA 4 4 4 4 4
18 21 16 4 4 4 4 4
14 25 NA 3 4 4 5 3
15 21 NA 4 4 4 4 4
12 19 NA 4 4 3 4 5
17 23 12 3 3 3 5 5
14 22 NA 4 4 3 4 4
18 21 16 3 4 4 4 4
17 24 16 4 4 4 3 4
17 21 NA 5 4 1 5 5
20 19 16 5 4 4 5 5
16 18 14 4 4 4 4 3
14 19 15 4 4 3 4 4
15 20 14 3 4 3 4 5
18 19 NA 4 4 4 4 4
20 22 15 4 4 4 5 4
17 21 NA 4 5 4 4 4
17 22 15 3 4 4 4 4
17 24 16 4 4 3 4 4
17 28 NA 4 4 4 4 5
15 19 NA 3 4 3 4 4
17 18 NA 4 4 3 4 3
18 23 11 3 2 2 4 4
17 19 NA 4 4 3 5 4
20 23 18 5 4 3 5 4
15 19 NA 2 4 3 3 5
16 22 11 3 3 4 4 4
15 21 NA 4 4 3 4 4
18 19 18 5 5 4 5 4
11 22 NA NA NA NA NA NA
15 21 15 4 5 4 4 4
18 23 19 5 5 5 5 4
20 22 17 4 5 4 5 5
19 19 NA 4 4 3 4 5
14 19 14 3 4 4 5 4
16 21 NA 4 4 4 4 4
15 22 13 4 4 4 4 4
17 21 17 4 4 4 5 5
18 20 14 4 4 4 5 5
20 23 19 5 4 3 5 4
17 22 14 4 3 4 4 4
18 23 NA 4 4 4 4 4
15 22 NA 3 3 3 4 4
16 21 16 4 5 4 4 3
11 20 16 4 4 3 4 4
15 18 15 4 4 4 4 5
18 18 12 3 4 3 5 5
17 20 NA 4 4 4 4 5
16 19 17 5 4 4 5 4
12 21 NA 4 4 4 3 4
19 24 NA 2 3 4 4 4
18 19 18 4 4 4 4 5
15 20 15 4 3 3 5 5
17 19 18 4 4 4 4 3
19 23 15 4 5 5 4 4
18 22 NA 5 4 4 4 4
19 21 NA 5 4 3 4 4
16 24 NA 3 3 4 5 5
16 21 16 4 4 4 4 5
16 21 NA 4 4 4 5 4
14 22 16 2 3 5 5 4




Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time7 seconds
R ServerBig Analytics Cloud Computing Center

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input view raw input (R code)  \tabularnewline
Raw Outputview raw output of R engine  \tabularnewline
Computing time7 seconds \tabularnewline
R ServerBig Analytics Cloud Computing Center \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=304819&T=0

[TABLE]
[ROW]
Summary of computational transaction[/C][/ROW] [ROW]Raw Input[/C] view raw input (R code) [/C][/ROW] [ROW]Raw Output[/C]view raw output of R engine [/C][/ROW] [ROW]Computing time[/C]7 seconds[/C][/ROW] [ROW]R Server[/C]Big Analytics Cloud Computing Center[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=304819&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=304819&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time7 seconds
R ServerBig Analytics Cloud Computing Center







Multiple Linear Regression - Estimated Regression Equation
ITHSUM[t] = + 8.11988 -0.0377059Bevr_Leeftijd[t] + 0.00760815TVDC[t] + 0.413666SKEOU1[t] -0.0133935SKEOU2[t] + 0.716207SKEOU4[t] + 0.690623SKEOU5[t] + 0.481387SKEOU6[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
ITHSUM[t] =  +  8.11988 -0.0377059Bevr_Leeftijd[t] +  0.00760815TVDC[t] +  0.413666SKEOU1[t] -0.0133935SKEOU2[t] +  0.716207SKEOU4[t] +  0.690623SKEOU5[t] +  0.481387SKEOU6[t]  + e[t] \tabularnewline
 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=304819&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]ITHSUM[t] =  +  8.11988 -0.0377059Bevr_Leeftijd[t] +  0.00760815TVDC[t] +  0.413666SKEOU1[t] -0.0133935SKEOU2[t] +  0.716207SKEOU4[t] +  0.690623SKEOU5[t] +  0.481387SKEOU6[t]  + e[t][/C][/ROW]
[ROW][C][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=304819&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=304819&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
ITHSUM[t] = + 8.11988 -0.0377059Bevr_Leeftijd[t] + 0.00760815TVDC[t] + 0.413666SKEOU1[t] -0.0133935SKEOU2[t] + 0.716207SKEOU4[t] + 0.690623SKEOU5[t] + 0.481387SKEOU6[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+8.12 4.072+1.9940e+00 0.04916 0.02458
Bevr_Leeftijd-0.03771 0.1389-2.7140e-01 0.7867 0.3933
TVDC+0.007608 0.1518+5.0110e-02 0.9601 0.4801
SKEOU1+0.4137 0.3331+1.2420e+00 0.2175 0.1088
SKEOU2-0.01339 0.3942-3.3980e-02 0.973 0.4865
SKEOU4+0.7162 0.436+1.6430e+00 0.1039 0.05195
SKEOU5+0.6906 0.3543+1.9490e+00 0.05435 0.02717
SKEOU6+0.4814 0.3774+1.2760e+00 0.2053 0.1027

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & +8.12 &  4.072 & +1.9940e+00 &  0.04916 &  0.02458 \tabularnewline
Bevr_Leeftijd & -0.03771 &  0.1389 & -2.7140e-01 &  0.7867 &  0.3933 \tabularnewline
TVDC & +0.007608 &  0.1518 & +5.0110e-02 &  0.9601 &  0.4801 \tabularnewline
SKEOU1 & +0.4137 &  0.3331 & +1.2420e+00 &  0.2175 &  0.1088 \tabularnewline
SKEOU2 & -0.01339 &  0.3942 & -3.3980e-02 &  0.973 &  0.4865 \tabularnewline
SKEOU4 & +0.7162 &  0.436 & +1.6430e+00 &  0.1039 &  0.05195 \tabularnewline
SKEOU5 & +0.6906 &  0.3543 & +1.9490e+00 &  0.05435 &  0.02717 \tabularnewline
SKEOU6 & +0.4814 &  0.3774 & +1.2760e+00 &  0.2053 &  0.1027 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=304819&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]+8.12[/C][C] 4.072[/C][C]+1.9940e+00[/C][C] 0.04916[/C][C] 0.02458[/C][/ROW]
[ROW][C]Bevr_Leeftijd[/C][C]-0.03771[/C][C] 0.1389[/C][C]-2.7140e-01[/C][C] 0.7867[/C][C] 0.3933[/C][/ROW]
[ROW][C]TVDC[/C][C]+0.007608[/C][C] 0.1518[/C][C]+5.0110e-02[/C][C] 0.9601[/C][C] 0.4801[/C][/ROW]
[ROW][C]SKEOU1[/C][C]+0.4137[/C][C] 0.3331[/C][C]+1.2420e+00[/C][C] 0.2175[/C][C] 0.1088[/C][/ROW]
[ROW][C]SKEOU2[/C][C]-0.01339[/C][C] 0.3942[/C][C]-3.3980e-02[/C][C] 0.973[/C][C] 0.4865[/C][/ROW]
[ROW][C]SKEOU4[/C][C]+0.7162[/C][C] 0.436[/C][C]+1.6430e+00[/C][C] 0.1039[/C][C] 0.05195[/C][/ROW]
[ROW][C]SKEOU5[/C][C]+0.6906[/C][C] 0.3543[/C][C]+1.9490e+00[/C][C] 0.05435[/C][C] 0.02717[/C][/ROW]
[ROW][C]SKEOU6[/C][C]+0.4814[/C][C] 0.3774[/C][C]+1.2760e+00[/C][C] 0.2053[/C][C] 0.1027[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=304819&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=304819&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+8.12 4.072+1.9940e+00 0.04916 0.02458
Bevr_Leeftijd-0.03771 0.1389-2.7140e-01 0.7867 0.3933
TVDC+0.007608 0.1518+5.0110e-02 0.9601 0.4801
SKEOU1+0.4137 0.3331+1.2420e+00 0.2175 0.1088
SKEOU2-0.01339 0.3942-3.3980e-02 0.973 0.4865
SKEOU4+0.7162 0.436+1.6430e+00 0.1039 0.05195
SKEOU5+0.6906 0.3543+1.9490e+00 0.05435 0.02717
SKEOU6+0.4814 0.3774+1.2760e+00 0.2053 0.1027







Multiple Linear Regression - Regression Statistics
Multiple R 0.3274
R-squared 0.1072
Adjusted R-squared 0.03853
F-TEST (value) 1.561
F-TEST (DF numerator)7
F-TEST (DF denominator)91
p-value 0.1571
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 2.24
Sum Squared Residuals 456.6

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R &  0.3274 \tabularnewline
R-squared &  0.1072 \tabularnewline
Adjusted R-squared &  0.03853 \tabularnewline
F-TEST (value) &  1.561 \tabularnewline
F-TEST (DF numerator) & 7 \tabularnewline
F-TEST (DF denominator) & 91 \tabularnewline
p-value &  0.1571 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation &  2.24 \tabularnewline
Sum Squared Residuals &  456.6 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=304819&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C] 0.3274[/C][/ROW]
[ROW][C]R-squared[/C][C] 0.1072[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C] 0.03853[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C] 1.561[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]7[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]91[/C][/ROW]
[ROW][C]p-value[/C][C] 0.1571[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C] 2.24[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C] 456.6[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=304819&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=304819&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R 0.3274
R-squared 0.1072
Adjusted R-squared 0.03853
F-TEST (value) 1.561
F-TEST (DF numerator)7
F-TEST (DF denominator)91
p-value 0.1571
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 2.24
Sum Squared Residuals 456.6







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 14 16.54-2.544
2 19 17.61 1.392
3 17 17.3-0.302
4 20 17.4 2.6
5 15 17.69-2.695
6 19 17.3 1.698
7 20 16.44 3.556
8 18 16.67 1.33
9 15 15.7-0.7039
10 14 16.64-2.641
11 16 15.96 0.04474
12 19 17.05 1.953
13 18 16.72 1.279
14 17 16.67 0.3285
15 19 16.87 2.127
16 17 17.02-0.01741
17 19 17.83 1.17
18 20 17.51 2.494
19 19 16.46 2.542
20 16 16.6-0.598
21 16 16.44-0.4425
22 18 16.68 1.323
23 16 16.69-0.6944
24 17 17.73-0.7323
25 20 17.42 2.578
26 19 16.41 2.593
27 7 16.32-9.324
28 16 14.84 1.162
29 16 17.23-1.234
30 18 16.67 1.327
31 17 17.83-0.8265
32 19 16.37 2.633
33 16 16.58-0.5809
34 13 15.79-2.792
35 16 16.65-0.6536
36 12 18-5.995
37 17 17.02-0.02383
38 17 17.78-0.7758
39 17 14.81 2.191
40 16 17.81-1.806
41 16 16.52-0.5247
42 14 16.07-2.067
43 16 16.53-0.5252
44 13 16.04-3.039
45 16 16.65-0.6459
46 14 16.14-2.145
47 19 17.29 1.706
48 18 16.63 1.369
49 14 16.68-2.677
50 18 16.55 1.451
51 15 16.08-1.082
52 17 17.86-0.8588
53 13 17.05-4.048
54 19 18.22 0.7774
55 18 17.72 0.2844
56 15 15.94-0.9368
57 20 17.02 2.983
58 19 16.88 2.117
59 18 17.05 0.9526
60 15 17.48-2.476
61 20 18.23 1.77
62 17 17.76-0.7624
63 19 17.04 1.96
64 20 17.33 2.675
65 18 16.6 1.396
66 17 16.55 0.4466
67 18 16.19 1.81
68 17 15.8 1.2
69 20 18.26 1.735
70 16 16.22-0.2203
71 14 15.96-1.955
72 15 15.98-0.9777
73 20 17.25 2.751
74 17 16.14 0.8552
75 17 15.77 1.226
76 18 14.67 3.329
77 20 16.93 3.068
78 16 16.13-0.1277
79 18 17.79 0.2147
80 15 16.58-1.583
81 18 18.36-0.3583
82 20 17.73 2.268
83 14 16.94-2.941
84 15 16.54-1.543
85 17 17.78-0.7834
86 18 17.8 0.2018
87 20 16.94 3.061
88 17 16.56 0.4358
89 16 16.11-0.109
90 11 15.93-4.925
91 15 17.19-2.191
92 18 16.73 1.271
93 16 17.79-1.791
94 18 17.18 0.8242
95 15 17.1-2.103
96 17 16.21 0.787
97 19 17.22 1.776
98 16 17.09-1.085
99 14 17.16-3.159

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 &  14 &  16.54 & -2.544 \tabularnewline
2 &  19 &  17.61 &  1.392 \tabularnewline
3 &  17 &  17.3 & -0.302 \tabularnewline
4 &  20 &  17.4 &  2.6 \tabularnewline
5 &  15 &  17.69 & -2.695 \tabularnewline
6 &  19 &  17.3 &  1.698 \tabularnewline
7 &  20 &  16.44 &  3.556 \tabularnewline
8 &  18 &  16.67 &  1.33 \tabularnewline
9 &  15 &  15.7 & -0.7039 \tabularnewline
10 &  14 &  16.64 & -2.641 \tabularnewline
11 &  16 &  15.96 &  0.04474 \tabularnewline
12 &  19 &  17.05 &  1.953 \tabularnewline
13 &  18 &  16.72 &  1.279 \tabularnewline
14 &  17 &  16.67 &  0.3285 \tabularnewline
15 &  19 &  16.87 &  2.127 \tabularnewline
16 &  17 &  17.02 & -0.01741 \tabularnewline
17 &  19 &  17.83 &  1.17 \tabularnewline
18 &  20 &  17.51 &  2.494 \tabularnewline
19 &  19 &  16.46 &  2.542 \tabularnewline
20 &  16 &  16.6 & -0.598 \tabularnewline
21 &  16 &  16.44 & -0.4425 \tabularnewline
22 &  18 &  16.68 &  1.323 \tabularnewline
23 &  16 &  16.69 & -0.6944 \tabularnewline
24 &  17 &  17.73 & -0.7323 \tabularnewline
25 &  20 &  17.42 &  2.578 \tabularnewline
26 &  19 &  16.41 &  2.593 \tabularnewline
27 &  7 &  16.32 & -9.324 \tabularnewline
28 &  16 &  14.84 &  1.162 \tabularnewline
29 &  16 &  17.23 & -1.234 \tabularnewline
30 &  18 &  16.67 &  1.327 \tabularnewline
31 &  17 &  17.83 & -0.8265 \tabularnewline
32 &  19 &  16.37 &  2.633 \tabularnewline
33 &  16 &  16.58 & -0.5809 \tabularnewline
34 &  13 &  15.79 & -2.792 \tabularnewline
35 &  16 &  16.65 & -0.6536 \tabularnewline
36 &  12 &  18 & -5.995 \tabularnewline
37 &  17 &  17.02 & -0.02383 \tabularnewline
38 &  17 &  17.78 & -0.7758 \tabularnewline
39 &  17 &  14.81 &  2.191 \tabularnewline
40 &  16 &  17.81 & -1.806 \tabularnewline
41 &  16 &  16.52 & -0.5247 \tabularnewline
42 &  14 &  16.07 & -2.067 \tabularnewline
43 &  16 &  16.53 & -0.5252 \tabularnewline
44 &  13 &  16.04 & -3.039 \tabularnewline
45 &  16 &  16.65 & -0.6459 \tabularnewline
46 &  14 &  16.14 & -2.145 \tabularnewline
47 &  19 &  17.29 &  1.706 \tabularnewline
48 &  18 &  16.63 &  1.369 \tabularnewline
49 &  14 &  16.68 & -2.677 \tabularnewline
50 &  18 &  16.55 &  1.451 \tabularnewline
51 &  15 &  16.08 & -1.082 \tabularnewline
52 &  17 &  17.86 & -0.8588 \tabularnewline
53 &  13 &  17.05 & -4.048 \tabularnewline
54 &  19 &  18.22 &  0.7774 \tabularnewline
55 &  18 &  17.72 &  0.2844 \tabularnewline
56 &  15 &  15.94 & -0.9368 \tabularnewline
57 &  20 &  17.02 &  2.983 \tabularnewline
58 &  19 &  16.88 &  2.117 \tabularnewline
59 &  18 &  17.05 &  0.9526 \tabularnewline
60 &  15 &  17.48 & -2.476 \tabularnewline
61 &  20 &  18.23 &  1.77 \tabularnewline
62 &  17 &  17.76 & -0.7624 \tabularnewline
63 &  19 &  17.04 &  1.96 \tabularnewline
64 &  20 &  17.33 &  2.675 \tabularnewline
65 &  18 &  16.6 &  1.396 \tabularnewline
66 &  17 &  16.55 &  0.4466 \tabularnewline
67 &  18 &  16.19 &  1.81 \tabularnewline
68 &  17 &  15.8 &  1.2 \tabularnewline
69 &  20 &  18.26 &  1.735 \tabularnewline
70 &  16 &  16.22 & -0.2203 \tabularnewline
71 &  14 &  15.96 & -1.955 \tabularnewline
72 &  15 &  15.98 & -0.9777 \tabularnewline
73 &  20 &  17.25 &  2.751 \tabularnewline
74 &  17 &  16.14 &  0.8552 \tabularnewline
75 &  17 &  15.77 &  1.226 \tabularnewline
76 &  18 &  14.67 &  3.329 \tabularnewline
77 &  20 &  16.93 &  3.068 \tabularnewline
78 &  16 &  16.13 & -0.1277 \tabularnewline
79 &  18 &  17.79 &  0.2147 \tabularnewline
80 &  15 &  16.58 & -1.583 \tabularnewline
81 &  18 &  18.36 & -0.3583 \tabularnewline
82 &  20 &  17.73 &  2.268 \tabularnewline
83 &  14 &  16.94 & -2.941 \tabularnewline
84 &  15 &  16.54 & -1.543 \tabularnewline
85 &  17 &  17.78 & -0.7834 \tabularnewline
86 &  18 &  17.8 &  0.2018 \tabularnewline
87 &  20 &  16.94 &  3.061 \tabularnewline
88 &  17 &  16.56 &  0.4358 \tabularnewline
89 &  16 &  16.11 & -0.109 \tabularnewline
90 &  11 &  15.93 & -4.925 \tabularnewline
91 &  15 &  17.19 & -2.191 \tabularnewline
92 &  18 &  16.73 &  1.271 \tabularnewline
93 &  16 &  17.79 & -1.791 \tabularnewline
94 &  18 &  17.18 &  0.8242 \tabularnewline
95 &  15 &  17.1 & -2.103 \tabularnewline
96 &  17 &  16.21 &  0.787 \tabularnewline
97 &  19 &  17.22 &  1.776 \tabularnewline
98 &  16 &  17.09 & -1.085 \tabularnewline
99 &  14 &  17.16 & -3.159 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=304819&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C] 14[/C][C] 16.54[/C][C]-2.544[/C][/ROW]
[ROW][C]2[/C][C] 19[/C][C] 17.61[/C][C] 1.392[/C][/ROW]
[ROW][C]3[/C][C] 17[/C][C] 17.3[/C][C]-0.302[/C][/ROW]
[ROW][C]4[/C][C] 20[/C][C] 17.4[/C][C] 2.6[/C][/ROW]
[ROW][C]5[/C][C] 15[/C][C] 17.69[/C][C]-2.695[/C][/ROW]
[ROW][C]6[/C][C] 19[/C][C] 17.3[/C][C] 1.698[/C][/ROW]
[ROW][C]7[/C][C] 20[/C][C] 16.44[/C][C] 3.556[/C][/ROW]
[ROW][C]8[/C][C] 18[/C][C] 16.67[/C][C] 1.33[/C][/ROW]
[ROW][C]9[/C][C] 15[/C][C] 15.7[/C][C]-0.7039[/C][/ROW]
[ROW][C]10[/C][C] 14[/C][C] 16.64[/C][C]-2.641[/C][/ROW]
[ROW][C]11[/C][C] 16[/C][C] 15.96[/C][C] 0.04474[/C][/ROW]
[ROW][C]12[/C][C] 19[/C][C] 17.05[/C][C] 1.953[/C][/ROW]
[ROW][C]13[/C][C] 18[/C][C] 16.72[/C][C] 1.279[/C][/ROW]
[ROW][C]14[/C][C] 17[/C][C] 16.67[/C][C] 0.3285[/C][/ROW]
[ROW][C]15[/C][C] 19[/C][C] 16.87[/C][C] 2.127[/C][/ROW]
[ROW][C]16[/C][C] 17[/C][C] 17.02[/C][C]-0.01741[/C][/ROW]
[ROW][C]17[/C][C] 19[/C][C] 17.83[/C][C] 1.17[/C][/ROW]
[ROW][C]18[/C][C] 20[/C][C] 17.51[/C][C] 2.494[/C][/ROW]
[ROW][C]19[/C][C] 19[/C][C] 16.46[/C][C] 2.542[/C][/ROW]
[ROW][C]20[/C][C] 16[/C][C] 16.6[/C][C]-0.598[/C][/ROW]
[ROW][C]21[/C][C] 16[/C][C] 16.44[/C][C]-0.4425[/C][/ROW]
[ROW][C]22[/C][C] 18[/C][C] 16.68[/C][C] 1.323[/C][/ROW]
[ROW][C]23[/C][C] 16[/C][C] 16.69[/C][C]-0.6944[/C][/ROW]
[ROW][C]24[/C][C] 17[/C][C] 17.73[/C][C]-0.7323[/C][/ROW]
[ROW][C]25[/C][C] 20[/C][C] 17.42[/C][C] 2.578[/C][/ROW]
[ROW][C]26[/C][C] 19[/C][C] 16.41[/C][C] 2.593[/C][/ROW]
[ROW][C]27[/C][C] 7[/C][C] 16.32[/C][C]-9.324[/C][/ROW]
[ROW][C]28[/C][C] 16[/C][C] 14.84[/C][C] 1.162[/C][/ROW]
[ROW][C]29[/C][C] 16[/C][C] 17.23[/C][C]-1.234[/C][/ROW]
[ROW][C]30[/C][C] 18[/C][C] 16.67[/C][C] 1.327[/C][/ROW]
[ROW][C]31[/C][C] 17[/C][C] 17.83[/C][C]-0.8265[/C][/ROW]
[ROW][C]32[/C][C] 19[/C][C] 16.37[/C][C] 2.633[/C][/ROW]
[ROW][C]33[/C][C] 16[/C][C] 16.58[/C][C]-0.5809[/C][/ROW]
[ROW][C]34[/C][C] 13[/C][C] 15.79[/C][C]-2.792[/C][/ROW]
[ROW][C]35[/C][C] 16[/C][C] 16.65[/C][C]-0.6536[/C][/ROW]
[ROW][C]36[/C][C] 12[/C][C] 18[/C][C]-5.995[/C][/ROW]
[ROW][C]37[/C][C] 17[/C][C] 17.02[/C][C]-0.02383[/C][/ROW]
[ROW][C]38[/C][C] 17[/C][C] 17.78[/C][C]-0.7758[/C][/ROW]
[ROW][C]39[/C][C] 17[/C][C] 14.81[/C][C] 2.191[/C][/ROW]
[ROW][C]40[/C][C] 16[/C][C] 17.81[/C][C]-1.806[/C][/ROW]
[ROW][C]41[/C][C] 16[/C][C] 16.52[/C][C]-0.5247[/C][/ROW]
[ROW][C]42[/C][C] 14[/C][C] 16.07[/C][C]-2.067[/C][/ROW]
[ROW][C]43[/C][C] 16[/C][C] 16.53[/C][C]-0.5252[/C][/ROW]
[ROW][C]44[/C][C] 13[/C][C] 16.04[/C][C]-3.039[/C][/ROW]
[ROW][C]45[/C][C] 16[/C][C] 16.65[/C][C]-0.6459[/C][/ROW]
[ROW][C]46[/C][C] 14[/C][C] 16.14[/C][C]-2.145[/C][/ROW]
[ROW][C]47[/C][C] 19[/C][C] 17.29[/C][C] 1.706[/C][/ROW]
[ROW][C]48[/C][C] 18[/C][C] 16.63[/C][C] 1.369[/C][/ROW]
[ROW][C]49[/C][C] 14[/C][C] 16.68[/C][C]-2.677[/C][/ROW]
[ROW][C]50[/C][C] 18[/C][C] 16.55[/C][C] 1.451[/C][/ROW]
[ROW][C]51[/C][C] 15[/C][C] 16.08[/C][C]-1.082[/C][/ROW]
[ROW][C]52[/C][C] 17[/C][C] 17.86[/C][C]-0.8588[/C][/ROW]
[ROW][C]53[/C][C] 13[/C][C] 17.05[/C][C]-4.048[/C][/ROW]
[ROW][C]54[/C][C] 19[/C][C] 18.22[/C][C] 0.7774[/C][/ROW]
[ROW][C]55[/C][C] 18[/C][C] 17.72[/C][C] 0.2844[/C][/ROW]
[ROW][C]56[/C][C] 15[/C][C] 15.94[/C][C]-0.9368[/C][/ROW]
[ROW][C]57[/C][C] 20[/C][C] 17.02[/C][C] 2.983[/C][/ROW]
[ROW][C]58[/C][C] 19[/C][C] 16.88[/C][C] 2.117[/C][/ROW]
[ROW][C]59[/C][C] 18[/C][C] 17.05[/C][C] 0.9526[/C][/ROW]
[ROW][C]60[/C][C] 15[/C][C] 17.48[/C][C]-2.476[/C][/ROW]
[ROW][C]61[/C][C] 20[/C][C] 18.23[/C][C] 1.77[/C][/ROW]
[ROW][C]62[/C][C] 17[/C][C] 17.76[/C][C]-0.7624[/C][/ROW]
[ROW][C]63[/C][C] 19[/C][C] 17.04[/C][C] 1.96[/C][/ROW]
[ROW][C]64[/C][C] 20[/C][C] 17.33[/C][C] 2.675[/C][/ROW]
[ROW][C]65[/C][C] 18[/C][C] 16.6[/C][C] 1.396[/C][/ROW]
[ROW][C]66[/C][C] 17[/C][C] 16.55[/C][C] 0.4466[/C][/ROW]
[ROW][C]67[/C][C] 18[/C][C] 16.19[/C][C] 1.81[/C][/ROW]
[ROW][C]68[/C][C] 17[/C][C] 15.8[/C][C] 1.2[/C][/ROW]
[ROW][C]69[/C][C] 20[/C][C] 18.26[/C][C] 1.735[/C][/ROW]
[ROW][C]70[/C][C] 16[/C][C] 16.22[/C][C]-0.2203[/C][/ROW]
[ROW][C]71[/C][C] 14[/C][C] 15.96[/C][C]-1.955[/C][/ROW]
[ROW][C]72[/C][C] 15[/C][C] 15.98[/C][C]-0.9777[/C][/ROW]
[ROW][C]73[/C][C] 20[/C][C] 17.25[/C][C] 2.751[/C][/ROW]
[ROW][C]74[/C][C] 17[/C][C] 16.14[/C][C] 0.8552[/C][/ROW]
[ROW][C]75[/C][C] 17[/C][C] 15.77[/C][C] 1.226[/C][/ROW]
[ROW][C]76[/C][C] 18[/C][C] 14.67[/C][C] 3.329[/C][/ROW]
[ROW][C]77[/C][C] 20[/C][C] 16.93[/C][C] 3.068[/C][/ROW]
[ROW][C]78[/C][C] 16[/C][C] 16.13[/C][C]-0.1277[/C][/ROW]
[ROW][C]79[/C][C] 18[/C][C] 17.79[/C][C] 0.2147[/C][/ROW]
[ROW][C]80[/C][C] 15[/C][C] 16.58[/C][C]-1.583[/C][/ROW]
[ROW][C]81[/C][C] 18[/C][C] 18.36[/C][C]-0.3583[/C][/ROW]
[ROW][C]82[/C][C] 20[/C][C] 17.73[/C][C] 2.268[/C][/ROW]
[ROW][C]83[/C][C] 14[/C][C] 16.94[/C][C]-2.941[/C][/ROW]
[ROW][C]84[/C][C] 15[/C][C] 16.54[/C][C]-1.543[/C][/ROW]
[ROW][C]85[/C][C] 17[/C][C] 17.78[/C][C]-0.7834[/C][/ROW]
[ROW][C]86[/C][C] 18[/C][C] 17.8[/C][C] 0.2018[/C][/ROW]
[ROW][C]87[/C][C] 20[/C][C] 16.94[/C][C] 3.061[/C][/ROW]
[ROW][C]88[/C][C] 17[/C][C] 16.56[/C][C] 0.4358[/C][/ROW]
[ROW][C]89[/C][C] 16[/C][C] 16.11[/C][C]-0.109[/C][/ROW]
[ROW][C]90[/C][C] 11[/C][C] 15.93[/C][C]-4.925[/C][/ROW]
[ROW][C]91[/C][C] 15[/C][C] 17.19[/C][C]-2.191[/C][/ROW]
[ROW][C]92[/C][C] 18[/C][C] 16.73[/C][C] 1.271[/C][/ROW]
[ROW][C]93[/C][C] 16[/C][C] 17.79[/C][C]-1.791[/C][/ROW]
[ROW][C]94[/C][C] 18[/C][C] 17.18[/C][C] 0.8242[/C][/ROW]
[ROW][C]95[/C][C] 15[/C][C] 17.1[/C][C]-2.103[/C][/ROW]
[ROW][C]96[/C][C] 17[/C][C] 16.21[/C][C] 0.787[/C][/ROW]
[ROW][C]97[/C][C] 19[/C][C] 17.22[/C][C] 1.776[/C][/ROW]
[ROW][C]98[/C][C] 16[/C][C] 17.09[/C][C]-1.085[/C][/ROW]
[ROW][C]99[/C][C] 14[/C][C] 17.16[/C][C]-3.159[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=304819&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=304819&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 14 16.54-2.544
2 19 17.61 1.392
3 17 17.3-0.302
4 20 17.4 2.6
5 15 17.69-2.695
6 19 17.3 1.698
7 20 16.44 3.556
8 18 16.67 1.33
9 15 15.7-0.7039
10 14 16.64-2.641
11 16 15.96 0.04474
12 19 17.05 1.953
13 18 16.72 1.279
14 17 16.67 0.3285
15 19 16.87 2.127
16 17 17.02-0.01741
17 19 17.83 1.17
18 20 17.51 2.494
19 19 16.46 2.542
20 16 16.6-0.598
21 16 16.44-0.4425
22 18 16.68 1.323
23 16 16.69-0.6944
24 17 17.73-0.7323
25 20 17.42 2.578
26 19 16.41 2.593
27 7 16.32-9.324
28 16 14.84 1.162
29 16 17.23-1.234
30 18 16.67 1.327
31 17 17.83-0.8265
32 19 16.37 2.633
33 16 16.58-0.5809
34 13 15.79-2.792
35 16 16.65-0.6536
36 12 18-5.995
37 17 17.02-0.02383
38 17 17.78-0.7758
39 17 14.81 2.191
40 16 17.81-1.806
41 16 16.52-0.5247
42 14 16.07-2.067
43 16 16.53-0.5252
44 13 16.04-3.039
45 16 16.65-0.6459
46 14 16.14-2.145
47 19 17.29 1.706
48 18 16.63 1.369
49 14 16.68-2.677
50 18 16.55 1.451
51 15 16.08-1.082
52 17 17.86-0.8588
53 13 17.05-4.048
54 19 18.22 0.7774
55 18 17.72 0.2844
56 15 15.94-0.9368
57 20 17.02 2.983
58 19 16.88 2.117
59 18 17.05 0.9526
60 15 17.48-2.476
61 20 18.23 1.77
62 17 17.76-0.7624
63 19 17.04 1.96
64 20 17.33 2.675
65 18 16.6 1.396
66 17 16.55 0.4466
67 18 16.19 1.81
68 17 15.8 1.2
69 20 18.26 1.735
70 16 16.22-0.2203
71 14 15.96-1.955
72 15 15.98-0.9777
73 20 17.25 2.751
74 17 16.14 0.8552
75 17 15.77 1.226
76 18 14.67 3.329
77 20 16.93 3.068
78 16 16.13-0.1277
79 18 17.79 0.2147
80 15 16.58-1.583
81 18 18.36-0.3583
82 20 17.73 2.268
83 14 16.94-2.941
84 15 16.54-1.543
85 17 17.78-0.7834
86 18 17.8 0.2018
87 20 16.94 3.061
88 17 16.56 0.4358
89 16 16.11-0.109
90 11 15.93-4.925
91 15 17.19-2.191
92 18 16.73 1.271
93 16 17.79-1.791
94 18 17.18 0.8242
95 15 17.1-2.103
96 17 16.21 0.787
97 19 17.22 1.776
98 16 17.09-1.085
99 14 17.16-3.159







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
11 0.583 0.8341 0.417
12 0.4751 0.9501 0.5249
13 0.3629 0.7258 0.6371
14 0.2419 0.4837 0.7581
15 0.5728 0.8544 0.4272
16 0.4622 0.9244 0.5378
17 0.3623 0.7246 0.6377
18 0.2815 0.5629 0.7185
19 0.3154 0.6308 0.6846
20 0.2424 0.4848 0.7576
21 0.2162 0.4323 0.7838
22 0.1615 0.3231 0.8385
23 0.1584 0.3168 0.8416
24 0.1228 0.2455 0.8772
25 0.1614 0.3228 0.8386
26 0.1515 0.3031 0.8485
27 0.9246 0.1508 0.07541
28 0.9243 0.1514 0.0757
29 0.9033 0.1933 0.09667
30 0.8773 0.2454 0.1227
31 0.8642 0.2716 0.1358
32 0.8918 0.2164 0.1082
33 0.858 0.2839 0.142
34 0.8901 0.2197 0.1099
35 0.8644 0.2713 0.1356
36 0.9722 0.05565 0.02782
37 0.9677 0.06452 0.03226
38 0.9582 0.08358 0.04179
39 0.97 0.05994 0.02997
40 0.9693 0.06134 0.03067
41 0.9578 0.08436 0.04218
42 0.9583 0.08336 0.04168
43 0.9512 0.09753 0.04876
44 0.9623 0.07544 0.03772
45 0.9487 0.1026 0.05131
46 0.9439 0.1121 0.05607
47 0.9357 0.1286 0.0643
48 0.9245 0.1509 0.07547
49 0.93 0.14 0.06999
50 0.9219 0.1561 0.07807
51 0.9106 0.1787 0.08935
52 0.8937 0.2126 0.1063
53 0.9497 0.1006 0.05029
54 0.9341 0.1318 0.06592
55 0.9165 0.1671 0.08353
56 0.8928 0.2144 0.1072
57 0.9174 0.1652 0.08261
58 0.922 0.156 0.07801
59 0.9005 0.199 0.09949
60 0.9327 0.1345 0.06726
61 0.921 0.1581 0.07904
62 0.9003 0.1994 0.09968
63 0.891 0.218 0.109
64 0.8885 0.223 0.1115
65 0.8713 0.2574 0.1287
66 0.8385 0.323 0.1615
67 0.8575 0.2849 0.1425
68 0.8252 0.3496 0.1748
69 0.8153 0.3693 0.1847
70 0.7945 0.4111 0.2055
71 0.7618 0.4764 0.2382
72 0.7075 0.5851 0.2925
73 0.7238 0.5523 0.2762
74 0.6715 0.657 0.3285
75 0.6288 0.7423 0.3712
76 0.6679 0.6643 0.3321
77 0.6586 0.6829 0.3414
78 0.6124 0.7752 0.3876
79 0.5241 0.9517 0.4759
80 0.4866 0.9732 0.5134
81 0.4611 0.9222 0.5389
82 0.3789 0.7579 0.6211
83 0.3402 0.6804 0.6598
84 0.2729 0.5458 0.7271
85 0.1987 0.3974 0.8013
86 0.1246 0.2492 0.8754
87 0.1826 0.3651 0.8174
88 0.2048 0.4097 0.7952

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
11 &  0.583 &  0.8341 &  0.417 \tabularnewline
12 &  0.4751 &  0.9501 &  0.5249 \tabularnewline
13 &  0.3629 &  0.7258 &  0.6371 \tabularnewline
14 &  0.2419 &  0.4837 &  0.7581 \tabularnewline
15 &  0.5728 &  0.8544 &  0.4272 \tabularnewline
16 &  0.4622 &  0.9244 &  0.5378 \tabularnewline
17 &  0.3623 &  0.7246 &  0.6377 \tabularnewline
18 &  0.2815 &  0.5629 &  0.7185 \tabularnewline
19 &  0.3154 &  0.6308 &  0.6846 \tabularnewline
20 &  0.2424 &  0.4848 &  0.7576 \tabularnewline
21 &  0.2162 &  0.4323 &  0.7838 \tabularnewline
22 &  0.1615 &  0.3231 &  0.8385 \tabularnewline
23 &  0.1584 &  0.3168 &  0.8416 \tabularnewline
24 &  0.1228 &  0.2455 &  0.8772 \tabularnewline
25 &  0.1614 &  0.3228 &  0.8386 \tabularnewline
26 &  0.1515 &  0.3031 &  0.8485 \tabularnewline
27 &  0.9246 &  0.1508 &  0.07541 \tabularnewline
28 &  0.9243 &  0.1514 &  0.0757 \tabularnewline
29 &  0.9033 &  0.1933 &  0.09667 \tabularnewline
30 &  0.8773 &  0.2454 &  0.1227 \tabularnewline
31 &  0.8642 &  0.2716 &  0.1358 \tabularnewline
32 &  0.8918 &  0.2164 &  0.1082 \tabularnewline
33 &  0.858 &  0.2839 &  0.142 \tabularnewline
34 &  0.8901 &  0.2197 &  0.1099 \tabularnewline
35 &  0.8644 &  0.2713 &  0.1356 \tabularnewline
36 &  0.9722 &  0.05565 &  0.02782 \tabularnewline
37 &  0.9677 &  0.06452 &  0.03226 \tabularnewline
38 &  0.9582 &  0.08358 &  0.04179 \tabularnewline
39 &  0.97 &  0.05994 &  0.02997 \tabularnewline
40 &  0.9693 &  0.06134 &  0.03067 \tabularnewline
41 &  0.9578 &  0.08436 &  0.04218 \tabularnewline
42 &  0.9583 &  0.08336 &  0.04168 \tabularnewline
43 &  0.9512 &  0.09753 &  0.04876 \tabularnewline
44 &  0.9623 &  0.07544 &  0.03772 \tabularnewline
45 &  0.9487 &  0.1026 &  0.05131 \tabularnewline
46 &  0.9439 &  0.1121 &  0.05607 \tabularnewline
47 &  0.9357 &  0.1286 &  0.0643 \tabularnewline
48 &  0.9245 &  0.1509 &  0.07547 \tabularnewline
49 &  0.93 &  0.14 &  0.06999 \tabularnewline
50 &  0.9219 &  0.1561 &  0.07807 \tabularnewline
51 &  0.9106 &  0.1787 &  0.08935 \tabularnewline
52 &  0.8937 &  0.2126 &  0.1063 \tabularnewline
53 &  0.9497 &  0.1006 &  0.05029 \tabularnewline
54 &  0.9341 &  0.1318 &  0.06592 \tabularnewline
55 &  0.9165 &  0.1671 &  0.08353 \tabularnewline
56 &  0.8928 &  0.2144 &  0.1072 \tabularnewline
57 &  0.9174 &  0.1652 &  0.08261 \tabularnewline
58 &  0.922 &  0.156 &  0.07801 \tabularnewline
59 &  0.9005 &  0.199 &  0.09949 \tabularnewline
60 &  0.9327 &  0.1345 &  0.06726 \tabularnewline
61 &  0.921 &  0.1581 &  0.07904 \tabularnewline
62 &  0.9003 &  0.1994 &  0.09968 \tabularnewline
63 &  0.891 &  0.218 &  0.109 \tabularnewline
64 &  0.8885 &  0.223 &  0.1115 \tabularnewline
65 &  0.8713 &  0.2574 &  0.1287 \tabularnewline
66 &  0.8385 &  0.323 &  0.1615 \tabularnewline
67 &  0.8575 &  0.2849 &  0.1425 \tabularnewline
68 &  0.8252 &  0.3496 &  0.1748 \tabularnewline
69 &  0.8153 &  0.3693 &  0.1847 \tabularnewline
70 &  0.7945 &  0.4111 &  0.2055 \tabularnewline
71 &  0.7618 &  0.4764 &  0.2382 \tabularnewline
72 &  0.7075 &  0.5851 &  0.2925 \tabularnewline
73 &  0.7238 &  0.5523 &  0.2762 \tabularnewline
74 &  0.6715 &  0.657 &  0.3285 \tabularnewline
75 &  0.6288 &  0.7423 &  0.3712 \tabularnewline
76 &  0.6679 &  0.6643 &  0.3321 \tabularnewline
77 &  0.6586 &  0.6829 &  0.3414 \tabularnewline
78 &  0.6124 &  0.7752 &  0.3876 \tabularnewline
79 &  0.5241 &  0.9517 &  0.4759 \tabularnewline
80 &  0.4866 &  0.9732 &  0.5134 \tabularnewline
81 &  0.4611 &  0.9222 &  0.5389 \tabularnewline
82 &  0.3789 &  0.7579 &  0.6211 \tabularnewline
83 &  0.3402 &  0.6804 &  0.6598 \tabularnewline
84 &  0.2729 &  0.5458 &  0.7271 \tabularnewline
85 &  0.1987 &  0.3974 &  0.8013 \tabularnewline
86 &  0.1246 &  0.2492 &  0.8754 \tabularnewline
87 &  0.1826 &  0.3651 &  0.8174 \tabularnewline
88 &  0.2048 &  0.4097 &  0.7952 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=304819&T=5

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]11[/C][C] 0.583[/C][C] 0.8341[/C][C] 0.417[/C][/ROW]
[ROW][C]12[/C][C] 0.4751[/C][C] 0.9501[/C][C] 0.5249[/C][/ROW]
[ROW][C]13[/C][C] 0.3629[/C][C] 0.7258[/C][C] 0.6371[/C][/ROW]
[ROW][C]14[/C][C] 0.2419[/C][C] 0.4837[/C][C] 0.7581[/C][/ROW]
[ROW][C]15[/C][C] 0.5728[/C][C] 0.8544[/C][C] 0.4272[/C][/ROW]
[ROW][C]16[/C][C] 0.4622[/C][C] 0.9244[/C][C] 0.5378[/C][/ROW]
[ROW][C]17[/C][C] 0.3623[/C][C] 0.7246[/C][C] 0.6377[/C][/ROW]
[ROW][C]18[/C][C] 0.2815[/C][C] 0.5629[/C][C] 0.7185[/C][/ROW]
[ROW][C]19[/C][C] 0.3154[/C][C] 0.6308[/C][C] 0.6846[/C][/ROW]
[ROW][C]20[/C][C] 0.2424[/C][C] 0.4848[/C][C] 0.7576[/C][/ROW]
[ROW][C]21[/C][C] 0.2162[/C][C] 0.4323[/C][C] 0.7838[/C][/ROW]
[ROW][C]22[/C][C] 0.1615[/C][C] 0.3231[/C][C] 0.8385[/C][/ROW]
[ROW][C]23[/C][C] 0.1584[/C][C] 0.3168[/C][C] 0.8416[/C][/ROW]
[ROW][C]24[/C][C] 0.1228[/C][C] 0.2455[/C][C] 0.8772[/C][/ROW]
[ROW][C]25[/C][C] 0.1614[/C][C] 0.3228[/C][C] 0.8386[/C][/ROW]
[ROW][C]26[/C][C] 0.1515[/C][C] 0.3031[/C][C] 0.8485[/C][/ROW]
[ROW][C]27[/C][C] 0.9246[/C][C] 0.1508[/C][C] 0.07541[/C][/ROW]
[ROW][C]28[/C][C] 0.9243[/C][C] 0.1514[/C][C] 0.0757[/C][/ROW]
[ROW][C]29[/C][C] 0.9033[/C][C] 0.1933[/C][C] 0.09667[/C][/ROW]
[ROW][C]30[/C][C] 0.8773[/C][C] 0.2454[/C][C] 0.1227[/C][/ROW]
[ROW][C]31[/C][C] 0.8642[/C][C] 0.2716[/C][C] 0.1358[/C][/ROW]
[ROW][C]32[/C][C] 0.8918[/C][C] 0.2164[/C][C] 0.1082[/C][/ROW]
[ROW][C]33[/C][C] 0.858[/C][C] 0.2839[/C][C] 0.142[/C][/ROW]
[ROW][C]34[/C][C] 0.8901[/C][C] 0.2197[/C][C] 0.1099[/C][/ROW]
[ROW][C]35[/C][C] 0.8644[/C][C] 0.2713[/C][C] 0.1356[/C][/ROW]
[ROW][C]36[/C][C] 0.9722[/C][C] 0.05565[/C][C] 0.02782[/C][/ROW]
[ROW][C]37[/C][C] 0.9677[/C][C] 0.06452[/C][C] 0.03226[/C][/ROW]
[ROW][C]38[/C][C] 0.9582[/C][C] 0.08358[/C][C] 0.04179[/C][/ROW]
[ROW][C]39[/C][C] 0.97[/C][C] 0.05994[/C][C] 0.02997[/C][/ROW]
[ROW][C]40[/C][C] 0.9693[/C][C] 0.06134[/C][C] 0.03067[/C][/ROW]
[ROW][C]41[/C][C] 0.9578[/C][C] 0.08436[/C][C] 0.04218[/C][/ROW]
[ROW][C]42[/C][C] 0.9583[/C][C] 0.08336[/C][C] 0.04168[/C][/ROW]
[ROW][C]43[/C][C] 0.9512[/C][C] 0.09753[/C][C] 0.04876[/C][/ROW]
[ROW][C]44[/C][C] 0.9623[/C][C] 0.07544[/C][C] 0.03772[/C][/ROW]
[ROW][C]45[/C][C] 0.9487[/C][C] 0.1026[/C][C] 0.05131[/C][/ROW]
[ROW][C]46[/C][C] 0.9439[/C][C] 0.1121[/C][C] 0.05607[/C][/ROW]
[ROW][C]47[/C][C] 0.9357[/C][C] 0.1286[/C][C] 0.0643[/C][/ROW]
[ROW][C]48[/C][C] 0.9245[/C][C] 0.1509[/C][C] 0.07547[/C][/ROW]
[ROW][C]49[/C][C] 0.93[/C][C] 0.14[/C][C] 0.06999[/C][/ROW]
[ROW][C]50[/C][C] 0.9219[/C][C] 0.1561[/C][C] 0.07807[/C][/ROW]
[ROW][C]51[/C][C] 0.9106[/C][C] 0.1787[/C][C] 0.08935[/C][/ROW]
[ROW][C]52[/C][C] 0.8937[/C][C] 0.2126[/C][C] 0.1063[/C][/ROW]
[ROW][C]53[/C][C] 0.9497[/C][C] 0.1006[/C][C] 0.05029[/C][/ROW]
[ROW][C]54[/C][C] 0.9341[/C][C] 0.1318[/C][C] 0.06592[/C][/ROW]
[ROW][C]55[/C][C] 0.9165[/C][C] 0.1671[/C][C] 0.08353[/C][/ROW]
[ROW][C]56[/C][C] 0.8928[/C][C] 0.2144[/C][C] 0.1072[/C][/ROW]
[ROW][C]57[/C][C] 0.9174[/C][C] 0.1652[/C][C] 0.08261[/C][/ROW]
[ROW][C]58[/C][C] 0.922[/C][C] 0.156[/C][C] 0.07801[/C][/ROW]
[ROW][C]59[/C][C] 0.9005[/C][C] 0.199[/C][C] 0.09949[/C][/ROW]
[ROW][C]60[/C][C] 0.9327[/C][C] 0.1345[/C][C] 0.06726[/C][/ROW]
[ROW][C]61[/C][C] 0.921[/C][C] 0.1581[/C][C] 0.07904[/C][/ROW]
[ROW][C]62[/C][C] 0.9003[/C][C] 0.1994[/C][C] 0.09968[/C][/ROW]
[ROW][C]63[/C][C] 0.891[/C][C] 0.218[/C][C] 0.109[/C][/ROW]
[ROW][C]64[/C][C] 0.8885[/C][C] 0.223[/C][C] 0.1115[/C][/ROW]
[ROW][C]65[/C][C] 0.8713[/C][C] 0.2574[/C][C] 0.1287[/C][/ROW]
[ROW][C]66[/C][C] 0.8385[/C][C] 0.323[/C][C] 0.1615[/C][/ROW]
[ROW][C]67[/C][C] 0.8575[/C][C] 0.2849[/C][C] 0.1425[/C][/ROW]
[ROW][C]68[/C][C] 0.8252[/C][C] 0.3496[/C][C] 0.1748[/C][/ROW]
[ROW][C]69[/C][C] 0.8153[/C][C] 0.3693[/C][C] 0.1847[/C][/ROW]
[ROW][C]70[/C][C] 0.7945[/C][C] 0.4111[/C][C] 0.2055[/C][/ROW]
[ROW][C]71[/C][C] 0.7618[/C][C] 0.4764[/C][C] 0.2382[/C][/ROW]
[ROW][C]72[/C][C] 0.7075[/C][C] 0.5851[/C][C] 0.2925[/C][/ROW]
[ROW][C]73[/C][C] 0.7238[/C][C] 0.5523[/C][C] 0.2762[/C][/ROW]
[ROW][C]74[/C][C] 0.6715[/C][C] 0.657[/C][C] 0.3285[/C][/ROW]
[ROW][C]75[/C][C] 0.6288[/C][C] 0.7423[/C][C] 0.3712[/C][/ROW]
[ROW][C]76[/C][C] 0.6679[/C][C] 0.6643[/C][C] 0.3321[/C][/ROW]
[ROW][C]77[/C][C] 0.6586[/C][C] 0.6829[/C][C] 0.3414[/C][/ROW]
[ROW][C]78[/C][C] 0.6124[/C][C] 0.7752[/C][C] 0.3876[/C][/ROW]
[ROW][C]79[/C][C] 0.5241[/C][C] 0.9517[/C][C] 0.4759[/C][/ROW]
[ROW][C]80[/C][C] 0.4866[/C][C] 0.9732[/C][C] 0.5134[/C][/ROW]
[ROW][C]81[/C][C] 0.4611[/C][C] 0.9222[/C][C] 0.5389[/C][/ROW]
[ROW][C]82[/C][C] 0.3789[/C][C] 0.7579[/C][C] 0.6211[/C][/ROW]
[ROW][C]83[/C][C] 0.3402[/C][C] 0.6804[/C][C] 0.6598[/C][/ROW]
[ROW][C]84[/C][C] 0.2729[/C][C] 0.5458[/C][C] 0.7271[/C][/ROW]
[ROW][C]85[/C][C] 0.1987[/C][C] 0.3974[/C][C] 0.8013[/C][/ROW]
[ROW][C]86[/C][C] 0.1246[/C][C] 0.2492[/C][C] 0.8754[/C][/ROW]
[ROW][C]87[/C][C] 0.1826[/C][C] 0.3651[/C][C] 0.8174[/C][/ROW]
[ROW][C]88[/C][C] 0.2048[/C][C] 0.4097[/C][C] 0.7952[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=304819&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=304819&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
11 0.583 0.8341 0.417
12 0.4751 0.9501 0.5249
13 0.3629 0.7258 0.6371
14 0.2419 0.4837 0.7581
15 0.5728 0.8544 0.4272
16 0.4622 0.9244 0.5378
17 0.3623 0.7246 0.6377
18 0.2815 0.5629 0.7185
19 0.3154 0.6308 0.6846
20 0.2424 0.4848 0.7576
21 0.2162 0.4323 0.7838
22 0.1615 0.3231 0.8385
23 0.1584 0.3168 0.8416
24 0.1228 0.2455 0.8772
25 0.1614 0.3228 0.8386
26 0.1515 0.3031 0.8485
27 0.9246 0.1508 0.07541
28 0.9243 0.1514 0.0757
29 0.9033 0.1933 0.09667
30 0.8773 0.2454 0.1227
31 0.8642 0.2716 0.1358
32 0.8918 0.2164 0.1082
33 0.858 0.2839 0.142
34 0.8901 0.2197 0.1099
35 0.8644 0.2713 0.1356
36 0.9722 0.05565 0.02782
37 0.9677 0.06452 0.03226
38 0.9582 0.08358 0.04179
39 0.97 0.05994 0.02997
40 0.9693 0.06134 0.03067
41 0.9578 0.08436 0.04218
42 0.9583 0.08336 0.04168
43 0.9512 0.09753 0.04876
44 0.9623 0.07544 0.03772
45 0.9487 0.1026 0.05131
46 0.9439 0.1121 0.05607
47 0.9357 0.1286 0.0643
48 0.9245 0.1509 0.07547
49 0.93 0.14 0.06999
50 0.9219 0.1561 0.07807
51 0.9106 0.1787 0.08935
52 0.8937 0.2126 0.1063
53 0.9497 0.1006 0.05029
54 0.9341 0.1318 0.06592
55 0.9165 0.1671 0.08353
56 0.8928 0.2144 0.1072
57 0.9174 0.1652 0.08261
58 0.922 0.156 0.07801
59 0.9005 0.199 0.09949
60 0.9327 0.1345 0.06726
61 0.921 0.1581 0.07904
62 0.9003 0.1994 0.09968
63 0.891 0.218 0.109
64 0.8885 0.223 0.1115
65 0.8713 0.2574 0.1287
66 0.8385 0.323 0.1615
67 0.8575 0.2849 0.1425
68 0.8252 0.3496 0.1748
69 0.8153 0.3693 0.1847
70 0.7945 0.4111 0.2055
71 0.7618 0.4764 0.2382
72 0.7075 0.5851 0.2925
73 0.7238 0.5523 0.2762
74 0.6715 0.657 0.3285
75 0.6288 0.7423 0.3712
76 0.6679 0.6643 0.3321
77 0.6586 0.6829 0.3414
78 0.6124 0.7752 0.3876
79 0.5241 0.9517 0.4759
80 0.4866 0.9732 0.5134
81 0.4611 0.9222 0.5389
82 0.3789 0.7579 0.6211
83 0.3402 0.6804 0.6598
84 0.2729 0.5458 0.7271
85 0.1987 0.3974 0.8013
86 0.1246 0.2492 0.8754
87 0.1826 0.3651 0.8174
88 0.2048 0.4097 0.7952







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level0 0OK
5% type I error level00OK
10% type I error level90.115385NOK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 0 &  0 & OK \tabularnewline
5% type I error level & 0 & 0 & OK \tabularnewline
10% type I error level & 9 & 0.115385 & NOK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=304819&T=6

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]0[/C][C] 0[/C][C]OK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]9[/C][C]0.115385[/C][C]NOK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=304819&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=304819&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level0 0OK
5% type I error level00OK
10% type I error level90.115385NOK







Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 2.8313, df1 = 2, df2 = 89, p-value = 0.06425
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.84467, df1 = 14, df2 = 77, p-value = 0.6196
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 0.79247, df1 = 2, df2 = 89, p-value = 0.4559

\begin{tabular}{lllllllll}
\hline
Ramsey RESET F-Test for powers (2 and 3) of fitted values \tabularnewline
> reset_test_fitted
	RESET test
data:  mylm
RESET = 2.8313, df1 = 2, df2 = 89, p-value = 0.06425
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of regressors \tabularnewline
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.84467, df1 = 14, df2 = 77, p-value = 0.6196
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of principal components \tabularnewline
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 0.79247, df1 = 2, df2 = 89, p-value = 0.4559
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=304819&T=7

[TABLE]
[ROW][C]Ramsey RESET F-Test for powers (2 and 3) of fitted values[/C][/ROW]
[ROW][C]
> reset_test_fitted
	RESET test
data:  mylm
RESET = 2.8313, df1 = 2, df2 = 89, p-value = 0.06425
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of regressors[/C][/ROW] [ROW][C]
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.84467, df1 = 14, df2 = 77, p-value = 0.6196
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of principal components[/C][/ROW] [ROW][C]
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 0.79247, df1 = 2, df2 = 89, p-value = 0.4559
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=304819&T=7

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=304819&T=7

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 2.8313, df1 = 2, df2 = 89, p-value = 0.06425
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.84467, df1 = 14, df2 = 77, p-value = 0.6196
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 0.79247, df1 = 2, df2 = 89, p-value = 0.4559







Variance Inflation Factors (Multicollinearity)
> vif
Bevr_Leeftijd          TVDC        SKEOU1        SKEOU2        SKEOU4 
     1.046149      1.574638      1.189404      1.440390      1.123632 
       SKEOU5        SKEOU6 
     1.023528      1.022214 

\begin{tabular}{lllllllll}
\hline
Variance Inflation Factors (Multicollinearity) \tabularnewline
> vif
Bevr_Leeftijd          TVDC        SKEOU1        SKEOU2        SKEOU4 
     1.046149      1.574638      1.189404      1.440390      1.123632 
       SKEOU5        SKEOU6 
     1.023528      1.022214 
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=304819&T=8

[TABLE]
[ROW][C]Variance Inflation Factors (Multicollinearity)[/C][/ROW]
[ROW][C]
> vif
Bevr_Leeftijd          TVDC        SKEOU1        SKEOU2        SKEOU4 
     1.046149      1.574638      1.189404      1.440390      1.123632 
       SKEOU5        SKEOU6 
     1.023528      1.022214 
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=304819&T=8

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=304819&T=8

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Variance Inflation Factors (Multicollinearity)
> vif
Bevr_Leeftijd          TVDC        SKEOU1        SKEOU2        SKEOU4 
     1.046149      1.574638      1.189404      1.440390      1.123632 
       SKEOU5        SKEOU6 
     1.023528      1.022214 



Parameters (Session):
Parameters (R input):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ; par4 = ; par5 = ;
R code (references can be found in the software module):
library(lattice)
library(lmtest)
library(car)
library(MASS)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
mywarning <- ''
par1 <- as.numeric(par1)
if(is.na(par1)) {
par1 <- 1
mywarning = 'Warning: you did not specify the column number of the endogenous series! The first column was selected by default.'
}
if (par4=='') par4 <- 0
par4 <- as.numeric(par4)
if (par5=='') par5 <- 0
par5 <- as.numeric(par5)
x <- na.omit(t(y))
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'Seasonal Differences (s=12)'){
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'First and Seasonal Differences (s=12)'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if(par4 > 0) {
x2 <- array(0, dim=c(n-par4,par4), dimnames=list(1:(n-par4), paste(colnames(x)[par1],'(t-',1:par4,')',sep='')))
for (i in 1:(n-par4)) {
for (j in 1:par4) {
x2[i,j] <- x[i+par4-j,par1]
}
}
x <- cbind(x[(par4+1):n,], x2)
n <- n - par4
}
if(par5 > 0) {
x2 <- array(0, dim=c(n-par5*12,par5), dimnames=list(1:(n-par5*12), paste(colnames(x)[par1],'(t-',1:par5,'s)',sep='')))
for (i in 1:(n-par5*12)) {
for (j in 1:par5) {
x2[i,j] <- x[i+par5*12-j*12,par1]
}
}
x <- cbind(x[(par5*12+1):n,], x2)
n <- n - par5*12
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
(k <- length(x[n,]))
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
print(x)
(k <- length(x[n,]))
head(x)
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
sresid <- studres(mylm)
hist(sresid, freq=FALSE, main='Distribution of Studentized Residuals')
xfit<-seq(min(sresid),max(sresid),length=40)
yfit<-dnorm(xfit)
lines(xfit, yfit)
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqPlot(mylm, main='QQ Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
print(z)
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, signif(mysum$coefficients[i,1],6), sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, mywarning)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Multiple Linear Regression - Ordinary Least Squares', 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,formatC(signif(mysum$coefficients[i,1],5),format='g',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,2],5),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,3],4),format='e',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4],4),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4]/2,4),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a,formatC(signif(sqrt(mysum$r.squared),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$adj.r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a,formatC(signif(mysum$fstatistic[1],6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[2],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[3],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a,formatC(signif(1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a,formatC(signif(mysum$sigma,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a,formatC(signif(sum(myerror*myerror),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
myr <- as.numeric(mysum$resid)
myr
if(n < 200) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,formatC(signif(x[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(x[i]-mysum$resid[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$resid[i],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,1],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,2],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,3],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant1,6))
a<-table.element(a,formatC(signif(numsignificant1/numgqtests,6),format='g',flag=' '))
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant5,6))
a<-table.element(a,signif(numsignificant5/numgqtests,6))
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant10,6))
a<-table.element(a,signif(numsignificant10/numgqtests,6))
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}
}
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of fitted values',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_fitted <- resettest(mylm,power=2:3,type='fitted')
a<-table.element(a,paste('
',RC.texteval('reset_test_fitted'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of regressors',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_regressors <- resettest(mylm,power=2:3,type='regressor')
a<-table.element(a,paste('
',RC.texteval('reset_test_regressors'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of principal components',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_principal_components <- resettest(mylm,power=2:3,type='princomp')
a<-table.element(a,paste('
',RC.texteval('reset_test_principal_components'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable8.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Variance Inflation Factors (Multicollinearity)',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
vif <- vif(mylm)
a<-table.element(a,paste('
',RC.texteval('vif'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable9.tab')