Free Statistics

of Irreproducible Research!

Author's title

Author*Unverified author*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationFri, 16 Dec 2016 14:46:01 +0100
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2016/Dec/16/t14818962178bz2upqaerykr2e.htm/, Retrieved Thu, 02 May 2024 19:34:37 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=300268, Retrieved Thu, 02 May 2024 19:34:37 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact63
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [ITHSUM, IKSUM, TV...] [2016-12-16 13:46:01] [d41d8cd98f00b204e9800998ecf8427e] [Current]
Feedback Forum

Post a new message
Dataseries X:
14	18	13
19	19	16
17	18	17
20	19	16
15	20	17
19	14	17
20	18	16
18	19	14
15	16	16
14	18	17
16	19	16
19	18	16
18	19	16
17	15	15
19	17	16
17	17	16
19	17	15
20	19	17
19	19	13
16	20	17
16	17	14
18	16	14
16	16	18
17	16	17
20	17	16
19	18	15
7	16	15
16	16	13
18	19	17
17	19	11
19	17	14
16	17	13
13	16	17
16	16	16
12	17	17
17	18	16
17	18	16
17	18	16
16	19	15
16	14	12
14	13	17
16	18	14
13	16	14
16	15	16
14	17	15
19	19	16
18	19	14
14	20	15
18	19	17
15	16	10
17	15	17
13	16	20
19	16	17
18	20	18
15	18	17
15	15	14
20	16	17
19	18	17
18	20	16
15	18	18
20	20	18
17	14	16
19	20	15
20	14	13
18	17	16
17	15	12
18	16	16
17	20	16
20	20	16
16	18	14
14	17	15
15	19	14
20	19	15
17	18	15
17	17	16
18	17	11
20	16	18
16	14	11
18	13	18
15	17	15
18	18	19
20	16	17
14	19	14
15	17	13
17	16	17
18	17	14
20	17	19
17	17	14
16	20	16
11	19	16
15	16	15
18	19	12
16	19	17
18	19	18
15	16	15
17	18	18
19	16	15
16	17	16
14	18	16




Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time9 seconds
R ServerBig Analytics Cloud Computing Center

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input view raw input (R code)  \tabularnewline
Raw Outputview raw output of R engine  \tabularnewline
Computing time9 seconds \tabularnewline
R ServerBig Analytics Cloud Computing Center \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=300268&T=0

[TABLE]
[ROW]
Summary of computational transaction[/C][/ROW] [ROW]Raw Input[/C] view raw input (R code) [/C][/ROW] [ROW]Raw Output[/C]view raw output of R engine [/C][/ROW] [ROW]Computing time[/C]9 seconds[/C][/ROW] [ROW]R Server[/C]Big Analytics Cloud Computing Center[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=300268&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=300268&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time9 seconds
R ServerBig Analytics Cloud Computing Center







Multiple Linear Regression - Estimated Regression Equation
IKSUM[t] = + 14.5127 + 0.0876192ITHSUM[t] + 0.087953TVDCSUM[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
IKSUM[t] =  +  14.5127 +  0.0876192ITHSUM[t] +  0.087953TVDCSUM[t]  + e[t] \tabularnewline
 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=300268&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]IKSUM[t] =  +  14.5127 +  0.0876192ITHSUM[t] +  0.087953TVDCSUM[t]  + e[t][/C][/ROW]
[ROW][C][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=300268&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=300268&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
IKSUM[t] = + 14.5127 + 0.0876192ITHSUM[t] + 0.087953TVDCSUM[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+14.51 1.842+7.8800e+00 5.086e-12 2.543e-12
ITHSUM+0.08762 0.07648+1.1460e+00 0.2548 0.1274
TVDCSUM+0.08795 0.0934+9.4160e-01 0.3487 0.1744

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & +14.51 &  1.842 & +7.8800e+00 &  5.086e-12 &  2.543e-12 \tabularnewline
ITHSUM & +0.08762 &  0.07648 & +1.1460e+00 &  0.2548 &  0.1274 \tabularnewline
TVDCSUM & +0.08795 &  0.0934 & +9.4160e-01 &  0.3487 &  0.1744 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=300268&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]+14.51[/C][C] 1.842[/C][C]+7.8800e+00[/C][C] 5.086e-12[/C][C] 2.543e-12[/C][/ROW]
[ROW][C]ITHSUM[/C][C]+0.08762[/C][C] 0.07648[/C][C]+1.1460e+00[/C][C] 0.2548[/C][C] 0.1274[/C][/ROW]
[ROW][C]TVDCSUM[/C][C]+0.08795[/C][C] 0.0934[/C][C]+9.4160e-01[/C][C] 0.3487[/C][C] 0.1744[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=300268&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=300268&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+14.51 1.842+7.8800e+00 5.086e-12 2.543e-12
ITHSUM+0.08762 0.07648+1.1460e+00 0.2548 0.1274
TVDCSUM+0.08795 0.0934+9.4160e-01 0.3487 0.1744







Multiple Linear Regression - Regression Statistics
Multiple R 0.158
R-squared 0.02497
Adjusted R-squared 0.00466
F-TEST (value) 1.229
F-TEST (DF numerator)2
F-TEST (DF denominator)96
p-value 0.297
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 1.724
Sum Squared Residuals 285.3

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R &  0.158 \tabularnewline
R-squared &  0.02497 \tabularnewline
Adjusted R-squared &  0.00466 \tabularnewline
F-TEST (value) &  1.229 \tabularnewline
F-TEST (DF numerator) & 2 \tabularnewline
F-TEST (DF denominator) & 96 \tabularnewline
p-value &  0.297 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation &  1.724 \tabularnewline
Sum Squared Residuals &  285.3 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=300268&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C] 0.158[/C][/ROW]
[ROW][C]R-squared[/C][C] 0.02497[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C] 0.00466[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C] 1.229[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]2[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]96[/C][/ROW]
[ROW][C]p-value[/C][C] 0.297[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C] 1.724[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C] 285.3[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=300268&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=300268&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R 0.158
R-squared 0.02497
Adjusted R-squared 0.00466
F-TEST (value) 1.229
F-TEST (DF numerator)2
F-TEST (DF denominator)96
p-value 0.297
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 1.724
Sum Squared Residuals 285.3







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 18 16.88 1.117
2 19 17.58 1.415
3 18 17.5 0.5026
4 19 17.67 1.328
5 20 17.32 2.678
6 14 17.67-3.673
7 18 17.67 0.3277
8 19 17.32 1.679
9 16 17.23-1.234
10 18 17.23 0.7655
11 19 17.32 1.678
12 18 17.58 0.4153
13 19 17.5 1.503
14 15 17.32-2.321
15 17 17.58-0.5847
16 17 17.41-0.4094
17 17 17.5-0.4967
18 19 17.76 1.24
19 19 17.32 1.679
20 20 17.41 2.59
21 17 17.15-0.1459
22 16 17.32-1.321
23 16 17.5-1.498
24 16 17.5-1.497
25 17 17.67-0.6723
26 18 17.5 0.5033
27 16 16.45-0.4453
28 16 17.06-1.058
29 19 17.59 1.415
30 19 16.97 2.03
31 17 17.41-0.4088
32 17 17.06-0.05797
33 16 17.15-1.147
34 16 17.32-1.322
35 17 17.06-0.05931
36 18 17.41 0.5906
37 18 17.41 0.5906
38 18 17.41 0.5906
39 19 17.23 1.766
40 14 16.97-2.97
41 13 17.23-4.235
42 18 17.15 0.8541
43 16 16.88-0.8831
44 15 17.32-2.322
45 17 17.06-0.05864
46 19 17.58 1.415
47 19 17.32 1.679
48 20 17.06 2.941
49 19 17.59 1.415
50 16 16.71-0.7065
51 15 17.5-2.497
52 16 17.41-1.411
53 16 17.67-1.673
54 20 17.67 2.327
55 18 17.32 0.6778
56 15 17.06-2.058
57 16 17.76-1.76
58 18 17.67 0.3274
59 20 17.5 2.503
60 18 17.41 0.5899
61 20 17.85 2.152
62 14 17.41-3.409
63 20 17.5 2.503
64 14 17.41-3.408
65 17 17.5-0.4971
66 15 17.06-2.058
67 16 17.5-1.497
68 20 17.41 2.591
69 20 17.67 2.328
70 18 17.15 0.8541
71 17 17.06-0.05864
72 19 17.06 1.942
73 19 17.58 1.416
74 18 17.32 0.6785
75 17 17.41-0.4094
76 17 17.06-0.0573
77 16 17.85-1.848
78 14 16.88-2.882
79 13 17.67-4.673
80 17 17.15-0.1463
81 18 17.76 0.2391
82 16 17.76-1.76
83 19 16.97 2.029
84 17 16.97 0.02965
85 16 17.5-1.497
86 17 17.32-0.3212
87 17 17.94-0.9362
88 17 17.23-0.2335
89 20 17.32 2.678
90 19 16.88 2.116
91 16 17.15-1.146
92 19 17.15 1.855
93 19 17.41 1.59
94 19 17.67 1.327
95 16 17.15-1.146
96 18 17.59 0.4146
97 16 17.5-1.497
98 17 17.32-0.3218
99 18 17.15 0.8534

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 &  18 &  16.88 &  1.117 \tabularnewline
2 &  19 &  17.58 &  1.415 \tabularnewline
3 &  18 &  17.5 &  0.5026 \tabularnewline
4 &  19 &  17.67 &  1.328 \tabularnewline
5 &  20 &  17.32 &  2.678 \tabularnewline
6 &  14 &  17.67 & -3.673 \tabularnewline
7 &  18 &  17.67 &  0.3277 \tabularnewline
8 &  19 &  17.32 &  1.679 \tabularnewline
9 &  16 &  17.23 & -1.234 \tabularnewline
10 &  18 &  17.23 &  0.7655 \tabularnewline
11 &  19 &  17.32 &  1.678 \tabularnewline
12 &  18 &  17.58 &  0.4153 \tabularnewline
13 &  19 &  17.5 &  1.503 \tabularnewline
14 &  15 &  17.32 & -2.321 \tabularnewline
15 &  17 &  17.58 & -0.5847 \tabularnewline
16 &  17 &  17.41 & -0.4094 \tabularnewline
17 &  17 &  17.5 & -0.4967 \tabularnewline
18 &  19 &  17.76 &  1.24 \tabularnewline
19 &  19 &  17.32 &  1.679 \tabularnewline
20 &  20 &  17.41 &  2.59 \tabularnewline
21 &  17 &  17.15 & -0.1459 \tabularnewline
22 &  16 &  17.32 & -1.321 \tabularnewline
23 &  16 &  17.5 & -1.498 \tabularnewline
24 &  16 &  17.5 & -1.497 \tabularnewline
25 &  17 &  17.67 & -0.6723 \tabularnewline
26 &  18 &  17.5 &  0.5033 \tabularnewline
27 &  16 &  16.45 & -0.4453 \tabularnewline
28 &  16 &  17.06 & -1.058 \tabularnewline
29 &  19 &  17.59 &  1.415 \tabularnewline
30 &  19 &  16.97 &  2.03 \tabularnewline
31 &  17 &  17.41 & -0.4088 \tabularnewline
32 &  17 &  17.06 & -0.05797 \tabularnewline
33 &  16 &  17.15 & -1.147 \tabularnewline
34 &  16 &  17.32 & -1.322 \tabularnewline
35 &  17 &  17.06 & -0.05931 \tabularnewline
36 &  18 &  17.41 &  0.5906 \tabularnewline
37 &  18 &  17.41 &  0.5906 \tabularnewline
38 &  18 &  17.41 &  0.5906 \tabularnewline
39 &  19 &  17.23 &  1.766 \tabularnewline
40 &  14 &  16.97 & -2.97 \tabularnewline
41 &  13 &  17.23 & -4.235 \tabularnewline
42 &  18 &  17.15 &  0.8541 \tabularnewline
43 &  16 &  16.88 & -0.8831 \tabularnewline
44 &  15 &  17.32 & -2.322 \tabularnewline
45 &  17 &  17.06 & -0.05864 \tabularnewline
46 &  19 &  17.58 &  1.415 \tabularnewline
47 &  19 &  17.32 &  1.679 \tabularnewline
48 &  20 &  17.06 &  2.941 \tabularnewline
49 &  19 &  17.59 &  1.415 \tabularnewline
50 &  16 &  16.71 & -0.7065 \tabularnewline
51 &  15 &  17.5 & -2.497 \tabularnewline
52 &  16 &  17.41 & -1.411 \tabularnewline
53 &  16 &  17.67 & -1.673 \tabularnewline
54 &  20 &  17.67 &  2.327 \tabularnewline
55 &  18 &  17.32 &  0.6778 \tabularnewline
56 &  15 &  17.06 & -2.058 \tabularnewline
57 &  16 &  17.76 & -1.76 \tabularnewline
58 &  18 &  17.67 &  0.3274 \tabularnewline
59 &  20 &  17.5 &  2.503 \tabularnewline
60 &  18 &  17.41 &  0.5899 \tabularnewline
61 &  20 &  17.85 &  2.152 \tabularnewline
62 &  14 &  17.41 & -3.409 \tabularnewline
63 &  20 &  17.5 &  2.503 \tabularnewline
64 &  14 &  17.41 & -3.408 \tabularnewline
65 &  17 &  17.5 & -0.4971 \tabularnewline
66 &  15 &  17.06 & -2.058 \tabularnewline
67 &  16 &  17.5 & -1.497 \tabularnewline
68 &  20 &  17.41 &  2.591 \tabularnewline
69 &  20 &  17.67 &  2.328 \tabularnewline
70 &  18 &  17.15 &  0.8541 \tabularnewline
71 &  17 &  17.06 & -0.05864 \tabularnewline
72 &  19 &  17.06 &  1.942 \tabularnewline
73 &  19 &  17.58 &  1.416 \tabularnewline
74 &  18 &  17.32 &  0.6785 \tabularnewline
75 &  17 &  17.41 & -0.4094 \tabularnewline
76 &  17 &  17.06 & -0.0573 \tabularnewline
77 &  16 &  17.85 & -1.848 \tabularnewline
78 &  14 &  16.88 & -2.882 \tabularnewline
79 &  13 &  17.67 & -4.673 \tabularnewline
80 &  17 &  17.15 & -0.1463 \tabularnewline
81 &  18 &  17.76 &  0.2391 \tabularnewline
82 &  16 &  17.76 & -1.76 \tabularnewline
83 &  19 &  16.97 &  2.029 \tabularnewline
84 &  17 &  16.97 &  0.02965 \tabularnewline
85 &  16 &  17.5 & -1.497 \tabularnewline
86 &  17 &  17.32 & -0.3212 \tabularnewline
87 &  17 &  17.94 & -0.9362 \tabularnewline
88 &  17 &  17.23 & -0.2335 \tabularnewline
89 &  20 &  17.32 &  2.678 \tabularnewline
90 &  19 &  16.88 &  2.116 \tabularnewline
91 &  16 &  17.15 & -1.146 \tabularnewline
92 &  19 &  17.15 &  1.855 \tabularnewline
93 &  19 &  17.41 &  1.59 \tabularnewline
94 &  19 &  17.67 &  1.327 \tabularnewline
95 &  16 &  17.15 & -1.146 \tabularnewline
96 &  18 &  17.59 &  0.4146 \tabularnewline
97 &  16 &  17.5 & -1.497 \tabularnewline
98 &  17 &  17.32 & -0.3218 \tabularnewline
99 &  18 &  17.15 &  0.8534 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=300268&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C] 18[/C][C] 16.88[/C][C] 1.117[/C][/ROW]
[ROW][C]2[/C][C] 19[/C][C] 17.58[/C][C] 1.415[/C][/ROW]
[ROW][C]3[/C][C] 18[/C][C] 17.5[/C][C] 0.5026[/C][/ROW]
[ROW][C]4[/C][C] 19[/C][C] 17.67[/C][C] 1.328[/C][/ROW]
[ROW][C]5[/C][C] 20[/C][C] 17.32[/C][C] 2.678[/C][/ROW]
[ROW][C]6[/C][C] 14[/C][C] 17.67[/C][C]-3.673[/C][/ROW]
[ROW][C]7[/C][C] 18[/C][C] 17.67[/C][C] 0.3277[/C][/ROW]
[ROW][C]8[/C][C] 19[/C][C] 17.32[/C][C] 1.679[/C][/ROW]
[ROW][C]9[/C][C] 16[/C][C] 17.23[/C][C]-1.234[/C][/ROW]
[ROW][C]10[/C][C] 18[/C][C] 17.23[/C][C] 0.7655[/C][/ROW]
[ROW][C]11[/C][C] 19[/C][C] 17.32[/C][C] 1.678[/C][/ROW]
[ROW][C]12[/C][C] 18[/C][C] 17.58[/C][C] 0.4153[/C][/ROW]
[ROW][C]13[/C][C] 19[/C][C] 17.5[/C][C] 1.503[/C][/ROW]
[ROW][C]14[/C][C] 15[/C][C] 17.32[/C][C]-2.321[/C][/ROW]
[ROW][C]15[/C][C] 17[/C][C] 17.58[/C][C]-0.5847[/C][/ROW]
[ROW][C]16[/C][C] 17[/C][C] 17.41[/C][C]-0.4094[/C][/ROW]
[ROW][C]17[/C][C] 17[/C][C] 17.5[/C][C]-0.4967[/C][/ROW]
[ROW][C]18[/C][C] 19[/C][C] 17.76[/C][C] 1.24[/C][/ROW]
[ROW][C]19[/C][C] 19[/C][C] 17.32[/C][C] 1.679[/C][/ROW]
[ROW][C]20[/C][C] 20[/C][C] 17.41[/C][C] 2.59[/C][/ROW]
[ROW][C]21[/C][C] 17[/C][C] 17.15[/C][C]-0.1459[/C][/ROW]
[ROW][C]22[/C][C] 16[/C][C] 17.32[/C][C]-1.321[/C][/ROW]
[ROW][C]23[/C][C] 16[/C][C] 17.5[/C][C]-1.498[/C][/ROW]
[ROW][C]24[/C][C] 16[/C][C] 17.5[/C][C]-1.497[/C][/ROW]
[ROW][C]25[/C][C] 17[/C][C] 17.67[/C][C]-0.6723[/C][/ROW]
[ROW][C]26[/C][C] 18[/C][C] 17.5[/C][C] 0.5033[/C][/ROW]
[ROW][C]27[/C][C] 16[/C][C] 16.45[/C][C]-0.4453[/C][/ROW]
[ROW][C]28[/C][C] 16[/C][C] 17.06[/C][C]-1.058[/C][/ROW]
[ROW][C]29[/C][C] 19[/C][C] 17.59[/C][C] 1.415[/C][/ROW]
[ROW][C]30[/C][C] 19[/C][C] 16.97[/C][C] 2.03[/C][/ROW]
[ROW][C]31[/C][C] 17[/C][C] 17.41[/C][C]-0.4088[/C][/ROW]
[ROW][C]32[/C][C] 17[/C][C] 17.06[/C][C]-0.05797[/C][/ROW]
[ROW][C]33[/C][C] 16[/C][C] 17.15[/C][C]-1.147[/C][/ROW]
[ROW][C]34[/C][C] 16[/C][C] 17.32[/C][C]-1.322[/C][/ROW]
[ROW][C]35[/C][C] 17[/C][C] 17.06[/C][C]-0.05931[/C][/ROW]
[ROW][C]36[/C][C] 18[/C][C] 17.41[/C][C] 0.5906[/C][/ROW]
[ROW][C]37[/C][C] 18[/C][C] 17.41[/C][C] 0.5906[/C][/ROW]
[ROW][C]38[/C][C] 18[/C][C] 17.41[/C][C] 0.5906[/C][/ROW]
[ROW][C]39[/C][C] 19[/C][C] 17.23[/C][C] 1.766[/C][/ROW]
[ROW][C]40[/C][C] 14[/C][C] 16.97[/C][C]-2.97[/C][/ROW]
[ROW][C]41[/C][C] 13[/C][C] 17.23[/C][C]-4.235[/C][/ROW]
[ROW][C]42[/C][C] 18[/C][C] 17.15[/C][C] 0.8541[/C][/ROW]
[ROW][C]43[/C][C] 16[/C][C] 16.88[/C][C]-0.8831[/C][/ROW]
[ROW][C]44[/C][C] 15[/C][C] 17.32[/C][C]-2.322[/C][/ROW]
[ROW][C]45[/C][C] 17[/C][C] 17.06[/C][C]-0.05864[/C][/ROW]
[ROW][C]46[/C][C] 19[/C][C] 17.58[/C][C] 1.415[/C][/ROW]
[ROW][C]47[/C][C] 19[/C][C] 17.32[/C][C] 1.679[/C][/ROW]
[ROW][C]48[/C][C] 20[/C][C] 17.06[/C][C] 2.941[/C][/ROW]
[ROW][C]49[/C][C] 19[/C][C] 17.59[/C][C] 1.415[/C][/ROW]
[ROW][C]50[/C][C] 16[/C][C] 16.71[/C][C]-0.7065[/C][/ROW]
[ROW][C]51[/C][C] 15[/C][C] 17.5[/C][C]-2.497[/C][/ROW]
[ROW][C]52[/C][C] 16[/C][C] 17.41[/C][C]-1.411[/C][/ROW]
[ROW][C]53[/C][C] 16[/C][C] 17.67[/C][C]-1.673[/C][/ROW]
[ROW][C]54[/C][C] 20[/C][C] 17.67[/C][C] 2.327[/C][/ROW]
[ROW][C]55[/C][C] 18[/C][C] 17.32[/C][C] 0.6778[/C][/ROW]
[ROW][C]56[/C][C] 15[/C][C] 17.06[/C][C]-2.058[/C][/ROW]
[ROW][C]57[/C][C] 16[/C][C] 17.76[/C][C]-1.76[/C][/ROW]
[ROW][C]58[/C][C] 18[/C][C] 17.67[/C][C] 0.3274[/C][/ROW]
[ROW][C]59[/C][C] 20[/C][C] 17.5[/C][C] 2.503[/C][/ROW]
[ROW][C]60[/C][C] 18[/C][C] 17.41[/C][C] 0.5899[/C][/ROW]
[ROW][C]61[/C][C] 20[/C][C] 17.85[/C][C] 2.152[/C][/ROW]
[ROW][C]62[/C][C] 14[/C][C] 17.41[/C][C]-3.409[/C][/ROW]
[ROW][C]63[/C][C] 20[/C][C] 17.5[/C][C] 2.503[/C][/ROW]
[ROW][C]64[/C][C] 14[/C][C] 17.41[/C][C]-3.408[/C][/ROW]
[ROW][C]65[/C][C] 17[/C][C] 17.5[/C][C]-0.4971[/C][/ROW]
[ROW][C]66[/C][C] 15[/C][C] 17.06[/C][C]-2.058[/C][/ROW]
[ROW][C]67[/C][C] 16[/C][C] 17.5[/C][C]-1.497[/C][/ROW]
[ROW][C]68[/C][C] 20[/C][C] 17.41[/C][C] 2.591[/C][/ROW]
[ROW][C]69[/C][C] 20[/C][C] 17.67[/C][C] 2.328[/C][/ROW]
[ROW][C]70[/C][C] 18[/C][C] 17.15[/C][C] 0.8541[/C][/ROW]
[ROW][C]71[/C][C] 17[/C][C] 17.06[/C][C]-0.05864[/C][/ROW]
[ROW][C]72[/C][C] 19[/C][C] 17.06[/C][C] 1.942[/C][/ROW]
[ROW][C]73[/C][C] 19[/C][C] 17.58[/C][C] 1.416[/C][/ROW]
[ROW][C]74[/C][C] 18[/C][C] 17.32[/C][C] 0.6785[/C][/ROW]
[ROW][C]75[/C][C] 17[/C][C] 17.41[/C][C]-0.4094[/C][/ROW]
[ROW][C]76[/C][C] 17[/C][C] 17.06[/C][C]-0.0573[/C][/ROW]
[ROW][C]77[/C][C] 16[/C][C] 17.85[/C][C]-1.848[/C][/ROW]
[ROW][C]78[/C][C] 14[/C][C] 16.88[/C][C]-2.882[/C][/ROW]
[ROW][C]79[/C][C] 13[/C][C] 17.67[/C][C]-4.673[/C][/ROW]
[ROW][C]80[/C][C] 17[/C][C] 17.15[/C][C]-0.1463[/C][/ROW]
[ROW][C]81[/C][C] 18[/C][C] 17.76[/C][C] 0.2391[/C][/ROW]
[ROW][C]82[/C][C] 16[/C][C] 17.76[/C][C]-1.76[/C][/ROW]
[ROW][C]83[/C][C] 19[/C][C] 16.97[/C][C] 2.029[/C][/ROW]
[ROW][C]84[/C][C] 17[/C][C] 16.97[/C][C] 0.02965[/C][/ROW]
[ROW][C]85[/C][C] 16[/C][C] 17.5[/C][C]-1.497[/C][/ROW]
[ROW][C]86[/C][C] 17[/C][C] 17.32[/C][C]-0.3212[/C][/ROW]
[ROW][C]87[/C][C] 17[/C][C] 17.94[/C][C]-0.9362[/C][/ROW]
[ROW][C]88[/C][C] 17[/C][C] 17.23[/C][C]-0.2335[/C][/ROW]
[ROW][C]89[/C][C] 20[/C][C] 17.32[/C][C] 2.678[/C][/ROW]
[ROW][C]90[/C][C] 19[/C][C] 16.88[/C][C] 2.116[/C][/ROW]
[ROW][C]91[/C][C] 16[/C][C] 17.15[/C][C]-1.146[/C][/ROW]
[ROW][C]92[/C][C] 19[/C][C] 17.15[/C][C] 1.855[/C][/ROW]
[ROW][C]93[/C][C] 19[/C][C] 17.41[/C][C] 1.59[/C][/ROW]
[ROW][C]94[/C][C] 19[/C][C] 17.67[/C][C] 1.327[/C][/ROW]
[ROW][C]95[/C][C] 16[/C][C] 17.15[/C][C]-1.146[/C][/ROW]
[ROW][C]96[/C][C] 18[/C][C] 17.59[/C][C] 0.4146[/C][/ROW]
[ROW][C]97[/C][C] 16[/C][C] 17.5[/C][C]-1.497[/C][/ROW]
[ROW][C]98[/C][C] 17[/C][C] 17.32[/C][C]-0.3218[/C][/ROW]
[ROW][C]99[/C][C] 18[/C][C] 17.15[/C][C] 0.8534[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=300268&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=300268&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 18 16.88 1.117
2 19 17.58 1.415
3 18 17.5 0.5026
4 19 17.67 1.328
5 20 17.32 2.678
6 14 17.67-3.673
7 18 17.67 0.3277
8 19 17.32 1.679
9 16 17.23-1.234
10 18 17.23 0.7655
11 19 17.32 1.678
12 18 17.58 0.4153
13 19 17.5 1.503
14 15 17.32-2.321
15 17 17.58-0.5847
16 17 17.41-0.4094
17 17 17.5-0.4967
18 19 17.76 1.24
19 19 17.32 1.679
20 20 17.41 2.59
21 17 17.15-0.1459
22 16 17.32-1.321
23 16 17.5-1.498
24 16 17.5-1.497
25 17 17.67-0.6723
26 18 17.5 0.5033
27 16 16.45-0.4453
28 16 17.06-1.058
29 19 17.59 1.415
30 19 16.97 2.03
31 17 17.41-0.4088
32 17 17.06-0.05797
33 16 17.15-1.147
34 16 17.32-1.322
35 17 17.06-0.05931
36 18 17.41 0.5906
37 18 17.41 0.5906
38 18 17.41 0.5906
39 19 17.23 1.766
40 14 16.97-2.97
41 13 17.23-4.235
42 18 17.15 0.8541
43 16 16.88-0.8831
44 15 17.32-2.322
45 17 17.06-0.05864
46 19 17.58 1.415
47 19 17.32 1.679
48 20 17.06 2.941
49 19 17.59 1.415
50 16 16.71-0.7065
51 15 17.5-2.497
52 16 17.41-1.411
53 16 17.67-1.673
54 20 17.67 2.327
55 18 17.32 0.6778
56 15 17.06-2.058
57 16 17.76-1.76
58 18 17.67 0.3274
59 20 17.5 2.503
60 18 17.41 0.5899
61 20 17.85 2.152
62 14 17.41-3.409
63 20 17.5 2.503
64 14 17.41-3.408
65 17 17.5-0.4971
66 15 17.06-2.058
67 16 17.5-1.497
68 20 17.41 2.591
69 20 17.67 2.328
70 18 17.15 0.8541
71 17 17.06-0.05864
72 19 17.06 1.942
73 19 17.58 1.416
74 18 17.32 0.6785
75 17 17.41-0.4094
76 17 17.06-0.0573
77 16 17.85-1.848
78 14 16.88-2.882
79 13 17.67-4.673
80 17 17.15-0.1463
81 18 17.76 0.2391
82 16 17.76-1.76
83 19 16.97 2.029
84 17 16.97 0.02965
85 16 17.5-1.497
86 17 17.32-0.3212
87 17 17.94-0.9362
88 17 17.23-0.2335
89 20 17.32 2.678
90 19 16.88 2.116
91 16 17.15-1.146
92 19 17.15 1.855
93 19 17.41 1.59
94 19 17.67 1.327
95 16 17.15-1.146
96 18 17.59 0.4146
97 16 17.5-1.497
98 17 17.32-0.3218
99 18 17.15 0.8534







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
6 0.9226 0.1547 0.07736
7 0.8582 0.2836 0.1418
8 0.7892 0.4215 0.2108
9 0.8004 0.3991 0.1996
10 0.7163 0.5675 0.2837
11 0.656 0.688 0.344
12 0.5569 0.8862 0.4431
13 0.4948 0.9895 0.5052
14 0.6668 0.6664 0.3332
15 0.5961 0.8078 0.4039
16 0.5245 0.9509 0.4755
17 0.4517 0.9034 0.5483
18 0.4186 0.8372 0.5814
19 0.3784 0.7568 0.6216
20 0.4335 0.867 0.5665
21 0.3768 0.7537 0.6232
22 0.3698 0.7396 0.6302
23 0.3818 0.7635 0.6182
24 0.3746 0.7491 0.6254
25 0.318 0.6361 0.682
26 0.2607 0.5215 0.7393
27 0.225 0.45 0.775
28 0.2062 0.4125 0.7938
29 0.1878 0.3756 0.8122
30 0.1878 0.3756 0.8122
31 0.1537 0.3073 0.8463
32 0.1201 0.2402 0.8799
33 0.1034 0.2067 0.8966
34 0.09388 0.1878 0.9061
35 0.06994 0.1399 0.9301
36 0.05273 0.1055 0.9473
37 0.03911 0.07823 0.9609
38 0.02855 0.05709 0.9715
39 0.02846 0.05692 0.9715
40 0.0667 0.1334 0.9333
41 0.2325 0.465 0.7675
42 0.1984 0.3969 0.8016
43 0.169 0.3381 0.831
44 0.2011 0.4022 0.7989
45 0.1629 0.3258 0.8371
46 0.151 0.302 0.849
47 0.148 0.2961 0.852
48 0.2223 0.4446 0.7777
49 0.2083 0.4165 0.7917
50 0.1764 0.3528 0.8236
51 0.2216 0.4432 0.7784
52 0.2216 0.4431 0.7784
53 0.2177 0.4353 0.7823
54 0.253 0.506 0.747
55 0.2138 0.4277 0.7862
56 0.2374 0.4748 0.7626
57 0.2351 0.4702 0.7649
58 0.1939 0.3878 0.8061
59 0.243 0.486 0.757
60 0.2026 0.4052 0.7974
61 0.2403 0.4807 0.7597
62 0.3946 0.7892 0.6054
63 0.4967 0.9933 0.5033
64 0.618 0.7641 0.382
65 0.5602 0.8797 0.4398
66 0.5836 0.8328 0.4164
67 0.563 0.8741 0.437
68 0.6403 0.7195 0.3597
69 0.7469 0.5062 0.2531
70 0.7029 0.5941 0.2971
71 0.6523 0.6955 0.3477
72 0.6511 0.6979 0.3489
73 0.7079 0.5842 0.2921
74 0.665 0.67 0.335
75 0.5992 0.8016 0.4008
76 0.5435 0.913 0.4565
77 0.5015 0.9969 0.4985
78 0.6475 0.7051 0.3525
79 0.9461 0.1078 0.05388
80 0.9262 0.1475 0.07376
81 0.8949 0.2103 0.1051
82 0.8743 0.2514 0.1257
83 0.8604 0.2792 0.1396
84 0.8077 0.3845 0.1923
85 0.8154 0.3693 0.1846
86 0.7438 0.5124 0.2562
87 0.6826 0.6347 0.3174
88 0.5889 0.8222 0.4111
89 0.6703 0.6595 0.3297
90 0.6615 0.677 0.3385
91 0.6138 0.7724 0.3862
92 0.9533 0.09331 0.04665
93 0.9402 0.1195 0.05976

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
6 &  0.9226 &  0.1547 &  0.07736 \tabularnewline
7 &  0.8582 &  0.2836 &  0.1418 \tabularnewline
8 &  0.7892 &  0.4215 &  0.2108 \tabularnewline
9 &  0.8004 &  0.3991 &  0.1996 \tabularnewline
10 &  0.7163 &  0.5675 &  0.2837 \tabularnewline
11 &  0.656 &  0.688 &  0.344 \tabularnewline
12 &  0.5569 &  0.8862 &  0.4431 \tabularnewline
13 &  0.4948 &  0.9895 &  0.5052 \tabularnewline
14 &  0.6668 &  0.6664 &  0.3332 \tabularnewline
15 &  0.5961 &  0.8078 &  0.4039 \tabularnewline
16 &  0.5245 &  0.9509 &  0.4755 \tabularnewline
17 &  0.4517 &  0.9034 &  0.5483 \tabularnewline
18 &  0.4186 &  0.8372 &  0.5814 \tabularnewline
19 &  0.3784 &  0.7568 &  0.6216 \tabularnewline
20 &  0.4335 &  0.867 &  0.5665 \tabularnewline
21 &  0.3768 &  0.7537 &  0.6232 \tabularnewline
22 &  0.3698 &  0.7396 &  0.6302 \tabularnewline
23 &  0.3818 &  0.7635 &  0.6182 \tabularnewline
24 &  0.3746 &  0.7491 &  0.6254 \tabularnewline
25 &  0.318 &  0.6361 &  0.682 \tabularnewline
26 &  0.2607 &  0.5215 &  0.7393 \tabularnewline
27 &  0.225 &  0.45 &  0.775 \tabularnewline
28 &  0.2062 &  0.4125 &  0.7938 \tabularnewline
29 &  0.1878 &  0.3756 &  0.8122 \tabularnewline
30 &  0.1878 &  0.3756 &  0.8122 \tabularnewline
31 &  0.1537 &  0.3073 &  0.8463 \tabularnewline
32 &  0.1201 &  0.2402 &  0.8799 \tabularnewline
33 &  0.1034 &  0.2067 &  0.8966 \tabularnewline
34 &  0.09388 &  0.1878 &  0.9061 \tabularnewline
35 &  0.06994 &  0.1399 &  0.9301 \tabularnewline
36 &  0.05273 &  0.1055 &  0.9473 \tabularnewline
37 &  0.03911 &  0.07823 &  0.9609 \tabularnewline
38 &  0.02855 &  0.05709 &  0.9715 \tabularnewline
39 &  0.02846 &  0.05692 &  0.9715 \tabularnewline
40 &  0.0667 &  0.1334 &  0.9333 \tabularnewline
41 &  0.2325 &  0.465 &  0.7675 \tabularnewline
42 &  0.1984 &  0.3969 &  0.8016 \tabularnewline
43 &  0.169 &  0.3381 &  0.831 \tabularnewline
44 &  0.2011 &  0.4022 &  0.7989 \tabularnewline
45 &  0.1629 &  0.3258 &  0.8371 \tabularnewline
46 &  0.151 &  0.302 &  0.849 \tabularnewline
47 &  0.148 &  0.2961 &  0.852 \tabularnewline
48 &  0.2223 &  0.4446 &  0.7777 \tabularnewline
49 &  0.2083 &  0.4165 &  0.7917 \tabularnewline
50 &  0.1764 &  0.3528 &  0.8236 \tabularnewline
51 &  0.2216 &  0.4432 &  0.7784 \tabularnewline
52 &  0.2216 &  0.4431 &  0.7784 \tabularnewline
53 &  0.2177 &  0.4353 &  0.7823 \tabularnewline
54 &  0.253 &  0.506 &  0.747 \tabularnewline
55 &  0.2138 &  0.4277 &  0.7862 \tabularnewline
56 &  0.2374 &  0.4748 &  0.7626 \tabularnewline
57 &  0.2351 &  0.4702 &  0.7649 \tabularnewline
58 &  0.1939 &  0.3878 &  0.8061 \tabularnewline
59 &  0.243 &  0.486 &  0.757 \tabularnewline
60 &  0.2026 &  0.4052 &  0.7974 \tabularnewline
61 &  0.2403 &  0.4807 &  0.7597 \tabularnewline
62 &  0.3946 &  0.7892 &  0.6054 \tabularnewline
63 &  0.4967 &  0.9933 &  0.5033 \tabularnewline
64 &  0.618 &  0.7641 &  0.382 \tabularnewline
65 &  0.5602 &  0.8797 &  0.4398 \tabularnewline
66 &  0.5836 &  0.8328 &  0.4164 \tabularnewline
67 &  0.563 &  0.8741 &  0.437 \tabularnewline
68 &  0.6403 &  0.7195 &  0.3597 \tabularnewline
69 &  0.7469 &  0.5062 &  0.2531 \tabularnewline
70 &  0.7029 &  0.5941 &  0.2971 \tabularnewline
71 &  0.6523 &  0.6955 &  0.3477 \tabularnewline
72 &  0.6511 &  0.6979 &  0.3489 \tabularnewline
73 &  0.7079 &  0.5842 &  0.2921 \tabularnewline
74 &  0.665 &  0.67 &  0.335 \tabularnewline
75 &  0.5992 &  0.8016 &  0.4008 \tabularnewline
76 &  0.5435 &  0.913 &  0.4565 \tabularnewline
77 &  0.5015 &  0.9969 &  0.4985 \tabularnewline
78 &  0.6475 &  0.7051 &  0.3525 \tabularnewline
79 &  0.9461 &  0.1078 &  0.05388 \tabularnewline
80 &  0.9262 &  0.1475 &  0.07376 \tabularnewline
81 &  0.8949 &  0.2103 &  0.1051 \tabularnewline
82 &  0.8743 &  0.2514 &  0.1257 \tabularnewline
83 &  0.8604 &  0.2792 &  0.1396 \tabularnewline
84 &  0.8077 &  0.3845 &  0.1923 \tabularnewline
85 &  0.8154 &  0.3693 &  0.1846 \tabularnewline
86 &  0.7438 &  0.5124 &  0.2562 \tabularnewline
87 &  0.6826 &  0.6347 &  0.3174 \tabularnewline
88 &  0.5889 &  0.8222 &  0.4111 \tabularnewline
89 &  0.6703 &  0.6595 &  0.3297 \tabularnewline
90 &  0.6615 &  0.677 &  0.3385 \tabularnewline
91 &  0.6138 &  0.7724 &  0.3862 \tabularnewline
92 &  0.9533 &  0.09331 &  0.04665 \tabularnewline
93 &  0.9402 &  0.1195 &  0.05976 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=300268&T=5

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]6[/C][C] 0.9226[/C][C] 0.1547[/C][C] 0.07736[/C][/ROW]
[ROW][C]7[/C][C] 0.8582[/C][C] 0.2836[/C][C] 0.1418[/C][/ROW]
[ROW][C]8[/C][C] 0.7892[/C][C] 0.4215[/C][C] 0.2108[/C][/ROW]
[ROW][C]9[/C][C] 0.8004[/C][C] 0.3991[/C][C] 0.1996[/C][/ROW]
[ROW][C]10[/C][C] 0.7163[/C][C] 0.5675[/C][C] 0.2837[/C][/ROW]
[ROW][C]11[/C][C] 0.656[/C][C] 0.688[/C][C] 0.344[/C][/ROW]
[ROW][C]12[/C][C] 0.5569[/C][C] 0.8862[/C][C] 0.4431[/C][/ROW]
[ROW][C]13[/C][C] 0.4948[/C][C] 0.9895[/C][C] 0.5052[/C][/ROW]
[ROW][C]14[/C][C] 0.6668[/C][C] 0.6664[/C][C] 0.3332[/C][/ROW]
[ROW][C]15[/C][C] 0.5961[/C][C] 0.8078[/C][C] 0.4039[/C][/ROW]
[ROW][C]16[/C][C] 0.5245[/C][C] 0.9509[/C][C] 0.4755[/C][/ROW]
[ROW][C]17[/C][C] 0.4517[/C][C] 0.9034[/C][C] 0.5483[/C][/ROW]
[ROW][C]18[/C][C] 0.4186[/C][C] 0.8372[/C][C] 0.5814[/C][/ROW]
[ROW][C]19[/C][C] 0.3784[/C][C] 0.7568[/C][C] 0.6216[/C][/ROW]
[ROW][C]20[/C][C] 0.4335[/C][C] 0.867[/C][C] 0.5665[/C][/ROW]
[ROW][C]21[/C][C] 0.3768[/C][C] 0.7537[/C][C] 0.6232[/C][/ROW]
[ROW][C]22[/C][C] 0.3698[/C][C] 0.7396[/C][C] 0.6302[/C][/ROW]
[ROW][C]23[/C][C] 0.3818[/C][C] 0.7635[/C][C] 0.6182[/C][/ROW]
[ROW][C]24[/C][C] 0.3746[/C][C] 0.7491[/C][C] 0.6254[/C][/ROW]
[ROW][C]25[/C][C] 0.318[/C][C] 0.6361[/C][C] 0.682[/C][/ROW]
[ROW][C]26[/C][C] 0.2607[/C][C] 0.5215[/C][C] 0.7393[/C][/ROW]
[ROW][C]27[/C][C] 0.225[/C][C] 0.45[/C][C] 0.775[/C][/ROW]
[ROW][C]28[/C][C] 0.2062[/C][C] 0.4125[/C][C] 0.7938[/C][/ROW]
[ROW][C]29[/C][C] 0.1878[/C][C] 0.3756[/C][C] 0.8122[/C][/ROW]
[ROW][C]30[/C][C] 0.1878[/C][C] 0.3756[/C][C] 0.8122[/C][/ROW]
[ROW][C]31[/C][C] 0.1537[/C][C] 0.3073[/C][C] 0.8463[/C][/ROW]
[ROW][C]32[/C][C] 0.1201[/C][C] 0.2402[/C][C] 0.8799[/C][/ROW]
[ROW][C]33[/C][C] 0.1034[/C][C] 0.2067[/C][C] 0.8966[/C][/ROW]
[ROW][C]34[/C][C] 0.09388[/C][C] 0.1878[/C][C] 0.9061[/C][/ROW]
[ROW][C]35[/C][C] 0.06994[/C][C] 0.1399[/C][C] 0.9301[/C][/ROW]
[ROW][C]36[/C][C] 0.05273[/C][C] 0.1055[/C][C] 0.9473[/C][/ROW]
[ROW][C]37[/C][C] 0.03911[/C][C] 0.07823[/C][C] 0.9609[/C][/ROW]
[ROW][C]38[/C][C] 0.02855[/C][C] 0.05709[/C][C] 0.9715[/C][/ROW]
[ROW][C]39[/C][C] 0.02846[/C][C] 0.05692[/C][C] 0.9715[/C][/ROW]
[ROW][C]40[/C][C] 0.0667[/C][C] 0.1334[/C][C] 0.9333[/C][/ROW]
[ROW][C]41[/C][C] 0.2325[/C][C] 0.465[/C][C] 0.7675[/C][/ROW]
[ROW][C]42[/C][C] 0.1984[/C][C] 0.3969[/C][C] 0.8016[/C][/ROW]
[ROW][C]43[/C][C] 0.169[/C][C] 0.3381[/C][C] 0.831[/C][/ROW]
[ROW][C]44[/C][C] 0.2011[/C][C] 0.4022[/C][C] 0.7989[/C][/ROW]
[ROW][C]45[/C][C] 0.1629[/C][C] 0.3258[/C][C] 0.8371[/C][/ROW]
[ROW][C]46[/C][C] 0.151[/C][C] 0.302[/C][C] 0.849[/C][/ROW]
[ROW][C]47[/C][C] 0.148[/C][C] 0.2961[/C][C] 0.852[/C][/ROW]
[ROW][C]48[/C][C] 0.2223[/C][C] 0.4446[/C][C] 0.7777[/C][/ROW]
[ROW][C]49[/C][C] 0.2083[/C][C] 0.4165[/C][C] 0.7917[/C][/ROW]
[ROW][C]50[/C][C] 0.1764[/C][C] 0.3528[/C][C] 0.8236[/C][/ROW]
[ROW][C]51[/C][C] 0.2216[/C][C] 0.4432[/C][C] 0.7784[/C][/ROW]
[ROW][C]52[/C][C] 0.2216[/C][C] 0.4431[/C][C] 0.7784[/C][/ROW]
[ROW][C]53[/C][C] 0.2177[/C][C] 0.4353[/C][C] 0.7823[/C][/ROW]
[ROW][C]54[/C][C] 0.253[/C][C] 0.506[/C][C] 0.747[/C][/ROW]
[ROW][C]55[/C][C] 0.2138[/C][C] 0.4277[/C][C] 0.7862[/C][/ROW]
[ROW][C]56[/C][C] 0.2374[/C][C] 0.4748[/C][C] 0.7626[/C][/ROW]
[ROW][C]57[/C][C] 0.2351[/C][C] 0.4702[/C][C] 0.7649[/C][/ROW]
[ROW][C]58[/C][C] 0.1939[/C][C] 0.3878[/C][C] 0.8061[/C][/ROW]
[ROW][C]59[/C][C] 0.243[/C][C] 0.486[/C][C] 0.757[/C][/ROW]
[ROW][C]60[/C][C] 0.2026[/C][C] 0.4052[/C][C] 0.7974[/C][/ROW]
[ROW][C]61[/C][C] 0.2403[/C][C] 0.4807[/C][C] 0.7597[/C][/ROW]
[ROW][C]62[/C][C] 0.3946[/C][C] 0.7892[/C][C] 0.6054[/C][/ROW]
[ROW][C]63[/C][C] 0.4967[/C][C] 0.9933[/C][C] 0.5033[/C][/ROW]
[ROW][C]64[/C][C] 0.618[/C][C] 0.7641[/C][C] 0.382[/C][/ROW]
[ROW][C]65[/C][C] 0.5602[/C][C] 0.8797[/C][C] 0.4398[/C][/ROW]
[ROW][C]66[/C][C] 0.5836[/C][C] 0.8328[/C][C] 0.4164[/C][/ROW]
[ROW][C]67[/C][C] 0.563[/C][C] 0.8741[/C][C] 0.437[/C][/ROW]
[ROW][C]68[/C][C] 0.6403[/C][C] 0.7195[/C][C] 0.3597[/C][/ROW]
[ROW][C]69[/C][C] 0.7469[/C][C] 0.5062[/C][C] 0.2531[/C][/ROW]
[ROW][C]70[/C][C] 0.7029[/C][C] 0.5941[/C][C] 0.2971[/C][/ROW]
[ROW][C]71[/C][C] 0.6523[/C][C] 0.6955[/C][C] 0.3477[/C][/ROW]
[ROW][C]72[/C][C] 0.6511[/C][C] 0.6979[/C][C] 0.3489[/C][/ROW]
[ROW][C]73[/C][C] 0.7079[/C][C] 0.5842[/C][C] 0.2921[/C][/ROW]
[ROW][C]74[/C][C] 0.665[/C][C] 0.67[/C][C] 0.335[/C][/ROW]
[ROW][C]75[/C][C] 0.5992[/C][C] 0.8016[/C][C] 0.4008[/C][/ROW]
[ROW][C]76[/C][C] 0.5435[/C][C] 0.913[/C][C] 0.4565[/C][/ROW]
[ROW][C]77[/C][C] 0.5015[/C][C] 0.9969[/C][C] 0.4985[/C][/ROW]
[ROW][C]78[/C][C] 0.6475[/C][C] 0.7051[/C][C] 0.3525[/C][/ROW]
[ROW][C]79[/C][C] 0.9461[/C][C] 0.1078[/C][C] 0.05388[/C][/ROW]
[ROW][C]80[/C][C] 0.9262[/C][C] 0.1475[/C][C] 0.07376[/C][/ROW]
[ROW][C]81[/C][C] 0.8949[/C][C] 0.2103[/C][C] 0.1051[/C][/ROW]
[ROW][C]82[/C][C] 0.8743[/C][C] 0.2514[/C][C] 0.1257[/C][/ROW]
[ROW][C]83[/C][C] 0.8604[/C][C] 0.2792[/C][C] 0.1396[/C][/ROW]
[ROW][C]84[/C][C] 0.8077[/C][C] 0.3845[/C][C] 0.1923[/C][/ROW]
[ROW][C]85[/C][C] 0.8154[/C][C] 0.3693[/C][C] 0.1846[/C][/ROW]
[ROW][C]86[/C][C] 0.7438[/C][C] 0.5124[/C][C] 0.2562[/C][/ROW]
[ROW][C]87[/C][C] 0.6826[/C][C] 0.6347[/C][C] 0.3174[/C][/ROW]
[ROW][C]88[/C][C] 0.5889[/C][C] 0.8222[/C][C] 0.4111[/C][/ROW]
[ROW][C]89[/C][C] 0.6703[/C][C] 0.6595[/C][C] 0.3297[/C][/ROW]
[ROW][C]90[/C][C] 0.6615[/C][C] 0.677[/C][C] 0.3385[/C][/ROW]
[ROW][C]91[/C][C] 0.6138[/C][C] 0.7724[/C][C] 0.3862[/C][/ROW]
[ROW][C]92[/C][C] 0.9533[/C][C] 0.09331[/C][C] 0.04665[/C][/ROW]
[ROW][C]93[/C][C] 0.9402[/C][C] 0.1195[/C][C] 0.05976[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=300268&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=300268&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
6 0.9226 0.1547 0.07736
7 0.8582 0.2836 0.1418
8 0.7892 0.4215 0.2108
9 0.8004 0.3991 0.1996
10 0.7163 0.5675 0.2837
11 0.656 0.688 0.344
12 0.5569 0.8862 0.4431
13 0.4948 0.9895 0.5052
14 0.6668 0.6664 0.3332
15 0.5961 0.8078 0.4039
16 0.5245 0.9509 0.4755
17 0.4517 0.9034 0.5483
18 0.4186 0.8372 0.5814
19 0.3784 0.7568 0.6216
20 0.4335 0.867 0.5665
21 0.3768 0.7537 0.6232
22 0.3698 0.7396 0.6302
23 0.3818 0.7635 0.6182
24 0.3746 0.7491 0.6254
25 0.318 0.6361 0.682
26 0.2607 0.5215 0.7393
27 0.225 0.45 0.775
28 0.2062 0.4125 0.7938
29 0.1878 0.3756 0.8122
30 0.1878 0.3756 0.8122
31 0.1537 0.3073 0.8463
32 0.1201 0.2402 0.8799
33 0.1034 0.2067 0.8966
34 0.09388 0.1878 0.9061
35 0.06994 0.1399 0.9301
36 0.05273 0.1055 0.9473
37 0.03911 0.07823 0.9609
38 0.02855 0.05709 0.9715
39 0.02846 0.05692 0.9715
40 0.0667 0.1334 0.9333
41 0.2325 0.465 0.7675
42 0.1984 0.3969 0.8016
43 0.169 0.3381 0.831
44 0.2011 0.4022 0.7989
45 0.1629 0.3258 0.8371
46 0.151 0.302 0.849
47 0.148 0.2961 0.852
48 0.2223 0.4446 0.7777
49 0.2083 0.4165 0.7917
50 0.1764 0.3528 0.8236
51 0.2216 0.4432 0.7784
52 0.2216 0.4431 0.7784
53 0.2177 0.4353 0.7823
54 0.253 0.506 0.747
55 0.2138 0.4277 0.7862
56 0.2374 0.4748 0.7626
57 0.2351 0.4702 0.7649
58 0.1939 0.3878 0.8061
59 0.243 0.486 0.757
60 0.2026 0.4052 0.7974
61 0.2403 0.4807 0.7597
62 0.3946 0.7892 0.6054
63 0.4967 0.9933 0.5033
64 0.618 0.7641 0.382
65 0.5602 0.8797 0.4398
66 0.5836 0.8328 0.4164
67 0.563 0.8741 0.437
68 0.6403 0.7195 0.3597
69 0.7469 0.5062 0.2531
70 0.7029 0.5941 0.2971
71 0.6523 0.6955 0.3477
72 0.6511 0.6979 0.3489
73 0.7079 0.5842 0.2921
74 0.665 0.67 0.335
75 0.5992 0.8016 0.4008
76 0.5435 0.913 0.4565
77 0.5015 0.9969 0.4985
78 0.6475 0.7051 0.3525
79 0.9461 0.1078 0.05388
80 0.9262 0.1475 0.07376
81 0.8949 0.2103 0.1051
82 0.8743 0.2514 0.1257
83 0.8604 0.2792 0.1396
84 0.8077 0.3845 0.1923
85 0.8154 0.3693 0.1846
86 0.7438 0.5124 0.2562
87 0.6826 0.6347 0.3174
88 0.5889 0.8222 0.4111
89 0.6703 0.6595 0.3297
90 0.6615 0.677 0.3385
91 0.6138 0.7724 0.3862
92 0.9533 0.09331 0.04665
93 0.9402 0.1195 0.05976







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level0 0OK
5% type I error level00OK
10% type I error level40.0454545OK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 0 &  0 & OK \tabularnewline
5% type I error level & 0 & 0 & OK \tabularnewline
10% type I error level & 4 & 0.0454545 & OK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=300268&T=6

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]0[/C][C] 0[/C][C]OK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]4[/C][C]0.0454545[/C][C]OK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=300268&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=300268&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level0 0OK
5% type I error level00OK
10% type I error level40.0454545OK







Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 0.27, df1 = 2, df2 = 94, p-value = 0.764
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.66262, df1 = 4, df2 = 92, p-value = 0.6195
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 0.012166, df1 = 2, df2 = 94, p-value = 0.9879

\begin{tabular}{lllllllll}
\hline
Ramsey RESET F-Test for powers (2 and 3) of fitted values \tabularnewline
> reset_test_fitted
	RESET test
data:  mylm
RESET = 0.27, df1 = 2, df2 = 94, p-value = 0.764
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of regressors \tabularnewline
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.66262, df1 = 4, df2 = 92, p-value = 0.6195
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of principal components \tabularnewline
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 0.012166, df1 = 2, df2 = 94, p-value = 0.9879
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=300268&T=7

[TABLE]
[ROW][C]Ramsey RESET F-Test for powers (2 and 3) of fitted values[/C][/ROW]
[ROW][C]
> reset_test_fitted
	RESET test
data:  mylm
RESET = 0.27, df1 = 2, df2 = 94, p-value = 0.764
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of regressors[/C][/ROW] [ROW][C]
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.66262, df1 = 4, df2 = 92, p-value = 0.6195
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of principal components[/C][/ROW] [ROW][C]
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 0.012166, df1 = 2, df2 = 94, p-value = 0.9879
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=300268&T=7

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=300268&T=7

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 0.27, df1 = 2, df2 = 94, p-value = 0.764
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.66262, df1 = 4, df2 = 92, p-value = 0.6195
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 0.012166, df1 = 2, df2 = 94, p-value = 0.9879







Variance Inflation Factors (Multicollinearity)
> vif
  ITHSUM  TVDCSUM 
1.011615 1.011615 

\begin{tabular}{lllllllll}
\hline
Variance Inflation Factors (Multicollinearity) \tabularnewline
> vif
  ITHSUM  TVDCSUM 
1.011615 1.011615 
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=300268&T=8

[TABLE]
[ROW][C]Variance Inflation Factors (Multicollinearity)[/C][/ROW]
[ROW][C]
> vif
  ITHSUM  TVDCSUM 
1.011615 1.011615 
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=300268&T=8

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=300268&T=8

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Variance Inflation Factors (Multicollinearity)
> vif
  ITHSUM  TVDCSUM 
1.011615 1.011615 



Parameters (Session):
par1 = 2 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ; par4 = 0 ; par5 = 0 ;
Parameters (R input):
par1 = 2 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ; par4 = 0 ; par5 = 0 ;
R code (references can be found in the software module):
par5 <- '0'
par4 <- '0'
par3 <- 'No Linear Trend'
par2 <- 'Do not include Seasonal Dummies'
par1 <- '1'
library(lattice)
library(lmtest)
library(car)
library(MASS)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
mywarning <- ''
par1 <- as.numeric(par1)
if(is.na(par1)) {
par1 <- 1
mywarning = 'Warning: you did not specify the column number of the endogenous series! The first column was selected by default.'
}
if (par4=='') par4 <- 0
par4 <- as.numeric(par4)
if (par5=='') par5 <- 0
par5 <- as.numeric(par5)
x <- na.omit(t(y))
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'Seasonal Differences (s=12)'){
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'First and Seasonal Differences (s=12)'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if(par4 > 0) {
x2 <- array(0, dim=c(n-par4,par4), dimnames=list(1:(n-par4), paste(colnames(x)[par1],'(t-',1:par4,')',sep='')))
for (i in 1:(n-par4)) {
for (j in 1:par4) {
x2[i,j] <- x[i+par4-j,par1]
}
}
x <- cbind(x[(par4+1):n,], x2)
n <- n - par4
}
if(par5 > 0) {
x2 <- array(0, dim=c(n-par5*12,par5), dimnames=list(1:(n-par5*12), paste(colnames(x)[par1],'(t-',1:par5,'s)',sep='')))
for (i in 1:(n-par5*12)) {
for (j in 1:par5) {
x2[i,j] <- x[i+par5*12-j*12,par1]
}
}
x <- cbind(x[(par5*12+1):n,], x2)
n <- n - par5*12
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
(k <- length(x[n,]))
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
print(x)
(k <- length(x[n,]))
head(x)
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
sresid <- studres(mylm)
hist(sresid, freq=FALSE, main='Distribution of Studentized Residuals')
xfit<-seq(min(sresid),max(sresid),length=40)
yfit<-dnorm(xfit)
lines(xfit, yfit)
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqPlot(mylm, main='QQ Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
print(z)
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, signif(mysum$coefficients[i,1],6), sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, mywarning)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Multiple Linear Regression - Ordinary Least Squares', 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,formatC(signif(mysum$coefficients[i,1],5),format='g',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,2],5),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,3],4),format='e',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4],4),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4]/2,4),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a,formatC(signif(sqrt(mysum$r.squared),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$adj.r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a,formatC(signif(mysum$fstatistic[1],6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[2],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[3],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a,formatC(signif(1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a,formatC(signif(mysum$sigma,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a,formatC(signif(sum(myerror*myerror),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
myr <- as.numeric(mysum$resid)
myr
if(n < 200) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,formatC(signif(x[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(x[i]-mysum$resid[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$resid[i],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,1],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,2],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,3],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant1,6))
a<-table.element(a,formatC(signif(numsignificant1/numgqtests,6),format='g',flag=' '))
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant5,6))
a<-table.element(a,signif(numsignificant5/numgqtests,6))
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant10,6))
a<-table.element(a,signif(numsignificant10/numgqtests,6))
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}
}
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of fitted values',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_fitted <- resettest(mylm,power=2:3,type='fitted')
a<-table.element(a,paste('
',RC.texteval('reset_test_fitted'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of regressors',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_regressors <- resettest(mylm,power=2:3,type='regressor')
a<-table.element(a,paste('
',RC.texteval('reset_test_regressors'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of principal components',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_principal_components <- resettest(mylm,power=2:3,type='princomp')
a<-table.element(a,paste('
',RC.texteval('reset_test_principal_components'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable8.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Variance Inflation Factors (Multicollinearity)',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
vif <- vif(mylm)
a<-table.element(a,paste('
',RC.texteval('vif'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable9.tab')