Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationWed, 07 Dec 2016 14:57:58 +0100
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2016/Dec/07/t14811191210g69zpk52pkxira.htm/, Retrieved Tue, 07 May 2024 10:25:48 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=298119, Retrieved Tue, 07 May 2024 10:25:48 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact64
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [Multiple Regressi...] [2016-12-07 13:57:58] [55eb8f21ed24cda91766c505eb72bb6f] [Current]
Feedback Forum

Post a new message
Dataseries X:
4	1	13
2	5	16
3	1	17
2	2	NA
2	1	NA
3	4	16
3	1	NA
2	1	NA
2	1	NA
4	2	17
2	1	17
2	4	15
3	1	16
2	5	14
3	2	16
2	1	17
2	NA	NA
NA	NA	NA
3	2	NA
2	1	NA
2	4	16
3	1	NA
1	2	16
2	3	NA
3	1	NA
2	4	NA
2	2	16
3	3	15
5	1	16
2	4	16
5	1	13
2	1	15
2	1	17
4	1	NA
1	3	13
2	4	17
2	2	NA
3	4	14
2	2	14
3	2	18
2	1	NA
3	1	17
4	1	13
4	5	16
3	1	15
2	1	15
2	1	NA
1	NA	15
4	1	13
4	1	NA
3	2	17
2	2	NA
2	2	NA
2	2	11
2	3	14
3	1	13
2	NA	NA
2	2	17
3	1	16
3	3	NA
2	3	17
2	1	16
4	1	16
4	2	16
3	1	15
4	3	12
4	3	17
4	NA	14
4	4	14
5	4	16
3	2	NA
4	1	NA
4	1	NA
2	1	NA
2	3	NA
3	2	15
4	1	16
2	1	14
5	1	15
1	1	17
3	1	NA
3	1	10
2	3	NA
2	1	17
1	4	NA
2	2	20
1	4	17
2	1	18
2	1	NA
2	1	17
2	2	14
3	5	NA
2	3	17
1	4	NA
1	1	17
3	4	NA
2	4	16
3	2	18
1	3	18
2	1	16
2	1	NA
3	4	NA
2	4	15
2	1	13
3	3	NA
1	1	NA
4	5	NA
3	2	NA
2	2	NA
3	1	16
3	3	NA
3	1	NA
4	1	NA
3	3	12
2	1	NA
3	4	16
2	1	16
1	5	NA
1	1	16
2	1	14
4	4	15
3	2	14
2	1	NA
2	4	15
3	1	NA
3	1	15
3	2	16
2	2	NA
2	1	NA
2	1	NA
3	1	11
4	4	NA
2	4	18
2	3	NA
1	4	11
3	1	NA
3	5	18
4	3	NA
3	2	15
1	3	19
1	3	17
2	1	NA
1	4	14
5	1	NA
3	1	13
2	3	17
2	3	14
4	2	19
2	4	14
4	4	NA
4	2	NA
3	4	16
4	4	16
3	2	15
4	1	12
3	1	NA
4	1	17
2	3	NA
4	3	NA
1	4	18
4	1	15
3	NA	18
2	3	15
2	3	NA
2	NA	NA
4	1	NA
3	2	16
3	2	NA
2	4	16




Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time7 seconds
R ServerBig Analytics Cloud Computing Center

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input view raw input (R code)  \tabularnewline
Raw Outputview raw output of R engine  \tabularnewline
Computing time7 seconds \tabularnewline
R ServerBig Analytics Cloud Computing Center \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=298119&T=0

[TABLE]
[ROW]
Summary of computational transaction[/C][/ROW] [ROW]Raw Input[/C] view raw input (R code) [/C][/ROW] [ROW]Raw Output[/C]view raw output of R engine [/C][/ROW] [ROW]Computing time[/C]7 seconds[/C][/ROW] [ROW]R Server[/C]Big Analytics Cloud Computing Center[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=298119&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=298119&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time7 seconds
R ServerBig Analytics Cloud Computing Center







Multiple Linear Regression - Estimated Regression Equation
TVDSUM[t] = + 16.0276 -0.289541EP3[t] + 0.0886389EP4[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
TVDSUM[t] =  +  16.0276 -0.289541EP3[t] +  0.0886389EP4[t]  + e[t] \tabularnewline
 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=298119&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]TVDSUM[t] =  +  16.0276 -0.289541EP3[t] +  0.0886389EP4[t]  + e[t][/C][/ROW]
[ROW][C][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=298119&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=298119&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
TVDSUM[t] = + 16.0276 -0.289541EP3[t] + 0.0886389EP4[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+16.03 0.6647+2.4110e+01 9.387e-43 4.693e-43
EP3-0.2895 0.1848-1.5670e+00 0.1204 0.06018
EP4+0.08864 0.1459+6.0770e-01 0.5448 0.2724

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & +16.03 &  0.6647 & +2.4110e+01 &  9.387e-43 &  4.693e-43 \tabularnewline
EP3 & -0.2895 &  0.1848 & -1.5670e+00 &  0.1204 &  0.06018 \tabularnewline
EP4 & +0.08864 &  0.1459 & +6.0770e-01 &  0.5448 &  0.2724 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=298119&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]+16.03[/C][C] 0.6647[/C][C]+2.4110e+01[/C][C] 9.387e-43[/C][C] 4.693e-43[/C][/ROW]
[ROW][C]EP3[/C][C]-0.2895[/C][C] 0.1848[/C][C]-1.5670e+00[/C][C] 0.1204[/C][C] 0.06018[/C][/ROW]
[ROW][C]EP4[/C][C]+0.08864[/C][C] 0.1459[/C][C]+6.0770e-01[/C][C] 0.5448[/C][C] 0.2724[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=298119&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=298119&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+16.03 0.6647+2.4110e+01 9.387e-43 4.693e-43
EP3-0.2895 0.1848-1.5670e+00 0.1204 0.06018
EP4+0.08864 0.1459+6.0770e-01 0.5448 0.2724







Multiple Linear Regression - Regression Statistics
Multiple R 0.1806
R-squared 0.03262
Adjusted R-squared 0.01268
F-TEST (value) 1.636
F-TEST (DF numerator)2
F-TEST (DF denominator)97
p-value 0.2001
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 1.865
Sum Squared Residuals 337.5

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R &  0.1806 \tabularnewline
R-squared &  0.03262 \tabularnewline
Adjusted R-squared &  0.01268 \tabularnewline
F-TEST (value) &  1.636 \tabularnewline
F-TEST (DF numerator) & 2 \tabularnewline
F-TEST (DF denominator) & 97 \tabularnewline
p-value &  0.2001 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation &  1.865 \tabularnewline
Sum Squared Residuals &  337.5 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=298119&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C] 0.1806[/C][/ROW]
[ROW][C]R-squared[/C][C] 0.03262[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C] 0.01268[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C] 1.636[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]2[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]97[/C][/ROW]
[ROW][C]p-value[/C][C] 0.2001[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C] 1.865[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C] 337.5[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=298119&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=298119&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R 0.1806
R-squared 0.03262
Adjusted R-squared 0.01268
F-TEST (value) 1.636
F-TEST (DF numerator)2
F-TEST (DF denominator)97
p-value 0.2001
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 1.865
Sum Squared Residuals 337.5







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 13 14.96-1.958
2 16 15.89 0.1083
3 17 15.25 1.752
4 16 15.51 0.4864
5 17 15.05 1.953
6 17 15.54 1.463
7 15 15.8-0.8031
8 16 15.25 0.7524
9 14 15.89-1.892
10 16 15.34 0.6637
11 17 15.54 1.463
12 16 15.8 0.1969
13 16 15.92 0.08464
14 16 15.63 0.3742
15 15 15.42-0.4249
16 16 14.67 1.331
17 16 15.8 0.1969
18 13 14.67-1.669
19 15 15.54-0.5372
20 17 15.54 1.463
21 13 16-3.004
22 17 15.8 1.197
23 14 15.51-1.514
24 14 15.63-1.626
25 18 15.34 2.664
26 17 15.25 1.752
27 13 14.96-1.958
28 16 15.31 0.6873
29 15 15.25-0.2476
30 15 15.54-0.5372
31 13 14.96-1.958
32 17 15.34 1.664
33 11 15.63-4.626
34 14 15.71-1.714
35 13 15.25-2.248
36 17 15.63 1.374
37 16 15.25 0.7524
38 17 15.71 1.286
39 16 15.54 0.4628
40 16 14.96 1.042
41 16 15.05 0.9533
42 15 15.25-0.2476
43 12 15.14-3.135
44 17 15.14 1.865
45 14 15.22-1.224
46 16 14.93 1.066
47 15 15.34-0.3363
48 16 14.96 1.042
49 14 15.54-1.537
50 15 14.67 0.3314
51 17 15.83 1.173
52 10 15.25-5.248
53 17 15.54 1.463
54 20 15.63 4.374
55 17 16.09 0.9074
56 18 15.54 2.463
57 17 15.54 1.463
58 14 15.63-1.626
59 17 15.71 1.286
60 17 15.83 1.173
61 16 15.8 0.1969
62 18 15.34 2.664
63 18 16 1.996
64 16 15.54 0.4628
65 15 15.8-0.8031
66 13 15.54-2.537
67 16 15.25 0.7524
68 12 15.42-3.425
69 16 15.51 0.4864
70 16 15.54 0.4628
71 16 15.83 0.1733
72 14 15.54-1.537
73 15 15.22-0.224
74 14 15.34-1.336
75 15 15.8-0.8031
76 15 15.25-0.2476
77 16 15.34 0.6637
78 11 15.25-4.248
79 18 15.8 2.197
80 11 16.09-5.093
81 18 15.6 2.398
82 15 15.34-0.3363
83 19 16 2.996
84 17 16 0.996
85 14 16.09-2.093
86 13 15.25-2.248
87 17 15.71 1.286
88 14 15.71-1.714
89 19 15.05 3.953
90 14 15.8-1.803
91 16 15.51 0.4864
92 16 15.22 0.776
93 15 15.34-0.3363
94 12 14.96-2.958
95 17 14.96 2.042
96 18 16.09 1.907
97 15 14.96 0.0419
98 15 15.71-0.7145
99 16 15.34 0.6637
100 16 15.8 0.1969

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 &  13 &  14.96 & -1.958 \tabularnewline
2 &  16 &  15.89 &  0.1083 \tabularnewline
3 &  17 &  15.25 &  1.752 \tabularnewline
4 &  16 &  15.51 &  0.4864 \tabularnewline
5 &  17 &  15.05 &  1.953 \tabularnewline
6 &  17 &  15.54 &  1.463 \tabularnewline
7 &  15 &  15.8 & -0.8031 \tabularnewline
8 &  16 &  15.25 &  0.7524 \tabularnewline
9 &  14 &  15.89 & -1.892 \tabularnewline
10 &  16 &  15.34 &  0.6637 \tabularnewline
11 &  17 &  15.54 &  1.463 \tabularnewline
12 &  16 &  15.8 &  0.1969 \tabularnewline
13 &  16 &  15.92 &  0.08464 \tabularnewline
14 &  16 &  15.63 &  0.3742 \tabularnewline
15 &  15 &  15.42 & -0.4249 \tabularnewline
16 &  16 &  14.67 &  1.331 \tabularnewline
17 &  16 &  15.8 &  0.1969 \tabularnewline
18 &  13 &  14.67 & -1.669 \tabularnewline
19 &  15 &  15.54 & -0.5372 \tabularnewline
20 &  17 &  15.54 &  1.463 \tabularnewline
21 &  13 &  16 & -3.004 \tabularnewline
22 &  17 &  15.8 &  1.197 \tabularnewline
23 &  14 &  15.51 & -1.514 \tabularnewline
24 &  14 &  15.63 & -1.626 \tabularnewline
25 &  18 &  15.34 &  2.664 \tabularnewline
26 &  17 &  15.25 &  1.752 \tabularnewline
27 &  13 &  14.96 & -1.958 \tabularnewline
28 &  16 &  15.31 &  0.6873 \tabularnewline
29 &  15 &  15.25 & -0.2476 \tabularnewline
30 &  15 &  15.54 & -0.5372 \tabularnewline
31 &  13 &  14.96 & -1.958 \tabularnewline
32 &  17 &  15.34 &  1.664 \tabularnewline
33 &  11 &  15.63 & -4.626 \tabularnewline
34 &  14 &  15.71 & -1.714 \tabularnewline
35 &  13 &  15.25 & -2.248 \tabularnewline
36 &  17 &  15.63 &  1.374 \tabularnewline
37 &  16 &  15.25 &  0.7524 \tabularnewline
38 &  17 &  15.71 &  1.286 \tabularnewline
39 &  16 &  15.54 &  0.4628 \tabularnewline
40 &  16 &  14.96 &  1.042 \tabularnewline
41 &  16 &  15.05 &  0.9533 \tabularnewline
42 &  15 &  15.25 & -0.2476 \tabularnewline
43 &  12 &  15.14 & -3.135 \tabularnewline
44 &  17 &  15.14 &  1.865 \tabularnewline
45 &  14 &  15.22 & -1.224 \tabularnewline
46 &  16 &  14.93 &  1.066 \tabularnewline
47 &  15 &  15.34 & -0.3363 \tabularnewline
48 &  16 &  14.96 &  1.042 \tabularnewline
49 &  14 &  15.54 & -1.537 \tabularnewline
50 &  15 &  14.67 &  0.3314 \tabularnewline
51 &  17 &  15.83 &  1.173 \tabularnewline
52 &  10 &  15.25 & -5.248 \tabularnewline
53 &  17 &  15.54 &  1.463 \tabularnewline
54 &  20 &  15.63 &  4.374 \tabularnewline
55 &  17 &  16.09 &  0.9074 \tabularnewline
56 &  18 &  15.54 &  2.463 \tabularnewline
57 &  17 &  15.54 &  1.463 \tabularnewline
58 &  14 &  15.63 & -1.626 \tabularnewline
59 &  17 &  15.71 &  1.286 \tabularnewline
60 &  17 &  15.83 &  1.173 \tabularnewline
61 &  16 &  15.8 &  0.1969 \tabularnewline
62 &  18 &  15.34 &  2.664 \tabularnewline
63 &  18 &  16 &  1.996 \tabularnewline
64 &  16 &  15.54 &  0.4628 \tabularnewline
65 &  15 &  15.8 & -0.8031 \tabularnewline
66 &  13 &  15.54 & -2.537 \tabularnewline
67 &  16 &  15.25 &  0.7524 \tabularnewline
68 &  12 &  15.42 & -3.425 \tabularnewline
69 &  16 &  15.51 &  0.4864 \tabularnewline
70 &  16 &  15.54 &  0.4628 \tabularnewline
71 &  16 &  15.83 &  0.1733 \tabularnewline
72 &  14 &  15.54 & -1.537 \tabularnewline
73 &  15 &  15.22 & -0.224 \tabularnewline
74 &  14 &  15.34 & -1.336 \tabularnewline
75 &  15 &  15.8 & -0.8031 \tabularnewline
76 &  15 &  15.25 & -0.2476 \tabularnewline
77 &  16 &  15.34 &  0.6637 \tabularnewline
78 &  11 &  15.25 & -4.248 \tabularnewline
79 &  18 &  15.8 &  2.197 \tabularnewline
80 &  11 &  16.09 & -5.093 \tabularnewline
81 &  18 &  15.6 &  2.398 \tabularnewline
82 &  15 &  15.34 & -0.3363 \tabularnewline
83 &  19 &  16 &  2.996 \tabularnewline
84 &  17 &  16 &  0.996 \tabularnewline
85 &  14 &  16.09 & -2.093 \tabularnewline
86 &  13 &  15.25 & -2.248 \tabularnewline
87 &  17 &  15.71 &  1.286 \tabularnewline
88 &  14 &  15.71 & -1.714 \tabularnewline
89 &  19 &  15.05 &  3.953 \tabularnewline
90 &  14 &  15.8 & -1.803 \tabularnewline
91 &  16 &  15.51 &  0.4864 \tabularnewline
92 &  16 &  15.22 &  0.776 \tabularnewline
93 &  15 &  15.34 & -0.3363 \tabularnewline
94 &  12 &  14.96 & -2.958 \tabularnewline
95 &  17 &  14.96 &  2.042 \tabularnewline
96 &  18 &  16.09 &  1.907 \tabularnewline
97 &  15 &  14.96 &  0.0419 \tabularnewline
98 &  15 &  15.71 & -0.7145 \tabularnewline
99 &  16 &  15.34 &  0.6637 \tabularnewline
100 &  16 &  15.8 &  0.1969 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=298119&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C] 13[/C][C] 14.96[/C][C]-1.958[/C][/ROW]
[ROW][C]2[/C][C] 16[/C][C] 15.89[/C][C] 0.1083[/C][/ROW]
[ROW][C]3[/C][C] 17[/C][C] 15.25[/C][C] 1.752[/C][/ROW]
[ROW][C]4[/C][C] 16[/C][C] 15.51[/C][C] 0.4864[/C][/ROW]
[ROW][C]5[/C][C] 17[/C][C] 15.05[/C][C] 1.953[/C][/ROW]
[ROW][C]6[/C][C] 17[/C][C] 15.54[/C][C] 1.463[/C][/ROW]
[ROW][C]7[/C][C] 15[/C][C] 15.8[/C][C]-0.8031[/C][/ROW]
[ROW][C]8[/C][C] 16[/C][C] 15.25[/C][C] 0.7524[/C][/ROW]
[ROW][C]9[/C][C] 14[/C][C] 15.89[/C][C]-1.892[/C][/ROW]
[ROW][C]10[/C][C] 16[/C][C] 15.34[/C][C] 0.6637[/C][/ROW]
[ROW][C]11[/C][C] 17[/C][C] 15.54[/C][C] 1.463[/C][/ROW]
[ROW][C]12[/C][C] 16[/C][C] 15.8[/C][C] 0.1969[/C][/ROW]
[ROW][C]13[/C][C] 16[/C][C] 15.92[/C][C] 0.08464[/C][/ROW]
[ROW][C]14[/C][C] 16[/C][C] 15.63[/C][C] 0.3742[/C][/ROW]
[ROW][C]15[/C][C] 15[/C][C] 15.42[/C][C]-0.4249[/C][/ROW]
[ROW][C]16[/C][C] 16[/C][C] 14.67[/C][C] 1.331[/C][/ROW]
[ROW][C]17[/C][C] 16[/C][C] 15.8[/C][C] 0.1969[/C][/ROW]
[ROW][C]18[/C][C] 13[/C][C] 14.67[/C][C]-1.669[/C][/ROW]
[ROW][C]19[/C][C] 15[/C][C] 15.54[/C][C]-0.5372[/C][/ROW]
[ROW][C]20[/C][C] 17[/C][C] 15.54[/C][C] 1.463[/C][/ROW]
[ROW][C]21[/C][C] 13[/C][C] 16[/C][C]-3.004[/C][/ROW]
[ROW][C]22[/C][C] 17[/C][C] 15.8[/C][C] 1.197[/C][/ROW]
[ROW][C]23[/C][C] 14[/C][C] 15.51[/C][C]-1.514[/C][/ROW]
[ROW][C]24[/C][C] 14[/C][C] 15.63[/C][C]-1.626[/C][/ROW]
[ROW][C]25[/C][C] 18[/C][C] 15.34[/C][C] 2.664[/C][/ROW]
[ROW][C]26[/C][C] 17[/C][C] 15.25[/C][C] 1.752[/C][/ROW]
[ROW][C]27[/C][C] 13[/C][C] 14.96[/C][C]-1.958[/C][/ROW]
[ROW][C]28[/C][C] 16[/C][C] 15.31[/C][C] 0.6873[/C][/ROW]
[ROW][C]29[/C][C] 15[/C][C] 15.25[/C][C]-0.2476[/C][/ROW]
[ROW][C]30[/C][C] 15[/C][C] 15.54[/C][C]-0.5372[/C][/ROW]
[ROW][C]31[/C][C] 13[/C][C] 14.96[/C][C]-1.958[/C][/ROW]
[ROW][C]32[/C][C] 17[/C][C] 15.34[/C][C] 1.664[/C][/ROW]
[ROW][C]33[/C][C] 11[/C][C] 15.63[/C][C]-4.626[/C][/ROW]
[ROW][C]34[/C][C] 14[/C][C] 15.71[/C][C]-1.714[/C][/ROW]
[ROW][C]35[/C][C] 13[/C][C] 15.25[/C][C]-2.248[/C][/ROW]
[ROW][C]36[/C][C] 17[/C][C] 15.63[/C][C] 1.374[/C][/ROW]
[ROW][C]37[/C][C] 16[/C][C] 15.25[/C][C] 0.7524[/C][/ROW]
[ROW][C]38[/C][C] 17[/C][C] 15.71[/C][C] 1.286[/C][/ROW]
[ROW][C]39[/C][C] 16[/C][C] 15.54[/C][C] 0.4628[/C][/ROW]
[ROW][C]40[/C][C] 16[/C][C] 14.96[/C][C] 1.042[/C][/ROW]
[ROW][C]41[/C][C] 16[/C][C] 15.05[/C][C] 0.9533[/C][/ROW]
[ROW][C]42[/C][C] 15[/C][C] 15.25[/C][C]-0.2476[/C][/ROW]
[ROW][C]43[/C][C] 12[/C][C] 15.14[/C][C]-3.135[/C][/ROW]
[ROW][C]44[/C][C] 17[/C][C] 15.14[/C][C] 1.865[/C][/ROW]
[ROW][C]45[/C][C] 14[/C][C] 15.22[/C][C]-1.224[/C][/ROW]
[ROW][C]46[/C][C] 16[/C][C] 14.93[/C][C] 1.066[/C][/ROW]
[ROW][C]47[/C][C] 15[/C][C] 15.34[/C][C]-0.3363[/C][/ROW]
[ROW][C]48[/C][C] 16[/C][C] 14.96[/C][C] 1.042[/C][/ROW]
[ROW][C]49[/C][C] 14[/C][C] 15.54[/C][C]-1.537[/C][/ROW]
[ROW][C]50[/C][C] 15[/C][C] 14.67[/C][C] 0.3314[/C][/ROW]
[ROW][C]51[/C][C] 17[/C][C] 15.83[/C][C] 1.173[/C][/ROW]
[ROW][C]52[/C][C] 10[/C][C] 15.25[/C][C]-5.248[/C][/ROW]
[ROW][C]53[/C][C] 17[/C][C] 15.54[/C][C] 1.463[/C][/ROW]
[ROW][C]54[/C][C] 20[/C][C] 15.63[/C][C] 4.374[/C][/ROW]
[ROW][C]55[/C][C] 17[/C][C] 16.09[/C][C] 0.9074[/C][/ROW]
[ROW][C]56[/C][C] 18[/C][C] 15.54[/C][C] 2.463[/C][/ROW]
[ROW][C]57[/C][C] 17[/C][C] 15.54[/C][C] 1.463[/C][/ROW]
[ROW][C]58[/C][C] 14[/C][C] 15.63[/C][C]-1.626[/C][/ROW]
[ROW][C]59[/C][C] 17[/C][C] 15.71[/C][C] 1.286[/C][/ROW]
[ROW][C]60[/C][C] 17[/C][C] 15.83[/C][C] 1.173[/C][/ROW]
[ROW][C]61[/C][C] 16[/C][C] 15.8[/C][C] 0.1969[/C][/ROW]
[ROW][C]62[/C][C] 18[/C][C] 15.34[/C][C] 2.664[/C][/ROW]
[ROW][C]63[/C][C] 18[/C][C] 16[/C][C] 1.996[/C][/ROW]
[ROW][C]64[/C][C] 16[/C][C] 15.54[/C][C] 0.4628[/C][/ROW]
[ROW][C]65[/C][C] 15[/C][C] 15.8[/C][C]-0.8031[/C][/ROW]
[ROW][C]66[/C][C] 13[/C][C] 15.54[/C][C]-2.537[/C][/ROW]
[ROW][C]67[/C][C] 16[/C][C] 15.25[/C][C] 0.7524[/C][/ROW]
[ROW][C]68[/C][C] 12[/C][C] 15.42[/C][C]-3.425[/C][/ROW]
[ROW][C]69[/C][C] 16[/C][C] 15.51[/C][C] 0.4864[/C][/ROW]
[ROW][C]70[/C][C] 16[/C][C] 15.54[/C][C] 0.4628[/C][/ROW]
[ROW][C]71[/C][C] 16[/C][C] 15.83[/C][C] 0.1733[/C][/ROW]
[ROW][C]72[/C][C] 14[/C][C] 15.54[/C][C]-1.537[/C][/ROW]
[ROW][C]73[/C][C] 15[/C][C] 15.22[/C][C]-0.224[/C][/ROW]
[ROW][C]74[/C][C] 14[/C][C] 15.34[/C][C]-1.336[/C][/ROW]
[ROW][C]75[/C][C] 15[/C][C] 15.8[/C][C]-0.8031[/C][/ROW]
[ROW][C]76[/C][C] 15[/C][C] 15.25[/C][C]-0.2476[/C][/ROW]
[ROW][C]77[/C][C] 16[/C][C] 15.34[/C][C] 0.6637[/C][/ROW]
[ROW][C]78[/C][C] 11[/C][C] 15.25[/C][C]-4.248[/C][/ROW]
[ROW][C]79[/C][C] 18[/C][C] 15.8[/C][C] 2.197[/C][/ROW]
[ROW][C]80[/C][C] 11[/C][C] 16.09[/C][C]-5.093[/C][/ROW]
[ROW][C]81[/C][C] 18[/C][C] 15.6[/C][C] 2.398[/C][/ROW]
[ROW][C]82[/C][C] 15[/C][C] 15.34[/C][C]-0.3363[/C][/ROW]
[ROW][C]83[/C][C] 19[/C][C] 16[/C][C] 2.996[/C][/ROW]
[ROW][C]84[/C][C] 17[/C][C] 16[/C][C] 0.996[/C][/ROW]
[ROW][C]85[/C][C] 14[/C][C] 16.09[/C][C]-2.093[/C][/ROW]
[ROW][C]86[/C][C] 13[/C][C] 15.25[/C][C]-2.248[/C][/ROW]
[ROW][C]87[/C][C] 17[/C][C] 15.71[/C][C] 1.286[/C][/ROW]
[ROW][C]88[/C][C] 14[/C][C] 15.71[/C][C]-1.714[/C][/ROW]
[ROW][C]89[/C][C] 19[/C][C] 15.05[/C][C] 3.953[/C][/ROW]
[ROW][C]90[/C][C] 14[/C][C] 15.8[/C][C]-1.803[/C][/ROW]
[ROW][C]91[/C][C] 16[/C][C] 15.51[/C][C] 0.4864[/C][/ROW]
[ROW][C]92[/C][C] 16[/C][C] 15.22[/C][C] 0.776[/C][/ROW]
[ROW][C]93[/C][C] 15[/C][C] 15.34[/C][C]-0.3363[/C][/ROW]
[ROW][C]94[/C][C] 12[/C][C] 14.96[/C][C]-2.958[/C][/ROW]
[ROW][C]95[/C][C] 17[/C][C] 14.96[/C][C] 2.042[/C][/ROW]
[ROW][C]96[/C][C] 18[/C][C] 16.09[/C][C] 1.907[/C][/ROW]
[ROW][C]97[/C][C] 15[/C][C] 14.96[/C][C] 0.0419[/C][/ROW]
[ROW][C]98[/C][C] 15[/C][C] 15.71[/C][C]-0.7145[/C][/ROW]
[ROW][C]99[/C][C] 16[/C][C] 15.34[/C][C] 0.6637[/C][/ROW]
[ROW][C]100[/C][C] 16[/C][C] 15.8[/C][C] 0.1969[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=298119&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=298119&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 13 14.96-1.958
2 16 15.89 0.1083
3 17 15.25 1.752
4 16 15.51 0.4864
5 17 15.05 1.953
6 17 15.54 1.463
7 15 15.8-0.8031
8 16 15.25 0.7524
9 14 15.89-1.892
10 16 15.34 0.6637
11 17 15.54 1.463
12 16 15.8 0.1969
13 16 15.92 0.08464
14 16 15.63 0.3742
15 15 15.42-0.4249
16 16 14.67 1.331
17 16 15.8 0.1969
18 13 14.67-1.669
19 15 15.54-0.5372
20 17 15.54 1.463
21 13 16-3.004
22 17 15.8 1.197
23 14 15.51-1.514
24 14 15.63-1.626
25 18 15.34 2.664
26 17 15.25 1.752
27 13 14.96-1.958
28 16 15.31 0.6873
29 15 15.25-0.2476
30 15 15.54-0.5372
31 13 14.96-1.958
32 17 15.34 1.664
33 11 15.63-4.626
34 14 15.71-1.714
35 13 15.25-2.248
36 17 15.63 1.374
37 16 15.25 0.7524
38 17 15.71 1.286
39 16 15.54 0.4628
40 16 14.96 1.042
41 16 15.05 0.9533
42 15 15.25-0.2476
43 12 15.14-3.135
44 17 15.14 1.865
45 14 15.22-1.224
46 16 14.93 1.066
47 15 15.34-0.3363
48 16 14.96 1.042
49 14 15.54-1.537
50 15 14.67 0.3314
51 17 15.83 1.173
52 10 15.25-5.248
53 17 15.54 1.463
54 20 15.63 4.374
55 17 16.09 0.9074
56 18 15.54 2.463
57 17 15.54 1.463
58 14 15.63-1.626
59 17 15.71 1.286
60 17 15.83 1.173
61 16 15.8 0.1969
62 18 15.34 2.664
63 18 16 1.996
64 16 15.54 0.4628
65 15 15.8-0.8031
66 13 15.54-2.537
67 16 15.25 0.7524
68 12 15.42-3.425
69 16 15.51 0.4864
70 16 15.54 0.4628
71 16 15.83 0.1733
72 14 15.54-1.537
73 15 15.22-0.224
74 14 15.34-1.336
75 15 15.8-0.8031
76 15 15.25-0.2476
77 16 15.34 0.6637
78 11 15.25-4.248
79 18 15.8 2.197
80 11 16.09-5.093
81 18 15.6 2.398
82 15 15.34-0.3363
83 19 16 2.996
84 17 16 0.996
85 14 16.09-2.093
86 13 15.25-2.248
87 17 15.71 1.286
88 14 15.71-1.714
89 19 15.05 3.953
90 14 15.8-1.803
91 16 15.51 0.4864
92 16 15.22 0.776
93 15 15.34-0.3363
94 12 14.96-2.958
95 17 14.96 2.042
96 18 16.09 1.907
97 15 14.96 0.0419
98 15 15.71-0.7145
99 16 15.34 0.6637
100 16 15.8 0.1969







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
6 0.556 0.888 0.444
7 0.4712 0.9425 0.5288
8 0.3255 0.651 0.6745
9 0.3068 0.6135 0.6932
10 0.2059 0.4117 0.7941
11 0.1345 0.269 0.8655
12 0.08252 0.165 0.9175
13 0.05502 0.11 0.945
14 0.03132 0.06263 0.9687
15 0.01814 0.03628 0.9819
16 0.01083 0.02167 0.9892
17 0.005858 0.01172 0.9941
18 0.01174 0.02348 0.9883
19 0.01037 0.02073 0.9896
20 0.006696 0.01339 0.9933
21 0.03263 0.06527 0.9674
22 0.03029 0.06058 0.9697
23 0.02481 0.04962 0.9752
24 0.02697 0.05394 0.973
25 0.04507 0.09014 0.9549
26 0.03844 0.07689 0.9616
27 0.05503 0.1101 0.945
28 0.04494 0.08988 0.9551
29 0.0321 0.06421 0.9679
30 0.02342 0.04684 0.9766
31 0.02934 0.05868 0.9707
32 0.02782 0.05564 0.9722
33 0.1629 0.3258 0.8371
34 0.1524 0.3048 0.8476
35 0.1713 0.3426 0.8287
36 0.1587 0.3174 0.8413
37 0.1292 0.2584 0.8708
38 0.1164 0.2328 0.8836
39 0.09052 0.181 0.9095
40 0.07384 0.1477 0.9262
41 0.05897 0.1179 0.941
42 0.04349 0.08698 0.9565
43 0.07611 0.1522 0.9239
44 0.07829 0.1566 0.9217
45 0.06567 0.1313 0.9343
46 0.05435 0.1087 0.9456
47 0.04017 0.08034 0.9598
48 0.03185 0.0637 0.9682
49 0.02841 0.05682 0.9716
50 0.02042 0.04085 0.9796
51 0.01679 0.03359 0.9832
52 0.1325 0.2649 0.8675
53 0.1211 0.2421 0.8789
54 0.2989 0.5978 0.7011
55 0.2609 0.5218 0.7391
56 0.2974 0.5948 0.7026
57 0.2817 0.5634 0.7183
58 0.2658 0.5316 0.7342
59 0.2404 0.4809 0.7596
60 0.2217 0.4435 0.7783
61 0.1801 0.3603 0.8199
62 0.225 0.45 0.775
63 0.2403 0.4806 0.7597
64 0.2077 0.4154 0.7923
65 0.1737 0.3474 0.8263
66 0.188 0.376 0.812
67 0.1605 0.3209 0.8395
68 0.2603 0.5205 0.7397
69 0.2135 0.427 0.7865
70 0.184 0.3679 0.816
71 0.1654 0.3307 0.8346
72 0.1384 0.2767 0.8616
73 0.1189 0.2379 0.8811
74 0.0995 0.199 0.9005
75 0.07961 0.1592 0.9204
76 0.05879 0.1176 0.9412
77 0.04415 0.0883 0.9559
78 0.1067 0.2134 0.8933
79 0.1088 0.2177 0.8912
80 0.4179 0.8357 0.5821
81 0.4002 0.8003 0.5998
82 0.3293 0.6586 0.6707
83 0.4748 0.9496 0.5252
84 0.4702 0.9403 0.5298
85 0.446 0.8921 0.554
86 0.4465 0.893 0.5535
87 0.4027 0.8054 0.5973
88 0.3731 0.7463 0.6269
89 0.6709 0.6582 0.3291
90 0.733 0.534 0.267
91 0.6211 0.7578 0.3789
92 0.5058 0.9883 0.4942
93 0.3703 0.7405 0.6297
94 0.6687 0.6627 0.3313

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
6 &  0.556 &  0.888 &  0.444 \tabularnewline
7 &  0.4712 &  0.9425 &  0.5288 \tabularnewline
8 &  0.3255 &  0.651 &  0.6745 \tabularnewline
9 &  0.3068 &  0.6135 &  0.6932 \tabularnewline
10 &  0.2059 &  0.4117 &  0.7941 \tabularnewline
11 &  0.1345 &  0.269 &  0.8655 \tabularnewline
12 &  0.08252 &  0.165 &  0.9175 \tabularnewline
13 &  0.05502 &  0.11 &  0.945 \tabularnewline
14 &  0.03132 &  0.06263 &  0.9687 \tabularnewline
15 &  0.01814 &  0.03628 &  0.9819 \tabularnewline
16 &  0.01083 &  0.02167 &  0.9892 \tabularnewline
17 &  0.005858 &  0.01172 &  0.9941 \tabularnewline
18 &  0.01174 &  0.02348 &  0.9883 \tabularnewline
19 &  0.01037 &  0.02073 &  0.9896 \tabularnewline
20 &  0.006696 &  0.01339 &  0.9933 \tabularnewline
21 &  0.03263 &  0.06527 &  0.9674 \tabularnewline
22 &  0.03029 &  0.06058 &  0.9697 \tabularnewline
23 &  0.02481 &  0.04962 &  0.9752 \tabularnewline
24 &  0.02697 &  0.05394 &  0.973 \tabularnewline
25 &  0.04507 &  0.09014 &  0.9549 \tabularnewline
26 &  0.03844 &  0.07689 &  0.9616 \tabularnewline
27 &  0.05503 &  0.1101 &  0.945 \tabularnewline
28 &  0.04494 &  0.08988 &  0.9551 \tabularnewline
29 &  0.0321 &  0.06421 &  0.9679 \tabularnewline
30 &  0.02342 &  0.04684 &  0.9766 \tabularnewline
31 &  0.02934 &  0.05868 &  0.9707 \tabularnewline
32 &  0.02782 &  0.05564 &  0.9722 \tabularnewline
33 &  0.1629 &  0.3258 &  0.8371 \tabularnewline
34 &  0.1524 &  0.3048 &  0.8476 \tabularnewline
35 &  0.1713 &  0.3426 &  0.8287 \tabularnewline
36 &  0.1587 &  0.3174 &  0.8413 \tabularnewline
37 &  0.1292 &  0.2584 &  0.8708 \tabularnewline
38 &  0.1164 &  0.2328 &  0.8836 \tabularnewline
39 &  0.09052 &  0.181 &  0.9095 \tabularnewline
40 &  0.07384 &  0.1477 &  0.9262 \tabularnewline
41 &  0.05897 &  0.1179 &  0.941 \tabularnewline
42 &  0.04349 &  0.08698 &  0.9565 \tabularnewline
43 &  0.07611 &  0.1522 &  0.9239 \tabularnewline
44 &  0.07829 &  0.1566 &  0.9217 \tabularnewline
45 &  0.06567 &  0.1313 &  0.9343 \tabularnewline
46 &  0.05435 &  0.1087 &  0.9456 \tabularnewline
47 &  0.04017 &  0.08034 &  0.9598 \tabularnewline
48 &  0.03185 &  0.0637 &  0.9682 \tabularnewline
49 &  0.02841 &  0.05682 &  0.9716 \tabularnewline
50 &  0.02042 &  0.04085 &  0.9796 \tabularnewline
51 &  0.01679 &  0.03359 &  0.9832 \tabularnewline
52 &  0.1325 &  0.2649 &  0.8675 \tabularnewline
53 &  0.1211 &  0.2421 &  0.8789 \tabularnewline
54 &  0.2989 &  0.5978 &  0.7011 \tabularnewline
55 &  0.2609 &  0.5218 &  0.7391 \tabularnewline
56 &  0.2974 &  0.5948 &  0.7026 \tabularnewline
57 &  0.2817 &  0.5634 &  0.7183 \tabularnewline
58 &  0.2658 &  0.5316 &  0.7342 \tabularnewline
59 &  0.2404 &  0.4809 &  0.7596 \tabularnewline
60 &  0.2217 &  0.4435 &  0.7783 \tabularnewline
61 &  0.1801 &  0.3603 &  0.8199 \tabularnewline
62 &  0.225 &  0.45 &  0.775 \tabularnewline
63 &  0.2403 &  0.4806 &  0.7597 \tabularnewline
64 &  0.2077 &  0.4154 &  0.7923 \tabularnewline
65 &  0.1737 &  0.3474 &  0.8263 \tabularnewline
66 &  0.188 &  0.376 &  0.812 \tabularnewline
67 &  0.1605 &  0.3209 &  0.8395 \tabularnewline
68 &  0.2603 &  0.5205 &  0.7397 \tabularnewline
69 &  0.2135 &  0.427 &  0.7865 \tabularnewline
70 &  0.184 &  0.3679 &  0.816 \tabularnewline
71 &  0.1654 &  0.3307 &  0.8346 \tabularnewline
72 &  0.1384 &  0.2767 &  0.8616 \tabularnewline
73 &  0.1189 &  0.2379 &  0.8811 \tabularnewline
74 &  0.0995 &  0.199 &  0.9005 \tabularnewline
75 &  0.07961 &  0.1592 &  0.9204 \tabularnewline
76 &  0.05879 &  0.1176 &  0.9412 \tabularnewline
77 &  0.04415 &  0.0883 &  0.9559 \tabularnewline
78 &  0.1067 &  0.2134 &  0.8933 \tabularnewline
79 &  0.1088 &  0.2177 &  0.8912 \tabularnewline
80 &  0.4179 &  0.8357 &  0.5821 \tabularnewline
81 &  0.4002 &  0.8003 &  0.5998 \tabularnewline
82 &  0.3293 &  0.6586 &  0.6707 \tabularnewline
83 &  0.4748 &  0.9496 &  0.5252 \tabularnewline
84 &  0.4702 &  0.9403 &  0.5298 \tabularnewline
85 &  0.446 &  0.8921 &  0.554 \tabularnewline
86 &  0.4465 &  0.893 &  0.5535 \tabularnewline
87 &  0.4027 &  0.8054 &  0.5973 \tabularnewline
88 &  0.3731 &  0.7463 &  0.6269 \tabularnewline
89 &  0.6709 &  0.6582 &  0.3291 \tabularnewline
90 &  0.733 &  0.534 &  0.267 \tabularnewline
91 &  0.6211 &  0.7578 &  0.3789 \tabularnewline
92 &  0.5058 &  0.9883 &  0.4942 \tabularnewline
93 &  0.3703 &  0.7405 &  0.6297 \tabularnewline
94 &  0.6687 &  0.6627 &  0.3313 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=298119&T=5

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]6[/C][C] 0.556[/C][C] 0.888[/C][C] 0.444[/C][/ROW]
[ROW][C]7[/C][C] 0.4712[/C][C] 0.9425[/C][C] 0.5288[/C][/ROW]
[ROW][C]8[/C][C] 0.3255[/C][C] 0.651[/C][C] 0.6745[/C][/ROW]
[ROW][C]9[/C][C] 0.3068[/C][C] 0.6135[/C][C] 0.6932[/C][/ROW]
[ROW][C]10[/C][C] 0.2059[/C][C] 0.4117[/C][C] 0.7941[/C][/ROW]
[ROW][C]11[/C][C] 0.1345[/C][C] 0.269[/C][C] 0.8655[/C][/ROW]
[ROW][C]12[/C][C] 0.08252[/C][C] 0.165[/C][C] 0.9175[/C][/ROW]
[ROW][C]13[/C][C] 0.05502[/C][C] 0.11[/C][C] 0.945[/C][/ROW]
[ROW][C]14[/C][C] 0.03132[/C][C] 0.06263[/C][C] 0.9687[/C][/ROW]
[ROW][C]15[/C][C] 0.01814[/C][C] 0.03628[/C][C] 0.9819[/C][/ROW]
[ROW][C]16[/C][C] 0.01083[/C][C] 0.02167[/C][C] 0.9892[/C][/ROW]
[ROW][C]17[/C][C] 0.005858[/C][C] 0.01172[/C][C] 0.9941[/C][/ROW]
[ROW][C]18[/C][C] 0.01174[/C][C] 0.02348[/C][C] 0.9883[/C][/ROW]
[ROW][C]19[/C][C] 0.01037[/C][C] 0.02073[/C][C] 0.9896[/C][/ROW]
[ROW][C]20[/C][C] 0.006696[/C][C] 0.01339[/C][C] 0.9933[/C][/ROW]
[ROW][C]21[/C][C] 0.03263[/C][C] 0.06527[/C][C] 0.9674[/C][/ROW]
[ROW][C]22[/C][C] 0.03029[/C][C] 0.06058[/C][C] 0.9697[/C][/ROW]
[ROW][C]23[/C][C] 0.02481[/C][C] 0.04962[/C][C] 0.9752[/C][/ROW]
[ROW][C]24[/C][C] 0.02697[/C][C] 0.05394[/C][C] 0.973[/C][/ROW]
[ROW][C]25[/C][C] 0.04507[/C][C] 0.09014[/C][C] 0.9549[/C][/ROW]
[ROW][C]26[/C][C] 0.03844[/C][C] 0.07689[/C][C] 0.9616[/C][/ROW]
[ROW][C]27[/C][C] 0.05503[/C][C] 0.1101[/C][C] 0.945[/C][/ROW]
[ROW][C]28[/C][C] 0.04494[/C][C] 0.08988[/C][C] 0.9551[/C][/ROW]
[ROW][C]29[/C][C] 0.0321[/C][C] 0.06421[/C][C] 0.9679[/C][/ROW]
[ROW][C]30[/C][C] 0.02342[/C][C] 0.04684[/C][C] 0.9766[/C][/ROW]
[ROW][C]31[/C][C] 0.02934[/C][C] 0.05868[/C][C] 0.9707[/C][/ROW]
[ROW][C]32[/C][C] 0.02782[/C][C] 0.05564[/C][C] 0.9722[/C][/ROW]
[ROW][C]33[/C][C] 0.1629[/C][C] 0.3258[/C][C] 0.8371[/C][/ROW]
[ROW][C]34[/C][C] 0.1524[/C][C] 0.3048[/C][C] 0.8476[/C][/ROW]
[ROW][C]35[/C][C] 0.1713[/C][C] 0.3426[/C][C] 0.8287[/C][/ROW]
[ROW][C]36[/C][C] 0.1587[/C][C] 0.3174[/C][C] 0.8413[/C][/ROW]
[ROW][C]37[/C][C] 0.1292[/C][C] 0.2584[/C][C] 0.8708[/C][/ROW]
[ROW][C]38[/C][C] 0.1164[/C][C] 0.2328[/C][C] 0.8836[/C][/ROW]
[ROW][C]39[/C][C] 0.09052[/C][C] 0.181[/C][C] 0.9095[/C][/ROW]
[ROW][C]40[/C][C] 0.07384[/C][C] 0.1477[/C][C] 0.9262[/C][/ROW]
[ROW][C]41[/C][C] 0.05897[/C][C] 0.1179[/C][C] 0.941[/C][/ROW]
[ROW][C]42[/C][C] 0.04349[/C][C] 0.08698[/C][C] 0.9565[/C][/ROW]
[ROW][C]43[/C][C] 0.07611[/C][C] 0.1522[/C][C] 0.9239[/C][/ROW]
[ROW][C]44[/C][C] 0.07829[/C][C] 0.1566[/C][C] 0.9217[/C][/ROW]
[ROW][C]45[/C][C] 0.06567[/C][C] 0.1313[/C][C] 0.9343[/C][/ROW]
[ROW][C]46[/C][C] 0.05435[/C][C] 0.1087[/C][C] 0.9456[/C][/ROW]
[ROW][C]47[/C][C] 0.04017[/C][C] 0.08034[/C][C] 0.9598[/C][/ROW]
[ROW][C]48[/C][C] 0.03185[/C][C] 0.0637[/C][C] 0.9682[/C][/ROW]
[ROW][C]49[/C][C] 0.02841[/C][C] 0.05682[/C][C] 0.9716[/C][/ROW]
[ROW][C]50[/C][C] 0.02042[/C][C] 0.04085[/C][C] 0.9796[/C][/ROW]
[ROW][C]51[/C][C] 0.01679[/C][C] 0.03359[/C][C] 0.9832[/C][/ROW]
[ROW][C]52[/C][C] 0.1325[/C][C] 0.2649[/C][C] 0.8675[/C][/ROW]
[ROW][C]53[/C][C] 0.1211[/C][C] 0.2421[/C][C] 0.8789[/C][/ROW]
[ROW][C]54[/C][C] 0.2989[/C][C] 0.5978[/C][C] 0.7011[/C][/ROW]
[ROW][C]55[/C][C] 0.2609[/C][C] 0.5218[/C][C] 0.7391[/C][/ROW]
[ROW][C]56[/C][C] 0.2974[/C][C] 0.5948[/C][C] 0.7026[/C][/ROW]
[ROW][C]57[/C][C] 0.2817[/C][C] 0.5634[/C][C] 0.7183[/C][/ROW]
[ROW][C]58[/C][C] 0.2658[/C][C] 0.5316[/C][C] 0.7342[/C][/ROW]
[ROW][C]59[/C][C] 0.2404[/C][C] 0.4809[/C][C] 0.7596[/C][/ROW]
[ROW][C]60[/C][C] 0.2217[/C][C] 0.4435[/C][C] 0.7783[/C][/ROW]
[ROW][C]61[/C][C] 0.1801[/C][C] 0.3603[/C][C] 0.8199[/C][/ROW]
[ROW][C]62[/C][C] 0.225[/C][C] 0.45[/C][C] 0.775[/C][/ROW]
[ROW][C]63[/C][C] 0.2403[/C][C] 0.4806[/C][C] 0.7597[/C][/ROW]
[ROW][C]64[/C][C] 0.2077[/C][C] 0.4154[/C][C] 0.7923[/C][/ROW]
[ROW][C]65[/C][C] 0.1737[/C][C] 0.3474[/C][C] 0.8263[/C][/ROW]
[ROW][C]66[/C][C] 0.188[/C][C] 0.376[/C][C] 0.812[/C][/ROW]
[ROW][C]67[/C][C] 0.1605[/C][C] 0.3209[/C][C] 0.8395[/C][/ROW]
[ROW][C]68[/C][C] 0.2603[/C][C] 0.5205[/C][C] 0.7397[/C][/ROW]
[ROW][C]69[/C][C] 0.2135[/C][C] 0.427[/C][C] 0.7865[/C][/ROW]
[ROW][C]70[/C][C] 0.184[/C][C] 0.3679[/C][C] 0.816[/C][/ROW]
[ROW][C]71[/C][C] 0.1654[/C][C] 0.3307[/C][C] 0.8346[/C][/ROW]
[ROW][C]72[/C][C] 0.1384[/C][C] 0.2767[/C][C] 0.8616[/C][/ROW]
[ROW][C]73[/C][C] 0.1189[/C][C] 0.2379[/C][C] 0.8811[/C][/ROW]
[ROW][C]74[/C][C] 0.0995[/C][C] 0.199[/C][C] 0.9005[/C][/ROW]
[ROW][C]75[/C][C] 0.07961[/C][C] 0.1592[/C][C] 0.9204[/C][/ROW]
[ROW][C]76[/C][C] 0.05879[/C][C] 0.1176[/C][C] 0.9412[/C][/ROW]
[ROW][C]77[/C][C] 0.04415[/C][C] 0.0883[/C][C] 0.9559[/C][/ROW]
[ROW][C]78[/C][C] 0.1067[/C][C] 0.2134[/C][C] 0.8933[/C][/ROW]
[ROW][C]79[/C][C] 0.1088[/C][C] 0.2177[/C][C] 0.8912[/C][/ROW]
[ROW][C]80[/C][C] 0.4179[/C][C] 0.8357[/C][C] 0.5821[/C][/ROW]
[ROW][C]81[/C][C] 0.4002[/C][C] 0.8003[/C][C] 0.5998[/C][/ROW]
[ROW][C]82[/C][C] 0.3293[/C][C] 0.6586[/C][C] 0.6707[/C][/ROW]
[ROW][C]83[/C][C] 0.4748[/C][C] 0.9496[/C][C] 0.5252[/C][/ROW]
[ROW][C]84[/C][C] 0.4702[/C][C] 0.9403[/C][C] 0.5298[/C][/ROW]
[ROW][C]85[/C][C] 0.446[/C][C] 0.8921[/C][C] 0.554[/C][/ROW]
[ROW][C]86[/C][C] 0.4465[/C][C] 0.893[/C][C] 0.5535[/C][/ROW]
[ROW][C]87[/C][C] 0.4027[/C][C] 0.8054[/C][C] 0.5973[/C][/ROW]
[ROW][C]88[/C][C] 0.3731[/C][C] 0.7463[/C][C] 0.6269[/C][/ROW]
[ROW][C]89[/C][C] 0.6709[/C][C] 0.6582[/C][C] 0.3291[/C][/ROW]
[ROW][C]90[/C][C] 0.733[/C][C] 0.534[/C][C] 0.267[/C][/ROW]
[ROW][C]91[/C][C] 0.6211[/C][C] 0.7578[/C][C] 0.3789[/C][/ROW]
[ROW][C]92[/C][C] 0.5058[/C][C] 0.9883[/C][C] 0.4942[/C][/ROW]
[ROW][C]93[/C][C] 0.3703[/C][C] 0.7405[/C][C] 0.6297[/C][/ROW]
[ROW][C]94[/C][C] 0.6687[/C][C] 0.6627[/C][C] 0.3313[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=298119&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=298119&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
6 0.556 0.888 0.444
7 0.4712 0.9425 0.5288
8 0.3255 0.651 0.6745
9 0.3068 0.6135 0.6932
10 0.2059 0.4117 0.7941
11 0.1345 0.269 0.8655
12 0.08252 0.165 0.9175
13 0.05502 0.11 0.945
14 0.03132 0.06263 0.9687
15 0.01814 0.03628 0.9819
16 0.01083 0.02167 0.9892
17 0.005858 0.01172 0.9941
18 0.01174 0.02348 0.9883
19 0.01037 0.02073 0.9896
20 0.006696 0.01339 0.9933
21 0.03263 0.06527 0.9674
22 0.03029 0.06058 0.9697
23 0.02481 0.04962 0.9752
24 0.02697 0.05394 0.973
25 0.04507 0.09014 0.9549
26 0.03844 0.07689 0.9616
27 0.05503 0.1101 0.945
28 0.04494 0.08988 0.9551
29 0.0321 0.06421 0.9679
30 0.02342 0.04684 0.9766
31 0.02934 0.05868 0.9707
32 0.02782 0.05564 0.9722
33 0.1629 0.3258 0.8371
34 0.1524 0.3048 0.8476
35 0.1713 0.3426 0.8287
36 0.1587 0.3174 0.8413
37 0.1292 0.2584 0.8708
38 0.1164 0.2328 0.8836
39 0.09052 0.181 0.9095
40 0.07384 0.1477 0.9262
41 0.05897 0.1179 0.941
42 0.04349 0.08698 0.9565
43 0.07611 0.1522 0.9239
44 0.07829 0.1566 0.9217
45 0.06567 0.1313 0.9343
46 0.05435 0.1087 0.9456
47 0.04017 0.08034 0.9598
48 0.03185 0.0637 0.9682
49 0.02841 0.05682 0.9716
50 0.02042 0.04085 0.9796
51 0.01679 0.03359 0.9832
52 0.1325 0.2649 0.8675
53 0.1211 0.2421 0.8789
54 0.2989 0.5978 0.7011
55 0.2609 0.5218 0.7391
56 0.2974 0.5948 0.7026
57 0.2817 0.5634 0.7183
58 0.2658 0.5316 0.7342
59 0.2404 0.4809 0.7596
60 0.2217 0.4435 0.7783
61 0.1801 0.3603 0.8199
62 0.225 0.45 0.775
63 0.2403 0.4806 0.7597
64 0.2077 0.4154 0.7923
65 0.1737 0.3474 0.8263
66 0.188 0.376 0.812
67 0.1605 0.3209 0.8395
68 0.2603 0.5205 0.7397
69 0.2135 0.427 0.7865
70 0.184 0.3679 0.816
71 0.1654 0.3307 0.8346
72 0.1384 0.2767 0.8616
73 0.1189 0.2379 0.8811
74 0.0995 0.199 0.9005
75 0.07961 0.1592 0.9204
76 0.05879 0.1176 0.9412
77 0.04415 0.0883 0.9559
78 0.1067 0.2134 0.8933
79 0.1088 0.2177 0.8912
80 0.4179 0.8357 0.5821
81 0.4002 0.8003 0.5998
82 0.3293 0.6586 0.6707
83 0.4748 0.9496 0.5252
84 0.4702 0.9403 0.5298
85 0.446 0.8921 0.554
86 0.4465 0.893 0.5535
87 0.4027 0.8054 0.5973
88 0.3731 0.7463 0.6269
89 0.6709 0.6582 0.3291
90 0.733 0.534 0.267
91 0.6211 0.7578 0.3789
92 0.5058 0.9883 0.4942
93 0.3703 0.7405 0.6297
94 0.6687 0.6627 0.3313







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level0 0OK
5% type I error level100.11236NOK
10% type I error level250.280899NOK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 0 &  0 & OK \tabularnewline
5% type I error level & 10 & 0.11236 & NOK \tabularnewline
10% type I error level & 25 & 0.280899 & NOK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=298119&T=6

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]0[/C][C] 0[/C][C]OK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]10[/C][C]0.11236[/C][C]NOK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]25[/C][C]0.280899[/C][C]NOK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=298119&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=298119&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level0 0OK
5% type I error level100.11236NOK
10% type I error level250.280899NOK







Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 0.31424, df1 = 2, df2 = 95, p-value = 0.7311
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.75525, df1 = 4, df2 = 93, p-value = 0.557
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 1.1573, df1 = 2, df2 = 95, p-value = 0.3187

\begin{tabular}{lllllllll}
\hline
Ramsey RESET F-Test for powers (2 and 3) of fitted values \tabularnewline
> reset_test_fitted
	RESET test
data:  mylm
RESET = 0.31424, df1 = 2, df2 = 95, p-value = 0.7311
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of regressors \tabularnewline
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.75525, df1 = 4, df2 = 93, p-value = 0.557
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of principal components \tabularnewline
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 1.1573, df1 = 2, df2 = 95, p-value = 0.3187
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=298119&T=7

[TABLE]
[ROW][C]Ramsey RESET F-Test for powers (2 and 3) of fitted values[/C][/ROW]
[ROW][C]
> reset_test_fitted
	RESET test
data:  mylm
RESET = 0.31424, df1 = 2, df2 = 95, p-value = 0.7311
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of regressors[/C][/ROW] [ROW][C]
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.75525, df1 = 4, df2 = 93, p-value = 0.557
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of principal components[/C][/ROW] [ROW][C]
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 1.1573, df1 = 2, df2 = 95, p-value = 0.3187
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=298119&T=7

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=298119&T=7

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 0.31424, df1 = 2, df2 = 95, p-value = 0.7311
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.75525, df1 = 4, df2 = 93, p-value = 0.557
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 1.1573, df1 = 2, df2 = 95, p-value = 0.3187







Variance Inflation Factors (Multicollinearity)
> vif
     EP3      EP4 
1.033159 1.033159 

\begin{tabular}{lllllllll}
\hline
Variance Inflation Factors (Multicollinearity) \tabularnewline
> vif
     EP3      EP4 
1.033159 1.033159 
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=298119&T=8

[TABLE]
[ROW][C]Variance Inflation Factors (Multicollinearity)[/C][/ROW]
[ROW][C]
> vif
     EP3      EP4 
1.033159 1.033159 
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=298119&T=8

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=298119&T=8

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Variance Inflation Factors (Multicollinearity)
> vif
     EP3      EP4 
1.033159 1.033159 



Parameters (Session):
par1 = 2 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ; par4 = 0 ; par5 = 0 ;
Parameters (R input):
par1 = 3 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ; par4 = 0 ; par5 = 0 ;
R code (references can be found in the software module):
library(lattice)
library(lmtest)
library(car)
library(MASS)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
mywarning <- ''
par1 <- as.numeric(par1)
if(is.na(par1)) {
par1 <- 1
mywarning = 'Warning: you did not specify the column number of the endogenous series! The first column was selected by default.'
}
if (par4=='') par4 <- 0
par4 <- as.numeric(par4)
if (par5=='') par5 <- 0
par5 <- as.numeric(par5)
x <- na.omit(t(y))
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'Seasonal Differences (s=12)'){
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'First and Seasonal Differences (s=12)'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if(par4 > 0) {
x2 <- array(0, dim=c(n-par4,par4), dimnames=list(1:(n-par4), paste(colnames(x)[par1],'(t-',1:par4,')',sep='')))
for (i in 1:(n-par4)) {
for (j in 1:par4) {
x2[i,j] <- x[i+par4-j,par1]
}
}
x <- cbind(x[(par4+1):n,], x2)
n <- n - par4
}
if(par5 > 0) {
x2 <- array(0, dim=c(n-par5*12,par5), dimnames=list(1:(n-par5*12), paste(colnames(x)[par1],'(t-',1:par5,'s)',sep='')))
for (i in 1:(n-par5*12)) {
for (j in 1:par5) {
x2[i,j] <- x[i+par5*12-j*12,par1]
}
}
x <- cbind(x[(par5*12+1):n,], x2)
n <- n - par5*12
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
(k <- length(x[n,]))
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
print(x)
(k <- length(x[n,]))
head(x)
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
sresid <- studres(mylm)
hist(sresid, freq=FALSE, main='Distribution of Studentized Residuals')
xfit<-seq(min(sresid),max(sresid),length=40)
yfit<-dnorm(xfit)
lines(xfit, yfit)
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqPlot(mylm, main='QQ Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
print(z)
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, signif(mysum$coefficients[i,1],6), sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, mywarning)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Multiple Linear Regression - Ordinary Least Squares', 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,formatC(signif(mysum$coefficients[i,1],5),format='g',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,2],5),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,3],4),format='e',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4],4),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4]/2,4),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a,formatC(signif(sqrt(mysum$r.squared),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$adj.r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a,formatC(signif(mysum$fstatistic[1],6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[2],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[3],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a,formatC(signif(1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a,formatC(signif(mysum$sigma,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a,formatC(signif(sum(myerror*myerror),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
myr <- as.numeric(mysum$resid)
myr
if(n < 200) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,formatC(signif(x[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(x[i]-mysum$resid[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$resid[i],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,1],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,2],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,3],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant1,6))
a<-table.element(a,formatC(signif(numsignificant1/numgqtests,6),format='g',flag=' '))
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant5,6))
a<-table.element(a,signif(numsignificant5/numgqtests,6))
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant10,6))
a<-table.element(a,signif(numsignificant10/numgqtests,6))
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}
}
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of fitted values',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_fitted <- resettest(mylm,power=2:3,type='fitted')
a<-table.element(a,paste('
',RC.texteval('reset_test_fitted'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of regressors',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_regressors <- resettest(mylm,power=2:3,type='regressor')
a<-table.element(a,paste('
',RC.texteval('reset_test_regressors'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of principal components',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_principal_components <- resettest(mylm,power=2:3,type='princomp')
a<-table.element(a,paste('
',RC.texteval('reset_test_principal_components'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable8.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Variance Inflation Factors (Multicollinearity)',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
vif <- vif(mylm)
a<-table.element(a,paste('
',RC.texteval('vif'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable9.tab')