Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationWed, 14 Dec 2016 15:50:07 +0100
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2016/Dec/14/t14817270939qqa2h8f8iz9k44.htm/, Retrieved Fri, 03 May 2024 22:50:34 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=299522, Retrieved Fri, 03 May 2024 22:50:34 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact83
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [] [2016-12-14 14:50:07] [c6ea875f0603e0876d03f43aca979571] [Current]
Feedback Forum

Post a new message
Dataseries X:
13	2	3	3	3
16	1	2	2	4
17	2	3	3	4
16	2	3	3	4
17	2	3	3	3
17	2	3	3	4
15	3	3	3	3
16	2	4	4	5
14	2	4	3	4
16	2	3	3	4
17	3	3	2	3
16	2	3	3	3
16	2	4	3	3
16	2	3	3	4
15	2	3	3	4
16	3	5	4	2
13	3	3	3	3
15	2	2	2	3
17	2	4	3	4
13	2	4	3	4
17	2	3	3	4
14	2	4	3	3
14	2	2	4	4
18	3	3	3	3
17	3	3	3	3
13	2	3	3	3
16	3	3	3	4
15	2	4	3	4
15	3	4	3	3
13	2	3	2	3
17	3	4	3	3
11	1	1	1	2
13	2	1	3	4
17	2	5	3	5
16	3	4	3	3
17	3	3	3	3
16	2	5	2	4
16	3	4	3	3
16	2	3	3	3
17	2	3	3	4
14	2	4	3	3
14	2	3	3	5
16	2	5	3	4
15	2	3	3	4
16	2	3	3	4
14	2	2	2	4
15	2	3	3	4
17	3	3	3	3
17	2	4	3	4
20	2	5	3	4
17	3	1	3	3
18	3	3	3	3
17	2	4	3	4
14	3	2	3	3
17	3	3	3	3
17	3	3	3	3
16	3	3	3	3
18	2	2	2	3
16	3	3	3	3
13	2	2	3	4
16	3	3	3	3
12	1	3	4	4
16	2	2	2	3
16	2	4	3	4
16	3	1	3	3
14	2	5	3	4
15	2	2	3	3
14	3	3	3	3
15	2	3	3	3
15	3	4	3	4
16	4	3	3	3
11	2	2	3	3
18	3	3	3	3
11	2	3	3	4
18	2	4	4	5
19	1	5	2	4
17	3	3	3	3
14	3	3	3	3
17	2	4	3	3
14	2	3	3	3
19	2	5	3	3
16	2	4	3	4
16	3	2	3	3
15	2	3	3	2
12	2	3	2	2
17	3	3	3	3
18	2	4	3	4
15	2	3	3	2
18	2	4	3	4
16	2	4	3	3
16	3	2	3	4




Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time9 seconds
R ServerBig Analytics Cloud Computing Center

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input view raw input (R code)  \tabularnewline
Raw Outputview raw output of R engine  \tabularnewline
Computing time9 seconds \tabularnewline
R ServerBig Analytics Cloud Computing Center \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=299522&T=0

[TABLE]
[ROW]
Summary of computational transaction[/C][/ROW] [ROW]Raw Input[/C] view raw input (R code) [/C][/ROW] [ROW]Raw Output[/C]view raw output of R engine [/C][/ROW] [ROW]Computing time[/C]9 seconds[/C][/ROW] [ROW]R Server[/C]Big Analytics Cloud Computing Center[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=299522&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=299522&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time9 seconds
R ServerBig Analytics Cloud Computing Center







Multiple Linear Regression - Estimated Regression Equation
TVDCsum[t] = + 10.4407 + 1.09485GW1[t] + 0.681168GW2[t] -0.450823GW3[t] + 0.524614GW4[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
TVDCsum[t] =  +  10.4407 +  1.09485GW1[t] +  0.681168GW2[t] -0.450823GW3[t] +  0.524614GW4[t]  + e[t] \tabularnewline
 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=299522&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]TVDCsum[t] =  +  10.4407 +  1.09485GW1[t] +  0.681168GW2[t] -0.450823GW3[t] +  0.524614GW4[t]  + e[t][/C][/ROW]
[ROW][C][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=299522&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=299522&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
TVDCsum[t] = + 10.4407 + 1.09485GW1[t] + 0.681168GW2[t] -0.450823GW3[t] + 0.524614GW4[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+10.44 1.551+6.7300e+00 1.795e-09 8.973e-10
GW1+1.095 0.3463+3.1620e+00 0.002166 0.001083
GW2+0.6812 0.1966+3.4640e+00 0.0008322 0.0004161
GW3-0.4508 0.4318-1.0440e+00 0.2994 0.1497
GW4+0.5246 0.301+1.7430e+00 0.08491 0.04245

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & +10.44 &  1.551 & +6.7300e+00 &  1.795e-09 &  8.973e-10 \tabularnewline
GW1 & +1.095 &  0.3463 & +3.1620e+00 &  0.002166 &  0.001083 \tabularnewline
GW2 & +0.6812 &  0.1966 & +3.4640e+00 &  0.0008322 &  0.0004161 \tabularnewline
GW3 & -0.4508 &  0.4318 & -1.0440e+00 &  0.2994 &  0.1497 \tabularnewline
GW4 & +0.5246 &  0.301 & +1.7430e+00 &  0.08491 &  0.04245 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=299522&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]+10.44[/C][C] 1.551[/C][C]+6.7300e+00[/C][C] 1.795e-09[/C][C] 8.973e-10[/C][/ROW]
[ROW][C]GW1[/C][C]+1.095[/C][C] 0.3463[/C][C]+3.1620e+00[/C][C] 0.002166[/C][C] 0.001083[/C][/ROW]
[ROW][C]GW2[/C][C]+0.6812[/C][C] 0.1966[/C][C]+3.4640e+00[/C][C] 0.0008322[/C][C] 0.0004161[/C][/ROW]
[ROW][C]GW3[/C][C]-0.4508[/C][C] 0.4318[/C][C]-1.0440e+00[/C][C] 0.2994[/C][C] 0.1497[/C][/ROW]
[ROW][C]GW4[/C][C]+0.5246[/C][C] 0.301[/C][C]+1.7430e+00[/C][C] 0.08491[/C][C] 0.04245[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=299522&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=299522&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+10.44 1.551+6.7300e+00 1.795e-09 8.973e-10
GW1+1.095 0.3463+3.1620e+00 0.002166 0.001083
GW2+0.6812 0.1966+3.4640e+00 0.0008322 0.0004161
GW3-0.4508 0.4318-1.0440e+00 0.2994 0.1497
GW4+0.5246 0.301+1.7430e+00 0.08491 0.04245







Multiple Linear Regression - Regression Statistics
Multiple R 0.458
R-squared 0.2098
Adjusted R-squared 0.173
F-TEST (value) 5.708
F-TEST (DF numerator)4
F-TEST (DF denominator)86
p-value 0.0004016
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 1.647
Sum Squared Residuals 233.3

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R &  0.458 \tabularnewline
R-squared &  0.2098 \tabularnewline
Adjusted R-squared &  0.173 \tabularnewline
F-TEST (value) &  5.708 \tabularnewline
F-TEST (DF numerator) & 4 \tabularnewline
F-TEST (DF denominator) & 86 \tabularnewline
p-value &  0.0004016 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation &  1.647 \tabularnewline
Sum Squared Residuals &  233.3 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=299522&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C] 0.458[/C][/ROW]
[ROW][C]R-squared[/C][C] 0.2098[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C] 0.173[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C] 5.708[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]4[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]86[/C][/ROW]
[ROW][C]p-value[/C][C] 0.0004016[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C] 1.647[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C] 233.3[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=299522&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=299522&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R 0.458
R-squared 0.2098
Adjusted R-squared 0.173
F-TEST (value) 5.708
F-TEST (DF numerator)4
F-TEST (DF denominator)86
p-value 0.0004016
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 1.647
Sum Squared Residuals 233.3







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 13 14.9-1.895
2 16 14.09 1.905
3 17 15.42 1.58
4 16 15.42 0.5801
5 17 14.9 2.105
6 17 15.42 1.58
7 15 15.99-0.9901
8 16 16.17-0.1748
9 14 16.1-2.101
10 16 15.42 0.5801
11 17 16.44 0.5591
12 16 14.9 1.105
13 16 15.58 0.4236
14 16 15.42 0.5801
15 15 15.42-0.4199
16 16 16.38-0.377
17 13 15.99-2.99
18 15 14.66 0.3351
19 17 16.1 0.899
20 13 16.1-3.101
21 17 15.42 1.58
22 14 15.58-1.576
23 14 14.29-0.2879
24 18 15.99 2.01
25 17 15.99 1.01
26 13 14.9-1.895
27 16 16.51-0.5147
28 15 16.1-1.101
29 15 16.67-1.671
30 13 15.35-2.346
31 17 16.67 0.3287
32 11 12.82-1.815
33 13 14.06-1.058
34 17 17.31-0.3068
35 16 16.67-0.6713
36 17 15.99 1.01
37 16 17.23-1.233
38 16 16.67-0.6713
39 16 14.9 1.105
40 17 15.42 1.58
41 14 15.58-1.576
42 14 15.94-1.944
43 16 16.78-0.7822
44 15 15.42-0.4199
45 16 15.42 0.5801
46 14 15.19-1.19
47 15 15.42-0.4199
48 17 15.99 1.01
49 17 16.1 0.899
50 20 16.78 3.218
51 17 14.63 2.372
52 18 15.99 2.01
53 17 16.1 0.899
54 14 15.31-1.309
55 17 15.99 1.01
56 17 15.99 1.01
57 16 15.99 0.009901
58 18 14.66 3.335
59 16 15.99 0.009901
60 13 14.74-1.739
61 16 15.99 0.009901
62 12 13.87-1.874
63 16 14.66 1.335
64 16 16.1-0.101
65 16 14.63 1.372
66 14 16.78-2.782
67 15 14.21 0.7859
68 14 15.99-1.99
69 15 14.9 0.1048
70 15 17.2-2.196
71 16 17.09-1.085
72 11 14.21-3.214
73 18 15.99 2.01
74 11 15.42-4.42
75 18 16.17 1.825
76 19 16.14 2.862
77 17 15.99 1.01
78 14 15.99-1.99
79 17 15.58 1.424
80 14 14.9-0.8952
81 19 16.26 2.742
82 16 16.1-0.101
83 16 15.31 0.6911
84 15 14.37 0.6294
85 12 14.82-2.821
86 17 15.99 1.01
87 18 16.1 1.899
88 15 14.37 0.6294
89 18 16.1 1.899
90 16 15.58 0.4236
91 16 15.83 0.1665

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 &  13 &  14.9 & -1.895 \tabularnewline
2 &  16 &  14.09 &  1.905 \tabularnewline
3 &  17 &  15.42 &  1.58 \tabularnewline
4 &  16 &  15.42 &  0.5801 \tabularnewline
5 &  17 &  14.9 &  2.105 \tabularnewline
6 &  17 &  15.42 &  1.58 \tabularnewline
7 &  15 &  15.99 & -0.9901 \tabularnewline
8 &  16 &  16.17 & -0.1748 \tabularnewline
9 &  14 &  16.1 & -2.101 \tabularnewline
10 &  16 &  15.42 &  0.5801 \tabularnewline
11 &  17 &  16.44 &  0.5591 \tabularnewline
12 &  16 &  14.9 &  1.105 \tabularnewline
13 &  16 &  15.58 &  0.4236 \tabularnewline
14 &  16 &  15.42 &  0.5801 \tabularnewline
15 &  15 &  15.42 & -0.4199 \tabularnewline
16 &  16 &  16.38 & -0.377 \tabularnewline
17 &  13 &  15.99 & -2.99 \tabularnewline
18 &  15 &  14.66 &  0.3351 \tabularnewline
19 &  17 &  16.1 &  0.899 \tabularnewline
20 &  13 &  16.1 & -3.101 \tabularnewline
21 &  17 &  15.42 &  1.58 \tabularnewline
22 &  14 &  15.58 & -1.576 \tabularnewline
23 &  14 &  14.29 & -0.2879 \tabularnewline
24 &  18 &  15.99 &  2.01 \tabularnewline
25 &  17 &  15.99 &  1.01 \tabularnewline
26 &  13 &  14.9 & -1.895 \tabularnewline
27 &  16 &  16.51 & -0.5147 \tabularnewline
28 &  15 &  16.1 & -1.101 \tabularnewline
29 &  15 &  16.67 & -1.671 \tabularnewline
30 &  13 &  15.35 & -2.346 \tabularnewline
31 &  17 &  16.67 &  0.3287 \tabularnewline
32 &  11 &  12.82 & -1.815 \tabularnewline
33 &  13 &  14.06 & -1.058 \tabularnewline
34 &  17 &  17.31 & -0.3068 \tabularnewline
35 &  16 &  16.67 & -0.6713 \tabularnewline
36 &  17 &  15.99 &  1.01 \tabularnewline
37 &  16 &  17.23 & -1.233 \tabularnewline
38 &  16 &  16.67 & -0.6713 \tabularnewline
39 &  16 &  14.9 &  1.105 \tabularnewline
40 &  17 &  15.42 &  1.58 \tabularnewline
41 &  14 &  15.58 & -1.576 \tabularnewline
42 &  14 &  15.94 & -1.944 \tabularnewline
43 &  16 &  16.78 & -0.7822 \tabularnewline
44 &  15 &  15.42 & -0.4199 \tabularnewline
45 &  16 &  15.42 &  0.5801 \tabularnewline
46 &  14 &  15.19 & -1.19 \tabularnewline
47 &  15 &  15.42 & -0.4199 \tabularnewline
48 &  17 &  15.99 &  1.01 \tabularnewline
49 &  17 &  16.1 &  0.899 \tabularnewline
50 &  20 &  16.78 &  3.218 \tabularnewline
51 &  17 &  14.63 &  2.372 \tabularnewline
52 &  18 &  15.99 &  2.01 \tabularnewline
53 &  17 &  16.1 &  0.899 \tabularnewline
54 &  14 &  15.31 & -1.309 \tabularnewline
55 &  17 &  15.99 &  1.01 \tabularnewline
56 &  17 &  15.99 &  1.01 \tabularnewline
57 &  16 &  15.99 &  0.009901 \tabularnewline
58 &  18 &  14.66 &  3.335 \tabularnewline
59 &  16 &  15.99 &  0.009901 \tabularnewline
60 &  13 &  14.74 & -1.739 \tabularnewline
61 &  16 &  15.99 &  0.009901 \tabularnewline
62 &  12 &  13.87 & -1.874 \tabularnewline
63 &  16 &  14.66 &  1.335 \tabularnewline
64 &  16 &  16.1 & -0.101 \tabularnewline
65 &  16 &  14.63 &  1.372 \tabularnewline
66 &  14 &  16.78 & -2.782 \tabularnewline
67 &  15 &  14.21 &  0.7859 \tabularnewline
68 &  14 &  15.99 & -1.99 \tabularnewline
69 &  15 &  14.9 &  0.1048 \tabularnewline
70 &  15 &  17.2 & -2.196 \tabularnewline
71 &  16 &  17.09 & -1.085 \tabularnewline
72 &  11 &  14.21 & -3.214 \tabularnewline
73 &  18 &  15.99 &  2.01 \tabularnewline
74 &  11 &  15.42 & -4.42 \tabularnewline
75 &  18 &  16.17 &  1.825 \tabularnewline
76 &  19 &  16.14 &  2.862 \tabularnewline
77 &  17 &  15.99 &  1.01 \tabularnewline
78 &  14 &  15.99 & -1.99 \tabularnewline
79 &  17 &  15.58 &  1.424 \tabularnewline
80 &  14 &  14.9 & -0.8952 \tabularnewline
81 &  19 &  16.26 &  2.742 \tabularnewline
82 &  16 &  16.1 & -0.101 \tabularnewline
83 &  16 &  15.31 &  0.6911 \tabularnewline
84 &  15 &  14.37 &  0.6294 \tabularnewline
85 &  12 &  14.82 & -2.821 \tabularnewline
86 &  17 &  15.99 &  1.01 \tabularnewline
87 &  18 &  16.1 &  1.899 \tabularnewline
88 &  15 &  14.37 &  0.6294 \tabularnewline
89 &  18 &  16.1 &  1.899 \tabularnewline
90 &  16 &  15.58 &  0.4236 \tabularnewline
91 &  16 &  15.83 &  0.1665 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=299522&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C] 13[/C][C] 14.9[/C][C]-1.895[/C][/ROW]
[ROW][C]2[/C][C] 16[/C][C] 14.09[/C][C] 1.905[/C][/ROW]
[ROW][C]3[/C][C] 17[/C][C] 15.42[/C][C] 1.58[/C][/ROW]
[ROW][C]4[/C][C] 16[/C][C] 15.42[/C][C] 0.5801[/C][/ROW]
[ROW][C]5[/C][C] 17[/C][C] 14.9[/C][C] 2.105[/C][/ROW]
[ROW][C]6[/C][C] 17[/C][C] 15.42[/C][C] 1.58[/C][/ROW]
[ROW][C]7[/C][C] 15[/C][C] 15.99[/C][C]-0.9901[/C][/ROW]
[ROW][C]8[/C][C] 16[/C][C] 16.17[/C][C]-0.1748[/C][/ROW]
[ROW][C]9[/C][C] 14[/C][C] 16.1[/C][C]-2.101[/C][/ROW]
[ROW][C]10[/C][C] 16[/C][C] 15.42[/C][C] 0.5801[/C][/ROW]
[ROW][C]11[/C][C] 17[/C][C] 16.44[/C][C] 0.5591[/C][/ROW]
[ROW][C]12[/C][C] 16[/C][C] 14.9[/C][C] 1.105[/C][/ROW]
[ROW][C]13[/C][C] 16[/C][C] 15.58[/C][C] 0.4236[/C][/ROW]
[ROW][C]14[/C][C] 16[/C][C] 15.42[/C][C] 0.5801[/C][/ROW]
[ROW][C]15[/C][C] 15[/C][C] 15.42[/C][C]-0.4199[/C][/ROW]
[ROW][C]16[/C][C] 16[/C][C] 16.38[/C][C]-0.377[/C][/ROW]
[ROW][C]17[/C][C] 13[/C][C] 15.99[/C][C]-2.99[/C][/ROW]
[ROW][C]18[/C][C] 15[/C][C] 14.66[/C][C] 0.3351[/C][/ROW]
[ROW][C]19[/C][C] 17[/C][C] 16.1[/C][C] 0.899[/C][/ROW]
[ROW][C]20[/C][C] 13[/C][C] 16.1[/C][C]-3.101[/C][/ROW]
[ROW][C]21[/C][C] 17[/C][C] 15.42[/C][C] 1.58[/C][/ROW]
[ROW][C]22[/C][C] 14[/C][C] 15.58[/C][C]-1.576[/C][/ROW]
[ROW][C]23[/C][C] 14[/C][C] 14.29[/C][C]-0.2879[/C][/ROW]
[ROW][C]24[/C][C] 18[/C][C] 15.99[/C][C] 2.01[/C][/ROW]
[ROW][C]25[/C][C] 17[/C][C] 15.99[/C][C] 1.01[/C][/ROW]
[ROW][C]26[/C][C] 13[/C][C] 14.9[/C][C]-1.895[/C][/ROW]
[ROW][C]27[/C][C] 16[/C][C] 16.51[/C][C]-0.5147[/C][/ROW]
[ROW][C]28[/C][C] 15[/C][C] 16.1[/C][C]-1.101[/C][/ROW]
[ROW][C]29[/C][C] 15[/C][C] 16.67[/C][C]-1.671[/C][/ROW]
[ROW][C]30[/C][C] 13[/C][C] 15.35[/C][C]-2.346[/C][/ROW]
[ROW][C]31[/C][C] 17[/C][C] 16.67[/C][C] 0.3287[/C][/ROW]
[ROW][C]32[/C][C] 11[/C][C] 12.82[/C][C]-1.815[/C][/ROW]
[ROW][C]33[/C][C] 13[/C][C] 14.06[/C][C]-1.058[/C][/ROW]
[ROW][C]34[/C][C] 17[/C][C] 17.31[/C][C]-0.3068[/C][/ROW]
[ROW][C]35[/C][C] 16[/C][C] 16.67[/C][C]-0.6713[/C][/ROW]
[ROW][C]36[/C][C] 17[/C][C] 15.99[/C][C] 1.01[/C][/ROW]
[ROW][C]37[/C][C] 16[/C][C] 17.23[/C][C]-1.233[/C][/ROW]
[ROW][C]38[/C][C] 16[/C][C] 16.67[/C][C]-0.6713[/C][/ROW]
[ROW][C]39[/C][C] 16[/C][C] 14.9[/C][C] 1.105[/C][/ROW]
[ROW][C]40[/C][C] 17[/C][C] 15.42[/C][C] 1.58[/C][/ROW]
[ROW][C]41[/C][C] 14[/C][C] 15.58[/C][C]-1.576[/C][/ROW]
[ROW][C]42[/C][C] 14[/C][C] 15.94[/C][C]-1.944[/C][/ROW]
[ROW][C]43[/C][C] 16[/C][C] 16.78[/C][C]-0.7822[/C][/ROW]
[ROW][C]44[/C][C] 15[/C][C] 15.42[/C][C]-0.4199[/C][/ROW]
[ROW][C]45[/C][C] 16[/C][C] 15.42[/C][C] 0.5801[/C][/ROW]
[ROW][C]46[/C][C] 14[/C][C] 15.19[/C][C]-1.19[/C][/ROW]
[ROW][C]47[/C][C] 15[/C][C] 15.42[/C][C]-0.4199[/C][/ROW]
[ROW][C]48[/C][C] 17[/C][C] 15.99[/C][C] 1.01[/C][/ROW]
[ROW][C]49[/C][C] 17[/C][C] 16.1[/C][C] 0.899[/C][/ROW]
[ROW][C]50[/C][C] 20[/C][C] 16.78[/C][C] 3.218[/C][/ROW]
[ROW][C]51[/C][C] 17[/C][C] 14.63[/C][C] 2.372[/C][/ROW]
[ROW][C]52[/C][C] 18[/C][C] 15.99[/C][C] 2.01[/C][/ROW]
[ROW][C]53[/C][C] 17[/C][C] 16.1[/C][C] 0.899[/C][/ROW]
[ROW][C]54[/C][C] 14[/C][C] 15.31[/C][C]-1.309[/C][/ROW]
[ROW][C]55[/C][C] 17[/C][C] 15.99[/C][C] 1.01[/C][/ROW]
[ROW][C]56[/C][C] 17[/C][C] 15.99[/C][C] 1.01[/C][/ROW]
[ROW][C]57[/C][C] 16[/C][C] 15.99[/C][C] 0.009901[/C][/ROW]
[ROW][C]58[/C][C] 18[/C][C] 14.66[/C][C] 3.335[/C][/ROW]
[ROW][C]59[/C][C] 16[/C][C] 15.99[/C][C] 0.009901[/C][/ROW]
[ROW][C]60[/C][C] 13[/C][C] 14.74[/C][C]-1.739[/C][/ROW]
[ROW][C]61[/C][C] 16[/C][C] 15.99[/C][C] 0.009901[/C][/ROW]
[ROW][C]62[/C][C] 12[/C][C] 13.87[/C][C]-1.874[/C][/ROW]
[ROW][C]63[/C][C] 16[/C][C] 14.66[/C][C] 1.335[/C][/ROW]
[ROW][C]64[/C][C] 16[/C][C] 16.1[/C][C]-0.101[/C][/ROW]
[ROW][C]65[/C][C] 16[/C][C] 14.63[/C][C] 1.372[/C][/ROW]
[ROW][C]66[/C][C] 14[/C][C] 16.78[/C][C]-2.782[/C][/ROW]
[ROW][C]67[/C][C] 15[/C][C] 14.21[/C][C] 0.7859[/C][/ROW]
[ROW][C]68[/C][C] 14[/C][C] 15.99[/C][C]-1.99[/C][/ROW]
[ROW][C]69[/C][C] 15[/C][C] 14.9[/C][C] 0.1048[/C][/ROW]
[ROW][C]70[/C][C] 15[/C][C] 17.2[/C][C]-2.196[/C][/ROW]
[ROW][C]71[/C][C] 16[/C][C] 17.09[/C][C]-1.085[/C][/ROW]
[ROW][C]72[/C][C] 11[/C][C] 14.21[/C][C]-3.214[/C][/ROW]
[ROW][C]73[/C][C] 18[/C][C] 15.99[/C][C] 2.01[/C][/ROW]
[ROW][C]74[/C][C] 11[/C][C] 15.42[/C][C]-4.42[/C][/ROW]
[ROW][C]75[/C][C] 18[/C][C] 16.17[/C][C] 1.825[/C][/ROW]
[ROW][C]76[/C][C] 19[/C][C] 16.14[/C][C] 2.862[/C][/ROW]
[ROW][C]77[/C][C] 17[/C][C] 15.99[/C][C] 1.01[/C][/ROW]
[ROW][C]78[/C][C] 14[/C][C] 15.99[/C][C]-1.99[/C][/ROW]
[ROW][C]79[/C][C] 17[/C][C] 15.58[/C][C] 1.424[/C][/ROW]
[ROW][C]80[/C][C] 14[/C][C] 14.9[/C][C]-0.8952[/C][/ROW]
[ROW][C]81[/C][C] 19[/C][C] 16.26[/C][C] 2.742[/C][/ROW]
[ROW][C]82[/C][C] 16[/C][C] 16.1[/C][C]-0.101[/C][/ROW]
[ROW][C]83[/C][C] 16[/C][C] 15.31[/C][C] 0.6911[/C][/ROW]
[ROW][C]84[/C][C] 15[/C][C] 14.37[/C][C] 0.6294[/C][/ROW]
[ROW][C]85[/C][C] 12[/C][C] 14.82[/C][C]-2.821[/C][/ROW]
[ROW][C]86[/C][C] 17[/C][C] 15.99[/C][C] 1.01[/C][/ROW]
[ROW][C]87[/C][C] 18[/C][C] 16.1[/C][C] 1.899[/C][/ROW]
[ROW][C]88[/C][C] 15[/C][C] 14.37[/C][C] 0.6294[/C][/ROW]
[ROW][C]89[/C][C] 18[/C][C] 16.1[/C][C] 1.899[/C][/ROW]
[ROW][C]90[/C][C] 16[/C][C] 15.58[/C][C] 0.4236[/C][/ROW]
[ROW][C]91[/C][C] 16[/C][C] 15.83[/C][C] 0.1665[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=299522&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=299522&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 13 14.9-1.895
2 16 14.09 1.905
3 17 15.42 1.58
4 16 15.42 0.5801
5 17 14.9 2.105
6 17 15.42 1.58
7 15 15.99-0.9901
8 16 16.17-0.1748
9 14 16.1-2.101
10 16 15.42 0.5801
11 17 16.44 0.5591
12 16 14.9 1.105
13 16 15.58 0.4236
14 16 15.42 0.5801
15 15 15.42-0.4199
16 16 16.38-0.377
17 13 15.99-2.99
18 15 14.66 0.3351
19 17 16.1 0.899
20 13 16.1-3.101
21 17 15.42 1.58
22 14 15.58-1.576
23 14 14.29-0.2879
24 18 15.99 2.01
25 17 15.99 1.01
26 13 14.9-1.895
27 16 16.51-0.5147
28 15 16.1-1.101
29 15 16.67-1.671
30 13 15.35-2.346
31 17 16.67 0.3287
32 11 12.82-1.815
33 13 14.06-1.058
34 17 17.31-0.3068
35 16 16.67-0.6713
36 17 15.99 1.01
37 16 17.23-1.233
38 16 16.67-0.6713
39 16 14.9 1.105
40 17 15.42 1.58
41 14 15.58-1.576
42 14 15.94-1.944
43 16 16.78-0.7822
44 15 15.42-0.4199
45 16 15.42 0.5801
46 14 15.19-1.19
47 15 15.42-0.4199
48 17 15.99 1.01
49 17 16.1 0.899
50 20 16.78 3.218
51 17 14.63 2.372
52 18 15.99 2.01
53 17 16.1 0.899
54 14 15.31-1.309
55 17 15.99 1.01
56 17 15.99 1.01
57 16 15.99 0.009901
58 18 14.66 3.335
59 16 15.99 0.009901
60 13 14.74-1.739
61 16 15.99 0.009901
62 12 13.87-1.874
63 16 14.66 1.335
64 16 16.1-0.101
65 16 14.63 1.372
66 14 16.78-2.782
67 15 14.21 0.7859
68 14 15.99-1.99
69 15 14.9 0.1048
70 15 17.2-2.196
71 16 17.09-1.085
72 11 14.21-3.214
73 18 15.99 2.01
74 11 15.42-4.42
75 18 16.17 1.825
76 19 16.14 2.862
77 17 15.99 1.01
78 14 15.99-1.99
79 17 15.58 1.424
80 14 14.9-0.8952
81 19 16.26 2.742
82 16 16.1-0.101
83 16 15.31 0.6911
84 15 14.37 0.6294
85 12 14.82-2.821
86 17 15.99 1.01
87 18 16.1 1.899
88 15 14.37 0.6294
89 18 16.1 1.899
90 16 15.58 0.4236
91 16 15.83 0.1665







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
8 0.6977 0.6047 0.3023
9 0.5506 0.8988 0.4494
10 0.4043 0.8087 0.5957
11 0.3614 0.7227 0.6386
12 0.3025 0.605 0.6975
13 0.2977 0.5954 0.7023
14 0.2099 0.4199 0.7901
15 0.1681 0.3362 0.8319
16 0.1364 0.2728 0.8636
17 0.2671 0.5342 0.7329
18 0.205 0.4101 0.795
19 0.1587 0.3173 0.8413
20 0.3394 0.6788 0.6606
21 0.3208 0.6416 0.6792
22 0.3054 0.6109 0.6946
23 0.2781 0.5562 0.7219
24 0.3574 0.7148 0.6426
25 0.318 0.6359 0.682
26 0.3596 0.7192 0.6404
27 0.3002 0.6004 0.6998
28 0.2545 0.509 0.7455
29 0.2314 0.4627 0.7686
30 0.2901 0.5803 0.7099
31 0.2566 0.5132 0.7434
32 0.2746 0.5491 0.7254
33 0.278 0.556 0.722
34 0.225 0.4499 0.775
35 0.1833 0.3666 0.8167
36 0.1601 0.3203 0.8399
37 0.1383 0.2767 0.8617
38 0.1103 0.2206 0.8897
39 0.09894 0.1979 0.9011
40 0.09733 0.1947 0.9027
41 0.0908 0.1816 0.9092
42 0.1072 0.2144 0.8928
43 0.08754 0.1751 0.9125
44 0.06595 0.1319 0.9341
45 0.05006 0.1001 0.9499
46 0.04421 0.08841 0.9558
47 0.03207 0.06415 0.9679
48 0.02562 0.05125 0.9744
49 0.02068 0.04137 0.9793
50 0.06033 0.1207 0.9397
51 0.08126 0.1625 0.9187
52 0.0901 0.1802 0.9099
53 0.07322 0.1464 0.9268
54 0.06641 0.1328 0.9336
55 0.05371 0.1074 0.9463
56 0.04305 0.0861 0.957
57 0.03023 0.06047 0.9698
58 0.07587 0.1517 0.9241
59 0.05519 0.1104 0.9448
60 0.05318 0.1064 0.9468
61 0.03753 0.07506 0.9625
62 0.04011 0.08023 0.9599
63 0.04079 0.08158 0.9592
64 0.0282 0.05641 0.9718
65 0.03456 0.06912 0.9654
66 0.1146 0.2292 0.8854
67 0.1203 0.2407 0.8797
68 0.1323 0.2646 0.8677
69 0.09761 0.1952 0.9024
70 0.2116 0.4232 0.7884
71 0.282 0.564 0.718
72 0.2839 0.5679 0.7161
73 0.2704 0.5409 0.7296
74 0.7941 0.4118 0.2059
75 0.7746 0.4507 0.2254
76 0.8943 0.2114 0.1057
77 0.8488 0.3023 0.1512
78 0.9839 0.03213 0.01606
79 0.9664 0.06721 0.03361
80 0.9626 0.07482 0.03741
81 0.9465 0.107 0.05348
82 0.9746 0.05075 0.02538
83 0.9337 0.1325 0.06627

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
8 &  0.6977 &  0.6047 &  0.3023 \tabularnewline
9 &  0.5506 &  0.8988 &  0.4494 \tabularnewline
10 &  0.4043 &  0.8087 &  0.5957 \tabularnewline
11 &  0.3614 &  0.7227 &  0.6386 \tabularnewline
12 &  0.3025 &  0.605 &  0.6975 \tabularnewline
13 &  0.2977 &  0.5954 &  0.7023 \tabularnewline
14 &  0.2099 &  0.4199 &  0.7901 \tabularnewline
15 &  0.1681 &  0.3362 &  0.8319 \tabularnewline
16 &  0.1364 &  0.2728 &  0.8636 \tabularnewline
17 &  0.2671 &  0.5342 &  0.7329 \tabularnewline
18 &  0.205 &  0.4101 &  0.795 \tabularnewline
19 &  0.1587 &  0.3173 &  0.8413 \tabularnewline
20 &  0.3394 &  0.6788 &  0.6606 \tabularnewline
21 &  0.3208 &  0.6416 &  0.6792 \tabularnewline
22 &  0.3054 &  0.6109 &  0.6946 \tabularnewline
23 &  0.2781 &  0.5562 &  0.7219 \tabularnewline
24 &  0.3574 &  0.7148 &  0.6426 \tabularnewline
25 &  0.318 &  0.6359 &  0.682 \tabularnewline
26 &  0.3596 &  0.7192 &  0.6404 \tabularnewline
27 &  0.3002 &  0.6004 &  0.6998 \tabularnewline
28 &  0.2545 &  0.509 &  0.7455 \tabularnewline
29 &  0.2314 &  0.4627 &  0.7686 \tabularnewline
30 &  0.2901 &  0.5803 &  0.7099 \tabularnewline
31 &  0.2566 &  0.5132 &  0.7434 \tabularnewline
32 &  0.2746 &  0.5491 &  0.7254 \tabularnewline
33 &  0.278 &  0.556 &  0.722 \tabularnewline
34 &  0.225 &  0.4499 &  0.775 \tabularnewline
35 &  0.1833 &  0.3666 &  0.8167 \tabularnewline
36 &  0.1601 &  0.3203 &  0.8399 \tabularnewline
37 &  0.1383 &  0.2767 &  0.8617 \tabularnewline
38 &  0.1103 &  0.2206 &  0.8897 \tabularnewline
39 &  0.09894 &  0.1979 &  0.9011 \tabularnewline
40 &  0.09733 &  0.1947 &  0.9027 \tabularnewline
41 &  0.0908 &  0.1816 &  0.9092 \tabularnewline
42 &  0.1072 &  0.2144 &  0.8928 \tabularnewline
43 &  0.08754 &  0.1751 &  0.9125 \tabularnewline
44 &  0.06595 &  0.1319 &  0.9341 \tabularnewline
45 &  0.05006 &  0.1001 &  0.9499 \tabularnewline
46 &  0.04421 &  0.08841 &  0.9558 \tabularnewline
47 &  0.03207 &  0.06415 &  0.9679 \tabularnewline
48 &  0.02562 &  0.05125 &  0.9744 \tabularnewline
49 &  0.02068 &  0.04137 &  0.9793 \tabularnewline
50 &  0.06033 &  0.1207 &  0.9397 \tabularnewline
51 &  0.08126 &  0.1625 &  0.9187 \tabularnewline
52 &  0.0901 &  0.1802 &  0.9099 \tabularnewline
53 &  0.07322 &  0.1464 &  0.9268 \tabularnewline
54 &  0.06641 &  0.1328 &  0.9336 \tabularnewline
55 &  0.05371 &  0.1074 &  0.9463 \tabularnewline
56 &  0.04305 &  0.0861 &  0.957 \tabularnewline
57 &  0.03023 &  0.06047 &  0.9698 \tabularnewline
58 &  0.07587 &  0.1517 &  0.9241 \tabularnewline
59 &  0.05519 &  0.1104 &  0.9448 \tabularnewline
60 &  0.05318 &  0.1064 &  0.9468 \tabularnewline
61 &  0.03753 &  0.07506 &  0.9625 \tabularnewline
62 &  0.04011 &  0.08023 &  0.9599 \tabularnewline
63 &  0.04079 &  0.08158 &  0.9592 \tabularnewline
64 &  0.0282 &  0.05641 &  0.9718 \tabularnewline
65 &  0.03456 &  0.06912 &  0.9654 \tabularnewline
66 &  0.1146 &  0.2292 &  0.8854 \tabularnewline
67 &  0.1203 &  0.2407 &  0.8797 \tabularnewline
68 &  0.1323 &  0.2646 &  0.8677 \tabularnewline
69 &  0.09761 &  0.1952 &  0.9024 \tabularnewline
70 &  0.2116 &  0.4232 &  0.7884 \tabularnewline
71 &  0.282 &  0.564 &  0.718 \tabularnewline
72 &  0.2839 &  0.5679 &  0.7161 \tabularnewline
73 &  0.2704 &  0.5409 &  0.7296 \tabularnewline
74 &  0.7941 &  0.4118 &  0.2059 \tabularnewline
75 &  0.7746 &  0.4507 &  0.2254 \tabularnewline
76 &  0.8943 &  0.2114 &  0.1057 \tabularnewline
77 &  0.8488 &  0.3023 &  0.1512 \tabularnewline
78 &  0.9839 &  0.03213 &  0.01606 \tabularnewline
79 &  0.9664 &  0.06721 &  0.03361 \tabularnewline
80 &  0.9626 &  0.07482 &  0.03741 \tabularnewline
81 &  0.9465 &  0.107 &  0.05348 \tabularnewline
82 &  0.9746 &  0.05075 &  0.02538 \tabularnewline
83 &  0.9337 &  0.1325 &  0.06627 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=299522&T=5

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]8[/C][C] 0.6977[/C][C] 0.6047[/C][C] 0.3023[/C][/ROW]
[ROW][C]9[/C][C] 0.5506[/C][C] 0.8988[/C][C] 0.4494[/C][/ROW]
[ROW][C]10[/C][C] 0.4043[/C][C] 0.8087[/C][C] 0.5957[/C][/ROW]
[ROW][C]11[/C][C] 0.3614[/C][C] 0.7227[/C][C] 0.6386[/C][/ROW]
[ROW][C]12[/C][C] 0.3025[/C][C] 0.605[/C][C] 0.6975[/C][/ROW]
[ROW][C]13[/C][C] 0.2977[/C][C] 0.5954[/C][C] 0.7023[/C][/ROW]
[ROW][C]14[/C][C] 0.2099[/C][C] 0.4199[/C][C] 0.7901[/C][/ROW]
[ROW][C]15[/C][C] 0.1681[/C][C] 0.3362[/C][C] 0.8319[/C][/ROW]
[ROW][C]16[/C][C] 0.1364[/C][C] 0.2728[/C][C] 0.8636[/C][/ROW]
[ROW][C]17[/C][C] 0.2671[/C][C] 0.5342[/C][C] 0.7329[/C][/ROW]
[ROW][C]18[/C][C] 0.205[/C][C] 0.4101[/C][C] 0.795[/C][/ROW]
[ROW][C]19[/C][C] 0.1587[/C][C] 0.3173[/C][C] 0.8413[/C][/ROW]
[ROW][C]20[/C][C] 0.3394[/C][C] 0.6788[/C][C] 0.6606[/C][/ROW]
[ROW][C]21[/C][C] 0.3208[/C][C] 0.6416[/C][C] 0.6792[/C][/ROW]
[ROW][C]22[/C][C] 0.3054[/C][C] 0.6109[/C][C] 0.6946[/C][/ROW]
[ROW][C]23[/C][C] 0.2781[/C][C] 0.5562[/C][C] 0.7219[/C][/ROW]
[ROW][C]24[/C][C] 0.3574[/C][C] 0.7148[/C][C] 0.6426[/C][/ROW]
[ROW][C]25[/C][C] 0.318[/C][C] 0.6359[/C][C] 0.682[/C][/ROW]
[ROW][C]26[/C][C] 0.3596[/C][C] 0.7192[/C][C] 0.6404[/C][/ROW]
[ROW][C]27[/C][C] 0.3002[/C][C] 0.6004[/C][C] 0.6998[/C][/ROW]
[ROW][C]28[/C][C] 0.2545[/C][C] 0.509[/C][C] 0.7455[/C][/ROW]
[ROW][C]29[/C][C] 0.2314[/C][C] 0.4627[/C][C] 0.7686[/C][/ROW]
[ROW][C]30[/C][C] 0.2901[/C][C] 0.5803[/C][C] 0.7099[/C][/ROW]
[ROW][C]31[/C][C] 0.2566[/C][C] 0.5132[/C][C] 0.7434[/C][/ROW]
[ROW][C]32[/C][C] 0.2746[/C][C] 0.5491[/C][C] 0.7254[/C][/ROW]
[ROW][C]33[/C][C] 0.278[/C][C] 0.556[/C][C] 0.722[/C][/ROW]
[ROW][C]34[/C][C] 0.225[/C][C] 0.4499[/C][C] 0.775[/C][/ROW]
[ROW][C]35[/C][C] 0.1833[/C][C] 0.3666[/C][C] 0.8167[/C][/ROW]
[ROW][C]36[/C][C] 0.1601[/C][C] 0.3203[/C][C] 0.8399[/C][/ROW]
[ROW][C]37[/C][C] 0.1383[/C][C] 0.2767[/C][C] 0.8617[/C][/ROW]
[ROW][C]38[/C][C] 0.1103[/C][C] 0.2206[/C][C] 0.8897[/C][/ROW]
[ROW][C]39[/C][C] 0.09894[/C][C] 0.1979[/C][C] 0.9011[/C][/ROW]
[ROW][C]40[/C][C] 0.09733[/C][C] 0.1947[/C][C] 0.9027[/C][/ROW]
[ROW][C]41[/C][C] 0.0908[/C][C] 0.1816[/C][C] 0.9092[/C][/ROW]
[ROW][C]42[/C][C] 0.1072[/C][C] 0.2144[/C][C] 0.8928[/C][/ROW]
[ROW][C]43[/C][C] 0.08754[/C][C] 0.1751[/C][C] 0.9125[/C][/ROW]
[ROW][C]44[/C][C] 0.06595[/C][C] 0.1319[/C][C] 0.9341[/C][/ROW]
[ROW][C]45[/C][C] 0.05006[/C][C] 0.1001[/C][C] 0.9499[/C][/ROW]
[ROW][C]46[/C][C] 0.04421[/C][C] 0.08841[/C][C] 0.9558[/C][/ROW]
[ROW][C]47[/C][C] 0.03207[/C][C] 0.06415[/C][C] 0.9679[/C][/ROW]
[ROW][C]48[/C][C] 0.02562[/C][C] 0.05125[/C][C] 0.9744[/C][/ROW]
[ROW][C]49[/C][C] 0.02068[/C][C] 0.04137[/C][C] 0.9793[/C][/ROW]
[ROW][C]50[/C][C] 0.06033[/C][C] 0.1207[/C][C] 0.9397[/C][/ROW]
[ROW][C]51[/C][C] 0.08126[/C][C] 0.1625[/C][C] 0.9187[/C][/ROW]
[ROW][C]52[/C][C] 0.0901[/C][C] 0.1802[/C][C] 0.9099[/C][/ROW]
[ROW][C]53[/C][C] 0.07322[/C][C] 0.1464[/C][C] 0.9268[/C][/ROW]
[ROW][C]54[/C][C] 0.06641[/C][C] 0.1328[/C][C] 0.9336[/C][/ROW]
[ROW][C]55[/C][C] 0.05371[/C][C] 0.1074[/C][C] 0.9463[/C][/ROW]
[ROW][C]56[/C][C] 0.04305[/C][C] 0.0861[/C][C] 0.957[/C][/ROW]
[ROW][C]57[/C][C] 0.03023[/C][C] 0.06047[/C][C] 0.9698[/C][/ROW]
[ROW][C]58[/C][C] 0.07587[/C][C] 0.1517[/C][C] 0.9241[/C][/ROW]
[ROW][C]59[/C][C] 0.05519[/C][C] 0.1104[/C][C] 0.9448[/C][/ROW]
[ROW][C]60[/C][C] 0.05318[/C][C] 0.1064[/C][C] 0.9468[/C][/ROW]
[ROW][C]61[/C][C] 0.03753[/C][C] 0.07506[/C][C] 0.9625[/C][/ROW]
[ROW][C]62[/C][C] 0.04011[/C][C] 0.08023[/C][C] 0.9599[/C][/ROW]
[ROW][C]63[/C][C] 0.04079[/C][C] 0.08158[/C][C] 0.9592[/C][/ROW]
[ROW][C]64[/C][C] 0.0282[/C][C] 0.05641[/C][C] 0.9718[/C][/ROW]
[ROW][C]65[/C][C] 0.03456[/C][C] 0.06912[/C][C] 0.9654[/C][/ROW]
[ROW][C]66[/C][C] 0.1146[/C][C] 0.2292[/C][C] 0.8854[/C][/ROW]
[ROW][C]67[/C][C] 0.1203[/C][C] 0.2407[/C][C] 0.8797[/C][/ROW]
[ROW][C]68[/C][C] 0.1323[/C][C] 0.2646[/C][C] 0.8677[/C][/ROW]
[ROW][C]69[/C][C] 0.09761[/C][C] 0.1952[/C][C] 0.9024[/C][/ROW]
[ROW][C]70[/C][C] 0.2116[/C][C] 0.4232[/C][C] 0.7884[/C][/ROW]
[ROW][C]71[/C][C] 0.282[/C][C] 0.564[/C][C] 0.718[/C][/ROW]
[ROW][C]72[/C][C] 0.2839[/C][C] 0.5679[/C][C] 0.7161[/C][/ROW]
[ROW][C]73[/C][C] 0.2704[/C][C] 0.5409[/C][C] 0.7296[/C][/ROW]
[ROW][C]74[/C][C] 0.7941[/C][C] 0.4118[/C][C] 0.2059[/C][/ROW]
[ROW][C]75[/C][C] 0.7746[/C][C] 0.4507[/C][C] 0.2254[/C][/ROW]
[ROW][C]76[/C][C] 0.8943[/C][C] 0.2114[/C][C] 0.1057[/C][/ROW]
[ROW][C]77[/C][C] 0.8488[/C][C] 0.3023[/C][C] 0.1512[/C][/ROW]
[ROW][C]78[/C][C] 0.9839[/C][C] 0.03213[/C][C] 0.01606[/C][/ROW]
[ROW][C]79[/C][C] 0.9664[/C][C] 0.06721[/C][C] 0.03361[/C][/ROW]
[ROW][C]80[/C][C] 0.9626[/C][C] 0.07482[/C][C] 0.03741[/C][/ROW]
[ROW][C]81[/C][C] 0.9465[/C][C] 0.107[/C][C] 0.05348[/C][/ROW]
[ROW][C]82[/C][C] 0.9746[/C][C] 0.05075[/C][C] 0.02538[/C][/ROW]
[ROW][C]83[/C][C] 0.9337[/C][C] 0.1325[/C][C] 0.06627[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=299522&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=299522&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
8 0.6977 0.6047 0.3023
9 0.5506 0.8988 0.4494
10 0.4043 0.8087 0.5957
11 0.3614 0.7227 0.6386
12 0.3025 0.605 0.6975
13 0.2977 0.5954 0.7023
14 0.2099 0.4199 0.7901
15 0.1681 0.3362 0.8319
16 0.1364 0.2728 0.8636
17 0.2671 0.5342 0.7329
18 0.205 0.4101 0.795
19 0.1587 0.3173 0.8413
20 0.3394 0.6788 0.6606
21 0.3208 0.6416 0.6792
22 0.3054 0.6109 0.6946
23 0.2781 0.5562 0.7219
24 0.3574 0.7148 0.6426
25 0.318 0.6359 0.682
26 0.3596 0.7192 0.6404
27 0.3002 0.6004 0.6998
28 0.2545 0.509 0.7455
29 0.2314 0.4627 0.7686
30 0.2901 0.5803 0.7099
31 0.2566 0.5132 0.7434
32 0.2746 0.5491 0.7254
33 0.278 0.556 0.722
34 0.225 0.4499 0.775
35 0.1833 0.3666 0.8167
36 0.1601 0.3203 0.8399
37 0.1383 0.2767 0.8617
38 0.1103 0.2206 0.8897
39 0.09894 0.1979 0.9011
40 0.09733 0.1947 0.9027
41 0.0908 0.1816 0.9092
42 0.1072 0.2144 0.8928
43 0.08754 0.1751 0.9125
44 0.06595 0.1319 0.9341
45 0.05006 0.1001 0.9499
46 0.04421 0.08841 0.9558
47 0.03207 0.06415 0.9679
48 0.02562 0.05125 0.9744
49 0.02068 0.04137 0.9793
50 0.06033 0.1207 0.9397
51 0.08126 0.1625 0.9187
52 0.0901 0.1802 0.9099
53 0.07322 0.1464 0.9268
54 0.06641 0.1328 0.9336
55 0.05371 0.1074 0.9463
56 0.04305 0.0861 0.957
57 0.03023 0.06047 0.9698
58 0.07587 0.1517 0.9241
59 0.05519 0.1104 0.9448
60 0.05318 0.1064 0.9468
61 0.03753 0.07506 0.9625
62 0.04011 0.08023 0.9599
63 0.04079 0.08158 0.9592
64 0.0282 0.05641 0.9718
65 0.03456 0.06912 0.9654
66 0.1146 0.2292 0.8854
67 0.1203 0.2407 0.8797
68 0.1323 0.2646 0.8677
69 0.09761 0.1952 0.9024
70 0.2116 0.4232 0.7884
71 0.282 0.564 0.718
72 0.2839 0.5679 0.7161
73 0.2704 0.5409 0.7296
74 0.7941 0.4118 0.2059
75 0.7746 0.4507 0.2254
76 0.8943 0.2114 0.1057
77 0.8488 0.3023 0.1512
78 0.9839 0.03213 0.01606
79 0.9664 0.06721 0.03361
80 0.9626 0.07482 0.03741
81 0.9465 0.107 0.05348
82 0.9746 0.05075 0.02538
83 0.9337 0.1325 0.06627







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level0 0OK
5% type I error level20.0263158OK
10% type I error level150.197368NOK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 0 &  0 & OK \tabularnewline
5% type I error level & 2 & 0.0263158 & OK \tabularnewline
10% type I error level & 15 & 0.197368 & NOK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=299522&T=6

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]0[/C][C] 0[/C][C]OK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]2[/C][C]0.0263158[/C][C]OK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]15[/C][C]0.197368[/C][C]NOK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=299522&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=299522&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level0 0OK
5% type I error level20.0263158OK
10% type I error level150.197368NOK







Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 2.0048, df1 = 2, df2 = 84, p-value = 0.1411
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.63066, df1 = 8, df2 = 78, p-value = 0.7496
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 0.53219, df1 = 2, df2 = 84, p-value = 0.5893

\begin{tabular}{lllllllll}
\hline
Ramsey RESET F-Test for powers (2 and 3) of fitted values \tabularnewline
> reset_test_fitted
	RESET test
data:  mylm
RESET = 2.0048, df1 = 2, df2 = 84, p-value = 0.1411
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of regressors \tabularnewline
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.63066, df1 = 8, df2 = 78, p-value = 0.7496
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of principal components \tabularnewline
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 0.53219, df1 = 2, df2 = 84, p-value = 0.5893
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=299522&T=7

[TABLE]
[ROW][C]Ramsey RESET F-Test for powers (2 and 3) of fitted values[/C][/ROW]
[ROW][C]
> reset_test_fitted
	RESET test
data:  mylm
RESET = 2.0048, df1 = 2, df2 = 84, p-value = 0.1411
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of regressors[/C][/ROW] [ROW][C]
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.63066, df1 = 8, df2 = 78, p-value = 0.7496
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of principal components[/C][/ROW] [ROW][C]
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 0.53219, df1 = 2, df2 = 84, p-value = 0.5893
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=299522&T=7

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=299522&T=7

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 2.0048, df1 = 2, df2 = 84, p-value = 0.1411
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.63066, df1 = 8, df2 = 78, p-value = 0.7496
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 0.53219, df1 = 2, df2 = 84, p-value = 0.5893







Variance Inflation Factors (Multicollinearity)
> vif
     GW1      GW2      GW3      GW4 
1.298582 1.117901 1.268786 1.345029 

\begin{tabular}{lllllllll}
\hline
Variance Inflation Factors (Multicollinearity) \tabularnewline
> vif
     GW1      GW2      GW3      GW4 
1.298582 1.117901 1.268786 1.345029 
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=299522&T=8

[TABLE]
[ROW][C]Variance Inflation Factors (Multicollinearity)[/C][/ROW]
[ROW][C]
> vif
     GW1      GW2      GW3      GW4 
1.298582 1.117901 1.268786 1.345029 
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=299522&T=8

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=299522&T=8

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Variance Inflation Factors (Multicollinearity)
> vif
     GW1      GW2      GW3      GW4 
1.298582 1.117901 1.268786 1.345029 



Parameters (Session):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ; par4 = 0 ; par5 = 0 ;
Parameters (R input):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ; par4 = 0 ; par5 = 0 ;
R code (references can be found in the software module):
par5 <- '0'
par4 <- '0'
par3 <- 'No Linear Trend'
par2 <- 'Do not include Seasonal Dummies'
par1 <- '1'
library(lattice)
library(lmtest)
library(car)
library(MASS)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
mywarning <- ''
par1 <- as.numeric(par1)
if(is.na(par1)) {
par1 <- 1
mywarning = 'Warning: you did not specify the column number of the endogenous series! The first column was selected by default.'
}
if (par4=='') par4 <- 0
par4 <- as.numeric(par4)
if (par5=='') par5 <- 0
par5 <- as.numeric(par5)
x <- na.omit(t(y))
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'Seasonal Differences (s=12)'){
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'First and Seasonal Differences (s=12)'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if(par4 > 0) {
x2 <- array(0, dim=c(n-par4,par4), dimnames=list(1:(n-par4), paste(colnames(x)[par1],'(t-',1:par4,')',sep='')))
for (i in 1:(n-par4)) {
for (j in 1:par4) {
x2[i,j] <- x[i+par4-j,par1]
}
}
x <- cbind(x[(par4+1):n,], x2)
n <- n - par4
}
if(par5 > 0) {
x2 <- array(0, dim=c(n-par5*12,par5), dimnames=list(1:(n-par5*12), paste(colnames(x)[par1],'(t-',1:par5,'s)',sep='')))
for (i in 1:(n-par5*12)) {
for (j in 1:par5) {
x2[i,j] <- x[i+par5*12-j*12,par1]
}
}
x <- cbind(x[(par5*12+1):n,], x2)
n <- n - par5*12
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
(k <- length(x[n,]))
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
print(x)
(k <- length(x[n,]))
head(x)
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
sresid <- studres(mylm)
hist(sresid, freq=FALSE, main='Distribution of Studentized Residuals')
xfit<-seq(min(sresid),max(sresid),length=40)
yfit<-dnorm(xfit)
lines(xfit, yfit)
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqPlot(mylm, main='QQ Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
print(z)
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, signif(mysum$coefficients[i,1],6), sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, mywarning)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Multiple Linear Regression - Ordinary Least Squares', 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,formatC(signif(mysum$coefficients[i,1],5),format='g',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,2],5),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,3],4),format='e',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4],4),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4]/2,4),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a,formatC(signif(sqrt(mysum$r.squared),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$adj.r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a,formatC(signif(mysum$fstatistic[1],6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[2],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[3],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a,formatC(signif(1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a,formatC(signif(mysum$sigma,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a,formatC(signif(sum(myerror*myerror),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
myr <- as.numeric(mysum$resid)
myr
if(n < 200) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,formatC(signif(x[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(x[i]-mysum$resid[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$resid[i],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,1],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,2],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,3],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant1,6))
a<-table.element(a,formatC(signif(numsignificant1/numgqtests,6),format='g',flag=' '))
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant5,6))
a<-table.element(a,signif(numsignificant5/numgqtests,6))
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant10,6))
a<-table.element(a,signif(numsignificant10/numgqtests,6))
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}
}
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of fitted values',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_fitted <- resettest(mylm,power=2:3,type='fitted')
a<-table.element(a,paste('
',RC.texteval('reset_test_fitted'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of regressors',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_regressors <- resettest(mylm,power=2:3,type='regressor')
a<-table.element(a,paste('
',RC.texteval('reset_test_regressors'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of principal components',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_principal_components <- resettest(mylm,power=2:3,type='princomp')
a<-table.element(a,paste('
',RC.texteval('reset_test_principal_components'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable8.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Variance Inflation Factors (Multicollinearity)',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
vif <- vif(mylm)
a<-table.element(a,paste('
',RC.texteval('vif'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable9.tab')