Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationFri, 16 Dec 2016 18:01:36 +0100
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2016/Dec/16/t1481907707mnq80ijr279e9c2.htm/, Retrieved Thu, 02 May 2024 20:21:13 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=300458, Retrieved Thu, 02 May 2024 20:21:13 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact75
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [] [2016-12-16 17:01:36] [c6ea875f0603e0876d03f43aca979571] [Current]
Feedback Forum

Post a new message
Dataseries X:
9	3	4	3	3
8	4	5	4	3
10	5	4	5	3
10	3	5	5	3
9	4	5	5	3
10	4	5	5	3
9	4	4	4	3
13	4	4	4	4
11	3	4	4	3
10	4	4	4	4
8	4	5	5	3
9	4	5	4	3
10	4	5	4	3
10	4	4	4	4
10	4	4	4	3
11	4	4	5	3
9	4	3	3	3
7	3	4	5	3
11	4	5	4	4
11	2	4	4	3
10	4	5	5	3
10	3	3	4	4
10	4	2	4	4
9	4	5	5	4
9	4	5	5	3
9	3	4	3	3
10	4	4	5	3
11	4	4	4	3
10	4	4	4	3
8	4	3	3	3
10	4	5	5	3
4	3	2	3	3
8	4	2	4	3
13	5	5	4	3
10	4	4	4	4
9	4	5	5	3
11	4	5	4	3
10	5	4	4	3
9	4	5	4	3
10	4	5	5	3
10	4	3	4	3
11	4	2	4	4
12	4	5	4	3
10	4	4	4	3
10	3	5	5	3
8	4	3	4	3
10	4	2	4	5
9	4	5	5	3
11	4	5	4	4
12	5	5	5	5
7	4	4	5	4
9	4	5	5	4
11	4	5	4	4
8	4	5	2	3
9	5	4	5	3
9	5	4	5	3
9	4	4	4	4
7	5	4	4	5
9	4	5	4	3
9	4	2	4	3
9	4	4	4	4
11	3	3	3	3
7	4	4	4	4
11	4	5	4	3
7	4	5	4	3
12	4	3	4	3
8	4	4	4	3
9	4	3	4	3
9	4	4	4	3
11	4	4	4	3
9	4	5	4	3
8	3	2	3	3
9	5	4	4	5
10	3	2	3	3
13	4	4	5	5
11	5	5	5	4
9	5	4	5	3
9	4	3	4	3
10	5	4	5	3
9	4	2	4	4
11	5	5	5	4
11	4	4	4	4
8	4	5	4	3
8	4	4	4	3
7	4	2	3	3
9	4	5	5	3
11	4	5	5	4
8	4	4	4	3
11	5	5	5	3
10	4	4	4	4
9	4	3	4	5




Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time9 seconds
R ServerBig Analytics Cloud Computing Center

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input view raw input (R code)  \tabularnewline
Raw Outputview raw output of R engine  \tabularnewline
Computing time9 seconds \tabularnewline
R ServerBig Analytics Cloud Computing Center \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=300458&T=0

[TABLE]
[ROW]
Summary of computational transaction[/C][/ROW] [ROW]Raw Input[/C] view raw input (R code) [/C][/ROW] [ROW]Raw Output[/C]view raw output of R engine [/C][/ROW] [ROW]Computing time[/C]9 seconds[/C][/ROW] [ROW]R Server[/C]Big Analytics Cloud Computing Center[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=300458&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=300458&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time9 seconds
R ServerBig Analytics Cloud Computing Center







Multiple Linear Regression - Estimated Regression Equation
GWsum[t] = + 5.65892 + 0.0184675TVDC2[t] + 0.291732TVDC1[t] + 0.245944TVDC3[t] + 0.474648TVDC4[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
GWsum[t] =  +  5.65892 +  0.0184675TVDC2[t] +  0.291732TVDC1[t] +  0.245944TVDC3[t] +  0.474648TVDC4[t]  + e[t] \tabularnewline
 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=300458&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]GWsum[t] =  +  5.65892 +  0.0184675TVDC2[t] +  0.291732TVDC1[t] +  0.245944TVDC3[t] +  0.474648TVDC4[t]  + e[t][/C][/ROW]
[ROW][C][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=300458&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=300458&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
GWsum[t] = + 5.65892 + 0.0184675TVDC2[t] + 0.291732TVDC1[t] + 0.245944TVDC3[t] + 0.474648TVDC4[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+5.659 1.419+3.9870e+00 0.0001399 6.995e-05
TVDC2+0.01847 0.3003+6.1510e-02 0.9511 0.4755
TVDC1+0.2917 0.1804+1.6170e+00 0.1095 0.05475
TVDC3+0.2459 0.2771+8.8750e-01 0.3773 0.1886
TVDC4+0.4747 0.2599+1.8260e+00 0.07127 0.03564

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & +5.659 &  1.419 & +3.9870e+00 &  0.0001399 &  6.995e-05 \tabularnewline
TVDC2 & +0.01847 &  0.3003 & +6.1510e-02 &  0.9511 &  0.4755 \tabularnewline
TVDC1 & +0.2917 &  0.1804 & +1.6170e+00 &  0.1095 &  0.05475 \tabularnewline
TVDC3 & +0.2459 &  0.2771 & +8.8750e-01 &  0.3773 &  0.1886 \tabularnewline
TVDC4 & +0.4747 &  0.2599 & +1.8260e+00 &  0.07127 &  0.03564 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=300458&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]+5.659[/C][C] 1.419[/C][C]+3.9870e+00[/C][C] 0.0001399[/C][C] 6.995e-05[/C][/ROW]
[ROW][C]TVDC2[/C][C]+0.01847[/C][C] 0.3003[/C][C]+6.1510e-02[/C][C] 0.9511[/C][C] 0.4755[/C][/ROW]
[ROW][C]TVDC1[/C][C]+0.2917[/C][C] 0.1804[/C][C]+1.6170e+00[/C][C] 0.1095[/C][C] 0.05475[/C][/ROW]
[ROW][C]TVDC3[/C][C]+0.2459[/C][C] 0.2771[/C][C]+8.8750e-01[/C][C] 0.3773[/C][C] 0.1886[/C][/ROW]
[ROW][C]TVDC4[/C][C]+0.4747[/C][C] 0.2599[/C][C]+1.8260e+00[/C][C] 0.07127[/C][C] 0.03564[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=300458&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=300458&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+5.659 1.419+3.9870e+00 0.0001399 6.995e-05
TVDC2+0.01847 0.3003+6.1510e-02 0.9511 0.4755
TVDC1+0.2917 0.1804+1.6170e+00 0.1095 0.05475
TVDC3+0.2459 0.2771+8.8750e-01 0.3773 0.1886
TVDC4+0.4747 0.2599+1.8260e+00 0.07127 0.03564







Multiple Linear Regression - Regression Statistics
Multiple R 0.3228
R-squared 0.1042
Adjusted R-squared 0.06255
F-TEST (value) 2.501
F-TEST (DF numerator)4
F-TEST (DF denominator)86
p-value 0.04827
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 1.438
Sum Squared Residuals 177.8

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R &  0.3228 \tabularnewline
R-squared &  0.1042 \tabularnewline
Adjusted R-squared &  0.06255 \tabularnewline
F-TEST (value) &  2.501 \tabularnewline
F-TEST (DF numerator) & 4 \tabularnewline
F-TEST (DF denominator) & 86 \tabularnewline
p-value &  0.04827 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation &  1.438 \tabularnewline
Sum Squared Residuals &  177.8 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=300458&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C] 0.3228[/C][/ROW]
[ROW][C]R-squared[/C][C] 0.1042[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C] 0.06255[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C] 2.501[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]4[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]86[/C][/ROW]
[ROW][C]p-value[/C][C] 0.04827[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C] 1.438[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C] 177.8[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=300458&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=300458&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R 0.3228
R-squared 0.1042
Adjusted R-squared 0.06255
F-TEST (value) 2.501
F-TEST (DF numerator)4
F-TEST (DF denominator)86
p-value 0.04827
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 1.438
Sum Squared Residuals 177.8







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 9 9.043-0.04302
2 8 9.599-1.599
3 10 9.572 0.4282
4 10 9.827 0.1734
5 9 9.845-0.8451
6 10 9.845 0.1549
7 9 9.307-0.3074
8 13 9.782 3.218
9 11 9.289 1.711
10 10 9.782 0.2179
11 8 9.845-1.845
12 9 9.599-0.5992
13 10 9.599 0.4008
14 10 9.782 0.2179
15 10 9.307 0.6926
16 11 9.553 1.447
17 9 8.77 0.2302
18 7 9.535-2.535
19 11 10.07 0.9262
20 11 9.271 1.73
21 10 9.845 0.1549
22 10 9.472 0.5281
23 10 9.199 0.8014
24 9 10.32-1.32
25 9 9.845-0.8451
26 9 9.043-0.04302
27 10 9.553 0.4466
28 11 9.307 1.693
29 10 9.307 0.6926
30 8 8.77-0.7698
31 10 9.845 0.1549
32 4 8.46-4.46
33 8 8.724-0.724
34 13 9.618 3.382
35 10 9.782 0.2179
36 9 9.845-0.8451
37 11 9.599 1.401
38 10 9.326 0.6741
39 9 9.599-0.5992
40 10 9.845 0.1549
41 10 9.016 0.9843
42 11 9.199 1.801
43 12 9.599 2.401
44 10 9.307 0.6926
45 10 9.827 0.1734
46 8 9.016-1.016
47 10 9.673 0.3267
48 9 9.845-0.8451
49 11 10.07 0.9262
50 12 10.81 1.187
51 7 10.03-3.028
52 9 10.32-1.32
53 11 10.07 0.9262
54 8 9.107-1.107
55 9 9.572-0.5718
56 9 9.572-0.5718
57 9 9.782-0.7821
58 7 10.28-3.275
59 9 9.599-0.5992
60 9 8.724 0.276
61 9 9.782-0.7821
62 11 8.751 2.249
63 7 9.782-2.782
64 11 9.599 1.401
65 7 9.599-2.599
66 12 9.016 2.984
67 8 9.307-1.307
68 9 9.016-0.0157
69 9 9.307-0.3074
70 11 9.307 1.693
71 9 9.599-0.5992
72 8 8.46-0.4596
73 9 10.28-1.275
74 10 8.46 1.54
75 13 10.5 2.497
76 11 10.34 0.6618
77 9 9.572-0.5718
78 9 9.016-0.0157
79 10 9.572 0.4282
80 9 9.199-0.1986
81 11 10.34 0.6618
82 11 9.782 1.218
83 8 9.599-1.599
84 8 9.307-1.307
85 7 8.478-1.478
86 9 9.845-0.8451
87 11 10.32 0.6802
88 8 9.307-1.307
89 11 9.864 1.136
90 10 9.782 0.2179
91 9 9.965-0.965

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 &  9 &  9.043 & -0.04302 \tabularnewline
2 &  8 &  9.599 & -1.599 \tabularnewline
3 &  10 &  9.572 &  0.4282 \tabularnewline
4 &  10 &  9.827 &  0.1734 \tabularnewline
5 &  9 &  9.845 & -0.8451 \tabularnewline
6 &  10 &  9.845 &  0.1549 \tabularnewline
7 &  9 &  9.307 & -0.3074 \tabularnewline
8 &  13 &  9.782 &  3.218 \tabularnewline
9 &  11 &  9.289 &  1.711 \tabularnewline
10 &  10 &  9.782 &  0.2179 \tabularnewline
11 &  8 &  9.845 & -1.845 \tabularnewline
12 &  9 &  9.599 & -0.5992 \tabularnewline
13 &  10 &  9.599 &  0.4008 \tabularnewline
14 &  10 &  9.782 &  0.2179 \tabularnewline
15 &  10 &  9.307 &  0.6926 \tabularnewline
16 &  11 &  9.553 &  1.447 \tabularnewline
17 &  9 &  8.77 &  0.2302 \tabularnewline
18 &  7 &  9.535 & -2.535 \tabularnewline
19 &  11 &  10.07 &  0.9262 \tabularnewline
20 &  11 &  9.271 &  1.73 \tabularnewline
21 &  10 &  9.845 &  0.1549 \tabularnewline
22 &  10 &  9.472 &  0.5281 \tabularnewline
23 &  10 &  9.199 &  0.8014 \tabularnewline
24 &  9 &  10.32 & -1.32 \tabularnewline
25 &  9 &  9.845 & -0.8451 \tabularnewline
26 &  9 &  9.043 & -0.04302 \tabularnewline
27 &  10 &  9.553 &  0.4466 \tabularnewline
28 &  11 &  9.307 &  1.693 \tabularnewline
29 &  10 &  9.307 &  0.6926 \tabularnewline
30 &  8 &  8.77 & -0.7698 \tabularnewline
31 &  10 &  9.845 &  0.1549 \tabularnewline
32 &  4 &  8.46 & -4.46 \tabularnewline
33 &  8 &  8.724 & -0.724 \tabularnewline
34 &  13 &  9.618 &  3.382 \tabularnewline
35 &  10 &  9.782 &  0.2179 \tabularnewline
36 &  9 &  9.845 & -0.8451 \tabularnewline
37 &  11 &  9.599 &  1.401 \tabularnewline
38 &  10 &  9.326 &  0.6741 \tabularnewline
39 &  9 &  9.599 & -0.5992 \tabularnewline
40 &  10 &  9.845 &  0.1549 \tabularnewline
41 &  10 &  9.016 &  0.9843 \tabularnewline
42 &  11 &  9.199 &  1.801 \tabularnewline
43 &  12 &  9.599 &  2.401 \tabularnewline
44 &  10 &  9.307 &  0.6926 \tabularnewline
45 &  10 &  9.827 &  0.1734 \tabularnewline
46 &  8 &  9.016 & -1.016 \tabularnewline
47 &  10 &  9.673 &  0.3267 \tabularnewline
48 &  9 &  9.845 & -0.8451 \tabularnewline
49 &  11 &  10.07 &  0.9262 \tabularnewline
50 &  12 &  10.81 &  1.187 \tabularnewline
51 &  7 &  10.03 & -3.028 \tabularnewline
52 &  9 &  10.32 & -1.32 \tabularnewline
53 &  11 &  10.07 &  0.9262 \tabularnewline
54 &  8 &  9.107 & -1.107 \tabularnewline
55 &  9 &  9.572 & -0.5718 \tabularnewline
56 &  9 &  9.572 & -0.5718 \tabularnewline
57 &  9 &  9.782 & -0.7821 \tabularnewline
58 &  7 &  10.28 & -3.275 \tabularnewline
59 &  9 &  9.599 & -0.5992 \tabularnewline
60 &  9 &  8.724 &  0.276 \tabularnewline
61 &  9 &  9.782 & -0.7821 \tabularnewline
62 &  11 &  8.751 &  2.249 \tabularnewline
63 &  7 &  9.782 & -2.782 \tabularnewline
64 &  11 &  9.599 &  1.401 \tabularnewline
65 &  7 &  9.599 & -2.599 \tabularnewline
66 &  12 &  9.016 &  2.984 \tabularnewline
67 &  8 &  9.307 & -1.307 \tabularnewline
68 &  9 &  9.016 & -0.0157 \tabularnewline
69 &  9 &  9.307 & -0.3074 \tabularnewline
70 &  11 &  9.307 &  1.693 \tabularnewline
71 &  9 &  9.599 & -0.5992 \tabularnewline
72 &  8 &  8.46 & -0.4596 \tabularnewline
73 &  9 &  10.28 & -1.275 \tabularnewline
74 &  10 &  8.46 &  1.54 \tabularnewline
75 &  13 &  10.5 &  2.497 \tabularnewline
76 &  11 &  10.34 &  0.6618 \tabularnewline
77 &  9 &  9.572 & -0.5718 \tabularnewline
78 &  9 &  9.016 & -0.0157 \tabularnewline
79 &  10 &  9.572 &  0.4282 \tabularnewline
80 &  9 &  9.199 & -0.1986 \tabularnewline
81 &  11 &  10.34 &  0.6618 \tabularnewline
82 &  11 &  9.782 &  1.218 \tabularnewline
83 &  8 &  9.599 & -1.599 \tabularnewline
84 &  8 &  9.307 & -1.307 \tabularnewline
85 &  7 &  8.478 & -1.478 \tabularnewline
86 &  9 &  9.845 & -0.8451 \tabularnewline
87 &  11 &  10.32 &  0.6802 \tabularnewline
88 &  8 &  9.307 & -1.307 \tabularnewline
89 &  11 &  9.864 &  1.136 \tabularnewline
90 &  10 &  9.782 &  0.2179 \tabularnewline
91 &  9 &  9.965 & -0.965 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=300458&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C] 9[/C][C] 9.043[/C][C]-0.04302[/C][/ROW]
[ROW][C]2[/C][C] 8[/C][C] 9.599[/C][C]-1.599[/C][/ROW]
[ROW][C]3[/C][C] 10[/C][C] 9.572[/C][C] 0.4282[/C][/ROW]
[ROW][C]4[/C][C] 10[/C][C] 9.827[/C][C] 0.1734[/C][/ROW]
[ROW][C]5[/C][C] 9[/C][C] 9.845[/C][C]-0.8451[/C][/ROW]
[ROW][C]6[/C][C] 10[/C][C] 9.845[/C][C] 0.1549[/C][/ROW]
[ROW][C]7[/C][C] 9[/C][C] 9.307[/C][C]-0.3074[/C][/ROW]
[ROW][C]8[/C][C] 13[/C][C] 9.782[/C][C] 3.218[/C][/ROW]
[ROW][C]9[/C][C] 11[/C][C] 9.289[/C][C] 1.711[/C][/ROW]
[ROW][C]10[/C][C] 10[/C][C] 9.782[/C][C] 0.2179[/C][/ROW]
[ROW][C]11[/C][C] 8[/C][C] 9.845[/C][C]-1.845[/C][/ROW]
[ROW][C]12[/C][C] 9[/C][C] 9.599[/C][C]-0.5992[/C][/ROW]
[ROW][C]13[/C][C] 10[/C][C] 9.599[/C][C] 0.4008[/C][/ROW]
[ROW][C]14[/C][C] 10[/C][C] 9.782[/C][C] 0.2179[/C][/ROW]
[ROW][C]15[/C][C] 10[/C][C] 9.307[/C][C] 0.6926[/C][/ROW]
[ROW][C]16[/C][C] 11[/C][C] 9.553[/C][C] 1.447[/C][/ROW]
[ROW][C]17[/C][C] 9[/C][C] 8.77[/C][C] 0.2302[/C][/ROW]
[ROW][C]18[/C][C] 7[/C][C] 9.535[/C][C]-2.535[/C][/ROW]
[ROW][C]19[/C][C] 11[/C][C] 10.07[/C][C] 0.9262[/C][/ROW]
[ROW][C]20[/C][C] 11[/C][C] 9.271[/C][C] 1.73[/C][/ROW]
[ROW][C]21[/C][C] 10[/C][C] 9.845[/C][C] 0.1549[/C][/ROW]
[ROW][C]22[/C][C] 10[/C][C] 9.472[/C][C] 0.5281[/C][/ROW]
[ROW][C]23[/C][C] 10[/C][C] 9.199[/C][C] 0.8014[/C][/ROW]
[ROW][C]24[/C][C] 9[/C][C] 10.32[/C][C]-1.32[/C][/ROW]
[ROW][C]25[/C][C] 9[/C][C] 9.845[/C][C]-0.8451[/C][/ROW]
[ROW][C]26[/C][C] 9[/C][C] 9.043[/C][C]-0.04302[/C][/ROW]
[ROW][C]27[/C][C] 10[/C][C] 9.553[/C][C] 0.4466[/C][/ROW]
[ROW][C]28[/C][C] 11[/C][C] 9.307[/C][C] 1.693[/C][/ROW]
[ROW][C]29[/C][C] 10[/C][C] 9.307[/C][C] 0.6926[/C][/ROW]
[ROW][C]30[/C][C] 8[/C][C] 8.77[/C][C]-0.7698[/C][/ROW]
[ROW][C]31[/C][C] 10[/C][C] 9.845[/C][C] 0.1549[/C][/ROW]
[ROW][C]32[/C][C] 4[/C][C] 8.46[/C][C]-4.46[/C][/ROW]
[ROW][C]33[/C][C] 8[/C][C] 8.724[/C][C]-0.724[/C][/ROW]
[ROW][C]34[/C][C] 13[/C][C] 9.618[/C][C] 3.382[/C][/ROW]
[ROW][C]35[/C][C] 10[/C][C] 9.782[/C][C] 0.2179[/C][/ROW]
[ROW][C]36[/C][C] 9[/C][C] 9.845[/C][C]-0.8451[/C][/ROW]
[ROW][C]37[/C][C] 11[/C][C] 9.599[/C][C] 1.401[/C][/ROW]
[ROW][C]38[/C][C] 10[/C][C] 9.326[/C][C] 0.6741[/C][/ROW]
[ROW][C]39[/C][C] 9[/C][C] 9.599[/C][C]-0.5992[/C][/ROW]
[ROW][C]40[/C][C] 10[/C][C] 9.845[/C][C] 0.1549[/C][/ROW]
[ROW][C]41[/C][C] 10[/C][C] 9.016[/C][C] 0.9843[/C][/ROW]
[ROW][C]42[/C][C] 11[/C][C] 9.199[/C][C] 1.801[/C][/ROW]
[ROW][C]43[/C][C] 12[/C][C] 9.599[/C][C] 2.401[/C][/ROW]
[ROW][C]44[/C][C] 10[/C][C] 9.307[/C][C] 0.6926[/C][/ROW]
[ROW][C]45[/C][C] 10[/C][C] 9.827[/C][C] 0.1734[/C][/ROW]
[ROW][C]46[/C][C] 8[/C][C] 9.016[/C][C]-1.016[/C][/ROW]
[ROW][C]47[/C][C] 10[/C][C] 9.673[/C][C] 0.3267[/C][/ROW]
[ROW][C]48[/C][C] 9[/C][C] 9.845[/C][C]-0.8451[/C][/ROW]
[ROW][C]49[/C][C] 11[/C][C] 10.07[/C][C] 0.9262[/C][/ROW]
[ROW][C]50[/C][C] 12[/C][C] 10.81[/C][C] 1.187[/C][/ROW]
[ROW][C]51[/C][C] 7[/C][C] 10.03[/C][C]-3.028[/C][/ROW]
[ROW][C]52[/C][C] 9[/C][C] 10.32[/C][C]-1.32[/C][/ROW]
[ROW][C]53[/C][C] 11[/C][C] 10.07[/C][C] 0.9262[/C][/ROW]
[ROW][C]54[/C][C] 8[/C][C] 9.107[/C][C]-1.107[/C][/ROW]
[ROW][C]55[/C][C] 9[/C][C] 9.572[/C][C]-0.5718[/C][/ROW]
[ROW][C]56[/C][C] 9[/C][C] 9.572[/C][C]-0.5718[/C][/ROW]
[ROW][C]57[/C][C] 9[/C][C] 9.782[/C][C]-0.7821[/C][/ROW]
[ROW][C]58[/C][C] 7[/C][C] 10.28[/C][C]-3.275[/C][/ROW]
[ROW][C]59[/C][C] 9[/C][C] 9.599[/C][C]-0.5992[/C][/ROW]
[ROW][C]60[/C][C] 9[/C][C] 8.724[/C][C] 0.276[/C][/ROW]
[ROW][C]61[/C][C] 9[/C][C] 9.782[/C][C]-0.7821[/C][/ROW]
[ROW][C]62[/C][C] 11[/C][C] 8.751[/C][C] 2.249[/C][/ROW]
[ROW][C]63[/C][C] 7[/C][C] 9.782[/C][C]-2.782[/C][/ROW]
[ROW][C]64[/C][C] 11[/C][C] 9.599[/C][C] 1.401[/C][/ROW]
[ROW][C]65[/C][C] 7[/C][C] 9.599[/C][C]-2.599[/C][/ROW]
[ROW][C]66[/C][C] 12[/C][C] 9.016[/C][C] 2.984[/C][/ROW]
[ROW][C]67[/C][C] 8[/C][C] 9.307[/C][C]-1.307[/C][/ROW]
[ROW][C]68[/C][C] 9[/C][C] 9.016[/C][C]-0.0157[/C][/ROW]
[ROW][C]69[/C][C] 9[/C][C] 9.307[/C][C]-0.3074[/C][/ROW]
[ROW][C]70[/C][C] 11[/C][C] 9.307[/C][C] 1.693[/C][/ROW]
[ROW][C]71[/C][C] 9[/C][C] 9.599[/C][C]-0.5992[/C][/ROW]
[ROW][C]72[/C][C] 8[/C][C] 8.46[/C][C]-0.4596[/C][/ROW]
[ROW][C]73[/C][C] 9[/C][C] 10.28[/C][C]-1.275[/C][/ROW]
[ROW][C]74[/C][C] 10[/C][C] 8.46[/C][C] 1.54[/C][/ROW]
[ROW][C]75[/C][C] 13[/C][C] 10.5[/C][C] 2.497[/C][/ROW]
[ROW][C]76[/C][C] 11[/C][C] 10.34[/C][C] 0.6618[/C][/ROW]
[ROW][C]77[/C][C] 9[/C][C] 9.572[/C][C]-0.5718[/C][/ROW]
[ROW][C]78[/C][C] 9[/C][C] 9.016[/C][C]-0.0157[/C][/ROW]
[ROW][C]79[/C][C] 10[/C][C] 9.572[/C][C] 0.4282[/C][/ROW]
[ROW][C]80[/C][C] 9[/C][C] 9.199[/C][C]-0.1986[/C][/ROW]
[ROW][C]81[/C][C] 11[/C][C] 10.34[/C][C] 0.6618[/C][/ROW]
[ROW][C]82[/C][C] 11[/C][C] 9.782[/C][C] 1.218[/C][/ROW]
[ROW][C]83[/C][C] 8[/C][C] 9.599[/C][C]-1.599[/C][/ROW]
[ROW][C]84[/C][C] 8[/C][C] 9.307[/C][C]-1.307[/C][/ROW]
[ROW][C]85[/C][C] 7[/C][C] 8.478[/C][C]-1.478[/C][/ROW]
[ROW][C]86[/C][C] 9[/C][C] 9.845[/C][C]-0.8451[/C][/ROW]
[ROW][C]87[/C][C] 11[/C][C] 10.32[/C][C] 0.6802[/C][/ROW]
[ROW][C]88[/C][C] 8[/C][C] 9.307[/C][C]-1.307[/C][/ROW]
[ROW][C]89[/C][C] 11[/C][C] 9.864[/C][C] 1.136[/C][/ROW]
[ROW][C]90[/C][C] 10[/C][C] 9.782[/C][C] 0.2179[/C][/ROW]
[ROW][C]91[/C][C] 9[/C][C] 9.965[/C][C]-0.965[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=300458&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=300458&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 9 9.043-0.04302
2 8 9.599-1.599
3 10 9.572 0.4282
4 10 9.827 0.1734
5 9 9.845-0.8451
6 10 9.845 0.1549
7 9 9.307-0.3074
8 13 9.782 3.218
9 11 9.289 1.711
10 10 9.782 0.2179
11 8 9.845-1.845
12 9 9.599-0.5992
13 10 9.599 0.4008
14 10 9.782 0.2179
15 10 9.307 0.6926
16 11 9.553 1.447
17 9 8.77 0.2302
18 7 9.535-2.535
19 11 10.07 0.9262
20 11 9.271 1.73
21 10 9.845 0.1549
22 10 9.472 0.5281
23 10 9.199 0.8014
24 9 10.32-1.32
25 9 9.845-0.8451
26 9 9.043-0.04302
27 10 9.553 0.4466
28 11 9.307 1.693
29 10 9.307 0.6926
30 8 8.77-0.7698
31 10 9.845 0.1549
32 4 8.46-4.46
33 8 8.724-0.724
34 13 9.618 3.382
35 10 9.782 0.2179
36 9 9.845-0.8451
37 11 9.599 1.401
38 10 9.326 0.6741
39 9 9.599-0.5992
40 10 9.845 0.1549
41 10 9.016 0.9843
42 11 9.199 1.801
43 12 9.599 2.401
44 10 9.307 0.6926
45 10 9.827 0.1734
46 8 9.016-1.016
47 10 9.673 0.3267
48 9 9.845-0.8451
49 11 10.07 0.9262
50 12 10.81 1.187
51 7 10.03-3.028
52 9 10.32-1.32
53 11 10.07 0.9262
54 8 9.107-1.107
55 9 9.572-0.5718
56 9 9.572-0.5718
57 9 9.782-0.7821
58 7 10.28-3.275
59 9 9.599-0.5992
60 9 8.724 0.276
61 9 9.782-0.7821
62 11 8.751 2.249
63 7 9.782-2.782
64 11 9.599 1.401
65 7 9.599-2.599
66 12 9.016 2.984
67 8 9.307-1.307
68 9 9.016-0.0157
69 9 9.307-0.3074
70 11 9.307 1.693
71 9 9.599-0.5992
72 8 8.46-0.4596
73 9 10.28-1.275
74 10 8.46 1.54
75 13 10.5 2.497
76 11 10.34 0.6618
77 9 9.572-0.5718
78 9 9.016-0.0157
79 10 9.572 0.4282
80 9 9.199-0.1986
81 11 10.34 0.6618
82 11 9.782 1.218
83 8 9.599-1.599
84 8 9.307-1.307
85 7 8.478-1.478
86 9 9.845-0.8451
87 11 10.32 0.6802
88 8 9.307-1.307
89 11 9.864 1.136
90 10 9.782 0.2179
91 9 9.965-0.965







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
8 0.0488 0.09761 0.9512
9 0.03383 0.06767 0.9662
10 0.2533 0.5065 0.7467
11 0.2458 0.4916 0.7542
12 0.2099 0.4199 0.7901
13 0.2387 0.4774 0.7613
14 0.2217 0.4435 0.7783
15 0.1556 0.3112 0.8444
16 0.1148 0.2297 0.8852
17 0.08805 0.1761 0.912
18 0.3559 0.7118 0.6441
19 0.284 0.5681 0.716
20 0.2981 0.5962 0.7019
21 0.2425 0.485 0.7575
22 0.2226 0.4453 0.7774
23 0.1808 0.3617 0.8192
24 0.1898 0.3795 0.8102
25 0.1482 0.2963 0.8518
26 0.1151 0.2302 0.8849
27 0.08723 0.1745 0.9128
28 0.09593 0.1919 0.9041
29 0.07174 0.1435 0.9283
30 0.07414 0.1483 0.9259
31 0.05412 0.1082 0.9459
32 0.4935 0.9871 0.5065
33 0.4358 0.8716 0.5642
34 0.6739 0.6522 0.3261
35 0.6197 0.7605 0.3803
36 0.5826 0.8348 0.4174
37 0.5664 0.8672 0.4336
38 0.5203 0.9593 0.4797
39 0.4776 0.9553 0.5224
40 0.4144 0.8289 0.5856
41 0.3891 0.7782 0.6109
42 0.4029 0.8058 0.5971
43 0.5082 0.9836 0.4918
44 0.4611 0.9222 0.5389
45 0.4073 0.8146 0.5927
46 0.3733 0.7467 0.6267
47 0.3218 0.6436 0.6782
48 0.2934 0.5868 0.7066
49 0.2637 0.5275 0.7363
50 0.2681 0.5361 0.7319
51 0.5492 0.9015 0.4508
52 0.5875 0.8251 0.4125
53 0.5598 0.8804 0.4402
54 0.6445 0.7111 0.3555
55 0.5968 0.8064 0.4032
56 0.5488 0.9024 0.4512
57 0.504 0.992 0.496
58 0.6551 0.6899 0.3449
59 0.5974 0.8053 0.4026
60 0.5375 0.925 0.4625
61 0.4832 0.9664 0.5168
62 0.6142 0.7715 0.3858
63 0.7633 0.4734 0.2367
64 0.8283 0.3434 0.1717
65 0.8739 0.2521 0.1261
66 0.9703 0.05947 0.02974
67 0.9656 0.06871 0.03435
68 0.947 0.1059 0.05297
69 0.9211 0.1579 0.07894
70 0.9581 0.08384 0.04192
71 0.9361 0.1279 0.06394
72 0.9041 0.1918 0.09589
73 0.8761 0.2478 0.1239
74 0.9748 0.05031 0.02516
75 0.9829 0.03417 0.01708
76 0.9681 0.06389 0.03194
77 0.9625 0.07508 0.03754
78 0.9504 0.09921 0.0496
79 0.9083 0.1833 0.09166
80 0.8463 0.3074 0.1537
81 0.7963 0.4074 0.2037
82 0.8941 0.2119 0.1059
83 0.8657 0.2686 0.1343

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
8 &  0.0488 &  0.09761 &  0.9512 \tabularnewline
9 &  0.03383 &  0.06767 &  0.9662 \tabularnewline
10 &  0.2533 &  0.5065 &  0.7467 \tabularnewline
11 &  0.2458 &  0.4916 &  0.7542 \tabularnewline
12 &  0.2099 &  0.4199 &  0.7901 \tabularnewline
13 &  0.2387 &  0.4774 &  0.7613 \tabularnewline
14 &  0.2217 &  0.4435 &  0.7783 \tabularnewline
15 &  0.1556 &  0.3112 &  0.8444 \tabularnewline
16 &  0.1148 &  0.2297 &  0.8852 \tabularnewline
17 &  0.08805 &  0.1761 &  0.912 \tabularnewline
18 &  0.3559 &  0.7118 &  0.6441 \tabularnewline
19 &  0.284 &  0.5681 &  0.716 \tabularnewline
20 &  0.2981 &  0.5962 &  0.7019 \tabularnewline
21 &  0.2425 &  0.485 &  0.7575 \tabularnewline
22 &  0.2226 &  0.4453 &  0.7774 \tabularnewline
23 &  0.1808 &  0.3617 &  0.8192 \tabularnewline
24 &  0.1898 &  0.3795 &  0.8102 \tabularnewline
25 &  0.1482 &  0.2963 &  0.8518 \tabularnewline
26 &  0.1151 &  0.2302 &  0.8849 \tabularnewline
27 &  0.08723 &  0.1745 &  0.9128 \tabularnewline
28 &  0.09593 &  0.1919 &  0.9041 \tabularnewline
29 &  0.07174 &  0.1435 &  0.9283 \tabularnewline
30 &  0.07414 &  0.1483 &  0.9259 \tabularnewline
31 &  0.05412 &  0.1082 &  0.9459 \tabularnewline
32 &  0.4935 &  0.9871 &  0.5065 \tabularnewline
33 &  0.4358 &  0.8716 &  0.5642 \tabularnewline
34 &  0.6739 &  0.6522 &  0.3261 \tabularnewline
35 &  0.6197 &  0.7605 &  0.3803 \tabularnewline
36 &  0.5826 &  0.8348 &  0.4174 \tabularnewline
37 &  0.5664 &  0.8672 &  0.4336 \tabularnewline
38 &  0.5203 &  0.9593 &  0.4797 \tabularnewline
39 &  0.4776 &  0.9553 &  0.5224 \tabularnewline
40 &  0.4144 &  0.8289 &  0.5856 \tabularnewline
41 &  0.3891 &  0.7782 &  0.6109 \tabularnewline
42 &  0.4029 &  0.8058 &  0.5971 \tabularnewline
43 &  0.5082 &  0.9836 &  0.4918 \tabularnewline
44 &  0.4611 &  0.9222 &  0.5389 \tabularnewline
45 &  0.4073 &  0.8146 &  0.5927 \tabularnewline
46 &  0.3733 &  0.7467 &  0.6267 \tabularnewline
47 &  0.3218 &  0.6436 &  0.6782 \tabularnewline
48 &  0.2934 &  0.5868 &  0.7066 \tabularnewline
49 &  0.2637 &  0.5275 &  0.7363 \tabularnewline
50 &  0.2681 &  0.5361 &  0.7319 \tabularnewline
51 &  0.5492 &  0.9015 &  0.4508 \tabularnewline
52 &  0.5875 &  0.8251 &  0.4125 \tabularnewline
53 &  0.5598 &  0.8804 &  0.4402 \tabularnewline
54 &  0.6445 &  0.7111 &  0.3555 \tabularnewline
55 &  0.5968 &  0.8064 &  0.4032 \tabularnewline
56 &  0.5488 &  0.9024 &  0.4512 \tabularnewline
57 &  0.504 &  0.992 &  0.496 \tabularnewline
58 &  0.6551 &  0.6899 &  0.3449 \tabularnewline
59 &  0.5974 &  0.8053 &  0.4026 \tabularnewline
60 &  0.5375 &  0.925 &  0.4625 \tabularnewline
61 &  0.4832 &  0.9664 &  0.5168 \tabularnewline
62 &  0.6142 &  0.7715 &  0.3858 \tabularnewline
63 &  0.7633 &  0.4734 &  0.2367 \tabularnewline
64 &  0.8283 &  0.3434 &  0.1717 \tabularnewline
65 &  0.8739 &  0.2521 &  0.1261 \tabularnewline
66 &  0.9703 &  0.05947 &  0.02974 \tabularnewline
67 &  0.9656 &  0.06871 &  0.03435 \tabularnewline
68 &  0.947 &  0.1059 &  0.05297 \tabularnewline
69 &  0.9211 &  0.1579 &  0.07894 \tabularnewline
70 &  0.9581 &  0.08384 &  0.04192 \tabularnewline
71 &  0.9361 &  0.1279 &  0.06394 \tabularnewline
72 &  0.9041 &  0.1918 &  0.09589 \tabularnewline
73 &  0.8761 &  0.2478 &  0.1239 \tabularnewline
74 &  0.9748 &  0.05031 &  0.02516 \tabularnewline
75 &  0.9829 &  0.03417 &  0.01708 \tabularnewline
76 &  0.9681 &  0.06389 &  0.03194 \tabularnewline
77 &  0.9625 &  0.07508 &  0.03754 \tabularnewline
78 &  0.9504 &  0.09921 &  0.0496 \tabularnewline
79 &  0.9083 &  0.1833 &  0.09166 \tabularnewline
80 &  0.8463 &  0.3074 &  0.1537 \tabularnewline
81 &  0.7963 &  0.4074 &  0.2037 \tabularnewline
82 &  0.8941 &  0.2119 &  0.1059 \tabularnewline
83 &  0.8657 &  0.2686 &  0.1343 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=300458&T=5

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]8[/C][C] 0.0488[/C][C] 0.09761[/C][C] 0.9512[/C][/ROW]
[ROW][C]9[/C][C] 0.03383[/C][C] 0.06767[/C][C] 0.9662[/C][/ROW]
[ROW][C]10[/C][C] 0.2533[/C][C] 0.5065[/C][C] 0.7467[/C][/ROW]
[ROW][C]11[/C][C] 0.2458[/C][C] 0.4916[/C][C] 0.7542[/C][/ROW]
[ROW][C]12[/C][C] 0.2099[/C][C] 0.4199[/C][C] 0.7901[/C][/ROW]
[ROW][C]13[/C][C] 0.2387[/C][C] 0.4774[/C][C] 0.7613[/C][/ROW]
[ROW][C]14[/C][C] 0.2217[/C][C] 0.4435[/C][C] 0.7783[/C][/ROW]
[ROW][C]15[/C][C] 0.1556[/C][C] 0.3112[/C][C] 0.8444[/C][/ROW]
[ROW][C]16[/C][C] 0.1148[/C][C] 0.2297[/C][C] 0.8852[/C][/ROW]
[ROW][C]17[/C][C] 0.08805[/C][C] 0.1761[/C][C] 0.912[/C][/ROW]
[ROW][C]18[/C][C] 0.3559[/C][C] 0.7118[/C][C] 0.6441[/C][/ROW]
[ROW][C]19[/C][C] 0.284[/C][C] 0.5681[/C][C] 0.716[/C][/ROW]
[ROW][C]20[/C][C] 0.2981[/C][C] 0.5962[/C][C] 0.7019[/C][/ROW]
[ROW][C]21[/C][C] 0.2425[/C][C] 0.485[/C][C] 0.7575[/C][/ROW]
[ROW][C]22[/C][C] 0.2226[/C][C] 0.4453[/C][C] 0.7774[/C][/ROW]
[ROW][C]23[/C][C] 0.1808[/C][C] 0.3617[/C][C] 0.8192[/C][/ROW]
[ROW][C]24[/C][C] 0.1898[/C][C] 0.3795[/C][C] 0.8102[/C][/ROW]
[ROW][C]25[/C][C] 0.1482[/C][C] 0.2963[/C][C] 0.8518[/C][/ROW]
[ROW][C]26[/C][C] 0.1151[/C][C] 0.2302[/C][C] 0.8849[/C][/ROW]
[ROW][C]27[/C][C] 0.08723[/C][C] 0.1745[/C][C] 0.9128[/C][/ROW]
[ROW][C]28[/C][C] 0.09593[/C][C] 0.1919[/C][C] 0.9041[/C][/ROW]
[ROW][C]29[/C][C] 0.07174[/C][C] 0.1435[/C][C] 0.9283[/C][/ROW]
[ROW][C]30[/C][C] 0.07414[/C][C] 0.1483[/C][C] 0.9259[/C][/ROW]
[ROW][C]31[/C][C] 0.05412[/C][C] 0.1082[/C][C] 0.9459[/C][/ROW]
[ROW][C]32[/C][C] 0.4935[/C][C] 0.9871[/C][C] 0.5065[/C][/ROW]
[ROW][C]33[/C][C] 0.4358[/C][C] 0.8716[/C][C] 0.5642[/C][/ROW]
[ROW][C]34[/C][C] 0.6739[/C][C] 0.6522[/C][C] 0.3261[/C][/ROW]
[ROW][C]35[/C][C] 0.6197[/C][C] 0.7605[/C][C] 0.3803[/C][/ROW]
[ROW][C]36[/C][C] 0.5826[/C][C] 0.8348[/C][C] 0.4174[/C][/ROW]
[ROW][C]37[/C][C] 0.5664[/C][C] 0.8672[/C][C] 0.4336[/C][/ROW]
[ROW][C]38[/C][C] 0.5203[/C][C] 0.9593[/C][C] 0.4797[/C][/ROW]
[ROW][C]39[/C][C] 0.4776[/C][C] 0.9553[/C][C] 0.5224[/C][/ROW]
[ROW][C]40[/C][C] 0.4144[/C][C] 0.8289[/C][C] 0.5856[/C][/ROW]
[ROW][C]41[/C][C] 0.3891[/C][C] 0.7782[/C][C] 0.6109[/C][/ROW]
[ROW][C]42[/C][C] 0.4029[/C][C] 0.8058[/C][C] 0.5971[/C][/ROW]
[ROW][C]43[/C][C] 0.5082[/C][C] 0.9836[/C][C] 0.4918[/C][/ROW]
[ROW][C]44[/C][C] 0.4611[/C][C] 0.9222[/C][C] 0.5389[/C][/ROW]
[ROW][C]45[/C][C] 0.4073[/C][C] 0.8146[/C][C] 0.5927[/C][/ROW]
[ROW][C]46[/C][C] 0.3733[/C][C] 0.7467[/C][C] 0.6267[/C][/ROW]
[ROW][C]47[/C][C] 0.3218[/C][C] 0.6436[/C][C] 0.6782[/C][/ROW]
[ROW][C]48[/C][C] 0.2934[/C][C] 0.5868[/C][C] 0.7066[/C][/ROW]
[ROW][C]49[/C][C] 0.2637[/C][C] 0.5275[/C][C] 0.7363[/C][/ROW]
[ROW][C]50[/C][C] 0.2681[/C][C] 0.5361[/C][C] 0.7319[/C][/ROW]
[ROW][C]51[/C][C] 0.5492[/C][C] 0.9015[/C][C] 0.4508[/C][/ROW]
[ROW][C]52[/C][C] 0.5875[/C][C] 0.8251[/C][C] 0.4125[/C][/ROW]
[ROW][C]53[/C][C] 0.5598[/C][C] 0.8804[/C][C] 0.4402[/C][/ROW]
[ROW][C]54[/C][C] 0.6445[/C][C] 0.7111[/C][C] 0.3555[/C][/ROW]
[ROW][C]55[/C][C] 0.5968[/C][C] 0.8064[/C][C] 0.4032[/C][/ROW]
[ROW][C]56[/C][C] 0.5488[/C][C] 0.9024[/C][C] 0.4512[/C][/ROW]
[ROW][C]57[/C][C] 0.504[/C][C] 0.992[/C][C] 0.496[/C][/ROW]
[ROW][C]58[/C][C] 0.6551[/C][C] 0.6899[/C][C] 0.3449[/C][/ROW]
[ROW][C]59[/C][C] 0.5974[/C][C] 0.8053[/C][C] 0.4026[/C][/ROW]
[ROW][C]60[/C][C] 0.5375[/C][C] 0.925[/C][C] 0.4625[/C][/ROW]
[ROW][C]61[/C][C] 0.4832[/C][C] 0.9664[/C][C] 0.5168[/C][/ROW]
[ROW][C]62[/C][C] 0.6142[/C][C] 0.7715[/C][C] 0.3858[/C][/ROW]
[ROW][C]63[/C][C] 0.7633[/C][C] 0.4734[/C][C] 0.2367[/C][/ROW]
[ROW][C]64[/C][C] 0.8283[/C][C] 0.3434[/C][C] 0.1717[/C][/ROW]
[ROW][C]65[/C][C] 0.8739[/C][C] 0.2521[/C][C] 0.1261[/C][/ROW]
[ROW][C]66[/C][C] 0.9703[/C][C] 0.05947[/C][C] 0.02974[/C][/ROW]
[ROW][C]67[/C][C] 0.9656[/C][C] 0.06871[/C][C] 0.03435[/C][/ROW]
[ROW][C]68[/C][C] 0.947[/C][C] 0.1059[/C][C] 0.05297[/C][/ROW]
[ROW][C]69[/C][C] 0.9211[/C][C] 0.1579[/C][C] 0.07894[/C][/ROW]
[ROW][C]70[/C][C] 0.9581[/C][C] 0.08384[/C][C] 0.04192[/C][/ROW]
[ROW][C]71[/C][C] 0.9361[/C][C] 0.1279[/C][C] 0.06394[/C][/ROW]
[ROW][C]72[/C][C] 0.9041[/C][C] 0.1918[/C][C] 0.09589[/C][/ROW]
[ROW][C]73[/C][C] 0.8761[/C][C] 0.2478[/C][C] 0.1239[/C][/ROW]
[ROW][C]74[/C][C] 0.9748[/C][C] 0.05031[/C][C] 0.02516[/C][/ROW]
[ROW][C]75[/C][C] 0.9829[/C][C] 0.03417[/C][C] 0.01708[/C][/ROW]
[ROW][C]76[/C][C] 0.9681[/C][C] 0.06389[/C][C] 0.03194[/C][/ROW]
[ROW][C]77[/C][C] 0.9625[/C][C] 0.07508[/C][C] 0.03754[/C][/ROW]
[ROW][C]78[/C][C] 0.9504[/C][C] 0.09921[/C][C] 0.0496[/C][/ROW]
[ROW][C]79[/C][C] 0.9083[/C][C] 0.1833[/C][C] 0.09166[/C][/ROW]
[ROW][C]80[/C][C] 0.8463[/C][C] 0.3074[/C][C] 0.1537[/C][/ROW]
[ROW][C]81[/C][C] 0.7963[/C][C] 0.4074[/C][C] 0.2037[/C][/ROW]
[ROW][C]82[/C][C] 0.8941[/C][C] 0.2119[/C][C] 0.1059[/C][/ROW]
[ROW][C]83[/C][C] 0.8657[/C][C] 0.2686[/C][C] 0.1343[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=300458&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=300458&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
8 0.0488 0.09761 0.9512
9 0.03383 0.06767 0.9662
10 0.2533 0.5065 0.7467
11 0.2458 0.4916 0.7542
12 0.2099 0.4199 0.7901
13 0.2387 0.4774 0.7613
14 0.2217 0.4435 0.7783
15 0.1556 0.3112 0.8444
16 0.1148 0.2297 0.8852
17 0.08805 0.1761 0.912
18 0.3559 0.7118 0.6441
19 0.284 0.5681 0.716
20 0.2981 0.5962 0.7019
21 0.2425 0.485 0.7575
22 0.2226 0.4453 0.7774
23 0.1808 0.3617 0.8192
24 0.1898 0.3795 0.8102
25 0.1482 0.2963 0.8518
26 0.1151 0.2302 0.8849
27 0.08723 0.1745 0.9128
28 0.09593 0.1919 0.9041
29 0.07174 0.1435 0.9283
30 0.07414 0.1483 0.9259
31 0.05412 0.1082 0.9459
32 0.4935 0.9871 0.5065
33 0.4358 0.8716 0.5642
34 0.6739 0.6522 0.3261
35 0.6197 0.7605 0.3803
36 0.5826 0.8348 0.4174
37 0.5664 0.8672 0.4336
38 0.5203 0.9593 0.4797
39 0.4776 0.9553 0.5224
40 0.4144 0.8289 0.5856
41 0.3891 0.7782 0.6109
42 0.4029 0.8058 0.5971
43 0.5082 0.9836 0.4918
44 0.4611 0.9222 0.5389
45 0.4073 0.8146 0.5927
46 0.3733 0.7467 0.6267
47 0.3218 0.6436 0.6782
48 0.2934 0.5868 0.7066
49 0.2637 0.5275 0.7363
50 0.2681 0.5361 0.7319
51 0.5492 0.9015 0.4508
52 0.5875 0.8251 0.4125
53 0.5598 0.8804 0.4402
54 0.6445 0.7111 0.3555
55 0.5968 0.8064 0.4032
56 0.5488 0.9024 0.4512
57 0.504 0.992 0.496
58 0.6551 0.6899 0.3449
59 0.5974 0.8053 0.4026
60 0.5375 0.925 0.4625
61 0.4832 0.9664 0.5168
62 0.6142 0.7715 0.3858
63 0.7633 0.4734 0.2367
64 0.8283 0.3434 0.1717
65 0.8739 0.2521 0.1261
66 0.9703 0.05947 0.02974
67 0.9656 0.06871 0.03435
68 0.947 0.1059 0.05297
69 0.9211 0.1579 0.07894
70 0.9581 0.08384 0.04192
71 0.9361 0.1279 0.06394
72 0.9041 0.1918 0.09589
73 0.8761 0.2478 0.1239
74 0.9748 0.05031 0.02516
75 0.9829 0.03417 0.01708
76 0.9681 0.06389 0.03194
77 0.9625 0.07508 0.03754
78 0.9504 0.09921 0.0496
79 0.9083 0.1833 0.09166
80 0.8463 0.3074 0.1537
81 0.7963 0.4074 0.2037
82 0.8941 0.2119 0.1059
83 0.8657 0.2686 0.1343







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level0 0OK
5% type I error level10.0131579OK
10% type I error level100.131579NOK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 0 &  0 & OK \tabularnewline
5% type I error level & 1 & 0.0131579 & OK \tabularnewline
10% type I error level & 10 & 0.131579 & NOK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=300458&T=6

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]0[/C][C] 0[/C][C]OK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]1[/C][C]0.0131579[/C][C]OK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]10[/C][C]0.131579[/C][C]NOK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=300458&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=300458&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level0 0OK
5% type I error level10.0131579OK
10% type I error level100.131579NOK







Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 3.7443, df1 = 2, df2 = 84, p-value = 0.02769
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.83812, df1 = 8, df2 = 78, p-value = 0.572
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 1.9709, df1 = 2, df2 = 84, p-value = 0.1457

\begin{tabular}{lllllllll}
\hline
Ramsey RESET F-Test for powers (2 and 3) of fitted values \tabularnewline
> reset_test_fitted
	RESET test
data:  mylm
RESET = 3.7443, df1 = 2, df2 = 84, p-value = 0.02769
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of regressors \tabularnewline
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.83812, df1 = 8, df2 = 78, p-value = 0.572
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of principal components \tabularnewline
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 1.9709, df1 = 2, df2 = 84, p-value = 0.1457
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=300458&T=7

[TABLE]
[ROW][C]Ramsey RESET F-Test for powers (2 and 3) of fitted values[/C][/ROW]
[ROW][C]
> reset_test_fitted
	RESET test
data:  mylm
RESET = 3.7443, df1 = 2, df2 = 84, p-value = 0.02769
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of regressors[/C][/ROW] [ROW][C]
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.83812, df1 = 8, df2 = 78, p-value = 0.572
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of principal components[/C][/ROW] [ROW][C]
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 1.9709, df1 = 2, df2 = 84, p-value = 0.1457
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=300458&T=7

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=300458&T=7

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 3.7443, df1 = 2, df2 = 84, p-value = 0.02769
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.83812, df1 = 8, df2 = 78, p-value = 0.572
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 1.9709, df1 = 2, df2 = 84, p-value = 0.1457







Variance Inflation Factors (Multicollinearity)
> vif
   TVDC2    TVDC1    TVDC3    TVDC4 
1.220638 1.333345 1.427455 1.087605 

\begin{tabular}{lllllllll}
\hline
Variance Inflation Factors (Multicollinearity) \tabularnewline
> vif
   TVDC2    TVDC1    TVDC3    TVDC4 
1.220638 1.333345 1.427455 1.087605 
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=300458&T=8

[TABLE]
[ROW][C]Variance Inflation Factors (Multicollinearity)[/C][/ROW]
[ROW][C]
> vif
   TVDC2    TVDC1    TVDC3    TVDC4 
1.220638 1.333345 1.427455 1.087605 
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=300458&T=8

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=300458&T=8

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Variance Inflation Factors (Multicollinearity)
> vif
   TVDC2    TVDC1    TVDC3    TVDC4 
1.220638 1.333345 1.427455 1.087605 



Parameters (Session):
par2 = grey ; par3 = FALSE ; par4 = 5-point Likert ;
Parameters (R input):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ; par4 = 0 ; par5 = 0 ;
R code (references can be found in the software module):
library(lattice)
library(lmtest)
library(car)
library(MASS)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
mywarning <- ''
par1 <- as.numeric(par1)
if(is.na(par1)) {
par1 <- 1
mywarning = 'Warning: you did not specify the column number of the endogenous series! The first column was selected by default.'
}
if (par4=='') par4 <- 0
par4 <- as.numeric(par4)
if (par5=='') par5 <- 0
par5 <- as.numeric(par5)
x <- na.omit(t(y))
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'Seasonal Differences (s=12)'){
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'First and Seasonal Differences (s=12)'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if(par4 > 0) {
x2 <- array(0, dim=c(n-par4,par4), dimnames=list(1:(n-par4), paste(colnames(x)[par1],'(t-',1:par4,')',sep='')))
for (i in 1:(n-par4)) {
for (j in 1:par4) {
x2[i,j] <- x[i+par4-j,par1]
}
}
x <- cbind(x[(par4+1):n,], x2)
n <- n - par4
}
if(par5 > 0) {
x2 <- array(0, dim=c(n-par5*12,par5), dimnames=list(1:(n-par5*12), paste(colnames(x)[par1],'(t-',1:par5,'s)',sep='')))
for (i in 1:(n-par5*12)) {
for (j in 1:par5) {
x2[i,j] <- x[i+par5*12-j*12,par1]
}
}
x <- cbind(x[(par5*12+1):n,], x2)
n <- n - par5*12
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
(k <- length(x[n,]))
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
print(x)
(k <- length(x[n,]))
head(x)
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
sresid <- studres(mylm)
hist(sresid, freq=FALSE, main='Distribution of Studentized Residuals')
xfit<-seq(min(sresid),max(sresid),length=40)
yfit<-dnorm(xfit)
lines(xfit, yfit)
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqPlot(mylm, main='QQ Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
print(z)
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, signif(mysum$coefficients[i,1],6), sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, mywarning)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Multiple Linear Regression - Ordinary Least Squares', 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,formatC(signif(mysum$coefficients[i,1],5),format='g',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,2],5),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,3],4),format='e',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4],4),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4]/2,4),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a,formatC(signif(sqrt(mysum$r.squared),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$adj.r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a,formatC(signif(mysum$fstatistic[1],6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[2],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[3],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a,formatC(signif(1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a,formatC(signif(mysum$sigma,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a,formatC(signif(sum(myerror*myerror),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
myr <- as.numeric(mysum$resid)
myr
if(n < 200) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,formatC(signif(x[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(x[i]-mysum$resid[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$resid[i],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,1],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,2],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,3],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant1,6))
a<-table.element(a,formatC(signif(numsignificant1/numgqtests,6),format='g',flag=' '))
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant5,6))
a<-table.element(a,signif(numsignificant5/numgqtests,6))
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant10,6))
a<-table.element(a,signif(numsignificant10/numgqtests,6))
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}
}
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of fitted values',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_fitted <- resettest(mylm,power=2:3,type='fitted')
a<-table.element(a,paste('
',RC.texteval('reset_test_fitted'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of regressors',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_regressors <- resettest(mylm,power=2:3,type='regressor')
a<-table.element(a,paste('
',RC.texteval('reset_test_regressors'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of principal components',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_principal_components <- resettest(mylm,power=2:3,type='princomp')
a<-table.element(a,paste('
',RC.texteval('reset_test_principal_components'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable8.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Variance Inflation Factors (Multicollinearity)',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
vif <- vif(mylm)
a<-table.element(a,paste('
',RC.texteval('vif'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable9.tab')