Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationThu, 22 Dec 2016 19:16:18 +0100
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2016/Dec/22/t148243067887czgdugg08sho0.htm/, Retrieved Mon, 29 Apr 2024 04:20:52 +0200
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=, Retrieved Mon, 29 Apr 2024 04:20:52 +0200
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact0
Dataseries X:
14	13
19	16
17	17
17	NA
15	NA
20	16
15	NA
19	NA
15	NA
15	17
19	17
NA	15
20	16
18	14
15	16
14	17
20	NA
NA	NA
16	NA
16	NA
16	16
10	NA
19	16
19	NA
16	NA
15	NA
18	16
17	15
19	16
17	16
NA	13
19	15
20	17
5	NA
19	13
16	17
15	NA
16	14
18	14
16	18
15	NA
17	17
NA	13
20	16
19	15
7	15
13	NA
16	15
16	13
NA	NA
18	17
18	NA
16	NA
17	11
19	14
16	13
19	NA
13	17
16	16
13	NA
12	17
17	16
17	16
17	16
16	15
16	12
14	17
16	14
13	14
16	16
14	NA
20	NA
12	NA
13	NA
18	NA
14	15
19	16
18	14
14	15
18	17
19	NA
15	10
14	NA
17	17
19	NA
13	20
19	17
18	18
20	NA
15	17
15	14
15	NA
20	17
15	NA
19	17
18	NA
18	16
15	18
20	18
17	16
12	NA
18	NA
19	15
20	13
NA	NA
17	NA
15	NA
16	NA
18	NA
18	16
14	NA
15	NA
12	NA
17	12
14	NA
18	16
17	16
17	NA
20	16
16	14
14	15
15	14
18	NA
20	15
17	NA
17	15
17	16
17	NA
15	NA
17	NA
18	11
17	NA
20	18
15	NA
16	11
15	NA
18	18
11	NA
15	15
18	19
20	17
19	NA
14	14
16	NA
15	13
17	17
18	14
20	19
17	14
18	NA
15	NA
16	16
11	16
15	15
18	12
17	NA
16	17
12	NA
19	NA
18	18
15	15
17	18
19	15
18	NA
19	NA
16	NA
16	16
16	NA
14	16




Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time4 seconds
R ServerBig Analytics Cloud Computing Center
R Engine error message
Error in vif.default(mylm) : model contains fewer than 2 terms
Calls: vif -> vif.default
Execution halted

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input view raw input (R code)  \tabularnewline
Raw Outputview raw output of R engine  \tabularnewline
Computing time4 seconds \tabularnewline
R ServerBig Analytics Cloud Computing Center \tabularnewline
R Engine error message & 
Error in vif.default(mylm) : model contains fewer than 2 terms
Calls: vif -> vif.default
Execution halted
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=&T=0

[TABLE]
[ROW]
Summary of computational transaction[/C][/ROW] [ROW]Raw Input[/C] view raw input (R code) [/C][/ROW] [ROW]Raw Output[/C]view raw output of R engine [/C][/ROW] [ROW]Computing time[/C]4 seconds[/C][/ROW] [ROW]R Server[/C]Big Analytics Cloud Computing Center[/C][/ROW] [ROW]R Engine error message[/C][C]
Error in vif.default(mylm) : model contains fewer than 2 terms
Calls: vif -> vif.default
Execution halted
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time4 seconds
R ServerBig Analytics Cloud Computing Center
R Engine error message
Error in vif.default(mylm) : model contains fewer than 2 terms
Calls: vif -> vif.default
Execution halted







Multiple Linear Regression - Estimated Regression Equation
ITHSUM[t] = + 14.7695 + 0.132034TVDCSUM[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
ITHSUM[t] =  +  14.7695 +  0.132034TVDCSUM[t]  + e[t] \tabularnewline
 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]ITHSUM[t] =  +  14.7695 +  0.132034TVDCSUM[t]  + e[t][/C][/ROW]
[ROW][C][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
ITHSUM[t] = + 14.7695 + 0.132034TVDCSUM[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+14.77 1.919+7.6970e+00 1.117e-11 5.585e-12
TVDCSUM+0.132 0.1227+1.0760e+00 0.2845 0.1422

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & +14.77 &  1.919 & +7.6970e+00 &  1.117e-11 &  5.585e-12 \tabularnewline
TVDCSUM & +0.132 &  0.1227 & +1.0760e+00 &  0.2845 &  0.1422 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]+14.77[/C][C] 1.919[/C][C]+7.6970e+00[/C][C] 1.117e-11[/C][C] 5.585e-12[/C][/ROW]
[ROW][C]TVDCSUM[/C][C]+0.132[/C][C] 0.1227[/C][C]+1.0760e+00[/C][C] 0.2845[/C][C] 0.1422[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+14.77 1.919+7.6970e+00 1.117e-11 5.585e-12
TVDCSUM+0.132 0.1227+1.0760e+00 0.2845 0.1422







Multiple Linear Regression - Regression Statistics
Multiple R 0.1081
R-squared 0.01168
Adjusted R-squared 0.001596
F-TEST (value) 1.158
F-TEST (DF numerator)1
F-TEST (DF denominator)98
p-value 0.2845
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 2.278
Sum Squared Residuals 508.7

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R &  0.1081 \tabularnewline
R-squared &  0.01168 \tabularnewline
Adjusted R-squared &  0.001596 \tabularnewline
F-TEST (value) &  1.158 \tabularnewline
F-TEST (DF numerator) & 1 \tabularnewline
F-TEST (DF denominator) & 98 \tabularnewline
p-value &  0.2845 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation &  2.278 \tabularnewline
Sum Squared Residuals &  508.7 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C] 0.1081[/C][/ROW]
[ROW][C]R-squared[/C][C] 0.01168[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C] 0.001596[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C] 1.158[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]1[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]98[/C][/ROW]
[ROW][C]p-value[/C][C] 0.2845[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C] 2.278[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C] 508.7[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R 0.1081
R-squared 0.01168
Adjusted R-squared 0.001596
F-TEST (value) 1.158
F-TEST (DF numerator)1
F-TEST (DF denominator)98
p-value 0.2845
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 2.278
Sum Squared Residuals 508.7







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 14 16.49-2.486
2 19 16.88 2.118
3 17 17.01-0.01409
4 20 16.88 3.118
5 15 17.01-2.014
6 19 17.01 1.986
7 20 16.88 3.118
8 18 16.62 1.382
9 15 16.88-1.882
10 14 17.01-3.014
11 16 16.88-0.8821
12 19 16.88 2.118
13 18 16.88 1.118
14 17 16.75 0.25
15 19 16.88 2.118
16 17 16.88 0.1179
17 19 16.75 2.25
18 20 17.01 2.986
19 19 16.49 2.514
20 16 17.01-1.014
21 16 16.62-0.618
22 18 16.62 1.382
23 16 17.15-1.146
24 17 17.01-0.01409
25 20 16.88 3.118
26 19 16.75 2.25
27 7 16.75-9.75
28 16 16.75-0.75
29 16 16.49-0.486
30 18 17.01 0.9859
31 17 16.22 0.7781
32 19 16.62 2.382
33 16 16.49-0.486
34 13 17.01-4.014
35 16 16.88-0.8821
36 12 17.01-5.014
37 17 16.88 0.1179
38 17 16.88 0.1179
39 17 16.88 0.1179
40 16 16.75-0.75
41 16 16.35-0.3539
42 14 17.01-3.014
43 16 16.62-0.618
44 13 16.62-3.618
45 16 16.88-0.8821
46 14 16.75-2.75
47 19 16.88 2.118
48 18 16.62 1.382
49 14 16.75-2.75
50 18 17.01 0.9859
51 15 16.09-1.09
52 17 17.01-0.01409
53 13 17.41-4.41
54 19 17.01 1.986
55 18 17.15 0.8539
56 15 17.01-2.014
57 15 16.62-1.618
58 20 17.01 2.986
59 19 17.01 1.986
60 18 16.88 1.118
61 15 17.15-2.146
62 20 17.15 2.854
63 17 16.88 0.1179
64 19 16.75 2.25
65 20 16.49 3.514
66 18 16.88 1.118
67 17 16.35 0.6461
68 18 16.88 1.118
69 17 16.88 0.1179
70 20 16.88 3.118
71 16 16.62-0.618
72 14 16.75-2.75
73 15 16.62-1.618
74 20 16.75 3.25
75 17 16.75 0.25
76 17 16.88 0.1179
77 18 16.22 1.778
78 20 17.15 2.854
79 16 16.22-0.2219
80 18 17.15 0.8539
81 15 16.75-1.75
82 18 17.28 0.7218
83 20 17.01 2.986
84 14 16.62-2.618
85 15 16.49-1.486
86 17 17.01-0.01409
87 18 16.62 1.382
88 20 17.28 2.722
89 17 16.62 0.382
90 16 16.88-0.8821
91 11 16.88-5.882
92 15 16.75-1.75
93 18 16.35 1.646
94 16 17.01-1.014
95 18 17.15 0.8539
96 15 16.75-1.75
97 17 17.15-0.1461
98 19 16.75 2.25
99 16 16.88-0.8821
100 14 16.88-2.882

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 &  14 &  16.49 & -2.486 \tabularnewline
2 &  19 &  16.88 &  2.118 \tabularnewline
3 &  17 &  17.01 & -0.01409 \tabularnewline
4 &  20 &  16.88 &  3.118 \tabularnewline
5 &  15 &  17.01 & -2.014 \tabularnewline
6 &  19 &  17.01 &  1.986 \tabularnewline
7 &  20 &  16.88 &  3.118 \tabularnewline
8 &  18 &  16.62 &  1.382 \tabularnewline
9 &  15 &  16.88 & -1.882 \tabularnewline
10 &  14 &  17.01 & -3.014 \tabularnewline
11 &  16 &  16.88 & -0.8821 \tabularnewline
12 &  19 &  16.88 &  2.118 \tabularnewline
13 &  18 &  16.88 &  1.118 \tabularnewline
14 &  17 &  16.75 &  0.25 \tabularnewline
15 &  19 &  16.88 &  2.118 \tabularnewline
16 &  17 &  16.88 &  0.1179 \tabularnewline
17 &  19 &  16.75 &  2.25 \tabularnewline
18 &  20 &  17.01 &  2.986 \tabularnewline
19 &  19 &  16.49 &  2.514 \tabularnewline
20 &  16 &  17.01 & -1.014 \tabularnewline
21 &  16 &  16.62 & -0.618 \tabularnewline
22 &  18 &  16.62 &  1.382 \tabularnewline
23 &  16 &  17.15 & -1.146 \tabularnewline
24 &  17 &  17.01 & -0.01409 \tabularnewline
25 &  20 &  16.88 &  3.118 \tabularnewline
26 &  19 &  16.75 &  2.25 \tabularnewline
27 &  7 &  16.75 & -9.75 \tabularnewline
28 &  16 &  16.75 & -0.75 \tabularnewline
29 &  16 &  16.49 & -0.486 \tabularnewline
30 &  18 &  17.01 &  0.9859 \tabularnewline
31 &  17 &  16.22 &  0.7781 \tabularnewline
32 &  19 &  16.62 &  2.382 \tabularnewline
33 &  16 &  16.49 & -0.486 \tabularnewline
34 &  13 &  17.01 & -4.014 \tabularnewline
35 &  16 &  16.88 & -0.8821 \tabularnewline
36 &  12 &  17.01 & -5.014 \tabularnewline
37 &  17 &  16.88 &  0.1179 \tabularnewline
38 &  17 &  16.88 &  0.1179 \tabularnewline
39 &  17 &  16.88 &  0.1179 \tabularnewline
40 &  16 &  16.75 & -0.75 \tabularnewline
41 &  16 &  16.35 & -0.3539 \tabularnewline
42 &  14 &  17.01 & -3.014 \tabularnewline
43 &  16 &  16.62 & -0.618 \tabularnewline
44 &  13 &  16.62 & -3.618 \tabularnewline
45 &  16 &  16.88 & -0.8821 \tabularnewline
46 &  14 &  16.75 & -2.75 \tabularnewline
47 &  19 &  16.88 &  2.118 \tabularnewline
48 &  18 &  16.62 &  1.382 \tabularnewline
49 &  14 &  16.75 & -2.75 \tabularnewline
50 &  18 &  17.01 &  0.9859 \tabularnewline
51 &  15 &  16.09 & -1.09 \tabularnewline
52 &  17 &  17.01 & -0.01409 \tabularnewline
53 &  13 &  17.41 & -4.41 \tabularnewline
54 &  19 &  17.01 &  1.986 \tabularnewline
55 &  18 &  17.15 &  0.8539 \tabularnewline
56 &  15 &  17.01 & -2.014 \tabularnewline
57 &  15 &  16.62 & -1.618 \tabularnewline
58 &  20 &  17.01 &  2.986 \tabularnewline
59 &  19 &  17.01 &  1.986 \tabularnewline
60 &  18 &  16.88 &  1.118 \tabularnewline
61 &  15 &  17.15 & -2.146 \tabularnewline
62 &  20 &  17.15 &  2.854 \tabularnewline
63 &  17 &  16.88 &  0.1179 \tabularnewline
64 &  19 &  16.75 &  2.25 \tabularnewline
65 &  20 &  16.49 &  3.514 \tabularnewline
66 &  18 &  16.88 &  1.118 \tabularnewline
67 &  17 &  16.35 &  0.6461 \tabularnewline
68 &  18 &  16.88 &  1.118 \tabularnewline
69 &  17 &  16.88 &  0.1179 \tabularnewline
70 &  20 &  16.88 &  3.118 \tabularnewline
71 &  16 &  16.62 & -0.618 \tabularnewline
72 &  14 &  16.75 & -2.75 \tabularnewline
73 &  15 &  16.62 & -1.618 \tabularnewline
74 &  20 &  16.75 &  3.25 \tabularnewline
75 &  17 &  16.75 &  0.25 \tabularnewline
76 &  17 &  16.88 &  0.1179 \tabularnewline
77 &  18 &  16.22 &  1.778 \tabularnewline
78 &  20 &  17.15 &  2.854 \tabularnewline
79 &  16 &  16.22 & -0.2219 \tabularnewline
80 &  18 &  17.15 &  0.8539 \tabularnewline
81 &  15 &  16.75 & -1.75 \tabularnewline
82 &  18 &  17.28 &  0.7218 \tabularnewline
83 &  20 &  17.01 &  2.986 \tabularnewline
84 &  14 &  16.62 & -2.618 \tabularnewline
85 &  15 &  16.49 & -1.486 \tabularnewline
86 &  17 &  17.01 & -0.01409 \tabularnewline
87 &  18 &  16.62 &  1.382 \tabularnewline
88 &  20 &  17.28 &  2.722 \tabularnewline
89 &  17 &  16.62 &  0.382 \tabularnewline
90 &  16 &  16.88 & -0.8821 \tabularnewline
91 &  11 &  16.88 & -5.882 \tabularnewline
92 &  15 &  16.75 & -1.75 \tabularnewline
93 &  18 &  16.35 &  1.646 \tabularnewline
94 &  16 &  17.01 & -1.014 \tabularnewline
95 &  18 &  17.15 &  0.8539 \tabularnewline
96 &  15 &  16.75 & -1.75 \tabularnewline
97 &  17 &  17.15 & -0.1461 \tabularnewline
98 &  19 &  16.75 &  2.25 \tabularnewline
99 &  16 &  16.88 & -0.8821 \tabularnewline
100 &  14 &  16.88 & -2.882 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C] 14[/C][C] 16.49[/C][C]-2.486[/C][/ROW]
[ROW][C]2[/C][C] 19[/C][C] 16.88[/C][C] 2.118[/C][/ROW]
[ROW][C]3[/C][C] 17[/C][C] 17.01[/C][C]-0.01409[/C][/ROW]
[ROW][C]4[/C][C] 20[/C][C] 16.88[/C][C] 3.118[/C][/ROW]
[ROW][C]5[/C][C] 15[/C][C] 17.01[/C][C]-2.014[/C][/ROW]
[ROW][C]6[/C][C] 19[/C][C] 17.01[/C][C] 1.986[/C][/ROW]
[ROW][C]7[/C][C] 20[/C][C] 16.88[/C][C] 3.118[/C][/ROW]
[ROW][C]8[/C][C] 18[/C][C] 16.62[/C][C] 1.382[/C][/ROW]
[ROW][C]9[/C][C] 15[/C][C] 16.88[/C][C]-1.882[/C][/ROW]
[ROW][C]10[/C][C] 14[/C][C] 17.01[/C][C]-3.014[/C][/ROW]
[ROW][C]11[/C][C] 16[/C][C] 16.88[/C][C]-0.8821[/C][/ROW]
[ROW][C]12[/C][C] 19[/C][C] 16.88[/C][C] 2.118[/C][/ROW]
[ROW][C]13[/C][C] 18[/C][C] 16.88[/C][C] 1.118[/C][/ROW]
[ROW][C]14[/C][C] 17[/C][C] 16.75[/C][C] 0.25[/C][/ROW]
[ROW][C]15[/C][C] 19[/C][C] 16.88[/C][C] 2.118[/C][/ROW]
[ROW][C]16[/C][C] 17[/C][C] 16.88[/C][C] 0.1179[/C][/ROW]
[ROW][C]17[/C][C] 19[/C][C] 16.75[/C][C] 2.25[/C][/ROW]
[ROW][C]18[/C][C] 20[/C][C] 17.01[/C][C] 2.986[/C][/ROW]
[ROW][C]19[/C][C] 19[/C][C] 16.49[/C][C] 2.514[/C][/ROW]
[ROW][C]20[/C][C] 16[/C][C] 17.01[/C][C]-1.014[/C][/ROW]
[ROW][C]21[/C][C] 16[/C][C] 16.62[/C][C]-0.618[/C][/ROW]
[ROW][C]22[/C][C] 18[/C][C] 16.62[/C][C] 1.382[/C][/ROW]
[ROW][C]23[/C][C] 16[/C][C] 17.15[/C][C]-1.146[/C][/ROW]
[ROW][C]24[/C][C] 17[/C][C] 17.01[/C][C]-0.01409[/C][/ROW]
[ROW][C]25[/C][C] 20[/C][C] 16.88[/C][C] 3.118[/C][/ROW]
[ROW][C]26[/C][C] 19[/C][C] 16.75[/C][C] 2.25[/C][/ROW]
[ROW][C]27[/C][C] 7[/C][C] 16.75[/C][C]-9.75[/C][/ROW]
[ROW][C]28[/C][C] 16[/C][C] 16.75[/C][C]-0.75[/C][/ROW]
[ROW][C]29[/C][C] 16[/C][C] 16.49[/C][C]-0.486[/C][/ROW]
[ROW][C]30[/C][C] 18[/C][C] 17.01[/C][C] 0.9859[/C][/ROW]
[ROW][C]31[/C][C] 17[/C][C] 16.22[/C][C] 0.7781[/C][/ROW]
[ROW][C]32[/C][C] 19[/C][C] 16.62[/C][C] 2.382[/C][/ROW]
[ROW][C]33[/C][C] 16[/C][C] 16.49[/C][C]-0.486[/C][/ROW]
[ROW][C]34[/C][C] 13[/C][C] 17.01[/C][C]-4.014[/C][/ROW]
[ROW][C]35[/C][C] 16[/C][C] 16.88[/C][C]-0.8821[/C][/ROW]
[ROW][C]36[/C][C] 12[/C][C] 17.01[/C][C]-5.014[/C][/ROW]
[ROW][C]37[/C][C] 17[/C][C] 16.88[/C][C] 0.1179[/C][/ROW]
[ROW][C]38[/C][C] 17[/C][C] 16.88[/C][C] 0.1179[/C][/ROW]
[ROW][C]39[/C][C] 17[/C][C] 16.88[/C][C] 0.1179[/C][/ROW]
[ROW][C]40[/C][C] 16[/C][C] 16.75[/C][C]-0.75[/C][/ROW]
[ROW][C]41[/C][C] 16[/C][C] 16.35[/C][C]-0.3539[/C][/ROW]
[ROW][C]42[/C][C] 14[/C][C] 17.01[/C][C]-3.014[/C][/ROW]
[ROW][C]43[/C][C] 16[/C][C] 16.62[/C][C]-0.618[/C][/ROW]
[ROW][C]44[/C][C] 13[/C][C] 16.62[/C][C]-3.618[/C][/ROW]
[ROW][C]45[/C][C] 16[/C][C] 16.88[/C][C]-0.8821[/C][/ROW]
[ROW][C]46[/C][C] 14[/C][C] 16.75[/C][C]-2.75[/C][/ROW]
[ROW][C]47[/C][C] 19[/C][C] 16.88[/C][C] 2.118[/C][/ROW]
[ROW][C]48[/C][C] 18[/C][C] 16.62[/C][C] 1.382[/C][/ROW]
[ROW][C]49[/C][C] 14[/C][C] 16.75[/C][C]-2.75[/C][/ROW]
[ROW][C]50[/C][C] 18[/C][C] 17.01[/C][C] 0.9859[/C][/ROW]
[ROW][C]51[/C][C] 15[/C][C] 16.09[/C][C]-1.09[/C][/ROW]
[ROW][C]52[/C][C] 17[/C][C] 17.01[/C][C]-0.01409[/C][/ROW]
[ROW][C]53[/C][C] 13[/C][C] 17.41[/C][C]-4.41[/C][/ROW]
[ROW][C]54[/C][C] 19[/C][C] 17.01[/C][C] 1.986[/C][/ROW]
[ROW][C]55[/C][C] 18[/C][C] 17.15[/C][C] 0.8539[/C][/ROW]
[ROW][C]56[/C][C] 15[/C][C] 17.01[/C][C]-2.014[/C][/ROW]
[ROW][C]57[/C][C] 15[/C][C] 16.62[/C][C]-1.618[/C][/ROW]
[ROW][C]58[/C][C] 20[/C][C] 17.01[/C][C] 2.986[/C][/ROW]
[ROW][C]59[/C][C] 19[/C][C] 17.01[/C][C] 1.986[/C][/ROW]
[ROW][C]60[/C][C] 18[/C][C] 16.88[/C][C] 1.118[/C][/ROW]
[ROW][C]61[/C][C] 15[/C][C] 17.15[/C][C]-2.146[/C][/ROW]
[ROW][C]62[/C][C] 20[/C][C] 17.15[/C][C] 2.854[/C][/ROW]
[ROW][C]63[/C][C] 17[/C][C] 16.88[/C][C] 0.1179[/C][/ROW]
[ROW][C]64[/C][C] 19[/C][C] 16.75[/C][C] 2.25[/C][/ROW]
[ROW][C]65[/C][C] 20[/C][C] 16.49[/C][C] 3.514[/C][/ROW]
[ROW][C]66[/C][C] 18[/C][C] 16.88[/C][C] 1.118[/C][/ROW]
[ROW][C]67[/C][C] 17[/C][C] 16.35[/C][C] 0.6461[/C][/ROW]
[ROW][C]68[/C][C] 18[/C][C] 16.88[/C][C] 1.118[/C][/ROW]
[ROW][C]69[/C][C] 17[/C][C] 16.88[/C][C] 0.1179[/C][/ROW]
[ROW][C]70[/C][C] 20[/C][C] 16.88[/C][C] 3.118[/C][/ROW]
[ROW][C]71[/C][C] 16[/C][C] 16.62[/C][C]-0.618[/C][/ROW]
[ROW][C]72[/C][C] 14[/C][C] 16.75[/C][C]-2.75[/C][/ROW]
[ROW][C]73[/C][C] 15[/C][C] 16.62[/C][C]-1.618[/C][/ROW]
[ROW][C]74[/C][C] 20[/C][C] 16.75[/C][C] 3.25[/C][/ROW]
[ROW][C]75[/C][C] 17[/C][C] 16.75[/C][C] 0.25[/C][/ROW]
[ROW][C]76[/C][C] 17[/C][C] 16.88[/C][C] 0.1179[/C][/ROW]
[ROW][C]77[/C][C] 18[/C][C] 16.22[/C][C] 1.778[/C][/ROW]
[ROW][C]78[/C][C] 20[/C][C] 17.15[/C][C] 2.854[/C][/ROW]
[ROW][C]79[/C][C] 16[/C][C] 16.22[/C][C]-0.2219[/C][/ROW]
[ROW][C]80[/C][C] 18[/C][C] 17.15[/C][C] 0.8539[/C][/ROW]
[ROW][C]81[/C][C] 15[/C][C] 16.75[/C][C]-1.75[/C][/ROW]
[ROW][C]82[/C][C] 18[/C][C] 17.28[/C][C] 0.7218[/C][/ROW]
[ROW][C]83[/C][C] 20[/C][C] 17.01[/C][C] 2.986[/C][/ROW]
[ROW][C]84[/C][C] 14[/C][C] 16.62[/C][C]-2.618[/C][/ROW]
[ROW][C]85[/C][C] 15[/C][C] 16.49[/C][C]-1.486[/C][/ROW]
[ROW][C]86[/C][C] 17[/C][C] 17.01[/C][C]-0.01409[/C][/ROW]
[ROW][C]87[/C][C] 18[/C][C] 16.62[/C][C] 1.382[/C][/ROW]
[ROW][C]88[/C][C] 20[/C][C] 17.28[/C][C] 2.722[/C][/ROW]
[ROW][C]89[/C][C] 17[/C][C] 16.62[/C][C] 0.382[/C][/ROW]
[ROW][C]90[/C][C] 16[/C][C] 16.88[/C][C]-0.8821[/C][/ROW]
[ROW][C]91[/C][C] 11[/C][C] 16.88[/C][C]-5.882[/C][/ROW]
[ROW][C]92[/C][C] 15[/C][C] 16.75[/C][C]-1.75[/C][/ROW]
[ROW][C]93[/C][C] 18[/C][C] 16.35[/C][C] 1.646[/C][/ROW]
[ROW][C]94[/C][C] 16[/C][C] 17.01[/C][C]-1.014[/C][/ROW]
[ROW][C]95[/C][C] 18[/C][C] 17.15[/C][C] 0.8539[/C][/ROW]
[ROW][C]96[/C][C] 15[/C][C] 16.75[/C][C]-1.75[/C][/ROW]
[ROW][C]97[/C][C] 17[/C][C] 17.15[/C][C]-0.1461[/C][/ROW]
[ROW][C]98[/C][C] 19[/C][C] 16.75[/C][C] 2.25[/C][/ROW]
[ROW][C]99[/C][C] 16[/C][C] 16.88[/C][C]-0.8821[/C][/ROW]
[ROW][C]100[/C][C] 14[/C][C] 16.88[/C][C]-2.882[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 14 16.49-2.486
2 19 16.88 2.118
3 17 17.01-0.01409
4 20 16.88 3.118
5 15 17.01-2.014
6 19 17.01 1.986
7 20 16.88 3.118
8 18 16.62 1.382
9 15 16.88-1.882
10 14 17.01-3.014
11 16 16.88-0.8821
12 19 16.88 2.118
13 18 16.88 1.118
14 17 16.75 0.25
15 19 16.88 2.118
16 17 16.88 0.1179
17 19 16.75 2.25
18 20 17.01 2.986
19 19 16.49 2.514
20 16 17.01-1.014
21 16 16.62-0.618
22 18 16.62 1.382
23 16 17.15-1.146
24 17 17.01-0.01409
25 20 16.88 3.118
26 19 16.75 2.25
27 7 16.75-9.75
28 16 16.75-0.75
29 16 16.49-0.486
30 18 17.01 0.9859
31 17 16.22 0.7781
32 19 16.62 2.382
33 16 16.49-0.486
34 13 17.01-4.014
35 16 16.88-0.8821
36 12 17.01-5.014
37 17 16.88 0.1179
38 17 16.88 0.1179
39 17 16.88 0.1179
40 16 16.75-0.75
41 16 16.35-0.3539
42 14 17.01-3.014
43 16 16.62-0.618
44 13 16.62-3.618
45 16 16.88-0.8821
46 14 16.75-2.75
47 19 16.88 2.118
48 18 16.62 1.382
49 14 16.75-2.75
50 18 17.01 0.9859
51 15 16.09-1.09
52 17 17.01-0.01409
53 13 17.41-4.41
54 19 17.01 1.986
55 18 17.15 0.8539
56 15 17.01-2.014
57 15 16.62-1.618
58 20 17.01 2.986
59 19 17.01 1.986
60 18 16.88 1.118
61 15 17.15-2.146
62 20 17.15 2.854
63 17 16.88 0.1179
64 19 16.75 2.25
65 20 16.49 3.514
66 18 16.88 1.118
67 17 16.35 0.6461
68 18 16.88 1.118
69 17 16.88 0.1179
70 20 16.88 3.118
71 16 16.62-0.618
72 14 16.75-2.75
73 15 16.62-1.618
74 20 16.75 3.25
75 17 16.75 0.25
76 17 16.88 0.1179
77 18 16.22 1.778
78 20 17.15 2.854
79 16 16.22-0.2219
80 18 17.15 0.8539
81 15 16.75-1.75
82 18 17.28 0.7218
83 20 17.01 2.986
84 14 16.62-2.618
85 15 16.49-1.486
86 17 17.01-0.01409
87 18 16.62 1.382
88 20 17.28 2.722
89 17 16.62 0.382
90 16 16.88-0.8821
91 11 16.88-5.882
92 15 16.75-1.75
93 18 16.35 1.646
94 16 17.01-1.014
95 18 17.15 0.8539
96 15 16.75-1.75
97 17 17.15-0.1461
98 19 16.75 2.25
99 16 16.88-0.8821
100 14 16.88-2.882







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
5 0.7444 0.5111 0.2556
6 0.6299 0.7402 0.3701
7 0.6397 0.7207 0.3603
8 0.5621 0.8759 0.4379
9 0.5946 0.8108 0.4054
10 0.7209 0.5581 0.2791
11 0.6469 0.7062 0.3531
12 0.6142 0.7716 0.3858
13 0.5332 0.9336 0.4668
14 0.442 0.8839 0.558
15 0.4068 0.8135 0.5932
16 0.3271 0.6543 0.6729
17 0.3066 0.6132 0.6934
18 0.3177 0.6354 0.6823
19 0.3072 0.6144 0.6928
20 0.2734 0.5468 0.7266
21 0.2345 0.469 0.7655
22 0.1883 0.3767 0.8117
23 0.1613 0.3225 0.8387
24 0.1216 0.2432 0.8784
25 0.1433 0.2867 0.8567
26 0.1279 0.2558 0.8721
27 0.9711 0.05781 0.0289
28 0.9609 0.07816 0.03908
29 0.9467 0.1067 0.05333
30 0.9301 0.1397 0.06987
31 0.9093 0.1815 0.09074
32 0.9068 0.1865 0.09324
33 0.8812 0.2376 0.1188
34 0.9316 0.1367 0.06837
35 0.9132 0.1735 0.08677
36 0.97 0.0601 0.03005
37 0.9583 0.08333 0.04167
38 0.9434 0.1133 0.05663
39 0.9245 0.1509 0.07547
40 0.9042 0.1916 0.09582
41 0.879 0.2421 0.121
42 0.8959 0.2081 0.1041
43 0.8697 0.2605 0.1303
44 0.9106 0.1789 0.08944
45 0.8886 0.2228 0.1114
46 0.9003 0.1995 0.09975
47 0.8968 0.2064 0.1032
48 0.8786 0.2428 0.1214
49 0.8917 0.2166 0.1083
50 0.8685 0.263 0.1315
51 0.8444 0.3112 0.1556
52 0.8073 0.3855 0.1927
53 0.9024 0.1951 0.09757
54 0.8956 0.2089 0.1044
55 0.8709 0.2582 0.1291
56 0.8691 0.2619 0.1309
57 0.8548 0.2904 0.1452
58 0.8743 0.2514 0.1257
59 0.8642 0.2716 0.1358
60 0.8365 0.3271 0.1635
61 0.8409 0.3182 0.1591
62 0.8548 0.2904 0.1452
63 0.8173 0.3654 0.1827
64 0.8141 0.3717 0.1859
65 0.8718 0.2564 0.1282
66 0.8444 0.3112 0.1556
67 0.8116 0.3769 0.1884
68 0.7767 0.4465 0.2233
69 0.7261 0.5479 0.2739
70 0.7741 0.4519 0.2259
71 0.7241 0.5518 0.2759
72 0.7459 0.5082 0.2541
73 0.7154 0.5691 0.2846
74 0.7804 0.4392 0.2196
75 0.7274 0.5453 0.2726
76 0.6664 0.6672 0.3336
77 0.6843 0.6315 0.3157
78 0.7151 0.5699 0.2849
79 0.6631 0.6739 0.3369
80 0.6003 0.7994 0.3997
81 0.5513 0.8975 0.4487
82 0.4777 0.9553 0.5223
83 0.5569 0.8861 0.4431
84 0.5446 0.9107 0.4554
85 0.4797 0.9594 0.5203
86 0.3962 0.7923 0.6038
87 0.3611 0.7222 0.6389
88 0.4957 0.9915 0.5043
89 0.4081 0.8162 0.5919
90 0.3135 0.6269 0.6865
91 0.7706 0.4588 0.2294
92 0.7301 0.5397 0.2698
93 0.6771 0.6459 0.3229
94 0.5442 0.9116 0.4558
95 0.4277 0.8554 0.5723

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
5 &  0.7444 &  0.5111 &  0.2556 \tabularnewline
6 &  0.6299 &  0.7402 &  0.3701 \tabularnewline
7 &  0.6397 &  0.7207 &  0.3603 \tabularnewline
8 &  0.5621 &  0.8759 &  0.4379 \tabularnewline
9 &  0.5946 &  0.8108 &  0.4054 \tabularnewline
10 &  0.7209 &  0.5581 &  0.2791 \tabularnewline
11 &  0.6469 &  0.7062 &  0.3531 \tabularnewline
12 &  0.6142 &  0.7716 &  0.3858 \tabularnewline
13 &  0.5332 &  0.9336 &  0.4668 \tabularnewline
14 &  0.442 &  0.8839 &  0.558 \tabularnewline
15 &  0.4068 &  0.8135 &  0.5932 \tabularnewline
16 &  0.3271 &  0.6543 &  0.6729 \tabularnewline
17 &  0.3066 &  0.6132 &  0.6934 \tabularnewline
18 &  0.3177 &  0.6354 &  0.6823 \tabularnewline
19 &  0.3072 &  0.6144 &  0.6928 \tabularnewline
20 &  0.2734 &  0.5468 &  0.7266 \tabularnewline
21 &  0.2345 &  0.469 &  0.7655 \tabularnewline
22 &  0.1883 &  0.3767 &  0.8117 \tabularnewline
23 &  0.1613 &  0.3225 &  0.8387 \tabularnewline
24 &  0.1216 &  0.2432 &  0.8784 \tabularnewline
25 &  0.1433 &  0.2867 &  0.8567 \tabularnewline
26 &  0.1279 &  0.2558 &  0.8721 \tabularnewline
27 &  0.9711 &  0.05781 &  0.0289 \tabularnewline
28 &  0.9609 &  0.07816 &  0.03908 \tabularnewline
29 &  0.9467 &  0.1067 &  0.05333 \tabularnewline
30 &  0.9301 &  0.1397 &  0.06987 \tabularnewline
31 &  0.9093 &  0.1815 &  0.09074 \tabularnewline
32 &  0.9068 &  0.1865 &  0.09324 \tabularnewline
33 &  0.8812 &  0.2376 &  0.1188 \tabularnewline
34 &  0.9316 &  0.1367 &  0.06837 \tabularnewline
35 &  0.9132 &  0.1735 &  0.08677 \tabularnewline
36 &  0.97 &  0.0601 &  0.03005 \tabularnewline
37 &  0.9583 &  0.08333 &  0.04167 \tabularnewline
38 &  0.9434 &  0.1133 &  0.05663 \tabularnewline
39 &  0.9245 &  0.1509 &  0.07547 \tabularnewline
40 &  0.9042 &  0.1916 &  0.09582 \tabularnewline
41 &  0.879 &  0.2421 &  0.121 \tabularnewline
42 &  0.8959 &  0.2081 &  0.1041 \tabularnewline
43 &  0.8697 &  0.2605 &  0.1303 \tabularnewline
44 &  0.9106 &  0.1789 &  0.08944 \tabularnewline
45 &  0.8886 &  0.2228 &  0.1114 \tabularnewline
46 &  0.9003 &  0.1995 &  0.09975 \tabularnewline
47 &  0.8968 &  0.2064 &  0.1032 \tabularnewline
48 &  0.8786 &  0.2428 &  0.1214 \tabularnewline
49 &  0.8917 &  0.2166 &  0.1083 \tabularnewline
50 &  0.8685 &  0.263 &  0.1315 \tabularnewline
51 &  0.8444 &  0.3112 &  0.1556 \tabularnewline
52 &  0.8073 &  0.3855 &  0.1927 \tabularnewline
53 &  0.9024 &  0.1951 &  0.09757 \tabularnewline
54 &  0.8956 &  0.2089 &  0.1044 \tabularnewline
55 &  0.8709 &  0.2582 &  0.1291 \tabularnewline
56 &  0.8691 &  0.2619 &  0.1309 \tabularnewline
57 &  0.8548 &  0.2904 &  0.1452 \tabularnewline
58 &  0.8743 &  0.2514 &  0.1257 \tabularnewline
59 &  0.8642 &  0.2716 &  0.1358 \tabularnewline
60 &  0.8365 &  0.3271 &  0.1635 \tabularnewline
61 &  0.8409 &  0.3182 &  0.1591 \tabularnewline
62 &  0.8548 &  0.2904 &  0.1452 \tabularnewline
63 &  0.8173 &  0.3654 &  0.1827 \tabularnewline
64 &  0.8141 &  0.3717 &  0.1859 \tabularnewline
65 &  0.8718 &  0.2564 &  0.1282 \tabularnewline
66 &  0.8444 &  0.3112 &  0.1556 \tabularnewline
67 &  0.8116 &  0.3769 &  0.1884 \tabularnewline
68 &  0.7767 &  0.4465 &  0.2233 \tabularnewline
69 &  0.7261 &  0.5479 &  0.2739 \tabularnewline
70 &  0.7741 &  0.4519 &  0.2259 \tabularnewline
71 &  0.7241 &  0.5518 &  0.2759 \tabularnewline
72 &  0.7459 &  0.5082 &  0.2541 \tabularnewline
73 &  0.7154 &  0.5691 &  0.2846 \tabularnewline
74 &  0.7804 &  0.4392 &  0.2196 \tabularnewline
75 &  0.7274 &  0.5453 &  0.2726 \tabularnewline
76 &  0.6664 &  0.6672 &  0.3336 \tabularnewline
77 &  0.6843 &  0.6315 &  0.3157 \tabularnewline
78 &  0.7151 &  0.5699 &  0.2849 \tabularnewline
79 &  0.6631 &  0.6739 &  0.3369 \tabularnewline
80 &  0.6003 &  0.7994 &  0.3997 \tabularnewline
81 &  0.5513 &  0.8975 &  0.4487 \tabularnewline
82 &  0.4777 &  0.9553 &  0.5223 \tabularnewline
83 &  0.5569 &  0.8861 &  0.4431 \tabularnewline
84 &  0.5446 &  0.9107 &  0.4554 \tabularnewline
85 &  0.4797 &  0.9594 &  0.5203 \tabularnewline
86 &  0.3962 &  0.7923 &  0.6038 \tabularnewline
87 &  0.3611 &  0.7222 &  0.6389 \tabularnewline
88 &  0.4957 &  0.9915 &  0.5043 \tabularnewline
89 &  0.4081 &  0.8162 &  0.5919 \tabularnewline
90 &  0.3135 &  0.6269 &  0.6865 \tabularnewline
91 &  0.7706 &  0.4588 &  0.2294 \tabularnewline
92 &  0.7301 &  0.5397 &  0.2698 \tabularnewline
93 &  0.6771 &  0.6459 &  0.3229 \tabularnewline
94 &  0.5442 &  0.9116 &  0.4558 \tabularnewline
95 &  0.4277 &  0.8554 &  0.5723 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=&T=5

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]5[/C][C] 0.7444[/C][C] 0.5111[/C][C] 0.2556[/C][/ROW]
[ROW][C]6[/C][C] 0.6299[/C][C] 0.7402[/C][C] 0.3701[/C][/ROW]
[ROW][C]7[/C][C] 0.6397[/C][C] 0.7207[/C][C] 0.3603[/C][/ROW]
[ROW][C]8[/C][C] 0.5621[/C][C] 0.8759[/C][C] 0.4379[/C][/ROW]
[ROW][C]9[/C][C] 0.5946[/C][C] 0.8108[/C][C] 0.4054[/C][/ROW]
[ROW][C]10[/C][C] 0.7209[/C][C] 0.5581[/C][C] 0.2791[/C][/ROW]
[ROW][C]11[/C][C] 0.6469[/C][C] 0.7062[/C][C] 0.3531[/C][/ROW]
[ROW][C]12[/C][C] 0.6142[/C][C] 0.7716[/C][C] 0.3858[/C][/ROW]
[ROW][C]13[/C][C] 0.5332[/C][C] 0.9336[/C][C] 0.4668[/C][/ROW]
[ROW][C]14[/C][C] 0.442[/C][C] 0.8839[/C][C] 0.558[/C][/ROW]
[ROW][C]15[/C][C] 0.4068[/C][C] 0.8135[/C][C] 0.5932[/C][/ROW]
[ROW][C]16[/C][C] 0.3271[/C][C] 0.6543[/C][C] 0.6729[/C][/ROW]
[ROW][C]17[/C][C] 0.3066[/C][C] 0.6132[/C][C] 0.6934[/C][/ROW]
[ROW][C]18[/C][C] 0.3177[/C][C] 0.6354[/C][C] 0.6823[/C][/ROW]
[ROW][C]19[/C][C] 0.3072[/C][C] 0.6144[/C][C] 0.6928[/C][/ROW]
[ROW][C]20[/C][C] 0.2734[/C][C] 0.5468[/C][C] 0.7266[/C][/ROW]
[ROW][C]21[/C][C] 0.2345[/C][C] 0.469[/C][C] 0.7655[/C][/ROW]
[ROW][C]22[/C][C] 0.1883[/C][C] 0.3767[/C][C] 0.8117[/C][/ROW]
[ROW][C]23[/C][C] 0.1613[/C][C] 0.3225[/C][C] 0.8387[/C][/ROW]
[ROW][C]24[/C][C] 0.1216[/C][C] 0.2432[/C][C] 0.8784[/C][/ROW]
[ROW][C]25[/C][C] 0.1433[/C][C] 0.2867[/C][C] 0.8567[/C][/ROW]
[ROW][C]26[/C][C] 0.1279[/C][C] 0.2558[/C][C] 0.8721[/C][/ROW]
[ROW][C]27[/C][C] 0.9711[/C][C] 0.05781[/C][C] 0.0289[/C][/ROW]
[ROW][C]28[/C][C] 0.9609[/C][C] 0.07816[/C][C] 0.03908[/C][/ROW]
[ROW][C]29[/C][C] 0.9467[/C][C] 0.1067[/C][C] 0.05333[/C][/ROW]
[ROW][C]30[/C][C] 0.9301[/C][C] 0.1397[/C][C] 0.06987[/C][/ROW]
[ROW][C]31[/C][C] 0.9093[/C][C] 0.1815[/C][C] 0.09074[/C][/ROW]
[ROW][C]32[/C][C] 0.9068[/C][C] 0.1865[/C][C] 0.09324[/C][/ROW]
[ROW][C]33[/C][C] 0.8812[/C][C] 0.2376[/C][C] 0.1188[/C][/ROW]
[ROW][C]34[/C][C] 0.9316[/C][C] 0.1367[/C][C] 0.06837[/C][/ROW]
[ROW][C]35[/C][C] 0.9132[/C][C] 0.1735[/C][C] 0.08677[/C][/ROW]
[ROW][C]36[/C][C] 0.97[/C][C] 0.0601[/C][C] 0.03005[/C][/ROW]
[ROW][C]37[/C][C] 0.9583[/C][C] 0.08333[/C][C] 0.04167[/C][/ROW]
[ROW][C]38[/C][C] 0.9434[/C][C] 0.1133[/C][C] 0.05663[/C][/ROW]
[ROW][C]39[/C][C] 0.9245[/C][C] 0.1509[/C][C] 0.07547[/C][/ROW]
[ROW][C]40[/C][C] 0.9042[/C][C] 0.1916[/C][C] 0.09582[/C][/ROW]
[ROW][C]41[/C][C] 0.879[/C][C] 0.2421[/C][C] 0.121[/C][/ROW]
[ROW][C]42[/C][C] 0.8959[/C][C] 0.2081[/C][C] 0.1041[/C][/ROW]
[ROW][C]43[/C][C] 0.8697[/C][C] 0.2605[/C][C] 0.1303[/C][/ROW]
[ROW][C]44[/C][C] 0.9106[/C][C] 0.1789[/C][C] 0.08944[/C][/ROW]
[ROW][C]45[/C][C] 0.8886[/C][C] 0.2228[/C][C] 0.1114[/C][/ROW]
[ROW][C]46[/C][C] 0.9003[/C][C] 0.1995[/C][C] 0.09975[/C][/ROW]
[ROW][C]47[/C][C] 0.8968[/C][C] 0.2064[/C][C] 0.1032[/C][/ROW]
[ROW][C]48[/C][C] 0.8786[/C][C] 0.2428[/C][C] 0.1214[/C][/ROW]
[ROW][C]49[/C][C] 0.8917[/C][C] 0.2166[/C][C] 0.1083[/C][/ROW]
[ROW][C]50[/C][C] 0.8685[/C][C] 0.263[/C][C] 0.1315[/C][/ROW]
[ROW][C]51[/C][C] 0.8444[/C][C] 0.3112[/C][C] 0.1556[/C][/ROW]
[ROW][C]52[/C][C] 0.8073[/C][C] 0.3855[/C][C] 0.1927[/C][/ROW]
[ROW][C]53[/C][C] 0.9024[/C][C] 0.1951[/C][C] 0.09757[/C][/ROW]
[ROW][C]54[/C][C] 0.8956[/C][C] 0.2089[/C][C] 0.1044[/C][/ROW]
[ROW][C]55[/C][C] 0.8709[/C][C] 0.2582[/C][C] 0.1291[/C][/ROW]
[ROW][C]56[/C][C] 0.8691[/C][C] 0.2619[/C][C] 0.1309[/C][/ROW]
[ROW][C]57[/C][C] 0.8548[/C][C] 0.2904[/C][C] 0.1452[/C][/ROW]
[ROW][C]58[/C][C] 0.8743[/C][C] 0.2514[/C][C] 0.1257[/C][/ROW]
[ROW][C]59[/C][C] 0.8642[/C][C] 0.2716[/C][C] 0.1358[/C][/ROW]
[ROW][C]60[/C][C] 0.8365[/C][C] 0.3271[/C][C] 0.1635[/C][/ROW]
[ROW][C]61[/C][C] 0.8409[/C][C] 0.3182[/C][C] 0.1591[/C][/ROW]
[ROW][C]62[/C][C] 0.8548[/C][C] 0.2904[/C][C] 0.1452[/C][/ROW]
[ROW][C]63[/C][C] 0.8173[/C][C] 0.3654[/C][C] 0.1827[/C][/ROW]
[ROW][C]64[/C][C] 0.8141[/C][C] 0.3717[/C][C] 0.1859[/C][/ROW]
[ROW][C]65[/C][C] 0.8718[/C][C] 0.2564[/C][C] 0.1282[/C][/ROW]
[ROW][C]66[/C][C] 0.8444[/C][C] 0.3112[/C][C] 0.1556[/C][/ROW]
[ROW][C]67[/C][C] 0.8116[/C][C] 0.3769[/C][C] 0.1884[/C][/ROW]
[ROW][C]68[/C][C] 0.7767[/C][C] 0.4465[/C][C] 0.2233[/C][/ROW]
[ROW][C]69[/C][C] 0.7261[/C][C] 0.5479[/C][C] 0.2739[/C][/ROW]
[ROW][C]70[/C][C] 0.7741[/C][C] 0.4519[/C][C] 0.2259[/C][/ROW]
[ROW][C]71[/C][C] 0.7241[/C][C] 0.5518[/C][C] 0.2759[/C][/ROW]
[ROW][C]72[/C][C] 0.7459[/C][C] 0.5082[/C][C] 0.2541[/C][/ROW]
[ROW][C]73[/C][C] 0.7154[/C][C] 0.5691[/C][C] 0.2846[/C][/ROW]
[ROW][C]74[/C][C] 0.7804[/C][C] 0.4392[/C][C] 0.2196[/C][/ROW]
[ROW][C]75[/C][C] 0.7274[/C][C] 0.5453[/C][C] 0.2726[/C][/ROW]
[ROW][C]76[/C][C] 0.6664[/C][C] 0.6672[/C][C] 0.3336[/C][/ROW]
[ROW][C]77[/C][C] 0.6843[/C][C] 0.6315[/C][C] 0.3157[/C][/ROW]
[ROW][C]78[/C][C] 0.7151[/C][C] 0.5699[/C][C] 0.2849[/C][/ROW]
[ROW][C]79[/C][C] 0.6631[/C][C] 0.6739[/C][C] 0.3369[/C][/ROW]
[ROW][C]80[/C][C] 0.6003[/C][C] 0.7994[/C][C] 0.3997[/C][/ROW]
[ROW][C]81[/C][C] 0.5513[/C][C] 0.8975[/C][C] 0.4487[/C][/ROW]
[ROW][C]82[/C][C] 0.4777[/C][C] 0.9553[/C][C] 0.5223[/C][/ROW]
[ROW][C]83[/C][C] 0.5569[/C][C] 0.8861[/C][C] 0.4431[/C][/ROW]
[ROW][C]84[/C][C] 0.5446[/C][C] 0.9107[/C][C] 0.4554[/C][/ROW]
[ROW][C]85[/C][C] 0.4797[/C][C] 0.9594[/C][C] 0.5203[/C][/ROW]
[ROW][C]86[/C][C] 0.3962[/C][C] 0.7923[/C][C] 0.6038[/C][/ROW]
[ROW][C]87[/C][C] 0.3611[/C][C] 0.7222[/C][C] 0.6389[/C][/ROW]
[ROW][C]88[/C][C] 0.4957[/C][C] 0.9915[/C][C] 0.5043[/C][/ROW]
[ROW][C]89[/C][C] 0.4081[/C][C] 0.8162[/C][C] 0.5919[/C][/ROW]
[ROW][C]90[/C][C] 0.3135[/C][C] 0.6269[/C][C] 0.6865[/C][/ROW]
[ROW][C]91[/C][C] 0.7706[/C][C] 0.4588[/C][C] 0.2294[/C][/ROW]
[ROW][C]92[/C][C] 0.7301[/C][C] 0.5397[/C][C] 0.2698[/C][/ROW]
[ROW][C]93[/C][C] 0.6771[/C][C] 0.6459[/C][C] 0.3229[/C][/ROW]
[ROW][C]94[/C][C] 0.5442[/C][C] 0.9116[/C][C] 0.4558[/C][/ROW]
[ROW][C]95[/C][C] 0.4277[/C][C] 0.8554[/C][C] 0.5723[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
5 0.7444 0.5111 0.2556
6 0.6299 0.7402 0.3701
7 0.6397 0.7207 0.3603
8 0.5621 0.8759 0.4379
9 0.5946 0.8108 0.4054
10 0.7209 0.5581 0.2791
11 0.6469 0.7062 0.3531
12 0.6142 0.7716 0.3858
13 0.5332 0.9336 0.4668
14 0.442 0.8839 0.558
15 0.4068 0.8135 0.5932
16 0.3271 0.6543 0.6729
17 0.3066 0.6132 0.6934
18 0.3177 0.6354 0.6823
19 0.3072 0.6144 0.6928
20 0.2734 0.5468 0.7266
21 0.2345 0.469 0.7655
22 0.1883 0.3767 0.8117
23 0.1613 0.3225 0.8387
24 0.1216 0.2432 0.8784
25 0.1433 0.2867 0.8567
26 0.1279 0.2558 0.8721
27 0.9711 0.05781 0.0289
28 0.9609 0.07816 0.03908
29 0.9467 0.1067 0.05333
30 0.9301 0.1397 0.06987
31 0.9093 0.1815 0.09074
32 0.9068 0.1865 0.09324
33 0.8812 0.2376 0.1188
34 0.9316 0.1367 0.06837
35 0.9132 0.1735 0.08677
36 0.97 0.0601 0.03005
37 0.9583 0.08333 0.04167
38 0.9434 0.1133 0.05663
39 0.9245 0.1509 0.07547
40 0.9042 0.1916 0.09582
41 0.879 0.2421 0.121
42 0.8959 0.2081 0.1041
43 0.8697 0.2605 0.1303
44 0.9106 0.1789 0.08944
45 0.8886 0.2228 0.1114
46 0.9003 0.1995 0.09975
47 0.8968 0.2064 0.1032
48 0.8786 0.2428 0.1214
49 0.8917 0.2166 0.1083
50 0.8685 0.263 0.1315
51 0.8444 0.3112 0.1556
52 0.8073 0.3855 0.1927
53 0.9024 0.1951 0.09757
54 0.8956 0.2089 0.1044
55 0.8709 0.2582 0.1291
56 0.8691 0.2619 0.1309
57 0.8548 0.2904 0.1452
58 0.8743 0.2514 0.1257
59 0.8642 0.2716 0.1358
60 0.8365 0.3271 0.1635
61 0.8409 0.3182 0.1591
62 0.8548 0.2904 0.1452
63 0.8173 0.3654 0.1827
64 0.8141 0.3717 0.1859
65 0.8718 0.2564 0.1282
66 0.8444 0.3112 0.1556
67 0.8116 0.3769 0.1884
68 0.7767 0.4465 0.2233
69 0.7261 0.5479 0.2739
70 0.7741 0.4519 0.2259
71 0.7241 0.5518 0.2759
72 0.7459 0.5082 0.2541
73 0.7154 0.5691 0.2846
74 0.7804 0.4392 0.2196
75 0.7274 0.5453 0.2726
76 0.6664 0.6672 0.3336
77 0.6843 0.6315 0.3157
78 0.7151 0.5699 0.2849
79 0.6631 0.6739 0.3369
80 0.6003 0.7994 0.3997
81 0.5513 0.8975 0.4487
82 0.4777 0.9553 0.5223
83 0.5569 0.8861 0.4431
84 0.5446 0.9107 0.4554
85 0.4797 0.9594 0.5203
86 0.3962 0.7923 0.6038
87 0.3611 0.7222 0.6389
88 0.4957 0.9915 0.5043
89 0.4081 0.8162 0.5919
90 0.3135 0.6269 0.6865
91 0.7706 0.4588 0.2294
92 0.7301 0.5397 0.2698
93 0.6771 0.6459 0.3229
94 0.5442 0.9116 0.4558
95 0.4277 0.8554 0.5723







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level0 0OK
5% type I error level00OK
10% type I error level40.043956OK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 0 &  0 & OK \tabularnewline
5% type I error level & 0 & 0 & OK \tabularnewline
10% type I error level & 4 & 0.043956 & OK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=&T=6

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]0[/C][C] 0[/C][C]OK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]4[/C][C]0.043956[/C][C]OK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level0 0OK
5% type I error level00OK
10% type I error level40.043956OK







Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 0.42125, df1 = 2, df2 = 96, p-value = 0.6574
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.42125, df1 = 2, df2 = 96, p-value = 0.6574
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 0.42125, df1 = 2, df2 = 96, p-value = 0.6574

\begin{tabular}{lllllllll}
\hline
Ramsey RESET F-Test for powers (2 and 3) of fitted values \tabularnewline
> reset_test_fitted
	RESET test
data:  mylm
RESET = 0.42125, df1 = 2, df2 = 96, p-value = 0.6574
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of regressors \tabularnewline
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.42125, df1 = 2, df2 = 96, p-value = 0.6574
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of principal components \tabularnewline
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 0.42125, df1 = 2, df2 = 96, p-value = 0.6574
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=&T=7

[TABLE]
[ROW][C]Ramsey RESET F-Test for powers (2 and 3) of fitted values[/C][/ROW]
[ROW][C]
> reset_test_fitted
	RESET test
data:  mylm
RESET = 0.42125, df1 = 2, df2 = 96, p-value = 0.6574
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of regressors[/C][/ROW] [ROW][C]
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.42125, df1 = 2, df2 = 96, p-value = 0.6574
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of principal components[/C][/ROW] [ROW][C]
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 0.42125, df1 = 2, df2 = 96, p-value = 0.6574
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=&T=7

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=&T=7

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 0.42125, df1 = 2, df2 = 96, p-value = 0.6574
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.42125, df1 = 2, df2 = 96, p-value = 0.6574
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 0.42125, df1 = 2, df2 = 96, p-value = 0.6574



Parameters (Session):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ; par4 = 0 ; par5 = 0 ;
Parameters (R input):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ; par4 = 0 ; par5 = 0 ;
R code (references can be found in the software module):
library(lattice)
library(lmtest)
library(car)
library(MASS)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
mywarning <- ''
par1 <- as.numeric(par1)
if(is.na(par1)) {
par1 <- 1
mywarning = 'Warning: you did not specify the column number of the endogenous series! The first column was selected by default.'
}
if (par4=='') par4 <- 0
par4 <- as.numeric(par4)
if (par5=='') par5 <- 0
par5 <- as.numeric(par5)
x <- na.omit(t(y))
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'Seasonal Differences (s=12)'){
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'First and Seasonal Differences (s=12)'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if(par4 > 0) {
x2 <- array(0, dim=c(n-par4,par4), dimnames=list(1:(n-par4), paste(colnames(x)[par1],'(t-',1:par4,')',sep='')))
for (i in 1:(n-par4)) {
for (j in 1:par4) {
x2[i,j] <- x[i+par4-j,par1]
}
}
x <- cbind(x[(par4+1):n,], x2)
n <- n - par4
}
if(par5 > 0) {
x2 <- array(0, dim=c(n-par5*12,par5), dimnames=list(1:(n-par5*12), paste(colnames(x)[par1],'(t-',1:par5,'s)',sep='')))
for (i in 1:(n-par5*12)) {
for (j in 1:par5) {
x2[i,j] <- x[i+par5*12-j*12,par1]
}
}
x <- cbind(x[(par5*12+1):n,], x2)
n <- n - par5*12
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
(k <- length(x[n,]))
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
print(x)
(k <- length(x[n,]))
head(x)
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
sresid <- studres(mylm)
hist(sresid, freq=FALSE, main='Distribution of Studentized Residuals')
xfit<-seq(min(sresid),max(sresid),length=40)
yfit<-dnorm(xfit)
lines(xfit, yfit)
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqPlot(mylm, main='QQ Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
print(z)
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, signif(mysum$coefficients[i,1],6), sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, mywarning)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Multiple Linear Regression - Ordinary Least Squares', 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,formatC(signif(mysum$coefficients[i,1],5),format='g',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,2],5),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,3],4),format='e',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4],4),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4]/2,4),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a,formatC(signif(sqrt(mysum$r.squared),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$adj.r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a,formatC(signif(mysum$fstatistic[1],6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[2],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[3],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a,formatC(signif(1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a,formatC(signif(mysum$sigma,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a,formatC(signif(sum(myerror*myerror),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
myr <- as.numeric(mysum$resid)
myr
if(n < 200) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,formatC(signif(x[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(x[i]-mysum$resid[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$resid[i],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,1],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,2],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,3],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant1,6))
a<-table.element(a,formatC(signif(numsignificant1/numgqtests,6),format='g',flag=' '))
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant5,6))
a<-table.element(a,signif(numsignificant5/numgqtests,6))
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant10,6))
a<-table.element(a,signif(numsignificant10/numgqtests,6))
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}
}
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of fitted values',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_fitted <- resettest(mylm,power=2:3,type='fitted')
a<-table.element(a,paste('
',RC.texteval('reset_test_fitted'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of regressors',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_regressors <- resettest(mylm,power=2:3,type='regressor')
a<-table.element(a,paste('
',RC.texteval('reset_test_regressors'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of principal components',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_principal_components <- resettest(mylm,power=2:3,type='princomp')
a<-table.element(a,paste('
',RC.texteval('reset_test_principal_components'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable8.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Variance Inflation Factors (Multicollinearity)',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
vif <- vif(mylm)
a<-table.element(a,paste('
',RC.texteval('vif'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable9.tab')