Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationFri, 18 Dec 2015 16:43:41 +0000
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2015/Dec/18/t1450457071g387lepmt5l8mri.htm/, Retrieved Thu, 16 May 2024 23:19:08 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=286914, Retrieved Thu, 16 May 2024 23:19:08 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact73
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [Multiple regressi...] [2015-12-18 16:43:41] [9174db105c4509bef9d2565aabf3c4bd] [Current]
Feedback Forum

Post a new message
Dataseries X:
78 8
68 9.300000191
70 7.5
96 8.899999619
74 10.19999981
111 8.300000191
77 8.800000191
168 8.800000191
82 10.69999981
89 11.69999981
149 8.5
60 8.300000191
96 8.199999809
83 7.900000095
130 10.30000019
145 7.400000095
112 9.600000381
131 9.300000191
80 10.60000038
130 9.699999809
140 11.60000038
154 8.100000381
118 9.800000191
94 7.400000095
119 9.399999619
153 11.19999981
116 9.100000381
97 10.5
176 11.89999962
75 8.399999619
134 5
161 9.800000191
111 9.800000191
114 10.80000019
142 10.10000038
238 10.89999962
78 9.199999809
196 8.300000191
125 7.300000191
82 9.399999619
125 9.399999619
129 9.800000191
84 3.599999905
183 8.399999619
119 10.80000019
180 10.10000038
82 9
71 10
118 11.30000019
121 11.30000019
68 12.80000019
112 10
109 6.699999809




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time5 seconds
R Server'Sir Maurice George Kendall' @ kendall.wessa.net

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 5 seconds \tabularnewline
R Server & 'Sir Maurice George Kendall' @ kendall.wessa.net \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=286914&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]5 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Sir Maurice George Kendall' @ kendall.wessa.net[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=286914&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=286914&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time5 seconds
R Server'Sir Maurice George Kendall' @ kendall.wessa.net







Multiple Linear Regression - Estimated Regression Equation
Doctor[t] = + 91.5449 + 2.63811death_rate[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
Doctor[t] =  +  91.5449 +  2.63811death_rate[t]  + e[t] \tabularnewline
 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=286914&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]Doctor[t] =  +  91.5449 +  2.63811death_rate[t]  + e[t][/C][/ROW]
[ROW][C][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=286914&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=286914&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
Doctor[t] = + 91.5449 + 2.63811death_rate[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+91.55 29.95+3.0560e+00 0.003562 0.001781
death_rate+2.638 3.17+8.3230e-01 0.4091 0.2046

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & +91.55 &  29.95 & +3.0560e+00 &  0.003562 &  0.001781 \tabularnewline
death_rate & +2.638 &  3.17 & +8.3230e-01 &  0.4091 &  0.2046 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=286914&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]+91.55[/C][C] 29.95[/C][C]+3.0560e+00[/C][C] 0.003562[/C][C] 0.001781[/C][/ROW]
[ROW][C]death_rate[/C][C]+2.638[/C][C] 3.17[/C][C]+8.3230e-01[/C][C] 0.4091[/C][C] 0.2046[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=286914&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=286914&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+91.55 29.95+3.0560e+00 0.003562 0.001781
death_rate+2.638 3.17+8.3230e-01 0.4091 0.2046







Multiple Linear Regression - Regression Statistics
Multiple R 0.1158
R-squared 0.0134
Adjusted R-squared-0.005944
F-TEST (value) 0.6928
F-TEST (DF numerator)1
F-TEST (DF denominator)51
p-value 0.4091
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 38
Sum Squared Residuals 7.364e+04

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R &  0.1158 \tabularnewline
R-squared &  0.0134 \tabularnewline
Adjusted R-squared & -0.005944 \tabularnewline
F-TEST (value) &  0.6928 \tabularnewline
F-TEST (DF numerator) & 1 \tabularnewline
F-TEST (DF denominator) & 51 \tabularnewline
p-value &  0.4091 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation &  38 \tabularnewline
Sum Squared Residuals &  7.364e+04 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=286914&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C] 0.1158[/C][/ROW]
[ROW][C]R-squared[/C][C] 0.0134[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]-0.005944[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C] 0.6928[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]1[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]51[/C][/ROW]
[ROW][C]p-value[/C][C] 0.4091[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C] 38[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C] 7.364e+04[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=286914&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=286914&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R 0.1158
R-squared 0.0134
Adjusted R-squared-0.005944
F-TEST (value) 0.6928
F-TEST (DF numerator)1
F-TEST (DF denominator)51
p-value 0.4091
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 38
Sum Squared Residuals 7.364e+04







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 78 112.7-34.65
2 68 116.1-48.08
3 70 111.3-41.33
4 96 115-19.02
5 74 118.5-44.45
6 111 113.4-2.441
7 77 114.8-37.76
8 168 114.8 53.24
9 82 119.8-37.77
10 89 122.4-33.41
11 149 114 35.03
12 60 113.4-53.44
13 96 113.2-17.18
14 83 112.4-29.39
15 130 118.7 11.28
16 145 111.1 33.93
17 112 116.9-4.871
18 131 116.1 14.92
19 80 119.5-39.51
20 130 117.1 12.87
21 140 122.1 17.85
22 154 112.9 41.09
23 118 117.4 0.6015
24 94 111.1-17.07
25 119 116.3 2.657
26 153 121.1 31.91
27 116 115.6 0.4482
28 97 119.2-22.25
29 176 122.9 53.06
30 75 113.7-38.71
31 134 104.7 29.26
32 161 117.4 43.6
33 111 117.4-6.398
34 114 120-6.037
35 142 118.2 23.81
36 238 120.3 117.7
37 78 115.8-37.82
38 196 113.4 82.56
39 125 110.8 14.2
40 82 116.3-34.34
41 125 116.3 8.657
42 129 117.4 11.6
43 84 101-17.04
44 183 113.7 69.29
45 119 120-1.037
46 180 118.2 61.81
47 82 115.3-33.29
48 71 117.9-46.93
49 118 121.4-3.356
50 121 121.4-0.3556
51 68 125.3-57.31
52 112 117.9-5.926
53 109 109.2-0.2203

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 &  78 &  112.7 & -34.65 \tabularnewline
2 &  68 &  116.1 & -48.08 \tabularnewline
3 &  70 &  111.3 & -41.33 \tabularnewline
4 &  96 &  115 & -19.02 \tabularnewline
5 &  74 &  118.5 & -44.45 \tabularnewline
6 &  111 &  113.4 & -2.441 \tabularnewline
7 &  77 &  114.8 & -37.76 \tabularnewline
8 &  168 &  114.8 &  53.24 \tabularnewline
9 &  82 &  119.8 & -37.77 \tabularnewline
10 &  89 &  122.4 & -33.41 \tabularnewline
11 &  149 &  114 &  35.03 \tabularnewline
12 &  60 &  113.4 & -53.44 \tabularnewline
13 &  96 &  113.2 & -17.18 \tabularnewline
14 &  83 &  112.4 & -29.39 \tabularnewline
15 &  130 &  118.7 &  11.28 \tabularnewline
16 &  145 &  111.1 &  33.93 \tabularnewline
17 &  112 &  116.9 & -4.871 \tabularnewline
18 &  131 &  116.1 &  14.92 \tabularnewline
19 &  80 &  119.5 & -39.51 \tabularnewline
20 &  130 &  117.1 &  12.87 \tabularnewline
21 &  140 &  122.1 &  17.85 \tabularnewline
22 &  154 &  112.9 &  41.09 \tabularnewline
23 &  118 &  117.4 &  0.6015 \tabularnewline
24 &  94 &  111.1 & -17.07 \tabularnewline
25 &  119 &  116.3 &  2.657 \tabularnewline
26 &  153 &  121.1 &  31.91 \tabularnewline
27 &  116 &  115.6 &  0.4482 \tabularnewline
28 &  97 &  119.2 & -22.25 \tabularnewline
29 &  176 &  122.9 &  53.06 \tabularnewline
30 &  75 &  113.7 & -38.71 \tabularnewline
31 &  134 &  104.7 &  29.26 \tabularnewline
32 &  161 &  117.4 &  43.6 \tabularnewline
33 &  111 &  117.4 & -6.398 \tabularnewline
34 &  114 &  120 & -6.037 \tabularnewline
35 &  142 &  118.2 &  23.81 \tabularnewline
36 &  238 &  120.3 &  117.7 \tabularnewline
37 &  78 &  115.8 & -37.82 \tabularnewline
38 &  196 &  113.4 &  82.56 \tabularnewline
39 &  125 &  110.8 &  14.2 \tabularnewline
40 &  82 &  116.3 & -34.34 \tabularnewline
41 &  125 &  116.3 &  8.657 \tabularnewline
42 &  129 &  117.4 &  11.6 \tabularnewline
43 &  84 &  101 & -17.04 \tabularnewline
44 &  183 &  113.7 &  69.29 \tabularnewline
45 &  119 &  120 & -1.037 \tabularnewline
46 &  180 &  118.2 &  61.81 \tabularnewline
47 &  82 &  115.3 & -33.29 \tabularnewline
48 &  71 &  117.9 & -46.93 \tabularnewline
49 &  118 &  121.4 & -3.356 \tabularnewline
50 &  121 &  121.4 & -0.3556 \tabularnewline
51 &  68 &  125.3 & -57.31 \tabularnewline
52 &  112 &  117.9 & -5.926 \tabularnewline
53 &  109 &  109.2 & -0.2203 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=286914&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C] 78[/C][C] 112.7[/C][C]-34.65[/C][/ROW]
[ROW][C]2[/C][C] 68[/C][C] 116.1[/C][C]-48.08[/C][/ROW]
[ROW][C]3[/C][C] 70[/C][C] 111.3[/C][C]-41.33[/C][/ROW]
[ROW][C]4[/C][C] 96[/C][C] 115[/C][C]-19.02[/C][/ROW]
[ROW][C]5[/C][C] 74[/C][C] 118.5[/C][C]-44.45[/C][/ROW]
[ROW][C]6[/C][C] 111[/C][C] 113.4[/C][C]-2.441[/C][/ROW]
[ROW][C]7[/C][C] 77[/C][C] 114.8[/C][C]-37.76[/C][/ROW]
[ROW][C]8[/C][C] 168[/C][C] 114.8[/C][C] 53.24[/C][/ROW]
[ROW][C]9[/C][C] 82[/C][C] 119.8[/C][C]-37.77[/C][/ROW]
[ROW][C]10[/C][C] 89[/C][C] 122.4[/C][C]-33.41[/C][/ROW]
[ROW][C]11[/C][C] 149[/C][C] 114[/C][C] 35.03[/C][/ROW]
[ROW][C]12[/C][C] 60[/C][C] 113.4[/C][C]-53.44[/C][/ROW]
[ROW][C]13[/C][C] 96[/C][C] 113.2[/C][C]-17.18[/C][/ROW]
[ROW][C]14[/C][C] 83[/C][C] 112.4[/C][C]-29.39[/C][/ROW]
[ROW][C]15[/C][C] 130[/C][C] 118.7[/C][C] 11.28[/C][/ROW]
[ROW][C]16[/C][C] 145[/C][C] 111.1[/C][C] 33.93[/C][/ROW]
[ROW][C]17[/C][C] 112[/C][C] 116.9[/C][C]-4.871[/C][/ROW]
[ROW][C]18[/C][C] 131[/C][C] 116.1[/C][C] 14.92[/C][/ROW]
[ROW][C]19[/C][C] 80[/C][C] 119.5[/C][C]-39.51[/C][/ROW]
[ROW][C]20[/C][C] 130[/C][C] 117.1[/C][C] 12.87[/C][/ROW]
[ROW][C]21[/C][C] 140[/C][C] 122.1[/C][C] 17.85[/C][/ROW]
[ROW][C]22[/C][C] 154[/C][C] 112.9[/C][C] 41.09[/C][/ROW]
[ROW][C]23[/C][C] 118[/C][C] 117.4[/C][C] 0.6015[/C][/ROW]
[ROW][C]24[/C][C] 94[/C][C] 111.1[/C][C]-17.07[/C][/ROW]
[ROW][C]25[/C][C] 119[/C][C] 116.3[/C][C] 2.657[/C][/ROW]
[ROW][C]26[/C][C] 153[/C][C] 121.1[/C][C] 31.91[/C][/ROW]
[ROW][C]27[/C][C] 116[/C][C] 115.6[/C][C] 0.4482[/C][/ROW]
[ROW][C]28[/C][C] 97[/C][C] 119.2[/C][C]-22.25[/C][/ROW]
[ROW][C]29[/C][C] 176[/C][C] 122.9[/C][C] 53.06[/C][/ROW]
[ROW][C]30[/C][C] 75[/C][C] 113.7[/C][C]-38.71[/C][/ROW]
[ROW][C]31[/C][C] 134[/C][C] 104.7[/C][C] 29.26[/C][/ROW]
[ROW][C]32[/C][C] 161[/C][C] 117.4[/C][C] 43.6[/C][/ROW]
[ROW][C]33[/C][C] 111[/C][C] 117.4[/C][C]-6.398[/C][/ROW]
[ROW][C]34[/C][C] 114[/C][C] 120[/C][C]-6.037[/C][/ROW]
[ROW][C]35[/C][C] 142[/C][C] 118.2[/C][C] 23.81[/C][/ROW]
[ROW][C]36[/C][C] 238[/C][C] 120.3[/C][C] 117.7[/C][/ROW]
[ROW][C]37[/C][C] 78[/C][C] 115.8[/C][C]-37.82[/C][/ROW]
[ROW][C]38[/C][C] 196[/C][C] 113.4[/C][C] 82.56[/C][/ROW]
[ROW][C]39[/C][C] 125[/C][C] 110.8[/C][C] 14.2[/C][/ROW]
[ROW][C]40[/C][C] 82[/C][C] 116.3[/C][C]-34.34[/C][/ROW]
[ROW][C]41[/C][C] 125[/C][C] 116.3[/C][C] 8.657[/C][/ROW]
[ROW][C]42[/C][C] 129[/C][C] 117.4[/C][C] 11.6[/C][/ROW]
[ROW][C]43[/C][C] 84[/C][C] 101[/C][C]-17.04[/C][/ROW]
[ROW][C]44[/C][C] 183[/C][C] 113.7[/C][C] 69.29[/C][/ROW]
[ROW][C]45[/C][C] 119[/C][C] 120[/C][C]-1.037[/C][/ROW]
[ROW][C]46[/C][C] 180[/C][C] 118.2[/C][C] 61.81[/C][/ROW]
[ROW][C]47[/C][C] 82[/C][C] 115.3[/C][C]-33.29[/C][/ROW]
[ROW][C]48[/C][C] 71[/C][C] 117.9[/C][C]-46.93[/C][/ROW]
[ROW][C]49[/C][C] 118[/C][C] 121.4[/C][C]-3.356[/C][/ROW]
[ROW][C]50[/C][C] 121[/C][C] 121.4[/C][C]-0.3556[/C][/ROW]
[ROW][C]51[/C][C] 68[/C][C] 125.3[/C][C]-57.31[/C][/ROW]
[ROW][C]52[/C][C] 112[/C][C] 117.9[/C][C]-5.926[/C][/ROW]
[ROW][C]53[/C][C] 109[/C][C] 109.2[/C][C]-0.2203[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=286914&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=286914&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 78 112.7-34.65
2 68 116.1-48.08
3 70 111.3-41.33
4 96 115-19.02
5 74 118.5-44.45
6 111 113.4-2.441
7 77 114.8-37.76
8 168 114.8 53.24
9 82 119.8-37.77
10 89 122.4-33.41
11 149 114 35.03
12 60 113.4-53.44
13 96 113.2-17.18
14 83 112.4-29.39
15 130 118.7 11.28
16 145 111.1 33.93
17 112 116.9-4.871
18 131 116.1 14.92
19 80 119.5-39.51
20 130 117.1 12.87
21 140 122.1 17.85
22 154 112.9 41.09
23 118 117.4 0.6015
24 94 111.1-17.07
25 119 116.3 2.657
26 153 121.1 31.91
27 116 115.6 0.4482
28 97 119.2-22.25
29 176 122.9 53.06
30 75 113.7-38.71
31 134 104.7 29.26
32 161 117.4 43.6
33 111 117.4-6.398
34 114 120-6.037
35 142 118.2 23.81
36 238 120.3 117.7
37 78 115.8-37.82
38 196 113.4 82.56
39 125 110.8 14.2
40 82 116.3-34.34
41 125 116.3 8.657
42 129 117.4 11.6
43 84 101-17.04
44 183 113.7 69.29
45 119 120-1.037
46 180 118.2 61.81
47 82 115.3-33.29
48 71 117.9-46.93
49 118 121.4-3.356
50 121 121.4-0.3556
51 68 125.3-57.31
52 112 117.9-5.926
53 109 109.2-0.2203







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
5 0.05062 0.1012 0.9494
6 0.08901 0.178 0.911
7 0.03971 0.07941 0.9603
8 0.5184 0.9633 0.4816
9 0.4135 0.8271 0.5865
10 0.3176 0.6352 0.6824
11 0.4169 0.8338 0.5831
12 0.4497 0.8995 0.5503
13 0.3583 0.7167 0.6417
14 0.2963 0.5926 0.7037
15 0.2835 0.5671 0.7165
16 0.317 0.6339 0.683
17 0.2508 0.5016 0.7492
18 0.2216 0.4433 0.7784
19 0.1968 0.3936 0.8032
20 0.1704 0.3408 0.8296
21 0.1617 0.3235 0.8383
22 0.1915 0.383 0.8085
23 0.1433 0.2866 0.8567
24 0.1091 0.2183 0.8909
25 0.07767 0.1553 0.9223
26 0.07859 0.1572 0.9214
27 0.05344 0.1069 0.9466
28 0.04 0.08 0.96
29 0.06166 0.1233 0.9383
30 0.0621 0.1242 0.9379
31 0.05694 0.1139 0.9431
32 0.06343 0.1269 0.9366
33 0.04236 0.08473 0.9576
34 0.02722 0.05443 0.9728
35 0.01958 0.03917 0.9804
36 0.3391 0.6783 0.6609
37 0.3365 0.6729 0.6635
38 0.6359 0.7283 0.3641
39 0.5542 0.8915 0.4458
40 0.5268 0.9464 0.4732
41 0.4319 0.8638 0.5681
42 0.3443 0.6886 0.6557
43 0.3443 0.6886 0.6557
44 0.5337 0.9327 0.4663
45 0.4199 0.8398 0.5801
46 0.8434 0.3132 0.1566
47 0.7782 0.4436 0.2218
48 0.7957 0.4087 0.2043

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
5 &  0.05062 &  0.1012 &  0.9494 \tabularnewline
6 &  0.08901 &  0.178 &  0.911 \tabularnewline
7 &  0.03971 &  0.07941 &  0.9603 \tabularnewline
8 &  0.5184 &  0.9633 &  0.4816 \tabularnewline
9 &  0.4135 &  0.8271 &  0.5865 \tabularnewline
10 &  0.3176 &  0.6352 &  0.6824 \tabularnewline
11 &  0.4169 &  0.8338 &  0.5831 \tabularnewline
12 &  0.4497 &  0.8995 &  0.5503 \tabularnewline
13 &  0.3583 &  0.7167 &  0.6417 \tabularnewline
14 &  0.2963 &  0.5926 &  0.7037 \tabularnewline
15 &  0.2835 &  0.5671 &  0.7165 \tabularnewline
16 &  0.317 &  0.6339 &  0.683 \tabularnewline
17 &  0.2508 &  0.5016 &  0.7492 \tabularnewline
18 &  0.2216 &  0.4433 &  0.7784 \tabularnewline
19 &  0.1968 &  0.3936 &  0.8032 \tabularnewline
20 &  0.1704 &  0.3408 &  0.8296 \tabularnewline
21 &  0.1617 &  0.3235 &  0.8383 \tabularnewline
22 &  0.1915 &  0.383 &  0.8085 \tabularnewline
23 &  0.1433 &  0.2866 &  0.8567 \tabularnewline
24 &  0.1091 &  0.2183 &  0.8909 \tabularnewline
25 &  0.07767 &  0.1553 &  0.9223 \tabularnewline
26 &  0.07859 &  0.1572 &  0.9214 \tabularnewline
27 &  0.05344 &  0.1069 &  0.9466 \tabularnewline
28 &  0.04 &  0.08 &  0.96 \tabularnewline
29 &  0.06166 &  0.1233 &  0.9383 \tabularnewline
30 &  0.0621 &  0.1242 &  0.9379 \tabularnewline
31 &  0.05694 &  0.1139 &  0.9431 \tabularnewline
32 &  0.06343 &  0.1269 &  0.9366 \tabularnewline
33 &  0.04236 &  0.08473 &  0.9576 \tabularnewline
34 &  0.02722 &  0.05443 &  0.9728 \tabularnewline
35 &  0.01958 &  0.03917 &  0.9804 \tabularnewline
36 &  0.3391 &  0.6783 &  0.6609 \tabularnewline
37 &  0.3365 &  0.6729 &  0.6635 \tabularnewline
38 &  0.6359 &  0.7283 &  0.3641 \tabularnewline
39 &  0.5542 &  0.8915 &  0.4458 \tabularnewline
40 &  0.5268 &  0.9464 &  0.4732 \tabularnewline
41 &  0.4319 &  0.8638 &  0.5681 \tabularnewline
42 &  0.3443 &  0.6886 &  0.6557 \tabularnewline
43 &  0.3443 &  0.6886 &  0.6557 \tabularnewline
44 &  0.5337 &  0.9327 &  0.4663 \tabularnewline
45 &  0.4199 &  0.8398 &  0.5801 \tabularnewline
46 &  0.8434 &  0.3132 &  0.1566 \tabularnewline
47 &  0.7782 &  0.4436 &  0.2218 \tabularnewline
48 &  0.7957 &  0.4087 &  0.2043 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=286914&T=5

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]5[/C][C] 0.05062[/C][C] 0.1012[/C][C] 0.9494[/C][/ROW]
[ROW][C]6[/C][C] 0.08901[/C][C] 0.178[/C][C] 0.911[/C][/ROW]
[ROW][C]7[/C][C] 0.03971[/C][C] 0.07941[/C][C] 0.9603[/C][/ROW]
[ROW][C]8[/C][C] 0.5184[/C][C] 0.9633[/C][C] 0.4816[/C][/ROW]
[ROW][C]9[/C][C] 0.4135[/C][C] 0.8271[/C][C] 0.5865[/C][/ROW]
[ROW][C]10[/C][C] 0.3176[/C][C] 0.6352[/C][C] 0.6824[/C][/ROW]
[ROW][C]11[/C][C] 0.4169[/C][C] 0.8338[/C][C] 0.5831[/C][/ROW]
[ROW][C]12[/C][C] 0.4497[/C][C] 0.8995[/C][C] 0.5503[/C][/ROW]
[ROW][C]13[/C][C] 0.3583[/C][C] 0.7167[/C][C] 0.6417[/C][/ROW]
[ROW][C]14[/C][C] 0.2963[/C][C] 0.5926[/C][C] 0.7037[/C][/ROW]
[ROW][C]15[/C][C] 0.2835[/C][C] 0.5671[/C][C] 0.7165[/C][/ROW]
[ROW][C]16[/C][C] 0.317[/C][C] 0.6339[/C][C] 0.683[/C][/ROW]
[ROW][C]17[/C][C] 0.2508[/C][C] 0.5016[/C][C] 0.7492[/C][/ROW]
[ROW][C]18[/C][C] 0.2216[/C][C] 0.4433[/C][C] 0.7784[/C][/ROW]
[ROW][C]19[/C][C] 0.1968[/C][C] 0.3936[/C][C] 0.8032[/C][/ROW]
[ROW][C]20[/C][C] 0.1704[/C][C] 0.3408[/C][C] 0.8296[/C][/ROW]
[ROW][C]21[/C][C] 0.1617[/C][C] 0.3235[/C][C] 0.8383[/C][/ROW]
[ROW][C]22[/C][C] 0.1915[/C][C] 0.383[/C][C] 0.8085[/C][/ROW]
[ROW][C]23[/C][C] 0.1433[/C][C] 0.2866[/C][C] 0.8567[/C][/ROW]
[ROW][C]24[/C][C] 0.1091[/C][C] 0.2183[/C][C] 0.8909[/C][/ROW]
[ROW][C]25[/C][C] 0.07767[/C][C] 0.1553[/C][C] 0.9223[/C][/ROW]
[ROW][C]26[/C][C] 0.07859[/C][C] 0.1572[/C][C] 0.9214[/C][/ROW]
[ROW][C]27[/C][C] 0.05344[/C][C] 0.1069[/C][C] 0.9466[/C][/ROW]
[ROW][C]28[/C][C] 0.04[/C][C] 0.08[/C][C] 0.96[/C][/ROW]
[ROW][C]29[/C][C] 0.06166[/C][C] 0.1233[/C][C] 0.9383[/C][/ROW]
[ROW][C]30[/C][C] 0.0621[/C][C] 0.1242[/C][C] 0.9379[/C][/ROW]
[ROW][C]31[/C][C] 0.05694[/C][C] 0.1139[/C][C] 0.9431[/C][/ROW]
[ROW][C]32[/C][C] 0.06343[/C][C] 0.1269[/C][C] 0.9366[/C][/ROW]
[ROW][C]33[/C][C] 0.04236[/C][C] 0.08473[/C][C] 0.9576[/C][/ROW]
[ROW][C]34[/C][C] 0.02722[/C][C] 0.05443[/C][C] 0.9728[/C][/ROW]
[ROW][C]35[/C][C] 0.01958[/C][C] 0.03917[/C][C] 0.9804[/C][/ROW]
[ROW][C]36[/C][C] 0.3391[/C][C] 0.6783[/C][C] 0.6609[/C][/ROW]
[ROW][C]37[/C][C] 0.3365[/C][C] 0.6729[/C][C] 0.6635[/C][/ROW]
[ROW][C]38[/C][C] 0.6359[/C][C] 0.7283[/C][C] 0.3641[/C][/ROW]
[ROW][C]39[/C][C] 0.5542[/C][C] 0.8915[/C][C] 0.4458[/C][/ROW]
[ROW][C]40[/C][C] 0.5268[/C][C] 0.9464[/C][C] 0.4732[/C][/ROW]
[ROW][C]41[/C][C] 0.4319[/C][C] 0.8638[/C][C] 0.5681[/C][/ROW]
[ROW][C]42[/C][C] 0.3443[/C][C] 0.6886[/C][C] 0.6557[/C][/ROW]
[ROW][C]43[/C][C] 0.3443[/C][C] 0.6886[/C][C] 0.6557[/C][/ROW]
[ROW][C]44[/C][C] 0.5337[/C][C] 0.9327[/C][C] 0.4663[/C][/ROW]
[ROW][C]45[/C][C] 0.4199[/C][C] 0.8398[/C][C] 0.5801[/C][/ROW]
[ROW][C]46[/C][C] 0.8434[/C][C] 0.3132[/C][C] 0.1566[/C][/ROW]
[ROW][C]47[/C][C] 0.7782[/C][C] 0.4436[/C][C] 0.2218[/C][/ROW]
[ROW][C]48[/C][C] 0.7957[/C][C] 0.4087[/C][C] 0.2043[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=286914&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=286914&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
5 0.05062 0.1012 0.9494
6 0.08901 0.178 0.911
7 0.03971 0.07941 0.9603
8 0.5184 0.9633 0.4816
9 0.4135 0.8271 0.5865
10 0.3176 0.6352 0.6824
11 0.4169 0.8338 0.5831
12 0.4497 0.8995 0.5503
13 0.3583 0.7167 0.6417
14 0.2963 0.5926 0.7037
15 0.2835 0.5671 0.7165
16 0.317 0.6339 0.683
17 0.2508 0.5016 0.7492
18 0.2216 0.4433 0.7784
19 0.1968 0.3936 0.8032
20 0.1704 0.3408 0.8296
21 0.1617 0.3235 0.8383
22 0.1915 0.383 0.8085
23 0.1433 0.2866 0.8567
24 0.1091 0.2183 0.8909
25 0.07767 0.1553 0.9223
26 0.07859 0.1572 0.9214
27 0.05344 0.1069 0.9466
28 0.04 0.08 0.96
29 0.06166 0.1233 0.9383
30 0.0621 0.1242 0.9379
31 0.05694 0.1139 0.9431
32 0.06343 0.1269 0.9366
33 0.04236 0.08473 0.9576
34 0.02722 0.05443 0.9728
35 0.01958 0.03917 0.9804
36 0.3391 0.6783 0.6609
37 0.3365 0.6729 0.6635
38 0.6359 0.7283 0.3641
39 0.5542 0.8915 0.4458
40 0.5268 0.9464 0.4732
41 0.4319 0.8638 0.5681
42 0.3443 0.6886 0.6557
43 0.3443 0.6886 0.6557
44 0.5337 0.9327 0.4663
45 0.4199 0.8398 0.5801
46 0.8434 0.3132 0.1566
47 0.7782 0.4436 0.2218
48 0.7957 0.4087 0.2043







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level0 0OK
5% type I error level10.0227273OK
10% type I error level50.113636NOK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 0 &  0 & OK \tabularnewline
5% type I error level & 1 & 0.0227273 & OK \tabularnewline
10% type I error level & 5 & 0.113636 & NOK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=286914&T=6

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]0[/C][C] 0[/C][C]OK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]1[/C][C]0.0227273[/C][C]OK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]5[/C][C]0.113636[/C][C]NOK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=286914&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=286914&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level0 0OK
5% type I error level10.0227273OK
10% type I error level50.113636NOK



Parameters (Session):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ; par4 = ; par5 = ;
R code (references can be found in the software module):
library(lattice)
library(lmtest)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
mywarning <- ''
par1 <- as.numeric(par1)
if(is.na(par1)) {
par1 <- 1
mywarning = 'Warning: you did not specify the column number of the endogenous series! The first column was selected by default.'
}
if (par4=='') par4 <- 0
par4 <- as.numeric(par4)
if (par5=='') par5 <- 0
par5 <- as.numeric(par5)
x <- na.omit(t(y))
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'Seasonal Differences (s=12)'){
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'First and Seasonal Differences (s=12)'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if(par4 > 0) {
x2 <- array(0, dim=c(n-par4,par4), dimnames=list(1:(n-par4), paste(colnames(x)[par1],'(t-',1:par4,')',sep='')))
for (i in 1:(n-par4)) {
for (j in 1:par4) {
x2[i,j] <- x[i+par4-j,par1]
}
}
x <- cbind(x[(par4+1):n,], x2)
n <- n - par4
}
if(par5 > 0) {
x2 <- array(0, dim=c(n-par5*12,par5), dimnames=list(1:(n-par5*12), paste(colnames(x)[par1],'(t-',1:par5,'s)',sep='')))
for (i in 1:(n-par5*12)) {
for (j in 1:par5) {
x2[i,j] <- x[i+par5*12-j*12,par1]
}
}
x <- cbind(x[(par5*12+1):n,], x2)
n <- n - par5*12
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
(k <- length(x[n,]))
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
(k <- length(x[n,]))
head(x)
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
qqline(mysum$resid)
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, signif(mysum$coefficients[i,1],6), sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, mywarning)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,formatC(signif(mysum$coefficients[i,1],5),format='g',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,2],5),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,3],4),format='e',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4],4),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4]/2,4),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a,formatC(signif(sqrt(mysum$r.squared),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$adj.r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a,formatC(signif(mysum$fstatistic[1],6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[2],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[3],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a,formatC(signif(1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a,formatC(signif(mysum$sigma,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a,formatC(signif(sum(myerror*myerror),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
if(n < 200) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,formatC(signif(x[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(x[i]-mysum$resid[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$resid[i],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,1],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,2],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,3],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant1,6))
a<-table.element(a,formatC(signif(numsignificant1/numgqtests,6),format='g',flag=' '))
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant5,6))
a<-table.element(a,signif(numsignificant5/numgqtests,6))
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant10,6))
a<-table.element(a,signif(numsignificant10/numgqtests,6))
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}
}