Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationWed, 09 Dec 2015 20:48:24 +0000
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2015/Dec/09/t1449694123dtf6jrffby7w2vv.htm/, Retrieved Thu, 16 May 2024 16:59:29 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=285772, Retrieved Thu, 16 May 2024 16:59:29 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact56
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [computing x5 x1] [2015-12-09 20:48:24] [a6adb6e41ce68c761989548559553e3d] [Current]
Feedback Forum

Post a new message
Dataseries X:
11 478
11 494
18 643
11 341
9 773
8 603
12 484
13 546
7 424
9 548
13 506
4 819
9 541
11 491
12 514
10 371
12 457
7 437
15 570
15 432
22 619
14 357
20 623
26 547
12 792
9 799
19 439
17 867
21 912
18 462
19 859
14 805
19 652
19 776
16 919
13 732
13 657
14 1419
9 989
13 821
22 1740
17 815
34 760
26 936
23 863
23 783
18 715
15 1504
22 1324
26 940




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time5 seconds
R Server'Sir Maurice George Kendall' @ kendall.wessa.net

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 5 seconds \tabularnewline
R Server & 'Sir Maurice George Kendall' @ kendall.wessa.net \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=285772&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]5 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Sir Maurice George Kendall' @ kendall.wessa.net[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=285772&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=285772&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time5 seconds
R Server'Sir Maurice George Kendall' @ kendall.wessa.net







Multiple Linear Regression - Estimated Regression Equation
X5[t] = + 10.6547 + 0.00660946X1[t] + e[t]
Warning: you did not specify the column number of the endogenous series! The first column was selected by default.

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
X5[t] =  +  10.6547 +  0.00660946X1[t]  + e[t] \tabularnewline
Warning: you did not specify the column number of the endogenous series! The first column was selected by default. \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=285772&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]X5[t] =  +  10.6547 +  0.00660946X1[t]  + e[t][/C][/ROW]
[ROW][C]Warning: you did not specify the column number of the endogenous series! The first column was selected by default.[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=285772&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=285772&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
X5[t] = + 10.6547 + 0.00660946X1[t] + e[t]
Warning: you did not specify the column number of the endogenous series! The first column was selected by default.







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+10.65 2.169+4.9120e+00 1.087e-05 5.433e-06
X1+0.006609 0.0028+2.3610e+00 0.02235 0.01118

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & +10.65 &  2.169 & +4.9120e+00 &  1.087e-05 &  5.433e-06 \tabularnewline
X1 & +0.006609 &  0.0028 & +2.3610e+00 &  0.02235 &  0.01118 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=285772&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]+10.65[/C][C] 2.169[/C][C]+4.9120e+00[/C][C] 1.087e-05[/C][C] 5.433e-06[/C][/ROW]
[ROW][C]X1[/C][C]+0.006609[/C][C] 0.0028[/C][C]+2.3610e+00[/C][C] 0.02235[/C][C] 0.01118[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=285772&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=285772&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+10.65 2.169+4.9120e+00 1.087e-05 5.433e-06
X1+0.006609 0.0028+2.3610e+00 0.02235 0.01118







Multiple Linear Regression - Regression Statistics
Multiple R 0.3225
R-squared 0.104
Adjusted R-squared 0.08535
F-TEST (value) 5.573
F-TEST (DF numerator)1
F-TEST (DF denominator)48
p-value 0.02235
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 5.761
Sum Squared Residuals 1593

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R &  0.3225 \tabularnewline
R-squared &  0.104 \tabularnewline
Adjusted R-squared &  0.08535 \tabularnewline
F-TEST (value) &  5.573 \tabularnewline
F-TEST (DF numerator) & 1 \tabularnewline
F-TEST (DF denominator) & 48 \tabularnewline
p-value &  0.02235 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation &  5.761 \tabularnewline
Sum Squared Residuals &  1593 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=285772&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C] 0.3225[/C][/ROW]
[ROW][C]R-squared[/C][C] 0.104[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C] 0.08535[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C] 5.573[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]1[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]48[/C][/ROW]
[ROW][C]p-value[/C][C] 0.02235[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C] 5.761[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C] 1593[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=285772&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=285772&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R 0.3225
R-squared 0.104
Adjusted R-squared 0.08535
F-TEST (value) 5.573
F-TEST (DF numerator)1
F-TEST (DF denominator)48
p-value 0.02235
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 5.761
Sum Squared Residuals 1593







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 11 13.81-2.814
2 11 13.92-2.92
3 18 14.9 3.095
4 11 12.91-1.909
5 9 15.76-6.764
6 8 14.64-6.64
7 12 13.85-1.854
8 13 14.26-1.263
9 7 13.46-6.457
10 9 14.28-5.277
11 13 14-0.9991
12 4 16.07-12.07
13 9 14.23-5.23
14 11 13.9-2.9
15 12 14.05-2.052
16 10 13.11-3.107
17 12 13.68-1.675
18 7 13.54-6.543
19 15 14.42 0.5779
20 15 13.51 1.49
21 22 14.75 7.254
22 14 13.01 0.9858
23 20 14.77 5.228
24 26 14.27 11.73
25 12 15.89-3.889
26 9 15.94-6.936
27 19 13.56 5.444
28 17 16.39 0.6149
29 21 16.68 4.317
30 18 13.71 4.292
31 19 16.33 2.668
32 14 15.98-1.975
33 19 14.96 4.036
34 19 15.78 3.216
35 16 16.73-0.7288
36 13 15.49-2.493
37 13 15-1.997
38 14 20.03-6.033
39 9 17.19-8.191
40 13 16.08-3.081
41 22 22.16-0.1551
42 17 16.04 0.9586
43 34 15.68 18.32
44 26 16.84 9.159
45 23 16.36 6.641
46 23 15.83 7.17
47 18 15.38 2.62
48 15 20.6-5.595
49 22 19.41 2.594
50 26 16.87 9.132

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 &  11 &  13.81 & -2.814 \tabularnewline
2 &  11 &  13.92 & -2.92 \tabularnewline
3 &  18 &  14.9 &  3.095 \tabularnewline
4 &  11 &  12.91 & -1.909 \tabularnewline
5 &  9 &  15.76 & -6.764 \tabularnewline
6 &  8 &  14.64 & -6.64 \tabularnewline
7 &  12 &  13.85 & -1.854 \tabularnewline
8 &  13 &  14.26 & -1.263 \tabularnewline
9 &  7 &  13.46 & -6.457 \tabularnewline
10 &  9 &  14.28 & -5.277 \tabularnewline
11 &  13 &  14 & -0.9991 \tabularnewline
12 &  4 &  16.07 & -12.07 \tabularnewline
13 &  9 &  14.23 & -5.23 \tabularnewline
14 &  11 &  13.9 & -2.9 \tabularnewline
15 &  12 &  14.05 & -2.052 \tabularnewline
16 &  10 &  13.11 & -3.107 \tabularnewline
17 &  12 &  13.68 & -1.675 \tabularnewline
18 &  7 &  13.54 & -6.543 \tabularnewline
19 &  15 &  14.42 &  0.5779 \tabularnewline
20 &  15 &  13.51 &  1.49 \tabularnewline
21 &  22 &  14.75 &  7.254 \tabularnewline
22 &  14 &  13.01 &  0.9858 \tabularnewline
23 &  20 &  14.77 &  5.228 \tabularnewline
24 &  26 &  14.27 &  11.73 \tabularnewline
25 &  12 &  15.89 & -3.889 \tabularnewline
26 &  9 &  15.94 & -6.936 \tabularnewline
27 &  19 &  13.56 &  5.444 \tabularnewline
28 &  17 &  16.39 &  0.6149 \tabularnewline
29 &  21 &  16.68 &  4.317 \tabularnewline
30 &  18 &  13.71 &  4.292 \tabularnewline
31 &  19 &  16.33 &  2.668 \tabularnewline
32 &  14 &  15.98 & -1.975 \tabularnewline
33 &  19 &  14.96 &  4.036 \tabularnewline
34 &  19 &  15.78 &  3.216 \tabularnewline
35 &  16 &  16.73 & -0.7288 \tabularnewline
36 &  13 &  15.49 & -2.493 \tabularnewline
37 &  13 &  15 & -1.997 \tabularnewline
38 &  14 &  20.03 & -6.033 \tabularnewline
39 &  9 &  17.19 & -8.191 \tabularnewline
40 &  13 &  16.08 & -3.081 \tabularnewline
41 &  22 &  22.16 & -0.1551 \tabularnewline
42 &  17 &  16.04 &  0.9586 \tabularnewline
43 &  34 &  15.68 &  18.32 \tabularnewline
44 &  26 &  16.84 &  9.159 \tabularnewline
45 &  23 &  16.36 &  6.641 \tabularnewline
46 &  23 &  15.83 &  7.17 \tabularnewline
47 &  18 &  15.38 &  2.62 \tabularnewline
48 &  15 &  20.6 & -5.595 \tabularnewline
49 &  22 &  19.41 &  2.594 \tabularnewline
50 &  26 &  16.87 &  9.132 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=285772&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C] 11[/C][C] 13.81[/C][C]-2.814[/C][/ROW]
[ROW][C]2[/C][C] 11[/C][C] 13.92[/C][C]-2.92[/C][/ROW]
[ROW][C]3[/C][C] 18[/C][C] 14.9[/C][C] 3.095[/C][/ROW]
[ROW][C]4[/C][C] 11[/C][C] 12.91[/C][C]-1.909[/C][/ROW]
[ROW][C]5[/C][C] 9[/C][C] 15.76[/C][C]-6.764[/C][/ROW]
[ROW][C]6[/C][C] 8[/C][C] 14.64[/C][C]-6.64[/C][/ROW]
[ROW][C]7[/C][C] 12[/C][C] 13.85[/C][C]-1.854[/C][/ROW]
[ROW][C]8[/C][C] 13[/C][C] 14.26[/C][C]-1.263[/C][/ROW]
[ROW][C]9[/C][C] 7[/C][C] 13.46[/C][C]-6.457[/C][/ROW]
[ROW][C]10[/C][C] 9[/C][C] 14.28[/C][C]-5.277[/C][/ROW]
[ROW][C]11[/C][C] 13[/C][C] 14[/C][C]-0.9991[/C][/ROW]
[ROW][C]12[/C][C] 4[/C][C] 16.07[/C][C]-12.07[/C][/ROW]
[ROW][C]13[/C][C] 9[/C][C] 14.23[/C][C]-5.23[/C][/ROW]
[ROW][C]14[/C][C] 11[/C][C] 13.9[/C][C]-2.9[/C][/ROW]
[ROW][C]15[/C][C] 12[/C][C] 14.05[/C][C]-2.052[/C][/ROW]
[ROW][C]16[/C][C] 10[/C][C] 13.11[/C][C]-3.107[/C][/ROW]
[ROW][C]17[/C][C] 12[/C][C] 13.68[/C][C]-1.675[/C][/ROW]
[ROW][C]18[/C][C] 7[/C][C] 13.54[/C][C]-6.543[/C][/ROW]
[ROW][C]19[/C][C] 15[/C][C] 14.42[/C][C] 0.5779[/C][/ROW]
[ROW][C]20[/C][C] 15[/C][C] 13.51[/C][C] 1.49[/C][/ROW]
[ROW][C]21[/C][C] 22[/C][C] 14.75[/C][C] 7.254[/C][/ROW]
[ROW][C]22[/C][C] 14[/C][C] 13.01[/C][C] 0.9858[/C][/ROW]
[ROW][C]23[/C][C] 20[/C][C] 14.77[/C][C] 5.228[/C][/ROW]
[ROW][C]24[/C][C] 26[/C][C] 14.27[/C][C] 11.73[/C][/ROW]
[ROW][C]25[/C][C] 12[/C][C] 15.89[/C][C]-3.889[/C][/ROW]
[ROW][C]26[/C][C] 9[/C][C] 15.94[/C][C]-6.936[/C][/ROW]
[ROW][C]27[/C][C] 19[/C][C] 13.56[/C][C] 5.444[/C][/ROW]
[ROW][C]28[/C][C] 17[/C][C] 16.39[/C][C] 0.6149[/C][/ROW]
[ROW][C]29[/C][C] 21[/C][C] 16.68[/C][C] 4.317[/C][/ROW]
[ROW][C]30[/C][C] 18[/C][C] 13.71[/C][C] 4.292[/C][/ROW]
[ROW][C]31[/C][C] 19[/C][C] 16.33[/C][C] 2.668[/C][/ROW]
[ROW][C]32[/C][C] 14[/C][C] 15.98[/C][C]-1.975[/C][/ROW]
[ROW][C]33[/C][C] 19[/C][C] 14.96[/C][C] 4.036[/C][/ROW]
[ROW][C]34[/C][C] 19[/C][C] 15.78[/C][C] 3.216[/C][/ROW]
[ROW][C]35[/C][C] 16[/C][C] 16.73[/C][C]-0.7288[/C][/ROW]
[ROW][C]36[/C][C] 13[/C][C] 15.49[/C][C]-2.493[/C][/ROW]
[ROW][C]37[/C][C] 13[/C][C] 15[/C][C]-1.997[/C][/ROW]
[ROW][C]38[/C][C] 14[/C][C] 20.03[/C][C]-6.033[/C][/ROW]
[ROW][C]39[/C][C] 9[/C][C] 17.19[/C][C]-8.191[/C][/ROW]
[ROW][C]40[/C][C] 13[/C][C] 16.08[/C][C]-3.081[/C][/ROW]
[ROW][C]41[/C][C] 22[/C][C] 22.16[/C][C]-0.1551[/C][/ROW]
[ROW][C]42[/C][C] 17[/C][C] 16.04[/C][C] 0.9586[/C][/ROW]
[ROW][C]43[/C][C] 34[/C][C] 15.68[/C][C] 18.32[/C][/ROW]
[ROW][C]44[/C][C] 26[/C][C] 16.84[/C][C] 9.159[/C][/ROW]
[ROW][C]45[/C][C] 23[/C][C] 16.36[/C][C] 6.641[/C][/ROW]
[ROW][C]46[/C][C] 23[/C][C] 15.83[/C][C] 7.17[/C][/ROW]
[ROW][C]47[/C][C] 18[/C][C] 15.38[/C][C] 2.62[/C][/ROW]
[ROW][C]48[/C][C] 15[/C][C] 20.6[/C][C]-5.595[/C][/ROW]
[ROW][C]49[/C][C] 22[/C][C] 19.41[/C][C] 2.594[/C][/ROW]
[ROW][C]50[/C][C] 26[/C][C] 16.87[/C][C] 9.132[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=285772&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=285772&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 11 13.81-2.814
2 11 13.92-2.92
3 18 14.9 3.095
4 11 12.91-1.909
5 9 15.76-6.764
6 8 14.64-6.64
7 12 13.85-1.854
8 13 14.26-1.263
9 7 13.46-6.457
10 9 14.28-5.277
11 13 14-0.9991
12 4 16.07-12.07
13 9 14.23-5.23
14 11 13.9-2.9
15 12 14.05-2.052
16 10 13.11-3.107
17 12 13.68-1.675
18 7 13.54-6.543
19 15 14.42 0.5779
20 15 13.51 1.49
21 22 14.75 7.254
22 14 13.01 0.9858
23 20 14.77 5.228
24 26 14.27 11.73
25 12 15.89-3.889
26 9 15.94-6.936
27 19 13.56 5.444
28 17 16.39 0.6149
29 21 16.68 4.317
30 18 13.71 4.292
31 19 16.33 2.668
32 14 15.98-1.975
33 19 14.96 4.036
34 19 15.78 3.216
35 16 16.73-0.7288
36 13 15.49-2.493
37 13 15-1.997
38 14 20.03-6.033
39 9 17.19-8.191
40 13 16.08-3.081
41 22 22.16-0.1551
42 17 16.04 0.9586
43 34 15.68 18.32
44 26 16.84 9.159
45 23 16.36 6.641
46 23 15.83 7.17
47 18 15.38 2.62
48 15 20.6-5.595
49 22 19.41 2.594
50 26 16.87 9.132







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
5 0.2814 0.5628 0.7186
6 0.2208 0.4416 0.7792
7 0.1188 0.2376 0.8812
8 0.06312 0.1262 0.9369
9 0.06086 0.1217 0.9391
10 0.03788 0.07576 0.9621
11 0.0217 0.0434 0.9783
12 0.05517 0.1103 0.9448
13 0.03763 0.07526 0.9624
14 0.02208 0.04415 0.9779
15 0.01299 0.02598 0.987
16 0.008531 0.01706 0.9915
17 0.004897 0.009794 0.9951
18 0.007949 0.0159 0.9921
19 0.008959 0.01792 0.991
20 0.007846 0.01569 0.9922
21 0.06305 0.1261 0.9369
22 0.04777 0.09555 0.9522
23 0.07725 0.1545 0.9227
24 0.2939 0.5878 0.7061
25 0.2618 0.5236 0.7382
26 0.3061 0.6122 0.6939
27 0.2864 0.5728 0.7136
28 0.2484 0.4968 0.7516
29 0.2462 0.4925 0.7538
30 0.2097 0.4195 0.7903
31 0.1685 0.337 0.8315
32 0.138 0.2759 0.862
33 0.1096 0.2191 0.8904
34 0.08091 0.1618 0.9191
35 0.05613 0.1123 0.9439
36 0.05118 0.1024 0.9488
37 0.05529 0.1106 0.9447
38 0.04583 0.09167 0.9542
39 0.1432 0.2865 0.8568
40 0.26 0.52 0.74
41 0.2248 0.4496 0.7752
42 0.2941 0.5883 0.7059
43 0.7651 0.4698 0.2349
44 0.7474 0.5052 0.2526
45 0.6025 0.795 0.3975

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
5 &  0.2814 &  0.5628 &  0.7186 \tabularnewline
6 &  0.2208 &  0.4416 &  0.7792 \tabularnewline
7 &  0.1188 &  0.2376 &  0.8812 \tabularnewline
8 &  0.06312 &  0.1262 &  0.9369 \tabularnewline
9 &  0.06086 &  0.1217 &  0.9391 \tabularnewline
10 &  0.03788 &  0.07576 &  0.9621 \tabularnewline
11 &  0.0217 &  0.0434 &  0.9783 \tabularnewline
12 &  0.05517 &  0.1103 &  0.9448 \tabularnewline
13 &  0.03763 &  0.07526 &  0.9624 \tabularnewline
14 &  0.02208 &  0.04415 &  0.9779 \tabularnewline
15 &  0.01299 &  0.02598 &  0.987 \tabularnewline
16 &  0.008531 &  0.01706 &  0.9915 \tabularnewline
17 &  0.004897 &  0.009794 &  0.9951 \tabularnewline
18 &  0.007949 &  0.0159 &  0.9921 \tabularnewline
19 &  0.008959 &  0.01792 &  0.991 \tabularnewline
20 &  0.007846 &  0.01569 &  0.9922 \tabularnewline
21 &  0.06305 &  0.1261 &  0.9369 \tabularnewline
22 &  0.04777 &  0.09555 &  0.9522 \tabularnewline
23 &  0.07725 &  0.1545 &  0.9227 \tabularnewline
24 &  0.2939 &  0.5878 &  0.7061 \tabularnewline
25 &  0.2618 &  0.5236 &  0.7382 \tabularnewline
26 &  0.3061 &  0.6122 &  0.6939 \tabularnewline
27 &  0.2864 &  0.5728 &  0.7136 \tabularnewline
28 &  0.2484 &  0.4968 &  0.7516 \tabularnewline
29 &  0.2462 &  0.4925 &  0.7538 \tabularnewline
30 &  0.2097 &  0.4195 &  0.7903 \tabularnewline
31 &  0.1685 &  0.337 &  0.8315 \tabularnewline
32 &  0.138 &  0.2759 &  0.862 \tabularnewline
33 &  0.1096 &  0.2191 &  0.8904 \tabularnewline
34 &  0.08091 &  0.1618 &  0.9191 \tabularnewline
35 &  0.05613 &  0.1123 &  0.9439 \tabularnewline
36 &  0.05118 &  0.1024 &  0.9488 \tabularnewline
37 &  0.05529 &  0.1106 &  0.9447 \tabularnewline
38 &  0.04583 &  0.09167 &  0.9542 \tabularnewline
39 &  0.1432 &  0.2865 &  0.8568 \tabularnewline
40 &  0.26 &  0.52 &  0.74 \tabularnewline
41 &  0.2248 &  0.4496 &  0.7752 \tabularnewline
42 &  0.2941 &  0.5883 &  0.7059 \tabularnewline
43 &  0.7651 &  0.4698 &  0.2349 \tabularnewline
44 &  0.7474 &  0.5052 &  0.2526 \tabularnewline
45 &  0.6025 &  0.795 &  0.3975 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=285772&T=5

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]5[/C][C] 0.2814[/C][C] 0.5628[/C][C] 0.7186[/C][/ROW]
[ROW][C]6[/C][C] 0.2208[/C][C] 0.4416[/C][C] 0.7792[/C][/ROW]
[ROW][C]7[/C][C] 0.1188[/C][C] 0.2376[/C][C] 0.8812[/C][/ROW]
[ROW][C]8[/C][C] 0.06312[/C][C] 0.1262[/C][C] 0.9369[/C][/ROW]
[ROW][C]9[/C][C] 0.06086[/C][C] 0.1217[/C][C] 0.9391[/C][/ROW]
[ROW][C]10[/C][C] 0.03788[/C][C] 0.07576[/C][C] 0.9621[/C][/ROW]
[ROW][C]11[/C][C] 0.0217[/C][C] 0.0434[/C][C] 0.9783[/C][/ROW]
[ROW][C]12[/C][C] 0.05517[/C][C] 0.1103[/C][C] 0.9448[/C][/ROW]
[ROW][C]13[/C][C] 0.03763[/C][C] 0.07526[/C][C] 0.9624[/C][/ROW]
[ROW][C]14[/C][C] 0.02208[/C][C] 0.04415[/C][C] 0.9779[/C][/ROW]
[ROW][C]15[/C][C] 0.01299[/C][C] 0.02598[/C][C] 0.987[/C][/ROW]
[ROW][C]16[/C][C] 0.008531[/C][C] 0.01706[/C][C] 0.9915[/C][/ROW]
[ROW][C]17[/C][C] 0.004897[/C][C] 0.009794[/C][C] 0.9951[/C][/ROW]
[ROW][C]18[/C][C] 0.007949[/C][C] 0.0159[/C][C] 0.9921[/C][/ROW]
[ROW][C]19[/C][C] 0.008959[/C][C] 0.01792[/C][C] 0.991[/C][/ROW]
[ROW][C]20[/C][C] 0.007846[/C][C] 0.01569[/C][C] 0.9922[/C][/ROW]
[ROW][C]21[/C][C] 0.06305[/C][C] 0.1261[/C][C] 0.9369[/C][/ROW]
[ROW][C]22[/C][C] 0.04777[/C][C] 0.09555[/C][C] 0.9522[/C][/ROW]
[ROW][C]23[/C][C] 0.07725[/C][C] 0.1545[/C][C] 0.9227[/C][/ROW]
[ROW][C]24[/C][C] 0.2939[/C][C] 0.5878[/C][C] 0.7061[/C][/ROW]
[ROW][C]25[/C][C] 0.2618[/C][C] 0.5236[/C][C] 0.7382[/C][/ROW]
[ROW][C]26[/C][C] 0.3061[/C][C] 0.6122[/C][C] 0.6939[/C][/ROW]
[ROW][C]27[/C][C] 0.2864[/C][C] 0.5728[/C][C] 0.7136[/C][/ROW]
[ROW][C]28[/C][C] 0.2484[/C][C] 0.4968[/C][C] 0.7516[/C][/ROW]
[ROW][C]29[/C][C] 0.2462[/C][C] 0.4925[/C][C] 0.7538[/C][/ROW]
[ROW][C]30[/C][C] 0.2097[/C][C] 0.4195[/C][C] 0.7903[/C][/ROW]
[ROW][C]31[/C][C] 0.1685[/C][C] 0.337[/C][C] 0.8315[/C][/ROW]
[ROW][C]32[/C][C] 0.138[/C][C] 0.2759[/C][C] 0.862[/C][/ROW]
[ROW][C]33[/C][C] 0.1096[/C][C] 0.2191[/C][C] 0.8904[/C][/ROW]
[ROW][C]34[/C][C] 0.08091[/C][C] 0.1618[/C][C] 0.9191[/C][/ROW]
[ROW][C]35[/C][C] 0.05613[/C][C] 0.1123[/C][C] 0.9439[/C][/ROW]
[ROW][C]36[/C][C] 0.05118[/C][C] 0.1024[/C][C] 0.9488[/C][/ROW]
[ROW][C]37[/C][C] 0.05529[/C][C] 0.1106[/C][C] 0.9447[/C][/ROW]
[ROW][C]38[/C][C] 0.04583[/C][C] 0.09167[/C][C] 0.9542[/C][/ROW]
[ROW][C]39[/C][C] 0.1432[/C][C] 0.2865[/C][C] 0.8568[/C][/ROW]
[ROW][C]40[/C][C] 0.26[/C][C] 0.52[/C][C] 0.74[/C][/ROW]
[ROW][C]41[/C][C] 0.2248[/C][C] 0.4496[/C][C] 0.7752[/C][/ROW]
[ROW][C]42[/C][C] 0.2941[/C][C] 0.5883[/C][C] 0.7059[/C][/ROW]
[ROW][C]43[/C][C] 0.7651[/C][C] 0.4698[/C][C] 0.2349[/C][/ROW]
[ROW][C]44[/C][C] 0.7474[/C][C] 0.5052[/C][C] 0.2526[/C][/ROW]
[ROW][C]45[/C][C] 0.6025[/C][C] 0.795[/C][C] 0.3975[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=285772&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=285772&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
5 0.2814 0.5628 0.7186
6 0.2208 0.4416 0.7792
7 0.1188 0.2376 0.8812
8 0.06312 0.1262 0.9369
9 0.06086 0.1217 0.9391
10 0.03788 0.07576 0.9621
11 0.0217 0.0434 0.9783
12 0.05517 0.1103 0.9448
13 0.03763 0.07526 0.9624
14 0.02208 0.04415 0.9779
15 0.01299 0.02598 0.987
16 0.008531 0.01706 0.9915
17 0.004897 0.009794 0.9951
18 0.007949 0.0159 0.9921
19 0.008959 0.01792 0.991
20 0.007846 0.01569 0.9922
21 0.06305 0.1261 0.9369
22 0.04777 0.09555 0.9522
23 0.07725 0.1545 0.9227
24 0.2939 0.5878 0.7061
25 0.2618 0.5236 0.7382
26 0.3061 0.6122 0.6939
27 0.2864 0.5728 0.7136
28 0.2484 0.4968 0.7516
29 0.2462 0.4925 0.7538
30 0.2097 0.4195 0.7903
31 0.1685 0.337 0.8315
32 0.138 0.2759 0.862
33 0.1096 0.2191 0.8904
34 0.08091 0.1618 0.9191
35 0.05613 0.1123 0.9439
36 0.05118 0.1024 0.9488
37 0.05529 0.1106 0.9447
38 0.04583 0.09167 0.9542
39 0.1432 0.2865 0.8568
40 0.26 0.52 0.74
41 0.2248 0.4496 0.7752
42 0.2941 0.5883 0.7059
43 0.7651 0.4698 0.2349
44 0.7474 0.5052 0.2526
45 0.6025 0.795 0.3975







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level1 0.02439NOK
5% type I error level80.195122NOK
10% type I error level120.292683NOK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 1 &  0.02439 & NOK \tabularnewline
5% type I error level & 8 & 0.195122 & NOK \tabularnewline
10% type I error level & 12 & 0.292683 & NOK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=285772&T=6

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]1[/C][C] 0.02439[/C][C]NOK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]8[/C][C]0.195122[/C][C]NOK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]12[/C][C]0.292683[/C][C]NOK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=285772&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=285772&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level1 0.02439NOK
5% type I error level80.195122NOK
10% type I error level120.292683NOK



Parameters (Session):
par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
Parameters (R input):
par1 = ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ; par4 = ; par5 = ;
R code (references can be found in the software module):
library(lattice)
library(lmtest)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
mywarning <- ''
par1 <- as.numeric(par1)
if(is.na(par1)) {
par1 <- 1
mywarning = 'Warning: you did not specify the column number of the endogenous series! The first column was selected by default.'
}
if (par4=='') par4 <- 0
par4 <- as.numeric(par4)
if (par5=='') par5 <- 0
par5 <- as.numeric(par5)
x <- na.omit(t(y))
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'Seasonal Differences (s=12)'){
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'First and Seasonal Differences (s=12)'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if(par4 > 0) {
x2 <- array(0, dim=c(n-par4,par4), dimnames=list(1:(n-par4), paste(colnames(x)[par1],'(t-',1:par4,')',sep='')))
for (i in 1:(n-par4)) {
for (j in 1:par4) {
x2[i,j] <- x[i+par4-j,par1]
}
}
x <- cbind(x[(par4+1):n,], x2)
n <- n - par4
}
if(par5 > 0) {
x2 <- array(0, dim=c(n-par5*12,par5), dimnames=list(1:(n-par5*12), paste(colnames(x)[par1],'(t-',1:par5,'s)',sep='')))
for (i in 1:(n-par5*12)) {
for (j in 1:par5) {
x2[i,j] <- x[i+par5*12-j*12,par1]
}
}
x <- cbind(x[(par5*12+1):n,], x2)
n <- n - par5*12
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
(k <- length(x[n,]))
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
(k <- length(x[n,]))
head(x)
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
qqline(mysum$resid)
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, signif(mysum$coefficients[i,1],6), sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, mywarning)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,formatC(signif(mysum$coefficients[i,1],5),format='g',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,2],5),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,3],4),format='e',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4],4),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4]/2,4),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a,formatC(signif(sqrt(mysum$r.squared),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$adj.r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a,formatC(signif(mysum$fstatistic[1],6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[2],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[3],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a,formatC(signif(1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a,formatC(signif(mysum$sigma,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a,formatC(signif(sum(myerror*myerror),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
if(n < 200) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,formatC(signif(x[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(x[i]-mysum$resid[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$resid[i],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,1],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,2],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,3],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant1,6))
a<-table.element(a,formatC(signif(numsignificant1/numgqtests,6),format='g',flag=' '))
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant5,6))
a<-table.element(a,signif(numsignificant5/numgqtests,6))
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant10,6))
a<-table.element(a,signif(numsignificant10/numgqtests,6))
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}
}