Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationFri, 18 Dec 2015 16:32:00 +0000
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2015/Dec/18/t1450456401xltg4t6ia4mufcn.htm/, Retrieved Thu, 16 May 2024 07:46:31 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=286908, Retrieved Thu, 16 May 2024 07:46:31 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact95
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [Multiple regressi...] [2015-12-18 16:32:00] [9174db105c4509bef9d2565aabf3c4bd] [Current]
Feedback Forum

Post a new message
Dataseries X:
8 78 284 9.100000381 109 0.274647887
9.300000191 68 433 8.699999809 144 0.15704388
7.5 70 739 7.199999809 113 0.094722598
8.899999619 96 1792 8.899999619 97 0.053571429
10.19999981 74 477 8.300000191 206 0.155136268
8.300000191 111 362 10.89999962 124 0.306629834
8.800000191 77 671 10 152 0.114754098
8.800000191 168 636 9.100000381 162 0.264150943
10.69999981 82 329 8.699999809 150 0.249240122
11.69999981 89 634 7.599999905 134 0.140378549
8.5 149 631 10.80000019 292 0.236133122
8.300000191 60 257 9.5 108 0.233463035
8.199999809 96 284 8.800000191 111 0.338028169
7.900000095 83 603 9.5 182 0.137645108
10.30000019 130 686 8.699999809 129 0.189504373
7.400000095 145 345 11.19999981 158 0.420289855
9.600000381 112 1357 9.699999809 186 0.082535004
9.300000191 131 544 9.600000381 177 0.240808824
10.60000038 80 205 9.100000381 127 0.390243902
9.699999809 130 1264 9.199999809 179 0.102848101
11.60000038 140 688 8.300000191 80 0.203488372
8.100000381 154 354 8.399999619 103 0.435028249
9.800000191 118 1632 9.399999619 101 0.072303922
7.400000095 94 348 9.800000191 117 0.270114943
9.399999619 119 370 10.39999962 88 0.321621622
11.19999981 153 648 9.899999619 78 0.236111111
9.100000381 116 366 9.199999809 102 0.316939891
10.5 97 540 10.30000019 95 0.17962963
11.89999962 176 680 8.899999619 80 0.258823529
8.399999619 75 345 9.600000381 92 0.217391304
5 134 525 10.30000019 126 0.255238095
9.800000191 161 870 10.39999962 108 0.185057471
9.800000191 111 669 9.699999809 77 0.165919283
10.80000019 114 452 9.600000381 60 0.252212389
10.10000038 142 430 10.69999981 71 0.330232558
10.89999962 238 822 10.30000019 86 0.289537713
9.199999809 78 190 10.69999981 93 0.410526316
8.300000191 196 867 9.600000381 106 0.226066897
7.300000191 125 969 10.5 162 0.128998968
9.399999619 82 499 7.699999809 95 0.164328657
9.399999619 125 925 10.19999981 91 0.135135135
9.800000191 129 353 9.899999619 52 0.365439093
3.599999905 84 288 8.399999619 110 0.291666667
8.399999619 183 718 10.39999962 69 0.254874652
10.80000019 119 540 9.199999809 57 0.22037037
10.10000038 180 668 13 106 0.269461078
9 82 347 8.800000191 40 0.236311239
10 71 345 9.199999809 50 0.205797101
11.30000019 118 463 7.800000191 35 0.254859611
11.30000019 121 728 8.199999809 86 0.166208791
12.80000019 68 383 7.400000095 57 0.177545692
10 112 316 10.39999962 57 0.35443038
6.699999809 109 388 8.899999619 94 0.280927835




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time5 seconds
R Server'George Udny Yule' @ yule.wessa.net

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 5 seconds \tabularnewline
R Server & 'George Udny Yule' @ yule.wessa.net \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=286908&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]5 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'George Udny Yule' @ yule.wessa.net[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=286908&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=286908&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time5 seconds
R Server'George Udny Yule' @ yule.wessa.net







Multiple Linear Regression - Estimated Regression Equation
death_rate[t] = + 13.1812 + 0.0132914Doctor[t] -0.00082302Hospital[t] -0.244041income[t] -0.01043Population[t] -6.3547`ratio_d/z`[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
death_rate[t] =  +  13.1812 +  0.0132914Doctor[t] -0.00082302Hospital[t] -0.244041income[t] -0.01043Population[t] -6.3547`ratio_d/z`[t]  + e[t] \tabularnewline
 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=286908&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]death_rate[t] =  +  13.1812 +  0.0132914Doctor[t] -0.00082302Hospital[t] -0.244041income[t] -0.01043Population[t] -6.3547`ratio_d/z`[t]  + e[t][/C][/ROW]
[ROW][C][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=286908&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=286908&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
death_rate[t] = + 13.1812 + 0.0132914Doctor[t] -0.00082302Hospital[t] -0.244041income[t] -0.01043Population[t] -6.3547`ratio_d/z`[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+13.18 2.108+6.2530e+00 1.112e-07 5.562e-08
Doctor+0.01329 0.008087+1.6440e+00 0.1069 0.05347
Hospital-0.000823 0.001244-6.6160e-01 0.5114 0.2557
income-0.244 0.2406-1.0140e+00 0.3156 0.1578
Population-0.01043 0.004891-2.1320e+00 0.03824 0.01912
`ratio_d/z`-6.355 4.598-1.3820e+00 0.1735 0.08673

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & +13.18 &  2.108 & +6.2530e+00 &  1.112e-07 &  5.562e-08 \tabularnewline
Doctor & +0.01329 &  0.008087 & +1.6440e+00 &  0.1069 &  0.05347 \tabularnewline
Hospital & -0.000823 &  0.001244 & -6.6160e-01 &  0.5114 &  0.2557 \tabularnewline
income & -0.244 &  0.2406 & -1.0140e+00 &  0.3156 &  0.1578 \tabularnewline
Population & -0.01043 &  0.004891 & -2.1320e+00 &  0.03824 &  0.01912 \tabularnewline
`ratio_d/z` & -6.355 &  4.598 & -1.3820e+00 &  0.1735 &  0.08673 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=286908&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]+13.18[/C][C] 2.108[/C][C]+6.2530e+00[/C][C] 1.112e-07[/C][C] 5.562e-08[/C][/ROW]
[ROW][C]Doctor[/C][C]+0.01329[/C][C] 0.008087[/C][C]+1.6440e+00[/C][C] 0.1069[/C][C] 0.05347[/C][/ROW]
[ROW][C]Hospital[/C][C]-0.000823[/C][C] 0.001244[/C][C]-6.6160e-01[/C][C] 0.5114[/C][C] 0.2557[/C][/ROW]
[ROW][C]income[/C][C]-0.244[/C][C] 0.2406[/C][C]-1.0140e+00[/C][C] 0.3156[/C][C] 0.1578[/C][/ROW]
[ROW][C]Population[/C][C]-0.01043[/C][C] 0.004891[/C][C]-2.1320e+00[/C][C] 0.03824[/C][C] 0.01912[/C][/ROW]
[ROW][C]`ratio_d/z`[/C][C]-6.355[/C][C] 4.598[/C][C]-1.3820e+00[/C][C] 0.1735[/C][C] 0.08673[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=286908&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=286908&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+13.18 2.108+6.2530e+00 1.112e-07 5.562e-08
Doctor+0.01329 0.008087+1.6440e+00 0.1069 0.05347
Hospital-0.000823 0.001244-6.6160e-01 0.5114 0.2557
income-0.244 0.2406-1.0140e+00 0.3156 0.1578
Population-0.01043 0.004891-2.1320e+00 0.03824 0.01912
`ratio_d/z`-6.355 4.598-1.3820e+00 0.1735 0.08673







Multiple Linear Regression - Regression Statistics
Multiple R 0.4209
R-squared 0.1771
Adjusted R-squared 0.08961
F-TEST (value) 2.024
F-TEST (DF numerator)5
F-TEST (DF denominator)47
p-value 0.09247
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 1.586
Sum Squared Residuals 118.3

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R &  0.4209 \tabularnewline
R-squared &  0.1771 \tabularnewline
Adjusted R-squared &  0.08961 \tabularnewline
F-TEST (value) &  2.024 \tabularnewline
F-TEST (DF numerator) & 5 \tabularnewline
F-TEST (DF denominator) & 47 \tabularnewline
p-value &  0.09247 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation &  1.586 \tabularnewline
Sum Squared Residuals &  118.3 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=286908&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C] 0.4209[/C][/ROW]
[ROW][C]R-squared[/C][C] 0.1771[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C] 0.08961[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C] 2.024[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]5[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]47[/C][/ROW]
[ROW][C]p-value[/C][C] 0.09247[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C] 1.586[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C] 118.3[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=286908&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=286908&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R 0.4209
R-squared 0.1771
Adjusted R-squared 0.08961
F-TEST (value) 2.024
F-TEST (DF numerator)5
F-TEST (DF denominator)47
p-value 0.09247
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 1.586
Sum Squared Residuals 118.3







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 8 8.881-0.8813
2 9.3 9.106 0.1944
3 7.5 9.966-2.466
4 8.9 9.458-0.5583
5 10.2 8.612 1.588
6 8.3 8.457-0.1567
7 8.8 8.897-0.09743
8 8.8 9.302-0.5017
9 10.7 8.729 1.971
10 11.7 9.698 2.002
11 8.5 7.461 1.039
12 8.3 8.839-0.5388
13 8.2 8.77-0.5701
14 7.9 8.697-0.7968
15 10.3 9.672 0.6283
16 7.4 7.773-0.3725
17 9.6 8.721 0.8786
18 9.3 8.756 0.5445
19 10.6 8.051 2.549
20 9.7 9.103 0.5969
21 11.6 10.32 1.277
22 8.1 9.048-0.9481
23 9.8 9.6 0.2004
24 7.4 8.816-1.416
25 9.4 8.959 0.4413
26 11.2 9.952 1.248
27 9.1 9.099 0.001285
28 10.5 9.38 1.12
29 11.9 10.31 1.59
30 8.4 9.21-0.8103
31 5 9.08-4.08
32 9.8 9.765 0.03533
33 9.8 9.881-0.0813
34 10.8 9.753 1.047
35 10.1 9.264 0.8356
36 10.9 10.42 0.4825
37 9.2 7.872 1.328
38 8.3 10.19-1.888
39 7.3 8.973-1.673
40 9.4 9.946-0.5462
41 9.4 9.784-0.3843
42 9.8 9.325 0.4753
43 3.6 9.01-5.41
44 8.4 10.15-1.745
45 10.8 10.08 0.7216
46 10.1 9.033 1.067
47 9 9.919-0.9191
48 10 9.767 0.2335
49 11.3 10.48 0.8196
50 11.3 10.24 1.064
51 12.8 10.24 2.559
52 10 9.025 0.975
53 6.7 9.373-2.673

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 &  8 &  8.881 & -0.8813 \tabularnewline
2 &  9.3 &  9.106 &  0.1944 \tabularnewline
3 &  7.5 &  9.966 & -2.466 \tabularnewline
4 &  8.9 &  9.458 & -0.5583 \tabularnewline
5 &  10.2 &  8.612 &  1.588 \tabularnewline
6 &  8.3 &  8.457 & -0.1567 \tabularnewline
7 &  8.8 &  8.897 & -0.09743 \tabularnewline
8 &  8.8 &  9.302 & -0.5017 \tabularnewline
9 &  10.7 &  8.729 &  1.971 \tabularnewline
10 &  11.7 &  9.698 &  2.002 \tabularnewline
11 &  8.5 &  7.461 &  1.039 \tabularnewline
12 &  8.3 &  8.839 & -0.5388 \tabularnewline
13 &  8.2 &  8.77 & -0.5701 \tabularnewline
14 &  7.9 &  8.697 & -0.7968 \tabularnewline
15 &  10.3 &  9.672 &  0.6283 \tabularnewline
16 &  7.4 &  7.773 & -0.3725 \tabularnewline
17 &  9.6 &  8.721 &  0.8786 \tabularnewline
18 &  9.3 &  8.756 &  0.5445 \tabularnewline
19 &  10.6 &  8.051 &  2.549 \tabularnewline
20 &  9.7 &  9.103 &  0.5969 \tabularnewline
21 &  11.6 &  10.32 &  1.277 \tabularnewline
22 &  8.1 &  9.048 & -0.9481 \tabularnewline
23 &  9.8 &  9.6 &  0.2004 \tabularnewline
24 &  7.4 &  8.816 & -1.416 \tabularnewline
25 &  9.4 &  8.959 &  0.4413 \tabularnewline
26 &  11.2 &  9.952 &  1.248 \tabularnewline
27 &  9.1 &  9.099 &  0.001285 \tabularnewline
28 &  10.5 &  9.38 &  1.12 \tabularnewline
29 &  11.9 &  10.31 &  1.59 \tabularnewline
30 &  8.4 &  9.21 & -0.8103 \tabularnewline
31 &  5 &  9.08 & -4.08 \tabularnewline
32 &  9.8 &  9.765 &  0.03533 \tabularnewline
33 &  9.8 &  9.881 & -0.0813 \tabularnewline
34 &  10.8 &  9.753 &  1.047 \tabularnewline
35 &  10.1 &  9.264 &  0.8356 \tabularnewline
36 &  10.9 &  10.42 &  0.4825 \tabularnewline
37 &  9.2 &  7.872 &  1.328 \tabularnewline
38 &  8.3 &  10.19 & -1.888 \tabularnewline
39 &  7.3 &  8.973 & -1.673 \tabularnewline
40 &  9.4 &  9.946 & -0.5462 \tabularnewline
41 &  9.4 &  9.784 & -0.3843 \tabularnewline
42 &  9.8 &  9.325 &  0.4753 \tabularnewline
43 &  3.6 &  9.01 & -5.41 \tabularnewline
44 &  8.4 &  10.15 & -1.745 \tabularnewline
45 &  10.8 &  10.08 &  0.7216 \tabularnewline
46 &  10.1 &  9.033 &  1.067 \tabularnewline
47 &  9 &  9.919 & -0.9191 \tabularnewline
48 &  10 &  9.767 &  0.2335 \tabularnewline
49 &  11.3 &  10.48 &  0.8196 \tabularnewline
50 &  11.3 &  10.24 &  1.064 \tabularnewline
51 &  12.8 &  10.24 &  2.559 \tabularnewline
52 &  10 &  9.025 &  0.975 \tabularnewline
53 &  6.7 &  9.373 & -2.673 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=286908&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C] 8[/C][C] 8.881[/C][C]-0.8813[/C][/ROW]
[ROW][C]2[/C][C] 9.3[/C][C] 9.106[/C][C] 0.1944[/C][/ROW]
[ROW][C]3[/C][C] 7.5[/C][C] 9.966[/C][C]-2.466[/C][/ROW]
[ROW][C]4[/C][C] 8.9[/C][C] 9.458[/C][C]-0.5583[/C][/ROW]
[ROW][C]5[/C][C] 10.2[/C][C] 8.612[/C][C] 1.588[/C][/ROW]
[ROW][C]6[/C][C] 8.3[/C][C] 8.457[/C][C]-0.1567[/C][/ROW]
[ROW][C]7[/C][C] 8.8[/C][C] 8.897[/C][C]-0.09743[/C][/ROW]
[ROW][C]8[/C][C] 8.8[/C][C] 9.302[/C][C]-0.5017[/C][/ROW]
[ROW][C]9[/C][C] 10.7[/C][C] 8.729[/C][C] 1.971[/C][/ROW]
[ROW][C]10[/C][C] 11.7[/C][C] 9.698[/C][C] 2.002[/C][/ROW]
[ROW][C]11[/C][C] 8.5[/C][C] 7.461[/C][C] 1.039[/C][/ROW]
[ROW][C]12[/C][C] 8.3[/C][C] 8.839[/C][C]-0.5388[/C][/ROW]
[ROW][C]13[/C][C] 8.2[/C][C] 8.77[/C][C]-0.5701[/C][/ROW]
[ROW][C]14[/C][C] 7.9[/C][C] 8.697[/C][C]-0.7968[/C][/ROW]
[ROW][C]15[/C][C] 10.3[/C][C] 9.672[/C][C] 0.6283[/C][/ROW]
[ROW][C]16[/C][C] 7.4[/C][C] 7.773[/C][C]-0.3725[/C][/ROW]
[ROW][C]17[/C][C] 9.6[/C][C] 8.721[/C][C] 0.8786[/C][/ROW]
[ROW][C]18[/C][C] 9.3[/C][C] 8.756[/C][C] 0.5445[/C][/ROW]
[ROW][C]19[/C][C] 10.6[/C][C] 8.051[/C][C] 2.549[/C][/ROW]
[ROW][C]20[/C][C] 9.7[/C][C] 9.103[/C][C] 0.5969[/C][/ROW]
[ROW][C]21[/C][C] 11.6[/C][C] 10.32[/C][C] 1.277[/C][/ROW]
[ROW][C]22[/C][C] 8.1[/C][C] 9.048[/C][C]-0.9481[/C][/ROW]
[ROW][C]23[/C][C] 9.8[/C][C] 9.6[/C][C] 0.2004[/C][/ROW]
[ROW][C]24[/C][C] 7.4[/C][C] 8.816[/C][C]-1.416[/C][/ROW]
[ROW][C]25[/C][C] 9.4[/C][C] 8.959[/C][C] 0.4413[/C][/ROW]
[ROW][C]26[/C][C] 11.2[/C][C] 9.952[/C][C] 1.248[/C][/ROW]
[ROW][C]27[/C][C] 9.1[/C][C] 9.099[/C][C] 0.001285[/C][/ROW]
[ROW][C]28[/C][C] 10.5[/C][C] 9.38[/C][C] 1.12[/C][/ROW]
[ROW][C]29[/C][C] 11.9[/C][C] 10.31[/C][C] 1.59[/C][/ROW]
[ROW][C]30[/C][C] 8.4[/C][C] 9.21[/C][C]-0.8103[/C][/ROW]
[ROW][C]31[/C][C] 5[/C][C] 9.08[/C][C]-4.08[/C][/ROW]
[ROW][C]32[/C][C] 9.8[/C][C] 9.765[/C][C] 0.03533[/C][/ROW]
[ROW][C]33[/C][C] 9.8[/C][C] 9.881[/C][C]-0.0813[/C][/ROW]
[ROW][C]34[/C][C] 10.8[/C][C] 9.753[/C][C] 1.047[/C][/ROW]
[ROW][C]35[/C][C] 10.1[/C][C] 9.264[/C][C] 0.8356[/C][/ROW]
[ROW][C]36[/C][C] 10.9[/C][C] 10.42[/C][C] 0.4825[/C][/ROW]
[ROW][C]37[/C][C] 9.2[/C][C] 7.872[/C][C] 1.328[/C][/ROW]
[ROW][C]38[/C][C] 8.3[/C][C] 10.19[/C][C]-1.888[/C][/ROW]
[ROW][C]39[/C][C] 7.3[/C][C] 8.973[/C][C]-1.673[/C][/ROW]
[ROW][C]40[/C][C] 9.4[/C][C] 9.946[/C][C]-0.5462[/C][/ROW]
[ROW][C]41[/C][C] 9.4[/C][C] 9.784[/C][C]-0.3843[/C][/ROW]
[ROW][C]42[/C][C] 9.8[/C][C] 9.325[/C][C] 0.4753[/C][/ROW]
[ROW][C]43[/C][C] 3.6[/C][C] 9.01[/C][C]-5.41[/C][/ROW]
[ROW][C]44[/C][C] 8.4[/C][C] 10.15[/C][C]-1.745[/C][/ROW]
[ROW][C]45[/C][C] 10.8[/C][C] 10.08[/C][C] 0.7216[/C][/ROW]
[ROW][C]46[/C][C] 10.1[/C][C] 9.033[/C][C] 1.067[/C][/ROW]
[ROW][C]47[/C][C] 9[/C][C] 9.919[/C][C]-0.9191[/C][/ROW]
[ROW][C]48[/C][C] 10[/C][C] 9.767[/C][C] 0.2335[/C][/ROW]
[ROW][C]49[/C][C] 11.3[/C][C] 10.48[/C][C] 0.8196[/C][/ROW]
[ROW][C]50[/C][C] 11.3[/C][C] 10.24[/C][C] 1.064[/C][/ROW]
[ROW][C]51[/C][C] 12.8[/C][C] 10.24[/C][C] 2.559[/C][/ROW]
[ROW][C]52[/C][C] 10[/C][C] 9.025[/C][C] 0.975[/C][/ROW]
[ROW][C]53[/C][C] 6.7[/C][C] 9.373[/C][C]-2.673[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=286908&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=286908&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 8 8.881-0.8813
2 9.3 9.106 0.1944
3 7.5 9.966-2.466
4 8.9 9.458-0.5583
5 10.2 8.612 1.588
6 8.3 8.457-0.1567
7 8.8 8.897-0.09743
8 8.8 9.302-0.5017
9 10.7 8.729 1.971
10 11.7 9.698 2.002
11 8.5 7.461 1.039
12 8.3 8.839-0.5388
13 8.2 8.77-0.5701
14 7.9 8.697-0.7968
15 10.3 9.672 0.6283
16 7.4 7.773-0.3725
17 9.6 8.721 0.8786
18 9.3 8.756 0.5445
19 10.6 8.051 2.549
20 9.7 9.103 0.5969
21 11.6 10.32 1.277
22 8.1 9.048-0.9481
23 9.8 9.6 0.2004
24 7.4 8.816-1.416
25 9.4 8.959 0.4413
26 11.2 9.952 1.248
27 9.1 9.099 0.001285
28 10.5 9.38 1.12
29 11.9 10.31 1.59
30 8.4 9.21-0.8103
31 5 9.08-4.08
32 9.8 9.765 0.03533
33 9.8 9.881-0.0813
34 10.8 9.753 1.047
35 10.1 9.264 0.8356
36 10.9 10.42 0.4825
37 9.2 7.872 1.328
38 8.3 10.19-1.888
39 7.3 8.973-1.673
40 9.4 9.946-0.5462
41 9.4 9.784-0.3843
42 9.8 9.325 0.4753
43 3.6 9.01-5.41
44 8.4 10.15-1.745
45 10.8 10.08 0.7216
46 10.1 9.033 1.067
47 9 9.919-0.9191
48 10 9.767 0.2335
49 11.3 10.48 0.8196
50 11.3 10.24 1.064
51 12.8 10.24 2.559
52 10 9.025 0.975
53 6.7 9.373-2.673







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
9 0.1697 0.3393 0.8303
10 0.5591 0.8819 0.4409
11 0.5445 0.9109 0.4555
12 0.4173 0.8345 0.5827
13 0.3246 0.6492 0.6754
14 0.2684 0.5368 0.7316
15 0.228 0.456 0.772
16 0.1557 0.3114 0.8443
17 0.113 0.2261 0.887
18 0.08816 0.1763 0.9118
19 0.1636 0.3273 0.8364
20 0.1659 0.3319 0.8341
21 0.2246 0.4493 0.7754
22 0.2361 0.4723 0.7639
23 0.191 0.3819 0.809
24 0.1615 0.3229 0.8385
25 0.1368 0.2736 0.8632
26 0.1395 0.279 0.8605
27 0.1186 0.2372 0.8814
28 0.1169 0.2337 0.8831
29 0.1393 0.2786 0.8607
30 0.1031 0.2062 0.8969
31 0.3378 0.6757 0.6622
32 0.2624 0.5249 0.7376
33 0.1983 0.3967 0.8017
34 0.1584 0.3169 0.8416
35 0.12 0.2401 0.88
36 0.112 0.2241 0.888
37 0.1594 0.3189 0.8406
38 0.1348 0.2696 0.8652
39 0.1048 0.2097 0.8952
40 0.07277 0.1455 0.9272
41 0.09223 0.1845 0.9078
42 0.0625 0.125 0.9375
43 0.2114 0.4229 0.7886
44 0.2173 0.4346 0.7827

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
9 &  0.1697 &  0.3393 &  0.8303 \tabularnewline
10 &  0.5591 &  0.8819 &  0.4409 \tabularnewline
11 &  0.5445 &  0.9109 &  0.4555 \tabularnewline
12 &  0.4173 &  0.8345 &  0.5827 \tabularnewline
13 &  0.3246 &  0.6492 &  0.6754 \tabularnewline
14 &  0.2684 &  0.5368 &  0.7316 \tabularnewline
15 &  0.228 &  0.456 &  0.772 \tabularnewline
16 &  0.1557 &  0.3114 &  0.8443 \tabularnewline
17 &  0.113 &  0.2261 &  0.887 \tabularnewline
18 &  0.08816 &  0.1763 &  0.9118 \tabularnewline
19 &  0.1636 &  0.3273 &  0.8364 \tabularnewline
20 &  0.1659 &  0.3319 &  0.8341 \tabularnewline
21 &  0.2246 &  0.4493 &  0.7754 \tabularnewline
22 &  0.2361 &  0.4723 &  0.7639 \tabularnewline
23 &  0.191 &  0.3819 &  0.809 \tabularnewline
24 &  0.1615 &  0.3229 &  0.8385 \tabularnewline
25 &  0.1368 &  0.2736 &  0.8632 \tabularnewline
26 &  0.1395 &  0.279 &  0.8605 \tabularnewline
27 &  0.1186 &  0.2372 &  0.8814 \tabularnewline
28 &  0.1169 &  0.2337 &  0.8831 \tabularnewline
29 &  0.1393 &  0.2786 &  0.8607 \tabularnewline
30 &  0.1031 &  0.2062 &  0.8969 \tabularnewline
31 &  0.3378 &  0.6757 &  0.6622 \tabularnewline
32 &  0.2624 &  0.5249 &  0.7376 \tabularnewline
33 &  0.1983 &  0.3967 &  0.8017 \tabularnewline
34 &  0.1584 &  0.3169 &  0.8416 \tabularnewline
35 &  0.12 &  0.2401 &  0.88 \tabularnewline
36 &  0.112 &  0.2241 &  0.888 \tabularnewline
37 &  0.1594 &  0.3189 &  0.8406 \tabularnewline
38 &  0.1348 &  0.2696 &  0.8652 \tabularnewline
39 &  0.1048 &  0.2097 &  0.8952 \tabularnewline
40 &  0.07277 &  0.1455 &  0.9272 \tabularnewline
41 &  0.09223 &  0.1845 &  0.9078 \tabularnewline
42 &  0.0625 &  0.125 &  0.9375 \tabularnewline
43 &  0.2114 &  0.4229 &  0.7886 \tabularnewline
44 &  0.2173 &  0.4346 &  0.7827 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=286908&T=5

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]9[/C][C] 0.1697[/C][C] 0.3393[/C][C] 0.8303[/C][/ROW]
[ROW][C]10[/C][C] 0.5591[/C][C] 0.8819[/C][C] 0.4409[/C][/ROW]
[ROW][C]11[/C][C] 0.5445[/C][C] 0.9109[/C][C] 0.4555[/C][/ROW]
[ROW][C]12[/C][C] 0.4173[/C][C] 0.8345[/C][C] 0.5827[/C][/ROW]
[ROW][C]13[/C][C] 0.3246[/C][C] 0.6492[/C][C] 0.6754[/C][/ROW]
[ROW][C]14[/C][C] 0.2684[/C][C] 0.5368[/C][C] 0.7316[/C][/ROW]
[ROW][C]15[/C][C] 0.228[/C][C] 0.456[/C][C] 0.772[/C][/ROW]
[ROW][C]16[/C][C] 0.1557[/C][C] 0.3114[/C][C] 0.8443[/C][/ROW]
[ROW][C]17[/C][C] 0.113[/C][C] 0.2261[/C][C] 0.887[/C][/ROW]
[ROW][C]18[/C][C] 0.08816[/C][C] 0.1763[/C][C] 0.9118[/C][/ROW]
[ROW][C]19[/C][C] 0.1636[/C][C] 0.3273[/C][C] 0.8364[/C][/ROW]
[ROW][C]20[/C][C] 0.1659[/C][C] 0.3319[/C][C] 0.8341[/C][/ROW]
[ROW][C]21[/C][C] 0.2246[/C][C] 0.4493[/C][C] 0.7754[/C][/ROW]
[ROW][C]22[/C][C] 0.2361[/C][C] 0.4723[/C][C] 0.7639[/C][/ROW]
[ROW][C]23[/C][C] 0.191[/C][C] 0.3819[/C][C] 0.809[/C][/ROW]
[ROW][C]24[/C][C] 0.1615[/C][C] 0.3229[/C][C] 0.8385[/C][/ROW]
[ROW][C]25[/C][C] 0.1368[/C][C] 0.2736[/C][C] 0.8632[/C][/ROW]
[ROW][C]26[/C][C] 0.1395[/C][C] 0.279[/C][C] 0.8605[/C][/ROW]
[ROW][C]27[/C][C] 0.1186[/C][C] 0.2372[/C][C] 0.8814[/C][/ROW]
[ROW][C]28[/C][C] 0.1169[/C][C] 0.2337[/C][C] 0.8831[/C][/ROW]
[ROW][C]29[/C][C] 0.1393[/C][C] 0.2786[/C][C] 0.8607[/C][/ROW]
[ROW][C]30[/C][C] 0.1031[/C][C] 0.2062[/C][C] 0.8969[/C][/ROW]
[ROW][C]31[/C][C] 0.3378[/C][C] 0.6757[/C][C] 0.6622[/C][/ROW]
[ROW][C]32[/C][C] 0.2624[/C][C] 0.5249[/C][C] 0.7376[/C][/ROW]
[ROW][C]33[/C][C] 0.1983[/C][C] 0.3967[/C][C] 0.8017[/C][/ROW]
[ROW][C]34[/C][C] 0.1584[/C][C] 0.3169[/C][C] 0.8416[/C][/ROW]
[ROW][C]35[/C][C] 0.12[/C][C] 0.2401[/C][C] 0.88[/C][/ROW]
[ROW][C]36[/C][C] 0.112[/C][C] 0.2241[/C][C] 0.888[/C][/ROW]
[ROW][C]37[/C][C] 0.1594[/C][C] 0.3189[/C][C] 0.8406[/C][/ROW]
[ROW][C]38[/C][C] 0.1348[/C][C] 0.2696[/C][C] 0.8652[/C][/ROW]
[ROW][C]39[/C][C] 0.1048[/C][C] 0.2097[/C][C] 0.8952[/C][/ROW]
[ROW][C]40[/C][C] 0.07277[/C][C] 0.1455[/C][C] 0.9272[/C][/ROW]
[ROW][C]41[/C][C] 0.09223[/C][C] 0.1845[/C][C] 0.9078[/C][/ROW]
[ROW][C]42[/C][C] 0.0625[/C][C] 0.125[/C][C] 0.9375[/C][/ROW]
[ROW][C]43[/C][C] 0.2114[/C][C] 0.4229[/C][C] 0.7886[/C][/ROW]
[ROW][C]44[/C][C] 0.2173[/C][C] 0.4346[/C][C] 0.7827[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=286908&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=286908&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
9 0.1697 0.3393 0.8303
10 0.5591 0.8819 0.4409
11 0.5445 0.9109 0.4555
12 0.4173 0.8345 0.5827
13 0.3246 0.6492 0.6754
14 0.2684 0.5368 0.7316
15 0.228 0.456 0.772
16 0.1557 0.3114 0.8443
17 0.113 0.2261 0.887
18 0.08816 0.1763 0.9118
19 0.1636 0.3273 0.8364
20 0.1659 0.3319 0.8341
21 0.2246 0.4493 0.7754
22 0.2361 0.4723 0.7639
23 0.191 0.3819 0.809
24 0.1615 0.3229 0.8385
25 0.1368 0.2736 0.8632
26 0.1395 0.279 0.8605
27 0.1186 0.2372 0.8814
28 0.1169 0.2337 0.8831
29 0.1393 0.2786 0.8607
30 0.1031 0.2062 0.8969
31 0.3378 0.6757 0.6622
32 0.2624 0.5249 0.7376
33 0.1983 0.3967 0.8017
34 0.1584 0.3169 0.8416
35 0.12 0.2401 0.88
36 0.112 0.2241 0.888
37 0.1594 0.3189 0.8406
38 0.1348 0.2696 0.8652
39 0.1048 0.2097 0.8952
40 0.07277 0.1455 0.9272
41 0.09223 0.1845 0.9078
42 0.0625 0.125 0.9375
43 0.2114 0.4229 0.7886
44 0.2173 0.4346 0.7827







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level0 0OK
5% type I error level00OK
10% type I error level00OK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 0 &  0 & OK \tabularnewline
5% type I error level & 0 & 0 & OK \tabularnewline
10% type I error level & 0 & 0 & OK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=286908&T=6

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]0[/C][C] 0[/C][C]OK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=286908&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=286908&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level0 0OK
5% type I error level00OK
10% type I error level00OK



Parameters (Session):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ; par4 = ; par5 = ;
R code (references can be found in the software module):
library(lattice)
library(lmtest)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
mywarning <- ''
par1 <- as.numeric(par1)
if(is.na(par1)) {
par1 <- 1
mywarning = 'Warning: you did not specify the column number of the endogenous series! The first column was selected by default.'
}
if (par4=='') par4 <- 0
par4 <- as.numeric(par4)
if (par5=='') par5 <- 0
par5 <- as.numeric(par5)
x <- na.omit(t(y))
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'Seasonal Differences (s=12)'){
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'First and Seasonal Differences (s=12)'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if(par4 > 0) {
x2 <- array(0, dim=c(n-par4,par4), dimnames=list(1:(n-par4), paste(colnames(x)[par1],'(t-',1:par4,')',sep='')))
for (i in 1:(n-par4)) {
for (j in 1:par4) {
x2[i,j] <- x[i+par4-j,par1]
}
}
x <- cbind(x[(par4+1):n,], x2)
n <- n - par4
}
if(par5 > 0) {
x2 <- array(0, dim=c(n-par5*12,par5), dimnames=list(1:(n-par5*12), paste(colnames(x)[par1],'(t-',1:par5,'s)',sep='')))
for (i in 1:(n-par5*12)) {
for (j in 1:par5) {
x2[i,j] <- x[i+par5*12-j*12,par1]
}
}
x <- cbind(x[(par5*12+1):n,], x2)
n <- n - par5*12
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
(k <- length(x[n,]))
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
(k <- length(x[n,]))
head(x)
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
qqline(mysum$resid)
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, signif(mysum$coefficients[i,1],6), sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, mywarning)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,formatC(signif(mysum$coefficients[i,1],5),format='g',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,2],5),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,3],4),format='e',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4],4),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4]/2,4),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a,formatC(signif(sqrt(mysum$r.squared),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$adj.r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a,formatC(signif(mysum$fstatistic[1],6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[2],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[3],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a,formatC(signif(1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a,formatC(signif(mysum$sigma,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a,formatC(signif(sum(myerror*myerror),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
if(n < 200) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,formatC(signif(x[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(x[i]-mysum$resid[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$resid[i],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,1],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,2],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,3],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant1,6))
a<-table.element(a,formatC(signif(numsignificant1/numgqtests,6),format='g',flag=' '))
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant5,6))
a<-table.element(a,signif(numsignificant5/numgqtests,6))
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant10,6))
a<-table.element(a,signif(numsignificant10/numgqtests,6))
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}
}