Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationSun, 14 Dec 2014 08:22:26 +0000
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2014/Dec/14/t1418546556kx0r1yt3l560ezk.htm/, Retrieved Thu, 16 May 2024 18:26:02 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=267332, Retrieved Thu, 16 May 2024 18:26:02 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact114
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-     [Multiple Regression] [Competence to learn] [2010-11-17 07:43:53] [b98453cac15ba1066b407e146608df68]
- RMPD    [Multiple Regression] [multiple regression] [2014-12-14 08:22:26] [ba449e08135e498de67ac1fe8477f1b8] [Current]
Feedback Forum

Post a new message
Dataseries X:
13 12 13 21 12.9
8 8 13 22 12.2
14 11 11 22 12.8
16 13 14 18 7.4
14 11 15 23 6.7
13 10 14 12 12.6
15 7 11 20 14.8
13 10 13 22 13.3
20 15 16 21 11.1
17 12 14 19 8.2
15 12 14 22 11.4
16 10 15 15 6.4
12 10 15 20 10.6
17 14 13 19 12
11 6 14 18 6.3
16 12 11 15 11.3
16 14 12 20 11.9
15 11 14 21 9.3
13 8 13 21 9.6
14 12 12 15 10
19 15 15 16 6.4
16 13 15 23 13.8
17 11 14 21 10.8
10 12 14 18 13.8
15 7 12 25 11.7
14 11 12 9 10.9
14 7 12 30 16.1
16 12 15 20 13.4
15 12 14 23 9.9
17 13 16 16 11.5
14 9 12 16 8.3
16 11 12 19 11.7
15 12 14 25 9
16 15 16 18 9.7
16 12 15 23 10.8
10 6 12 21 10.3
8 5 14 10 10.4
17 13 13 14 12.7
14 11 14 22 9.3
10 6 16 26 11.8
14 12 12 23 5.9
12 10 14 23 11.4
16 6 15 24 13
16 12 13 24 10.8
16 11 16 18 12.3
8 6 16 23 11.3
16 12 12 15 11.8
15 12 12 19 7.9
8 8 16 16 12.7
13 10 12 25 12.3
14 11 15 23 11.6
13 7 12 17 6.7
16 12 13 19 10.9
19 13 12 21 12.1
19 14 14 18 13.3
14 12 14 27 10.1
15 6 11 21 5.7
13 14 10 13 14.3
10 10 12 8 8
16 12 11 29 13.3
15 11 16 28 9.3
11 10 14 23 12.5
9 7 14 21 7.6
16 12 15 19 15.9
12 7 15 19 9.2
12 12 14 20 9.1
14 12 13 18 11.1
14 10 11 19 13
13 10 16 17 14.5
15 12 12 19 12.2
17 12 15 25 12.3
14 12 14 19 11.4
11 8 15 22 8.8
9 10 14 23 14.6
7 5 13 14 12.6
15 10 12 16 13
12 12 12 24 12.6
15 11 14 20 13.2
14 9 14 12 9.9
16 12 15 24 7.7
14 11 11 22 10.5
13 10 13 12 13.4
16 12 14 22 10.9
13 10 16 20 4.3
16 9 13 10 10.3
16 11 14 23 11.8
16 12 16 17 11.2
10 7 11 22 11.4
12 11 13 24 8.6
12 12 13 18 13.2
12 6 15 21 12.6
12 9 12 20 5.6
19 15 13 20 9.9
14 10 12 22 8.8
13 11 14 19 7.7
16 12 14 20 9
15 12 16 26 7.3
12 12 15 23 11.4
8 11 14 24 13.6
10 9 13 21 7.9
16 11 14 21 10.7
16 12 15 19 10.3
10 12 14 8 8.3
18 14 12 17 9.6
12 8 7 20 14.2
16 10 12 11 8.5
10 9 15 8 13.5
14 10 12 15 4.9
12 9 13 18 6.4
11 10 11 18 9.6
15 12 14 19 11.6
7 11 13 19 11.1




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time6 seconds
R Server'Herman Ole Andreas Wold' @ wold.wessa.net

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 6 seconds \tabularnewline
R Server & 'Herman Ole Andreas Wold' @ wold.wessa.net \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=267332&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]6 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Herman Ole Andreas Wold' @ wold.wessa.net[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=267332&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=267332&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time6 seconds
R Server'Herman Ole Andreas Wold' @ wold.wessa.net







Multiple Linear Regression - Estimated Regression Equation
TOT[t] = + 11.5055 -0.0974805CONFSTATTOT[t] + 0.125981CONFSOFTTOT[t] -0.148422STRESSTOT[t] + 0.0611593NUMERACYTOT[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
TOT[t] =  +  11.5055 -0.0974805CONFSTATTOT[t] +  0.125981CONFSOFTTOT[t] -0.148422STRESSTOT[t] +  0.0611593NUMERACYTOT[t]  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=267332&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]TOT[t] =  +  11.5055 -0.0974805CONFSTATTOT[t] +  0.125981CONFSOFTTOT[t] -0.148422STRESSTOT[t] +  0.0611593NUMERACYTOT[t]  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=267332&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=267332&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
TOT[t] = + 11.5055 -0.0974805CONFSTATTOT[t] + 0.125981CONFSOFTTOT[t] -0.148422STRESSTOT[t] + 0.0611593NUMERACYTOT[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)11.50552.40744.7795.63707e-062.81853e-06
CONFSTATTOT-0.09748050.108437-0.8990.3706940.185347
CONFSOFTTOT0.1259810.1312790.95960.33940.1697
STRESSTOT-0.1484220.147659-1.0050.3170840.158542
NUMERACYTOT0.06115930.05434541.1250.2629450.131473

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 11.5055 & 2.4074 & 4.779 & 5.63707e-06 & 2.81853e-06 \tabularnewline
CONFSTATTOT & -0.0974805 & 0.108437 & -0.899 & 0.370694 & 0.185347 \tabularnewline
CONFSOFTTOT & 0.125981 & 0.131279 & 0.9596 & 0.3394 & 0.1697 \tabularnewline
STRESSTOT & -0.148422 & 0.147659 & -1.005 & 0.317084 & 0.158542 \tabularnewline
NUMERACYTOT & 0.0611593 & 0.0543454 & 1.125 & 0.262945 & 0.131473 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=267332&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]11.5055[/C][C]2.4074[/C][C]4.779[/C][C]5.63707e-06[/C][C]2.81853e-06[/C][/ROW]
[ROW][C]CONFSTATTOT[/C][C]-0.0974805[/C][C]0.108437[/C][C]-0.899[/C][C]0.370694[/C][C]0.185347[/C][/ROW]
[ROW][C]CONFSOFTTOT[/C][C]0.125981[/C][C]0.131279[/C][C]0.9596[/C][C]0.3394[/C][C]0.1697[/C][/ROW]
[ROW][C]STRESSTOT[/C][C]-0.148422[/C][C]0.147659[/C][C]-1.005[/C][C]0.317084[/C][C]0.158542[/C][/ROW]
[ROW][C]NUMERACYTOT[/C][C]0.0611593[/C][C]0.0543454[/C][C]1.125[/C][C]0.262945[/C][C]0.131473[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=267332&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=267332&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)11.50552.40744.7795.63707e-062.81853e-06
CONFSTATTOT-0.09748050.108437-0.8990.3706940.185347
CONFSOFTTOT0.1259810.1312790.95960.33940.1697
STRESSTOT-0.1484220.147659-1.0050.3170840.158542
NUMERACYTOT0.06115930.05434541.1250.2629450.131473







Multiple Linear Regression - Regression Statistics
Multiple R0.159556
R-squared0.0254582
Adjusted R-squared-0.0109733
F-TEST (value)0.698796
F-TEST (DF numerator)4
F-TEST (DF denominator)107
p-value0.59444
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation2.48486
Sum Squared Residuals660.676

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.159556 \tabularnewline
R-squared & 0.0254582 \tabularnewline
Adjusted R-squared & -0.0109733 \tabularnewline
F-TEST (value) & 0.698796 \tabularnewline
F-TEST (DF numerator) & 4 \tabularnewline
F-TEST (DF denominator) & 107 \tabularnewline
p-value & 0.59444 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 2.48486 \tabularnewline
Sum Squared Residuals & 660.676 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=267332&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.159556[/C][/ROW]
[ROW][C]R-squared[/C][C]0.0254582[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]-0.0109733[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]0.698796[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]4[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]107[/C][/ROW]
[ROW][C]p-value[/C][C]0.59444[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]2.48486[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]660.676[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=267332&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=267332&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.159556
R-squared0.0254582
Adjusted R-squared-0.0109733
F-TEST (value)0.698796
F-TEST (DF numerator)4
F-TEST (DF denominator)107
p-value0.59444
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation2.48486
Sum Squared Residuals660.676







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
112.911.10491.79509
212.211.14951.05045
312.811.23951.56055
47.410.6065-3.20655
56.710.7069-4.00692
612.610.15412.44591
714.810.51574.28427
813.310.91412.38589
911.110.35520.744778
108.210.4442-2.24425
1111.410.82270.577315
126.49.89671-3.49671
1310.610.59240.00757606
141210.84461.15537
156.310.2121-3.91208
1611.310.74240.557645
1711.911.15170.748308
189.310.6355-1.33554
199.610.601-1.00098
201010.7889-0.788894
216.410.2953-3.89533
2213.810.76393.03608
2310.810.44060.359416
2413.811.06552.73455
2511.710.67311.0269
2610.910.2960.604043
2716.111.07645.02362
2813.410.45452.94554
299.910.8838-0.983844
3011.510.08991.4101
318.310.4721-2.17211
3211.710.71260.98741
33911.0062-2.00616
349.710.5617-0.861666
3510.810.63790.162058
3610.310.7899-0.489887
3710.49.889270.51073
3812.710.41292.28715
399.310.7942-1.49418
4011.810.5021.298
415.911.2782-5.37817
4211.410.92430.475676
43139.943223.05678
4410.810.9959-0.195945
4512.310.05772.24226
4611.310.51350.786522
4711.810.59391.20607
487.910.9361-3.03605
4912.710.33732.36268
5012.311.2461.05399
5111.610.70690.893078
526.710.3788-3.67879
5310.910.69010.209852
5412.110.79441.30557
5513.310.44012.85991
5610.111.226-1.12596
575.710.4509-4.75091
5814.311.31292.98714
59810.4987-2.49874
6013.311.59861.70141
619.310.7668-1.46682
6212.511.02181.4782
637.610.7165-3.1165
6415.910.39335.5067
659.210.1533-0.953322
669.110.9928-1.89281
6711.110.8240.27605
681310.932.07001
6914.510.1634.33696
7012.210.93611.26395
7112.310.66281.63722
7211.410.73670.663313
738.810.5603-1.76026
7414.611.21683.38323
7512.610.37982.22019
761310.50062.49939
7712.611.53431.06571
7813.210.57442.62561
799.99.93063-0.0306292
807.710.6991-2.9991
8110.511.2395-0.739451
8213.410.30253.09749
8310.910.72520.174796
844.310.3465-6.04652
8510.39.761770.538228
8611.810.66041.13962
8711.210.12261.07744
8811.411.12540.274551
898.611.2599-2.65989
9013.211.01892.18109
9112.610.14972.45034
925.610.9117-5.31171
939.910.8368-0.936809
948.810.965-2.16505
957.710.7082-3.00819
96910.6029-1.60289
977.310.7705-3.47048
9811.411.02790.372136
9913.611.50142.09861
1007.911.0194-3.11941
10110.710.53810.161936
10210.310.3933-0.0933043
1038.310.4539-2.15386
1049.610.7733-1.17325
10514.211.52782.67216
1068.510.0973-1.59733
10713.59.927493.57251
1084.910.5369-5.63693
1096.410.641-4.24097
1109.611.1613-1.56127
11111.610.63920.960793
11211.111.4415-0.341492

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 12.9 & 11.1049 & 1.79509 \tabularnewline
2 & 12.2 & 11.1495 & 1.05045 \tabularnewline
3 & 12.8 & 11.2395 & 1.56055 \tabularnewline
4 & 7.4 & 10.6065 & -3.20655 \tabularnewline
5 & 6.7 & 10.7069 & -4.00692 \tabularnewline
6 & 12.6 & 10.1541 & 2.44591 \tabularnewline
7 & 14.8 & 10.5157 & 4.28427 \tabularnewline
8 & 13.3 & 10.9141 & 2.38589 \tabularnewline
9 & 11.1 & 10.3552 & 0.744778 \tabularnewline
10 & 8.2 & 10.4442 & -2.24425 \tabularnewline
11 & 11.4 & 10.8227 & 0.577315 \tabularnewline
12 & 6.4 & 9.89671 & -3.49671 \tabularnewline
13 & 10.6 & 10.5924 & 0.00757606 \tabularnewline
14 & 12 & 10.8446 & 1.15537 \tabularnewline
15 & 6.3 & 10.2121 & -3.91208 \tabularnewline
16 & 11.3 & 10.7424 & 0.557645 \tabularnewline
17 & 11.9 & 11.1517 & 0.748308 \tabularnewline
18 & 9.3 & 10.6355 & -1.33554 \tabularnewline
19 & 9.6 & 10.601 & -1.00098 \tabularnewline
20 & 10 & 10.7889 & -0.788894 \tabularnewline
21 & 6.4 & 10.2953 & -3.89533 \tabularnewline
22 & 13.8 & 10.7639 & 3.03608 \tabularnewline
23 & 10.8 & 10.4406 & 0.359416 \tabularnewline
24 & 13.8 & 11.0655 & 2.73455 \tabularnewline
25 & 11.7 & 10.6731 & 1.0269 \tabularnewline
26 & 10.9 & 10.296 & 0.604043 \tabularnewline
27 & 16.1 & 11.0764 & 5.02362 \tabularnewline
28 & 13.4 & 10.4545 & 2.94554 \tabularnewline
29 & 9.9 & 10.8838 & -0.983844 \tabularnewline
30 & 11.5 & 10.0899 & 1.4101 \tabularnewline
31 & 8.3 & 10.4721 & -2.17211 \tabularnewline
32 & 11.7 & 10.7126 & 0.98741 \tabularnewline
33 & 9 & 11.0062 & -2.00616 \tabularnewline
34 & 9.7 & 10.5617 & -0.861666 \tabularnewline
35 & 10.8 & 10.6379 & 0.162058 \tabularnewline
36 & 10.3 & 10.7899 & -0.489887 \tabularnewline
37 & 10.4 & 9.88927 & 0.51073 \tabularnewline
38 & 12.7 & 10.4129 & 2.28715 \tabularnewline
39 & 9.3 & 10.7942 & -1.49418 \tabularnewline
40 & 11.8 & 10.502 & 1.298 \tabularnewline
41 & 5.9 & 11.2782 & -5.37817 \tabularnewline
42 & 11.4 & 10.9243 & 0.475676 \tabularnewline
43 & 13 & 9.94322 & 3.05678 \tabularnewline
44 & 10.8 & 10.9959 & -0.195945 \tabularnewline
45 & 12.3 & 10.0577 & 2.24226 \tabularnewline
46 & 11.3 & 10.5135 & 0.786522 \tabularnewline
47 & 11.8 & 10.5939 & 1.20607 \tabularnewline
48 & 7.9 & 10.9361 & -3.03605 \tabularnewline
49 & 12.7 & 10.3373 & 2.36268 \tabularnewline
50 & 12.3 & 11.246 & 1.05399 \tabularnewline
51 & 11.6 & 10.7069 & 0.893078 \tabularnewline
52 & 6.7 & 10.3788 & -3.67879 \tabularnewline
53 & 10.9 & 10.6901 & 0.209852 \tabularnewline
54 & 12.1 & 10.7944 & 1.30557 \tabularnewline
55 & 13.3 & 10.4401 & 2.85991 \tabularnewline
56 & 10.1 & 11.226 & -1.12596 \tabularnewline
57 & 5.7 & 10.4509 & -4.75091 \tabularnewline
58 & 14.3 & 11.3129 & 2.98714 \tabularnewline
59 & 8 & 10.4987 & -2.49874 \tabularnewline
60 & 13.3 & 11.5986 & 1.70141 \tabularnewline
61 & 9.3 & 10.7668 & -1.46682 \tabularnewline
62 & 12.5 & 11.0218 & 1.4782 \tabularnewline
63 & 7.6 & 10.7165 & -3.1165 \tabularnewline
64 & 15.9 & 10.3933 & 5.5067 \tabularnewline
65 & 9.2 & 10.1533 & -0.953322 \tabularnewline
66 & 9.1 & 10.9928 & -1.89281 \tabularnewline
67 & 11.1 & 10.824 & 0.27605 \tabularnewline
68 & 13 & 10.93 & 2.07001 \tabularnewline
69 & 14.5 & 10.163 & 4.33696 \tabularnewline
70 & 12.2 & 10.9361 & 1.26395 \tabularnewline
71 & 12.3 & 10.6628 & 1.63722 \tabularnewline
72 & 11.4 & 10.7367 & 0.663313 \tabularnewline
73 & 8.8 & 10.5603 & -1.76026 \tabularnewline
74 & 14.6 & 11.2168 & 3.38323 \tabularnewline
75 & 12.6 & 10.3798 & 2.22019 \tabularnewline
76 & 13 & 10.5006 & 2.49939 \tabularnewline
77 & 12.6 & 11.5343 & 1.06571 \tabularnewline
78 & 13.2 & 10.5744 & 2.62561 \tabularnewline
79 & 9.9 & 9.93063 & -0.0306292 \tabularnewline
80 & 7.7 & 10.6991 & -2.9991 \tabularnewline
81 & 10.5 & 11.2395 & -0.739451 \tabularnewline
82 & 13.4 & 10.3025 & 3.09749 \tabularnewline
83 & 10.9 & 10.7252 & 0.174796 \tabularnewline
84 & 4.3 & 10.3465 & -6.04652 \tabularnewline
85 & 10.3 & 9.76177 & 0.538228 \tabularnewline
86 & 11.8 & 10.6604 & 1.13962 \tabularnewline
87 & 11.2 & 10.1226 & 1.07744 \tabularnewline
88 & 11.4 & 11.1254 & 0.274551 \tabularnewline
89 & 8.6 & 11.2599 & -2.65989 \tabularnewline
90 & 13.2 & 11.0189 & 2.18109 \tabularnewline
91 & 12.6 & 10.1497 & 2.45034 \tabularnewline
92 & 5.6 & 10.9117 & -5.31171 \tabularnewline
93 & 9.9 & 10.8368 & -0.936809 \tabularnewline
94 & 8.8 & 10.965 & -2.16505 \tabularnewline
95 & 7.7 & 10.7082 & -3.00819 \tabularnewline
96 & 9 & 10.6029 & -1.60289 \tabularnewline
97 & 7.3 & 10.7705 & -3.47048 \tabularnewline
98 & 11.4 & 11.0279 & 0.372136 \tabularnewline
99 & 13.6 & 11.5014 & 2.09861 \tabularnewline
100 & 7.9 & 11.0194 & -3.11941 \tabularnewline
101 & 10.7 & 10.5381 & 0.161936 \tabularnewline
102 & 10.3 & 10.3933 & -0.0933043 \tabularnewline
103 & 8.3 & 10.4539 & -2.15386 \tabularnewline
104 & 9.6 & 10.7733 & -1.17325 \tabularnewline
105 & 14.2 & 11.5278 & 2.67216 \tabularnewline
106 & 8.5 & 10.0973 & -1.59733 \tabularnewline
107 & 13.5 & 9.92749 & 3.57251 \tabularnewline
108 & 4.9 & 10.5369 & -5.63693 \tabularnewline
109 & 6.4 & 10.641 & -4.24097 \tabularnewline
110 & 9.6 & 11.1613 & -1.56127 \tabularnewline
111 & 11.6 & 10.6392 & 0.960793 \tabularnewline
112 & 11.1 & 11.4415 & -0.341492 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=267332&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]12.9[/C][C]11.1049[/C][C]1.79509[/C][/ROW]
[ROW][C]2[/C][C]12.2[/C][C]11.1495[/C][C]1.05045[/C][/ROW]
[ROW][C]3[/C][C]12.8[/C][C]11.2395[/C][C]1.56055[/C][/ROW]
[ROW][C]4[/C][C]7.4[/C][C]10.6065[/C][C]-3.20655[/C][/ROW]
[ROW][C]5[/C][C]6.7[/C][C]10.7069[/C][C]-4.00692[/C][/ROW]
[ROW][C]6[/C][C]12.6[/C][C]10.1541[/C][C]2.44591[/C][/ROW]
[ROW][C]7[/C][C]14.8[/C][C]10.5157[/C][C]4.28427[/C][/ROW]
[ROW][C]8[/C][C]13.3[/C][C]10.9141[/C][C]2.38589[/C][/ROW]
[ROW][C]9[/C][C]11.1[/C][C]10.3552[/C][C]0.744778[/C][/ROW]
[ROW][C]10[/C][C]8.2[/C][C]10.4442[/C][C]-2.24425[/C][/ROW]
[ROW][C]11[/C][C]11.4[/C][C]10.8227[/C][C]0.577315[/C][/ROW]
[ROW][C]12[/C][C]6.4[/C][C]9.89671[/C][C]-3.49671[/C][/ROW]
[ROW][C]13[/C][C]10.6[/C][C]10.5924[/C][C]0.00757606[/C][/ROW]
[ROW][C]14[/C][C]12[/C][C]10.8446[/C][C]1.15537[/C][/ROW]
[ROW][C]15[/C][C]6.3[/C][C]10.2121[/C][C]-3.91208[/C][/ROW]
[ROW][C]16[/C][C]11.3[/C][C]10.7424[/C][C]0.557645[/C][/ROW]
[ROW][C]17[/C][C]11.9[/C][C]11.1517[/C][C]0.748308[/C][/ROW]
[ROW][C]18[/C][C]9.3[/C][C]10.6355[/C][C]-1.33554[/C][/ROW]
[ROW][C]19[/C][C]9.6[/C][C]10.601[/C][C]-1.00098[/C][/ROW]
[ROW][C]20[/C][C]10[/C][C]10.7889[/C][C]-0.788894[/C][/ROW]
[ROW][C]21[/C][C]6.4[/C][C]10.2953[/C][C]-3.89533[/C][/ROW]
[ROW][C]22[/C][C]13.8[/C][C]10.7639[/C][C]3.03608[/C][/ROW]
[ROW][C]23[/C][C]10.8[/C][C]10.4406[/C][C]0.359416[/C][/ROW]
[ROW][C]24[/C][C]13.8[/C][C]11.0655[/C][C]2.73455[/C][/ROW]
[ROW][C]25[/C][C]11.7[/C][C]10.6731[/C][C]1.0269[/C][/ROW]
[ROW][C]26[/C][C]10.9[/C][C]10.296[/C][C]0.604043[/C][/ROW]
[ROW][C]27[/C][C]16.1[/C][C]11.0764[/C][C]5.02362[/C][/ROW]
[ROW][C]28[/C][C]13.4[/C][C]10.4545[/C][C]2.94554[/C][/ROW]
[ROW][C]29[/C][C]9.9[/C][C]10.8838[/C][C]-0.983844[/C][/ROW]
[ROW][C]30[/C][C]11.5[/C][C]10.0899[/C][C]1.4101[/C][/ROW]
[ROW][C]31[/C][C]8.3[/C][C]10.4721[/C][C]-2.17211[/C][/ROW]
[ROW][C]32[/C][C]11.7[/C][C]10.7126[/C][C]0.98741[/C][/ROW]
[ROW][C]33[/C][C]9[/C][C]11.0062[/C][C]-2.00616[/C][/ROW]
[ROW][C]34[/C][C]9.7[/C][C]10.5617[/C][C]-0.861666[/C][/ROW]
[ROW][C]35[/C][C]10.8[/C][C]10.6379[/C][C]0.162058[/C][/ROW]
[ROW][C]36[/C][C]10.3[/C][C]10.7899[/C][C]-0.489887[/C][/ROW]
[ROW][C]37[/C][C]10.4[/C][C]9.88927[/C][C]0.51073[/C][/ROW]
[ROW][C]38[/C][C]12.7[/C][C]10.4129[/C][C]2.28715[/C][/ROW]
[ROW][C]39[/C][C]9.3[/C][C]10.7942[/C][C]-1.49418[/C][/ROW]
[ROW][C]40[/C][C]11.8[/C][C]10.502[/C][C]1.298[/C][/ROW]
[ROW][C]41[/C][C]5.9[/C][C]11.2782[/C][C]-5.37817[/C][/ROW]
[ROW][C]42[/C][C]11.4[/C][C]10.9243[/C][C]0.475676[/C][/ROW]
[ROW][C]43[/C][C]13[/C][C]9.94322[/C][C]3.05678[/C][/ROW]
[ROW][C]44[/C][C]10.8[/C][C]10.9959[/C][C]-0.195945[/C][/ROW]
[ROW][C]45[/C][C]12.3[/C][C]10.0577[/C][C]2.24226[/C][/ROW]
[ROW][C]46[/C][C]11.3[/C][C]10.5135[/C][C]0.786522[/C][/ROW]
[ROW][C]47[/C][C]11.8[/C][C]10.5939[/C][C]1.20607[/C][/ROW]
[ROW][C]48[/C][C]7.9[/C][C]10.9361[/C][C]-3.03605[/C][/ROW]
[ROW][C]49[/C][C]12.7[/C][C]10.3373[/C][C]2.36268[/C][/ROW]
[ROW][C]50[/C][C]12.3[/C][C]11.246[/C][C]1.05399[/C][/ROW]
[ROW][C]51[/C][C]11.6[/C][C]10.7069[/C][C]0.893078[/C][/ROW]
[ROW][C]52[/C][C]6.7[/C][C]10.3788[/C][C]-3.67879[/C][/ROW]
[ROW][C]53[/C][C]10.9[/C][C]10.6901[/C][C]0.209852[/C][/ROW]
[ROW][C]54[/C][C]12.1[/C][C]10.7944[/C][C]1.30557[/C][/ROW]
[ROW][C]55[/C][C]13.3[/C][C]10.4401[/C][C]2.85991[/C][/ROW]
[ROW][C]56[/C][C]10.1[/C][C]11.226[/C][C]-1.12596[/C][/ROW]
[ROW][C]57[/C][C]5.7[/C][C]10.4509[/C][C]-4.75091[/C][/ROW]
[ROW][C]58[/C][C]14.3[/C][C]11.3129[/C][C]2.98714[/C][/ROW]
[ROW][C]59[/C][C]8[/C][C]10.4987[/C][C]-2.49874[/C][/ROW]
[ROW][C]60[/C][C]13.3[/C][C]11.5986[/C][C]1.70141[/C][/ROW]
[ROW][C]61[/C][C]9.3[/C][C]10.7668[/C][C]-1.46682[/C][/ROW]
[ROW][C]62[/C][C]12.5[/C][C]11.0218[/C][C]1.4782[/C][/ROW]
[ROW][C]63[/C][C]7.6[/C][C]10.7165[/C][C]-3.1165[/C][/ROW]
[ROW][C]64[/C][C]15.9[/C][C]10.3933[/C][C]5.5067[/C][/ROW]
[ROW][C]65[/C][C]9.2[/C][C]10.1533[/C][C]-0.953322[/C][/ROW]
[ROW][C]66[/C][C]9.1[/C][C]10.9928[/C][C]-1.89281[/C][/ROW]
[ROW][C]67[/C][C]11.1[/C][C]10.824[/C][C]0.27605[/C][/ROW]
[ROW][C]68[/C][C]13[/C][C]10.93[/C][C]2.07001[/C][/ROW]
[ROW][C]69[/C][C]14.5[/C][C]10.163[/C][C]4.33696[/C][/ROW]
[ROW][C]70[/C][C]12.2[/C][C]10.9361[/C][C]1.26395[/C][/ROW]
[ROW][C]71[/C][C]12.3[/C][C]10.6628[/C][C]1.63722[/C][/ROW]
[ROW][C]72[/C][C]11.4[/C][C]10.7367[/C][C]0.663313[/C][/ROW]
[ROW][C]73[/C][C]8.8[/C][C]10.5603[/C][C]-1.76026[/C][/ROW]
[ROW][C]74[/C][C]14.6[/C][C]11.2168[/C][C]3.38323[/C][/ROW]
[ROW][C]75[/C][C]12.6[/C][C]10.3798[/C][C]2.22019[/C][/ROW]
[ROW][C]76[/C][C]13[/C][C]10.5006[/C][C]2.49939[/C][/ROW]
[ROW][C]77[/C][C]12.6[/C][C]11.5343[/C][C]1.06571[/C][/ROW]
[ROW][C]78[/C][C]13.2[/C][C]10.5744[/C][C]2.62561[/C][/ROW]
[ROW][C]79[/C][C]9.9[/C][C]9.93063[/C][C]-0.0306292[/C][/ROW]
[ROW][C]80[/C][C]7.7[/C][C]10.6991[/C][C]-2.9991[/C][/ROW]
[ROW][C]81[/C][C]10.5[/C][C]11.2395[/C][C]-0.739451[/C][/ROW]
[ROW][C]82[/C][C]13.4[/C][C]10.3025[/C][C]3.09749[/C][/ROW]
[ROW][C]83[/C][C]10.9[/C][C]10.7252[/C][C]0.174796[/C][/ROW]
[ROW][C]84[/C][C]4.3[/C][C]10.3465[/C][C]-6.04652[/C][/ROW]
[ROW][C]85[/C][C]10.3[/C][C]9.76177[/C][C]0.538228[/C][/ROW]
[ROW][C]86[/C][C]11.8[/C][C]10.6604[/C][C]1.13962[/C][/ROW]
[ROW][C]87[/C][C]11.2[/C][C]10.1226[/C][C]1.07744[/C][/ROW]
[ROW][C]88[/C][C]11.4[/C][C]11.1254[/C][C]0.274551[/C][/ROW]
[ROW][C]89[/C][C]8.6[/C][C]11.2599[/C][C]-2.65989[/C][/ROW]
[ROW][C]90[/C][C]13.2[/C][C]11.0189[/C][C]2.18109[/C][/ROW]
[ROW][C]91[/C][C]12.6[/C][C]10.1497[/C][C]2.45034[/C][/ROW]
[ROW][C]92[/C][C]5.6[/C][C]10.9117[/C][C]-5.31171[/C][/ROW]
[ROW][C]93[/C][C]9.9[/C][C]10.8368[/C][C]-0.936809[/C][/ROW]
[ROW][C]94[/C][C]8.8[/C][C]10.965[/C][C]-2.16505[/C][/ROW]
[ROW][C]95[/C][C]7.7[/C][C]10.7082[/C][C]-3.00819[/C][/ROW]
[ROW][C]96[/C][C]9[/C][C]10.6029[/C][C]-1.60289[/C][/ROW]
[ROW][C]97[/C][C]7.3[/C][C]10.7705[/C][C]-3.47048[/C][/ROW]
[ROW][C]98[/C][C]11.4[/C][C]11.0279[/C][C]0.372136[/C][/ROW]
[ROW][C]99[/C][C]13.6[/C][C]11.5014[/C][C]2.09861[/C][/ROW]
[ROW][C]100[/C][C]7.9[/C][C]11.0194[/C][C]-3.11941[/C][/ROW]
[ROW][C]101[/C][C]10.7[/C][C]10.5381[/C][C]0.161936[/C][/ROW]
[ROW][C]102[/C][C]10.3[/C][C]10.3933[/C][C]-0.0933043[/C][/ROW]
[ROW][C]103[/C][C]8.3[/C][C]10.4539[/C][C]-2.15386[/C][/ROW]
[ROW][C]104[/C][C]9.6[/C][C]10.7733[/C][C]-1.17325[/C][/ROW]
[ROW][C]105[/C][C]14.2[/C][C]11.5278[/C][C]2.67216[/C][/ROW]
[ROW][C]106[/C][C]8.5[/C][C]10.0973[/C][C]-1.59733[/C][/ROW]
[ROW][C]107[/C][C]13.5[/C][C]9.92749[/C][C]3.57251[/C][/ROW]
[ROW][C]108[/C][C]4.9[/C][C]10.5369[/C][C]-5.63693[/C][/ROW]
[ROW][C]109[/C][C]6.4[/C][C]10.641[/C][C]-4.24097[/C][/ROW]
[ROW][C]110[/C][C]9.6[/C][C]11.1613[/C][C]-1.56127[/C][/ROW]
[ROW][C]111[/C][C]11.6[/C][C]10.6392[/C][C]0.960793[/C][/ROW]
[ROW][C]112[/C][C]11.1[/C][C]11.4415[/C][C]-0.341492[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=267332&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=267332&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
112.911.10491.79509
212.211.14951.05045
312.811.23951.56055
47.410.6065-3.20655
56.710.7069-4.00692
612.610.15412.44591
714.810.51574.28427
813.310.91412.38589
911.110.35520.744778
108.210.4442-2.24425
1111.410.82270.577315
126.49.89671-3.49671
1310.610.59240.00757606
141210.84461.15537
156.310.2121-3.91208
1611.310.74240.557645
1711.911.15170.748308
189.310.6355-1.33554
199.610.601-1.00098
201010.7889-0.788894
216.410.2953-3.89533
2213.810.76393.03608
2310.810.44060.359416
2413.811.06552.73455
2511.710.67311.0269
2610.910.2960.604043
2716.111.07645.02362
2813.410.45452.94554
299.910.8838-0.983844
3011.510.08991.4101
318.310.4721-2.17211
3211.710.71260.98741
33911.0062-2.00616
349.710.5617-0.861666
3510.810.63790.162058
3610.310.7899-0.489887
3710.49.889270.51073
3812.710.41292.28715
399.310.7942-1.49418
4011.810.5021.298
415.911.2782-5.37817
4211.410.92430.475676
43139.943223.05678
4410.810.9959-0.195945
4512.310.05772.24226
4611.310.51350.786522
4711.810.59391.20607
487.910.9361-3.03605
4912.710.33732.36268
5012.311.2461.05399
5111.610.70690.893078
526.710.3788-3.67879
5310.910.69010.209852
5412.110.79441.30557
5513.310.44012.85991
5610.111.226-1.12596
575.710.4509-4.75091
5814.311.31292.98714
59810.4987-2.49874
6013.311.59861.70141
619.310.7668-1.46682
6212.511.02181.4782
637.610.7165-3.1165
6415.910.39335.5067
659.210.1533-0.953322
669.110.9928-1.89281
6711.110.8240.27605
681310.932.07001
6914.510.1634.33696
7012.210.93611.26395
7112.310.66281.63722
7211.410.73670.663313
738.810.5603-1.76026
7414.611.21683.38323
7512.610.37982.22019
761310.50062.49939
7712.611.53431.06571
7813.210.57442.62561
799.99.93063-0.0306292
807.710.6991-2.9991
8110.511.2395-0.739451
8213.410.30253.09749
8310.910.72520.174796
844.310.3465-6.04652
8510.39.761770.538228
8611.810.66041.13962
8711.210.12261.07744
8811.411.12540.274551
898.611.2599-2.65989
9013.211.01892.18109
9112.610.14972.45034
925.610.9117-5.31171
939.910.8368-0.936809
948.810.965-2.16505
957.710.7082-3.00819
96910.6029-1.60289
977.310.7705-3.47048
9811.411.02790.372136
9913.611.50142.09861
1007.911.0194-3.11941
10110.710.53810.161936
10210.310.3933-0.0933043
1038.310.4539-2.15386
1049.610.7733-1.17325
10514.211.52782.67216
1068.510.0973-1.59733
10713.59.927493.57251
1084.910.5369-5.63693
1096.410.641-4.24097
1109.611.1613-1.56127
11111.610.63920.960793
11211.111.4415-0.341492







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
80.4635140.9270280.536486
90.7424120.5151750.257588
100.7307460.5385080.269254
110.6356550.728690.364345
120.6685560.6628880.331444
130.586840.826320.41316
140.4866890.9733790.513311
150.5435480.9129040.456452
160.5254270.9491470.474573
170.4522660.9045330.547734
180.3777450.755490.622255
190.3113580.6227170.688642
200.2787940.5575880.721206
210.2827660.5655320.717234
220.3849530.7699070.615047
230.3189420.6378840.681058
240.3329620.6659230.667038
250.2710190.5420380.728981
260.2225230.4450450.777477
270.2821090.5642180.717891
280.3693960.7387920.630604
290.3331970.6663950.666803
300.3720230.7440470.627977
310.366930.733860.63307
320.3109510.6219010.689049
330.328690.657380.67131
340.2788550.5577090.721145
350.2286610.4573220.771339
360.1949630.3899260.805037
370.1745780.3491570.825422
380.1725890.3451780.827411
390.1531770.3063540.846823
400.1347830.2695660.865217
410.3636910.7273830.636309
420.3110340.6220670.688966
430.3376410.6752820.662359
440.2878880.5757760.712112
450.284280.5685610.71572
460.2446570.4893130.755343
470.2097850.419570.790215
480.2361770.4723540.763823
490.2358990.4717980.764101
500.2019520.4039030.798048
510.1689730.3379460.831027
520.2186720.4373430.781328
530.1794090.3588180.820591
540.1531350.306270.846865
550.1625870.3251730.837413
560.1377420.2754840.862258
570.2260450.452090.773955
580.2391750.478350.760825
590.2399660.4799330.760034
600.2201450.440290.779855
610.193870.3877410.80613
620.1701150.3402310.829885
630.1881570.3763130.811843
640.3694170.7388350.630583
650.3241130.6482250.675887
660.2999990.5999980.700001
670.2532560.5065130.746744
680.2414470.4828950.758553
690.3386650.677330.661335
700.3047550.6095110.695245
710.2979690.5959370.702031
720.257070.514140.74293
730.2294070.4588150.770593
740.2752690.5505370.724731
750.2562960.5125920.743704
760.263260.5265190.73674
770.2370950.474190.762905
780.2685910.5371820.731409
790.2201010.4402030.779899
800.2115110.4230230.788489
810.1715810.3431620.828419
820.2001290.4002570.799871
830.1666960.3333930.833304
840.3692690.7385370.630731
850.3164120.6328230.683588
860.294580.5891610.70542
870.2679970.5359940.732003
880.2195890.4391780.780411
890.1997440.3994870.800256
900.2118080.4236160.788192
910.2847430.5694850.715257
920.4102910.8205830.589709
930.3385430.6770860.661457
940.2812230.5624450.718777
950.2587540.5175070.741246
960.197230.3944610.80277
970.1936750.387350.806325
980.1378040.2756080.862196
990.1251090.2502180.874891
1000.1165130.2330260.883487
1010.07687780.1537560.923122
1020.05203970.1040790.94796
1030.04931970.09863930.95068
1040.02291660.04583330.977083

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
8 & 0.463514 & 0.927028 & 0.536486 \tabularnewline
9 & 0.742412 & 0.515175 & 0.257588 \tabularnewline
10 & 0.730746 & 0.538508 & 0.269254 \tabularnewline
11 & 0.635655 & 0.72869 & 0.364345 \tabularnewline
12 & 0.668556 & 0.662888 & 0.331444 \tabularnewline
13 & 0.58684 & 0.82632 & 0.41316 \tabularnewline
14 & 0.486689 & 0.973379 & 0.513311 \tabularnewline
15 & 0.543548 & 0.912904 & 0.456452 \tabularnewline
16 & 0.525427 & 0.949147 & 0.474573 \tabularnewline
17 & 0.452266 & 0.904533 & 0.547734 \tabularnewline
18 & 0.377745 & 0.75549 & 0.622255 \tabularnewline
19 & 0.311358 & 0.622717 & 0.688642 \tabularnewline
20 & 0.278794 & 0.557588 & 0.721206 \tabularnewline
21 & 0.282766 & 0.565532 & 0.717234 \tabularnewline
22 & 0.384953 & 0.769907 & 0.615047 \tabularnewline
23 & 0.318942 & 0.637884 & 0.681058 \tabularnewline
24 & 0.332962 & 0.665923 & 0.667038 \tabularnewline
25 & 0.271019 & 0.542038 & 0.728981 \tabularnewline
26 & 0.222523 & 0.445045 & 0.777477 \tabularnewline
27 & 0.282109 & 0.564218 & 0.717891 \tabularnewline
28 & 0.369396 & 0.738792 & 0.630604 \tabularnewline
29 & 0.333197 & 0.666395 & 0.666803 \tabularnewline
30 & 0.372023 & 0.744047 & 0.627977 \tabularnewline
31 & 0.36693 & 0.73386 & 0.63307 \tabularnewline
32 & 0.310951 & 0.621901 & 0.689049 \tabularnewline
33 & 0.32869 & 0.65738 & 0.67131 \tabularnewline
34 & 0.278855 & 0.557709 & 0.721145 \tabularnewline
35 & 0.228661 & 0.457322 & 0.771339 \tabularnewline
36 & 0.194963 & 0.389926 & 0.805037 \tabularnewline
37 & 0.174578 & 0.349157 & 0.825422 \tabularnewline
38 & 0.172589 & 0.345178 & 0.827411 \tabularnewline
39 & 0.153177 & 0.306354 & 0.846823 \tabularnewline
40 & 0.134783 & 0.269566 & 0.865217 \tabularnewline
41 & 0.363691 & 0.727383 & 0.636309 \tabularnewline
42 & 0.311034 & 0.622067 & 0.688966 \tabularnewline
43 & 0.337641 & 0.675282 & 0.662359 \tabularnewline
44 & 0.287888 & 0.575776 & 0.712112 \tabularnewline
45 & 0.28428 & 0.568561 & 0.71572 \tabularnewline
46 & 0.244657 & 0.489313 & 0.755343 \tabularnewline
47 & 0.209785 & 0.41957 & 0.790215 \tabularnewline
48 & 0.236177 & 0.472354 & 0.763823 \tabularnewline
49 & 0.235899 & 0.471798 & 0.764101 \tabularnewline
50 & 0.201952 & 0.403903 & 0.798048 \tabularnewline
51 & 0.168973 & 0.337946 & 0.831027 \tabularnewline
52 & 0.218672 & 0.437343 & 0.781328 \tabularnewline
53 & 0.179409 & 0.358818 & 0.820591 \tabularnewline
54 & 0.153135 & 0.30627 & 0.846865 \tabularnewline
55 & 0.162587 & 0.325173 & 0.837413 \tabularnewline
56 & 0.137742 & 0.275484 & 0.862258 \tabularnewline
57 & 0.226045 & 0.45209 & 0.773955 \tabularnewline
58 & 0.239175 & 0.47835 & 0.760825 \tabularnewline
59 & 0.239966 & 0.479933 & 0.760034 \tabularnewline
60 & 0.220145 & 0.44029 & 0.779855 \tabularnewline
61 & 0.19387 & 0.387741 & 0.80613 \tabularnewline
62 & 0.170115 & 0.340231 & 0.829885 \tabularnewline
63 & 0.188157 & 0.376313 & 0.811843 \tabularnewline
64 & 0.369417 & 0.738835 & 0.630583 \tabularnewline
65 & 0.324113 & 0.648225 & 0.675887 \tabularnewline
66 & 0.299999 & 0.599998 & 0.700001 \tabularnewline
67 & 0.253256 & 0.506513 & 0.746744 \tabularnewline
68 & 0.241447 & 0.482895 & 0.758553 \tabularnewline
69 & 0.338665 & 0.67733 & 0.661335 \tabularnewline
70 & 0.304755 & 0.609511 & 0.695245 \tabularnewline
71 & 0.297969 & 0.595937 & 0.702031 \tabularnewline
72 & 0.25707 & 0.51414 & 0.74293 \tabularnewline
73 & 0.229407 & 0.458815 & 0.770593 \tabularnewline
74 & 0.275269 & 0.550537 & 0.724731 \tabularnewline
75 & 0.256296 & 0.512592 & 0.743704 \tabularnewline
76 & 0.26326 & 0.526519 & 0.73674 \tabularnewline
77 & 0.237095 & 0.47419 & 0.762905 \tabularnewline
78 & 0.268591 & 0.537182 & 0.731409 \tabularnewline
79 & 0.220101 & 0.440203 & 0.779899 \tabularnewline
80 & 0.211511 & 0.423023 & 0.788489 \tabularnewline
81 & 0.171581 & 0.343162 & 0.828419 \tabularnewline
82 & 0.200129 & 0.400257 & 0.799871 \tabularnewline
83 & 0.166696 & 0.333393 & 0.833304 \tabularnewline
84 & 0.369269 & 0.738537 & 0.630731 \tabularnewline
85 & 0.316412 & 0.632823 & 0.683588 \tabularnewline
86 & 0.29458 & 0.589161 & 0.70542 \tabularnewline
87 & 0.267997 & 0.535994 & 0.732003 \tabularnewline
88 & 0.219589 & 0.439178 & 0.780411 \tabularnewline
89 & 0.199744 & 0.399487 & 0.800256 \tabularnewline
90 & 0.211808 & 0.423616 & 0.788192 \tabularnewline
91 & 0.284743 & 0.569485 & 0.715257 \tabularnewline
92 & 0.410291 & 0.820583 & 0.589709 \tabularnewline
93 & 0.338543 & 0.677086 & 0.661457 \tabularnewline
94 & 0.281223 & 0.562445 & 0.718777 \tabularnewline
95 & 0.258754 & 0.517507 & 0.741246 \tabularnewline
96 & 0.19723 & 0.394461 & 0.80277 \tabularnewline
97 & 0.193675 & 0.38735 & 0.806325 \tabularnewline
98 & 0.137804 & 0.275608 & 0.862196 \tabularnewline
99 & 0.125109 & 0.250218 & 0.874891 \tabularnewline
100 & 0.116513 & 0.233026 & 0.883487 \tabularnewline
101 & 0.0768778 & 0.153756 & 0.923122 \tabularnewline
102 & 0.0520397 & 0.104079 & 0.94796 \tabularnewline
103 & 0.0493197 & 0.0986393 & 0.95068 \tabularnewline
104 & 0.0229166 & 0.0458333 & 0.977083 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=267332&T=5

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]8[/C][C]0.463514[/C][C]0.927028[/C][C]0.536486[/C][/ROW]
[ROW][C]9[/C][C]0.742412[/C][C]0.515175[/C][C]0.257588[/C][/ROW]
[ROW][C]10[/C][C]0.730746[/C][C]0.538508[/C][C]0.269254[/C][/ROW]
[ROW][C]11[/C][C]0.635655[/C][C]0.72869[/C][C]0.364345[/C][/ROW]
[ROW][C]12[/C][C]0.668556[/C][C]0.662888[/C][C]0.331444[/C][/ROW]
[ROW][C]13[/C][C]0.58684[/C][C]0.82632[/C][C]0.41316[/C][/ROW]
[ROW][C]14[/C][C]0.486689[/C][C]0.973379[/C][C]0.513311[/C][/ROW]
[ROW][C]15[/C][C]0.543548[/C][C]0.912904[/C][C]0.456452[/C][/ROW]
[ROW][C]16[/C][C]0.525427[/C][C]0.949147[/C][C]0.474573[/C][/ROW]
[ROW][C]17[/C][C]0.452266[/C][C]0.904533[/C][C]0.547734[/C][/ROW]
[ROW][C]18[/C][C]0.377745[/C][C]0.75549[/C][C]0.622255[/C][/ROW]
[ROW][C]19[/C][C]0.311358[/C][C]0.622717[/C][C]0.688642[/C][/ROW]
[ROW][C]20[/C][C]0.278794[/C][C]0.557588[/C][C]0.721206[/C][/ROW]
[ROW][C]21[/C][C]0.282766[/C][C]0.565532[/C][C]0.717234[/C][/ROW]
[ROW][C]22[/C][C]0.384953[/C][C]0.769907[/C][C]0.615047[/C][/ROW]
[ROW][C]23[/C][C]0.318942[/C][C]0.637884[/C][C]0.681058[/C][/ROW]
[ROW][C]24[/C][C]0.332962[/C][C]0.665923[/C][C]0.667038[/C][/ROW]
[ROW][C]25[/C][C]0.271019[/C][C]0.542038[/C][C]0.728981[/C][/ROW]
[ROW][C]26[/C][C]0.222523[/C][C]0.445045[/C][C]0.777477[/C][/ROW]
[ROW][C]27[/C][C]0.282109[/C][C]0.564218[/C][C]0.717891[/C][/ROW]
[ROW][C]28[/C][C]0.369396[/C][C]0.738792[/C][C]0.630604[/C][/ROW]
[ROW][C]29[/C][C]0.333197[/C][C]0.666395[/C][C]0.666803[/C][/ROW]
[ROW][C]30[/C][C]0.372023[/C][C]0.744047[/C][C]0.627977[/C][/ROW]
[ROW][C]31[/C][C]0.36693[/C][C]0.73386[/C][C]0.63307[/C][/ROW]
[ROW][C]32[/C][C]0.310951[/C][C]0.621901[/C][C]0.689049[/C][/ROW]
[ROW][C]33[/C][C]0.32869[/C][C]0.65738[/C][C]0.67131[/C][/ROW]
[ROW][C]34[/C][C]0.278855[/C][C]0.557709[/C][C]0.721145[/C][/ROW]
[ROW][C]35[/C][C]0.228661[/C][C]0.457322[/C][C]0.771339[/C][/ROW]
[ROW][C]36[/C][C]0.194963[/C][C]0.389926[/C][C]0.805037[/C][/ROW]
[ROW][C]37[/C][C]0.174578[/C][C]0.349157[/C][C]0.825422[/C][/ROW]
[ROW][C]38[/C][C]0.172589[/C][C]0.345178[/C][C]0.827411[/C][/ROW]
[ROW][C]39[/C][C]0.153177[/C][C]0.306354[/C][C]0.846823[/C][/ROW]
[ROW][C]40[/C][C]0.134783[/C][C]0.269566[/C][C]0.865217[/C][/ROW]
[ROW][C]41[/C][C]0.363691[/C][C]0.727383[/C][C]0.636309[/C][/ROW]
[ROW][C]42[/C][C]0.311034[/C][C]0.622067[/C][C]0.688966[/C][/ROW]
[ROW][C]43[/C][C]0.337641[/C][C]0.675282[/C][C]0.662359[/C][/ROW]
[ROW][C]44[/C][C]0.287888[/C][C]0.575776[/C][C]0.712112[/C][/ROW]
[ROW][C]45[/C][C]0.28428[/C][C]0.568561[/C][C]0.71572[/C][/ROW]
[ROW][C]46[/C][C]0.244657[/C][C]0.489313[/C][C]0.755343[/C][/ROW]
[ROW][C]47[/C][C]0.209785[/C][C]0.41957[/C][C]0.790215[/C][/ROW]
[ROW][C]48[/C][C]0.236177[/C][C]0.472354[/C][C]0.763823[/C][/ROW]
[ROW][C]49[/C][C]0.235899[/C][C]0.471798[/C][C]0.764101[/C][/ROW]
[ROW][C]50[/C][C]0.201952[/C][C]0.403903[/C][C]0.798048[/C][/ROW]
[ROW][C]51[/C][C]0.168973[/C][C]0.337946[/C][C]0.831027[/C][/ROW]
[ROW][C]52[/C][C]0.218672[/C][C]0.437343[/C][C]0.781328[/C][/ROW]
[ROW][C]53[/C][C]0.179409[/C][C]0.358818[/C][C]0.820591[/C][/ROW]
[ROW][C]54[/C][C]0.153135[/C][C]0.30627[/C][C]0.846865[/C][/ROW]
[ROW][C]55[/C][C]0.162587[/C][C]0.325173[/C][C]0.837413[/C][/ROW]
[ROW][C]56[/C][C]0.137742[/C][C]0.275484[/C][C]0.862258[/C][/ROW]
[ROW][C]57[/C][C]0.226045[/C][C]0.45209[/C][C]0.773955[/C][/ROW]
[ROW][C]58[/C][C]0.239175[/C][C]0.47835[/C][C]0.760825[/C][/ROW]
[ROW][C]59[/C][C]0.239966[/C][C]0.479933[/C][C]0.760034[/C][/ROW]
[ROW][C]60[/C][C]0.220145[/C][C]0.44029[/C][C]0.779855[/C][/ROW]
[ROW][C]61[/C][C]0.19387[/C][C]0.387741[/C][C]0.80613[/C][/ROW]
[ROW][C]62[/C][C]0.170115[/C][C]0.340231[/C][C]0.829885[/C][/ROW]
[ROW][C]63[/C][C]0.188157[/C][C]0.376313[/C][C]0.811843[/C][/ROW]
[ROW][C]64[/C][C]0.369417[/C][C]0.738835[/C][C]0.630583[/C][/ROW]
[ROW][C]65[/C][C]0.324113[/C][C]0.648225[/C][C]0.675887[/C][/ROW]
[ROW][C]66[/C][C]0.299999[/C][C]0.599998[/C][C]0.700001[/C][/ROW]
[ROW][C]67[/C][C]0.253256[/C][C]0.506513[/C][C]0.746744[/C][/ROW]
[ROW][C]68[/C][C]0.241447[/C][C]0.482895[/C][C]0.758553[/C][/ROW]
[ROW][C]69[/C][C]0.338665[/C][C]0.67733[/C][C]0.661335[/C][/ROW]
[ROW][C]70[/C][C]0.304755[/C][C]0.609511[/C][C]0.695245[/C][/ROW]
[ROW][C]71[/C][C]0.297969[/C][C]0.595937[/C][C]0.702031[/C][/ROW]
[ROW][C]72[/C][C]0.25707[/C][C]0.51414[/C][C]0.74293[/C][/ROW]
[ROW][C]73[/C][C]0.229407[/C][C]0.458815[/C][C]0.770593[/C][/ROW]
[ROW][C]74[/C][C]0.275269[/C][C]0.550537[/C][C]0.724731[/C][/ROW]
[ROW][C]75[/C][C]0.256296[/C][C]0.512592[/C][C]0.743704[/C][/ROW]
[ROW][C]76[/C][C]0.26326[/C][C]0.526519[/C][C]0.73674[/C][/ROW]
[ROW][C]77[/C][C]0.237095[/C][C]0.47419[/C][C]0.762905[/C][/ROW]
[ROW][C]78[/C][C]0.268591[/C][C]0.537182[/C][C]0.731409[/C][/ROW]
[ROW][C]79[/C][C]0.220101[/C][C]0.440203[/C][C]0.779899[/C][/ROW]
[ROW][C]80[/C][C]0.211511[/C][C]0.423023[/C][C]0.788489[/C][/ROW]
[ROW][C]81[/C][C]0.171581[/C][C]0.343162[/C][C]0.828419[/C][/ROW]
[ROW][C]82[/C][C]0.200129[/C][C]0.400257[/C][C]0.799871[/C][/ROW]
[ROW][C]83[/C][C]0.166696[/C][C]0.333393[/C][C]0.833304[/C][/ROW]
[ROW][C]84[/C][C]0.369269[/C][C]0.738537[/C][C]0.630731[/C][/ROW]
[ROW][C]85[/C][C]0.316412[/C][C]0.632823[/C][C]0.683588[/C][/ROW]
[ROW][C]86[/C][C]0.29458[/C][C]0.589161[/C][C]0.70542[/C][/ROW]
[ROW][C]87[/C][C]0.267997[/C][C]0.535994[/C][C]0.732003[/C][/ROW]
[ROW][C]88[/C][C]0.219589[/C][C]0.439178[/C][C]0.780411[/C][/ROW]
[ROW][C]89[/C][C]0.199744[/C][C]0.399487[/C][C]0.800256[/C][/ROW]
[ROW][C]90[/C][C]0.211808[/C][C]0.423616[/C][C]0.788192[/C][/ROW]
[ROW][C]91[/C][C]0.284743[/C][C]0.569485[/C][C]0.715257[/C][/ROW]
[ROW][C]92[/C][C]0.410291[/C][C]0.820583[/C][C]0.589709[/C][/ROW]
[ROW][C]93[/C][C]0.338543[/C][C]0.677086[/C][C]0.661457[/C][/ROW]
[ROW][C]94[/C][C]0.281223[/C][C]0.562445[/C][C]0.718777[/C][/ROW]
[ROW][C]95[/C][C]0.258754[/C][C]0.517507[/C][C]0.741246[/C][/ROW]
[ROW][C]96[/C][C]0.19723[/C][C]0.394461[/C][C]0.80277[/C][/ROW]
[ROW][C]97[/C][C]0.193675[/C][C]0.38735[/C][C]0.806325[/C][/ROW]
[ROW][C]98[/C][C]0.137804[/C][C]0.275608[/C][C]0.862196[/C][/ROW]
[ROW][C]99[/C][C]0.125109[/C][C]0.250218[/C][C]0.874891[/C][/ROW]
[ROW][C]100[/C][C]0.116513[/C][C]0.233026[/C][C]0.883487[/C][/ROW]
[ROW][C]101[/C][C]0.0768778[/C][C]0.153756[/C][C]0.923122[/C][/ROW]
[ROW][C]102[/C][C]0.0520397[/C][C]0.104079[/C][C]0.94796[/C][/ROW]
[ROW][C]103[/C][C]0.0493197[/C][C]0.0986393[/C][C]0.95068[/C][/ROW]
[ROW][C]104[/C][C]0.0229166[/C][C]0.0458333[/C][C]0.977083[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=267332&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=267332&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
80.4635140.9270280.536486
90.7424120.5151750.257588
100.7307460.5385080.269254
110.6356550.728690.364345
120.6685560.6628880.331444
130.586840.826320.41316
140.4866890.9733790.513311
150.5435480.9129040.456452
160.5254270.9491470.474573
170.4522660.9045330.547734
180.3777450.755490.622255
190.3113580.6227170.688642
200.2787940.5575880.721206
210.2827660.5655320.717234
220.3849530.7699070.615047
230.3189420.6378840.681058
240.3329620.6659230.667038
250.2710190.5420380.728981
260.2225230.4450450.777477
270.2821090.5642180.717891
280.3693960.7387920.630604
290.3331970.6663950.666803
300.3720230.7440470.627977
310.366930.733860.63307
320.3109510.6219010.689049
330.328690.657380.67131
340.2788550.5577090.721145
350.2286610.4573220.771339
360.1949630.3899260.805037
370.1745780.3491570.825422
380.1725890.3451780.827411
390.1531770.3063540.846823
400.1347830.2695660.865217
410.3636910.7273830.636309
420.3110340.6220670.688966
430.3376410.6752820.662359
440.2878880.5757760.712112
450.284280.5685610.71572
460.2446570.4893130.755343
470.2097850.419570.790215
480.2361770.4723540.763823
490.2358990.4717980.764101
500.2019520.4039030.798048
510.1689730.3379460.831027
520.2186720.4373430.781328
530.1794090.3588180.820591
540.1531350.306270.846865
550.1625870.3251730.837413
560.1377420.2754840.862258
570.2260450.452090.773955
580.2391750.478350.760825
590.2399660.4799330.760034
600.2201450.440290.779855
610.193870.3877410.80613
620.1701150.3402310.829885
630.1881570.3763130.811843
640.3694170.7388350.630583
650.3241130.6482250.675887
660.2999990.5999980.700001
670.2532560.5065130.746744
680.2414470.4828950.758553
690.3386650.677330.661335
700.3047550.6095110.695245
710.2979690.5959370.702031
720.257070.514140.74293
730.2294070.4588150.770593
740.2752690.5505370.724731
750.2562960.5125920.743704
760.263260.5265190.73674
770.2370950.474190.762905
780.2685910.5371820.731409
790.2201010.4402030.779899
800.2115110.4230230.788489
810.1715810.3431620.828419
820.2001290.4002570.799871
830.1666960.3333930.833304
840.3692690.7385370.630731
850.3164120.6328230.683588
860.294580.5891610.70542
870.2679970.5359940.732003
880.2195890.4391780.780411
890.1997440.3994870.800256
900.2118080.4236160.788192
910.2847430.5694850.715257
920.4102910.8205830.589709
930.3385430.6770860.661457
940.2812230.5624450.718777
950.2587540.5175070.741246
960.197230.3944610.80277
970.1936750.387350.806325
980.1378040.2756080.862196
990.1251090.2502180.874891
1000.1165130.2330260.883487
1010.07687780.1537560.923122
1020.05203970.1040790.94796
1030.04931970.09863930.95068
1040.02291660.04583330.977083







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level00OK
5% type I error level10.0103093OK
10% type I error level20.0206186OK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 0 & 0 & OK \tabularnewline
5% type I error level & 1 & 0.0103093 & OK \tabularnewline
10% type I error level & 2 & 0.0206186 & OK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=267332&T=6

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]1[/C][C]0.0103093[/C][C]OK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]2[/C][C]0.0206186[/C][C]OK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=267332&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=267332&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level00OK
5% type I error level10.0103093OK
10% type I error level20.0206186OK



Parameters (Session):
par1 = 6 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
Parameters (R input):
par1 = 5 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
R code (references can be found in the software module):
library(lattice)
library(lmtest)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
qqline(mysum$resid)
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, signif(mysum$coefficients[i,1],6), sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,signif(mysum$coefficients[i,1],6))
a<-table.element(a, signif(mysum$coefficients[i,2],6))
a<-table.element(a, signif(mysum$coefficients[i,3],4))
a<-table.element(a, signif(mysum$coefficients[i,4],6))
a<-table.element(a, signif(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, signif(sqrt(mysum$r.squared),6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, signif(mysum$r.squared,6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, signif(mysum$adj.r.squared,6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[1],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[2],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[3],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, signif(1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]),6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, signif(mysum$sigma,6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, signif(sum(myerror*myerror),6))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,signif(x[i],6))
a<-table.element(a,signif(x[i]-mysum$resid[i],6))
a<-table.element(a,signif(mysum$resid[i],6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,signif(gqarr[mypoint-kp3+1,1],6))
a<-table.element(a,signif(gqarr[mypoint-kp3+1,2],6))
a<-table.element(a,signif(gqarr[mypoint-kp3+1,3],6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant1,6))
a<-table.element(a,signif(numsignificant1/numgqtests,6))
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant5,6))
a<-table.element(a,signif(numsignificant5/numgqtests,6))
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant10,6))
a<-table.element(a,signif(numsignificant10/numgqtests,6))
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}