Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationFri, 12 Dec 2014 14:23:40 +0000
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2014/Dec/12/t14183942583p5rws3lkev8mpo.htm/, Retrieved Thu, 16 May 2024 12:02:36 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=266737, Retrieved Thu, 16 May 2024 12:02:36 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact95
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [4] [2014-12-12 14:23:40] [c98c6a6156d025200627852118e8b268] [Current]
Feedback Forum

Post a new message
Dataseries X:
4 13 13 12.9
4 8 13 12.2
5 14 11 12.8
4 16 14 7.4
4 14 15 6.7
9 13 14 12.6
8 15 11 14.8
11 13 13 13.3
4 20 16 11.1
4 17 14 8.2
6 15 14 11.4
4 16 15 6.4
8 12 15 10.6
4 17 13 12
4 11 14 6.3
11 16 11 11.3
4 16 12 11.9
4 15 14 9.3
6 13 13 9.6
6 14 12 10
4 19 15 6.4
8 16 15 13.8
5 17 14 10.8
4 10 14 13.8
9 15 12 11.7
4 14 12 10.9
7 14 12 16.1
10 16 15 13.4
4 15 14 9.9
4 17 16 11.5
7 14 12 8.3
12 16 12 11.7
7 15 14 9
5 16 16 9.7
8 16 15 10.8
5 10 12 10.3
4 8 14 10.4
9 17 13 12.7
7 14 14 9.3
4 10 16 11.8
4 14 12 5.9
4 12 14 11.4
4 16 15 13
4 16 13 10.8
7 16 16 12.3
4 8 16 11.3
7 16 12 11.8
4 15 12 7.9
4 8 16 12.7
4 13 12 12.3
4 14 15 11.6
8 13 12 6.7
4 16 13 10.9
4 19 12 12.1
4 19 14 13.3
4 14 14 10.1
7 15 11 5.7
12 13 10 14.3
4 10 12 8
4 16 11 13.3
4 15 16 9.3
5 11 14 12.5
15 9 14 7.6
5 16 15 15.9
10 12 15 9.2
9 12 14 9.1
8 14 13 11.1
4 14 11 13
5 13 16 14.5
4 15 12 12.2
9 17 15 12.3
4 14 14 11.4
10 11 15 8.8
4 9 14 14.6
4 7 13 12.6
7 15 12 13
5 12 12 12.6
4 15 14 13.2
4 14 14 9.9
4 16 15 7.7
4 14 11 10.5
4 13 13 13.4
4 16 14 10.9
6 13 16 4.3
10 16 13 10.3
7 16 14 11.8
4 16 16 11.2
4 10 11 11.4
7 12 13 8.6
4 12 13 13.2
8 12 15 12.6
11 12 12 5.6
6 19 13 9.9
14 14 12 8.8
5 13 14 7.7
4 16 14 9
8 15 16 7.3
9 12 15 11.4
4 8 14 13.6
4 10 13 7.9
5 16 14 10.7
4 16 15 10.3
5 10 14 8.3
4 18 12 9.6
4 12 7 14.2
7 16 12 8.5
10 10 15 13.5
4 14 12 4.9
5 12 13 6.4
4 11 11 9.6
4 15 14 11.6
4 7 13 11.1




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time6 seconds
R Server'Gwilym Jenkins' @ jenkins.wessa.net

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 6 seconds \tabularnewline
R Server & 'Gwilym Jenkins' @ jenkins.wessa.net \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=266737&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]6 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Gwilym Jenkins' @ jenkins.wessa.net[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=266737&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=266737&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time6 seconds
R Server'Gwilym Jenkins' @ jenkins.wessa.net







Multiple Linear Regression - Estimated Regression Equation
TOT[t] = + 12.7008 -0.00988833AMS.A[t] -0.0258875CONFSTATTOT[t] -0.118622STRESSTOT[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
TOT[t] =  +  12.7008 -0.00988833AMS.A[t] -0.0258875CONFSTATTOT[t] -0.118622STRESSTOT[t]  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=266737&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]TOT[t] =  +  12.7008 -0.00988833AMS.A[t] -0.0258875CONFSTATTOT[t] -0.118622STRESSTOT[t]  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=266737&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=266737&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
TOT[t] = + 12.7008 -0.00988833AMS.A[t] -0.0258875CONFSTATTOT[t] -0.118622STRESSTOT[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)12.70082.370085.3594.78429e-072.39214e-07
AMS.A-0.009888330.0939185-0.10530.9163440.458172
CONFSTATTOT-0.02588750.0848518-0.30510.7608840.380442
STRESSTOT-0.1186220.147079-0.80650.4217140.210857

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 12.7008 & 2.37008 & 5.359 & 4.78429e-07 & 2.39214e-07 \tabularnewline
AMS.A & -0.00988833 & 0.0939185 & -0.1053 & 0.916344 & 0.458172 \tabularnewline
CONFSTATTOT & -0.0258875 & 0.0848518 & -0.3051 & 0.760884 & 0.380442 \tabularnewline
STRESSTOT & -0.118622 & 0.147079 & -0.8065 & 0.421714 & 0.210857 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=266737&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]12.7008[/C][C]2.37008[/C][C]5.359[/C][C]4.78429e-07[/C][C]2.39214e-07[/C][/ROW]
[ROW][C]AMS.A[/C][C]-0.00988833[/C][C]0.0939185[/C][C]-0.1053[/C][C]0.916344[/C][C]0.458172[/C][/ROW]
[ROW][C]CONFSTATTOT[/C][C]-0.0258875[/C][C]0.0848518[/C][C]-0.3051[/C][C]0.760884[/C][C]0.380442[/C][/ROW]
[ROW][C]STRESSTOT[/C][C]-0.118622[/C][C]0.147079[/C][C]-0.8065[/C][C]0.421714[/C][C]0.210857[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=266737&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=266737&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)12.70082.370085.3594.78429e-072.39214e-07
AMS.A-0.009888330.0939185-0.10530.9163440.458172
CONFSTATTOT-0.02588750.0848518-0.30510.7608840.380442
STRESSTOT-0.1186220.147079-0.80650.4217140.210857







Multiple Linear Regression - Regression Statistics
Multiple R0.0839455
R-squared0.00704684
Adjusted R-squared-0.0205352
F-TEST (value)0.255487
F-TEST (DF numerator)3
F-TEST (DF denominator)108
p-value0.857278
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation2.49659
Sum Squared Residuals673.158

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.0839455 \tabularnewline
R-squared & 0.00704684 \tabularnewline
Adjusted R-squared & -0.0205352 \tabularnewline
F-TEST (value) & 0.255487 \tabularnewline
F-TEST (DF numerator) & 3 \tabularnewline
F-TEST (DF denominator) & 108 \tabularnewline
p-value & 0.857278 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 2.49659 \tabularnewline
Sum Squared Residuals & 673.158 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=266737&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.0839455[/C][/ROW]
[ROW][C]R-squared[/C][C]0.00704684[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]-0.0205352[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]0.255487[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]3[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]108[/C][/ROW]
[ROW][C]p-value[/C][C]0.857278[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]2.49659[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]673.158[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=266737&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=266737&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.0839455
R-squared0.00704684
Adjusted R-squared-0.0205352
F-TEST (value)0.255487
F-TEST (DF numerator)3
F-TEST (DF denominator)108
p-value0.857278
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation2.49659
Sum Squared Residuals673.158







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
112.910.78262.1174
212.210.9121.28796
312.810.98411.81593
47.410.5863-3.18632
56.710.5195-3.81947
612.610.61451.98546
714.810.92853.87148
813.310.71342.58662
911.110.24550.854479
108.210.5604-2.36043
1111.410.59240.807574
126.410.4677-4.06769
1310.610.53170.0683104
141210.67911.32095
156.310.7158-4.41575
1611.310.8730.427036
1711.910.82361.07644
189.310.6122-1.3122
199.610.7628-1.16282
201010.8556-0.855559
216.410.39-3.99003
2213.810.42813.37186
2310.810.55050.24946
2413.810.74163.05836
2511.710.80.899994
2610.910.87530.0246647
2716.110.84575.25433
2813.410.40842.99164
299.910.6122-0.712203
3011.510.32321.17682
318.310.8457-2.54567
3211.710.74450.955546
33910.5825-1.58254
349.710.3392-0.639182
3510.810.42810.37186
3610.310.969-0.668997
3710.410.7934-0.393415
3812.710.62962.07039
399.310.6084-1.30843
4011.810.50441.2956
415.910.8753-4.97534
4211.410.68990.710135
431310.46772.53231
4410.810.70490.0950621
4512.310.31941.98059
4611.310.55620.74383
4711.810.79391.0061
487.910.8494-2.94945
4912.710.55622.14383
5012.310.90121.39878
5111.610.51951.08053
526.710.8617-4.16167
5310.910.70490.195062
5412.110.74591.3541
5513.310.50872.79135
5610.110.6381-0.53809
575.710.9384-5.23841
5814.311.05943.24064
59810.9789-2.97889
6013.310.94222.35782
619.310.375-1.07496
6212.510.70591.79414
637.610.6588-3.05876
6415.910.45785.4422
659.210.5119-1.31191
669.110.6404-1.54042
6711.110.71720.38284
681310.9942.00604
6914.510.41684.08316
7012.210.84941.35055
7112.310.39241.90764
7211.410.63810.76191
738.810.5378-1.7378
7414.610.76753.83247
7512.610.93791.66207
761310.81982.18022
7712.610.91721.68278
7813.210.61222.5878
799.910.6381-0.73809
807.710.4677-2.76769
8110.510.994-0.493958
8213.410.78262.6174
8310.910.58630.313685
844.310.407-6.10696
8510.310.6456-0.345608
8611.810.55671.24335
8711.210.34910.85093
8811.411.09750.302492
898.610.7788-2.17882
9013.210.80852.39151
9112.610.53172.06831
925.610.8579-5.25789
939.910.6075-0.707499
948.810.7765-1.97645
957.710.6541-2.95409
96910.5863-1.58632
977.310.3354-3.0354
9811.410.52180.878199
9913.610.79342.80658
1007.910.8603-2.96026
10110.710.57640.123573
10210.310.4677-0.167693
1038.310.7318-2.43175
1049.610.7718-1.17179
10514.211.52022.67978
1068.510.7939-2.2939
10713.510.56372.93631
1084.910.8753-5.97534
1096.410.7986-4.3986
1109.611.0716-1.47162
11111.610.61220.987797
11211.110.93790.162075

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 12.9 & 10.7826 & 2.1174 \tabularnewline
2 & 12.2 & 10.912 & 1.28796 \tabularnewline
3 & 12.8 & 10.9841 & 1.81593 \tabularnewline
4 & 7.4 & 10.5863 & -3.18632 \tabularnewline
5 & 6.7 & 10.5195 & -3.81947 \tabularnewline
6 & 12.6 & 10.6145 & 1.98546 \tabularnewline
7 & 14.8 & 10.9285 & 3.87148 \tabularnewline
8 & 13.3 & 10.7134 & 2.58662 \tabularnewline
9 & 11.1 & 10.2455 & 0.854479 \tabularnewline
10 & 8.2 & 10.5604 & -2.36043 \tabularnewline
11 & 11.4 & 10.5924 & 0.807574 \tabularnewline
12 & 6.4 & 10.4677 & -4.06769 \tabularnewline
13 & 10.6 & 10.5317 & 0.0683104 \tabularnewline
14 & 12 & 10.6791 & 1.32095 \tabularnewline
15 & 6.3 & 10.7158 & -4.41575 \tabularnewline
16 & 11.3 & 10.873 & 0.427036 \tabularnewline
17 & 11.9 & 10.8236 & 1.07644 \tabularnewline
18 & 9.3 & 10.6122 & -1.3122 \tabularnewline
19 & 9.6 & 10.7628 & -1.16282 \tabularnewline
20 & 10 & 10.8556 & -0.855559 \tabularnewline
21 & 6.4 & 10.39 & -3.99003 \tabularnewline
22 & 13.8 & 10.4281 & 3.37186 \tabularnewline
23 & 10.8 & 10.5505 & 0.24946 \tabularnewline
24 & 13.8 & 10.7416 & 3.05836 \tabularnewline
25 & 11.7 & 10.8 & 0.899994 \tabularnewline
26 & 10.9 & 10.8753 & 0.0246647 \tabularnewline
27 & 16.1 & 10.8457 & 5.25433 \tabularnewline
28 & 13.4 & 10.4084 & 2.99164 \tabularnewline
29 & 9.9 & 10.6122 & -0.712203 \tabularnewline
30 & 11.5 & 10.3232 & 1.17682 \tabularnewline
31 & 8.3 & 10.8457 & -2.54567 \tabularnewline
32 & 11.7 & 10.7445 & 0.955546 \tabularnewline
33 & 9 & 10.5825 & -1.58254 \tabularnewline
34 & 9.7 & 10.3392 & -0.639182 \tabularnewline
35 & 10.8 & 10.4281 & 0.37186 \tabularnewline
36 & 10.3 & 10.969 & -0.668997 \tabularnewline
37 & 10.4 & 10.7934 & -0.393415 \tabularnewline
38 & 12.7 & 10.6296 & 2.07039 \tabularnewline
39 & 9.3 & 10.6084 & -1.30843 \tabularnewline
40 & 11.8 & 10.5044 & 1.2956 \tabularnewline
41 & 5.9 & 10.8753 & -4.97534 \tabularnewline
42 & 11.4 & 10.6899 & 0.710135 \tabularnewline
43 & 13 & 10.4677 & 2.53231 \tabularnewline
44 & 10.8 & 10.7049 & 0.0950621 \tabularnewline
45 & 12.3 & 10.3194 & 1.98059 \tabularnewline
46 & 11.3 & 10.5562 & 0.74383 \tabularnewline
47 & 11.8 & 10.7939 & 1.0061 \tabularnewline
48 & 7.9 & 10.8494 & -2.94945 \tabularnewline
49 & 12.7 & 10.5562 & 2.14383 \tabularnewline
50 & 12.3 & 10.9012 & 1.39878 \tabularnewline
51 & 11.6 & 10.5195 & 1.08053 \tabularnewline
52 & 6.7 & 10.8617 & -4.16167 \tabularnewline
53 & 10.9 & 10.7049 & 0.195062 \tabularnewline
54 & 12.1 & 10.7459 & 1.3541 \tabularnewline
55 & 13.3 & 10.5087 & 2.79135 \tabularnewline
56 & 10.1 & 10.6381 & -0.53809 \tabularnewline
57 & 5.7 & 10.9384 & -5.23841 \tabularnewline
58 & 14.3 & 11.0594 & 3.24064 \tabularnewline
59 & 8 & 10.9789 & -2.97889 \tabularnewline
60 & 13.3 & 10.9422 & 2.35782 \tabularnewline
61 & 9.3 & 10.375 & -1.07496 \tabularnewline
62 & 12.5 & 10.7059 & 1.79414 \tabularnewline
63 & 7.6 & 10.6588 & -3.05876 \tabularnewline
64 & 15.9 & 10.4578 & 5.4422 \tabularnewline
65 & 9.2 & 10.5119 & -1.31191 \tabularnewline
66 & 9.1 & 10.6404 & -1.54042 \tabularnewline
67 & 11.1 & 10.7172 & 0.38284 \tabularnewline
68 & 13 & 10.994 & 2.00604 \tabularnewline
69 & 14.5 & 10.4168 & 4.08316 \tabularnewline
70 & 12.2 & 10.8494 & 1.35055 \tabularnewline
71 & 12.3 & 10.3924 & 1.90764 \tabularnewline
72 & 11.4 & 10.6381 & 0.76191 \tabularnewline
73 & 8.8 & 10.5378 & -1.7378 \tabularnewline
74 & 14.6 & 10.7675 & 3.83247 \tabularnewline
75 & 12.6 & 10.9379 & 1.66207 \tabularnewline
76 & 13 & 10.8198 & 2.18022 \tabularnewline
77 & 12.6 & 10.9172 & 1.68278 \tabularnewline
78 & 13.2 & 10.6122 & 2.5878 \tabularnewline
79 & 9.9 & 10.6381 & -0.73809 \tabularnewline
80 & 7.7 & 10.4677 & -2.76769 \tabularnewline
81 & 10.5 & 10.994 & -0.493958 \tabularnewline
82 & 13.4 & 10.7826 & 2.6174 \tabularnewline
83 & 10.9 & 10.5863 & 0.313685 \tabularnewline
84 & 4.3 & 10.407 & -6.10696 \tabularnewline
85 & 10.3 & 10.6456 & -0.345608 \tabularnewline
86 & 11.8 & 10.5567 & 1.24335 \tabularnewline
87 & 11.2 & 10.3491 & 0.85093 \tabularnewline
88 & 11.4 & 11.0975 & 0.302492 \tabularnewline
89 & 8.6 & 10.7788 & -2.17882 \tabularnewline
90 & 13.2 & 10.8085 & 2.39151 \tabularnewline
91 & 12.6 & 10.5317 & 2.06831 \tabularnewline
92 & 5.6 & 10.8579 & -5.25789 \tabularnewline
93 & 9.9 & 10.6075 & -0.707499 \tabularnewline
94 & 8.8 & 10.7765 & -1.97645 \tabularnewline
95 & 7.7 & 10.6541 & -2.95409 \tabularnewline
96 & 9 & 10.5863 & -1.58632 \tabularnewline
97 & 7.3 & 10.3354 & -3.0354 \tabularnewline
98 & 11.4 & 10.5218 & 0.878199 \tabularnewline
99 & 13.6 & 10.7934 & 2.80658 \tabularnewline
100 & 7.9 & 10.8603 & -2.96026 \tabularnewline
101 & 10.7 & 10.5764 & 0.123573 \tabularnewline
102 & 10.3 & 10.4677 & -0.167693 \tabularnewline
103 & 8.3 & 10.7318 & -2.43175 \tabularnewline
104 & 9.6 & 10.7718 & -1.17179 \tabularnewline
105 & 14.2 & 11.5202 & 2.67978 \tabularnewline
106 & 8.5 & 10.7939 & -2.2939 \tabularnewline
107 & 13.5 & 10.5637 & 2.93631 \tabularnewline
108 & 4.9 & 10.8753 & -5.97534 \tabularnewline
109 & 6.4 & 10.7986 & -4.3986 \tabularnewline
110 & 9.6 & 11.0716 & -1.47162 \tabularnewline
111 & 11.6 & 10.6122 & 0.987797 \tabularnewline
112 & 11.1 & 10.9379 & 0.162075 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=266737&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]12.9[/C][C]10.7826[/C][C]2.1174[/C][/ROW]
[ROW][C]2[/C][C]12.2[/C][C]10.912[/C][C]1.28796[/C][/ROW]
[ROW][C]3[/C][C]12.8[/C][C]10.9841[/C][C]1.81593[/C][/ROW]
[ROW][C]4[/C][C]7.4[/C][C]10.5863[/C][C]-3.18632[/C][/ROW]
[ROW][C]5[/C][C]6.7[/C][C]10.5195[/C][C]-3.81947[/C][/ROW]
[ROW][C]6[/C][C]12.6[/C][C]10.6145[/C][C]1.98546[/C][/ROW]
[ROW][C]7[/C][C]14.8[/C][C]10.9285[/C][C]3.87148[/C][/ROW]
[ROW][C]8[/C][C]13.3[/C][C]10.7134[/C][C]2.58662[/C][/ROW]
[ROW][C]9[/C][C]11.1[/C][C]10.2455[/C][C]0.854479[/C][/ROW]
[ROW][C]10[/C][C]8.2[/C][C]10.5604[/C][C]-2.36043[/C][/ROW]
[ROW][C]11[/C][C]11.4[/C][C]10.5924[/C][C]0.807574[/C][/ROW]
[ROW][C]12[/C][C]6.4[/C][C]10.4677[/C][C]-4.06769[/C][/ROW]
[ROW][C]13[/C][C]10.6[/C][C]10.5317[/C][C]0.0683104[/C][/ROW]
[ROW][C]14[/C][C]12[/C][C]10.6791[/C][C]1.32095[/C][/ROW]
[ROW][C]15[/C][C]6.3[/C][C]10.7158[/C][C]-4.41575[/C][/ROW]
[ROW][C]16[/C][C]11.3[/C][C]10.873[/C][C]0.427036[/C][/ROW]
[ROW][C]17[/C][C]11.9[/C][C]10.8236[/C][C]1.07644[/C][/ROW]
[ROW][C]18[/C][C]9.3[/C][C]10.6122[/C][C]-1.3122[/C][/ROW]
[ROW][C]19[/C][C]9.6[/C][C]10.7628[/C][C]-1.16282[/C][/ROW]
[ROW][C]20[/C][C]10[/C][C]10.8556[/C][C]-0.855559[/C][/ROW]
[ROW][C]21[/C][C]6.4[/C][C]10.39[/C][C]-3.99003[/C][/ROW]
[ROW][C]22[/C][C]13.8[/C][C]10.4281[/C][C]3.37186[/C][/ROW]
[ROW][C]23[/C][C]10.8[/C][C]10.5505[/C][C]0.24946[/C][/ROW]
[ROW][C]24[/C][C]13.8[/C][C]10.7416[/C][C]3.05836[/C][/ROW]
[ROW][C]25[/C][C]11.7[/C][C]10.8[/C][C]0.899994[/C][/ROW]
[ROW][C]26[/C][C]10.9[/C][C]10.8753[/C][C]0.0246647[/C][/ROW]
[ROW][C]27[/C][C]16.1[/C][C]10.8457[/C][C]5.25433[/C][/ROW]
[ROW][C]28[/C][C]13.4[/C][C]10.4084[/C][C]2.99164[/C][/ROW]
[ROW][C]29[/C][C]9.9[/C][C]10.6122[/C][C]-0.712203[/C][/ROW]
[ROW][C]30[/C][C]11.5[/C][C]10.3232[/C][C]1.17682[/C][/ROW]
[ROW][C]31[/C][C]8.3[/C][C]10.8457[/C][C]-2.54567[/C][/ROW]
[ROW][C]32[/C][C]11.7[/C][C]10.7445[/C][C]0.955546[/C][/ROW]
[ROW][C]33[/C][C]9[/C][C]10.5825[/C][C]-1.58254[/C][/ROW]
[ROW][C]34[/C][C]9.7[/C][C]10.3392[/C][C]-0.639182[/C][/ROW]
[ROW][C]35[/C][C]10.8[/C][C]10.4281[/C][C]0.37186[/C][/ROW]
[ROW][C]36[/C][C]10.3[/C][C]10.969[/C][C]-0.668997[/C][/ROW]
[ROW][C]37[/C][C]10.4[/C][C]10.7934[/C][C]-0.393415[/C][/ROW]
[ROW][C]38[/C][C]12.7[/C][C]10.6296[/C][C]2.07039[/C][/ROW]
[ROW][C]39[/C][C]9.3[/C][C]10.6084[/C][C]-1.30843[/C][/ROW]
[ROW][C]40[/C][C]11.8[/C][C]10.5044[/C][C]1.2956[/C][/ROW]
[ROW][C]41[/C][C]5.9[/C][C]10.8753[/C][C]-4.97534[/C][/ROW]
[ROW][C]42[/C][C]11.4[/C][C]10.6899[/C][C]0.710135[/C][/ROW]
[ROW][C]43[/C][C]13[/C][C]10.4677[/C][C]2.53231[/C][/ROW]
[ROW][C]44[/C][C]10.8[/C][C]10.7049[/C][C]0.0950621[/C][/ROW]
[ROW][C]45[/C][C]12.3[/C][C]10.3194[/C][C]1.98059[/C][/ROW]
[ROW][C]46[/C][C]11.3[/C][C]10.5562[/C][C]0.74383[/C][/ROW]
[ROW][C]47[/C][C]11.8[/C][C]10.7939[/C][C]1.0061[/C][/ROW]
[ROW][C]48[/C][C]7.9[/C][C]10.8494[/C][C]-2.94945[/C][/ROW]
[ROW][C]49[/C][C]12.7[/C][C]10.5562[/C][C]2.14383[/C][/ROW]
[ROW][C]50[/C][C]12.3[/C][C]10.9012[/C][C]1.39878[/C][/ROW]
[ROW][C]51[/C][C]11.6[/C][C]10.5195[/C][C]1.08053[/C][/ROW]
[ROW][C]52[/C][C]6.7[/C][C]10.8617[/C][C]-4.16167[/C][/ROW]
[ROW][C]53[/C][C]10.9[/C][C]10.7049[/C][C]0.195062[/C][/ROW]
[ROW][C]54[/C][C]12.1[/C][C]10.7459[/C][C]1.3541[/C][/ROW]
[ROW][C]55[/C][C]13.3[/C][C]10.5087[/C][C]2.79135[/C][/ROW]
[ROW][C]56[/C][C]10.1[/C][C]10.6381[/C][C]-0.53809[/C][/ROW]
[ROW][C]57[/C][C]5.7[/C][C]10.9384[/C][C]-5.23841[/C][/ROW]
[ROW][C]58[/C][C]14.3[/C][C]11.0594[/C][C]3.24064[/C][/ROW]
[ROW][C]59[/C][C]8[/C][C]10.9789[/C][C]-2.97889[/C][/ROW]
[ROW][C]60[/C][C]13.3[/C][C]10.9422[/C][C]2.35782[/C][/ROW]
[ROW][C]61[/C][C]9.3[/C][C]10.375[/C][C]-1.07496[/C][/ROW]
[ROW][C]62[/C][C]12.5[/C][C]10.7059[/C][C]1.79414[/C][/ROW]
[ROW][C]63[/C][C]7.6[/C][C]10.6588[/C][C]-3.05876[/C][/ROW]
[ROW][C]64[/C][C]15.9[/C][C]10.4578[/C][C]5.4422[/C][/ROW]
[ROW][C]65[/C][C]9.2[/C][C]10.5119[/C][C]-1.31191[/C][/ROW]
[ROW][C]66[/C][C]9.1[/C][C]10.6404[/C][C]-1.54042[/C][/ROW]
[ROW][C]67[/C][C]11.1[/C][C]10.7172[/C][C]0.38284[/C][/ROW]
[ROW][C]68[/C][C]13[/C][C]10.994[/C][C]2.00604[/C][/ROW]
[ROW][C]69[/C][C]14.5[/C][C]10.4168[/C][C]4.08316[/C][/ROW]
[ROW][C]70[/C][C]12.2[/C][C]10.8494[/C][C]1.35055[/C][/ROW]
[ROW][C]71[/C][C]12.3[/C][C]10.3924[/C][C]1.90764[/C][/ROW]
[ROW][C]72[/C][C]11.4[/C][C]10.6381[/C][C]0.76191[/C][/ROW]
[ROW][C]73[/C][C]8.8[/C][C]10.5378[/C][C]-1.7378[/C][/ROW]
[ROW][C]74[/C][C]14.6[/C][C]10.7675[/C][C]3.83247[/C][/ROW]
[ROW][C]75[/C][C]12.6[/C][C]10.9379[/C][C]1.66207[/C][/ROW]
[ROW][C]76[/C][C]13[/C][C]10.8198[/C][C]2.18022[/C][/ROW]
[ROW][C]77[/C][C]12.6[/C][C]10.9172[/C][C]1.68278[/C][/ROW]
[ROW][C]78[/C][C]13.2[/C][C]10.6122[/C][C]2.5878[/C][/ROW]
[ROW][C]79[/C][C]9.9[/C][C]10.6381[/C][C]-0.73809[/C][/ROW]
[ROW][C]80[/C][C]7.7[/C][C]10.4677[/C][C]-2.76769[/C][/ROW]
[ROW][C]81[/C][C]10.5[/C][C]10.994[/C][C]-0.493958[/C][/ROW]
[ROW][C]82[/C][C]13.4[/C][C]10.7826[/C][C]2.6174[/C][/ROW]
[ROW][C]83[/C][C]10.9[/C][C]10.5863[/C][C]0.313685[/C][/ROW]
[ROW][C]84[/C][C]4.3[/C][C]10.407[/C][C]-6.10696[/C][/ROW]
[ROW][C]85[/C][C]10.3[/C][C]10.6456[/C][C]-0.345608[/C][/ROW]
[ROW][C]86[/C][C]11.8[/C][C]10.5567[/C][C]1.24335[/C][/ROW]
[ROW][C]87[/C][C]11.2[/C][C]10.3491[/C][C]0.85093[/C][/ROW]
[ROW][C]88[/C][C]11.4[/C][C]11.0975[/C][C]0.302492[/C][/ROW]
[ROW][C]89[/C][C]8.6[/C][C]10.7788[/C][C]-2.17882[/C][/ROW]
[ROW][C]90[/C][C]13.2[/C][C]10.8085[/C][C]2.39151[/C][/ROW]
[ROW][C]91[/C][C]12.6[/C][C]10.5317[/C][C]2.06831[/C][/ROW]
[ROW][C]92[/C][C]5.6[/C][C]10.8579[/C][C]-5.25789[/C][/ROW]
[ROW][C]93[/C][C]9.9[/C][C]10.6075[/C][C]-0.707499[/C][/ROW]
[ROW][C]94[/C][C]8.8[/C][C]10.7765[/C][C]-1.97645[/C][/ROW]
[ROW][C]95[/C][C]7.7[/C][C]10.6541[/C][C]-2.95409[/C][/ROW]
[ROW][C]96[/C][C]9[/C][C]10.5863[/C][C]-1.58632[/C][/ROW]
[ROW][C]97[/C][C]7.3[/C][C]10.3354[/C][C]-3.0354[/C][/ROW]
[ROW][C]98[/C][C]11.4[/C][C]10.5218[/C][C]0.878199[/C][/ROW]
[ROW][C]99[/C][C]13.6[/C][C]10.7934[/C][C]2.80658[/C][/ROW]
[ROW][C]100[/C][C]7.9[/C][C]10.8603[/C][C]-2.96026[/C][/ROW]
[ROW][C]101[/C][C]10.7[/C][C]10.5764[/C][C]0.123573[/C][/ROW]
[ROW][C]102[/C][C]10.3[/C][C]10.4677[/C][C]-0.167693[/C][/ROW]
[ROW][C]103[/C][C]8.3[/C][C]10.7318[/C][C]-2.43175[/C][/ROW]
[ROW][C]104[/C][C]9.6[/C][C]10.7718[/C][C]-1.17179[/C][/ROW]
[ROW][C]105[/C][C]14.2[/C][C]11.5202[/C][C]2.67978[/C][/ROW]
[ROW][C]106[/C][C]8.5[/C][C]10.7939[/C][C]-2.2939[/C][/ROW]
[ROW][C]107[/C][C]13.5[/C][C]10.5637[/C][C]2.93631[/C][/ROW]
[ROW][C]108[/C][C]4.9[/C][C]10.8753[/C][C]-5.97534[/C][/ROW]
[ROW][C]109[/C][C]6.4[/C][C]10.7986[/C][C]-4.3986[/C][/ROW]
[ROW][C]110[/C][C]9.6[/C][C]11.0716[/C][C]-1.47162[/C][/ROW]
[ROW][C]111[/C][C]11.6[/C][C]10.6122[/C][C]0.987797[/C][/ROW]
[ROW][C]112[/C][C]11.1[/C][C]10.9379[/C][C]0.162075[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=266737&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=266737&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
112.910.78262.1174
212.210.9121.28796
312.810.98411.81593
47.410.5863-3.18632
56.710.5195-3.81947
612.610.61451.98546
714.810.92853.87148
813.310.71342.58662
911.110.24550.854479
108.210.5604-2.36043
1111.410.59240.807574
126.410.4677-4.06769
1310.610.53170.0683104
141210.67911.32095
156.310.7158-4.41575
1611.310.8730.427036
1711.910.82361.07644
189.310.6122-1.3122
199.610.7628-1.16282
201010.8556-0.855559
216.410.39-3.99003
2213.810.42813.37186
2310.810.55050.24946
2413.810.74163.05836
2511.710.80.899994
2610.910.87530.0246647
2716.110.84575.25433
2813.410.40842.99164
299.910.6122-0.712203
3011.510.32321.17682
318.310.8457-2.54567
3211.710.74450.955546
33910.5825-1.58254
349.710.3392-0.639182
3510.810.42810.37186
3610.310.969-0.668997
3710.410.7934-0.393415
3812.710.62962.07039
399.310.6084-1.30843
4011.810.50441.2956
415.910.8753-4.97534
4211.410.68990.710135
431310.46772.53231
4410.810.70490.0950621
4512.310.31941.98059
4611.310.55620.74383
4711.810.79391.0061
487.910.8494-2.94945
4912.710.55622.14383
5012.310.90121.39878
5111.610.51951.08053
526.710.8617-4.16167
5310.910.70490.195062
5412.110.74591.3541
5513.310.50872.79135
5610.110.6381-0.53809
575.710.9384-5.23841
5814.311.05943.24064
59810.9789-2.97889
6013.310.94222.35782
619.310.375-1.07496
6212.510.70591.79414
637.610.6588-3.05876
6415.910.45785.4422
659.210.5119-1.31191
669.110.6404-1.54042
6711.110.71720.38284
681310.9942.00604
6914.510.41684.08316
7012.210.84941.35055
7112.310.39241.90764
7211.410.63810.76191
738.810.5378-1.7378
7414.610.76753.83247
7512.610.93791.66207
761310.81982.18022
7712.610.91721.68278
7813.210.61222.5878
799.910.6381-0.73809
807.710.4677-2.76769
8110.510.994-0.493958
8213.410.78262.6174
8310.910.58630.313685
844.310.407-6.10696
8510.310.6456-0.345608
8611.810.55671.24335
8711.210.34910.85093
8811.411.09750.302492
898.610.7788-2.17882
9013.210.80852.39151
9112.610.53172.06831
925.610.8579-5.25789
939.910.6075-0.707499
948.810.7765-1.97645
957.710.6541-2.95409
96910.5863-1.58632
977.310.3354-3.0354
9811.410.52180.878199
9913.610.79342.80658
1007.910.8603-2.96026
10110.710.57640.123573
10210.310.4677-0.167693
1038.310.7318-2.43175
1049.610.7718-1.17179
10514.211.52022.67978
1068.510.7939-2.2939
10713.510.56372.93631
1084.910.8753-5.97534
1096.410.7986-4.3986
1109.611.0716-1.47162
11111.610.61220.987797
11211.110.93790.162075







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
70.2348610.4697230.765139
80.1478010.2956030.852199
90.5030710.9938570.496929
100.4491040.8982080.550896
110.3372270.6744550.662773
120.3603620.7207230.639638
130.2638530.5277060.736147
140.2095950.419190.790405
150.2833490.5666970.716651
160.4352550.8705090.564745
170.3554460.7108910.644554
180.2828950.565790.717105
190.2391840.4783680.760816
200.2104550.4209090.789545
210.2240590.4481170.775941
220.3117690.6235380.688231
230.2564460.5128930.743554
240.3548520.7097050.645148
250.3033750.606750.696625
260.2443650.488730.755635
270.3801810.7603620.619819
280.3732510.7465020.626749
290.314120.628240.68588
300.3253510.6507020.674649
310.3960770.7921550.603923
320.3766520.7533040.623348
330.3525710.7051420.647429
340.2999380.5998770.700062
350.2485730.4971460.751427
360.2109180.4218360.789082
370.1699220.3398440.830078
380.1494310.2988630.850569
390.1310170.2620340.868983
400.1186750.2373490.881325
410.2272960.4545930.772704
420.1930980.3861970.806902
430.2195170.4390340.780483
440.182040.3640810.81796
450.1641270.3282550.835873
460.1325390.2650790.867461
470.1085680.2171360.891432
480.1123530.2247050.887647
490.1006080.2012170.899392
500.08922610.1784520.910774
510.07365330.1473070.926347
520.1426630.2853250.857337
530.1160080.2320170.883992
540.107490.2149810.89251
550.123920.247840.87608
560.09846960.1969390.90153
570.2076210.4152410.792379
580.2405020.4810030.759498
590.2572080.5144170.742792
600.2718480.5436970.728152
610.2391470.4782940.760853
620.2156180.4312360.784382
630.2720470.5440940.727953
640.4629720.9259440.537028
650.4243830.8487660.575617
660.3875030.7750050.612497
670.3393480.6786960.660652
680.3246670.6493330.675333
690.4114790.8229580.588521
700.3761330.7522660.623867
710.3855080.7710170.614492
720.3385890.6771780.661411
730.3046830.6093650.695317
740.3683620.7367240.631638
750.3336870.6673740.666313
760.3410720.6821440.658928
770.3188110.6376230.681189
780.3443830.6887660.655617
790.2923350.5846690.707665
800.283130.5662610.71687
810.2337230.4674460.766277
820.253260.5065210.74674
830.2135890.4271770.786411
840.4477780.8955550.552222
850.3960690.7921390.603931
860.3823210.7646430.617679
870.3408620.6817240.659138
880.2836410.5672820.716359
890.2485530.4971060.751447
900.2648890.5297770.735111
910.2797280.5594560.720272
920.4297080.8594160.570292
930.3762230.7524450.623777
940.3559440.7118880.644056
950.3272370.6544750.672763
960.2609160.5218320.739084
970.2525650.5051310.747435
980.1871060.3742130.812894
990.2521990.5043990.747801
1000.2094820.4189630.790518
1010.161890.3237810.83811
1020.1578590.3157190.842141
1030.1073650.214730.892635
1040.08049540.1609910.919505
1050.2126450.4252910.787355

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
7 & 0.234861 & 0.469723 & 0.765139 \tabularnewline
8 & 0.147801 & 0.295603 & 0.852199 \tabularnewline
9 & 0.503071 & 0.993857 & 0.496929 \tabularnewline
10 & 0.449104 & 0.898208 & 0.550896 \tabularnewline
11 & 0.337227 & 0.674455 & 0.662773 \tabularnewline
12 & 0.360362 & 0.720723 & 0.639638 \tabularnewline
13 & 0.263853 & 0.527706 & 0.736147 \tabularnewline
14 & 0.209595 & 0.41919 & 0.790405 \tabularnewline
15 & 0.283349 & 0.566697 & 0.716651 \tabularnewline
16 & 0.435255 & 0.870509 & 0.564745 \tabularnewline
17 & 0.355446 & 0.710891 & 0.644554 \tabularnewline
18 & 0.282895 & 0.56579 & 0.717105 \tabularnewline
19 & 0.239184 & 0.478368 & 0.760816 \tabularnewline
20 & 0.210455 & 0.420909 & 0.789545 \tabularnewline
21 & 0.224059 & 0.448117 & 0.775941 \tabularnewline
22 & 0.311769 & 0.623538 & 0.688231 \tabularnewline
23 & 0.256446 & 0.512893 & 0.743554 \tabularnewline
24 & 0.354852 & 0.709705 & 0.645148 \tabularnewline
25 & 0.303375 & 0.60675 & 0.696625 \tabularnewline
26 & 0.244365 & 0.48873 & 0.755635 \tabularnewline
27 & 0.380181 & 0.760362 & 0.619819 \tabularnewline
28 & 0.373251 & 0.746502 & 0.626749 \tabularnewline
29 & 0.31412 & 0.62824 & 0.68588 \tabularnewline
30 & 0.325351 & 0.650702 & 0.674649 \tabularnewline
31 & 0.396077 & 0.792155 & 0.603923 \tabularnewline
32 & 0.376652 & 0.753304 & 0.623348 \tabularnewline
33 & 0.352571 & 0.705142 & 0.647429 \tabularnewline
34 & 0.299938 & 0.599877 & 0.700062 \tabularnewline
35 & 0.248573 & 0.497146 & 0.751427 \tabularnewline
36 & 0.210918 & 0.421836 & 0.789082 \tabularnewline
37 & 0.169922 & 0.339844 & 0.830078 \tabularnewline
38 & 0.149431 & 0.298863 & 0.850569 \tabularnewline
39 & 0.131017 & 0.262034 & 0.868983 \tabularnewline
40 & 0.118675 & 0.237349 & 0.881325 \tabularnewline
41 & 0.227296 & 0.454593 & 0.772704 \tabularnewline
42 & 0.193098 & 0.386197 & 0.806902 \tabularnewline
43 & 0.219517 & 0.439034 & 0.780483 \tabularnewline
44 & 0.18204 & 0.364081 & 0.81796 \tabularnewline
45 & 0.164127 & 0.328255 & 0.835873 \tabularnewline
46 & 0.132539 & 0.265079 & 0.867461 \tabularnewline
47 & 0.108568 & 0.217136 & 0.891432 \tabularnewline
48 & 0.112353 & 0.224705 & 0.887647 \tabularnewline
49 & 0.100608 & 0.201217 & 0.899392 \tabularnewline
50 & 0.0892261 & 0.178452 & 0.910774 \tabularnewline
51 & 0.0736533 & 0.147307 & 0.926347 \tabularnewline
52 & 0.142663 & 0.285325 & 0.857337 \tabularnewline
53 & 0.116008 & 0.232017 & 0.883992 \tabularnewline
54 & 0.10749 & 0.214981 & 0.89251 \tabularnewline
55 & 0.12392 & 0.24784 & 0.87608 \tabularnewline
56 & 0.0984696 & 0.196939 & 0.90153 \tabularnewline
57 & 0.207621 & 0.415241 & 0.792379 \tabularnewline
58 & 0.240502 & 0.481003 & 0.759498 \tabularnewline
59 & 0.257208 & 0.514417 & 0.742792 \tabularnewline
60 & 0.271848 & 0.543697 & 0.728152 \tabularnewline
61 & 0.239147 & 0.478294 & 0.760853 \tabularnewline
62 & 0.215618 & 0.431236 & 0.784382 \tabularnewline
63 & 0.272047 & 0.544094 & 0.727953 \tabularnewline
64 & 0.462972 & 0.925944 & 0.537028 \tabularnewline
65 & 0.424383 & 0.848766 & 0.575617 \tabularnewline
66 & 0.387503 & 0.775005 & 0.612497 \tabularnewline
67 & 0.339348 & 0.678696 & 0.660652 \tabularnewline
68 & 0.324667 & 0.649333 & 0.675333 \tabularnewline
69 & 0.411479 & 0.822958 & 0.588521 \tabularnewline
70 & 0.376133 & 0.752266 & 0.623867 \tabularnewline
71 & 0.385508 & 0.771017 & 0.614492 \tabularnewline
72 & 0.338589 & 0.677178 & 0.661411 \tabularnewline
73 & 0.304683 & 0.609365 & 0.695317 \tabularnewline
74 & 0.368362 & 0.736724 & 0.631638 \tabularnewline
75 & 0.333687 & 0.667374 & 0.666313 \tabularnewline
76 & 0.341072 & 0.682144 & 0.658928 \tabularnewline
77 & 0.318811 & 0.637623 & 0.681189 \tabularnewline
78 & 0.344383 & 0.688766 & 0.655617 \tabularnewline
79 & 0.292335 & 0.584669 & 0.707665 \tabularnewline
80 & 0.28313 & 0.566261 & 0.71687 \tabularnewline
81 & 0.233723 & 0.467446 & 0.766277 \tabularnewline
82 & 0.25326 & 0.506521 & 0.74674 \tabularnewline
83 & 0.213589 & 0.427177 & 0.786411 \tabularnewline
84 & 0.447778 & 0.895555 & 0.552222 \tabularnewline
85 & 0.396069 & 0.792139 & 0.603931 \tabularnewline
86 & 0.382321 & 0.764643 & 0.617679 \tabularnewline
87 & 0.340862 & 0.681724 & 0.659138 \tabularnewline
88 & 0.283641 & 0.567282 & 0.716359 \tabularnewline
89 & 0.248553 & 0.497106 & 0.751447 \tabularnewline
90 & 0.264889 & 0.529777 & 0.735111 \tabularnewline
91 & 0.279728 & 0.559456 & 0.720272 \tabularnewline
92 & 0.429708 & 0.859416 & 0.570292 \tabularnewline
93 & 0.376223 & 0.752445 & 0.623777 \tabularnewline
94 & 0.355944 & 0.711888 & 0.644056 \tabularnewline
95 & 0.327237 & 0.654475 & 0.672763 \tabularnewline
96 & 0.260916 & 0.521832 & 0.739084 \tabularnewline
97 & 0.252565 & 0.505131 & 0.747435 \tabularnewline
98 & 0.187106 & 0.374213 & 0.812894 \tabularnewline
99 & 0.252199 & 0.504399 & 0.747801 \tabularnewline
100 & 0.209482 & 0.418963 & 0.790518 \tabularnewline
101 & 0.16189 & 0.323781 & 0.83811 \tabularnewline
102 & 0.157859 & 0.315719 & 0.842141 \tabularnewline
103 & 0.107365 & 0.21473 & 0.892635 \tabularnewline
104 & 0.0804954 & 0.160991 & 0.919505 \tabularnewline
105 & 0.212645 & 0.425291 & 0.787355 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=266737&T=5

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]7[/C][C]0.234861[/C][C]0.469723[/C][C]0.765139[/C][/ROW]
[ROW][C]8[/C][C]0.147801[/C][C]0.295603[/C][C]0.852199[/C][/ROW]
[ROW][C]9[/C][C]0.503071[/C][C]0.993857[/C][C]0.496929[/C][/ROW]
[ROW][C]10[/C][C]0.449104[/C][C]0.898208[/C][C]0.550896[/C][/ROW]
[ROW][C]11[/C][C]0.337227[/C][C]0.674455[/C][C]0.662773[/C][/ROW]
[ROW][C]12[/C][C]0.360362[/C][C]0.720723[/C][C]0.639638[/C][/ROW]
[ROW][C]13[/C][C]0.263853[/C][C]0.527706[/C][C]0.736147[/C][/ROW]
[ROW][C]14[/C][C]0.209595[/C][C]0.41919[/C][C]0.790405[/C][/ROW]
[ROW][C]15[/C][C]0.283349[/C][C]0.566697[/C][C]0.716651[/C][/ROW]
[ROW][C]16[/C][C]0.435255[/C][C]0.870509[/C][C]0.564745[/C][/ROW]
[ROW][C]17[/C][C]0.355446[/C][C]0.710891[/C][C]0.644554[/C][/ROW]
[ROW][C]18[/C][C]0.282895[/C][C]0.56579[/C][C]0.717105[/C][/ROW]
[ROW][C]19[/C][C]0.239184[/C][C]0.478368[/C][C]0.760816[/C][/ROW]
[ROW][C]20[/C][C]0.210455[/C][C]0.420909[/C][C]0.789545[/C][/ROW]
[ROW][C]21[/C][C]0.224059[/C][C]0.448117[/C][C]0.775941[/C][/ROW]
[ROW][C]22[/C][C]0.311769[/C][C]0.623538[/C][C]0.688231[/C][/ROW]
[ROW][C]23[/C][C]0.256446[/C][C]0.512893[/C][C]0.743554[/C][/ROW]
[ROW][C]24[/C][C]0.354852[/C][C]0.709705[/C][C]0.645148[/C][/ROW]
[ROW][C]25[/C][C]0.303375[/C][C]0.60675[/C][C]0.696625[/C][/ROW]
[ROW][C]26[/C][C]0.244365[/C][C]0.48873[/C][C]0.755635[/C][/ROW]
[ROW][C]27[/C][C]0.380181[/C][C]0.760362[/C][C]0.619819[/C][/ROW]
[ROW][C]28[/C][C]0.373251[/C][C]0.746502[/C][C]0.626749[/C][/ROW]
[ROW][C]29[/C][C]0.31412[/C][C]0.62824[/C][C]0.68588[/C][/ROW]
[ROW][C]30[/C][C]0.325351[/C][C]0.650702[/C][C]0.674649[/C][/ROW]
[ROW][C]31[/C][C]0.396077[/C][C]0.792155[/C][C]0.603923[/C][/ROW]
[ROW][C]32[/C][C]0.376652[/C][C]0.753304[/C][C]0.623348[/C][/ROW]
[ROW][C]33[/C][C]0.352571[/C][C]0.705142[/C][C]0.647429[/C][/ROW]
[ROW][C]34[/C][C]0.299938[/C][C]0.599877[/C][C]0.700062[/C][/ROW]
[ROW][C]35[/C][C]0.248573[/C][C]0.497146[/C][C]0.751427[/C][/ROW]
[ROW][C]36[/C][C]0.210918[/C][C]0.421836[/C][C]0.789082[/C][/ROW]
[ROW][C]37[/C][C]0.169922[/C][C]0.339844[/C][C]0.830078[/C][/ROW]
[ROW][C]38[/C][C]0.149431[/C][C]0.298863[/C][C]0.850569[/C][/ROW]
[ROW][C]39[/C][C]0.131017[/C][C]0.262034[/C][C]0.868983[/C][/ROW]
[ROW][C]40[/C][C]0.118675[/C][C]0.237349[/C][C]0.881325[/C][/ROW]
[ROW][C]41[/C][C]0.227296[/C][C]0.454593[/C][C]0.772704[/C][/ROW]
[ROW][C]42[/C][C]0.193098[/C][C]0.386197[/C][C]0.806902[/C][/ROW]
[ROW][C]43[/C][C]0.219517[/C][C]0.439034[/C][C]0.780483[/C][/ROW]
[ROW][C]44[/C][C]0.18204[/C][C]0.364081[/C][C]0.81796[/C][/ROW]
[ROW][C]45[/C][C]0.164127[/C][C]0.328255[/C][C]0.835873[/C][/ROW]
[ROW][C]46[/C][C]0.132539[/C][C]0.265079[/C][C]0.867461[/C][/ROW]
[ROW][C]47[/C][C]0.108568[/C][C]0.217136[/C][C]0.891432[/C][/ROW]
[ROW][C]48[/C][C]0.112353[/C][C]0.224705[/C][C]0.887647[/C][/ROW]
[ROW][C]49[/C][C]0.100608[/C][C]0.201217[/C][C]0.899392[/C][/ROW]
[ROW][C]50[/C][C]0.0892261[/C][C]0.178452[/C][C]0.910774[/C][/ROW]
[ROW][C]51[/C][C]0.0736533[/C][C]0.147307[/C][C]0.926347[/C][/ROW]
[ROW][C]52[/C][C]0.142663[/C][C]0.285325[/C][C]0.857337[/C][/ROW]
[ROW][C]53[/C][C]0.116008[/C][C]0.232017[/C][C]0.883992[/C][/ROW]
[ROW][C]54[/C][C]0.10749[/C][C]0.214981[/C][C]0.89251[/C][/ROW]
[ROW][C]55[/C][C]0.12392[/C][C]0.24784[/C][C]0.87608[/C][/ROW]
[ROW][C]56[/C][C]0.0984696[/C][C]0.196939[/C][C]0.90153[/C][/ROW]
[ROW][C]57[/C][C]0.207621[/C][C]0.415241[/C][C]0.792379[/C][/ROW]
[ROW][C]58[/C][C]0.240502[/C][C]0.481003[/C][C]0.759498[/C][/ROW]
[ROW][C]59[/C][C]0.257208[/C][C]0.514417[/C][C]0.742792[/C][/ROW]
[ROW][C]60[/C][C]0.271848[/C][C]0.543697[/C][C]0.728152[/C][/ROW]
[ROW][C]61[/C][C]0.239147[/C][C]0.478294[/C][C]0.760853[/C][/ROW]
[ROW][C]62[/C][C]0.215618[/C][C]0.431236[/C][C]0.784382[/C][/ROW]
[ROW][C]63[/C][C]0.272047[/C][C]0.544094[/C][C]0.727953[/C][/ROW]
[ROW][C]64[/C][C]0.462972[/C][C]0.925944[/C][C]0.537028[/C][/ROW]
[ROW][C]65[/C][C]0.424383[/C][C]0.848766[/C][C]0.575617[/C][/ROW]
[ROW][C]66[/C][C]0.387503[/C][C]0.775005[/C][C]0.612497[/C][/ROW]
[ROW][C]67[/C][C]0.339348[/C][C]0.678696[/C][C]0.660652[/C][/ROW]
[ROW][C]68[/C][C]0.324667[/C][C]0.649333[/C][C]0.675333[/C][/ROW]
[ROW][C]69[/C][C]0.411479[/C][C]0.822958[/C][C]0.588521[/C][/ROW]
[ROW][C]70[/C][C]0.376133[/C][C]0.752266[/C][C]0.623867[/C][/ROW]
[ROW][C]71[/C][C]0.385508[/C][C]0.771017[/C][C]0.614492[/C][/ROW]
[ROW][C]72[/C][C]0.338589[/C][C]0.677178[/C][C]0.661411[/C][/ROW]
[ROW][C]73[/C][C]0.304683[/C][C]0.609365[/C][C]0.695317[/C][/ROW]
[ROW][C]74[/C][C]0.368362[/C][C]0.736724[/C][C]0.631638[/C][/ROW]
[ROW][C]75[/C][C]0.333687[/C][C]0.667374[/C][C]0.666313[/C][/ROW]
[ROW][C]76[/C][C]0.341072[/C][C]0.682144[/C][C]0.658928[/C][/ROW]
[ROW][C]77[/C][C]0.318811[/C][C]0.637623[/C][C]0.681189[/C][/ROW]
[ROW][C]78[/C][C]0.344383[/C][C]0.688766[/C][C]0.655617[/C][/ROW]
[ROW][C]79[/C][C]0.292335[/C][C]0.584669[/C][C]0.707665[/C][/ROW]
[ROW][C]80[/C][C]0.28313[/C][C]0.566261[/C][C]0.71687[/C][/ROW]
[ROW][C]81[/C][C]0.233723[/C][C]0.467446[/C][C]0.766277[/C][/ROW]
[ROW][C]82[/C][C]0.25326[/C][C]0.506521[/C][C]0.74674[/C][/ROW]
[ROW][C]83[/C][C]0.213589[/C][C]0.427177[/C][C]0.786411[/C][/ROW]
[ROW][C]84[/C][C]0.447778[/C][C]0.895555[/C][C]0.552222[/C][/ROW]
[ROW][C]85[/C][C]0.396069[/C][C]0.792139[/C][C]0.603931[/C][/ROW]
[ROW][C]86[/C][C]0.382321[/C][C]0.764643[/C][C]0.617679[/C][/ROW]
[ROW][C]87[/C][C]0.340862[/C][C]0.681724[/C][C]0.659138[/C][/ROW]
[ROW][C]88[/C][C]0.283641[/C][C]0.567282[/C][C]0.716359[/C][/ROW]
[ROW][C]89[/C][C]0.248553[/C][C]0.497106[/C][C]0.751447[/C][/ROW]
[ROW][C]90[/C][C]0.264889[/C][C]0.529777[/C][C]0.735111[/C][/ROW]
[ROW][C]91[/C][C]0.279728[/C][C]0.559456[/C][C]0.720272[/C][/ROW]
[ROW][C]92[/C][C]0.429708[/C][C]0.859416[/C][C]0.570292[/C][/ROW]
[ROW][C]93[/C][C]0.376223[/C][C]0.752445[/C][C]0.623777[/C][/ROW]
[ROW][C]94[/C][C]0.355944[/C][C]0.711888[/C][C]0.644056[/C][/ROW]
[ROW][C]95[/C][C]0.327237[/C][C]0.654475[/C][C]0.672763[/C][/ROW]
[ROW][C]96[/C][C]0.260916[/C][C]0.521832[/C][C]0.739084[/C][/ROW]
[ROW][C]97[/C][C]0.252565[/C][C]0.505131[/C][C]0.747435[/C][/ROW]
[ROW][C]98[/C][C]0.187106[/C][C]0.374213[/C][C]0.812894[/C][/ROW]
[ROW][C]99[/C][C]0.252199[/C][C]0.504399[/C][C]0.747801[/C][/ROW]
[ROW][C]100[/C][C]0.209482[/C][C]0.418963[/C][C]0.790518[/C][/ROW]
[ROW][C]101[/C][C]0.16189[/C][C]0.323781[/C][C]0.83811[/C][/ROW]
[ROW][C]102[/C][C]0.157859[/C][C]0.315719[/C][C]0.842141[/C][/ROW]
[ROW][C]103[/C][C]0.107365[/C][C]0.21473[/C][C]0.892635[/C][/ROW]
[ROW][C]104[/C][C]0.0804954[/C][C]0.160991[/C][C]0.919505[/C][/ROW]
[ROW][C]105[/C][C]0.212645[/C][C]0.425291[/C][C]0.787355[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=266737&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=266737&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
70.2348610.4697230.765139
80.1478010.2956030.852199
90.5030710.9938570.496929
100.4491040.8982080.550896
110.3372270.6744550.662773
120.3603620.7207230.639638
130.2638530.5277060.736147
140.2095950.419190.790405
150.2833490.5666970.716651
160.4352550.8705090.564745
170.3554460.7108910.644554
180.2828950.565790.717105
190.2391840.4783680.760816
200.2104550.4209090.789545
210.2240590.4481170.775941
220.3117690.6235380.688231
230.2564460.5128930.743554
240.3548520.7097050.645148
250.3033750.606750.696625
260.2443650.488730.755635
270.3801810.7603620.619819
280.3732510.7465020.626749
290.314120.628240.68588
300.3253510.6507020.674649
310.3960770.7921550.603923
320.3766520.7533040.623348
330.3525710.7051420.647429
340.2999380.5998770.700062
350.2485730.4971460.751427
360.2109180.4218360.789082
370.1699220.3398440.830078
380.1494310.2988630.850569
390.1310170.2620340.868983
400.1186750.2373490.881325
410.2272960.4545930.772704
420.1930980.3861970.806902
430.2195170.4390340.780483
440.182040.3640810.81796
450.1641270.3282550.835873
460.1325390.2650790.867461
470.1085680.2171360.891432
480.1123530.2247050.887647
490.1006080.2012170.899392
500.08922610.1784520.910774
510.07365330.1473070.926347
520.1426630.2853250.857337
530.1160080.2320170.883992
540.107490.2149810.89251
550.123920.247840.87608
560.09846960.1969390.90153
570.2076210.4152410.792379
580.2405020.4810030.759498
590.2572080.5144170.742792
600.2718480.5436970.728152
610.2391470.4782940.760853
620.2156180.4312360.784382
630.2720470.5440940.727953
640.4629720.9259440.537028
650.4243830.8487660.575617
660.3875030.7750050.612497
670.3393480.6786960.660652
680.3246670.6493330.675333
690.4114790.8229580.588521
700.3761330.7522660.623867
710.3855080.7710170.614492
720.3385890.6771780.661411
730.3046830.6093650.695317
740.3683620.7367240.631638
750.3336870.6673740.666313
760.3410720.6821440.658928
770.3188110.6376230.681189
780.3443830.6887660.655617
790.2923350.5846690.707665
800.283130.5662610.71687
810.2337230.4674460.766277
820.253260.5065210.74674
830.2135890.4271770.786411
840.4477780.8955550.552222
850.3960690.7921390.603931
860.3823210.7646430.617679
870.3408620.6817240.659138
880.2836410.5672820.716359
890.2485530.4971060.751447
900.2648890.5297770.735111
910.2797280.5594560.720272
920.4297080.8594160.570292
930.3762230.7524450.623777
940.3559440.7118880.644056
950.3272370.6544750.672763
960.2609160.5218320.739084
970.2525650.5051310.747435
980.1871060.3742130.812894
990.2521990.5043990.747801
1000.2094820.4189630.790518
1010.161890.3237810.83811
1020.1578590.3157190.842141
1030.1073650.214730.892635
1040.08049540.1609910.919505
1050.2126450.4252910.787355







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level00OK
5% type I error level00OK
10% type I error level00OK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 0 & 0 & OK \tabularnewline
5% type I error level & 0 & 0 & OK \tabularnewline
10% type I error level & 0 & 0 & OK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=266737&T=6

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=266737&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=266737&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level00OK
5% type I error level00OK
10% type I error level00OK



Parameters (Session):
par1 = 4 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
Parameters (R input):
par1 = 4 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
R code (references can be found in the software module):
library(lattice)
library(lmtest)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
qqline(mysum$resid)
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, signif(mysum$coefficients[i,1],6), sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,signif(mysum$coefficients[i,1],6))
a<-table.element(a, signif(mysum$coefficients[i,2],6))
a<-table.element(a, signif(mysum$coefficients[i,3],4))
a<-table.element(a, signif(mysum$coefficients[i,4],6))
a<-table.element(a, signif(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, signif(sqrt(mysum$r.squared),6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, signif(mysum$r.squared,6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, signif(mysum$adj.r.squared,6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[1],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[2],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[3],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, signif(1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]),6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, signif(mysum$sigma,6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, signif(sum(myerror*myerror),6))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,signif(x[i],6))
a<-table.element(a,signif(x[i]-mysum$resid[i],6))
a<-table.element(a,signif(mysum$resid[i],6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,signif(gqarr[mypoint-kp3+1,1],6))
a<-table.element(a,signif(gqarr[mypoint-kp3+1,2],6))
a<-table.element(a,signif(gqarr[mypoint-kp3+1,3],6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant1,6))
a<-table.element(a,signif(numsignificant1/numgqtests,6))
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant5,6))
a<-table.element(a,signif(numsignificant5/numgqtests,6))
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant10,6))
a<-table.element(a,signif(numsignificant10/numgqtests,6))
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}