Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationWed, 17 Dec 2014 12:43:41 +0000
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2014/Dec/17/t1418820254qtrz4vmdvlum99d.htm/, Retrieved Thu, 16 May 2024 18:39:21 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=270134, Retrieved Thu, 16 May 2024 18:39:21 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact67
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-     [Exponential Smoothing] [ES] [2014-12-17 12:09:36] [40df8d8b5657a9599acc6ccced535535]
- RMP     [Multiple Regression] [MR] [2014-12-17 12:43:41] [eeaae55b7499419163eef5a1870a44a7] [Current]
Feedback Forum

Post a new message
Dataseries X:
12.9
12.2
12.8
7.4
6.7
12.6
14.8
13.3
11.1
8.2
11.4
6.4
10.6
12
6.3
11.3
11.9
9.3
9.6
10
6.4
13.8
10.8
13.8
11.7
10.9
16.1
13.4
9.9
11.5
8.3
11.7
9
9.7
10.8
10.3
10.4
12.7
9.3
11.8
5.9
11.4
13
10.8
12.3
11.3
11.8
7.9
12.7
12.3
11.6
6.7
10.9
12.1
13.3
10.1
5.7
14.3
8
13.3
9.3
12.5
7.6
15.9
9.2
9.1
11.1
13
14.5
12.2
12.3
11.4
8.8
14.6
12.6
13
12.6
13.2
9.9
7.7
10.5
13.4
10.9
4.3
10.3
11.8
11.2
11.4
8.6
13.2
12.6
5.6
9.9
8.8
7.7
9
7.3
11.4
13.6
7.9
10.7
10.3
8.3
9.6
14.2
8.5
13.5
4.9
6.4
9.6
11.6
11.1




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time5 seconds
R Server'Gwilym Jenkins' @ jenkins.wessa.net

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 5 seconds \tabularnewline
R Server & 'Gwilym Jenkins' @ jenkins.wessa.net \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=270134&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]5 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Gwilym Jenkins' @ jenkins.wessa.net[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=270134&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=270134&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time5 seconds
R Server'Gwilym Jenkins' @ jenkins.wessa.net







Multiple Linear Regression - Estimated Regression Equation
TOT[t] = + 11.148 -0.00813502t + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
TOT[t] =  +  11.148 -0.00813502t  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=270134&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]TOT[t] =  +  11.148 -0.00813502t  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=270134&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=270134&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
TOT[t] = + 11.148 -0.00813502t + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)11.1480.4696123.744.31552e-452.15776e-45
t-0.008135020.00721411-1.1280.261920.13096

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 11.148 & 0.46961 & 23.74 & 4.31552e-45 & 2.15776e-45 \tabularnewline
t & -0.00813502 & 0.00721411 & -1.128 & 0.26192 & 0.13096 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=270134&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]11.148[/C][C]0.46961[/C][C]23.74[/C][C]4.31552e-45[/C][C]2.15776e-45[/C][/ROW]
[ROW][C]t[/C][C]-0.00813502[/C][C]0.00721411[/C][C]-1.128[/C][C]0.26192[/C][C]0.13096[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=270134&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=270134&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)11.1480.4696123.744.31552e-452.15776e-45
t-0.008135020.00721411-1.1280.261920.13096







Multiple Linear Regression - Regression Statistics
Multiple R0.106901
R-squared0.0114279
Adjusted R-squared0.0024409
F-TEST (value)1.2716
F-TEST (DF numerator)1
F-TEST (DF denominator)110
p-value0.26192
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation2.46832
Sum Squared Residuals670.188

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.106901 \tabularnewline
R-squared & 0.0114279 \tabularnewline
Adjusted R-squared & 0.0024409 \tabularnewline
F-TEST (value) & 1.2716 \tabularnewline
F-TEST (DF numerator) & 1 \tabularnewline
F-TEST (DF denominator) & 110 \tabularnewline
p-value & 0.26192 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 2.46832 \tabularnewline
Sum Squared Residuals & 670.188 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=270134&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.106901[/C][/ROW]
[ROW][C]R-squared[/C][C]0.0114279[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]0.0024409[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]1.2716[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]1[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]110[/C][/ROW]
[ROW][C]p-value[/C][C]0.26192[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]2.46832[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]670.188[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=270134&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=270134&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.106901
R-squared0.0114279
Adjusted R-squared0.0024409
F-TEST (value)1.2716
F-TEST (DF numerator)1
F-TEST (DF denominator)110
p-value0.26192
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation2.46832
Sum Squared Residuals670.188







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
112.911.13991.76011
212.211.13181.06825
312.811.12361.67638
47.411.1155-3.71548
56.711.1073-4.40735
612.611.09921.50079
714.811.09113.70892
813.311.08292.21706
911.111.07480.0251939
108.211.0667-2.86667
1111.411.05850.341464
126.411.0504-4.6504
1310.611.0423-0.442266
141211.03410.965869
156.311.026-4.726
1611.311.01790.282139
1711.911.00970.890274
189.311.0016-1.70159
199.610.9935-1.39346
201010.9853-0.985321
216.410.9772-4.57719
2213.810.96912.83095
2310.810.9609-0.160916
2413.810.95282.84722
2511.710.94460.755354
2610.910.9365-0.0365108
2716.110.92845.17162
2813.410.92022.47976
299.910.9121-1.01211
3011.510.9040.596029
318.310.8958-2.59584
3211.710.88770.812299
33910.8796-1.87957
349.710.8714-1.17143
3510.810.8633-0.0632957
3610.310.8552-0.555161
3710.410.847-0.447026
3812.710.83891.86111
399.310.8308-1.53076
4011.810.82260.977379
415.910.8145-4.91449
4211.410.80640.593649
431310.79822.20178
4410.810.79010.00991945
4512.310.78191.51805
4611.310.77380.526189
4711.810.76571.03432
487.910.7575-2.85754
4912.710.74941.95059
5012.310.74131.55873
5111.610.73310.866865
526.710.725-4.025
5310.910.71690.183135
5412.110.70871.39127
5513.310.70062.5994
5610.110.6925-0.59246
575.710.6843-4.98433
5814.310.67623.62381
59810.6681-2.66806
6013.310.65992.64008
619.310.6518-1.35179
6212.510.64371.85635
637.610.6355-3.03552
6415.910.62745.27262
659.210.6192-1.41925
669.110.6111-1.51111
6711.110.6030.497025
681310.59482.40516
6914.510.58673.91329
7012.210.57861.62143
7112.310.57041.72956
7211.410.56230.8377
738.810.5542-1.75417
7414.610.5464.05397
7512.610.53792.0621
761310.52982.47024
7712.610.52162.07837
7813.210.51352.68651
799.910.5054-0.605355
807.710.4972-2.79722
8110.510.48910.010915
8213.410.48092.91905
8310.910.47280.427185
844.310.4647-6.16468
8510.310.4565-0.156545
8611.810.44841.35159
8711.210.44030.759725
8811.410.43210.96786
898.610.424-1.824
9013.210.41592.78413
9112.610.40772.19227
925.610.3996-4.7996
939.910.3915-0.491465
948.810.3833-1.58333
957.710.3752-2.67519
96910.3671-1.36706
977.310.3589-3.05892
9811.410.35081.04921
9913.610.34273.25735
1007.910.3345-2.43452
10110.710.32640.373615
10210.310.3182-0.0182496
1038.310.3101-2.01011
1049.610.302-0.70198
10514.210.29383.90616
1068.510.2857-1.78571
10713.510.27763.22243
1084.910.2694-5.36944
1096.410.2613-3.8613
1109.610.2532-0.65317
11111.610.2451.35497
11211.110.23690.863101

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 12.9 & 11.1399 & 1.76011 \tabularnewline
2 & 12.2 & 11.1318 & 1.06825 \tabularnewline
3 & 12.8 & 11.1236 & 1.67638 \tabularnewline
4 & 7.4 & 11.1155 & -3.71548 \tabularnewline
5 & 6.7 & 11.1073 & -4.40735 \tabularnewline
6 & 12.6 & 11.0992 & 1.50079 \tabularnewline
7 & 14.8 & 11.0911 & 3.70892 \tabularnewline
8 & 13.3 & 11.0829 & 2.21706 \tabularnewline
9 & 11.1 & 11.0748 & 0.0251939 \tabularnewline
10 & 8.2 & 11.0667 & -2.86667 \tabularnewline
11 & 11.4 & 11.0585 & 0.341464 \tabularnewline
12 & 6.4 & 11.0504 & -4.6504 \tabularnewline
13 & 10.6 & 11.0423 & -0.442266 \tabularnewline
14 & 12 & 11.0341 & 0.965869 \tabularnewline
15 & 6.3 & 11.026 & -4.726 \tabularnewline
16 & 11.3 & 11.0179 & 0.282139 \tabularnewline
17 & 11.9 & 11.0097 & 0.890274 \tabularnewline
18 & 9.3 & 11.0016 & -1.70159 \tabularnewline
19 & 9.6 & 10.9935 & -1.39346 \tabularnewline
20 & 10 & 10.9853 & -0.985321 \tabularnewline
21 & 6.4 & 10.9772 & -4.57719 \tabularnewline
22 & 13.8 & 10.9691 & 2.83095 \tabularnewline
23 & 10.8 & 10.9609 & -0.160916 \tabularnewline
24 & 13.8 & 10.9528 & 2.84722 \tabularnewline
25 & 11.7 & 10.9446 & 0.755354 \tabularnewline
26 & 10.9 & 10.9365 & -0.0365108 \tabularnewline
27 & 16.1 & 10.9284 & 5.17162 \tabularnewline
28 & 13.4 & 10.9202 & 2.47976 \tabularnewline
29 & 9.9 & 10.9121 & -1.01211 \tabularnewline
30 & 11.5 & 10.904 & 0.596029 \tabularnewline
31 & 8.3 & 10.8958 & -2.59584 \tabularnewline
32 & 11.7 & 10.8877 & 0.812299 \tabularnewline
33 & 9 & 10.8796 & -1.87957 \tabularnewline
34 & 9.7 & 10.8714 & -1.17143 \tabularnewline
35 & 10.8 & 10.8633 & -0.0632957 \tabularnewline
36 & 10.3 & 10.8552 & -0.555161 \tabularnewline
37 & 10.4 & 10.847 & -0.447026 \tabularnewline
38 & 12.7 & 10.8389 & 1.86111 \tabularnewline
39 & 9.3 & 10.8308 & -1.53076 \tabularnewline
40 & 11.8 & 10.8226 & 0.977379 \tabularnewline
41 & 5.9 & 10.8145 & -4.91449 \tabularnewline
42 & 11.4 & 10.8064 & 0.593649 \tabularnewline
43 & 13 & 10.7982 & 2.20178 \tabularnewline
44 & 10.8 & 10.7901 & 0.00991945 \tabularnewline
45 & 12.3 & 10.7819 & 1.51805 \tabularnewline
46 & 11.3 & 10.7738 & 0.526189 \tabularnewline
47 & 11.8 & 10.7657 & 1.03432 \tabularnewline
48 & 7.9 & 10.7575 & -2.85754 \tabularnewline
49 & 12.7 & 10.7494 & 1.95059 \tabularnewline
50 & 12.3 & 10.7413 & 1.55873 \tabularnewline
51 & 11.6 & 10.7331 & 0.866865 \tabularnewline
52 & 6.7 & 10.725 & -4.025 \tabularnewline
53 & 10.9 & 10.7169 & 0.183135 \tabularnewline
54 & 12.1 & 10.7087 & 1.39127 \tabularnewline
55 & 13.3 & 10.7006 & 2.5994 \tabularnewline
56 & 10.1 & 10.6925 & -0.59246 \tabularnewline
57 & 5.7 & 10.6843 & -4.98433 \tabularnewline
58 & 14.3 & 10.6762 & 3.62381 \tabularnewline
59 & 8 & 10.6681 & -2.66806 \tabularnewline
60 & 13.3 & 10.6599 & 2.64008 \tabularnewline
61 & 9.3 & 10.6518 & -1.35179 \tabularnewline
62 & 12.5 & 10.6437 & 1.85635 \tabularnewline
63 & 7.6 & 10.6355 & -3.03552 \tabularnewline
64 & 15.9 & 10.6274 & 5.27262 \tabularnewline
65 & 9.2 & 10.6192 & -1.41925 \tabularnewline
66 & 9.1 & 10.6111 & -1.51111 \tabularnewline
67 & 11.1 & 10.603 & 0.497025 \tabularnewline
68 & 13 & 10.5948 & 2.40516 \tabularnewline
69 & 14.5 & 10.5867 & 3.91329 \tabularnewline
70 & 12.2 & 10.5786 & 1.62143 \tabularnewline
71 & 12.3 & 10.5704 & 1.72956 \tabularnewline
72 & 11.4 & 10.5623 & 0.8377 \tabularnewline
73 & 8.8 & 10.5542 & -1.75417 \tabularnewline
74 & 14.6 & 10.546 & 4.05397 \tabularnewline
75 & 12.6 & 10.5379 & 2.0621 \tabularnewline
76 & 13 & 10.5298 & 2.47024 \tabularnewline
77 & 12.6 & 10.5216 & 2.07837 \tabularnewline
78 & 13.2 & 10.5135 & 2.68651 \tabularnewline
79 & 9.9 & 10.5054 & -0.605355 \tabularnewline
80 & 7.7 & 10.4972 & -2.79722 \tabularnewline
81 & 10.5 & 10.4891 & 0.010915 \tabularnewline
82 & 13.4 & 10.4809 & 2.91905 \tabularnewline
83 & 10.9 & 10.4728 & 0.427185 \tabularnewline
84 & 4.3 & 10.4647 & -6.16468 \tabularnewline
85 & 10.3 & 10.4565 & -0.156545 \tabularnewline
86 & 11.8 & 10.4484 & 1.35159 \tabularnewline
87 & 11.2 & 10.4403 & 0.759725 \tabularnewline
88 & 11.4 & 10.4321 & 0.96786 \tabularnewline
89 & 8.6 & 10.424 & -1.824 \tabularnewline
90 & 13.2 & 10.4159 & 2.78413 \tabularnewline
91 & 12.6 & 10.4077 & 2.19227 \tabularnewline
92 & 5.6 & 10.3996 & -4.7996 \tabularnewline
93 & 9.9 & 10.3915 & -0.491465 \tabularnewline
94 & 8.8 & 10.3833 & -1.58333 \tabularnewline
95 & 7.7 & 10.3752 & -2.67519 \tabularnewline
96 & 9 & 10.3671 & -1.36706 \tabularnewline
97 & 7.3 & 10.3589 & -3.05892 \tabularnewline
98 & 11.4 & 10.3508 & 1.04921 \tabularnewline
99 & 13.6 & 10.3427 & 3.25735 \tabularnewline
100 & 7.9 & 10.3345 & -2.43452 \tabularnewline
101 & 10.7 & 10.3264 & 0.373615 \tabularnewline
102 & 10.3 & 10.3182 & -0.0182496 \tabularnewline
103 & 8.3 & 10.3101 & -2.01011 \tabularnewline
104 & 9.6 & 10.302 & -0.70198 \tabularnewline
105 & 14.2 & 10.2938 & 3.90616 \tabularnewline
106 & 8.5 & 10.2857 & -1.78571 \tabularnewline
107 & 13.5 & 10.2776 & 3.22243 \tabularnewline
108 & 4.9 & 10.2694 & -5.36944 \tabularnewline
109 & 6.4 & 10.2613 & -3.8613 \tabularnewline
110 & 9.6 & 10.2532 & -0.65317 \tabularnewline
111 & 11.6 & 10.245 & 1.35497 \tabularnewline
112 & 11.1 & 10.2369 & 0.863101 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=270134&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]12.9[/C][C]11.1399[/C][C]1.76011[/C][/ROW]
[ROW][C]2[/C][C]12.2[/C][C]11.1318[/C][C]1.06825[/C][/ROW]
[ROW][C]3[/C][C]12.8[/C][C]11.1236[/C][C]1.67638[/C][/ROW]
[ROW][C]4[/C][C]7.4[/C][C]11.1155[/C][C]-3.71548[/C][/ROW]
[ROW][C]5[/C][C]6.7[/C][C]11.1073[/C][C]-4.40735[/C][/ROW]
[ROW][C]6[/C][C]12.6[/C][C]11.0992[/C][C]1.50079[/C][/ROW]
[ROW][C]7[/C][C]14.8[/C][C]11.0911[/C][C]3.70892[/C][/ROW]
[ROW][C]8[/C][C]13.3[/C][C]11.0829[/C][C]2.21706[/C][/ROW]
[ROW][C]9[/C][C]11.1[/C][C]11.0748[/C][C]0.0251939[/C][/ROW]
[ROW][C]10[/C][C]8.2[/C][C]11.0667[/C][C]-2.86667[/C][/ROW]
[ROW][C]11[/C][C]11.4[/C][C]11.0585[/C][C]0.341464[/C][/ROW]
[ROW][C]12[/C][C]6.4[/C][C]11.0504[/C][C]-4.6504[/C][/ROW]
[ROW][C]13[/C][C]10.6[/C][C]11.0423[/C][C]-0.442266[/C][/ROW]
[ROW][C]14[/C][C]12[/C][C]11.0341[/C][C]0.965869[/C][/ROW]
[ROW][C]15[/C][C]6.3[/C][C]11.026[/C][C]-4.726[/C][/ROW]
[ROW][C]16[/C][C]11.3[/C][C]11.0179[/C][C]0.282139[/C][/ROW]
[ROW][C]17[/C][C]11.9[/C][C]11.0097[/C][C]0.890274[/C][/ROW]
[ROW][C]18[/C][C]9.3[/C][C]11.0016[/C][C]-1.70159[/C][/ROW]
[ROW][C]19[/C][C]9.6[/C][C]10.9935[/C][C]-1.39346[/C][/ROW]
[ROW][C]20[/C][C]10[/C][C]10.9853[/C][C]-0.985321[/C][/ROW]
[ROW][C]21[/C][C]6.4[/C][C]10.9772[/C][C]-4.57719[/C][/ROW]
[ROW][C]22[/C][C]13.8[/C][C]10.9691[/C][C]2.83095[/C][/ROW]
[ROW][C]23[/C][C]10.8[/C][C]10.9609[/C][C]-0.160916[/C][/ROW]
[ROW][C]24[/C][C]13.8[/C][C]10.9528[/C][C]2.84722[/C][/ROW]
[ROW][C]25[/C][C]11.7[/C][C]10.9446[/C][C]0.755354[/C][/ROW]
[ROW][C]26[/C][C]10.9[/C][C]10.9365[/C][C]-0.0365108[/C][/ROW]
[ROW][C]27[/C][C]16.1[/C][C]10.9284[/C][C]5.17162[/C][/ROW]
[ROW][C]28[/C][C]13.4[/C][C]10.9202[/C][C]2.47976[/C][/ROW]
[ROW][C]29[/C][C]9.9[/C][C]10.9121[/C][C]-1.01211[/C][/ROW]
[ROW][C]30[/C][C]11.5[/C][C]10.904[/C][C]0.596029[/C][/ROW]
[ROW][C]31[/C][C]8.3[/C][C]10.8958[/C][C]-2.59584[/C][/ROW]
[ROW][C]32[/C][C]11.7[/C][C]10.8877[/C][C]0.812299[/C][/ROW]
[ROW][C]33[/C][C]9[/C][C]10.8796[/C][C]-1.87957[/C][/ROW]
[ROW][C]34[/C][C]9.7[/C][C]10.8714[/C][C]-1.17143[/C][/ROW]
[ROW][C]35[/C][C]10.8[/C][C]10.8633[/C][C]-0.0632957[/C][/ROW]
[ROW][C]36[/C][C]10.3[/C][C]10.8552[/C][C]-0.555161[/C][/ROW]
[ROW][C]37[/C][C]10.4[/C][C]10.847[/C][C]-0.447026[/C][/ROW]
[ROW][C]38[/C][C]12.7[/C][C]10.8389[/C][C]1.86111[/C][/ROW]
[ROW][C]39[/C][C]9.3[/C][C]10.8308[/C][C]-1.53076[/C][/ROW]
[ROW][C]40[/C][C]11.8[/C][C]10.8226[/C][C]0.977379[/C][/ROW]
[ROW][C]41[/C][C]5.9[/C][C]10.8145[/C][C]-4.91449[/C][/ROW]
[ROW][C]42[/C][C]11.4[/C][C]10.8064[/C][C]0.593649[/C][/ROW]
[ROW][C]43[/C][C]13[/C][C]10.7982[/C][C]2.20178[/C][/ROW]
[ROW][C]44[/C][C]10.8[/C][C]10.7901[/C][C]0.00991945[/C][/ROW]
[ROW][C]45[/C][C]12.3[/C][C]10.7819[/C][C]1.51805[/C][/ROW]
[ROW][C]46[/C][C]11.3[/C][C]10.7738[/C][C]0.526189[/C][/ROW]
[ROW][C]47[/C][C]11.8[/C][C]10.7657[/C][C]1.03432[/C][/ROW]
[ROW][C]48[/C][C]7.9[/C][C]10.7575[/C][C]-2.85754[/C][/ROW]
[ROW][C]49[/C][C]12.7[/C][C]10.7494[/C][C]1.95059[/C][/ROW]
[ROW][C]50[/C][C]12.3[/C][C]10.7413[/C][C]1.55873[/C][/ROW]
[ROW][C]51[/C][C]11.6[/C][C]10.7331[/C][C]0.866865[/C][/ROW]
[ROW][C]52[/C][C]6.7[/C][C]10.725[/C][C]-4.025[/C][/ROW]
[ROW][C]53[/C][C]10.9[/C][C]10.7169[/C][C]0.183135[/C][/ROW]
[ROW][C]54[/C][C]12.1[/C][C]10.7087[/C][C]1.39127[/C][/ROW]
[ROW][C]55[/C][C]13.3[/C][C]10.7006[/C][C]2.5994[/C][/ROW]
[ROW][C]56[/C][C]10.1[/C][C]10.6925[/C][C]-0.59246[/C][/ROW]
[ROW][C]57[/C][C]5.7[/C][C]10.6843[/C][C]-4.98433[/C][/ROW]
[ROW][C]58[/C][C]14.3[/C][C]10.6762[/C][C]3.62381[/C][/ROW]
[ROW][C]59[/C][C]8[/C][C]10.6681[/C][C]-2.66806[/C][/ROW]
[ROW][C]60[/C][C]13.3[/C][C]10.6599[/C][C]2.64008[/C][/ROW]
[ROW][C]61[/C][C]9.3[/C][C]10.6518[/C][C]-1.35179[/C][/ROW]
[ROW][C]62[/C][C]12.5[/C][C]10.6437[/C][C]1.85635[/C][/ROW]
[ROW][C]63[/C][C]7.6[/C][C]10.6355[/C][C]-3.03552[/C][/ROW]
[ROW][C]64[/C][C]15.9[/C][C]10.6274[/C][C]5.27262[/C][/ROW]
[ROW][C]65[/C][C]9.2[/C][C]10.6192[/C][C]-1.41925[/C][/ROW]
[ROW][C]66[/C][C]9.1[/C][C]10.6111[/C][C]-1.51111[/C][/ROW]
[ROW][C]67[/C][C]11.1[/C][C]10.603[/C][C]0.497025[/C][/ROW]
[ROW][C]68[/C][C]13[/C][C]10.5948[/C][C]2.40516[/C][/ROW]
[ROW][C]69[/C][C]14.5[/C][C]10.5867[/C][C]3.91329[/C][/ROW]
[ROW][C]70[/C][C]12.2[/C][C]10.5786[/C][C]1.62143[/C][/ROW]
[ROW][C]71[/C][C]12.3[/C][C]10.5704[/C][C]1.72956[/C][/ROW]
[ROW][C]72[/C][C]11.4[/C][C]10.5623[/C][C]0.8377[/C][/ROW]
[ROW][C]73[/C][C]8.8[/C][C]10.5542[/C][C]-1.75417[/C][/ROW]
[ROW][C]74[/C][C]14.6[/C][C]10.546[/C][C]4.05397[/C][/ROW]
[ROW][C]75[/C][C]12.6[/C][C]10.5379[/C][C]2.0621[/C][/ROW]
[ROW][C]76[/C][C]13[/C][C]10.5298[/C][C]2.47024[/C][/ROW]
[ROW][C]77[/C][C]12.6[/C][C]10.5216[/C][C]2.07837[/C][/ROW]
[ROW][C]78[/C][C]13.2[/C][C]10.5135[/C][C]2.68651[/C][/ROW]
[ROW][C]79[/C][C]9.9[/C][C]10.5054[/C][C]-0.605355[/C][/ROW]
[ROW][C]80[/C][C]7.7[/C][C]10.4972[/C][C]-2.79722[/C][/ROW]
[ROW][C]81[/C][C]10.5[/C][C]10.4891[/C][C]0.010915[/C][/ROW]
[ROW][C]82[/C][C]13.4[/C][C]10.4809[/C][C]2.91905[/C][/ROW]
[ROW][C]83[/C][C]10.9[/C][C]10.4728[/C][C]0.427185[/C][/ROW]
[ROW][C]84[/C][C]4.3[/C][C]10.4647[/C][C]-6.16468[/C][/ROW]
[ROW][C]85[/C][C]10.3[/C][C]10.4565[/C][C]-0.156545[/C][/ROW]
[ROW][C]86[/C][C]11.8[/C][C]10.4484[/C][C]1.35159[/C][/ROW]
[ROW][C]87[/C][C]11.2[/C][C]10.4403[/C][C]0.759725[/C][/ROW]
[ROW][C]88[/C][C]11.4[/C][C]10.4321[/C][C]0.96786[/C][/ROW]
[ROW][C]89[/C][C]8.6[/C][C]10.424[/C][C]-1.824[/C][/ROW]
[ROW][C]90[/C][C]13.2[/C][C]10.4159[/C][C]2.78413[/C][/ROW]
[ROW][C]91[/C][C]12.6[/C][C]10.4077[/C][C]2.19227[/C][/ROW]
[ROW][C]92[/C][C]5.6[/C][C]10.3996[/C][C]-4.7996[/C][/ROW]
[ROW][C]93[/C][C]9.9[/C][C]10.3915[/C][C]-0.491465[/C][/ROW]
[ROW][C]94[/C][C]8.8[/C][C]10.3833[/C][C]-1.58333[/C][/ROW]
[ROW][C]95[/C][C]7.7[/C][C]10.3752[/C][C]-2.67519[/C][/ROW]
[ROW][C]96[/C][C]9[/C][C]10.3671[/C][C]-1.36706[/C][/ROW]
[ROW][C]97[/C][C]7.3[/C][C]10.3589[/C][C]-3.05892[/C][/ROW]
[ROW][C]98[/C][C]11.4[/C][C]10.3508[/C][C]1.04921[/C][/ROW]
[ROW][C]99[/C][C]13.6[/C][C]10.3427[/C][C]3.25735[/C][/ROW]
[ROW][C]100[/C][C]7.9[/C][C]10.3345[/C][C]-2.43452[/C][/ROW]
[ROW][C]101[/C][C]10.7[/C][C]10.3264[/C][C]0.373615[/C][/ROW]
[ROW][C]102[/C][C]10.3[/C][C]10.3182[/C][C]-0.0182496[/C][/ROW]
[ROW][C]103[/C][C]8.3[/C][C]10.3101[/C][C]-2.01011[/C][/ROW]
[ROW][C]104[/C][C]9.6[/C][C]10.302[/C][C]-0.70198[/C][/ROW]
[ROW][C]105[/C][C]14.2[/C][C]10.2938[/C][C]3.90616[/C][/ROW]
[ROW][C]106[/C][C]8.5[/C][C]10.2857[/C][C]-1.78571[/C][/ROW]
[ROW][C]107[/C][C]13.5[/C][C]10.2776[/C][C]3.22243[/C][/ROW]
[ROW][C]108[/C][C]4.9[/C][C]10.2694[/C][C]-5.36944[/C][/ROW]
[ROW][C]109[/C][C]6.4[/C][C]10.2613[/C][C]-3.8613[/C][/ROW]
[ROW][C]110[/C][C]9.6[/C][C]10.2532[/C][C]-0.65317[/C][/ROW]
[ROW][C]111[/C][C]11.6[/C][C]10.245[/C][C]1.35497[/C][/ROW]
[ROW][C]112[/C][C]11.1[/C][C]10.2369[/C][C]0.863101[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=270134&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=270134&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
112.911.13991.76011
212.211.13181.06825
312.811.12361.67638
47.411.1155-3.71548
56.711.1073-4.40735
612.611.09921.50079
714.811.09113.70892
813.311.08292.21706
911.111.07480.0251939
108.211.0667-2.86667
1111.411.05850.341464
126.411.0504-4.6504
1310.611.0423-0.442266
141211.03410.965869
156.311.026-4.726
1611.311.01790.282139
1711.911.00970.890274
189.311.0016-1.70159
199.610.9935-1.39346
201010.9853-0.985321
216.410.9772-4.57719
2213.810.96912.83095
2310.810.9609-0.160916
2413.810.95282.84722
2511.710.94460.755354
2610.910.9365-0.0365108
2716.110.92845.17162
2813.410.92022.47976
299.910.9121-1.01211
3011.510.9040.596029
318.310.8958-2.59584
3211.710.88770.812299
33910.8796-1.87957
349.710.8714-1.17143
3510.810.8633-0.0632957
3610.310.8552-0.555161
3710.410.847-0.447026
3812.710.83891.86111
399.310.8308-1.53076
4011.810.82260.977379
415.910.8145-4.91449
4211.410.80640.593649
431310.79822.20178
4410.810.79010.00991945
4512.310.78191.51805
4611.310.77380.526189
4711.810.76571.03432
487.910.7575-2.85754
4912.710.74941.95059
5012.310.74131.55873
5111.610.73310.866865
526.710.725-4.025
5310.910.71690.183135
5412.110.70871.39127
5513.310.70062.5994
5610.110.6925-0.59246
575.710.6843-4.98433
5814.310.67623.62381
59810.6681-2.66806
6013.310.65992.64008
619.310.6518-1.35179
6212.510.64371.85635
637.610.6355-3.03552
6415.910.62745.27262
659.210.6192-1.41925
669.110.6111-1.51111
6711.110.6030.497025
681310.59482.40516
6914.510.58673.91329
7012.210.57861.62143
7112.310.57041.72956
7211.410.56230.8377
738.810.5542-1.75417
7414.610.5464.05397
7512.610.53792.0621
761310.52982.47024
7712.610.52162.07837
7813.210.51352.68651
799.910.5054-0.605355
807.710.4972-2.79722
8110.510.48910.010915
8213.410.48092.91905
8310.910.47280.427185
844.310.4647-6.16468
8510.310.4565-0.156545
8611.810.44841.35159
8711.210.44030.759725
8811.410.43210.96786
898.610.424-1.824
9013.210.41592.78413
9112.610.40772.19227
925.610.3996-4.7996
939.910.3915-0.491465
948.810.3833-1.58333
957.710.3752-2.67519
96910.3671-1.36706
977.310.3589-3.05892
9811.410.35081.04921
9913.610.34273.25735
1007.910.3345-2.43452
10110.710.32640.373615
10210.310.3182-0.0182496
1038.310.3101-2.01011
1049.610.302-0.70198
10514.210.29383.90616
1068.510.2857-1.78571
10713.510.27763.22243
1084.910.2694-5.36944
1096.410.2613-3.8613
1109.610.2532-0.65317
11111.610.2451.35497
11211.110.23690.863101







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
50.2924190.5848380.707581
60.7658360.4683280.234164
70.8951850.209630.104815
80.8511910.2976180.148809
90.7862620.4274760.213738
100.8034740.3930520.196526
110.7320690.5358630.267931
120.8043420.3913170.195658
130.7487450.5025110.251255
140.7235210.5529590.276479
150.7748040.4503920.225196
160.7483480.5033040.251652
170.7289520.5420960.271048
180.6695280.6609440.330472
190.6041830.7916340.395817
200.5377210.9245580.462279
210.5924450.815110.407555
220.7145640.5708720.285436
230.6652020.6695960.334798
240.7210840.5578320.278916
250.6750130.6499740.324987
260.6153540.7692930.384646
270.7812970.4374060.218703
280.7633050.4733910.236695
290.7307980.5384040.269202
300.6773690.6452620.322631
310.6964710.6070580.303529
320.6436420.7127160.356358
330.6266710.7466570.373329
340.5847650.8304690.415235
350.5265820.9468360.473418
360.4721470.9442940.527853
370.4174090.8348180.582591
380.3883780.7767560.611622
390.3598030.7196060.640197
400.3131840.6263670.686816
410.4871080.9742160.512892
420.436340.8726810.56366
430.4214190.8428390.578581
440.3688170.7376340.631183
450.3314390.6628790.668561
460.2832150.5664290.716785
470.2417230.4834450.758277
480.2728170.5456340.727183
490.2482680.4965360.751732
500.2159810.4319620.784019
510.1790250.3580510.820975
520.2719270.5438550.728073
530.2300790.4601570.769921
540.1973880.3947750.802612
550.1882990.3765980.811701
560.1603230.3206460.839677
570.3289620.6579250.671038
580.3587970.7175940.641203
590.4007580.8015150.599242
600.3853890.7707770.614611
610.370720.741440.62928
620.3333990.6667980.666601
630.407740.815480.59226
640.5374940.9250130.462506
650.5292560.9414870.470744
660.5298640.9402720.470136
670.4799440.9598870.520056
680.447120.8942410.55288
690.4770180.9540360.522982
700.4272950.8545910.572705
710.380560.7611190.61944
720.327660.6553190.67234
730.3295150.6590290.670485
740.3710840.7421690.628916
750.3360470.6720950.663953
760.3182170.6364340.681783
770.2942850.5885690.705715
780.3007910.6015820.699209
790.2584560.5169110.741544
800.277530.5550590.72247
810.2310150.462030.768985
820.2529060.5058110.747094
830.2137940.4275890.786206
840.4945890.9891790.505411
850.4320650.8641290.567935
860.3877840.7755680.612216
870.3359110.6718230.664089
880.2945990.5891990.705401
890.2598230.5196450.740177
900.3030690.6061380.696931
910.3514810.7029620.648519
920.4489420.8978830.551058
930.3817150.763430.618285
940.3220610.6441210.677939
950.3029890.6059780.697011
960.2495380.4990770.750462
970.2843470.5686930.715653
980.2214460.4428920.778554
990.2690930.5381870.730907
1000.2404180.4808350.759582
1010.1762940.3525880.823706
1020.1212860.2425720.878714
1030.09365680.1873140.906343
1040.05952050.1190410.940479
1050.1279870.2559740.872013
1060.07350110.1470020.926499
1070.8890980.2218040.110902

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
5 & 0.292419 & 0.584838 & 0.707581 \tabularnewline
6 & 0.765836 & 0.468328 & 0.234164 \tabularnewline
7 & 0.895185 & 0.20963 & 0.104815 \tabularnewline
8 & 0.851191 & 0.297618 & 0.148809 \tabularnewline
9 & 0.786262 & 0.427476 & 0.213738 \tabularnewline
10 & 0.803474 & 0.393052 & 0.196526 \tabularnewline
11 & 0.732069 & 0.535863 & 0.267931 \tabularnewline
12 & 0.804342 & 0.391317 & 0.195658 \tabularnewline
13 & 0.748745 & 0.502511 & 0.251255 \tabularnewline
14 & 0.723521 & 0.552959 & 0.276479 \tabularnewline
15 & 0.774804 & 0.450392 & 0.225196 \tabularnewline
16 & 0.748348 & 0.503304 & 0.251652 \tabularnewline
17 & 0.728952 & 0.542096 & 0.271048 \tabularnewline
18 & 0.669528 & 0.660944 & 0.330472 \tabularnewline
19 & 0.604183 & 0.791634 & 0.395817 \tabularnewline
20 & 0.537721 & 0.924558 & 0.462279 \tabularnewline
21 & 0.592445 & 0.81511 & 0.407555 \tabularnewline
22 & 0.714564 & 0.570872 & 0.285436 \tabularnewline
23 & 0.665202 & 0.669596 & 0.334798 \tabularnewline
24 & 0.721084 & 0.557832 & 0.278916 \tabularnewline
25 & 0.675013 & 0.649974 & 0.324987 \tabularnewline
26 & 0.615354 & 0.769293 & 0.384646 \tabularnewline
27 & 0.781297 & 0.437406 & 0.218703 \tabularnewline
28 & 0.763305 & 0.473391 & 0.236695 \tabularnewline
29 & 0.730798 & 0.538404 & 0.269202 \tabularnewline
30 & 0.677369 & 0.645262 & 0.322631 \tabularnewline
31 & 0.696471 & 0.607058 & 0.303529 \tabularnewline
32 & 0.643642 & 0.712716 & 0.356358 \tabularnewline
33 & 0.626671 & 0.746657 & 0.373329 \tabularnewline
34 & 0.584765 & 0.830469 & 0.415235 \tabularnewline
35 & 0.526582 & 0.946836 & 0.473418 \tabularnewline
36 & 0.472147 & 0.944294 & 0.527853 \tabularnewline
37 & 0.417409 & 0.834818 & 0.582591 \tabularnewline
38 & 0.388378 & 0.776756 & 0.611622 \tabularnewline
39 & 0.359803 & 0.719606 & 0.640197 \tabularnewline
40 & 0.313184 & 0.626367 & 0.686816 \tabularnewline
41 & 0.487108 & 0.974216 & 0.512892 \tabularnewline
42 & 0.43634 & 0.872681 & 0.56366 \tabularnewline
43 & 0.421419 & 0.842839 & 0.578581 \tabularnewline
44 & 0.368817 & 0.737634 & 0.631183 \tabularnewline
45 & 0.331439 & 0.662879 & 0.668561 \tabularnewline
46 & 0.283215 & 0.566429 & 0.716785 \tabularnewline
47 & 0.241723 & 0.483445 & 0.758277 \tabularnewline
48 & 0.272817 & 0.545634 & 0.727183 \tabularnewline
49 & 0.248268 & 0.496536 & 0.751732 \tabularnewline
50 & 0.215981 & 0.431962 & 0.784019 \tabularnewline
51 & 0.179025 & 0.358051 & 0.820975 \tabularnewline
52 & 0.271927 & 0.543855 & 0.728073 \tabularnewline
53 & 0.230079 & 0.460157 & 0.769921 \tabularnewline
54 & 0.197388 & 0.394775 & 0.802612 \tabularnewline
55 & 0.188299 & 0.376598 & 0.811701 \tabularnewline
56 & 0.160323 & 0.320646 & 0.839677 \tabularnewline
57 & 0.328962 & 0.657925 & 0.671038 \tabularnewline
58 & 0.358797 & 0.717594 & 0.641203 \tabularnewline
59 & 0.400758 & 0.801515 & 0.599242 \tabularnewline
60 & 0.385389 & 0.770777 & 0.614611 \tabularnewline
61 & 0.37072 & 0.74144 & 0.62928 \tabularnewline
62 & 0.333399 & 0.666798 & 0.666601 \tabularnewline
63 & 0.40774 & 0.81548 & 0.59226 \tabularnewline
64 & 0.537494 & 0.925013 & 0.462506 \tabularnewline
65 & 0.529256 & 0.941487 & 0.470744 \tabularnewline
66 & 0.529864 & 0.940272 & 0.470136 \tabularnewline
67 & 0.479944 & 0.959887 & 0.520056 \tabularnewline
68 & 0.44712 & 0.894241 & 0.55288 \tabularnewline
69 & 0.477018 & 0.954036 & 0.522982 \tabularnewline
70 & 0.427295 & 0.854591 & 0.572705 \tabularnewline
71 & 0.38056 & 0.761119 & 0.61944 \tabularnewline
72 & 0.32766 & 0.655319 & 0.67234 \tabularnewline
73 & 0.329515 & 0.659029 & 0.670485 \tabularnewline
74 & 0.371084 & 0.742169 & 0.628916 \tabularnewline
75 & 0.336047 & 0.672095 & 0.663953 \tabularnewline
76 & 0.318217 & 0.636434 & 0.681783 \tabularnewline
77 & 0.294285 & 0.588569 & 0.705715 \tabularnewline
78 & 0.300791 & 0.601582 & 0.699209 \tabularnewline
79 & 0.258456 & 0.516911 & 0.741544 \tabularnewline
80 & 0.27753 & 0.555059 & 0.72247 \tabularnewline
81 & 0.231015 & 0.46203 & 0.768985 \tabularnewline
82 & 0.252906 & 0.505811 & 0.747094 \tabularnewline
83 & 0.213794 & 0.427589 & 0.786206 \tabularnewline
84 & 0.494589 & 0.989179 & 0.505411 \tabularnewline
85 & 0.432065 & 0.864129 & 0.567935 \tabularnewline
86 & 0.387784 & 0.775568 & 0.612216 \tabularnewline
87 & 0.335911 & 0.671823 & 0.664089 \tabularnewline
88 & 0.294599 & 0.589199 & 0.705401 \tabularnewline
89 & 0.259823 & 0.519645 & 0.740177 \tabularnewline
90 & 0.303069 & 0.606138 & 0.696931 \tabularnewline
91 & 0.351481 & 0.702962 & 0.648519 \tabularnewline
92 & 0.448942 & 0.897883 & 0.551058 \tabularnewline
93 & 0.381715 & 0.76343 & 0.618285 \tabularnewline
94 & 0.322061 & 0.644121 & 0.677939 \tabularnewline
95 & 0.302989 & 0.605978 & 0.697011 \tabularnewline
96 & 0.249538 & 0.499077 & 0.750462 \tabularnewline
97 & 0.284347 & 0.568693 & 0.715653 \tabularnewline
98 & 0.221446 & 0.442892 & 0.778554 \tabularnewline
99 & 0.269093 & 0.538187 & 0.730907 \tabularnewline
100 & 0.240418 & 0.480835 & 0.759582 \tabularnewline
101 & 0.176294 & 0.352588 & 0.823706 \tabularnewline
102 & 0.121286 & 0.242572 & 0.878714 \tabularnewline
103 & 0.0936568 & 0.187314 & 0.906343 \tabularnewline
104 & 0.0595205 & 0.119041 & 0.940479 \tabularnewline
105 & 0.127987 & 0.255974 & 0.872013 \tabularnewline
106 & 0.0735011 & 0.147002 & 0.926499 \tabularnewline
107 & 0.889098 & 0.221804 & 0.110902 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=270134&T=5

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]5[/C][C]0.292419[/C][C]0.584838[/C][C]0.707581[/C][/ROW]
[ROW][C]6[/C][C]0.765836[/C][C]0.468328[/C][C]0.234164[/C][/ROW]
[ROW][C]7[/C][C]0.895185[/C][C]0.20963[/C][C]0.104815[/C][/ROW]
[ROW][C]8[/C][C]0.851191[/C][C]0.297618[/C][C]0.148809[/C][/ROW]
[ROW][C]9[/C][C]0.786262[/C][C]0.427476[/C][C]0.213738[/C][/ROW]
[ROW][C]10[/C][C]0.803474[/C][C]0.393052[/C][C]0.196526[/C][/ROW]
[ROW][C]11[/C][C]0.732069[/C][C]0.535863[/C][C]0.267931[/C][/ROW]
[ROW][C]12[/C][C]0.804342[/C][C]0.391317[/C][C]0.195658[/C][/ROW]
[ROW][C]13[/C][C]0.748745[/C][C]0.502511[/C][C]0.251255[/C][/ROW]
[ROW][C]14[/C][C]0.723521[/C][C]0.552959[/C][C]0.276479[/C][/ROW]
[ROW][C]15[/C][C]0.774804[/C][C]0.450392[/C][C]0.225196[/C][/ROW]
[ROW][C]16[/C][C]0.748348[/C][C]0.503304[/C][C]0.251652[/C][/ROW]
[ROW][C]17[/C][C]0.728952[/C][C]0.542096[/C][C]0.271048[/C][/ROW]
[ROW][C]18[/C][C]0.669528[/C][C]0.660944[/C][C]0.330472[/C][/ROW]
[ROW][C]19[/C][C]0.604183[/C][C]0.791634[/C][C]0.395817[/C][/ROW]
[ROW][C]20[/C][C]0.537721[/C][C]0.924558[/C][C]0.462279[/C][/ROW]
[ROW][C]21[/C][C]0.592445[/C][C]0.81511[/C][C]0.407555[/C][/ROW]
[ROW][C]22[/C][C]0.714564[/C][C]0.570872[/C][C]0.285436[/C][/ROW]
[ROW][C]23[/C][C]0.665202[/C][C]0.669596[/C][C]0.334798[/C][/ROW]
[ROW][C]24[/C][C]0.721084[/C][C]0.557832[/C][C]0.278916[/C][/ROW]
[ROW][C]25[/C][C]0.675013[/C][C]0.649974[/C][C]0.324987[/C][/ROW]
[ROW][C]26[/C][C]0.615354[/C][C]0.769293[/C][C]0.384646[/C][/ROW]
[ROW][C]27[/C][C]0.781297[/C][C]0.437406[/C][C]0.218703[/C][/ROW]
[ROW][C]28[/C][C]0.763305[/C][C]0.473391[/C][C]0.236695[/C][/ROW]
[ROW][C]29[/C][C]0.730798[/C][C]0.538404[/C][C]0.269202[/C][/ROW]
[ROW][C]30[/C][C]0.677369[/C][C]0.645262[/C][C]0.322631[/C][/ROW]
[ROW][C]31[/C][C]0.696471[/C][C]0.607058[/C][C]0.303529[/C][/ROW]
[ROW][C]32[/C][C]0.643642[/C][C]0.712716[/C][C]0.356358[/C][/ROW]
[ROW][C]33[/C][C]0.626671[/C][C]0.746657[/C][C]0.373329[/C][/ROW]
[ROW][C]34[/C][C]0.584765[/C][C]0.830469[/C][C]0.415235[/C][/ROW]
[ROW][C]35[/C][C]0.526582[/C][C]0.946836[/C][C]0.473418[/C][/ROW]
[ROW][C]36[/C][C]0.472147[/C][C]0.944294[/C][C]0.527853[/C][/ROW]
[ROW][C]37[/C][C]0.417409[/C][C]0.834818[/C][C]0.582591[/C][/ROW]
[ROW][C]38[/C][C]0.388378[/C][C]0.776756[/C][C]0.611622[/C][/ROW]
[ROW][C]39[/C][C]0.359803[/C][C]0.719606[/C][C]0.640197[/C][/ROW]
[ROW][C]40[/C][C]0.313184[/C][C]0.626367[/C][C]0.686816[/C][/ROW]
[ROW][C]41[/C][C]0.487108[/C][C]0.974216[/C][C]0.512892[/C][/ROW]
[ROW][C]42[/C][C]0.43634[/C][C]0.872681[/C][C]0.56366[/C][/ROW]
[ROW][C]43[/C][C]0.421419[/C][C]0.842839[/C][C]0.578581[/C][/ROW]
[ROW][C]44[/C][C]0.368817[/C][C]0.737634[/C][C]0.631183[/C][/ROW]
[ROW][C]45[/C][C]0.331439[/C][C]0.662879[/C][C]0.668561[/C][/ROW]
[ROW][C]46[/C][C]0.283215[/C][C]0.566429[/C][C]0.716785[/C][/ROW]
[ROW][C]47[/C][C]0.241723[/C][C]0.483445[/C][C]0.758277[/C][/ROW]
[ROW][C]48[/C][C]0.272817[/C][C]0.545634[/C][C]0.727183[/C][/ROW]
[ROW][C]49[/C][C]0.248268[/C][C]0.496536[/C][C]0.751732[/C][/ROW]
[ROW][C]50[/C][C]0.215981[/C][C]0.431962[/C][C]0.784019[/C][/ROW]
[ROW][C]51[/C][C]0.179025[/C][C]0.358051[/C][C]0.820975[/C][/ROW]
[ROW][C]52[/C][C]0.271927[/C][C]0.543855[/C][C]0.728073[/C][/ROW]
[ROW][C]53[/C][C]0.230079[/C][C]0.460157[/C][C]0.769921[/C][/ROW]
[ROW][C]54[/C][C]0.197388[/C][C]0.394775[/C][C]0.802612[/C][/ROW]
[ROW][C]55[/C][C]0.188299[/C][C]0.376598[/C][C]0.811701[/C][/ROW]
[ROW][C]56[/C][C]0.160323[/C][C]0.320646[/C][C]0.839677[/C][/ROW]
[ROW][C]57[/C][C]0.328962[/C][C]0.657925[/C][C]0.671038[/C][/ROW]
[ROW][C]58[/C][C]0.358797[/C][C]0.717594[/C][C]0.641203[/C][/ROW]
[ROW][C]59[/C][C]0.400758[/C][C]0.801515[/C][C]0.599242[/C][/ROW]
[ROW][C]60[/C][C]0.385389[/C][C]0.770777[/C][C]0.614611[/C][/ROW]
[ROW][C]61[/C][C]0.37072[/C][C]0.74144[/C][C]0.62928[/C][/ROW]
[ROW][C]62[/C][C]0.333399[/C][C]0.666798[/C][C]0.666601[/C][/ROW]
[ROW][C]63[/C][C]0.40774[/C][C]0.81548[/C][C]0.59226[/C][/ROW]
[ROW][C]64[/C][C]0.537494[/C][C]0.925013[/C][C]0.462506[/C][/ROW]
[ROW][C]65[/C][C]0.529256[/C][C]0.941487[/C][C]0.470744[/C][/ROW]
[ROW][C]66[/C][C]0.529864[/C][C]0.940272[/C][C]0.470136[/C][/ROW]
[ROW][C]67[/C][C]0.479944[/C][C]0.959887[/C][C]0.520056[/C][/ROW]
[ROW][C]68[/C][C]0.44712[/C][C]0.894241[/C][C]0.55288[/C][/ROW]
[ROW][C]69[/C][C]0.477018[/C][C]0.954036[/C][C]0.522982[/C][/ROW]
[ROW][C]70[/C][C]0.427295[/C][C]0.854591[/C][C]0.572705[/C][/ROW]
[ROW][C]71[/C][C]0.38056[/C][C]0.761119[/C][C]0.61944[/C][/ROW]
[ROW][C]72[/C][C]0.32766[/C][C]0.655319[/C][C]0.67234[/C][/ROW]
[ROW][C]73[/C][C]0.329515[/C][C]0.659029[/C][C]0.670485[/C][/ROW]
[ROW][C]74[/C][C]0.371084[/C][C]0.742169[/C][C]0.628916[/C][/ROW]
[ROW][C]75[/C][C]0.336047[/C][C]0.672095[/C][C]0.663953[/C][/ROW]
[ROW][C]76[/C][C]0.318217[/C][C]0.636434[/C][C]0.681783[/C][/ROW]
[ROW][C]77[/C][C]0.294285[/C][C]0.588569[/C][C]0.705715[/C][/ROW]
[ROW][C]78[/C][C]0.300791[/C][C]0.601582[/C][C]0.699209[/C][/ROW]
[ROW][C]79[/C][C]0.258456[/C][C]0.516911[/C][C]0.741544[/C][/ROW]
[ROW][C]80[/C][C]0.27753[/C][C]0.555059[/C][C]0.72247[/C][/ROW]
[ROW][C]81[/C][C]0.231015[/C][C]0.46203[/C][C]0.768985[/C][/ROW]
[ROW][C]82[/C][C]0.252906[/C][C]0.505811[/C][C]0.747094[/C][/ROW]
[ROW][C]83[/C][C]0.213794[/C][C]0.427589[/C][C]0.786206[/C][/ROW]
[ROW][C]84[/C][C]0.494589[/C][C]0.989179[/C][C]0.505411[/C][/ROW]
[ROW][C]85[/C][C]0.432065[/C][C]0.864129[/C][C]0.567935[/C][/ROW]
[ROW][C]86[/C][C]0.387784[/C][C]0.775568[/C][C]0.612216[/C][/ROW]
[ROW][C]87[/C][C]0.335911[/C][C]0.671823[/C][C]0.664089[/C][/ROW]
[ROW][C]88[/C][C]0.294599[/C][C]0.589199[/C][C]0.705401[/C][/ROW]
[ROW][C]89[/C][C]0.259823[/C][C]0.519645[/C][C]0.740177[/C][/ROW]
[ROW][C]90[/C][C]0.303069[/C][C]0.606138[/C][C]0.696931[/C][/ROW]
[ROW][C]91[/C][C]0.351481[/C][C]0.702962[/C][C]0.648519[/C][/ROW]
[ROW][C]92[/C][C]0.448942[/C][C]0.897883[/C][C]0.551058[/C][/ROW]
[ROW][C]93[/C][C]0.381715[/C][C]0.76343[/C][C]0.618285[/C][/ROW]
[ROW][C]94[/C][C]0.322061[/C][C]0.644121[/C][C]0.677939[/C][/ROW]
[ROW][C]95[/C][C]0.302989[/C][C]0.605978[/C][C]0.697011[/C][/ROW]
[ROW][C]96[/C][C]0.249538[/C][C]0.499077[/C][C]0.750462[/C][/ROW]
[ROW][C]97[/C][C]0.284347[/C][C]0.568693[/C][C]0.715653[/C][/ROW]
[ROW][C]98[/C][C]0.221446[/C][C]0.442892[/C][C]0.778554[/C][/ROW]
[ROW][C]99[/C][C]0.269093[/C][C]0.538187[/C][C]0.730907[/C][/ROW]
[ROW][C]100[/C][C]0.240418[/C][C]0.480835[/C][C]0.759582[/C][/ROW]
[ROW][C]101[/C][C]0.176294[/C][C]0.352588[/C][C]0.823706[/C][/ROW]
[ROW][C]102[/C][C]0.121286[/C][C]0.242572[/C][C]0.878714[/C][/ROW]
[ROW][C]103[/C][C]0.0936568[/C][C]0.187314[/C][C]0.906343[/C][/ROW]
[ROW][C]104[/C][C]0.0595205[/C][C]0.119041[/C][C]0.940479[/C][/ROW]
[ROW][C]105[/C][C]0.127987[/C][C]0.255974[/C][C]0.872013[/C][/ROW]
[ROW][C]106[/C][C]0.0735011[/C][C]0.147002[/C][C]0.926499[/C][/ROW]
[ROW][C]107[/C][C]0.889098[/C][C]0.221804[/C][C]0.110902[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=270134&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=270134&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
50.2924190.5848380.707581
60.7658360.4683280.234164
70.8951850.209630.104815
80.8511910.2976180.148809
90.7862620.4274760.213738
100.8034740.3930520.196526
110.7320690.5358630.267931
120.8043420.3913170.195658
130.7487450.5025110.251255
140.7235210.5529590.276479
150.7748040.4503920.225196
160.7483480.5033040.251652
170.7289520.5420960.271048
180.6695280.6609440.330472
190.6041830.7916340.395817
200.5377210.9245580.462279
210.5924450.815110.407555
220.7145640.5708720.285436
230.6652020.6695960.334798
240.7210840.5578320.278916
250.6750130.6499740.324987
260.6153540.7692930.384646
270.7812970.4374060.218703
280.7633050.4733910.236695
290.7307980.5384040.269202
300.6773690.6452620.322631
310.6964710.6070580.303529
320.6436420.7127160.356358
330.6266710.7466570.373329
340.5847650.8304690.415235
350.5265820.9468360.473418
360.4721470.9442940.527853
370.4174090.8348180.582591
380.3883780.7767560.611622
390.3598030.7196060.640197
400.3131840.6263670.686816
410.4871080.9742160.512892
420.436340.8726810.56366
430.4214190.8428390.578581
440.3688170.7376340.631183
450.3314390.6628790.668561
460.2832150.5664290.716785
470.2417230.4834450.758277
480.2728170.5456340.727183
490.2482680.4965360.751732
500.2159810.4319620.784019
510.1790250.3580510.820975
520.2719270.5438550.728073
530.2300790.4601570.769921
540.1973880.3947750.802612
550.1882990.3765980.811701
560.1603230.3206460.839677
570.3289620.6579250.671038
580.3587970.7175940.641203
590.4007580.8015150.599242
600.3853890.7707770.614611
610.370720.741440.62928
620.3333990.6667980.666601
630.407740.815480.59226
640.5374940.9250130.462506
650.5292560.9414870.470744
660.5298640.9402720.470136
670.4799440.9598870.520056
680.447120.8942410.55288
690.4770180.9540360.522982
700.4272950.8545910.572705
710.380560.7611190.61944
720.327660.6553190.67234
730.3295150.6590290.670485
740.3710840.7421690.628916
750.3360470.6720950.663953
760.3182170.6364340.681783
770.2942850.5885690.705715
780.3007910.6015820.699209
790.2584560.5169110.741544
800.277530.5550590.72247
810.2310150.462030.768985
820.2529060.5058110.747094
830.2137940.4275890.786206
840.4945890.9891790.505411
850.4320650.8641290.567935
860.3877840.7755680.612216
870.3359110.6718230.664089
880.2945990.5891990.705401
890.2598230.5196450.740177
900.3030690.6061380.696931
910.3514810.7029620.648519
920.4489420.8978830.551058
930.3817150.763430.618285
940.3220610.6441210.677939
950.3029890.6059780.697011
960.2495380.4990770.750462
970.2843470.5686930.715653
980.2214460.4428920.778554
990.2690930.5381870.730907
1000.2404180.4808350.759582
1010.1762940.3525880.823706
1020.1212860.2425720.878714
1030.09365680.1873140.906343
1040.05952050.1190410.940479
1050.1279870.2559740.872013
1060.07350110.1470020.926499
1070.8890980.2218040.110902







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level00OK
5% type I error level00OK
10% type I error level00OK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 0 & 0 & OK \tabularnewline
5% type I error level & 0 & 0 & OK \tabularnewline
10% type I error level & 0 & 0 & OK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=270134&T=6

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=270134&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=270134&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level00OK
5% type I error level00OK
10% type I error level00OK



Parameters (Session):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = Linear Trend ;
R code (references can be found in the software module):
par3 <- 'Linear Trend'
par2 <- 'Include Monthly Dummies'
par1 <- '1'
library(lattice)
library(lmtest)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
qqline(mysum$resid)
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, signif(mysum$coefficients[i,1],6), sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,signif(mysum$coefficients[i,1],6))
a<-table.element(a, signif(mysum$coefficients[i,2],6))
a<-table.element(a, signif(mysum$coefficients[i,3],4))
a<-table.element(a, signif(mysum$coefficients[i,4],6))
a<-table.element(a, signif(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, signif(sqrt(mysum$r.squared),6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, signif(mysum$r.squared,6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, signif(mysum$adj.r.squared,6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[1],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[2],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[3],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, signif(1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]),6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, signif(mysum$sigma,6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, signif(sum(myerror*myerror),6))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,signif(x[i],6))
a<-table.element(a,signif(x[i]-mysum$resid[i],6))
a<-table.element(a,signif(mysum$resid[i],6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,signif(gqarr[mypoint-kp3+1,1],6))
a<-table.element(a,signif(gqarr[mypoint-kp3+1,2],6))
a<-table.element(a,signif(gqarr[mypoint-kp3+1,3],6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant1,6))
a<-table.element(a,signif(numsignificant1/numgqtests,6))
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant5,6))
a<-table.element(a,signif(numsignificant5/numgqtests,6))
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant10,6))
a<-table.element(a,signif(numsignificant10/numgqtests,6))
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}