Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationSat, 06 Dec 2014 13:01:32 +0000
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2014/Dec/06/t1417871334hbsyglwxt58a6mv.htm/, Retrieved Thu, 16 May 2024 23:38:57 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=263606, Retrieved Thu, 16 May 2024 23:38:57 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact115
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [paper mr conf] [2014-12-06 13:01:32] [ec1b40d1a9751af99658fe8fca4f9eca] [Current]
Feedback Forum

Post a new message
Dataseries X:
1 13 12
0 8 8
1 14 11
0 16 13
0 14 11
0 13 10
1 15 7
0 13 10
0 20 15
0 17 12
0 15 12
0 16 10
0 12 10
1 17 14
1 11 6
1 16 12
0 16 14
1 15 11
0 13 8
1 14 12
0 19 15
0 16 13
1 17 11
0 10 12
0 15 7
0 14 11
0 14 7
1 16 12
0 15 12
1 17 13
1 14 9
1 16 11
0 15 12
0 16 15
0 16 12
0 10 6
1 8 5
0 17 13
0 14 11
1 10 6
0 14 12
0 12 10
0 16 6
0 16 12
0 16 11
1 8 6
0 16 12
0 15 12
1 8 8
0 13 10
0 14 11
0 13 7
0 16 12
0 19 13
0 19 14
0 14 12
1 15 6
0 13 14
1 10 10
0 16 12
0 15 11
1 11 10
1 9 7
0 16 12
1 12 7
0 12 12
1 14 12
0 14 10
0 13 10
1 15 12
1 17 12
1 14 12
1 11 8
0 9 10
1 7 5
0 13 10
1 15 10
0 12 12
1 15 11
1 14 9
0 16 12
1 14 11
1 13 10
1 16 12
0 13 10
1 16 9
0 16 11
0 16 12
1 10 7
1 12 11
1 12 12
0 12 6
0 12 9
0 19 15
1 14 10
0 13 11
1 16 12
0 15 12
0 12 12
0 8 11
0 10 9
0 16 11
1 16 12
0 10 12
0 18 14
0 12 8
1 16 10
1 10 9
1 14 10
1 12 9
1 11 10
1 15 12
0 7 11




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time5 seconds
R Server'Sir Ronald Aylmer Fisher' @ fisher.wessa.net

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 5 seconds \tabularnewline
R Server & 'Sir Ronald Aylmer Fisher' @ fisher.wessa.net \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=263606&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]5 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Sir Ronald Aylmer Fisher' @ fisher.wessa.net[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=263606&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=263606&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time5 seconds
R Server'Sir Ronald Aylmer Fisher' @ fisher.wessa.net







Multiple Linear Regression - Estimated Regression Equation
gendercode[t] = + 1.00291 + 0.00331044CONFSTATTOT[t] -0.0598271CONFSOFTTOT[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
gendercode[t] =  +  1.00291 +  0.00331044CONFSTATTOT[t] -0.0598271CONFSOFTTOT[t]  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=263606&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]gendercode[t] =  +  1.00291 +  0.00331044CONFSTATTOT[t] -0.0598271CONFSOFTTOT[t]  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=263606&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=263606&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
gendercode[t] = + 1.00291 + 0.00331044CONFSTATTOT[t] -0.0598271CONFSOFTTOT[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)1.002910.246224.0738.77351e-054.38675e-05
CONFSTATTOT0.003310440.02087930.15860.8743130.437157
CONFSOFTTOT-0.05982710.0252437-2.370.01953070.00976535

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 1.00291 & 0.24622 & 4.073 & 8.77351e-05 & 4.38675e-05 \tabularnewline
CONFSTATTOT & 0.00331044 & 0.0208793 & 0.1586 & 0.874313 & 0.437157 \tabularnewline
CONFSOFTTOT & -0.0598271 & 0.0252437 & -2.37 & 0.0195307 & 0.00976535 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=263606&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]1.00291[/C][C]0.24622[/C][C]4.073[/C][C]8.77351e-05[/C][C]4.38675e-05[/C][/ROW]
[ROW][C]CONFSTATTOT[/C][C]0.00331044[/C][C]0.0208793[/C][C]0.1586[/C][C]0.874313[/C][C]0.437157[/C][/ROW]
[ROW][C]CONFSOFTTOT[/C][C]-0.0598271[/C][C]0.0252437[/C][C]-2.37[/C][C]0.0195307[/C][C]0.00976535[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=263606&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=263606&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)1.002910.246224.0738.77351e-054.38675e-05
CONFSTATTOT0.003310440.02087930.15860.8743130.437157
CONFSOFTTOT-0.05982710.0252437-2.370.01953070.00976535







Multiple Linear Regression - Regression Statistics
Multiple R0.267021
R-squared0.0713003
Adjusted R-squared0.0544148
F-TEST (value)4.22259
F-TEST (DF numerator)2
F-TEST (DF denominator)110
p-value0.0171057
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation0.481419
Sum Squared Residuals25.494

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.267021 \tabularnewline
R-squared & 0.0713003 \tabularnewline
Adjusted R-squared & 0.0544148 \tabularnewline
F-TEST (value) & 4.22259 \tabularnewline
F-TEST (DF numerator) & 2 \tabularnewline
F-TEST (DF denominator) & 110 \tabularnewline
p-value & 0.0171057 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 0.481419 \tabularnewline
Sum Squared Residuals & 25.494 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=263606&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.267021[/C][/ROW]
[ROW][C]R-squared[/C][C]0.0713003[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]0.0544148[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]4.22259[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]2[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]110[/C][/ROW]
[ROW][C]p-value[/C][C]0.0171057[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]0.481419[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]25.494[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=263606&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=263606&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.267021
R-squared0.0713003
Adjusted R-squared0.0544148
F-TEST (value)4.22259
F-TEST (DF numerator)2
F-TEST (DF denominator)110
p-value0.0171057
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation0.481419
Sum Squared Residuals25.494







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
110.3280230.671977
200.550779-0.550779
310.391160.60884
400.278127-0.278127
500.39116-0.39116
600.447677-0.447677
710.6337790.366221
800.447677-0.447677
900.171715-0.171715
1000.341265-0.341265
1100.334644-0.334644
1200.457608-0.457608
1300.444367-0.444367
1410.221610.77839
1510.6803650.319635
1610.3379540.662046
1700.2183-0.2183
1810.3944710.605529
1900.567331-0.567331
2010.3313330.668667
2100.168404-0.168404
2200.278127-0.278127
2310.4010920.598908
2400.318092-0.318092
2500.633779-0.633779
2600.39116-0.39116
2700.630469-0.630469
2810.3379540.662046
2900.334644-0.334644
3010.2814370.718563
3110.5108150.489185
3210.3977810.602219
3300.334644-0.334644
3400.158473-0.158473
3500.337954-0.337954
3600.677054-0.677054
3710.7302610.269739
3800.281437-0.281437
3900.39116-0.39116
4010.6770540.322946
4100.331333-0.331333
4200.444367-0.444367
4300.696917-0.696917
4400.337954-0.337954
4500.397781-0.397781
4610.6704330.329567
4700.337954-0.337954
4800.334644-0.334644
4910.5507790.449221
5000.447677-0.447677
5100.39116-0.39116
5200.627159-0.627159
5300.337954-0.337954
5400.288058-0.288058
5500.228231-0.228231
5600.331333-0.331333
5710.6936070.306393
5800.208369-0.208369
5910.4377460.562254
6000.337954-0.337954
6100.394471-0.394471
6210.4410560.558944
6310.6139170.386083
6400.337954-0.337954
6510.6238480.376152
6600.324712-0.324712
6710.3313330.668667
6800.450988-0.450988
6900.447677-0.447677
7010.3346440.665356
7110.3412650.658735
7210.3313330.668667
7310.5607110.439289
7400.434435-0.434435
7510.726950.27305
7600.447677-0.447677
7710.4542980.545702
7800.324712-0.324712
7910.3944710.605529
8010.5108150.489185
8100.337954-0.337954
8210.391160.60884
8310.4476770.552323
8410.3379540.662046
8500.447677-0.447677
8610.5174360.482564
8700.397781-0.397781
8800.337954-0.337954
8910.6172270.382773
9010.384540.61546
9110.3247120.675288
9200.683675-0.683675
9300.504194-0.504194
9400.168404-0.168404
9510.4509880.549012
9600.38785-0.38785
9710.3379540.662046
9800.334644-0.334644
9900.324712-0.324712
10000.371298-0.371298
10100.497573-0.497573
10200.397781-0.397781
10310.3379540.662046
10400.318092-0.318092
10500.224921-0.224921
10600.564021-0.564021
10710.4576080.542392
10810.4975730.502427
10910.4509880.549012
11010.5041940.495806
11110.4410560.558944
11210.3346440.665356
11300.367987-0.367987

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 1 & 0.328023 & 0.671977 \tabularnewline
2 & 0 & 0.550779 & -0.550779 \tabularnewline
3 & 1 & 0.39116 & 0.60884 \tabularnewline
4 & 0 & 0.278127 & -0.278127 \tabularnewline
5 & 0 & 0.39116 & -0.39116 \tabularnewline
6 & 0 & 0.447677 & -0.447677 \tabularnewline
7 & 1 & 0.633779 & 0.366221 \tabularnewline
8 & 0 & 0.447677 & -0.447677 \tabularnewline
9 & 0 & 0.171715 & -0.171715 \tabularnewline
10 & 0 & 0.341265 & -0.341265 \tabularnewline
11 & 0 & 0.334644 & -0.334644 \tabularnewline
12 & 0 & 0.457608 & -0.457608 \tabularnewline
13 & 0 & 0.444367 & -0.444367 \tabularnewline
14 & 1 & 0.22161 & 0.77839 \tabularnewline
15 & 1 & 0.680365 & 0.319635 \tabularnewline
16 & 1 & 0.337954 & 0.662046 \tabularnewline
17 & 0 & 0.2183 & -0.2183 \tabularnewline
18 & 1 & 0.394471 & 0.605529 \tabularnewline
19 & 0 & 0.567331 & -0.567331 \tabularnewline
20 & 1 & 0.331333 & 0.668667 \tabularnewline
21 & 0 & 0.168404 & -0.168404 \tabularnewline
22 & 0 & 0.278127 & -0.278127 \tabularnewline
23 & 1 & 0.401092 & 0.598908 \tabularnewline
24 & 0 & 0.318092 & -0.318092 \tabularnewline
25 & 0 & 0.633779 & -0.633779 \tabularnewline
26 & 0 & 0.39116 & -0.39116 \tabularnewline
27 & 0 & 0.630469 & -0.630469 \tabularnewline
28 & 1 & 0.337954 & 0.662046 \tabularnewline
29 & 0 & 0.334644 & -0.334644 \tabularnewline
30 & 1 & 0.281437 & 0.718563 \tabularnewline
31 & 1 & 0.510815 & 0.489185 \tabularnewline
32 & 1 & 0.397781 & 0.602219 \tabularnewline
33 & 0 & 0.334644 & -0.334644 \tabularnewline
34 & 0 & 0.158473 & -0.158473 \tabularnewline
35 & 0 & 0.337954 & -0.337954 \tabularnewline
36 & 0 & 0.677054 & -0.677054 \tabularnewline
37 & 1 & 0.730261 & 0.269739 \tabularnewline
38 & 0 & 0.281437 & -0.281437 \tabularnewline
39 & 0 & 0.39116 & -0.39116 \tabularnewline
40 & 1 & 0.677054 & 0.322946 \tabularnewline
41 & 0 & 0.331333 & -0.331333 \tabularnewline
42 & 0 & 0.444367 & -0.444367 \tabularnewline
43 & 0 & 0.696917 & -0.696917 \tabularnewline
44 & 0 & 0.337954 & -0.337954 \tabularnewline
45 & 0 & 0.397781 & -0.397781 \tabularnewline
46 & 1 & 0.670433 & 0.329567 \tabularnewline
47 & 0 & 0.337954 & -0.337954 \tabularnewline
48 & 0 & 0.334644 & -0.334644 \tabularnewline
49 & 1 & 0.550779 & 0.449221 \tabularnewline
50 & 0 & 0.447677 & -0.447677 \tabularnewline
51 & 0 & 0.39116 & -0.39116 \tabularnewline
52 & 0 & 0.627159 & -0.627159 \tabularnewline
53 & 0 & 0.337954 & -0.337954 \tabularnewline
54 & 0 & 0.288058 & -0.288058 \tabularnewline
55 & 0 & 0.228231 & -0.228231 \tabularnewline
56 & 0 & 0.331333 & -0.331333 \tabularnewline
57 & 1 & 0.693607 & 0.306393 \tabularnewline
58 & 0 & 0.208369 & -0.208369 \tabularnewline
59 & 1 & 0.437746 & 0.562254 \tabularnewline
60 & 0 & 0.337954 & -0.337954 \tabularnewline
61 & 0 & 0.394471 & -0.394471 \tabularnewline
62 & 1 & 0.441056 & 0.558944 \tabularnewline
63 & 1 & 0.613917 & 0.386083 \tabularnewline
64 & 0 & 0.337954 & -0.337954 \tabularnewline
65 & 1 & 0.623848 & 0.376152 \tabularnewline
66 & 0 & 0.324712 & -0.324712 \tabularnewline
67 & 1 & 0.331333 & 0.668667 \tabularnewline
68 & 0 & 0.450988 & -0.450988 \tabularnewline
69 & 0 & 0.447677 & -0.447677 \tabularnewline
70 & 1 & 0.334644 & 0.665356 \tabularnewline
71 & 1 & 0.341265 & 0.658735 \tabularnewline
72 & 1 & 0.331333 & 0.668667 \tabularnewline
73 & 1 & 0.560711 & 0.439289 \tabularnewline
74 & 0 & 0.434435 & -0.434435 \tabularnewline
75 & 1 & 0.72695 & 0.27305 \tabularnewline
76 & 0 & 0.447677 & -0.447677 \tabularnewline
77 & 1 & 0.454298 & 0.545702 \tabularnewline
78 & 0 & 0.324712 & -0.324712 \tabularnewline
79 & 1 & 0.394471 & 0.605529 \tabularnewline
80 & 1 & 0.510815 & 0.489185 \tabularnewline
81 & 0 & 0.337954 & -0.337954 \tabularnewline
82 & 1 & 0.39116 & 0.60884 \tabularnewline
83 & 1 & 0.447677 & 0.552323 \tabularnewline
84 & 1 & 0.337954 & 0.662046 \tabularnewline
85 & 0 & 0.447677 & -0.447677 \tabularnewline
86 & 1 & 0.517436 & 0.482564 \tabularnewline
87 & 0 & 0.397781 & -0.397781 \tabularnewline
88 & 0 & 0.337954 & -0.337954 \tabularnewline
89 & 1 & 0.617227 & 0.382773 \tabularnewline
90 & 1 & 0.38454 & 0.61546 \tabularnewline
91 & 1 & 0.324712 & 0.675288 \tabularnewline
92 & 0 & 0.683675 & -0.683675 \tabularnewline
93 & 0 & 0.504194 & -0.504194 \tabularnewline
94 & 0 & 0.168404 & -0.168404 \tabularnewline
95 & 1 & 0.450988 & 0.549012 \tabularnewline
96 & 0 & 0.38785 & -0.38785 \tabularnewline
97 & 1 & 0.337954 & 0.662046 \tabularnewline
98 & 0 & 0.334644 & -0.334644 \tabularnewline
99 & 0 & 0.324712 & -0.324712 \tabularnewline
100 & 0 & 0.371298 & -0.371298 \tabularnewline
101 & 0 & 0.497573 & -0.497573 \tabularnewline
102 & 0 & 0.397781 & -0.397781 \tabularnewline
103 & 1 & 0.337954 & 0.662046 \tabularnewline
104 & 0 & 0.318092 & -0.318092 \tabularnewline
105 & 0 & 0.224921 & -0.224921 \tabularnewline
106 & 0 & 0.564021 & -0.564021 \tabularnewline
107 & 1 & 0.457608 & 0.542392 \tabularnewline
108 & 1 & 0.497573 & 0.502427 \tabularnewline
109 & 1 & 0.450988 & 0.549012 \tabularnewline
110 & 1 & 0.504194 & 0.495806 \tabularnewline
111 & 1 & 0.441056 & 0.558944 \tabularnewline
112 & 1 & 0.334644 & 0.665356 \tabularnewline
113 & 0 & 0.367987 & -0.367987 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=263606&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]1[/C][C]0.328023[/C][C]0.671977[/C][/ROW]
[ROW][C]2[/C][C]0[/C][C]0.550779[/C][C]-0.550779[/C][/ROW]
[ROW][C]3[/C][C]1[/C][C]0.39116[/C][C]0.60884[/C][/ROW]
[ROW][C]4[/C][C]0[/C][C]0.278127[/C][C]-0.278127[/C][/ROW]
[ROW][C]5[/C][C]0[/C][C]0.39116[/C][C]-0.39116[/C][/ROW]
[ROW][C]6[/C][C]0[/C][C]0.447677[/C][C]-0.447677[/C][/ROW]
[ROW][C]7[/C][C]1[/C][C]0.633779[/C][C]0.366221[/C][/ROW]
[ROW][C]8[/C][C]0[/C][C]0.447677[/C][C]-0.447677[/C][/ROW]
[ROW][C]9[/C][C]0[/C][C]0.171715[/C][C]-0.171715[/C][/ROW]
[ROW][C]10[/C][C]0[/C][C]0.341265[/C][C]-0.341265[/C][/ROW]
[ROW][C]11[/C][C]0[/C][C]0.334644[/C][C]-0.334644[/C][/ROW]
[ROW][C]12[/C][C]0[/C][C]0.457608[/C][C]-0.457608[/C][/ROW]
[ROW][C]13[/C][C]0[/C][C]0.444367[/C][C]-0.444367[/C][/ROW]
[ROW][C]14[/C][C]1[/C][C]0.22161[/C][C]0.77839[/C][/ROW]
[ROW][C]15[/C][C]1[/C][C]0.680365[/C][C]0.319635[/C][/ROW]
[ROW][C]16[/C][C]1[/C][C]0.337954[/C][C]0.662046[/C][/ROW]
[ROW][C]17[/C][C]0[/C][C]0.2183[/C][C]-0.2183[/C][/ROW]
[ROW][C]18[/C][C]1[/C][C]0.394471[/C][C]0.605529[/C][/ROW]
[ROW][C]19[/C][C]0[/C][C]0.567331[/C][C]-0.567331[/C][/ROW]
[ROW][C]20[/C][C]1[/C][C]0.331333[/C][C]0.668667[/C][/ROW]
[ROW][C]21[/C][C]0[/C][C]0.168404[/C][C]-0.168404[/C][/ROW]
[ROW][C]22[/C][C]0[/C][C]0.278127[/C][C]-0.278127[/C][/ROW]
[ROW][C]23[/C][C]1[/C][C]0.401092[/C][C]0.598908[/C][/ROW]
[ROW][C]24[/C][C]0[/C][C]0.318092[/C][C]-0.318092[/C][/ROW]
[ROW][C]25[/C][C]0[/C][C]0.633779[/C][C]-0.633779[/C][/ROW]
[ROW][C]26[/C][C]0[/C][C]0.39116[/C][C]-0.39116[/C][/ROW]
[ROW][C]27[/C][C]0[/C][C]0.630469[/C][C]-0.630469[/C][/ROW]
[ROW][C]28[/C][C]1[/C][C]0.337954[/C][C]0.662046[/C][/ROW]
[ROW][C]29[/C][C]0[/C][C]0.334644[/C][C]-0.334644[/C][/ROW]
[ROW][C]30[/C][C]1[/C][C]0.281437[/C][C]0.718563[/C][/ROW]
[ROW][C]31[/C][C]1[/C][C]0.510815[/C][C]0.489185[/C][/ROW]
[ROW][C]32[/C][C]1[/C][C]0.397781[/C][C]0.602219[/C][/ROW]
[ROW][C]33[/C][C]0[/C][C]0.334644[/C][C]-0.334644[/C][/ROW]
[ROW][C]34[/C][C]0[/C][C]0.158473[/C][C]-0.158473[/C][/ROW]
[ROW][C]35[/C][C]0[/C][C]0.337954[/C][C]-0.337954[/C][/ROW]
[ROW][C]36[/C][C]0[/C][C]0.677054[/C][C]-0.677054[/C][/ROW]
[ROW][C]37[/C][C]1[/C][C]0.730261[/C][C]0.269739[/C][/ROW]
[ROW][C]38[/C][C]0[/C][C]0.281437[/C][C]-0.281437[/C][/ROW]
[ROW][C]39[/C][C]0[/C][C]0.39116[/C][C]-0.39116[/C][/ROW]
[ROW][C]40[/C][C]1[/C][C]0.677054[/C][C]0.322946[/C][/ROW]
[ROW][C]41[/C][C]0[/C][C]0.331333[/C][C]-0.331333[/C][/ROW]
[ROW][C]42[/C][C]0[/C][C]0.444367[/C][C]-0.444367[/C][/ROW]
[ROW][C]43[/C][C]0[/C][C]0.696917[/C][C]-0.696917[/C][/ROW]
[ROW][C]44[/C][C]0[/C][C]0.337954[/C][C]-0.337954[/C][/ROW]
[ROW][C]45[/C][C]0[/C][C]0.397781[/C][C]-0.397781[/C][/ROW]
[ROW][C]46[/C][C]1[/C][C]0.670433[/C][C]0.329567[/C][/ROW]
[ROW][C]47[/C][C]0[/C][C]0.337954[/C][C]-0.337954[/C][/ROW]
[ROW][C]48[/C][C]0[/C][C]0.334644[/C][C]-0.334644[/C][/ROW]
[ROW][C]49[/C][C]1[/C][C]0.550779[/C][C]0.449221[/C][/ROW]
[ROW][C]50[/C][C]0[/C][C]0.447677[/C][C]-0.447677[/C][/ROW]
[ROW][C]51[/C][C]0[/C][C]0.39116[/C][C]-0.39116[/C][/ROW]
[ROW][C]52[/C][C]0[/C][C]0.627159[/C][C]-0.627159[/C][/ROW]
[ROW][C]53[/C][C]0[/C][C]0.337954[/C][C]-0.337954[/C][/ROW]
[ROW][C]54[/C][C]0[/C][C]0.288058[/C][C]-0.288058[/C][/ROW]
[ROW][C]55[/C][C]0[/C][C]0.228231[/C][C]-0.228231[/C][/ROW]
[ROW][C]56[/C][C]0[/C][C]0.331333[/C][C]-0.331333[/C][/ROW]
[ROW][C]57[/C][C]1[/C][C]0.693607[/C][C]0.306393[/C][/ROW]
[ROW][C]58[/C][C]0[/C][C]0.208369[/C][C]-0.208369[/C][/ROW]
[ROW][C]59[/C][C]1[/C][C]0.437746[/C][C]0.562254[/C][/ROW]
[ROW][C]60[/C][C]0[/C][C]0.337954[/C][C]-0.337954[/C][/ROW]
[ROW][C]61[/C][C]0[/C][C]0.394471[/C][C]-0.394471[/C][/ROW]
[ROW][C]62[/C][C]1[/C][C]0.441056[/C][C]0.558944[/C][/ROW]
[ROW][C]63[/C][C]1[/C][C]0.613917[/C][C]0.386083[/C][/ROW]
[ROW][C]64[/C][C]0[/C][C]0.337954[/C][C]-0.337954[/C][/ROW]
[ROW][C]65[/C][C]1[/C][C]0.623848[/C][C]0.376152[/C][/ROW]
[ROW][C]66[/C][C]0[/C][C]0.324712[/C][C]-0.324712[/C][/ROW]
[ROW][C]67[/C][C]1[/C][C]0.331333[/C][C]0.668667[/C][/ROW]
[ROW][C]68[/C][C]0[/C][C]0.450988[/C][C]-0.450988[/C][/ROW]
[ROW][C]69[/C][C]0[/C][C]0.447677[/C][C]-0.447677[/C][/ROW]
[ROW][C]70[/C][C]1[/C][C]0.334644[/C][C]0.665356[/C][/ROW]
[ROW][C]71[/C][C]1[/C][C]0.341265[/C][C]0.658735[/C][/ROW]
[ROW][C]72[/C][C]1[/C][C]0.331333[/C][C]0.668667[/C][/ROW]
[ROW][C]73[/C][C]1[/C][C]0.560711[/C][C]0.439289[/C][/ROW]
[ROW][C]74[/C][C]0[/C][C]0.434435[/C][C]-0.434435[/C][/ROW]
[ROW][C]75[/C][C]1[/C][C]0.72695[/C][C]0.27305[/C][/ROW]
[ROW][C]76[/C][C]0[/C][C]0.447677[/C][C]-0.447677[/C][/ROW]
[ROW][C]77[/C][C]1[/C][C]0.454298[/C][C]0.545702[/C][/ROW]
[ROW][C]78[/C][C]0[/C][C]0.324712[/C][C]-0.324712[/C][/ROW]
[ROW][C]79[/C][C]1[/C][C]0.394471[/C][C]0.605529[/C][/ROW]
[ROW][C]80[/C][C]1[/C][C]0.510815[/C][C]0.489185[/C][/ROW]
[ROW][C]81[/C][C]0[/C][C]0.337954[/C][C]-0.337954[/C][/ROW]
[ROW][C]82[/C][C]1[/C][C]0.39116[/C][C]0.60884[/C][/ROW]
[ROW][C]83[/C][C]1[/C][C]0.447677[/C][C]0.552323[/C][/ROW]
[ROW][C]84[/C][C]1[/C][C]0.337954[/C][C]0.662046[/C][/ROW]
[ROW][C]85[/C][C]0[/C][C]0.447677[/C][C]-0.447677[/C][/ROW]
[ROW][C]86[/C][C]1[/C][C]0.517436[/C][C]0.482564[/C][/ROW]
[ROW][C]87[/C][C]0[/C][C]0.397781[/C][C]-0.397781[/C][/ROW]
[ROW][C]88[/C][C]0[/C][C]0.337954[/C][C]-0.337954[/C][/ROW]
[ROW][C]89[/C][C]1[/C][C]0.617227[/C][C]0.382773[/C][/ROW]
[ROW][C]90[/C][C]1[/C][C]0.38454[/C][C]0.61546[/C][/ROW]
[ROW][C]91[/C][C]1[/C][C]0.324712[/C][C]0.675288[/C][/ROW]
[ROW][C]92[/C][C]0[/C][C]0.683675[/C][C]-0.683675[/C][/ROW]
[ROW][C]93[/C][C]0[/C][C]0.504194[/C][C]-0.504194[/C][/ROW]
[ROW][C]94[/C][C]0[/C][C]0.168404[/C][C]-0.168404[/C][/ROW]
[ROW][C]95[/C][C]1[/C][C]0.450988[/C][C]0.549012[/C][/ROW]
[ROW][C]96[/C][C]0[/C][C]0.38785[/C][C]-0.38785[/C][/ROW]
[ROW][C]97[/C][C]1[/C][C]0.337954[/C][C]0.662046[/C][/ROW]
[ROW][C]98[/C][C]0[/C][C]0.334644[/C][C]-0.334644[/C][/ROW]
[ROW][C]99[/C][C]0[/C][C]0.324712[/C][C]-0.324712[/C][/ROW]
[ROW][C]100[/C][C]0[/C][C]0.371298[/C][C]-0.371298[/C][/ROW]
[ROW][C]101[/C][C]0[/C][C]0.497573[/C][C]-0.497573[/C][/ROW]
[ROW][C]102[/C][C]0[/C][C]0.397781[/C][C]-0.397781[/C][/ROW]
[ROW][C]103[/C][C]1[/C][C]0.337954[/C][C]0.662046[/C][/ROW]
[ROW][C]104[/C][C]0[/C][C]0.318092[/C][C]-0.318092[/C][/ROW]
[ROW][C]105[/C][C]0[/C][C]0.224921[/C][C]-0.224921[/C][/ROW]
[ROW][C]106[/C][C]0[/C][C]0.564021[/C][C]-0.564021[/C][/ROW]
[ROW][C]107[/C][C]1[/C][C]0.457608[/C][C]0.542392[/C][/ROW]
[ROW][C]108[/C][C]1[/C][C]0.497573[/C][C]0.502427[/C][/ROW]
[ROW][C]109[/C][C]1[/C][C]0.450988[/C][C]0.549012[/C][/ROW]
[ROW][C]110[/C][C]1[/C][C]0.504194[/C][C]0.495806[/C][/ROW]
[ROW][C]111[/C][C]1[/C][C]0.441056[/C][C]0.558944[/C][/ROW]
[ROW][C]112[/C][C]1[/C][C]0.334644[/C][C]0.665356[/C][/ROW]
[ROW][C]113[/C][C]0[/C][C]0.367987[/C][C]-0.367987[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=263606&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=263606&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
110.3280230.671977
200.550779-0.550779
310.391160.60884
400.278127-0.278127
500.39116-0.39116
600.447677-0.447677
710.6337790.366221
800.447677-0.447677
900.171715-0.171715
1000.341265-0.341265
1100.334644-0.334644
1200.457608-0.457608
1300.444367-0.444367
1410.221610.77839
1510.6803650.319635
1610.3379540.662046
1700.2183-0.2183
1810.3944710.605529
1900.567331-0.567331
2010.3313330.668667
2100.168404-0.168404
2200.278127-0.278127
2310.4010920.598908
2400.318092-0.318092
2500.633779-0.633779
2600.39116-0.39116
2700.630469-0.630469
2810.3379540.662046
2900.334644-0.334644
3010.2814370.718563
3110.5108150.489185
3210.3977810.602219
3300.334644-0.334644
3400.158473-0.158473
3500.337954-0.337954
3600.677054-0.677054
3710.7302610.269739
3800.281437-0.281437
3900.39116-0.39116
4010.6770540.322946
4100.331333-0.331333
4200.444367-0.444367
4300.696917-0.696917
4400.337954-0.337954
4500.397781-0.397781
4610.6704330.329567
4700.337954-0.337954
4800.334644-0.334644
4910.5507790.449221
5000.447677-0.447677
5100.39116-0.39116
5200.627159-0.627159
5300.337954-0.337954
5400.288058-0.288058
5500.228231-0.228231
5600.331333-0.331333
5710.6936070.306393
5800.208369-0.208369
5910.4377460.562254
6000.337954-0.337954
6100.394471-0.394471
6210.4410560.558944
6310.6139170.386083
6400.337954-0.337954
6510.6238480.376152
6600.324712-0.324712
6710.3313330.668667
6800.450988-0.450988
6900.447677-0.447677
7010.3346440.665356
7110.3412650.658735
7210.3313330.668667
7310.5607110.439289
7400.434435-0.434435
7510.726950.27305
7600.447677-0.447677
7710.4542980.545702
7800.324712-0.324712
7910.3944710.605529
8010.5108150.489185
8100.337954-0.337954
8210.391160.60884
8310.4476770.552323
8410.3379540.662046
8500.447677-0.447677
8610.5174360.482564
8700.397781-0.397781
8800.337954-0.337954
8910.6172270.382773
9010.384540.61546
9110.3247120.675288
9200.683675-0.683675
9300.504194-0.504194
9400.168404-0.168404
9510.4509880.549012
9600.38785-0.38785
9710.3379540.662046
9800.334644-0.334644
9900.324712-0.324712
10000.371298-0.371298
10100.497573-0.497573
10200.397781-0.397781
10310.3379540.662046
10400.318092-0.318092
10500.224921-0.224921
10600.564021-0.564021
10710.4576080.542392
10810.4975730.502427
10910.4509880.549012
11010.5041940.495806
11110.4410560.558944
11210.3346440.665356
11300.367987-0.367987







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
60.8003620.3992760.199638
70.8194450.3611110.180555
80.7744790.4510410.225521
90.7219790.5560430.278021
100.6620820.6758370.337918
110.5784960.8430080.421504
120.5400770.9198470.459923
130.4718950.9437910.528105
140.6348690.7302620.365131
150.6391860.7216280.360814
160.6964810.6070380.303519
170.6336940.7326120.366306
180.6668140.6663710.333186
190.6744340.6511330.325566
200.7290720.5418560.270928
210.686550.6269010.31345
220.6440050.711990.355995
230.6595160.6809690.340484
240.6029590.7940810.397041
250.6418870.7162270.358113
260.6102950.7794090.389705
270.6178750.764250.382125
280.6644940.6710120.335506
290.6318660.7362670.368134
300.679840.6403190.32016
310.6955440.6089120.304456
320.7154920.5690160.284508
330.6897190.6205620.310281
340.6470920.7058160.352908
350.6218050.7563890.378195
360.6341210.7317580.365879
370.6425240.7149510.357476
380.6099480.7801040.390052
390.5843080.8313840.415692
400.5722430.8555150.427757
410.5369790.9260430.463021
420.516860.966280.48314
430.5808110.8383780.419189
440.5507860.8984280.449214
450.5311850.9376310.468815
460.5180820.9638360.481918
470.4875110.9750220.512489
480.4562060.9124110.543794
490.4542640.9085280.545736
500.4449810.8899620.555019
510.4250270.8500540.574973
520.4703150.940630.529685
530.4431540.8863070.556846
540.4121260.8242530.587874
550.374690.7493790.62531
560.3491540.6983090.650846
570.3313230.6626460.668677
580.2888960.5777930.711104
590.3134180.6268360.686582
600.2964240.5928480.703576
610.2910690.5821390.708931
620.3105070.6210130.689493
630.2929970.5859940.707003
640.282320.5646410.71768
650.2617540.5235080.738246
660.2387590.4775180.761241
670.2744230.5488460.725577
680.2835950.567190.716405
690.2895050.5790110.710495
700.3225210.6450430.677479
710.3480450.696090.651955
720.3882960.7765920.611704
730.3700670.7401340.629933
740.3545930.7091850.645407
750.317740.635480.68226
760.320970.641940.67903
770.3167140.6334290.683286
780.2889110.5778210.711089
790.298440.596880.70156
800.2852790.5705590.714721
810.2747340.5494680.725266
820.2869860.5739730.713014
830.2913470.5826950.708653
840.3135330.6270670.686467
850.3092140.6184280.690786
860.2904170.5808340.709583
870.2861370.5722750.713863
880.2759950.551990.724005
890.2609780.5219570.739022
900.2874370.5748750.712563
910.3454160.6908310.654584
920.4657880.9315750.534212
930.5210030.9579930.478997
940.4644580.9289170.535542
950.4342590.8685190.565741
960.4194880.8389770.580512
970.428280.856560.57172
980.4080730.8161450.591927
990.3516750.7033490.648325
1000.2846280.5692560.715372
1010.3089080.6178170.691092
1020.4125070.8250140.587493
1030.373580.7471610.62642
1040.2969090.5938170.703091
1050.4897130.9794270.510287
1060.9103590.1792820.0896411
1070.9125940.1748120.0874059

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
6 & 0.800362 & 0.399276 & 0.199638 \tabularnewline
7 & 0.819445 & 0.361111 & 0.180555 \tabularnewline
8 & 0.774479 & 0.451041 & 0.225521 \tabularnewline
9 & 0.721979 & 0.556043 & 0.278021 \tabularnewline
10 & 0.662082 & 0.675837 & 0.337918 \tabularnewline
11 & 0.578496 & 0.843008 & 0.421504 \tabularnewline
12 & 0.540077 & 0.919847 & 0.459923 \tabularnewline
13 & 0.471895 & 0.943791 & 0.528105 \tabularnewline
14 & 0.634869 & 0.730262 & 0.365131 \tabularnewline
15 & 0.639186 & 0.721628 & 0.360814 \tabularnewline
16 & 0.696481 & 0.607038 & 0.303519 \tabularnewline
17 & 0.633694 & 0.732612 & 0.366306 \tabularnewline
18 & 0.666814 & 0.666371 & 0.333186 \tabularnewline
19 & 0.674434 & 0.651133 & 0.325566 \tabularnewline
20 & 0.729072 & 0.541856 & 0.270928 \tabularnewline
21 & 0.68655 & 0.626901 & 0.31345 \tabularnewline
22 & 0.644005 & 0.71199 & 0.355995 \tabularnewline
23 & 0.659516 & 0.680969 & 0.340484 \tabularnewline
24 & 0.602959 & 0.794081 & 0.397041 \tabularnewline
25 & 0.641887 & 0.716227 & 0.358113 \tabularnewline
26 & 0.610295 & 0.779409 & 0.389705 \tabularnewline
27 & 0.617875 & 0.76425 & 0.382125 \tabularnewline
28 & 0.664494 & 0.671012 & 0.335506 \tabularnewline
29 & 0.631866 & 0.736267 & 0.368134 \tabularnewline
30 & 0.67984 & 0.640319 & 0.32016 \tabularnewline
31 & 0.695544 & 0.608912 & 0.304456 \tabularnewline
32 & 0.715492 & 0.569016 & 0.284508 \tabularnewline
33 & 0.689719 & 0.620562 & 0.310281 \tabularnewline
34 & 0.647092 & 0.705816 & 0.352908 \tabularnewline
35 & 0.621805 & 0.756389 & 0.378195 \tabularnewline
36 & 0.634121 & 0.731758 & 0.365879 \tabularnewline
37 & 0.642524 & 0.714951 & 0.357476 \tabularnewline
38 & 0.609948 & 0.780104 & 0.390052 \tabularnewline
39 & 0.584308 & 0.831384 & 0.415692 \tabularnewline
40 & 0.572243 & 0.855515 & 0.427757 \tabularnewline
41 & 0.536979 & 0.926043 & 0.463021 \tabularnewline
42 & 0.51686 & 0.96628 & 0.48314 \tabularnewline
43 & 0.580811 & 0.838378 & 0.419189 \tabularnewline
44 & 0.550786 & 0.898428 & 0.449214 \tabularnewline
45 & 0.531185 & 0.937631 & 0.468815 \tabularnewline
46 & 0.518082 & 0.963836 & 0.481918 \tabularnewline
47 & 0.487511 & 0.975022 & 0.512489 \tabularnewline
48 & 0.456206 & 0.912411 & 0.543794 \tabularnewline
49 & 0.454264 & 0.908528 & 0.545736 \tabularnewline
50 & 0.444981 & 0.889962 & 0.555019 \tabularnewline
51 & 0.425027 & 0.850054 & 0.574973 \tabularnewline
52 & 0.470315 & 0.94063 & 0.529685 \tabularnewline
53 & 0.443154 & 0.886307 & 0.556846 \tabularnewline
54 & 0.412126 & 0.824253 & 0.587874 \tabularnewline
55 & 0.37469 & 0.749379 & 0.62531 \tabularnewline
56 & 0.349154 & 0.698309 & 0.650846 \tabularnewline
57 & 0.331323 & 0.662646 & 0.668677 \tabularnewline
58 & 0.288896 & 0.577793 & 0.711104 \tabularnewline
59 & 0.313418 & 0.626836 & 0.686582 \tabularnewline
60 & 0.296424 & 0.592848 & 0.703576 \tabularnewline
61 & 0.291069 & 0.582139 & 0.708931 \tabularnewline
62 & 0.310507 & 0.621013 & 0.689493 \tabularnewline
63 & 0.292997 & 0.585994 & 0.707003 \tabularnewline
64 & 0.28232 & 0.564641 & 0.71768 \tabularnewline
65 & 0.261754 & 0.523508 & 0.738246 \tabularnewline
66 & 0.238759 & 0.477518 & 0.761241 \tabularnewline
67 & 0.274423 & 0.548846 & 0.725577 \tabularnewline
68 & 0.283595 & 0.56719 & 0.716405 \tabularnewline
69 & 0.289505 & 0.579011 & 0.710495 \tabularnewline
70 & 0.322521 & 0.645043 & 0.677479 \tabularnewline
71 & 0.348045 & 0.69609 & 0.651955 \tabularnewline
72 & 0.388296 & 0.776592 & 0.611704 \tabularnewline
73 & 0.370067 & 0.740134 & 0.629933 \tabularnewline
74 & 0.354593 & 0.709185 & 0.645407 \tabularnewline
75 & 0.31774 & 0.63548 & 0.68226 \tabularnewline
76 & 0.32097 & 0.64194 & 0.67903 \tabularnewline
77 & 0.316714 & 0.633429 & 0.683286 \tabularnewline
78 & 0.288911 & 0.577821 & 0.711089 \tabularnewline
79 & 0.29844 & 0.59688 & 0.70156 \tabularnewline
80 & 0.285279 & 0.570559 & 0.714721 \tabularnewline
81 & 0.274734 & 0.549468 & 0.725266 \tabularnewline
82 & 0.286986 & 0.573973 & 0.713014 \tabularnewline
83 & 0.291347 & 0.582695 & 0.708653 \tabularnewline
84 & 0.313533 & 0.627067 & 0.686467 \tabularnewline
85 & 0.309214 & 0.618428 & 0.690786 \tabularnewline
86 & 0.290417 & 0.580834 & 0.709583 \tabularnewline
87 & 0.286137 & 0.572275 & 0.713863 \tabularnewline
88 & 0.275995 & 0.55199 & 0.724005 \tabularnewline
89 & 0.260978 & 0.521957 & 0.739022 \tabularnewline
90 & 0.287437 & 0.574875 & 0.712563 \tabularnewline
91 & 0.345416 & 0.690831 & 0.654584 \tabularnewline
92 & 0.465788 & 0.931575 & 0.534212 \tabularnewline
93 & 0.521003 & 0.957993 & 0.478997 \tabularnewline
94 & 0.464458 & 0.928917 & 0.535542 \tabularnewline
95 & 0.434259 & 0.868519 & 0.565741 \tabularnewline
96 & 0.419488 & 0.838977 & 0.580512 \tabularnewline
97 & 0.42828 & 0.85656 & 0.57172 \tabularnewline
98 & 0.408073 & 0.816145 & 0.591927 \tabularnewline
99 & 0.351675 & 0.703349 & 0.648325 \tabularnewline
100 & 0.284628 & 0.569256 & 0.715372 \tabularnewline
101 & 0.308908 & 0.617817 & 0.691092 \tabularnewline
102 & 0.412507 & 0.825014 & 0.587493 \tabularnewline
103 & 0.37358 & 0.747161 & 0.62642 \tabularnewline
104 & 0.296909 & 0.593817 & 0.703091 \tabularnewline
105 & 0.489713 & 0.979427 & 0.510287 \tabularnewline
106 & 0.910359 & 0.179282 & 0.0896411 \tabularnewline
107 & 0.912594 & 0.174812 & 0.0874059 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=263606&T=5

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]6[/C][C]0.800362[/C][C]0.399276[/C][C]0.199638[/C][/ROW]
[ROW][C]7[/C][C]0.819445[/C][C]0.361111[/C][C]0.180555[/C][/ROW]
[ROW][C]8[/C][C]0.774479[/C][C]0.451041[/C][C]0.225521[/C][/ROW]
[ROW][C]9[/C][C]0.721979[/C][C]0.556043[/C][C]0.278021[/C][/ROW]
[ROW][C]10[/C][C]0.662082[/C][C]0.675837[/C][C]0.337918[/C][/ROW]
[ROW][C]11[/C][C]0.578496[/C][C]0.843008[/C][C]0.421504[/C][/ROW]
[ROW][C]12[/C][C]0.540077[/C][C]0.919847[/C][C]0.459923[/C][/ROW]
[ROW][C]13[/C][C]0.471895[/C][C]0.943791[/C][C]0.528105[/C][/ROW]
[ROW][C]14[/C][C]0.634869[/C][C]0.730262[/C][C]0.365131[/C][/ROW]
[ROW][C]15[/C][C]0.639186[/C][C]0.721628[/C][C]0.360814[/C][/ROW]
[ROW][C]16[/C][C]0.696481[/C][C]0.607038[/C][C]0.303519[/C][/ROW]
[ROW][C]17[/C][C]0.633694[/C][C]0.732612[/C][C]0.366306[/C][/ROW]
[ROW][C]18[/C][C]0.666814[/C][C]0.666371[/C][C]0.333186[/C][/ROW]
[ROW][C]19[/C][C]0.674434[/C][C]0.651133[/C][C]0.325566[/C][/ROW]
[ROW][C]20[/C][C]0.729072[/C][C]0.541856[/C][C]0.270928[/C][/ROW]
[ROW][C]21[/C][C]0.68655[/C][C]0.626901[/C][C]0.31345[/C][/ROW]
[ROW][C]22[/C][C]0.644005[/C][C]0.71199[/C][C]0.355995[/C][/ROW]
[ROW][C]23[/C][C]0.659516[/C][C]0.680969[/C][C]0.340484[/C][/ROW]
[ROW][C]24[/C][C]0.602959[/C][C]0.794081[/C][C]0.397041[/C][/ROW]
[ROW][C]25[/C][C]0.641887[/C][C]0.716227[/C][C]0.358113[/C][/ROW]
[ROW][C]26[/C][C]0.610295[/C][C]0.779409[/C][C]0.389705[/C][/ROW]
[ROW][C]27[/C][C]0.617875[/C][C]0.76425[/C][C]0.382125[/C][/ROW]
[ROW][C]28[/C][C]0.664494[/C][C]0.671012[/C][C]0.335506[/C][/ROW]
[ROW][C]29[/C][C]0.631866[/C][C]0.736267[/C][C]0.368134[/C][/ROW]
[ROW][C]30[/C][C]0.67984[/C][C]0.640319[/C][C]0.32016[/C][/ROW]
[ROW][C]31[/C][C]0.695544[/C][C]0.608912[/C][C]0.304456[/C][/ROW]
[ROW][C]32[/C][C]0.715492[/C][C]0.569016[/C][C]0.284508[/C][/ROW]
[ROW][C]33[/C][C]0.689719[/C][C]0.620562[/C][C]0.310281[/C][/ROW]
[ROW][C]34[/C][C]0.647092[/C][C]0.705816[/C][C]0.352908[/C][/ROW]
[ROW][C]35[/C][C]0.621805[/C][C]0.756389[/C][C]0.378195[/C][/ROW]
[ROW][C]36[/C][C]0.634121[/C][C]0.731758[/C][C]0.365879[/C][/ROW]
[ROW][C]37[/C][C]0.642524[/C][C]0.714951[/C][C]0.357476[/C][/ROW]
[ROW][C]38[/C][C]0.609948[/C][C]0.780104[/C][C]0.390052[/C][/ROW]
[ROW][C]39[/C][C]0.584308[/C][C]0.831384[/C][C]0.415692[/C][/ROW]
[ROW][C]40[/C][C]0.572243[/C][C]0.855515[/C][C]0.427757[/C][/ROW]
[ROW][C]41[/C][C]0.536979[/C][C]0.926043[/C][C]0.463021[/C][/ROW]
[ROW][C]42[/C][C]0.51686[/C][C]0.96628[/C][C]0.48314[/C][/ROW]
[ROW][C]43[/C][C]0.580811[/C][C]0.838378[/C][C]0.419189[/C][/ROW]
[ROW][C]44[/C][C]0.550786[/C][C]0.898428[/C][C]0.449214[/C][/ROW]
[ROW][C]45[/C][C]0.531185[/C][C]0.937631[/C][C]0.468815[/C][/ROW]
[ROW][C]46[/C][C]0.518082[/C][C]0.963836[/C][C]0.481918[/C][/ROW]
[ROW][C]47[/C][C]0.487511[/C][C]0.975022[/C][C]0.512489[/C][/ROW]
[ROW][C]48[/C][C]0.456206[/C][C]0.912411[/C][C]0.543794[/C][/ROW]
[ROW][C]49[/C][C]0.454264[/C][C]0.908528[/C][C]0.545736[/C][/ROW]
[ROW][C]50[/C][C]0.444981[/C][C]0.889962[/C][C]0.555019[/C][/ROW]
[ROW][C]51[/C][C]0.425027[/C][C]0.850054[/C][C]0.574973[/C][/ROW]
[ROW][C]52[/C][C]0.470315[/C][C]0.94063[/C][C]0.529685[/C][/ROW]
[ROW][C]53[/C][C]0.443154[/C][C]0.886307[/C][C]0.556846[/C][/ROW]
[ROW][C]54[/C][C]0.412126[/C][C]0.824253[/C][C]0.587874[/C][/ROW]
[ROW][C]55[/C][C]0.37469[/C][C]0.749379[/C][C]0.62531[/C][/ROW]
[ROW][C]56[/C][C]0.349154[/C][C]0.698309[/C][C]0.650846[/C][/ROW]
[ROW][C]57[/C][C]0.331323[/C][C]0.662646[/C][C]0.668677[/C][/ROW]
[ROW][C]58[/C][C]0.288896[/C][C]0.577793[/C][C]0.711104[/C][/ROW]
[ROW][C]59[/C][C]0.313418[/C][C]0.626836[/C][C]0.686582[/C][/ROW]
[ROW][C]60[/C][C]0.296424[/C][C]0.592848[/C][C]0.703576[/C][/ROW]
[ROW][C]61[/C][C]0.291069[/C][C]0.582139[/C][C]0.708931[/C][/ROW]
[ROW][C]62[/C][C]0.310507[/C][C]0.621013[/C][C]0.689493[/C][/ROW]
[ROW][C]63[/C][C]0.292997[/C][C]0.585994[/C][C]0.707003[/C][/ROW]
[ROW][C]64[/C][C]0.28232[/C][C]0.564641[/C][C]0.71768[/C][/ROW]
[ROW][C]65[/C][C]0.261754[/C][C]0.523508[/C][C]0.738246[/C][/ROW]
[ROW][C]66[/C][C]0.238759[/C][C]0.477518[/C][C]0.761241[/C][/ROW]
[ROW][C]67[/C][C]0.274423[/C][C]0.548846[/C][C]0.725577[/C][/ROW]
[ROW][C]68[/C][C]0.283595[/C][C]0.56719[/C][C]0.716405[/C][/ROW]
[ROW][C]69[/C][C]0.289505[/C][C]0.579011[/C][C]0.710495[/C][/ROW]
[ROW][C]70[/C][C]0.322521[/C][C]0.645043[/C][C]0.677479[/C][/ROW]
[ROW][C]71[/C][C]0.348045[/C][C]0.69609[/C][C]0.651955[/C][/ROW]
[ROW][C]72[/C][C]0.388296[/C][C]0.776592[/C][C]0.611704[/C][/ROW]
[ROW][C]73[/C][C]0.370067[/C][C]0.740134[/C][C]0.629933[/C][/ROW]
[ROW][C]74[/C][C]0.354593[/C][C]0.709185[/C][C]0.645407[/C][/ROW]
[ROW][C]75[/C][C]0.31774[/C][C]0.63548[/C][C]0.68226[/C][/ROW]
[ROW][C]76[/C][C]0.32097[/C][C]0.64194[/C][C]0.67903[/C][/ROW]
[ROW][C]77[/C][C]0.316714[/C][C]0.633429[/C][C]0.683286[/C][/ROW]
[ROW][C]78[/C][C]0.288911[/C][C]0.577821[/C][C]0.711089[/C][/ROW]
[ROW][C]79[/C][C]0.29844[/C][C]0.59688[/C][C]0.70156[/C][/ROW]
[ROW][C]80[/C][C]0.285279[/C][C]0.570559[/C][C]0.714721[/C][/ROW]
[ROW][C]81[/C][C]0.274734[/C][C]0.549468[/C][C]0.725266[/C][/ROW]
[ROW][C]82[/C][C]0.286986[/C][C]0.573973[/C][C]0.713014[/C][/ROW]
[ROW][C]83[/C][C]0.291347[/C][C]0.582695[/C][C]0.708653[/C][/ROW]
[ROW][C]84[/C][C]0.313533[/C][C]0.627067[/C][C]0.686467[/C][/ROW]
[ROW][C]85[/C][C]0.309214[/C][C]0.618428[/C][C]0.690786[/C][/ROW]
[ROW][C]86[/C][C]0.290417[/C][C]0.580834[/C][C]0.709583[/C][/ROW]
[ROW][C]87[/C][C]0.286137[/C][C]0.572275[/C][C]0.713863[/C][/ROW]
[ROW][C]88[/C][C]0.275995[/C][C]0.55199[/C][C]0.724005[/C][/ROW]
[ROW][C]89[/C][C]0.260978[/C][C]0.521957[/C][C]0.739022[/C][/ROW]
[ROW][C]90[/C][C]0.287437[/C][C]0.574875[/C][C]0.712563[/C][/ROW]
[ROW][C]91[/C][C]0.345416[/C][C]0.690831[/C][C]0.654584[/C][/ROW]
[ROW][C]92[/C][C]0.465788[/C][C]0.931575[/C][C]0.534212[/C][/ROW]
[ROW][C]93[/C][C]0.521003[/C][C]0.957993[/C][C]0.478997[/C][/ROW]
[ROW][C]94[/C][C]0.464458[/C][C]0.928917[/C][C]0.535542[/C][/ROW]
[ROW][C]95[/C][C]0.434259[/C][C]0.868519[/C][C]0.565741[/C][/ROW]
[ROW][C]96[/C][C]0.419488[/C][C]0.838977[/C][C]0.580512[/C][/ROW]
[ROW][C]97[/C][C]0.42828[/C][C]0.85656[/C][C]0.57172[/C][/ROW]
[ROW][C]98[/C][C]0.408073[/C][C]0.816145[/C][C]0.591927[/C][/ROW]
[ROW][C]99[/C][C]0.351675[/C][C]0.703349[/C][C]0.648325[/C][/ROW]
[ROW][C]100[/C][C]0.284628[/C][C]0.569256[/C][C]0.715372[/C][/ROW]
[ROW][C]101[/C][C]0.308908[/C][C]0.617817[/C][C]0.691092[/C][/ROW]
[ROW][C]102[/C][C]0.412507[/C][C]0.825014[/C][C]0.587493[/C][/ROW]
[ROW][C]103[/C][C]0.37358[/C][C]0.747161[/C][C]0.62642[/C][/ROW]
[ROW][C]104[/C][C]0.296909[/C][C]0.593817[/C][C]0.703091[/C][/ROW]
[ROW][C]105[/C][C]0.489713[/C][C]0.979427[/C][C]0.510287[/C][/ROW]
[ROW][C]106[/C][C]0.910359[/C][C]0.179282[/C][C]0.0896411[/C][/ROW]
[ROW][C]107[/C][C]0.912594[/C][C]0.174812[/C][C]0.0874059[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=263606&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=263606&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
60.8003620.3992760.199638
70.8194450.3611110.180555
80.7744790.4510410.225521
90.7219790.5560430.278021
100.6620820.6758370.337918
110.5784960.8430080.421504
120.5400770.9198470.459923
130.4718950.9437910.528105
140.6348690.7302620.365131
150.6391860.7216280.360814
160.6964810.6070380.303519
170.6336940.7326120.366306
180.6668140.6663710.333186
190.6744340.6511330.325566
200.7290720.5418560.270928
210.686550.6269010.31345
220.6440050.711990.355995
230.6595160.6809690.340484
240.6029590.7940810.397041
250.6418870.7162270.358113
260.6102950.7794090.389705
270.6178750.764250.382125
280.6644940.6710120.335506
290.6318660.7362670.368134
300.679840.6403190.32016
310.6955440.6089120.304456
320.7154920.5690160.284508
330.6897190.6205620.310281
340.6470920.7058160.352908
350.6218050.7563890.378195
360.6341210.7317580.365879
370.6425240.7149510.357476
380.6099480.7801040.390052
390.5843080.8313840.415692
400.5722430.8555150.427757
410.5369790.9260430.463021
420.516860.966280.48314
430.5808110.8383780.419189
440.5507860.8984280.449214
450.5311850.9376310.468815
460.5180820.9638360.481918
470.4875110.9750220.512489
480.4562060.9124110.543794
490.4542640.9085280.545736
500.4449810.8899620.555019
510.4250270.8500540.574973
520.4703150.940630.529685
530.4431540.8863070.556846
540.4121260.8242530.587874
550.374690.7493790.62531
560.3491540.6983090.650846
570.3313230.6626460.668677
580.2888960.5777930.711104
590.3134180.6268360.686582
600.2964240.5928480.703576
610.2910690.5821390.708931
620.3105070.6210130.689493
630.2929970.5859940.707003
640.282320.5646410.71768
650.2617540.5235080.738246
660.2387590.4775180.761241
670.2744230.5488460.725577
680.2835950.567190.716405
690.2895050.5790110.710495
700.3225210.6450430.677479
710.3480450.696090.651955
720.3882960.7765920.611704
730.3700670.7401340.629933
740.3545930.7091850.645407
750.317740.635480.68226
760.320970.641940.67903
770.3167140.6334290.683286
780.2889110.5778210.711089
790.298440.596880.70156
800.2852790.5705590.714721
810.2747340.5494680.725266
820.2869860.5739730.713014
830.2913470.5826950.708653
840.3135330.6270670.686467
850.3092140.6184280.690786
860.2904170.5808340.709583
870.2861370.5722750.713863
880.2759950.551990.724005
890.2609780.5219570.739022
900.2874370.5748750.712563
910.3454160.6908310.654584
920.4657880.9315750.534212
930.5210030.9579930.478997
940.4644580.9289170.535542
950.4342590.8685190.565741
960.4194880.8389770.580512
970.428280.856560.57172
980.4080730.8161450.591927
990.3516750.7033490.648325
1000.2846280.5692560.715372
1010.3089080.6178170.691092
1020.4125070.8250140.587493
1030.373580.7471610.62642
1040.2969090.5938170.703091
1050.4897130.9794270.510287
1060.9103590.1792820.0896411
1070.9125940.1748120.0874059







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level00OK
5% type I error level00OK
10% type I error level00OK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 0 & 0 & OK \tabularnewline
5% type I error level & 0 & 0 & OK \tabularnewline
10% type I error level & 0 & 0 & OK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=263606&T=6

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=263606&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=263606&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level00OK
5% type I error level00OK
10% type I error level00OK



Parameters (Session):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
R code (references can be found in the software module):
library(lattice)
library(lmtest)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
qqline(mysum$resid)
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, signif(mysum$coefficients[i,1],6), sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,signif(mysum$coefficients[i,1],6))
a<-table.element(a, signif(mysum$coefficients[i,2],6))
a<-table.element(a, signif(mysum$coefficients[i,3],4))
a<-table.element(a, signif(mysum$coefficients[i,4],6))
a<-table.element(a, signif(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, signif(sqrt(mysum$r.squared),6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, signif(mysum$r.squared,6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, signif(mysum$adj.r.squared,6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[1],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[2],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[3],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, signif(1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]),6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, signif(mysum$sigma,6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, signif(sum(myerror*myerror),6))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,signif(x[i],6))
a<-table.element(a,signif(x[i]-mysum$resid[i],6))
a<-table.element(a,signif(mysum$resid[i],6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,signif(gqarr[mypoint-kp3+1,1],6))
a<-table.element(a,signif(gqarr[mypoint-kp3+1,2],6))
a<-table.element(a,signif(gqarr[mypoint-kp3+1,3],6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant1,6))
a<-table.element(a,signif(numsignificant1/numgqtests,6))
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant5,6))
a<-table.element(a,signif(numsignificant5/numgqtests,6))
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant10,6))
a<-table.element(a,signif(numsignificant10/numgqtests,6))
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}