Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationTue, 09 Dec 2014 11:47:48 +0000
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2014/Dec/09/t1418125680u3lbujk6utksc8o.htm/, Retrieved Tue, 14 May 2024 23:24:13 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=264482, Retrieved Tue, 14 May 2024 23:24:13 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact95
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-     [Multiple Regression] [] [2013-11-04 07:24:05] [0307e7a6407eb638caabc417e3a6b260]
- RM D    [Multiple Regression] [] [2014-12-09 11:47:48] [dab7ed139043e35d640785ec44e1a81a] [Current]
Feedback Forum

Post a new message
Dataseries X:
1	1
0	0
0	1
1	1
1	0
1	0
1	1
1	1
1	1
1	1
1	0
1	1
1	0
0	1
1	1
1	0
1	1
1	1
1	0
0	0
1	0
0	0
1	1
1	1
0	1
1	1
1	1
1	1
0	1
1	0
1	1
1	0
1	1
0	0
1	0
0	0
1	1
0	1
1	1
1	1
0	0
1	1
0	0
0	1
1	1
1	1
1	1
1	1
0	1
0	0
0	1
0	1
1	1
0	0
1	1
1	1
1	1
1	1
0	0
1	1
0	0
0	1
1	0
1	1
1	0
0	1
1	0
0	0
0	1
0	1
1	1
0	1
1	1
0	1
0	0
0	1
0	0
1	1
0	1
1	1
1	1
1	1
0	0
1	0
1	1
0	1
1	1
1	0
1	1
0	1
0	1
1	1
0	0
0	1
0	1
1	0
0	0
1	0
0	0
1	0
0	0
0	1
0	1
0	0
0	1
0	1
0	0
1	0
0	1
0	0
0	1
0	0
1	0
0	0
1	1
0	1




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time5 seconds
R Server'Gwilym Jenkins' @ jenkins.wessa.net

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 5 seconds \tabularnewline
R Server & 'Gwilym Jenkins' @ jenkins.wessa.net \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=264482&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]5 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Gwilym Jenkins' @ jenkins.wessa.net[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=264482&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=264482&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time5 seconds
R Server'Gwilym Jenkins' @ jenkins.wessa.net







Multiple Linear Regression - Estimated Regression Equation
Accidents[t] = + 0.454545 + 0.142677Belt[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
Accidents[t] =  +  0.454545 +  0.142677Belt[t]  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=264482&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]Accidents[t] =  +  0.454545 +  0.142677Belt[t]  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=264482&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=264482&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
Accidents[t] = + 0.454545 + 0.142677Belt[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)0.4545450.0750186.0591.80794e-089.03971e-09
Belt0.1426770.095221.4980.1367960.0683979

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 0.454545 & 0.075018 & 6.059 & 1.80794e-08 & 9.03971e-09 \tabularnewline
Belt & 0.142677 & 0.09522 & 1.498 & 0.136796 & 0.0683979 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=264482&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]0.454545[/C][C]0.075018[/C][C]6.059[/C][C]1.80794e-08[/C][C]9.03971e-09[/C][/ROW]
[ROW][C]Belt[/C][C]0.142677[/C][C]0.09522[/C][C]1.498[/C][C]0.136796[/C][C]0.0683979[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=264482&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=264482&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)0.4545450.0750186.0591.80794e-089.03971e-09
Belt0.1426770.095221.4980.1367960.0683979







Multiple Linear Regression - Regression Statistics
Multiple R0.138975
R-squared0.0193141
Adjusted R-squared0.0107116
F-TEST (value)2.24518
F-TEST (DF numerator)1
F-TEST (DF denominator)114
p-value0.136796
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation0.497613
Sum Squared Residuals28.2285

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.138975 \tabularnewline
R-squared & 0.0193141 \tabularnewline
Adjusted R-squared & 0.0107116 \tabularnewline
F-TEST (value) & 2.24518 \tabularnewline
F-TEST (DF numerator) & 1 \tabularnewline
F-TEST (DF denominator) & 114 \tabularnewline
p-value & 0.136796 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 0.497613 \tabularnewline
Sum Squared Residuals & 28.2285 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=264482&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.138975[/C][/ROW]
[ROW][C]R-squared[/C][C]0.0193141[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]0.0107116[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]2.24518[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]1[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]114[/C][/ROW]
[ROW][C]p-value[/C][C]0.136796[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]0.497613[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]28.2285[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=264482&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=264482&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.138975
R-squared0.0193141
Adjusted R-squared0.0107116
F-TEST (value)2.24518
F-TEST (DF numerator)1
F-TEST (DF denominator)114
p-value0.136796
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation0.497613
Sum Squared Residuals28.2285







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
110.5972220.402778
200.454545-0.454545
300.597222-0.597222
410.5972220.402778
510.4545450.545455
610.4545450.545455
710.5972220.402778
810.5972220.402778
910.5972220.402778
1010.5972220.402778
1110.4545450.545455
1210.5972220.402778
1310.4545450.545455
1400.597222-0.597222
1510.5972220.402778
1610.4545450.545455
1710.5972220.402778
1810.5972220.402778
1910.4545450.545455
2000.454545-0.454545
2110.4545450.545455
2200.454545-0.454545
2310.5972220.402778
2410.5972220.402778
2500.597222-0.597222
2610.5972220.402778
2710.5972220.402778
2810.5972220.402778
2900.597222-0.597222
3010.4545450.545455
3110.5972220.402778
3210.4545450.545455
3310.5972220.402778
3400.454545-0.454545
3510.4545450.545455
3600.454545-0.454545
3710.5972220.402778
3800.597222-0.597222
3910.5972220.402778
4010.5972220.402778
4100.454545-0.454545
4210.5972220.402778
4300.454545-0.454545
4400.597222-0.597222
4510.5972220.402778
4610.5972220.402778
4710.5972220.402778
4810.5972220.402778
4900.597222-0.597222
5000.454545-0.454545
5100.597222-0.597222
5200.597222-0.597222
5310.5972220.402778
5400.454545-0.454545
5510.5972220.402778
5610.5972220.402778
5710.5972220.402778
5810.5972220.402778
5900.454545-0.454545
6010.5972220.402778
6100.454545-0.454545
6200.597222-0.597222
6310.4545450.545455
6410.5972220.402778
6510.4545450.545455
6600.597222-0.597222
6710.4545450.545455
6800.454545-0.454545
6900.597222-0.597222
7000.597222-0.597222
7110.5972220.402778
7200.597222-0.597222
7310.5972220.402778
7400.597222-0.597222
7500.454545-0.454545
7600.597222-0.597222
7700.454545-0.454545
7810.5972220.402778
7900.597222-0.597222
8010.5972220.402778
8110.5972220.402778
8210.5972220.402778
8300.454545-0.454545
8410.4545450.545455
8510.5972220.402778
8600.597222-0.597222
8710.5972220.402778
8810.4545450.545455
8910.5972220.402778
9000.597222-0.597222
9100.597222-0.597222
9210.5972220.402778
9300.454545-0.454545
9400.597222-0.597222
9500.597222-0.597222
9610.4545450.545455
9700.454545-0.454545
9810.4545450.545455
9900.454545-0.454545
10010.4545450.545455
10100.454545-0.454545
10200.597222-0.597222
10300.597222-0.597222
10400.454545-0.454545
10500.597222-0.597222
10600.597222-0.597222
10700.454545-0.454545
10810.4545450.545455
10900.597222-0.597222
11000.454545-0.454545
11100.597222-0.597222
11200.454545-0.454545
11310.4545450.545455
11400.454545-0.454545
11510.5972220.402778
11600.597222-0.597222

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 1 & 0.597222 & 0.402778 \tabularnewline
2 & 0 & 0.454545 & -0.454545 \tabularnewline
3 & 0 & 0.597222 & -0.597222 \tabularnewline
4 & 1 & 0.597222 & 0.402778 \tabularnewline
5 & 1 & 0.454545 & 0.545455 \tabularnewline
6 & 1 & 0.454545 & 0.545455 \tabularnewline
7 & 1 & 0.597222 & 0.402778 \tabularnewline
8 & 1 & 0.597222 & 0.402778 \tabularnewline
9 & 1 & 0.597222 & 0.402778 \tabularnewline
10 & 1 & 0.597222 & 0.402778 \tabularnewline
11 & 1 & 0.454545 & 0.545455 \tabularnewline
12 & 1 & 0.597222 & 0.402778 \tabularnewline
13 & 1 & 0.454545 & 0.545455 \tabularnewline
14 & 0 & 0.597222 & -0.597222 \tabularnewline
15 & 1 & 0.597222 & 0.402778 \tabularnewline
16 & 1 & 0.454545 & 0.545455 \tabularnewline
17 & 1 & 0.597222 & 0.402778 \tabularnewline
18 & 1 & 0.597222 & 0.402778 \tabularnewline
19 & 1 & 0.454545 & 0.545455 \tabularnewline
20 & 0 & 0.454545 & -0.454545 \tabularnewline
21 & 1 & 0.454545 & 0.545455 \tabularnewline
22 & 0 & 0.454545 & -0.454545 \tabularnewline
23 & 1 & 0.597222 & 0.402778 \tabularnewline
24 & 1 & 0.597222 & 0.402778 \tabularnewline
25 & 0 & 0.597222 & -0.597222 \tabularnewline
26 & 1 & 0.597222 & 0.402778 \tabularnewline
27 & 1 & 0.597222 & 0.402778 \tabularnewline
28 & 1 & 0.597222 & 0.402778 \tabularnewline
29 & 0 & 0.597222 & -0.597222 \tabularnewline
30 & 1 & 0.454545 & 0.545455 \tabularnewline
31 & 1 & 0.597222 & 0.402778 \tabularnewline
32 & 1 & 0.454545 & 0.545455 \tabularnewline
33 & 1 & 0.597222 & 0.402778 \tabularnewline
34 & 0 & 0.454545 & -0.454545 \tabularnewline
35 & 1 & 0.454545 & 0.545455 \tabularnewline
36 & 0 & 0.454545 & -0.454545 \tabularnewline
37 & 1 & 0.597222 & 0.402778 \tabularnewline
38 & 0 & 0.597222 & -0.597222 \tabularnewline
39 & 1 & 0.597222 & 0.402778 \tabularnewline
40 & 1 & 0.597222 & 0.402778 \tabularnewline
41 & 0 & 0.454545 & -0.454545 \tabularnewline
42 & 1 & 0.597222 & 0.402778 \tabularnewline
43 & 0 & 0.454545 & -0.454545 \tabularnewline
44 & 0 & 0.597222 & -0.597222 \tabularnewline
45 & 1 & 0.597222 & 0.402778 \tabularnewline
46 & 1 & 0.597222 & 0.402778 \tabularnewline
47 & 1 & 0.597222 & 0.402778 \tabularnewline
48 & 1 & 0.597222 & 0.402778 \tabularnewline
49 & 0 & 0.597222 & -0.597222 \tabularnewline
50 & 0 & 0.454545 & -0.454545 \tabularnewline
51 & 0 & 0.597222 & -0.597222 \tabularnewline
52 & 0 & 0.597222 & -0.597222 \tabularnewline
53 & 1 & 0.597222 & 0.402778 \tabularnewline
54 & 0 & 0.454545 & -0.454545 \tabularnewline
55 & 1 & 0.597222 & 0.402778 \tabularnewline
56 & 1 & 0.597222 & 0.402778 \tabularnewline
57 & 1 & 0.597222 & 0.402778 \tabularnewline
58 & 1 & 0.597222 & 0.402778 \tabularnewline
59 & 0 & 0.454545 & -0.454545 \tabularnewline
60 & 1 & 0.597222 & 0.402778 \tabularnewline
61 & 0 & 0.454545 & -0.454545 \tabularnewline
62 & 0 & 0.597222 & -0.597222 \tabularnewline
63 & 1 & 0.454545 & 0.545455 \tabularnewline
64 & 1 & 0.597222 & 0.402778 \tabularnewline
65 & 1 & 0.454545 & 0.545455 \tabularnewline
66 & 0 & 0.597222 & -0.597222 \tabularnewline
67 & 1 & 0.454545 & 0.545455 \tabularnewline
68 & 0 & 0.454545 & -0.454545 \tabularnewline
69 & 0 & 0.597222 & -0.597222 \tabularnewline
70 & 0 & 0.597222 & -0.597222 \tabularnewline
71 & 1 & 0.597222 & 0.402778 \tabularnewline
72 & 0 & 0.597222 & -0.597222 \tabularnewline
73 & 1 & 0.597222 & 0.402778 \tabularnewline
74 & 0 & 0.597222 & -0.597222 \tabularnewline
75 & 0 & 0.454545 & -0.454545 \tabularnewline
76 & 0 & 0.597222 & -0.597222 \tabularnewline
77 & 0 & 0.454545 & -0.454545 \tabularnewline
78 & 1 & 0.597222 & 0.402778 \tabularnewline
79 & 0 & 0.597222 & -0.597222 \tabularnewline
80 & 1 & 0.597222 & 0.402778 \tabularnewline
81 & 1 & 0.597222 & 0.402778 \tabularnewline
82 & 1 & 0.597222 & 0.402778 \tabularnewline
83 & 0 & 0.454545 & -0.454545 \tabularnewline
84 & 1 & 0.454545 & 0.545455 \tabularnewline
85 & 1 & 0.597222 & 0.402778 \tabularnewline
86 & 0 & 0.597222 & -0.597222 \tabularnewline
87 & 1 & 0.597222 & 0.402778 \tabularnewline
88 & 1 & 0.454545 & 0.545455 \tabularnewline
89 & 1 & 0.597222 & 0.402778 \tabularnewline
90 & 0 & 0.597222 & -0.597222 \tabularnewline
91 & 0 & 0.597222 & -0.597222 \tabularnewline
92 & 1 & 0.597222 & 0.402778 \tabularnewline
93 & 0 & 0.454545 & -0.454545 \tabularnewline
94 & 0 & 0.597222 & -0.597222 \tabularnewline
95 & 0 & 0.597222 & -0.597222 \tabularnewline
96 & 1 & 0.454545 & 0.545455 \tabularnewline
97 & 0 & 0.454545 & -0.454545 \tabularnewline
98 & 1 & 0.454545 & 0.545455 \tabularnewline
99 & 0 & 0.454545 & -0.454545 \tabularnewline
100 & 1 & 0.454545 & 0.545455 \tabularnewline
101 & 0 & 0.454545 & -0.454545 \tabularnewline
102 & 0 & 0.597222 & -0.597222 \tabularnewline
103 & 0 & 0.597222 & -0.597222 \tabularnewline
104 & 0 & 0.454545 & -0.454545 \tabularnewline
105 & 0 & 0.597222 & -0.597222 \tabularnewline
106 & 0 & 0.597222 & -0.597222 \tabularnewline
107 & 0 & 0.454545 & -0.454545 \tabularnewline
108 & 1 & 0.454545 & 0.545455 \tabularnewline
109 & 0 & 0.597222 & -0.597222 \tabularnewline
110 & 0 & 0.454545 & -0.454545 \tabularnewline
111 & 0 & 0.597222 & -0.597222 \tabularnewline
112 & 0 & 0.454545 & -0.454545 \tabularnewline
113 & 1 & 0.454545 & 0.545455 \tabularnewline
114 & 0 & 0.454545 & -0.454545 \tabularnewline
115 & 1 & 0.597222 & 0.402778 \tabularnewline
116 & 0 & 0.597222 & -0.597222 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=264482&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]1[/C][C]0.597222[/C][C]0.402778[/C][/ROW]
[ROW][C]2[/C][C]0[/C][C]0.454545[/C][C]-0.454545[/C][/ROW]
[ROW][C]3[/C][C]0[/C][C]0.597222[/C][C]-0.597222[/C][/ROW]
[ROW][C]4[/C][C]1[/C][C]0.597222[/C][C]0.402778[/C][/ROW]
[ROW][C]5[/C][C]1[/C][C]0.454545[/C][C]0.545455[/C][/ROW]
[ROW][C]6[/C][C]1[/C][C]0.454545[/C][C]0.545455[/C][/ROW]
[ROW][C]7[/C][C]1[/C][C]0.597222[/C][C]0.402778[/C][/ROW]
[ROW][C]8[/C][C]1[/C][C]0.597222[/C][C]0.402778[/C][/ROW]
[ROW][C]9[/C][C]1[/C][C]0.597222[/C][C]0.402778[/C][/ROW]
[ROW][C]10[/C][C]1[/C][C]0.597222[/C][C]0.402778[/C][/ROW]
[ROW][C]11[/C][C]1[/C][C]0.454545[/C][C]0.545455[/C][/ROW]
[ROW][C]12[/C][C]1[/C][C]0.597222[/C][C]0.402778[/C][/ROW]
[ROW][C]13[/C][C]1[/C][C]0.454545[/C][C]0.545455[/C][/ROW]
[ROW][C]14[/C][C]0[/C][C]0.597222[/C][C]-0.597222[/C][/ROW]
[ROW][C]15[/C][C]1[/C][C]0.597222[/C][C]0.402778[/C][/ROW]
[ROW][C]16[/C][C]1[/C][C]0.454545[/C][C]0.545455[/C][/ROW]
[ROW][C]17[/C][C]1[/C][C]0.597222[/C][C]0.402778[/C][/ROW]
[ROW][C]18[/C][C]1[/C][C]0.597222[/C][C]0.402778[/C][/ROW]
[ROW][C]19[/C][C]1[/C][C]0.454545[/C][C]0.545455[/C][/ROW]
[ROW][C]20[/C][C]0[/C][C]0.454545[/C][C]-0.454545[/C][/ROW]
[ROW][C]21[/C][C]1[/C][C]0.454545[/C][C]0.545455[/C][/ROW]
[ROW][C]22[/C][C]0[/C][C]0.454545[/C][C]-0.454545[/C][/ROW]
[ROW][C]23[/C][C]1[/C][C]0.597222[/C][C]0.402778[/C][/ROW]
[ROW][C]24[/C][C]1[/C][C]0.597222[/C][C]0.402778[/C][/ROW]
[ROW][C]25[/C][C]0[/C][C]0.597222[/C][C]-0.597222[/C][/ROW]
[ROW][C]26[/C][C]1[/C][C]0.597222[/C][C]0.402778[/C][/ROW]
[ROW][C]27[/C][C]1[/C][C]0.597222[/C][C]0.402778[/C][/ROW]
[ROW][C]28[/C][C]1[/C][C]0.597222[/C][C]0.402778[/C][/ROW]
[ROW][C]29[/C][C]0[/C][C]0.597222[/C][C]-0.597222[/C][/ROW]
[ROW][C]30[/C][C]1[/C][C]0.454545[/C][C]0.545455[/C][/ROW]
[ROW][C]31[/C][C]1[/C][C]0.597222[/C][C]0.402778[/C][/ROW]
[ROW][C]32[/C][C]1[/C][C]0.454545[/C][C]0.545455[/C][/ROW]
[ROW][C]33[/C][C]1[/C][C]0.597222[/C][C]0.402778[/C][/ROW]
[ROW][C]34[/C][C]0[/C][C]0.454545[/C][C]-0.454545[/C][/ROW]
[ROW][C]35[/C][C]1[/C][C]0.454545[/C][C]0.545455[/C][/ROW]
[ROW][C]36[/C][C]0[/C][C]0.454545[/C][C]-0.454545[/C][/ROW]
[ROW][C]37[/C][C]1[/C][C]0.597222[/C][C]0.402778[/C][/ROW]
[ROW][C]38[/C][C]0[/C][C]0.597222[/C][C]-0.597222[/C][/ROW]
[ROW][C]39[/C][C]1[/C][C]0.597222[/C][C]0.402778[/C][/ROW]
[ROW][C]40[/C][C]1[/C][C]0.597222[/C][C]0.402778[/C][/ROW]
[ROW][C]41[/C][C]0[/C][C]0.454545[/C][C]-0.454545[/C][/ROW]
[ROW][C]42[/C][C]1[/C][C]0.597222[/C][C]0.402778[/C][/ROW]
[ROW][C]43[/C][C]0[/C][C]0.454545[/C][C]-0.454545[/C][/ROW]
[ROW][C]44[/C][C]0[/C][C]0.597222[/C][C]-0.597222[/C][/ROW]
[ROW][C]45[/C][C]1[/C][C]0.597222[/C][C]0.402778[/C][/ROW]
[ROW][C]46[/C][C]1[/C][C]0.597222[/C][C]0.402778[/C][/ROW]
[ROW][C]47[/C][C]1[/C][C]0.597222[/C][C]0.402778[/C][/ROW]
[ROW][C]48[/C][C]1[/C][C]0.597222[/C][C]0.402778[/C][/ROW]
[ROW][C]49[/C][C]0[/C][C]0.597222[/C][C]-0.597222[/C][/ROW]
[ROW][C]50[/C][C]0[/C][C]0.454545[/C][C]-0.454545[/C][/ROW]
[ROW][C]51[/C][C]0[/C][C]0.597222[/C][C]-0.597222[/C][/ROW]
[ROW][C]52[/C][C]0[/C][C]0.597222[/C][C]-0.597222[/C][/ROW]
[ROW][C]53[/C][C]1[/C][C]0.597222[/C][C]0.402778[/C][/ROW]
[ROW][C]54[/C][C]0[/C][C]0.454545[/C][C]-0.454545[/C][/ROW]
[ROW][C]55[/C][C]1[/C][C]0.597222[/C][C]0.402778[/C][/ROW]
[ROW][C]56[/C][C]1[/C][C]0.597222[/C][C]0.402778[/C][/ROW]
[ROW][C]57[/C][C]1[/C][C]0.597222[/C][C]0.402778[/C][/ROW]
[ROW][C]58[/C][C]1[/C][C]0.597222[/C][C]0.402778[/C][/ROW]
[ROW][C]59[/C][C]0[/C][C]0.454545[/C][C]-0.454545[/C][/ROW]
[ROW][C]60[/C][C]1[/C][C]0.597222[/C][C]0.402778[/C][/ROW]
[ROW][C]61[/C][C]0[/C][C]0.454545[/C][C]-0.454545[/C][/ROW]
[ROW][C]62[/C][C]0[/C][C]0.597222[/C][C]-0.597222[/C][/ROW]
[ROW][C]63[/C][C]1[/C][C]0.454545[/C][C]0.545455[/C][/ROW]
[ROW][C]64[/C][C]1[/C][C]0.597222[/C][C]0.402778[/C][/ROW]
[ROW][C]65[/C][C]1[/C][C]0.454545[/C][C]0.545455[/C][/ROW]
[ROW][C]66[/C][C]0[/C][C]0.597222[/C][C]-0.597222[/C][/ROW]
[ROW][C]67[/C][C]1[/C][C]0.454545[/C][C]0.545455[/C][/ROW]
[ROW][C]68[/C][C]0[/C][C]0.454545[/C][C]-0.454545[/C][/ROW]
[ROW][C]69[/C][C]0[/C][C]0.597222[/C][C]-0.597222[/C][/ROW]
[ROW][C]70[/C][C]0[/C][C]0.597222[/C][C]-0.597222[/C][/ROW]
[ROW][C]71[/C][C]1[/C][C]0.597222[/C][C]0.402778[/C][/ROW]
[ROW][C]72[/C][C]0[/C][C]0.597222[/C][C]-0.597222[/C][/ROW]
[ROW][C]73[/C][C]1[/C][C]0.597222[/C][C]0.402778[/C][/ROW]
[ROW][C]74[/C][C]0[/C][C]0.597222[/C][C]-0.597222[/C][/ROW]
[ROW][C]75[/C][C]0[/C][C]0.454545[/C][C]-0.454545[/C][/ROW]
[ROW][C]76[/C][C]0[/C][C]0.597222[/C][C]-0.597222[/C][/ROW]
[ROW][C]77[/C][C]0[/C][C]0.454545[/C][C]-0.454545[/C][/ROW]
[ROW][C]78[/C][C]1[/C][C]0.597222[/C][C]0.402778[/C][/ROW]
[ROW][C]79[/C][C]0[/C][C]0.597222[/C][C]-0.597222[/C][/ROW]
[ROW][C]80[/C][C]1[/C][C]0.597222[/C][C]0.402778[/C][/ROW]
[ROW][C]81[/C][C]1[/C][C]0.597222[/C][C]0.402778[/C][/ROW]
[ROW][C]82[/C][C]1[/C][C]0.597222[/C][C]0.402778[/C][/ROW]
[ROW][C]83[/C][C]0[/C][C]0.454545[/C][C]-0.454545[/C][/ROW]
[ROW][C]84[/C][C]1[/C][C]0.454545[/C][C]0.545455[/C][/ROW]
[ROW][C]85[/C][C]1[/C][C]0.597222[/C][C]0.402778[/C][/ROW]
[ROW][C]86[/C][C]0[/C][C]0.597222[/C][C]-0.597222[/C][/ROW]
[ROW][C]87[/C][C]1[/C][C]0.597222[/C][C]0.402778[/C][/ROW]
[ROW][C]88[/C][C]1[/C][C]0.454545[/C][C]0.545455[/C][/ROW]
[ROW][C]89[/C][C]1[/C][C]0.597222[/C][C]0.402778[/C][/ROW]
[ROW][C]90[/C][C]0[/C][C]0.597222[/C][C]-0.597222[/C][/ROW]
[ROW][C]91[/C][C]0[/C][C]0.597222[/C][C]-0.597222[/C][/ROW]
[ROW][C]92[/C][C]1[/C][C]0.597222[/C][C]0.402778[/C][/ROW]
[ROW][C]93[/C][C]0[/C][C]0.454545[/C][C]-0.454545[/C][/ROW]
[ROW][C]94[/C][C]0[/C][C]0.597222[/C][C]-0.597222[/C][/ROW]
[ROW][C]95[/C][C]0[/C][C]0.597222[/C][C]-0.597222[/C][/ROW]
[ROW][C]96[/C][C]1[/C][C]0.454545[/C][C]0.545455[/C][/ROW]
[ROW][C]97[/C][C]0[/C][C]0.454545[/C][C]-0.454545[/C][/ROW]
[ROW][C]98[/C][C]1[/C][C]0.454545[/C][C]0.545455[/C][/ROW]
[ROW][C]99[/C][C]0[/C][C]0.454545[/C][C]-0.454545[/C][/ROW]
[ROW][C]100[/C][C]1[/C][C]0.454545[/C][C]0.545455[/C][/ROW]
[ROW][C]101[/C][C]0[/C][C]0.454545[/C][C]-0.454545[/C][/ROW]
[ROW][C]102[/C][C]0[/C][C]0.597222[/C][C]-0.597222[/C][/ROW]
[ROW][C]103[/C][C]0[/C][C]0.597222[/C][C]-0.597222[/C][/ROW]
[ROW][C]104[/C][C]0[/C][C]0.454545[/C][C]-0.454545[/C][/ROW]
[ROW][C]105[/C][C]0[/C][C]0.597222[/C][C]-0.597222[/C][/ROW]
[ROW][C]106[/C][C]0[/C][C]0.597222[/C][C]-0.597222[/C][/ROW]
[ROW][C]107[/C][C]0[/C][C]0.454545[/C][C]-0.454545[/C][/ROW]
[ROW][C]108[/C][C]1[/C][C]0.454545[/C][C]0.545455[/C][/ROW]
[ROW][C]109[/C][C]0[/C][C]0.597222[/C][C]-0.597222[/C][/ROW]
[ROW][C]110[/C][C]0[/C][C]0.454545[/C][C]-0.454545[/C][/ROW]
[ROW][C]111[/C][C]0[/C][C]0.597222[/C][C]-0.597222[/C][/ROW]
[ROW][C]112[/C][C]0[/C][C]0.454545[/C][C]-0.454545[/C][/ROW]
[ROW][C]113[/C][C]1[/C][C]0.454545[/C][C]0.545455[/C][/ROW]
[ROW][C]114[/C][C]0[/C][C]0.454545[/C][C]-0.454545[/C][/ROW]
[ROW][C]115[/C][C]1[/C][C]0.597222[/C][C]0.402778[/C][/ROW]
[ROW][C]116[/C][C]0[/C][C]0.597222[/C][C]-0.597222[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=264482&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=264482&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
110.5972220.402778
200.454545-0.454545
300.597222-0.597222
410.5972220.402778
510.4545450.545455
610.4545450.545455
710.5972220.402778
810.5972220.402778
910.5972220.402778
1010.5972220.402778
1110.4545450.545455
1210.5972220.402778
1310.4545450.545455
1400.597222-0.597222
1510.5972220.402778
1610.4545450.545455
1710.5972220.402778
1810.5972220.402778
1910.4545450.545455
2000.454545-0.454545
2110.4545450.545455
2200.454545-0.454545
2310.5972220.402778
2410.5972220.402778
2500.597222-0.597222
2610.5972220.402778
2710.5972220.402778
2810.5972220.402778
2900.597222-0.597222
3010.4545450.545455
3110.5972220.402778
3210.4545450.545455
3310.5972220.402778
3400.454545-0.454545
3510.4545450.545455
3600.454545-0.454545
3710.5972220.402778
3800.597222-0.597222
3910.5972220.402778
4010.5972220.402778
4100.454545-0.454545
4210.5972220.402778
4300.454545-0.454545
4400.597222-0.597222
4510.5972220.402778
4610.5972220.402778
4710.5972220.402778
4810.5972220.402778
4900.597222-0.597222
5000.454545-0.454545
5100.597222-0.597222
5200.597222-0.597222
5310.5972220.402778
5400.454545-0.454545
5510.5972220.402778
5610.5972220.402778
5710.5972220.402778
5810.5972220.402778
5900.454545-0.454545
6010.5972220.402778
6100.454545-0.454545
6200.597222-0.597222
6310.4545450.545455
6410.5972220.402778
6510.4545450.545455
6600.597222-0.597222
6710.4545450.545455
6800.454545-0.454545
6900.597222-0.597222
7000.597222-0.597222
7110.5972220.402778
7200.597222-0.597222
7310.5972220.402778
7400.597222-0.597222
7500.454545-0.454545
7600.597222-0.597222
7700.454545-0.454545
7810.5972220.402778
7900.597222-0.597222
8010.5972220.402778
8110.5972220.402778
8210.5972220.402778
8300.454545-0.454545
8410.4545450.545455
8510.5972220.402778
8600.597222-0.597222
8710.5972220.402778
8810.4545450.545455
8910.5972220.402778
9000.597222-0.597222
9100.597222-0.597222
9210.5972220.402778
9300.454545-0.454545
9400.597222-0.597222
9500.597222-0.597222
9610.4545450.545455
9700.454545-0.454545
9810.4545450.545455
9900.454545-0.454545
10010.4545450.545455
10100.454545-0.454545
10200.597222-0.597222
10300.597222-0.597222
10400.454545-0.454545
10500.597222-0.597222
10600.597222-0.597222
10700.454545-0.454545
10810.4545450.545455
10900.597222-0.597222
11000.454545-0.454545
11100.597222-0.597222
11200.454545-0.454545
11310.4545450.545455
11400.454545-0.454545
11510.5972220.402778
11600.597222-0.597222







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
50.7985780.4028430.201422
60.7425340.5149320.257466
70.6567060.6865890.343294
80.5585290.8829410.441471
90.4578040.9156080.542196
100.3621590.7243180.637841
110.3062220.6124450.693778
120.2318310.4636620.768169
130.1840440.3680880.815956
140.3398670.6797350.660133
150.280820.5616410.71918
160.230920.4618390.76908
170.1848370.3696740.815163
180.1450730.2901460.854927
190.1147180.2294350.885282
200.2034460.4068910.796554
210.1752660.3505310.824734
220.2439860.4879730.756014
230.2027330.4054660.797267
240.1663380.3326760.833662
250.259930.519860.74007
260.2228590.4457180.777141
270.1892610.3785220.810739
280.1593180.3186350.840682
290.2351270.4702540.764873
300.2182420.4364840.781758
310.1897490.3794980.810251
320.1754520.3509040.824548
330.1516170.3032340.848383
340.1907280.3814560.809272
350.1816780.3633560.818322
360.2139230.4278450.786077
370.1889650.3779310.811035
380.2504550.5009110.749545
390.2256930.4513870.774307
400.2029410.4058830.797059
410.2231790.4463590.776821
420.2016010.4032030.798399
430.2144930.4289860.785507
440.2664990.5329970.733501
450.2452050.490410.754795
460.2257390.4514780.774261
470.2081510.4163010.791849
480.1924630.3849260.807537
490.23670.47340.7633
500.2419780.4839570.758022
510.2820060.5640120.717994
520.3184580.6369160.681542
530.3024630.6049260.697537
540.3016380.6032770.698362
550.2876790.5753580.712321
560.2756860.5513720.724314
570.2658350.5316690.734165
580.258330.516660.74167
590.2561590.5123170.743841
600.2516040.5032080.748396
610.2482910.4965810.751709
620.2746640.5493280.725336
630.2872670.5745350.712733
640.2863110.5726230.713689
650.3015280.6030560.698472
660.3235320.6470630.676468
670.3418730.6837460.658127
680.3340440.6680880.665956
690.3516630.7033260.648337
700.3668420.7336840.633158
710.3688310.7376630.631169
720.3804910.7609830.619509
730.3855030.7710060.614497
740.3936090.7872170.606391
750.3810640.7621270.618936
760.3870670.7741350.612933
770.3741780.7483570.625822
780.3802940.7605880.619706
790.3821960.7643920.617804
800.3918440.7836880.608156
810.4096420.8192840.590358
820.4381570.8763140.561843
830.4267710.8535420.573229
840.447190.8943810.55281
850.4934610.9869220.506539
860.4766150.9532310.523385
870.5377430.9245140.462257
880.5734690.8530630.426531
890.6644930.6710150.335507
900.6348380.7303240.365162
910.6027780.7944450.397222
920.7182810.5634370.281719
930.701550.59690.29845
940.6611380.6777230.338862
950.6169540.7660920.383046
960.6677440.6645110.332256
970.6407950.7184110.359205
980.7060490.5879030.293951
990.672720.654560.32728
1000.7602690.4794620.239731
1010.7190050.561990.280995
1020.6593950.6812090.340605
1030.59390.8122010.4061
1040.5424540.9150930.457546
1050.4684490.9368990.531551
1060.3963860.7927710.603614
1070.3465010.6930020.653499
1080.4177460.8354930.582254
1090.3398090.6796180.660191
1100.2555580.5111160.744442
1110.2007480.4014970.799252

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
5 & 0.798578 & 0.402843 & 0.201422 \tabularnewline
6 & 0.742534 & 0.514932 & 0.257466 \tabularnewline
7 & 0.656706 & 0.686589 & 0.343294 \tabularnewline
8 & 0.558529 & 0.882941 & 0.441471 \tabularnewline
9 & 0.457804 & 0.915608 & 0.542196 \tabularnewline
10 & 0.362159 & 0.724318 & 0.637841 \tabularnewline
11 & 0.306222 & 0.612445 & 0.693778 \tabularnewline
12 & 0.231831 & 0.463662 & 0.768169 \tabularnewline
13 & 0.184044 & 0.368088 & 0.815956 \tabularnewline
14 & 0.339867 & 0.679735 & 0.660133 \tabularnewline
15 & 0.28082 & 0.561641 & 0.71918 \tabularnewline
16 & 0.23092 & 0.461839 & 0.76908 \tabularnewline
17 & 0.184837 & 0.369674 & 0.815163 \tabularnewline
18 & 0.145073 & 0.290146 & 0.854927 \tabularnewline
19 & 0.114718 & 0.229435 & 0.885282 \tabularnewline
20 & 0.203446 & 0.406891 & 0.796554 \tabularnewline
21 & 0.175266 & 0.350531 & 0.824734 \tabularnewline
22 & 0.243986 & 0.487973 & 0.756014 \tabularnewline
23 & 0.202733 & 0.405466 & 0.797267 \tabularnewline
24 & 0.166338 & 0.332676 & 0.833662 \tabularnewline
25 & 0.25993 & 0.51986 & 0.74007 \tabularnewline
26 & 0.222859 & 0.445718 & 0.777141 \tabularnewline
27 & 0.189261 & 0.378522 & 0.810739 \tabularnewline
28 & 0.159318 & 0.318635 & 0.840682 \tabularnewline
29 & 0.235127 & 0.470254 & 0.764873 \tabularnewline
30 & 0.218242 & 0.436484 & 0.781758 \tabularnewline
31 & 0.189749 & 0.379498 & 0.810251 \tabularnewline
32 & 0.175452 & 0.350904 & 0.824548 \tabularnewline
33 & 0.151617 & 0.303234 & 0.848383 \tabularnewline
34 & 0.190728 & 0.381456 & 0.809272 \tabularnewline
35 & 0.181678 & 0.363356 & 0.818322 \tabularnewline
36 & 0.213923 & 0.427845 & 0.786077 \tabularnewline
37 & 0.188965 & 0.377931 & 0.811035 \tabularnewline
38 & 0.250455 & 0.500911 & 0.749545 \tabularnewline
39 & 0.225693 & 0.451387 & 0.774307 \tabularnewline
40 & 0.202941 & 0.405883 & 0.797059 \tabularnewline
41 & 0.223179 & 0.446359 & 0.776821 \tabularnewline
42 & 0.201601 & 0.403203 & 0.798399 \tabularnewline
43 & 0.214493 & 0.428986 & 0.785507 \tabularnewline
44 & 0.266499 & 0.532997 & 0.733501 \tabularnewline
45 & 0.245205 & 0.49041 & 0.754795 \tabularnewline
46 & 0.225739 & 0.451478 & 0.774261 \tabularnewline
47 & 0.208151 & 0.416301 & 0.791849 \tabularnewline
48 & 0.192463 & 0.384926 & 0.807537 \tabularnewline
49 & 0.2367 & 0.4734 & 0.7633 \tabularnewline
50 & 0.241978 & 0.483957 & 0.758022 \tabularnewline
51 & 0.282006 & 0.564012 & 0.717994 \tabularnewline
52 & 0.318458 & 0.636916 & 0.681542 \tabularnewline
53 & 0.302463 & 0.604926 & 0.697537 \tabularnewline
54 & 0.301638 & 0.603277 & 0.698362 \tabularnewline
55 & 0.287679 & 0.575358 & 0.712321 \tabularnewline
56 & 0.275686 & 0.551372 & 0.724314 \tabularnewline
57 & 0.265835 & 0.531669 & 0.734165 \tabularnewline
58 & 0.25833 & 0.51666 & 0.74167 \tabularnewline
59 & 0.256159 & 0.512317 & 0.743841 \tabularnewline
60 & 0.251604 & 0.503208 & 0.748396 \tabularnewline
61 & 0.248291 & 0.496581 & 0.751709 \tabularnewline
62 & 0.274664 & 0.549328 & 0.725336 \tabularnewline
63 & 0.287267 & 0.574535 & 0.712733 \tabularnewline
64 & 0.286311 & 0.572623 & 0.713689 \tabularnewline
65 & 0.301528 & 0.603056 & 0.698472 \tabularnewline
66 & 0.323532 & 0.647063 & 0.676468 \tabularnewline
67 & 0.341873 & 0.683746 & 0.658127 \tabularnewline
68 & 0.334044 & 0.668088 & 0.665956 \tabularnewline
69 & 0.351663 & 0.703326 & 0.648337 \tabularnewline
70 & 0.366842 & 0.733684 & 0.633158 \tabularnewline
71 & 0.368831 & 0.737663 & 0.631169 \tabularnewline
72 & 0.380491 & 0.760983 & 0.619509 \tabularnewline
73 & 0.385503 & 0.771006 & 0.614497 \tabularnewline
74 & 0.393609 & 0.787217 & 0.606391 \tabularnewline
75 & 0.381064 & 0.762127 & 0.618936 \tabularnewline
76 & 0.387067 & 0.774135 & 0.612933 \tabularnewline
77 & 0.374178 & 0.748357 & 0.625822 \tabularnewline
78 & 0.380294 & 0.760588 & 0.619706 \tabularnewline
79 & 0.382196 & 0.764392 & 0.617804 \tabularnewline
80 & 0.391844 & 0.783688 & 0.608156 \tabularnewline
81 & 0.409642 & 0.819284 & 0.590358 \tabularnewline
82 & 0.438157 & 0.876314 & 0.561843 \tabularnewline
83 & 0.426771 & 0.853542 & 0.573229 \tabularnewline
84 & 0.44719 & 0.894381 & 0.55281 \tabularnewline
85 & 0.493461 & 0.986922 & 0.506539 \tabularnewline
86 & 0.476615 & 0.953231 & 0.523385 \tabularnewline
87 & 0.537743 & 0.924514 & 0.462257 \tabularnewline
88 & 0.573469 & 0.853063 & 0.426531 \tabularnewline
89 & 0.664493 & 0.671015 & 0.335507 \tabularnewline
90 & 0.634838 & 0.730324 & 0.365162 \tabularnewline
91 & 0.602778 & 0.794445 & 0.397222 \tabularnewline
92 & 0.718281 & 0.563437 & 0.281719 \tabularnewline
93 & 0.70155 & 0.5969 & 0.29845 \tabularnewline
94 & 0.661138 & 0.677723 & 0.338862 \tabularnewline
95 & 0.616954 & 0.766092 & 0.383046 \tabularnewline
96 & 0.667744 & 0.664511 & 0.332256 \tabularnewline
97 & 0.640795 & 0.718411 & 0.359205 \tabularnewline
98 & 0.706049 & 0.587903 & 0.293951 \tabularnewline
99 & 0.67272 & 0.65456 & 0.32728 \tabularnewline
100 & 0.760269 & 0.479462 & 0.239731 \tabularnewline
101 & 0.719005 & 0.56199 & 0.280995 \tabularnewline
102 & 0.659395 & 0.681209 & 0.340605 \tabularnewline
103 & 0.5939 & 0.812201 & 0.4061 \tabularnewline
104 & 0.542454 & 0.915093 & 0.457546 \tabularnewline
105 & 0.468449 & 0.936899 & 0.531551 \tabularnewline
106 & 0.396386 & 0.792771 & 0.603614 \tabularnewline
107 & 0.346501 & 0.693002 & 0.653499 \tabularnewline
108 & 0.417746 & 0.835493 & 0.582254 \tabularnewline
109 & 0.339809 & 0.679618 & 0.660191 \tabularnewline
110 & 0.255558 & 0.511116 & 0.744442 \tabularnewline
111 & 0.200748 & 0.401497 & 0.799252 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=264482&T=5

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]5[/C][C]0.798578[/C][C]0.402843[/C][C]0.201422[/C][/ROW]
[ROW][C]6[/C][C]0.742534[/C][C]0.514932[/C][C]0.257466[/C][/ROW]
[ROW][C]7[/C][C]0.656706[/C][C]0.686589[/C][C]0.343294[/C][/ROW]
[ROW][C]8[/C][C]0.558529[/C][C]0.882941[/C][C]0.441471[/C][/ROW]
[ROW][C]9[/C][C]0.457804[/C][C]0.915608[/C][C]0.542196[/C][/ROW]
[ROW][C]10[/C][C]0.362159[/C][C]0.724318[/C][C]0.637841[/C][/ROW]
[ROW][C]11[/C][C]0.306222[/C][C]0.612445[/C][C]0.693778[/C][/ROW]
[ROW][C]12[/C][C]0.231831[/C][C]0.463662[/C][C]0.768169[/C][/ROW]
[ROW][C]13[/C][C]0.184044[/C][C]0.368088[/C][C]0.815956[/C][/ROW]
[ROW][C]14[/C][C]0.339867[/C][C]0.679735[/C][C]0.660133[/C][/ROW]
[ROW][C]15[/C][C]0.28082[/C][C]0.561641[/C][C]0.71918[/C][/ROW]
[ROW][C]16[/C][C]0.23092[/C][C]0.461839[/C][C]0.76908[/C][/ROW]
[ROW][C]17[/C][C]0.184837[/C][C]0.369674[/C][C]0.815163[/C][/ROW]
[ROW][C]18[/C][C]0.145073[/C][C]0.290146[/C][C]0.854927[/C][/ROW]
[ROW][C]19[/C][C]0.114718[/C][C]0.229435[/C][C]0.885282[/C][/ROW]
[ROW][C]20[/C][C]0.203446[/C][C]0.406891[/C][C]0.796554[/C][/ROW]
[ROW][C]21[/C][C]0.175266[/C][C]0.350531[/C][C]0.824734[/C][/ROW]
[ROW][C]22[/C][C]0.243986[/C][C]0.487973[/C][C]0.756014[/C][/ROW]
[ROW][C]23[/C][C]0.202733[/C][C]0.405466[/C][C]0.797267[/C][/ROW]
[ROW][C]24[/C][C]0.166338[/C][C]0.332676[/C][C]0.833662[/C][/ROW]
[ROW][C]25[/C][C]0.25993[/C][C]0.51986[/C][C]0.74007[/C][/ROW]
[ROW][C]26[/C][C]0.222859[/C][C]0.445718[/C][C]0.777141[/C][/ROW]
[ROW][C]27[/C][C]0.189261[/C][C]0.378522[/C][C]0.810739[/C][/ROW]
[ROW][C]28[/C][C]0.159318[/C][C]0.318635[/C][C]0.840682[/C][/ROW]
[ROW][C]29[/C][C]0.235127[/C][C]0.470254[/C][C]0.764873[/C][/ROW]
[ROW][C]30[/C][C]0.218242[/C][C]0.436484[/C][C]0.781758[/C][/ROW]
[ROW][C]31[/C][C]0.189749[/C][C]0.379498[/C][C]0.810251[/C][/ROW]
[ROW][C]32[/C][C]0.175452[/C][C]0.350904[/C][C]0.824548[/C][/ROW]
[ROW][C]33[/C][C]0.151617[/C][C]0.303234[/C][C]0.848383[/C][/ROW]
[ROW][C]34[/C][C]0.190728[/C][C]0.381456[/C][C]0.809272[/C][/ROW]
[ROW][C]35[/C][C]0.181678[/C][C]0.363356[/C][C]0.818322[/C][/ROW]
[ROW][C]36[/C][C]0.213923[/C][C]0.427845[/C][C]0.786077[/C][/ROW]
[ROW][C]37[/C][C]0.188965[/C][C]0.377931[/C][C]0.811035[/C][/ROW]
[ROW][C]38[/C][C]0.250455[/C][C]0.500911[/C][C]0.749545[/C][/ROW]
[ROW][C]39[/C][C]0.225693[/C][C]0.451387[/C][C]0.774307[/C][/ROW]
[ROW][C]40[/C][C]0.202941[/C][C]0.405883[/C][C]0.797059[/C][/ROW]
[ROW][C]41[/C][C]0.223179[/C][C]0.446359[/C][C]0.776821[/C][/ROW]
[ROW][C]42[/C][C]0.201601[/C][C]0.403203[/C][C]0.798399[/C][/ROW]
[ROW][C]43[/C][C]0.214493[/C][C]0.428986[/C][C]0.785507[/C][/ROW]
[ROW][C]44[/C][C]0.266499[/C][C]0.532997[/C][C]0.733501[/C][/ROW]
[ROW][C]45[/C][C]0.245205[/C][C]0.49041[/C][C]0.754795[/C][/ROW]
[ROW][C]46[/C][C]0.225739[/C][C]0.451478[/C][C]0.774261[/C][/ROW]
[ROW][C]47[/C][C]0.208151[/C][C]0.416301[/C][C]0.791849[/C][/ROW]
[ROW][C]48[/C][C]0.192463[/C][C]0.384926[/C][C]0.807537[/C][/ROW]
[ROW][C]49[/C][C]0.2367[/C][C]0.4734[/C][C]0.7633[/C][/ROW]
[ROW][C]50[/C][C]0.241978[/C][C]0.483957[/C][C]0.758022[/C][/ROW]
[ROW][C]51[/C][C]0.282006[/C][C]0.564012[/C][C]0.717994[/C][/ROW]
[ROW][C]52[/C][C]0.318458[/C][C]0.636916[/C][C]0.681542[/C][/ROW]
[ROW][C]53[/C][C]0.302463[/C][C]0.604926[/C][C]0.697537[/C][/ROW]
[ROW][C]54[/C][C]0.301638[/C][C]0.603277[/C][C]0.698362[/C][/ROW]
[ROW][C]55[/C][C]0.287679[/C][C]0.575358[/C][C]0.712321[/C][/ROW]
[ROW][C]56[/C][C]0.275686[/C][C]0.551372[/C][C]0.724314[/C][/ROW]
[ROW][C]57[/C][C]0.265835[/C][C]0.531669[/C][C]0.734165[/C][/ROW]
[ROW][C]58[/C][C]0.25833[/C][C]0.51666[/C][C]0.74167[/C][/ROW]
[ROW][C]59[/C][C]0.256159[/C][C]0.512317[/C][C]0.743841[/C][/ROW]
[ROW][C]60[/C][C]0.251604[/C][C]0.503208[/C][C]0.748396[/C][/ROW]
[ROW][C]61[/C][C]0.248291[/C][C]0.496581[/C][C]0.751709[/C][/ROW]
[ROW][C]62[/C][C]0.274664[/C][C]0.549328[/C][C]0.725336[/C][/ROW]
[ROW][C]63[/C][C]0.287267[/C][C]0.574535[/C][C]0.712733[/C][/ROW]
[ROW][C]64[/C][C]0.286311[/C][C]0.572623[/C][C]0.713689[/C][/ROW]
[ROW][C]65[/C][C]0.301528[/C][C]0.603056[/C][C]0.698472[/C][/ROW]
[ROW][C]66[/C][C]0.323532[/C][C]0.647063[/C][C]0.676468[/C][/ROW]
[ROW][C]67[/C][C]0.341873[/C][C]0.683746[/C][C]0.658127[/C][/ROW]
[ROW][C]68[/C][C]0.334044[/C][C]0.668088[/C][C]0.665956[/C][/ROW]
[ROW][C]69[/C][C]0.351663[/C][C]0.703326[/C][C]0.648337[/C][/ROW]
[ROW][C]70[/C][C]0.366842[/C][C]0.733684[/C][C]0.633158[/C][/ROW]
[ROW][C]71[/C][C]0.368831[/C][C]0.737663[/C][C]0.631169[/C][/ROW]
[ROW][C]72[/C][C]0.380491[/C][C]0.760983[/C][C]0.619509[/C][/ROW]
[ROW][C]73[/C][C]0.385503[/C][C]0.771006[/C][C]0.614497[/C][/ROW]
[ROW][C]74[/C][C]0.393609[/C][C]0.787217[/C][C]0.606391[/C][/ROW]
[ROW][C]75[/C][C]0.381064[/C][C]0.762127[/C][C]0.618936[/C][/ROW]
[ROW][C]76[/C][C]0.387067[/C][C]0.774135[/C][C]0.612933[/C][/ROW]
[ROW][C]77[/C][C]0.374178[/C][C]0.748357[/C][C]0.625822[/C][/ROW]
[ROW][C]78[/C][C]0.380294[/C][C]0.760588[/C][C]0.619706[/C][/ROW]
[ROW][C]79[/C][C]0.382196[/C][C]0.764392[/C][C]0.617804[/C][/ROW]
[ROW][C]80[/C][C]0.391844[/C][C]0.783688[/C][C]0.608156[/C][/ROW]
[ROW][C]81[/C][C]0.409642[/C][C]0.819284[/C][C]0.590358[/C][/ROW]
[ROW][C]82[/C][C]0.438157[/C][C]0.876314[/C][C]0.561843[/C][/ROW]
[ROW][C]83[/C][C]0.426771[/C][C]0.853542[/C][C]0.573229[/C][/ROW]
[ROW][C]84[/C][C]0.44719[/C][C]0.894381[/C][C]0.55281[/C][/ROW]
[ROW][C]85[/C][C]0.493461[/C][C]0.986922[/C][C]0.506539[/C][/ROW]
[ROW][C]86[/C][C]0.476615[/C][C]0.953231[/C][C]0.523385[/C][/ROW]
[ROW][C]87[/C][C]0.537743[/C][C]0.924514[/C][C]0.462257[/C][/ROW]
[ROW][C]88[/C][C]0.573469[/C][C]0.853063[/C][C]0.426531[/C][/ROW]
[ROW][C]89[/C][C]0.664493[/C][C]0.671015[/C][C]0.335507[/C][/ROW]
[ROW][C]90[/C][C]0.634838[/C][C]0.730324[/C][C]0.365162[/C][/ROW]
[ROW][C]91[/C][C]0.602778[/C][C]0.794445[/C][C]0.397222[/C][/ROW]
[ROW][C]92[/C][C]0.718281[/C][C]0.563437[/C][C]0.281719[/C][/ROW]
[ROW][C]93[/C][C]0.70155[/C][C]0.5969[/C][C]0.29845[/C][/ROW]
[ROW][C]94[/C][C]0.661138[/C][C]0.677723[/C][C]0.338862[/C][/ROW]
[ROW][C]95[/C][C]0.616954[/C][C]0.766092[/C][C]0.383046[/C][/ROW]
[ROW][C]96[/C][C]0.667744[/C][C]0.664511[/C][C]0.332256[/C][/ROW]
[ROW][C]97[/C][C]0.640795[/C][C]0.718411[/C][C]0.359205[/C][/ROW]
[ROW][C]98[/C][C]0.706049[/C][C]0.587903[/C][C]0.293951[/C][/ROW]
[ROW][C]99[/C][C]0.67272[/C][C]0.65456[/C][C]0.32728[/C][/ROW]
[ROW][C]100[/C][C]0.760269[/C][C]0.479462[/C][C]0.239731[/C][/ROW]
[ROW][C]101[/C][C]0.719005[/C][C]0.56199[/C][C]0.280995[/C][/ROW]
[ROW][C]102[/C][C]0.659395[/C][C]0.681209[/C][C]0.340605[/C][/ROW]
[ROW][C]103[/C][C]0.5939[/C][C]0.812201[/C][C]0.4061[/C][/ROW]
[ROW][C]104[/C][C]0.542454[/C][C]0.915093[/C][C]0.457546[/C][/ROW]
[ROW][C]105[/C][C]0.468449[/C][C]0.936899[/C][C]0.531551[/C][/ROW]
[ROW][C]106[/C][C]0.396386[/C][C]0.792771[/C][C]0.603614[/C][/ROW]
[ROW][C]107[/C][C]0.346501[/C][C]0.693002[/C][C]0.653499[/C][/ROW]
[ROW][C]108[/C][C]0.417746[/C][C]0.835493[/C][C]0.582254[/C][/ROW]
[ROW][C]109[/C][C]0.339809[/C][C]0.679618[/C][C]0.660191[/C][/ROW]
[ROW][C]110[/C][C]0.255558[/C][C]0.511116[/C][C]0.744442[/C][/ROW]
[ROW][C]111[/C][C]0.200748[/C][C]0.401497[/C][C]0.799252[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=264482&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=264482&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
50.7985780.4028430.201422
60.7425340.5149320.257466
70.6567060.6865890.343294
80.5585290.8829410.441471
90.4578040.9156080.542196
100.3621590.7243180.637841
110.3062220.6124450.693778
120.2318310.4636620.768169
130.1840440.3680880.815956
140.3398670.6797350.660133
150.280820.5616410.71918
160.230920.4618390.76908
170.1848370.3696740.815163
180.1450730.2901460.854927
190.1147180.2294350.885282
200.2034460.4068910.796554
210.1752660.3505310.824734
220.2439860.4879730.756014
230.2027330.4054660.797267
240.1663380.3326760.833662
250.259930.519860.74007
260.2228590.4457180.777141
270.1892610.3785220.810739
280.1593180.3186350.840682
290.2351270.4702540.764873
300.2182420.4364840.781758
310.1897490.3794980.810251
320.1754520.3509040.824548
330.1516170.3032340.848383
340.1907280.3814560.809272
350.1816780.3633560.818322
360.2139230.4278450.786077
370.1889650.3779310.811035
380.2504550.5009110.749545
390.2256930.4513870.774307
400.2029410.4058830.797059
410.2231790.4463590.776821
420.2016010.4032030.798399
430.2144930.4289860.785507
440.2664990.5329970.733501
450.2452050.490410.754795
460.2257390.4514780.774261
470.2081510.4163010.791849
480.1924630.3849260.807537
490.23670.47340.7633
500.2419780.4839570.758022
510.2820060.5640120.717994
520.3184580.6369160.681542
530.3024630.6049260.697537
540.3016380.6032770.698362
550.2876790.5753580.712321
560.2756860.5513720.724314
570.2658350.5316690.734165
580.258330.516660.74167
590.2561590.5123170.743841
600.2516040.5032080.748396
610.2482910.4965810.751709
620.2746640.5493280.725336
630.2872670.5745350.712733
640.2863110.5726230.713689
650.3015280.6030560.698472
660.3235320.6470630.676468
670.3418730.6837460.658127
680.3340440.6680880.665956
690.3516630.7033260.648337
700.3668420.7336840.633158
710.3688310.7376630.631169
720.3804910.7609830.619509
730.3855030.7710060.614497
740.3936090.7872170.606391
750.3810640.7621270.618936
760.3870670.7741350.612933
770.3741780.7483570.625822
780.3802940.7605880.619706
790.3821960.7643920.617804
800.3918440.7836880.608156
810.4096420.8192840.590358
820.4381570.8763140.561843
830.4267710.8535420.573229
840.447190.8943810.55281
850.4934610.9869220.506539
860.4766150.9532310.523385
870.5377430.9245140.462257
880.5734690.8530630.426531
890.6644930.6710150.335507
900.6348380.7303240.365162
910.6027780.7944450.397222
920.7182810.5634370.281719
930.701550.59690.29845
940.6611380.6777230.338862
950.6169540.7660920.383046
960.6677440.6645110.332256
970.6407950.7184110.359205
980.7060490.5879030.293951
990.672720.654560.32728
1000.7602690.4794620.239731
1010.7190050.561990.280995
1020.6593950.6812090.340605
1030.59390.8122010.4061
1040.5424540.9150930.457546
1050.4684490.9368990.531551
1060.3963860.7927710.603614
1070.3465010.6930020.653499
1080.4177460.8354930.582254
1090.3398090.6796180.660191
1100.2555580.5111160.744442
1110.2007480.4014970.799252







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level00OK
5% type I error level00OK
10% type I error level00OK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 0 & 0 & OK \tabularnewline
5% type I error level & 0 & 0 & OK \tabularnewline
10% type I error level & 0 & 0 & OK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=264482&T=6

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=264482&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=264482&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level00OK
5% type I error level00OK
10% type I error level00OK



Parameters (Session):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
R code (references can be found in the software module):
library(lattice)
library(lmtest)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
qqline(mysum$resid)
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, signif(mysum$coefficients[i,1],6), sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,signif(mysum$coefficients[i,1],6))
a<-table.element(a, signif(mysum$coefficients[i,2],6))
a<-table.element(a, signif(mysum$coefficients[i,3],4))
a<-table.element(a, signif(mysum$coefficients[i,4],6))
a<-table.element(a, signif(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, signif(sqrt(mysum$r.squared),6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, signif(mysum$r.squared,6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, signif(mysum$adj.r.squared,6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[1],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[2],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[3],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, signif(1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]),6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, signif(mysum$sigma,6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, signif(sum(myerror*myerror),6))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,signif(x[i],6))
a<-table.element(a,signif(x[i]-mysum$resid[i],6))
a<-table.element(a,signif(mysum$resid[i],6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,signif(gqarr[mypoint-kp3+1,1],6))
a<-table.element(a,signif(gqarr[mypoint-kp3+1,2],6))
a<-table.element(a,signif(gqarr[mypoint-kp3+1,3],6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant1,6))
a<-table.element(a,signif(numsignificant1/numgqtests,6))
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant5,6))
a<-table.element(a,signif(numsignificant5/numgqtests,6))
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant10,6))
a<-table.element(a,signif(numsignificant10/numgqtests,6))
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}