Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationSat, 06 Dec 2014 13:17:27 +0000
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2014/Dec/06/t1417871867csc9x0bcawnuyhl.htm/, Retrieved Thu, 31 Oct 2024 23:14:45 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=263616, Retrieved Thu, 31 Oct 2024 23:14:45 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact136
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [Paper mr stresstest] [2014-12-06 13:17:27] [ec1b40d1a9751af99658fe8fca4f9eca] [Current]
Feedback Forum

Post a new message
Dataseries X:
1 13
0 13
1 11
0 14
0 15
0 14
1 11
0 13
0 16
0 14
0 14
0 15
0 15
1 13
1 14
1 11
0 12
1 14
0 13
1 12
0 15
0 15
1 14
0 14
0 12
0 12
0 12
1 15
0 14
1 16
1 12
1 12
0 14
0 16
0 15
0 12
1 14
0 13
0 14
1 16
0 12
0 14
0 15
0 13
0 16
1 16
0 12
0 12
1 16
0 12
0 15
0 12
0 13
0 12
0 14
0 14
1 11
0 10
1 12
0 11
0 16
1 14
1 14
0 15
1 15
0 14
1 13
0 11
0 16
1 12
1 15
1 14
1 15
0 14
1 13
0 6
1 12
0 12
1 14
1 14
0 15
1 11
1 13
1 14
0 16
1 13
0 14
0 16
1 11
1 13
1 13
0 15
0 12
0 13
1 12
0 14
1 14
0 16
0 15
0 14
0 13
0 14
1 15
0 14
0 12
0 7
1 12
1 15
1 12
1 13
1 11
1 14
0 13




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time6 seconds
R Server'Gwilym Jenkins' @ jenkins.wessa.net

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 6 seconds \tabularnewline
R Server & 'Gwilym Jenkins' @ jenkins.wessa.net \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=263616&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]6 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Gwilym Jenkins' @ jenkins.wessa.net[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=263616&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=263616&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time6 seconds
R Server'Gwilym Jenkins' @ jenkins.wessa.net







Multiple Linear Regression - Estimated Regression Equation
gendercode[t] = + 0.637893 -0.0165667STRESSTOT[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
gendercode[t] =  +  0.637893 -0.0165667STRESSTOT[t]  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=263616&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]gendercode[t] =  +  0.637893 -0.0165667STRESSTOT[t]  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=263616&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=263616&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
gendercode[t] = + 0.637893 -0.0165667STRESSTOT[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)0.6378930.3610941.7670.08005150.0400257
STRESSTOT-0.01656670.0267245-0.61990.5365910.268296

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 0.637893 & 0.361094 & 1.767 & 0.0800515 & 0.0400257 \tabularnewline
STRESSTOT & -0.0165667 & 0.0267245 & -0.6199 & 0.536591 & 0.268296 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=263616&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]0.637893[/C][C]0.361094[/C][C]1.767[/C][C]0.0800515[/C][C]0.0400257[/C][/ROW]
[ROW][C]STRESSTOT[/C][C]-0.0165667[/C][C]0.0267245[/C][C]-0.6199[/C][C]0.536591[/C][C]0.268296[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=263616&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=263616&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)0.6378930.3610941.7670.08005150.0400257
STRESSTOT-0.01656670.0267245-0.61990.5365910.268296







Multiple Linear Regression - Regression Statistics
Multiple R0.0587371
R-squared0.00345005
Adjusted R-squared-0.00552788
F-TEST (value)0.384281
F-TEST (DF numerator)1
F-TEST (DF denominator)111
p-value0.536591
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation0.496443
Sum Squared Residuals27.3566

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.0587371 \tabularnewline
R-squared & 0.00345005 \tabularnewline
Adjusted R-squared & -0.00552788 \tabularnewline
F-TEST (value) & 0.384281 \tabularnewline
F-TEST (DF numerator) & 1 \tabularnewline
F-TEST (DF denominator) & 111 \tabularnewline
p-value & 0.536591 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 0.496443 \tabularnewline
Sum Squared Residuals & 27.3566 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=263616&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.0587371[/C][/ROW]
[ROW][C]R-squared[/C][C]0.00345005[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]-0.00552788[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]0.384281[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]1[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]111[/C][/ROW]
[ROW][C]p-value[/C][C]0.536591[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]0.496443[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]27.3566[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=263616&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=263616&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.0587371
R-squared0.00345005
Adjusted R-squared-0.00552788
F-TEST (value)0.384281
F-TEST (DF numerator)1
F-TEST (DF denominator)111
p-value0.536591
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation0.496443
Sum Squared Residuals27.3566







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
110.4225270.577473
200.422527-0.422527
310.455660.54434
400.40596-0.40596
500.389393-0.389393
600.40596-0.40596
710.455660.54434
800.422527-0.422527
900.372827-0.372827
1000.40596-0.40596
1100.40596-0.40596
1200.389393-0.389393
1300.389393-0.389393
1410.4225270.577473
1510.405960.59404
1610.455660.54434
1700.439093-0.439093
1810.405960.59404
1900.422527-0.422527
2010.4390930.560907
2100.389393-0.389393
2200.389393-0.389393
2310.405960.59404
2400.40596-0.40596
2500.439093-0.439093
2600.439093-0.439093
2700.439093-0.439093
2810.3893930.610607
2900.40596-0.40596
3010.3728270.627173
3110.4390930.560907
3210.4390930.560907
3300.40596-0.40596
3400.372827-0.372827
3500.389393-0.389393
3600.439093-0.439093
3710.405960.59404
3800.422527-0.422527
3900.40596-0.40596
4010.3728270.627173
4100.439093-0.439093
4200.40596-0.40596
4300.389393-0.389393
4400.422527-0.422527
4500.372827-0.372827
4610.3728270.627173
4700.439093-0.439093
4800.439093-0.439093
4910.3728270.627173
5000.439093-0.439093
5100.389393-0.389393
5200.439093-0.439093
5300.422527-0.422527
5400.439093-0.439093
5500.40596-0.40596
5600.40596-0.40596
5710.455660.54434
5800.472226-0.472226
5910.4390930.560907
6000.45566-0.45566
6100.372827-0.372827
6210.405960.59404
6310.405960.59404
6400.389393-0.389393
6510.3893930.610607
6600.40596-0.40596
6710.4225270.577473
6800.45566-0.45566
6900.372827-0.372827
7010.4390930.560907
7110.3893930.610607
7210.405960.59404
7310.3893930.610607
7400.40596-0.40596
7510.4225270.577473
7600.538493-0.538493
7710.4390930.560907
7800.439093-0.439093
7910.405960.59404
8010.405960.59404
8100.389393-0.389393
8210.455660.54434
8310.4225270.577473
8410.405960.59404
8500.372827-0.372827
8610.4225270.577473
8700.40596-0.40596
8800.372827-0.372827
8910.455660.54434
9010.4225270.577473
9110.4225270.577473
9200.389393-0.389393
9300.439093-0.439093
9400.422527-0.422527
9510.4390930.560907
9600.40596-0.40596
9710.405960.59404
9800.372827-0.372827
9900.389393-0.389393
10000.40596-0.40596
10100.422527-0.422527
10200.40596-0.40596
10310.3893930.610607
10400.40596-0.40596
10500.439093-0.439093
10600.521926-0.521926
10710.4390930.560907
10810.3893930.610607
10910.4390930.560907
11010.4225270.577473
11110.455660.54434
11210.405960.59404
11300.422527-0.422527

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 1 & 0.422527 & 0.577473 \tabularnewline
2 & 0 & 0.422527 & -0.422527 \tabularnewline
3 & 1 & 0.45566 & 0.54434 \tabularnewline
4 & 0 & 0.40596 & -0.40596 \tabularnewline
5 & 0 & 0.389393 & -0.389393 \tabularnewline
6 & 0 & 0.40596 & -0.40596 \tabularnewline
7 & 1 & 0.45566 & 0.54434 \tabularnewline
8 & 0 & 0.422527 & -0.422527 \tabularnewline
9 & 0 & 0.372827 & -0.372827 \tabularnewline
10 & 0 & 0.40596 & -0.40596 \tabularnewline
11 & 0 & 0.40596 & -0.40596 \tabularnewline
12 & 0 & 0.389393 & -0.389393 \tabularnewline
13 & 0 & 0.389393 & -0.389393 \tabularnewline
14 & 1 & 0.422527 & 0.577473 \tabularnewline
15 & 1 & 0.40596 & 0.59404 \tabularnewline
16 & 1 & 0.45566 & 0.54434 \tabularnewline
17 & 0 & 0.439093 & -0.439093 \tabularnewline
18 & 1 & 0.40596 & 0.59404 \tabularnewline
19 & 0 & 0.422527 & -0.422527 \tabularnewline
20 & 1 & 0.439093 & 0.560907 \tabularnewline
21 & 0 & 0.389393 & -0.389393 \tabularnewline
22 & 0 & 0.389393 & -0.389393 \tabularnewline
23 & 1 & 0.40596 & 0.59404 \tabularnewline
24 & 0 & 0.40596 & -0.40596 \tabularnewline
25 & 0 & 0.439093 & -0.439093 \tabularnewline
26 & 0 & 0.439093 & -0.439093 \tabularnewline
27 & 0 & 0.439093 & -0.439093 \tabularnewline
28 & 1 & 0.389393 & 0.610607 \tabularnewline
29 & 0 & 0.40596 & -0.40596 \tabularnewline
30 & 1 & 0.372827 & 0.627173 \tabularnewline
31 & 1 & 0.439093 & 0.560907 \tabularnewline
32 & 1 & 0.439093 & 0.560907 \tabularnewline
33 & 0 & 0.40596 & -0.40596 \tabularnewline
34 & 0 & 0.372827 & -0.372827 \tabularnewline
35 & 0 & 0.389393 & -0.389393 \tabularnewline
36 & 0 & 0.439093 & -0.439093 \tabularnewline
37 & 1 & 0.40596 & 0.59404 \tabularnewline
38 & 0 & 0.422527 & -0.422527 \tabularnewline
39 & 0 & 0.40596 & -0.40596 \tabularnewline
40 & 1 & 0.372827 & 0.627173 \tabularnewline
41 & 0 & 0.439093 & -0.439093 \tabularnewline
42 & 0 & 0.40596 & -0.40596 \tabularnewline
43 & 0 & 0.389393 & -0.389393 \tabularnewline
44 & 0 & 0.422527 & -0.422527 \tabularnewline
45 & 0 & 0.372827 & -0.372827 \tabularnewline
46 & 1 & 0.372827 & 0.627173 \tabularnewline
47 & 0 & 0.439093 & -0.439093 \tabularnewline
48 & 0 & 0.439093 & -0.439093 \tabularnewline
49 & 1 & 0.372827 & 0.627173 \tabularnewline
50 & 0 & 0.439093 & -0.439093 \tabularnewline
51 & 0 & 0.389393 & -0.389393 \tabularnewline
52 & 0 & 0.439093 & -0.439093 \tabularnewline
53 & 0 & 0.422527 & -0.422527 \tabularnewline
54 & 0 & 0.439093 & -0.439093 \tabularnewline
55 & 0 & 0.40596 & -0.40596 \tabularnewline
56 & 0 & 0.40596 & -0.40596 \tabularnewline
57 & 1 & 0.45566 & 0.54434 \tabularnewline
58 & 0 & 0.472226 & -0.472226 \tabularnewline
59 & 1 & 0.439093 & 0.560907 \tabularnewline
60 & 0 & 0.45566 & -0.45566 \tabularnewline
61 & 0 & 0.372827 & -0.372827 \tabularnewline
62 & 1 & 0.40596 & 0.59404 \tabularnewline
63 & 1 & 0.40596 & 0.59404 \tabularnewline
64 & 0 & 0.389393 & -0.389393 \tabularnewline
65 & 1 & 0.389393 & 0.610607 \tabularnewline
66 & 0 & 0.40596 & -0.40596 \tabularnewline
67 & 1 & 0.422527 & 0.577473 \tabularnewline
68 & 0 & 0.45566 & -0.45566 \tabularnewline
69 & 0 & 0.372827 & -0.372827 \tabularnewline
70 & 1 & 0.439093 & 0.560907 \tabularnewline
71 & 1 & 0.389393 & 0.610607 \tabularnewline
72 & 1 & 0.40596 & 0.59404 \tabularnewline
73 & 1 & 0.389393 & 0.610607 \tabularnewline
74 & 0 & 0.40596 & -0.40596 \tabularnewline
75 & 1 & 0.422527 & 0.577473 \tabularnewline
76 & 0 & 0.538493 & -0.538493 \tabularnewline
77 & 1 & 0.439093 & 0.560907 \tabularnewline
78 & 0 & 0.439093 & -0.439093 \tabularnewline
79 & 1 & 0.40596 & 0.59404 \tabularnewline
80 & 1 & 0.40596 & 0.59404 \tabularnewline
81 & 0 & 0.389393 & -0.389393 \tabularnewline
82 & 1 & 0.45566 & 0.54434 \tabularnewline
83 & 1 & 0.422527 & 0.577473 \tabularnewline
84 & 1 & 0.40596 & 0.59404 \tabularnewline
85 & 0 & 0.372827 & -0.372827 \tabularnewline
86 & 1 & 0.422527 & 0.577473 \tabularnewline
87 & 0 & 0.40596 & -0.40596 \tabularnewline
88 & 0 & 0.372827 & -0.372827 \tabularnewline
89 & 1 & 0.45566 & 0.54434 \tabularnewline
90 & 1 & 0.422527 & 0.577473 \tabularnewline
91 & 1 & 0.422527 & 0.577473 \tabularnewline
92 & 0 & 0.389393 & -0.389393 \tabularnewline
93 & 0 & 0.439093 & -0.439093 \tabularnewline
94 & 0 & 0.422527 & -0.422527 \tabularnewline
95 & 1 & 0.439093 & 0.560907 \tabularnewline
96 & 0 & 0.40596 & -0.40596 \tabularnewline
97 & 1 & 0.40596 & 0.59404 \tabularnewline
98 & 0 & 0.372827 & -0.372827 \tabularnewline
99 & 0 & 0.389393 & -0.389393 \tabularnewline
100 & 0 & 0.40596 & -0.40596 \tabularnewline
101 & 0 & 0.422527 & -0.422527 \tabularnewline
102 & 0 & 0.40596 & -0.40596 \tabularnewline
103 & 1 & 0.389393 & 0.610607 \tabularnewline
104 & 0 & 0.40596 & -0.40596 \tabularnewline
105 & 0 & 0.439093 & -0.439093 \tabularnewline
106 & 0 & 0.521926 & -0.521926 \tabularnewline
107 & 1 & 0.439093 & 0.560907 \tabularnewline
108 & 1 & 0.389393 & 0.610607 \tabularnewline
109 & 1 & 0.439093 & 0.560907 \tabularnewline
110 & 1 & 0.422527 & 0.577473 \tabularnewline
111 & 1 & 0.45566 & 0.54434 \tabularnewline
112 & 1 & 0.40596 & 0.59404 \tabularnewline
113 & 0 & 0.422527 & -0.422527 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=263616&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]1[/C][C]0.422527[/C][C]0.577473[/C][/ROW]
[ROW][C]2[/C][C]0[/C][C]0.422527[/C][C]-0.422527[/C][/ROW]
[ROW][C]3[/C][C]1[/C][C]0.45566[/C][C]0.54434[/C][/ROW]
[ROW][C]4[/C][C]0[/C][C]0.40596[/C][C]-0.40596[/C][/ROW]
[ROW][C]5[/C][C]0[/C][C]0.389393[/C][C]-0.389393[/C][/ROW]
[ROW][C]6[/C][C]0[/C][C]0.40596[/C][C]-0.40596[/C][/ROW]
[ROW][C]7[/C][C]1[/C][C]0.45566[/C][C]0.54434[/C][/ROW]
[ROW][C]8[/C][C]0[/C][C]0.422527[/C][C]-0.422527[/C][/ROW]
[ROW][C]9[/C][C]0[/C][C]0.372827[/C][C]-0.372827[/C][/ROW]
[ROW][C]10[/C][C]0[/C][C]0.40596[/C][C]-0.40596[/C][/ROW]
[ROW][C]11[/C][C]0[/C][C]0.40596[/C][C]-0.40596[/C][/ROW]
[ROW][C]12[/C][C]0[/C][C]0.389393[/C][C]-0.389393[/C][/ROW]
[ROW][C]13[/C][C]0[/C][C]0.389393[/C][C]-0.389393[/C][/ROW]
[ROW][C]14[/C][C]1[/C][C]0.422527[/C][C]0.577473[/C][/ROW]
[ROW][C]15[/C][C]1[/C][C]0.40596[/C][C]0.59404[/C][/ROW]
[ROW][C]16[/C][C]1[/C][C]0.45566[/C][C]0.54434[/C][/ROW]
[ROW][C]17[/C][C]0[/C][C]0.439093[/C][C]-0.439093[/C][/ROW]
[ROW][C]18[/C][C]1[/C][C]0.40596[/C][C]0.59404[/C][/ROW]
[ROW][C]19[/C][C]0[/C][C]0.422527[/C][C]-0.422527[/C][/ROW]
[ROW][C]20[/C][C]1[/C][C]0.439093[/C][C]0.560907[/C][/ROW]
[ROW][C]21[/C][C]0[/C][C]0.389393[/C][C]-0.389393[/C][/ROW]
[ROW][C]22[/C][C]0[/C][C]0.389393[/C][C]-0.389393[/C][/ROW]
[ROW][C]23[/C][C]1[/C][C]0.40596[/C][C]0.59404[/C][/ROW]
[ROW][C]24[/C][C]0[/C][C]0.40596[/C][C]-0.40596[/C][/ROW]
[ROW][C]25[/C][C]0[/C][C]0.439093[/C][C]-0.439093[/C][/ROW]
[ROW][C]26[/C][C]0[/C][C]0.439093[/C][C]-0.439093[/C][/ROW]
[ROW][C]27[/C][C]0[/C][C]0.439093[/C][C]-0.439093[/C][/ROW]
[ROW][C]28[/C][C]1[/C][C]0.389393[/C][C]0.610607[/C][/ROW]
[ROW][C]29[/C][C]0[/C][C]0.40596[/C][C]-0.40596[/C][/ROW]
[ROW][C]30[/C][C]1[/C][C]0.372827[/C][C]0.627173[/C][/ROW]
[ROW][C]31[/C][C]1[/C][C]0.439093[/C][C]0.560907[/C][/ROW]
[ROW][C]32[/C][C]1[/C][C]0.439093[/C][C]0.560907[/C][/ROW]
[ROW][C]33[/C][C]0[/C][C]0.40596[/C][C]-0.40596[/C][/ROW]
[ROW][C]34[/C][C]0[/C][C]0.372827[/C][C]-0.372827[/C][/ROW]
[ROW][C]35[/C][C]0[/C][C]0.389393[/C][C]-0.389393[/C][/ROW]
[ROW][C]36[/C][C]0[/C][C]0.439093[/C][C]-0.439093[/C][/ROW]
[ROW][C]37[/C][C]1[/C][C]0.40596[/C][C]0.59404[/C][/ROW]
[ROW][C]38[/C][C]0[/C][C]0.422527[/C][C]-0.422527[/C][/ROW]
[ROW][C]39[/C][C]0[/C][C]0.40596[/C][C]-0.40596[/C][/ROW]
[ROW][C]40[/C][C]1[/C][C]0.372827[/C][C]0.627173[/C][/ROW]
[ROW][C]41[/C][C]0[/C][C]0.439093[/C][C]-0.439093[/C][/ROW]
[ROW][C]42[/C][C]0[/C][C]0.40596[/C][C]-0.40596[/C][/ROW]
[ROW][C]43[/C][C]0[/C][C]0.389393[/C][C]-0.389393[/C][/ROW]
[ROW][C]44[/C][C]0[/C][C]0.422527[/C][C]-0.422527[/C][/ROW]
[ROW][C]45[/C][C]0[/C][C]0.372827[/C][C]-0.372827[/C][/ROW]
[ROW][C]46[/C][C]1[/C][C]0.372827[/C][C]0.627173[/C][/ROW]
[ROW][C]47[/C][C]0[/C][C]0.439093[/C][C]-0.439093[/C][/ROW]
[ROW][C]48[/C][C]0[/C][C]0.439093[/C][C]-0.439093[/C][/ROW]
[ROW][C]49[/C][C]1[/C][C]0.372827[/C][C]0.627173[/C][/ROW]
[ROW][C]50[/C][C]0[/C][C]0.439093[/C][C]-0.439093[/C][/ROW]
[ROW][C]51[/C][C]0[/C][C]0.389393[/C][C]-0.389393[/C][/ROW]
[ROW][C]52[/C][C]0[/C][C]0.439093[/C][C]-0.439093[/C][/ROW]
[ROW][C]53[/C][C]0[/C][C]0.422527[/C][C]-0.422527[/C][/ROW]
[ROW][C]54[/C][C]0[/C][C]0.439093[/C][C]-0.439093[/C][/ROW]
[ROW][C]55[/C][C]0[/C][C]0.40596[/C][C]-0.40596[/C][/ROW]
[ROW][C]56[/C][C]0[/C][C]0.40596[/C][C]-0.40596[/C][/ROW]
[ROW][C]57[/C][C]1[/C][C]0.45566[/C][C]0.54434[/C][/ROW]
[ROW][C]58[/C][C]0[/C][C]0.472226[/C][C]-0.472226[/C][/ROW]
[ROW][C]59[/C][C]1[/C][C]0.439093[/C][C]0.560907[/C][/ROW]
[ROW][C]60[/C][C]0[/C][C]0.45566[/C][C]-0.45566[/C][/ROW]
[ROW][C]61[/C][C]0[/C][C]0.372827[/C][C]-0.372827[/C][/ROW]
[ROW][C]62[/C][C]1[/C][C]0.40596[/C][C]0.59404[/C][/ROW]
[ROW][C]63[/C][C]1[/C][C]0.40596[/C][C]0.59404[/C][/ROW]
[ROW][C]64[/C][C]0[/C][C]0.389393[/C][C]-0.389393[/C][/ROW]
[ROW][C]65[/C][C]1[/C][C]0.389393[/C][C]0.610607[/C][/ROW]
[ROW][C]66[/C][C]0[/C][C]0.40596[/C][C]-0.40596[/C][/ROW]
[ROW][C]67[/C][C]1[/C][C]0.422527[/C][C]0.577473[/C][/ROW]
[ROW][C]68[/C][C]0[/C][C]0.45566[/C][C]-0.45566[/C][/ROW]
[ROW][C]69[/C][C]0[/C][C]0.372827[/C][C]-0.372827[/C][/ROW]
[ROW][C]70[/C][C]1[/C][C]0.439093[/C][C]0.560907[/C][/ROW]
[ROW][C]71[/C][C]1[/C][C]0.389393[/C][C]0.610607[/C][/ROW]
[ROW][C]72[/C][C]1[/C][C]0.40596[/C][C]0.59404[/C][/ROW]
[ROW][C]73[/C][C]1[/C][C]0.389393[/C][C]0.610607[/C][/ROW]
[ROW][C]74[/C][C]0[/C][C]0.40596[/C][C]-0.40596[/C][/ROW]
[ROW][C]75[/C][C]1[/C][C]0.422527[/C][C]0.577473[/C][/ROW]
[ROW][C]76[/C][C]0[/C][C]0.538493[/C][C]-0.538493[/C][/ROW]
[ROW][C]77[/C][C]1[/C][C]0.439093[/C][C]0.560907[/C][/ROW]
[ROW][C]78[/C][C]0[/C][C]0.439093[/C][C]-0.439093[/C][/ROW]
[ROW][C]79[/C][C]1[/C][C]0.40596[/C][C]0.59404[/C][/ROW]
[ROW][C]80[/C][C]1[/C][C]0.40596[/C][C]0.59404[/C][/ROW]
[ROW][C]81[/C][C]0[/C][C]0.389393[/C][C]-0.389393[/C][/ROW]
[ROW][C]82[/C][C]1[/C][C]0.45566[/C][C]0.54434[/C][/ROW]
[ROW][C]83[/C][C]1[/C][C]0.422527[/C][C]0.577473[/C][/ROW]
[ROW][C]84[/C][C]1[/C][C]0.40596[/C][C]0.59404[/C][/ROW]
[ROW][C]85[/C][C]0[/C][C]0.372827[/C][C]-0.372827[/C][/ROW]
[ROW][C]86[/C][C]1[/C][C]0.422527[/C][C]0.577473[/C][/ROW]
[ROW][C]87[/C][C]0[/C][C]0.40596[/C][C]-0.40596[/C][/ROW]
[ROW][C]88[/C][C]0[/C][C]0.372827[/C][C]-0.372827[/C][/ROW]
[ROW][C]89[/C][C]1[/C][C]0.45566[/C][C]0.54434[/C][/ROW]
[ROW][C]90[/C][C]1[/C][C]0.422527[/C][C]0.577473[/C][/ROW]
[ROW][C]91[/C][C]1[/C][C]0.422527[/C][C]0.577473[/C][/ROW]
[ROW][C]92[/C][C]0[/C][C]0.389393[/C][C]-0.389393[/C][/ROW]
[ROW][C]93[/C][C]0[/C][C]0.439093[/C][C]-0.439093[/C][/ROW]
[ROW][C]94[/C][C]0[/C][C]0.422527[/C][C]-0.422527[/C][/ROW]
[ROW][C]95[/C][C]1[/C][C]0.439093[/C][C]0.560907[/C][/ROW]
[ROW][C]96[/C][C]0[/C][C]0.40596[/C][C]-0.40596[/C][/ROW]
[ROW][C]97[/C][C]1[/C][C]0.40596[/C][C]0.59404[/C][/ROW]
[ROW][C]98[/C][C]0[/C][C]0.372827[/C][C]-0.372827[/C][/ROW]
[ROW][C]99[/C][C]0[/C][C]0.389393[/C][C]-0.389393[/C][/ROW]
[ROW][C]100[/C][C]0[/C][C]0.40596[/C][C]-0.40596[/C][/ROW]
[ROW][C]101[/C][C]0[/C][C]0.422527[/C][C]-0.422527[/C][/ROW]
[ROW][C]102[/C][C]0[/C][C]0.40596[/C][C]-0.40596[/C][/ROW]
[ROW][C]103[/C][C]1[/C][C]0.389393[/C][C]0.610607[/C][/ROW]
[ROW][C]104[/C][C]0[/C][C]0.40596[/C][C]-0.40596[/C][/ROW]
[ROW][C]105[/C][C]0[/C][C]0.439093[/C][C]-0.439093[/C][/ROW]
[ROW][C]106[/C][C]0[/C][C]0.521926[/C][C]-0.521926[/C][/ROW]
[ROW][C]107[/C][C]1[/C][C]0.439093[/C][C]0.560907[/C][/ROW]
[ROW][C]108[/C][C]1[/C][C]0.389393[/C][C]0.610607[/C][/ROW]
[ROW][C]109[/C][C]1[/C][C]0.439093[/C][C]0.560907[/C][/ROW]
[ROW][C]110[/C][C]1[/C][C]0.422527[/C][C]0.577473[/C][/ROW]
[ROW][C]111[/C][C]1[/C][C]0.45566[/C][C]0.54434[/C][/ROW]
[ROW][C]112[/C][C]1[/C][C]0.40596[/C][C]0.59404[/C][/ROW]
[ROW][C]113[/C][C]0[/C][C]0.422527[/C][C]-0.422527[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=263616&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=263616&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
110.4225270.577473
200.422527-0.422527
310.455660.54434
400.40596-0.40596
500.389393-0.389393
600.40596-0.40596
710.455660.54434
800.422527-0.422527
900.372827-0.372827
1000.40596-0.40596
1100.40596-0.40596
1200.389393-0.389393
1300.389393-0.389393
1410.4225270.577473
1510.405960.59404
1610.455660.54434
1700.439093-0.439093
1810.405960.59404
1900.422527-0.422527
2010.4390930.560907
2100.389393-0.389393
2200.389393-0.389393
2310.405960.59404
2400.40596-0.40596
2500.439093-0.439093
2600.439093-0.439093
2700.439093-0.439093
2810.3893930.610607
2900.40596-0.40596
3010.3728270.627173
3110.4390930.560907
3210.4390930.560907
3300.40596-0.40596
3400.372827-0.372827
3500.389393-0.389393
3600.439093-0.439093
3710.405960.59404
3800.422527-0.422527
3900.40596-0.40596
4010.3728270.627173
4100.439093-0.439093
4200.40596-0.40596
4300.389393-0.389393
4400.422527-0.422527
4500.372827-0.372827
4610.3728270.627173
4700.439093-0.439093
4800.439093-0.439093
4910.3728270.627173
5000.439093-0.439093
5100.389393-0.389393
5200.439093-0.439093
5300.422527-0.422527
5400.439093-0.439093
5500.40596-0.40596
5600.40596-0.40596
5710.455660.54434
5800.472226-0.472226
5910.4390930.560907
6000.45566-0.45566
6100.372827-0.372827
6210.405960.59404
6310.405960.59404
6400.389393-0.389393
6510.3893930.610607
6600.40596-0.40596
6710.4225270.577473
6800.45566-0.45566
6900.372827-0.372827
7010.4390930.560907
7110.3893930.610607
7210.405960.59404
7310.3893930.610607
7400.40596-0.40596
7510.4225270.577473
7600.538493-0.538493
7710.4390930.560907
7800.439093-0.439093
7910.405960.59404
8010.405960.59404
8100.389393-0.389393
8210.455660.54434
8310.4225270.577473
8410.405960.59404
8500.372827-0.372827
8610.4225270.577473
8700.40596-0.40596
8800.372827-0.372827
8910.455660.54434
9010.4225270.577473
9110.4225270.577473
9200.389393-0.389393
9300.439093-0.439093
9400.422527-0.422527
9510.4390930.560907
9600.40596-0.40596
9710.405960.59404
9800.372827-0.372827
9900.389393-0.389393
10000.40596-0.40596
10100.422527-0.422527
10200.40596-0.40596
10310.3893930.610607
10400.40596-0.40596
10500.439093-0.439093
10600.521926-0.521926
10710.4390930.560907
10810.3893930.610607
10910.4390930.560907
11010.4225270.577473
11110.455660.54434
11210.405960.59404
11300.422527-0.422527







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
50.4666680.9333370.533332
60.3192730.6385470.680727
70.1962880.3925760.803712
80.1868790.3737580.813121
90.1643060.3286120.835694
100.1102770.2205530.889723
110.07114860.1422970.928851
120.04238120.08476240.957619
130.02429920.04859840.975701
140.04866950.0973390.95133
150.1309080.2618170.869092
160.09406280.1881260.905937
170.1513760.3027530.848624
180.2385770.4771540.761423
190.2363410.4726810.763659
200.217220.434440.78278
210.1715580.3431150.828442
220.1333650.2667310.866635
230.1946740.3893480.805326
240.1693890.3387770.830611
250.2076270.4152530.792373
260.2325160.4650320.767484
270.246410.4928190.75359
280.3513130.7026250.648687
290.3183980.6367960.681602
300.4310390.8620780.568961
310.4346390.8692780.565361
320.4337550.8675090.566245
330.4065320.8130640.593468
340.363460.7269190.63654
350.3282360.6564730.671764
360.3328730.6657460.667127
370.3729140.7458290.627086
380.3595940.7191880.640406
390.3351320.6702650.664868
400.4096090.8192180.590391
410.4028420.8056840.597158
420.3795610.7591210.620439
430.3515490.7030980.648451
440.3359810.6719620.664019
450.3071270.6142540.692873
460.3618190.7236370.638181
470.3508070.7016140.649193
480.3386980.6773950.661302
490.3783060.7566110.621694
500.3637820.7275640.636218
510.3431120.6862240.656888
520.3294740.6589480.670526
530.31440.6287990.6856
540.3018770.6037530.698123
550.2869730.5739460.713027
560.2733180.5466360.726682
570.2888990.5777990.711101
580.2842760.5685510.715724
590.3005970.6011940.699403
600.2954730.5909470.704527
610.2800340.5600680.719966
620.3005740.6011490.699426
630.3198350.639670.680165
640.3070390.6140790.692961
650.3272040.6544080.672796
660.3171240.6342470.682876
670.3300890.6601790.669911
680.3275310.6550630.672469
690.3156290.6312580.684371
700.3236130.6472260.676387
710.3387220.6774430.661278
720.3510830.7021670.648917
730.3669120.7338240.633088
740.3558860.7117730.644114
750.3647510.7295030.635249
760.4016630.8033250.598337
770.4019130.8038270.598087
780.4067390.8134790.593261
790.4218170.8436340.578183
800.4392970.8785950.560703
810.415250.8304990.58475
820.4054450.8108890.594555
830.4130540.8261080.586946
840.4332710.8665420.566729
850.3997860.7995710.600214
860.4108010.8216020.589199
870.3900590.7801170.609941
880.3620590.7241180.637941
890.361430.722860.63857
900.3735830.7471660.626417
910.3914050.7828110.608595
920.3664870.7329750.633513
930.3484760.6969530.651524
940.3332110.6664220.666789
950.3404150.6808310.659585
960.3235850.647170.676415
970.3325920.6651850.667408
980.3198070.6396130.680193
990.3248640.6497290.675136
1000.3400670.6801350.659933
1010.3567740.7135480.643226
1020.4340130.8680260.565987
1030.3589660.7179330.641034
1040.5019460.9961080.498054
1050.6258170.7483670.374183
1060.6181320.7637350.381868
1070.4908970.9817940.509103
1080.3728710.7457420.627129

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
5 & 0.466668 & 0.933337 & 0.533332 \tabularnewline
6 & 0.319273 & 0.638547 & 0.680727 \tabularnewline
7 & 0.196288 & 0.392576 & 0.803712 \tabularnewline
8 & 0.186879 & 0.373758 & 0.813121 \tabularnewline
9 & 0.164306 & 0.328612 & 0.835694 \tabularnewline
10 & 0.110277 & 0.220553 & 0.889723 \tabularnewline
11 & 0.0711486 & 0.142297 & 0.928851 \tabularnewline
12 & 0.0423812 & 0.0847624 & 0.957619 \tabularnewline
13 & 0.0242992 & 0.0485984 & 0.975701 \tabularnewline
14 & 0.0486695 & 0.097339 & 0.95133 \tabularnewline
15 & 0.130908 & 0.261817 & 0.869092 \tabularnewline
16 & 0.0940628 & 0.188126 & 0.905937 \tabularnewline
17 & 0.151376 & 0.302753 & 0.848624 \tabularnewline
18 & 0.238577 & 0.477154 & 0.761423 \tabularnewline
19 & 0.236341 & 0.472681 & 0.763659 \tabularnewline
20 & 0.21722 & 0.43444 & 0.78278 \tabularnewline
21 & 0.171558 & 0.343115 & 0.828442 \tabularnewline
22 & 0.133365 & 0.266731 & 0.866635 \tabularnewline
23 & 0.194674 & 0.389348 & 0.805326 \tabularnewline
24 & 0.169389 & 0.338777 & 0.830611 \tabularnewline
25 & 0.207627 & 0.415253 & 0.792373 \tabularnewline
26 & 0.232516 & 0.465032 & 0.767484 \tabularnewline
27 & 0.24641 & 0.492819 & 0.75359 \tabularnewline
28 & 0.351313 & 0.702625 & 0.648687 \tabularnewline
29 & 0.318398 & 0.636796 & 0.681602 \tabularnewline
30 & 0.431039 & 0.862078 & 0.568961 \tabularnewline
31 & 0.434639 & 0.869278 & 0.565361 \tabularnewline
32 & 0.433755 & 0.867509 & 0.566245 \tabularnewline
33 & 0.406532 & 0.813064 & 0.593468 \tabularnewline
34 & 0.36346 & 0.726919 & 0.63654 \tabularnewline
35 & 0.328236 & 0.656473 & 0.671764 \tabularnewline
36 & 0.332873 & 0.665746 & 0.667127 \tabularnewline
37 & 0.372914 & 0.745829 & 0.627086 \tabularnewline
38 & 0.359594 & 0.719188 & 0.640406 \tabularnewline
39 & 0.335132 & 0.670265 & 0.664868 \tabularnewline
40 & 0.409609 & 0.819218 & 0.590391 \tabularnewline
41 & 0.402842 & 0.805684 & 0.597158 \tabularnewline
42 & 0.379561 & 0.759121 & 0.620439 \tabularnewline
43 & 0.351549 & 0.703098 & 0.648451 \tabularnewline
44 & 0.335981 & 0.671962 & 0.664019 \tabularnewline
45 & 0.307127 & 0.614254 & 0.692873 \tabularnewline
46 & 0.361819 & 0.723637 & 0.638181 \tabularnewline
47 & 0.350807 & 0.701614 & 0.649193 \tabularnewline
48 & 0.338698 & 0.677395 & 0.661302 \tabularnewline
49 & 0.378306 & 0.756611 & 0.621694 \tabularnewline
50 & 0.363782 & 0.727564 & 0.636218 \tabularnewline
51 & 0.343112 & 0.686224 & 0.656888 \tabularnewline
52 & 0.329474 & 0.658948 & 0.670526 \tabularnewline
53 & 0.3144 & 0.628799 & 0.6856 \tabularnewline
54 & 0.301877 & 0.603753 & 0.698123 \tabularnewline
55 & 0.286973 & 0.573946 & 0.713027 \tabularnewline
56 & 0.273318 & 0.546636 & 0.726682 \tabularnewline
57 & 0.288899 & 0.577799 & 0.711101 \tabularnewline
58 & 0.284276 & 0.568551 & 0.715724 \tabularnewline
59 & 0.300597 & 0.601194 & 0.699403 \tabularnewline
60 & 0.295473 & 0.590947 & 0.704527 \tabularnewline
61 & 0.280034 & 0.560068 & 0.719966 \tabularnewline
62 & 0.300574 & 0.601149 & 0.699426 \tabularnewline
63 & 0.319835 & 0.63967 & 0.680165 \tabularnewline
64 & 0.307039 & 0.614079 & 0.692961 \tabularnewline
65 & 0.327204 & 0.654408 & 0.672796 \tabularnewline
66 & 0.317124 & 0.634247 & 0.682876 \tabularnewline
67 & 0.330089 & 0.660179 & 0.669911 \tabularnewline
68 & 0.327531 & 0.655063 & 0.672469 \tabularnewline
69 & 0.315629 & 0.631258 & 0.684371 \tabularnewline
70 & 0.323613 & 0.647226 & 0.676387 \tabularnewline
71 & 0.338722 & 0.677443 & 0.661278 \tabularnewline
72 & 0.351083 & 0.702167 & 0.648917 \tabularnewline
73 & 0.366912 & 0.733824 & 0.633088 \tabularnewline
74 & 0.355886 & 0.711773 & 0.644114 \tabularnewline
75 & 0.364751 & 0.729503 & 0.635249 \tabularnewline
76 & 0.401663 & 0.803325 & 0.598337 \tabularnewline
77 & 0.401913 & 0.803827 & 0.598087 \tabularnewline
78 & 0.406739 & 0.813479 & 0.593261 \tabularnewline
79 & 0.421817 & 0.843634 & 0.578183 \tabularnewline
80 & 0.439297 & 0.878595 & 0.560703 \tabularnewline
81 & 0.41525 & 0.830499 & 0.58475 \tabularnewline
82 & 0.405445 & 0.810889 & 0.594555 \tabularnewline
83 & 0.413054 & 0.826108 & 0.586946 \tabularnewline
84 & 0.433271 & 0.866542 & 0.566729 \tabularnewline
85 & 0.399786 & 0.799571 & 0.600214 \tabularnewline
86 & 0.410801 & 0.821602 & 0.589199 \tabularnewline
87 & 0.390059 & 0.780117 & 0.609941 \tabularnewline
88 & 0.362059 & 0.724118 & 0.637941 \tabularnewline
89 & 0.36143 & 0.72286 & 0.63857 \tabularnewline
90 & 0.373583 & 0.747166 & 0.626417 \tabularnewline
91 & 0.391405 & 0.782811 & 0.608595 \tabularnewline
92 & 0.366487 & 0.732975 & 0.633513 \tabularnewline
93 & 0.348476 & 0.696953 & 0.651524 \tabularnewline
94 & 0.333211 & 0.666422 & 0.666789 \tabularnewline
95 & 0.340415 & 0.680831 & 0.659585 \tabularnewline
96 & 0.323585 & 0.64717 & 0.676415 \tabularnewline
97 & 0.332592 & 0.665185 & 0.667408 \tabularnewline
98 & 0.319807 & 0.639613 & 0.680193 \tabularnewline
99 & 0.324864 & 0.649729 & 0.675136 \tabularnewline
100 & 0.340067 & 0.680135 & 0.659933 \tabularnewline
101 & 0.356774 & 0.713548 & 0.643226 \tabularnewline
102 & 0.434013 & 0.868026 & 0.565987 \tabularnewline
103 & 0.358966 & 0.717933 & 0.641034 \tabularnewline
104 & 0.501946 & 0.996108 & 0.498054 \tabularnewline
105 & 0.625817 & 0.748367 & 0.374183 \tabularnewline
106 & 0.618132 & 0.763735 & 0.381868 \tabularnewline
107 & 0.490897 & 0.981794 & 0.509103 \tabularnewline
108 & 0.372871 & 0.745742 & 0.627129 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=263616&T=5

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]5[/C][C]0.466668[/C][C]0.933337[/C][C]0.533332[/C][/ROW]
[ROW][C]6[/C][C]0.319273[/C][C]0.638547[/C][C]0.680727[/C][/ROW]
[ROW][C]7[/C][C]0.196288[/C][C]0.392576[/C][C]0.803712[/C][/ROW]
[ROW][C]8[/C][C]0.186879[/C][C]0.373758[/C][C]0.813121[/C][/ROW]
[ROW][C]9[/C][C]0.164306[/C][C]0.328612[/C][C]0.835694[/C][/ROW]
[ROW][C]10[/C][C]0.110277[/C][C]0.220553[/C][C]0.889723[/C][/ROW]
[ROW][C]11[/C][C]0.0711486[/C][C]0.142297[/C][C]0.928851[/C][/ROW]
[ROW][C]12[/C][C]0.0423812[/C][C]0.0847624[/C][C]0.957619[/C][/ROW]
[ROW][C]13[/C][C]0.0242992[/C][C]0.0485984[/C][C]0.975701[/C][/ROW]
[ROW][C]14[/C][C]0.0486695[/C][C]0.097339[/C][C]0.95133[/C][/ROW]
[ROW][C]15[/C][C]0.130908[/C][C]0.261817[/C][C]0.869092[/C][/ROW]
[ROW][C]16[/C][C]0.0940628[/C][C]0.188126[/C][C]0.905937[/C][/ROW]
[ROW][C]17[/C][C]0.151376[/C][C]0.302753[/C][C]0.848624[/C][/ROW]
[ROW][C]18[/C][C]0.238577[/C][C]0.477154[/C][C]0.761423[/C][/ROW]
[ROW][C]19[/C][C]0.236341[/C][C]0.472681[/C][C]0.763659[/C][/ROW]
[ROW][C]20[/C][C]0.21722[/C][C]0.43444[/C][C]0.78278[/C][/ROW]
[ROW][C]21[/C][C]0.171558[/C][C]0.343115[/C][C]0.828442[/C][/ROW]
[ROW][C]22[/C][C]0.133365[/C][C]0.266731[/C][C]0.866635[/C][/ROW]
[ROW][C]23[/C][C]0.194674[/C][C]0.389348[/C][C]0.805326[/C][/ROW]
[ROW][C]24[/C][C]0.169389[/C][C]0.338777[/C][C]0.830611[/C][/ROW]
[ROW][C]25[/C][C]0.207627[/C][C]0.415253[/C][C]0.792373[/C][/ROW]
[ROW][C]26[/C][C]0.232516[/C][C]0.465032[/C][C]0.767484[/C][/ROW]
[ROW][C]27[/C][C]0.24641[/C][C]0.492819[/C][C]0.75359[/C][/ROW]
[ROW][C]28[/C][C]0.351313[/C][C]0.702625[/C][C]0.648687[/C][/ROW]
[ROW][C]29[/C][C]0.318398[/C][C]0.636796[/C][C]0.681602[/C][/ROW]
[ROW][C]30[/C][C]0.431039[/C][C]0.862078[/C][C]0.568961[/C][/ROW]
[ROW][C]31[/C][C]0.434639[/C][C]0.869278[/C][C]0.565361[/C][/ROW]
[ROW][C]32[/C][C]0.433755[/C][C]0.867509[/C][C]0.566245[/C][/ROW]
[ROW][C]33[/C][C]0.406532[/C][C]0.813064[/C][C]0.593468[/C][/ROW]
[ROW][C]34[/C][C]0.36346[/C][C]0.726919[/C][C]0.63654[/C][/ROW]
[ROW][C]35[/C][C]0.328236[/C][C]0.656473[/C][C]0.671764[/C][/ROW]
[ROW][C]36[/C][C]0.332873[/C][C]0.665746[/C][C]0.667127[/C][/ROW]
[ROW][C]37[/C][C]0.372914[/C][C]0.745829[/C][C]0.627086[/C][/ROW]
[ROW][C]38[/C][C]0.359594[/C][C]0.719188[/C][C]0.640406[/C][/ROW]
[ROW][C]39[/C][C]0.335132[/C][C]0.670265[/C][C]0.664868[/C][/ROW]
[ROW][C]40[/C][C]0.409609[/C][C]0.819218[/C][C]0.590391[/C][/ROW]
[ROW][C]41[/C][C]0.402842[/C][C]0.805684[/C][C]0.597158[/C][/ROW]
[ROW][C]42[/C][C]0.379561[/C][C]0.759121[/C][C]0.620439[/C][/ROW]
[ROW][C]43[/C][C]0.351549[/C][C]0.703098[/C][C]0.648451[/C][/ROW]
[ROW][C]44[/C][C]0.335981[/C][C]0.671962[/C][C]0.664019[/C][/ROW]
[ROW][C]45[/C][C]0.307127[/C][C]0.614254[/C][C]0.692873[/C][/ROW]
[ROW][C]46[/C][C]0.361819[/C][C]0.723637[/C][C]0.638181[/C][/ROW]
[ROW][C]47[/C][C]0.350807[/C][C]0.701614[/C][C]0.649193[/C][/ROW]
[ROW][C]48[/C][C]0.338698[/C][C]0.677395[/C][C]0.661302[/C][/ROW]
[ROW][C]49[/C][C]0.378306[/C][C]0.756611[/C][C]0.621694[/C][/ROW]
[ROW][C]50[/C][C]0.363782[/C][C]0.727564[/C][C]0.636218[/C][/ROW]
[ROW][C]51[/C][C]0.343112[/C][C]0.686224[/C][C]0.656888[/C][/ROW]
[ROW][C]52[/C][C]0.329474[/C][C]0.658948[/C][C]0.670526[/C][/ROW]
[ROW][C]53[/C][C]0.3144[/C][C]0.628799[/C][C]0.6856[/C][/ROW]
[ROW][C]54[/C][C]0.301877[/C][C]0.603753[/C][C]0.698123[/C][/ROW]
[ROW][C]55[/C][C]0.286973[/C][C]0.573946[/C][C]0.713027[/C][/ROW]
[ROW][C]56[/C][C]0.273318[/C][C]0.546636[/C][C]0.726682[/C][/ROW]
[ROW][C]57[/C][C]0.288899[/C][C]0.577799[/C][C]0.711101[/C][/ROW]
[ROW][C]58[/C][C]0.284276[/C][C]0.568551[/C][C]0.715724[/C][/ROW]
[ROW][C]59[/C][C]0.300597[/C][C]0.601194[/C][C]0.699403[/C][/ROW]
[ROW][C]60[/C][C]0.295473[/C][C]0.590947[/C][C]0.704527[/C][/ROW]
[ROW][C]61[/C][C]0.280034[/C][C]0.560068[/C][C]0.719966[/C][/ROW]
[ROW][C]62[/C][C]0.300574[/C][C]0.601149[/C][C]0.699426[/C][/ROW]
[ROW][C]63[/C][C]0.319835[/C][C]0.63967[/C][C]0.680165[/C][/ROW]
[ROW][C]64[/C][C]0.307039[/C][C]0.614079[/C][C]0.692961[/C][/ROW]
[ROW][C]65[/C][C]0.327204[/C][C]0.654408[/C][C]0.672796[/C][/ROW]
[ROW][C]66[/C][C]0.317124[/C][C]0.634247[/C][C]0.682876[/C][/ROW]
[ROW][C]67[/C][C]0.330089[/C][C]0.660179[/C][C]0.669911[/C][/ROW]
[ROW][C]68[/C][C]0.327531[/C][C]0.655063[/C][C]0.672469[/C][/ROW]
[ROW][C]69[/C][C]0.315629[/C][C]0.631258[/C][C]0.684371[/C][/ROW]
[ROW][C]70[/C][C]0.323613[/C][C]0.647226[/C][C]0.676387[/C][/ROW]
[ROW][C]71[/C][C]0.338722[/C][C]0.677443[/C][C]0.661278[/C][/ROW]
[ROW][C]72[/C][C]0.351083[/C][C]0.702167[/C][C]0.648917[/C][/ROW]
[ROW][C]73[/C][C]0.366912[/C][C]0.733824[/C][C]0.633088[/C][/ROW]
[ROW][C]74[/C][C]0.355886[/C][C]0.711773[/C][C]0.644114[/C][/ROW]
[ROW][C]75[/C][C]0.364751[/C][C]0.729503[/C][C]0.635249[/C][/ROW]
[ROW][C]76[/C][C]0.401663[/C][C]0.803325[/C][C]0.598337[/C][/ROW]
[ROW][C]77[/C][C]0.401913[/C][C]0.803827[/C][C]0.598087[/C][/ROW]
[ROW][C]78[/C][C]0.406739[/C][C]0.813479[/C][C]0.593261[/C][/ROW]
[ROW][C]79[/C][C]0.421817[/C][C]0.843634[/C][C]0.578183[/C][/ROW]
[ROW][C]80[/C][C]0.439297[/C][C]0.878595[/C][C]0.560703[/C][/ROW]
[ROW][C]81[/C][C]0.41525[/C][C]0.830499[/C][C]0.58475[/C][/ROW]
[ROW][C]82[/C][C]0.405445[/C][C]0.810889[/C][C]0.594555[/C][/ROW]
[ROW][C]83[/C][C]0.413054[/C][C]0.826108[/C][C]0.586946[/C][/ROW]
[ROW][C]84[/C][C]0.433271[/C][C]0.866542[/C][C]0.566729[/C][/ROW]
[ROW][C]85[/C][C]0.399786[/C][C]0.799571[/C][C]0.600214[/C][/ROW]
[ROW][C]86[/C][C]0.410801[/C][C]0.821602[/C][C]0.589199[/C][/ROW]
[ROW][C]87[/C][C]0.390059[/C][C]0.780117[/C][C]0.609941[/C][/ROW]
[ROW][C]88[/C][C]0.362059[/C][C]0.724118[/C][C]0.637941[/C][/ROW]
[ROW][C]89[/C][C]0.36143[/C][C]0.72286[/C][C]0.63857[/C][/ROW]
[ROW][C]90[/C][C]0.373583[/C][C]0.747166[/C][C]0.626417[/C][/ROW]
[ROW][C]91[/C][C]0.391405[/C][C]0.782811[/C][C]0.608595[/C][/ROW]
[ROW][C]92[/C][C]0.366487[/C][C]0.732975[/C][C]0.633513[/C][/ROW]
[ROW][C]93[/C][C]0.348476[/C][C]0.696953[/C][C]0.651524[/C][/ROW]
[ROW][C]94[/C][C]0.333211[/C][C]0.666422[/C][C]0.666789[/C][/ROW]
[ROW][C]95[/C][C]0.340415[/C][C]0.680831[/C][C]0.659585[/C][/ROW]
[ROW][C]96[/C][C]0.323585[/C][C]0.64717[/C][C]0.676415[/C][/ROW]
[ROW][C]97[/C][C]0.332592[/C][C]0.665185[/C][C]0.667408[/C][/ROW]
[ROW][C]98[/C][C]0.319807[/C][C]0.639613[/C][C]0.680193[/C][/ROW]
[ROW][C]99[/C][C]0.324864[/C][C]0.649729[/C][C]0.675136[/C][/ROW]
[ROW][C]100[/C][C]0.340067[/C][C]0.680135[/C][C]0.659933[/C][/ROW]
[ROW][C]101[/C][C]0.356774[/C][C]0.713548[/C][C]0.643226[/C][/ROW]
[ROW][C]102[/C][C]0.434013[/C][C]0.868026[/C][C]0.565987[/C][/ROW]
[ROW][C]103[/C][C]0.358966[/C][C]0.717933[/C][C]0.641034[/C][/ROW]
[ROW][C]104[/C][C]0.501946[/C][C]0.996108[/C][C]0.498054[/C][/ROW]
[ROW][C]105[/C][C]0.625817[/C][C]0.748367[/C][C]0.374183[/C][/ROW]
[ROW][C]106[/C][C]0.618132[/C][C]0.763735[/C][C]0.381868[/C][/ROW]
[ROW][C]107[/C][C]0.490897[/C][C]0.981794[/C][C]0.509103[/C][/ROW]
[ROW][C]108[/C][C]0.372871[/C][C]0.745742[/C][C]0.627129[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=263616&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=263616&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
50.4666680.9333370.533332
60.3192730.6385470.680727
70.1962880.3925760.803712
80.1868790.3737580.813121
90.1643060.3286120.835694
100.1102770.2205530.889723
110.07114860.1422970.928851
120.04238120.08476240.957619
130.02429920.04859840.975701
140.04866950.0973390.95133
150.1309080.2618170.869092
160.09406280.1881260.905937
170.1513760.3027530.848624
180.2385770.4771540.761423
190.2363410.4726810.763659
200.217220.434440.78278
210.1715580.3431150.828442
220.1333650.2667310.866635
230.1946740.3893480.805326
240.1693890.3387770.830611
250.2076270.4152530.792373
260.2325160.4650320.767484
270.246410.4928190.75359
280.3513130.7026250.648687
290.3183980.6367960.681602
300.4310390.8620780.568961
310.4346390.8692780.565361
320.4337550.8675090.566245
330.4065320.8130640.593468
340.363460.7269190.63654
350.3282360.6564730.671764
360.3328730.6657460.667127
370.3729140.7458290.627086
380.3595940.7191880.640406
390.3351320.6702650.664868
400.4096090.8192180.590391
410.4028420.8056840.597158
420.3795610.7591210.620439
430.3515490.7030980.648451
440.3359810.6719620.664019
450.3071270.6142540.692873
460.3618190.7236370.638181
470.3508070.7016140.649193
480.3386980.6773950.661302
490.3783060.7566110.621694
500.3637820.7275640.636218
510.3431120.6862240.656888
520.3294740.6589480.670526
530.31440.6287990.6856
540.3018770.6037530.698123
550.2869730.5739460.713027
560.2733180.5466360.726682
570.2888990.5777990.711101
580.2842760.5685510.715724
590.3005970.6011940.699403
600.2954730.5909470.704527
610.2800340.5600680.719966
620.3005740.6011490.699426
630.3198350.639670.680165
640.3070390.6140790.692961
650.3272040.6544080.672796
660.3171240.6342470.682876
670.3300890.6601790.669911
680.3275310.6550630.672469
690.3156290.6312580.684371
700.3236130.6472260.676387
710.3387220.6774430.661278
720.3510830.7021670.648917
730.3669120.7338240.633088
740.3558860.7117730.644114
750.3647510.7295030.635249
760.4016630.8033250.598337
770.4019130.8038270.598087
780.4067390.8134790.593261
790.4218170.8436340.578183
800.4392970.8785950.560703
810.415250.8304990.58475
820.4054450.8108890.594555
830.4130540.8261080.586946
840.4332710.8665420.566729
850.3997860.7995710.600214
860.4108010.8216020.589199
870.3900590.7801170.609941
880.3620590.7241180.637941
890.361430.722860.63857
900.3735830.7471660.626417
910.3914050.7828110.608595
920.3664870.7329750.633513
930.3484760.6969530.651524
940.3332110.6664220.666789
950.3404150.6808310.659585
960.3235850.647170.676415
970.3325920.6651850.667408
980.3198070.6396130.680193
990.3248640.6497290.675136
1000.3400670.6801350.659933
1010.3567740.7135480.643226
1020.4340130.8680260.565987
1030.3589660.7179330.641034
1040.5019460.9961080.498054
1050.6258170.7483670.374183
1060.6181320.7637350.381868
1070.4908970.9817940.509103
1080.3728710.7457420.627129







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level00OK
5% type I error level10.00961538OK
10% type I error level30.0288462OK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 0 & 0 & OK \tabularnewline
5% type I error level & 1 & 0.00961538 & OK \tabularnewline
10% type I error level & 3 & 0.0288462 & OK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=263616&T=6

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]1[/C][C]0.00961538[/C][C]OK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]3[/C][C]0.0288462[/C][C]OK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=263616&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=263616&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level00OK
5% type I error level10.00961538OK
10% type I error level30.0288462OK



Parameters (Session):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
R code (references can be found in the software module):
library(lattice)
library(lmtest)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
qqline(mysum$resid)
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, signif(mysum$coefficients[i,1],6), sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,signif(mysum$coefficients[i,1],6))
a<-table.element(a, signif(mysum$coefficients[i,2],6))
a<-table.element(a, signif(mysum$coefficients[i,3],4))
a<-table.element(a, signif(mysum$coefficients[i,4],6))
a<-table.element(a, signif(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, signif(sqrt(mysum$r.squared),6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, signif(mysum$r.squared,6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, signif(mysum$adj.r.squared,6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[1],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[2],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[3],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, signif(1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]),6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, signif(mysum$sigma,6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, signif(sum(myerror*myerror),6))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,signif(x[i],6))
a<-table.element(a,signif(x[i]-mysum$resid[i],6))
a<-table.element(a,signif(mysum$resid[i],6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,signif(gqarr[mypoint-kp3+1,1],6))
a<-table.element(a,signif(gqarr[mypoint-kp3+1,2],6))
a<-table.element(a,signif(gqarr[mypoint-kp3+1,3],6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant1,6))
a<-table.element(a,signif(numsignificant1/numgqtests,6))
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant5,6))
a<-table.element(a,signif(numsignificant5/numgqtests,6))
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant10,6))
a<-table.element(a,signif(numsignificant10/numgqtests,6))
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}