Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationSat, 06 Dec 2014 12:25:23 +0000
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2014/Dec/06/t1417870080lcjh56voy9sb53v.htm/, Retrieved Thu, 16 May 2024 17:44:41 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=263596, Retrieved Thu, 16 May 2024 17:44:41 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact121
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [paper multiple re...] [2014-12-06 12:25:23] [ec1b40d1a9751af99658fe8fca4f9eca] [Current]
Feedback Forum

Post a new message
Dataseries X:
1 26 50 4
0 57 62 4
1 37 54 5
0 67 71 4
0 43 54 4
0 52 65 9
1 52 73 8
0 43 52 11
0 84 84 4
0 67 42 4
0 49 66 6
0 70 65 4
0 52 78 8
1 58 73 4
1 68 75 4
1 62 72 11
0 43 66 4
1 56 70 4
0 56 61 6
1 74 81 6
0 65 71 4
0 63 69 8
1 58 71 5
0 57 72 4
0 63 68 9
0 53 70 4
0 57 68 7
1 51 61 10
0 64 67 4
1 53 76 4
1 29 70 7
1 54 60 12
0 58 72 7
0 43 69 5
0 51 71 8
0 53 62 5
1 54 70 4
0 56 64 9
0 61 58 7
1 47 76 4
0 39 52 4
0 48 59 4
0 50 68 4
0 35 76 4
0 30 65 7
1 68 67 4
0 49 59 7
0 61 69 4
1 67 76 4
0 47 63 4
0 56 75 4
0 50 63 8
0 43 60 4
0 67 73 4
0 62 63 4
0 57 70 4
1 41 75 7
0 54 66 12
1 45 63 4
0 48 63 4
0 61 64 4
1 56 70 5
1 41 75 15
0 43 61 5
1 53 60 10
0 44 62 9
1 66 73 8
0 58 61 4
0 46 66 5
1 37 64 4
1 51 59 9
1 51 64 4
1 56 60 10
0 66 56 4
1 37 78 4
0 59 53 6
1 42 67 7
0 38 59 5
1 66 66 4
1 34 68 4
0 53 71 4
1 49 66 4
1 55 73 4
1 49 72 4
0 59 71 6
1 40 59 10
0 58 64 7
0 60 66 4
1 63 78 4
1 56 68 7
1 54 73 4
0 52 62 8
0 34 65 11
0 69 68 6
1 32 65 14
0 48 60 5
1 67 71 4
0 58 65 8
0 57 68 9
0 42 64 4
0 64 74 4
0 58 69 5
1 66 76 4
0 26 68 5
0 61 72 4
0 52 67 4
1 51 63 7
1 55 59 10
1 50 73 4
1 60 66 5
1 56 62 4
1 63 69 4
0 61 66 4




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time5 seconds
R Server'Gwilym Jenkins' @ jenkins.wessa.net

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 5 seconds \tabularnewline
R Server & 'Gwilym Jenkins' @ jenkins.wessa.net \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=263596&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]5 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Gwilym Jenkins' @ jenkins.wessa.net[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=263596&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=263596&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time5 seconds
R Server'Gwilym Jenkins' @ jenkins.wessa.net







Multiple Linear Regression - Estimated Regression Equation
gendercode[t] = -0.673793 -0.00583064AMS.I[t] + 0.0190811AMS.E[t] + 0.02217AMS.A[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
gendercode[t] =  -0.673793 -0.00583064AMS.I[t] +  0.0190811AMS.E[t] +  0.02217AMS.A[t]  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=263596&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]gendercode[t] =  -0.673793 -0.00583064AMS.I[t] +  0.0190811AMS.E[t] +  0.02217AMS.A[t]  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=263596&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=263596&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
gendercode[t] = -0.673793 -0.00583064AMS.I[t] + 0.0190811AMS.E[t] + 0.02217AMS.A[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)-0.6737930.492942-1.3670.1744750.0872376
AMS.I-0.005830640.00450767-1.2930.1985750.0992874
AMS.E0.01908110.006957092.7430.007125650.00356283
AMS.A0.022170.01852181.1970.2339150.116958

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & -0.673793 & 0.492942 & -1.367 & 0.174475 & 0.0872376 \tabularnewline
AMS.I & -0.00583064 & 0.00450767 & -1.293 & 0.198575 & 0.0992874 \tabularnewline
AMS.E & 0.0190811 & 0.00695709 & 2.743 & 0.00712565 & 0.00356283 \tabularnewline
AMS.A & 0.02217 & 0.0185218 & 1.197 & 0.233915 & 0.116958 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=263596&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]-0.673793[/C][C]0.492942[/C][C]-1.367[/C][C]0.174475[/C][C]0.0872376[/C][/ROW]
[ROW][C]AMS.I[/C][C]-0.00583064[/C][C]0.00450767[/C][C]-1.293[/C][C]0.198575[/C][C]0.0992874[/C][/ROW]
[ROW][C]AMS.E[/C][C]0.0190811[/C][C]0.00695709[/C][C]2.743[/C][C]0.00712565[/C][C]0.00356283[/C][/ROW]
[ROW][C]AMS.A[/C][C]0.02217[/C][C]0.0185218[/C][C]1.197[/C][C]0.233915[/C][C]0.116958[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=263596&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=263596&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)-0.6737930.492942-1.3670.1744750.0872376
AMS.I-0.005830640.00450767-1.2930.1985750.0992874
AMS.E0.01908110.006957092.7430.007125650.00356283
AMS.A0.022170.01852181.1970.2339150.116958







Multiple Linear Regression - Regression Statistics
Multiple R0.27648
R-squared0.0764414
Adjusted R-squared0.0510224
F-TEST (value)3.00725
F-TEST (DF numerator)3
F-TEST (DF denominator)109
p-value0.033449
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation0.482282
Sum Squared Residuals25.3529

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.27648 \tabularnewline
R-squared & 0.0764414 \tabularnewline
Adjusted R-squared & 0.0510224 \tabularnewline
F-TEST (value) & 3.00725 \tabularnewline
F-TEST (DF numerator) & 3 \tabularnewline
F-TEST (DF denominator) & 109 \tabularnewline
p-value & 0.033449 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 0.482282 \tabularnewline
Sum Squared Residuals & 25.3529 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=263596&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.27648[/C][/ROW]
[ROW][C]R-squared[/C][C]0.0764414[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]0.0510224[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]3.00725[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]3[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]109[/C][/ROW]
[ROW][C]p-value[/C][C]0.033449[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]0.482282[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]25.3529[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=263596&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=263596&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.27648
R-squared0.0764414
Adjusted R-squared0.0510224
F-TEST (value)3.00725
F-TEST (DF numerator)3
F-TEST (DF denominator)109
p-value0.033449
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation0.482282
Sum Squared Residuals25.3529







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
110.2173470.782653
200.265571-0.265571
310.2517050.748295
400.378995-0.378995
500.194551-0.194551
600.462818-0.462818
710.5932970.406703
800.311579-0.311579
900.527929-0.527929
100-0.1743580.174358
1100.432881-0.432881
1200.247016-0.247016
1300.688702-0.688702
1410.4696330.530367
1510.4494890.550511
1610.5824190.417581
1700.423525-0.423525
1810.4240510.575949
1900.296661-0.296661
2010.5733320.426668
2100.390656-0.390656
2200.452835-0.452835
2310.4536410.546359
2400.456382-0.456382
2500.455924-0.455924
2600.441543-0.441543
2700.446568-0.446568
2810.4144940.585506
2900.320162-0.320162
3010.556030.44397
3110.6479880.352012
3210.4222610.577739
3300.517062-0.517062
3400.502938-0.502938
3500.560965-0.560965
3600.311064-0.311064
3710.4357120.564288
3800.420414-0.420414
3900.232434-0.232434
4010.5910130.408987
4100.179711-0.179711
4200.260803-0.260803
4300.420872-0.420872
4400.660981-0.660981
4500.546752-0.546752
4610.296840.70316
4700.321483-0.321483
4800.375816-0.375816
4910.4744010.525599
5000.342959-0.342959
5100.519456-0.519456
5200.414147-0.414147
5300.309038-0.309038
5400.417157-0.417157
5500.255499-0.255499
5600.41822-0.41822
5710.6734260.326574
5800.536748-0.536748
5910.354620.64538
6000.337128-0.337128
6100.280411-0.280411
6210.4462210.553779
6310.8507860.149214
6400.350289-0.350289
6510.3837510.616249
6600.452219-0.452219
6710.5116680.488332
6800.240659-0.240659
6900.428203-0.428203
7010.4203460.579654
7110.3541620.645838
7210.3387170.661283
7310.366260.63374
7400.0986085-0.0986085
7510.6874820.312518
7600.12652-0.12652
7710.5149460.485054
7800.34128-0.34128
7910.289420.71058
8010.5141630.485837
8100.460624-0.460624
8210.3885410.611459
8310.4871250.512875
8410.5030280.496972
8500.46998-0.46998
8610.4404690.559531
8700.364413-0.364413
8800.324404-0.324404
8910.5358850.464115
9010.4523990.547601
9110.4929550.507045
9200.383404-0.383404
9300.612109-0.612109
9400.35443-0.35443
9510.6902810.309719
9600.302055-0.302055
9710.3789950.621005
9800.405664-0.405664
9900.490908-0.490908
10000.391193-0.391193
10100.45373-0.45373
10200.415478-0.415478
10310.4802310.519769
10400.582978-0.582978
10500.43306-0.43306
10600.39013-0.39013
10710.3861460.613854
10810.3530090.646991
10910.5162780.483722
11010.3465740.653426
11110.2714020.728598
11210.3641550.635845
11300.318573-0.318573

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 1 & 0.217347 & 0.782653 \tabularnewline
2 & 0 & 0.265571 & -0.265571 \tabularnewline
3 & 1 & 0.251705 & 0.748295 \tabularnewline
4 & 0 & 0.378995 & -0.378995 \tabularnewline
5 & 0 & 0.194551 & -0.194551 \tabularnewline
6 & 0 & 0.462818 & -0.462818 \tabularnewline
7 & 1 & 0.593297 & 0.406703 \tabularnewline
8 & 0 & 0.311579 & -0.311579 \tabularnewline
9 & 0 & 0.527929 & -0.527929 \tabularnewline
10 & 0 & -0.174358 & 0.174358 \tabularnewline
11 & 0 & 0.432881 & -0.432881 \tabularnewline
12 & 0 & 0.247016 & -0.247016 \tabularnewline
13 & 0 & 0.688702 & -0.688702 \tabularnewline
14 & 1 & 0.469633 & 0.530367 \tabularnewline
15 & 1 & 0.449489 & 0.550511 \tabularnewline
16 & 1 & 0.582419 & 0.417581 \tabularnewline
17 & 0 & 0.423525 & -0.423525 \tabularnewline
18 & 1 & 0.424051 & 0.575949 \tabularnewline
19 & 0 & 0.296661 & -0.296661 \tabularnewline
20 & 1 & 0.573332 & 0.426668 \tabularnewline
21 & 0 & 0.390656 & -0.390656 \tabularnewline
22 & 0 & 0.452835 & -0.452835 \tabularnewline
23 & 1 & 0.453641 & 0.546359 \tabularnewline
24 & 0 & 0.456382 & -0.456382 \tabularnewline
25 & 0 & 0.455924 & -0.455924 \tabularnewline
26 & 0 & 0.441543 & -0.441543 \tabularnewline
27 & 0 & 0.446568 & -0.446568 \tabularnewline
28 & 1 & 0.414494 & 0.585506 \tabularnewline
29 & 0 & 0.320162 & -0.320162 \tabularnewline
30 & 1 & 0.55603 & 0.44397 \tabularnewline
31 & 1 & 0.647988 & 0.352012 \tabularnewline
32 & 1 & 0.422261 & 0.577739 \tabularnewline
33 & 0 & 0.517062 & -0.517062 \tabularnewline
34 & 0 & 0.502938 & -0.502938 \tabularnewline
35 & 0 & 0.560965 & -0.560965 \tabularnewline
36 & 0 & 0.311064 & -0.311064 \tabularnewline
37 & 1 & 0.435712 & 0.564288 \tabularnewline
38 & 0 & 0.420414 & -0.420414 \tabularnewline
39 & 0 & 0.232434 & -0.232434 \tabularnewline
40 & 1 & 0.591013 & 0.408987 \tabularnewline
41 & 0 & 0.179711 & -0.179711 \tabularnewline
42 & 0 & 0.260803 & -0.260803 \tabularnewline
43 & 0 & 0.420872 & -0.420872 \tabularnewline
44 & 0 & 0.660981 & -0.660981 \tabularnewline
45 & 0 & 0.546752 & -0.546752 \tabularnewline
46 & 1 & 0.29684 & 0.70316 \tabularnewline
47 & 0 & 0.321483 & -0.321483 \tabularnewline
48 & 0 & 0.375816 & -0.375816 \tabularnewline
49 & 1 & 0.474401 & 0.525599 \tabularnewline
50 & 0 & 0.342959 & -0.342959 \tabularnewline
51 & 0 & 0.519456 & -0.519456 \tabularnewline
52 & 0 & 0.414147 & -0.414147 \tabularnewline
53 & 0 & 0.309038 & -0.309038 \tabularnewline
54 & 0 & 0.417157 & -0.417157 \tabularnewline
55 & 0 & 0.255499 & -0.255499 \tabularnewline
56 & 0 & 0.41822 & -0.41822 \tabularnewline
57 & 1 & 0.673426 & 0.326574 \tabularnewline
58 & 0 & 0.536748 & -0.536748 \tabularnewline
59 & 1 & 0.35462 & 0.64538 \tabularnewline
60 & 0 & 0.337128 & -0.337128 \tabularnewline
61 & 0 & 0.280411 & -0.280411 \tabularnewline
62 & 1 & 0.446221 & 0.553779 \tabularnewline
63 & 1 & 0.850786 & 0.149214 \tabularnewline
64 & 0 & 0.350289 & -0.350289 \tabularnewline
65 & 1 & 0.383751 & 0.616249 \tabularnewline
66 & 0 & 0.452219 & -0.452219 \tabularnewline
67 & 1 & 0.511668 & 0.488332 \tabularnewline
68 & 0 & 0.240659 & -0.240659 \tabularnewline
69 & 0 & 0.428203 & -0.428203 \tabularnewline
70 & 1 & 0.420346 & 0.579654 \tabularnewline
71 & 1 & 0.354162 & 0.645838 \tabularnewline
72 & 1 & 0.338717 & 0.661283 \tabularnewline
73 & 1 & 0.36626 & 0.63374 \tabularnewline
74 & 0 & 0.0986085 & -0.0986085 \tabularnewline
75 & 1 & 0.687482 & 0.312518 \tabularnewline
76 & 0 & 0.12652 & -0.12652 \tabularnewline
77 & 1 & 0.514946 & 0.485054 \tabularnewline
78 & 0 & 0.34128 & -0.34128 \tabularnewline
79 & 1 & 0.28942 & 0.71058 \tabularnewline
80 & 1 & 0.514163 & 0.485837 \tabularnewline
81 & 0 & 0.460624 & -0.460624 \tabularnewline
82 & 1 & 0.388541 & 0.611459 \tabularnewline
83 & 1 & 0.487125 & 0.512875 \tabularnewline
84 & 1 & 0.503028 & 0.496972 \tabularnewline
85 & 0 & 0.46998 & -0.46998 \tabularnewline
86 & 1 & 0.440469 & 0.559531 \tabularnewline
87 & 0 & 0.364413 & -0.364413 \tabularnewline
88 & 0 & 0.324404 & -0.324404 \tabularnewline
89 & 1 & 0.535885 & 0.464115 \tabularnewline
90 & 1 & 0.452399 & 0.547601 \tabularnewline
91 & 1 & 0.492955 & 0.507045 \tabularnewline
92 & 0 & 0.383404 & -0.383404 \tabularnewline
93 & 0 & 0.612109 & -0.612109 \tabularnewline
94 & 0 & 0.35443 & -0.35443 \tabularnewline
95 & 1 & 0.690281 & 0.309719 \tabularnewline
96 & 0 & 0.302055 & -0.302055 \tabularnewline
97 & 1 & 0.378995 & 0.621005 \tabularnewline
98 & 0 & 0.405664 & -0.405664 \tabularnewline
99 & 0 & 0.490908 & -0.490908 \tabularnewline
100 & 0 & 0.391193 & -0.391193 \tabularnewline
101 & 0 & 0.45373 & -0.45373 \tabularnewline
102 & 0 & 0.415478 & -0.415478 \tabularnewline
103 & 1 & 0.480231 & 0.519769 \tabularnewline
104 & 0 & 0.582978 & -0.582978 \tabularnewline
105 & 0 & 0.43306 & -0.43306 \tabularnewline
106 & 0 & 0.39013 & -0.39013 \tabularnewline
107 & 1 & 0.386146 & 0.613854 \tabularnewline
108 & 1 & 0.353009 & 0.646991 \tabularnewline
109 & 1 & 0.516278 & 0.483722 \tabularnewline
110 & 1 & 0.346574 & 0.653426 \tabularnewline
111 & 1 & 0.271402 & 0.728598 \tabularnewline
112 & 1 & 0.364155 & 0.635845 \tabularnewline
113 & 0 & 0.318573 & -0.318573 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=263596&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]1[/C][C]0.217347[/C][C]0.782653[/C][/ROW]
[ROW][C]2[/C][C]0[/C][C]0.265571[/C][C]-0.265571[/C][/ROW]
[ROW][C]3[/C][C]1[/C][C]0.251705[/C][C]0.748295[/C][/ROW]
[ROW][C]4[/C][C]0[/C][C]0.378995[/C][C]-0.378995[/C][/ROW]
[ROW][C]5[/C][C]0[/C][C]0.194551[/C][C]-0.194551[/C][/ROW]
[ROW][C]6[/C][C]0[/C][C]0.462818[/C][C]-0.462818[/C][/ROW]
[ROW][C]7[/C][C]1[/C][C]0.593297[/C][C]0.406703[/C][/ROW]
[ROW][C]8[/C][C]0[/C][C]0.311579[/C][C]-0.311579[/C][/ROW]
[ROW][C]9[/C][C]0[/C][C]0.527929[/C][C]-0.527929[/C][/ROW]
[ROW][C]10[/C][C]0[/C][C]-0.174358[/C][C]0.174358[/C][/ROW]
[ROW][C]11[/C][C]0[/C][C]0.432881[/C][C]-0.432881[/C][/ROW]
[ROW][C]12[/C][C]0[/C][C]0.247016[/C][C]-0.247016[/C][/ROW]
[ROW][C]13[/C][C]0[/C][C]0.688702[/C][C]-0.688702[/C][/ROW]
[ROW][C]14[/C][C]1[/C][C]0.469633[/C][C]0.530367[/C][/ROW]
[ROW][C]15[/C][C]1[/C][C]0.449489[/C][C]0.550511[/C][/ROW]
[ROW][C]16[/C][C]1[/C][C]0.582419[/C][C]0.417581[/C][/ROW]
[ROW][C]17[/C][C]0[/C][C]0.423525[/C][C]-0.423525[/C][/ROW]
[ROW][C]18[/C][C]1[/C][C]0.424051[/C][C]0.575949[/C][/ROW]
[ROW][C]19[/C][C]0[/C][C]0.296661[/C][C]-0.296661[/C][/ROW]
[ROW][C]20[/C][C]1[/C][C]0.573332[/C][C]0.426668[/C][/ROW]
[ROW][C]21[/C][C]0[/C][C]0.390656[/C][C]-0.390656[/C][/ROW]
[ROW][C]22[/C][C]0[/C][C]0.452835[/C][C]-0.452835[/C][/ROW]
[ROW][C]23[/C][C]1[/C][C]0.453641[/C][C]0.546359[/C][/ROW]
[ROW][C]24[/C][C]0[/C][C]0.456382[/C][C]-0.456382[/C][/ROW]
[ROW][C]25[/C][C]0[/C][C]0.455924[/C][C]-0.455924[/C][/ROW]
[ROW][C]26[/C][C]0[/C][C]0.441543[/C][C]-0.441543[/C][/ROW]
[ROW][C]27[/C][C]0[/C][C]0.446568[/C][C]-0.446568[/C][/ROW]
[ROW][C]28[/C][C]1[/C][C]0.414494[/C][C]0.585506[/C][/ROW]
[ROW][C]29[/C][C]0[/C][C]0.320162[/C][C]-0.320162[/C][/ROW]
[ROW][C]30[/C][C]1[/C][C]0.55603[/C][C]0.44397[/C][/ROW]
[ROW][C]31[/C][C]1[/C][C]0.647988[/C][C]0.352012[/C][/ROW]
[ROW][C]32[/C][C]1[/C][C]0.422261[/C][C]0.577739[/C][/ROW]
[ROW][C]33[/C][C]0[/C][C]0.517062[/C][C]-0.517062[/C][/ROW]
[ROW][C]34[/C][C]0[/C][C]0.502938[/C][C]-0.502938[/C][/ROW]
[ROW][C]35[/C][C]0[/C][C]0.560965[/C][C]-0.560965[/C][/ROW]
[ROW][C]36[/C][C]0[/C][C]0.311064[/C][C]-0.311064[/C][/ROW]
[ROW][C]37[/C][C]1[/C][C]0.435712[/C][C]0.564288[/C][/ROW]
[ROW][C]38[/C][C]0[/C][C]0.420414[/C][C]-0.420414[/C][/ROW]
[ROW][C]39[/C][C]0[/C][C]0.232434[/C][C]-0.232434[/C][/ROW]
[ROW][C]40[/C][C]1[/C][C]0.591013[/C][C]0.408987[/C][/ROW]
[ROW][C]41[/C][C]0[/C][C]0.179711[/C][C]-0.179711[/C][/ROW]
[ROW][C]42[/C][C]0[/C][C]0.260803[/C][C]-0.260803[/C][/ROW]
[ROW][C]43[/C][C]0[/C][C]0.420872[/C][C]-0.420872[/C][/ROW]
[ROW][C]44[/C][C]0[/C][C]0.660981[/C][C]-0.660981[/C][/ROW]
[ROW][C]45[/C][C]0[/C][C]0.546752[/C][C]-0.546752[/C][/ROW]
[ROW][C]46[/C][C]1[/C][C]0.29684[/C][C]0.70316[/C][/ROW]
[ROW][C]47[/C][C]0[/C][C]0.321483[/C][C]-0.321483[/C][/ROW]
[ROW][C]48[/C][C]0[/C][C]0.375816[/C][C]-0.375816[/C][/ROW]
[ROW][C]49[/C][C]1[/C][C]0.474401[/C][C]0.525599[/C][/ROW]
[ROW][C]50[/C][C]0[/C][C]0.342959[/C][C]-0.342959[/C][/ROW]
[ROW][C]51[/C][C]0[/C][C]0.519456[/C][C]-0.519456[/C][/ROW]
[ROW][C]52[/C][C]0[/C][C]0.414147[/C][C]-0.414147[/C][/ROW]
[ROW][C]53[/C][C]0[/C][C]0.309038[/C][C]-0.309038[/C][/ROW]
[ROW][C]54[/C][C]0[/C][C]0.417157[/C][C]-0.417157[/C][/ROW]
[ROW][C]55[/C][C]0[/C][C]0.255499[/C][C]-0.255499[/C][/ROW]
[ROW][C]56[/C][C]0[/C][C]0.41822[/C][C]-0.41822[/C][/ROW]
[ROW][C]57[/C][C]1[/C][C]0.673426[/C][C]0.326574[/C][/ROW]
[ROW][C]58[/C][C]0[/C][C]0.536748[/C][C]-0.536748[/C][/ROW]
[ROW][C]59[/C][C]1[/C][C]0.35462[/C][C]0.64538[/C][/ROW]
[ROW][C]60[/C][C]0[/C][C]0.337128[/C][C]-0.337128[/C][/ROW]
[ROW][C]61[/C][C]0[/C][C]0.280411[/C][C]-0.280411[/C][/ROW]
[ROW][C]62[/C][C]1[/C][C]0.446221[/C][C]0.553779[/C][/ROW]
[ROW][C]63[/C][C]1[/C][C]0.850786[/C][C]0.149214[/C][/ROW]
[ROW][C]64[/C][C]0[/C][C]0.350289[/C][C]-0.350289[/C][/ROW]
[ROW][C]65[/C][C]1[/C][C]0.383751[/C][C]0.616249[/C][/ROW]
[ROW][C]66[/C][C]0[/C][C]0.452219[/C][C]-0.452219[/C][/ROW]
[ROW][C]67[/C][C]1[/C][C]0.511668[/C][C]0.488332[/C][/ROW]
[ROW][C]68[/C][C]0[/C][C]0.240659[/C][C]-0.240659[/C][/ROW]
[ROW][C]69[/C][C]0[/C][C]0.428203[/C][C]-0.428203[/C][/ROW]
[ROW][C]70[/C][C]1[/C][C]0.420346[/C][C]0.579654[/C][/ROW]
[ROW][C]71[/C][C]1[/C][C]0.354162[/C][C]0.645838[/C][/ROW]
[ROW][C]72[/C][C]1[/C][C]0.338717[/C][C]0.661283[/C][/ROW]
[ROW][C]73[/C][C]1[/C][C]0.36626[/C][C]0.63374[/C][/ROW]
[ROW][C]74[/C][C]0[/C][C]0.0986085[/C][C]-0.0986085[/C][/ROW]
[ROW][C]75[/C][C]1[/C][C]0.687482[/C][C]0.312518[/C][/ROW]
[ROW][C]76[/C][C]0[/C][C]0.12652[/C][C]-0.12652[/C][/ROW]
[ROW][C]77[/C][C]1[/C][C]0.514946[/C][C]0.485054[/C][/ROW]
[ROW][C]78[/C][C]0[/C][C]0.34128[/C][C]-0.34128[/C][/ROW]
[ROW][C]79[/C][C]1[/C][C]0.28942[/C][C]0.71058[/C][/ROW]
[ROW][C]80[/C][C]1[/C][C]0.514163[/C][C]0.485837[/C][/ROW]
[ROW][C]81[/C][C]0[/C][C]0.460624[/C][C]-0.460624[/C][/ROW]
[ROW][C]82[/C][C]1[/C][C]0.388541[/C][C]0.611459[/C][/ROW]
[ROW][C]83[/C][C]1[/C][C]0.487125[/C][C]0.512875[/C][/ROW]
[ROW][C]84[/C][C]1[/C][C]0.503028[/C][C]0.496972[/C][/ROW]
[ROW][C]85[/C][C]0[/C][C]0.46998[/C][C]-0.46998[/C][/ROW]
[ROW][C]86[/C][C]1[/C][C]0.440469[/C][C]0.559531[/C][/ROW]
[ROW][C]87[/C][C]0[/C][C]0.364413[/C][C]-0.364413[/C][/ROW]
[ROW][C]88[/C][C]0[/C][C]0.324404[/C][C]-0.324404[/C][/ROW]
[ROW][C]89[/C][C]1[/C][C]0.535885[/C][C]0.464115[/C][/ROW]
[ROW][C]90[/C][C]1[/C][C]0.452399[/C][C]0.547601[/C][/ROW]
[ROW][C]91[/C][C]1[/C][C]0.492955[/C][C]0.507045[/C][/ROW]
[ROW][C]92[/C][C]0[/C][C]0.383404[/C][C]-0.383404[/C][/ROW]
[ROW][C]93[/C][C]0[/C][C]0.612109[/C][C]-0.612109[/C][/ROW]
[ROW][C]94[/C][C]0[/C][C]0.35443[/C][C]-0.35443[/C][/ROW]
[ROW][C]95[/C][C]1[/C][C]0.690281[/C][C]0.309719[/C][/ROW]
[ROW][C]96[/C][C]0[/C][C]0.302055[/C][C]-0.302055[/C][/ROW]
[ROW][C]97[/C][C]1[/C][C]0.378995[/C][C]0.621005[/C][/ROW]
[ROW][C]98[/C][C]0[/C][C]0.405664[/C][C]-0.405664[/C][/ROW]
[ROW][C]99[/C][C]0[/C][C]0.490908[/C][C]-0.490908[/C][/ROW]
[ROW][C]100[/C][C]0[/C][C]0.391193[/C][C]-0.391193[/C][/ROW]
[ROW][C]101[/C][C]0[/C][C]0.45373[/C][C]-0.45373[/C][/ROW]
[ROW][C]102[/C][C]0[/C][C]0.415478[/C][C]-0.415478[/C][/ROW]
[ROW][C]103[/C][C]1[/C][C]0.480231[/C][C]0.519769[/C][/ROW]
[ROW][C]104[/C][C]0[/C][C]0.582978[/C][C]-0.582978[/C][/ROW]
[ROW][C]105[/C][C]0[/C][C]0.43306[/C][C]-0.43306[/C][/ROW]
[ROW][C]106[/C][C]0[/C][C]0.39013[/C][C]-0.39013[/C][/ROW]
[ROW][C]107[/C][C]1[/C][C]0.386146[/C][C]0.613854[/C][/ROW]
[ROW][C]108[/C][C]1[/C][C]0.353009[/C][C]0.646991[/C][/ROW]
[ROW][C]109[/C][C]1[/C][C]0.516278[/C][C]0.483722[/C][/ROW]
[ROW][C]110[/C][C]1[/C][C]0.346574[/C][C]0.653426[/C][/ROW]
[ROW][C]111[/C][C]1[/C][C]0.271402[/C][C]0.728598[/C][/ROW]
[ROW][C]112[/C][C]1[/C][C]0.364155[/C][C]0.635845[/C][/ROW]
[ROW][C]113[/C][C]0[/C][C]0.318573[/C][C]-0.318573[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=263596&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=263596&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
110.2173470.782653
200.265571-0.265571
310.2517050.748295
400.378995-0.378995
500.194551-0.194551
600.462818-0.462818
710.5932970.406703
800.311579-0.311579
900.527929-0.527929
100-0.1743580.174358
1100.432881-0.432881
1200.247016-0.247016
1300.688702-0.688702
1410.4696330.530367
1510.4494890.550511
1610.5824190.417581
1700.423525-0.423525
1810.4240510.575949
1900.296661-0.296661
2010.5733320.426668
2100.390656-0.390656
2200.452835-0.452835
2310.4536410.546359
2400.456382-0.456382
2500.455924-0.455924
2600.441543-0.441543
2700.446568-0.446568
2810.4144940.585506
2900.320162-0.320162
3010.556030.44397
3110.6479880.352012
3210.4222610.577739
3300.517062-0.517062
3400.502938-0.502938
3500.560965-0.560965
3600.311064-0.311064
3710.4357120.564288
3800.420414-0.420414
3900.232434-0.232434
4010.5910130.408987
4100.179711-0.179711
4200.260803-0.260803
4300.420872-0.420872
4400.660981-0.660981
4500.546752-0.546752
4610.296840.70316
4700.321483-0.321483
4800.375816-0.375816
4910.4744010.525599
5000.342959-0.342959
5100.519456-0.519456
5200.414147-0.414147
5300.309038-0.309038
5400.417157-0.417157
5500.255499-0.255499
5600.41822-0.41822
5710.6734260.326574
5800.536748-0.536748
5910.354620.64538
6000.337128-0.337128
6100.280411-0.280411
6210.4462210.553779
6310.8507860.149214
6400.350289-0.350289
6510.3837510.616249
6600.452219-0.452219
6710.5116680.488332
6800.240659-0.240659
6900.428203-0.428203
7010.4203460.579654
7110.3541620.645838
7210.3387170.661283
7310.366260.63374
7400.0986085-0.0986085
7510.6874820.312518
7600.12652-0.12652
7710.5149460.485054
7800.34128-0.34128
7910.289420.71058
8010.5141630.485837
8100.460624-0.460624
8210.3885410.611459
8310.4871250.512875
8410.5030280.496972
8500.46998-0.46998
8610.4404690.559531
8700.364413-0.364413
8800.324404-0.324404
8910.5358850.464115
9010.4523990.547601
9110.4929550.507045
9200.383404-0.383404
9300.612109-0.612109
9400.35443-0.35443
9510.6902810.309719
9600.302055-0.302055
9710.3789950.621005
9800.405664-0.405664
9900.490908-0.490908
10000.391193-0.391193
10100.45373-0.45373
10200.415478-0.415478
10310.4802310.519769
10400.582978-0.582978
10500.43306-0.43306
10600.39013-0.39013
10710.3861460.613854
10810.3530090.646991
10910.5162780.483722
11010.3465740.653426
11110.2714020.728598
11210.3641550.635845
11300.318573-0.318573







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
70.2301190.4602380.769881
80.1709360.3418720.829064
90.1001840.2003680.899816
100.3979240.7958480.602076
110.3888110.7776220.611189
120.2855110.5710220.714489
130.2679560.5359130.732044
140.3462210.6924430.653779
150.4594040.9188080.540596
160.6464770.7070470.353523
170.6684730.6630530.331527
180.6854520.6290960.314548
190.6361540.7276920.363846
200.654790.6904190.34521
210.6273930.7452130.372607
220.5926990.8146010.407301
230.6030980.7938030.396902
240.6062070.7875850.393793
250.5702390.8595230.429761
260.5685780.8628450.431422
270.5432580.9134840.456742
280.5964850.8070310.403515
290.5533350.8933310.446665
300.5373380.9253230.462662
310.4891850.978370.510815
320.5226790.9546420.477321
330.5299840.9400320.470016
340.5474240.9051520.452576
350.5681860.8636270.431814
360.5309530.9380930.469047
370.5568510.8862980.443149
380.5383380.9233240.461662
390.4892360.9784730.510764
400.4660060.9320110.533994
410.4243620.8487240.575638
420.3841170.7682340.615883
430.3709040.7418080.629096
440.4265680.8531360.573432
450.4381110.8762220.561889
460.5072640.9854720.492736
470.4712670.9425340.528733
480.4474190.8948380.552581
490.4592480.9184960.540752
500.4283160.8566310.571684
510.4371560.8743120.562844
520.4183370.8366740.581663
530.3829570.7659150.617043
540.3743640.7487280.625636
550.3378440.6756880.662156
560.3293010.6586020.670699
570.3072120.6144250.692788
580.3232630.6465260.676737
590.3710480.7420970.628952
600.3429570.6859150.657043
610.3134160.6268320.686584
620.3264890.6529780.673511
630.2873090.5746180.712691
640.2633020.5266030.736698
650.2888060.5776130.711194
660.2867440.5734870.713256
670.2783650.556730.721635
680.2469480.4938960.753052
690.2432710.4865410.756729
700.2596780.5193560.740322
710.2885610.5771210.711439
720.3260860.6521720.673914
730.355710.7114190.64429
740.3047890.6095770.695211
750.2699680.5399360.730032
760.2267510.4535010.773249
770.2231240.4462470.776876
780.2012320.4024640.798768
790.2317490.4634990.768251
800.2296080.4592160.770392
810.2288230.4576450.771177
820.2500440.5000890.749956
830.2456630.4913260.754337
840.2496320.4992630.750368
850.2537630.5075250.746237
860.2854920.5709840.714508
870.2665180.5330350.733482
880.2434990.4869980.756501
890.2208460.4416910.779154
900.2214930.4429870.778507
910.2342320.4684630.765768
920.2208840.4417690.779116
930.2210240.4420490.778976
940.2345660.4691330.765434
950.2407510.4815010.759249
960.2282470.4564930.771753
970.2171460.4342920.782854
980.2234040.4468080.776596
990.2855350.5710710.714465
1000.2338040.4676090.766196
1010.2444090.4888170.755591
1020.3004650.600930.699535
1030.2402260.4804520.759774
1040.2076410.4152830.792359
1050.2373640.4747280.762636
1060.3770730.7541470.622927

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
7 & 0.230119 & 0.460238 & 0.769881 \tabularnewline
8 & 0.170936 & 0.341872 & 0.829064 \tabularnewline
9 & 0.100184 & 0.200368 & 0.899816 \tabularnewline
10 & 0.397924 & 0.795848 & 0.602076 \tabularnewline
11 & 0.388811 & 0.777622 & 0.611189 \tabularnewline
12 & 0.285511 & 0.571022 & 0.714489 \tabularnewline
13 & 0.267956 & 0.535913 & 0.732044 \tabularnewline
14 & 0.346221 & 0.692443 & 0.653779 \tabularnewline
15 & 0.459404 & 0.918808 & 0.540596 \tabularnewline
16 & 0.646477 & 0.707047 & 0.353523 \tabularnewline
17 & 0.668473 & 0.663053 & 0.331527 \tabularnewline
18 & 0.685452 & 0.629096 & 0.314548 \tabularnewline
19 & 0.636154 & 0.727692 & 0.363846 \tabularnewline
20 & 0.65479 & 0.690419 & 0.34521 \tabularnewline
21 & 0.627393 & 0.745213 & 0.372607 \tabularnewline
22 & 0.592699 & 0.814601 & 0.407301 \tabularnewline
23 & 0.603098 & 0.793803 & 0.396902 \tabularnewline
24 & 0.606207 & 0.787585 & 0.393793 \tabularnewline
25 & 0.570239 & 0.859523 & 0.429761 \tabularnewline
26 & 0.568578 & 0.862845 & 0.431422 \tabularnewline
27 & 0.543258 & 0.913484 & 0.456742 \tabularnewline
28 & 0.596485 & 0.807031 & 0.403515 \tabularnewline
29 & 0.553335 & 0.893331 & 0.446665 \tabularnewline
30 & 0.537338 & 0.925323 & 0.462662 \tabularnewline
31 & 0.489185 & 0.97837 & 0.510815 \tabularnewline
32 & 0.522679 & 0.954642 & 0.477321 \tabularnewline
33 & 0.529984 & 0.940032 & 0.470016 \tabularnewline
34 & 0.547424 & 0.905152 & 0.452576 \tabularnewline
35 & 0.568186 & 0.863627 & 0.431814 \tabularnewline
36 & 0.530953 & 0.938093 & 0.469047 \tabularnewline
37 & 0.556851 & 0.886298 & 0.443149 \tabularnewline
38 & 0.538338 & 0.923324 & 0.461662 \tabularnewline
39 & 0.489236 & 0.978473 & 0.510764 \tabularnewline
40 & 0.466006 & 0.932011 & 0.533994 \tabularnewline
41 & 0.424362 & 0.848724 & 0.575638 \tabularnewline
42 & 0.384117 & 0.768234 & 0.615883 \tabularnewline
43 & 0.370904 & 0.741808 & 0.629096 \tabularnewline
44 & 0.426568 & 0.853136 & 0.573432 \tabularnewline
45 & 0.438111 & 0.876222 & 0.561889 \tabularnewline
46 & 0.507264 & 0.985472 & 0.492736 \tabularnewline
47 & 0.471267 & 0.942534 & 0.528733 \tabularnewline
48 & 0.447419 & 0.894838 & 0.552581 \tabularnewline
49 & 0.459248 & 0.918496 & 0.540752 \tabularnewline
50 & 0.428316 & 0.856631 & 0.571684 \tabularnewline
51 & 0.437156 & 0.874312 & 0.562844 \tabularnewline
52 & 0.418337 & 0.836674 & 0.581663 \tabularnewline
53 & 0.382957 & 0.765915 & 0.617043 \tabularnewline
54 & 0.374364 & 0.748728 & 0.625636 \tabularnewline
55 & 0.337844 & 0.675688 & 0.662156 \tabularnewline
56 & 0.329301 & 0.658602 & 0.670699 \tabularnewline
57 & 0.307212 & 0.614425 & 0.692788 \tabularnewline
58 & 0.323263 & 0.646526 & 0.676737 \tabularnewline
59 & 0.371048 & 0.742097 & 0.628952 \tabularnewline
60 & 0.342957 & 0.685915 & 0.657043 \tabularnewline
61 & 0.313416 & 0.626832 & 0.686584 \tabularnewline
62 & 0.326489 & 0.652978 & 0.673511 \tabularnewline
63 & 0.287309 & 0.574618 & 0.712691 \tabularnewline
64 & 0.263302 & 0.526603 & 0.736698 \tabularnewline
65 & 0.288806 & 0.577613 & 0.711194 \tabularnewline
66 & 0.286744 & 0.573487 & 0.713256 \tabularnewline
67 & 0.278365 & 0.55673 & 0.721635 \tabularnewline
68 & 0.246948 & 0.493896 & 0.753052 \tabularnewline
69 & 0.243271 & 0.486541 & 0.756729 \tabularnewline
70 & 0.259678 & 0.519356 & 0.740322 \tabularnewline
71 & 0.288561 & 0.577121 & 0.711439 \tabularnewline
72 & 0.326086 & 0.652172 & 0.673914 \tabularnewline
73 & 0.35571 & 0.711419 & 0.64429 \tabularnewline
74 & 0.304789 & 0.609577 & 0.695211 \tabularnewline
75 & 0.269968 & 0.539936 & 0.730032 \tabularnewline
76 & 0.226751 & 0.453501 & 0.773249 \tabularnewline
77 & 0.223124 & 0.446247 & 0.776876 \tabularnewline
78 & 0.201232 & 0.402464 & 0.798768 \tabularnewline
79 & 0.231749 & 0.463499 & 0.768251 \tabularnewline
80 & 0.229608 & 0.459216 & 0.770392 \tabularnewline
81 & 0.228823 & 0.457645 & 0.771177 \tabularnewline
82 & 0.250044 & 0.500089 & 0.749956 \tabularnewline
83 & 0.245663 & 0.491326 & 0.754337 \tabularnewline
84 & 0.249632 & 0.499263 & 0.750368 \tabularnewline
85 & 0.253763 & 0.507525 & 0.746237 \tabularnewline
86 & 0.285492 & 0.570984 & 0.714508 \tabularnewline
87 & 0.266518 & 0.533035 & 0.733482 \tabularnewline
88 & 0.243499 & 0.486998 & 0.756501 \tabularnewline
89 & 0.220846 & 0.441691 & 0.779154 \tabularnewline
90 & 0.221493 & 0.442987 & 0.778507 \tabularnewline
91 & 0.234232 & 0.468463 & 0.765768 \tabularnewline
92 & 0.220884 & 0.441769 & 0.779116 \tabularnewline
93 & 0.221024 & 0.442049 & 0.778976 \tabularnewline
94 & 0.234566 & 0.469133 & 0.765434 \tabularnewline
95 & 0.240751 & 0.481501 & 0.759249 \tabularnewline
96 & 0.228247 & 0.456493 & 0.771753 \tabularnewline
97 & 0.217146 & 0.434292 & 0.782854 \tabularnewline
98 & 0.223404 & 0.446808 & 0.776596 \tabularnewline
99 & 0.285535 & 0.571071 & 0.714465 \tabularnewline
100 & 0.233804 & 0.467609 & 0.766196 \tabularnewline
101 & 0.244409 & 0.488817 & 0.755591 \tabularnewline
102 & 0.300465 & 0.60093 & 0.699535 \tabularnewline
103 & 0.240226 & 0.480452 & 0.759774 \tabularnewline
104 & 0.207641 & 0.415283 & 0.792359 \tabularnewline
105 & 0.237364 & 0.474728 & 0.762636 \tabularnewline
106 & 0.377073 & 0.754147 & 0.622927 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=263596&T=5

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]7[/C][C]0.230119[/C][C]0.460238[/C][C]0.769881[/C][/ROW]
[ROW][C]8[/C][C]0.170936[/C][C]0.341872[/C][C]0.829064[/C][/ROW]
[ROW][C]9[/C][C]0.100184[/C][C]0.200368[/C][C]0.899816[/C][/ROW]
[ROW][C]10[/C][C]0.397924[/C][C]0.795848[/C][C]0.602076[/C][/ROW]
[ROW][C]11[/C][C]0.388811[/C][C]0.777622[/C][C]0.611189[/C][/ROW]
[ROW][C]12[/C][C]0.285511[/C][C]0.571022[/C][C]0.714489[/C][/ROW]
[ROW][C]13[/C][C]0.267956[/C][C]0.535913[/C][C]0.732044[/C][/ROW]
[ROW][C]14[/C][C]0.346221[/C][C]0.692443[/C][C]0.653779[/C][/ROW]
[ROW][C]15[/C][C]0.459404[/C][C]0.918808[/C][C]0.540596[/C][/ROW]
[ROW][C]16[/C][C]0.646477[/C][C]0.707047[/C][C]0.353523[/C][/ROW]
[ROW][C]17[/C][C]0.668473[/C][C]0.663053[/C][C]0.331527[/C][/ROW]
[ROW][C]18[/C][C]0.685452[/C][C]0.629096[/C][C]0.314548[/C][/ROW]
[ROW][C]19[/C][C]0.636154[/C][C]0.727692[/C][C]0.363846[/C][/ROW]
[ROW][C]20[/C][C]0.65479[/C][C]0.690419[/C][C]0.34521[/C][/ROW]
[ROW][C]21[/C][C]0.627393[/C][C]0.745213[/C][C]0.372607[/C][/ROW]
[ROW][C]22[/C][C]0.592699[/C][C]0.814601[/C][C]0.407301[/C][/ROW]
[ROW][C]23[/C][C]0.603098[/C][C]0.793803[/C][C]0.396902[/C][/ROW]
[ROW][C]24[/C][C]0.606207[/C][C]0.787585[/C][C]0.393793[/C][/ROW]
[ROW][C]25[/C][C]0.570239[/C][C]0.859523[/C][C]0.429761[/C][/ROW]
[ROW][C]26[/C][C]0.568578[/C][C]0.862845[/C][C]0.431422[/C][/ROW]
[ROW][C]27[/C][C]0.543258[/C][C]0.913484[/C][C]0.456742[/C][/ROW]
[ROW][C]28[/C][C]0.596485[/C][C]0.807031[/C][C]0.403515[/C][/ROW]
[ROW][C]29[/C][C]0.553335[/C][C]0.893331[/C][C]0.446665[/C][/ROW]
[ROW][C]30[/C][C]0.537338[/C][C]0.925323[/C][C]0.462662[/C][/ROW]
[ROW][C]31[/C][C]0.489185[/C][C]0.97837[/C][C]0.510815[/C][/ROW]
[ROW][C]32[/C][C]0.522679[/C][C]0.954642[/C][C]0.477321[/C][/ROW]
[ROW][C]33[/C][C]0.529984[/C][C]0.940032[/C][C]0.470016[/C][/ROW]
[ROW][C]34[/C][C]0.547424[/C][C]0.905152[/C][C]0.452576[/C][/ROW]
[ROW][C]35[/C][C]0.568186[/C][C]0.863627[/C][C]0.431814[/C][/ROW]
[ROW][C]36[/C][C]0.530953[/C][C]0.938093[/C][C]0.469047[/C][/ROW]
[ROW][C]37[/C][C]0.556851[/C][C]0.886298[/C][C]0.443149[/C][/ROW]
[ROW][C]38[/C][C]0.538338[/C][C]0.923324[/C][C]0.461662[/C][/ROW]
[ROW][C]39[/C][C]0.489236[/C][C]0.978473[/C][C]0.510764[/C][/ROW]
[ROW][C]40[/C][C]0.466006[/C][C]0.932011[/C][C]0.533994[/C][/ROW]
[ROW][C]41[/C][C]0.424362[/C][C]0.848724[/C][C]0.575638[/C][/ROW]
[ROW][C]42[/C][C]0.384117[/C][C]0.768234[/C][C]0.615883[/C][/ROW]
[ROW][C]43[/C][C]0.370904[/C][C]0.741808[/C][C]0.629096[/C][/ROW]
[ROW][C]44[/C][C]0.426568[/C][C]0.853136[/C][C]0.573432[/C][/ROW]
[ROW][C]45[/C][C]0.438111[/C][C]0.876222[/C][C]0.561889[/C][/ROW]
[ROW][C]46[/C][C]0.507264[/C][C]0.985472[/C][C]0.492736[/C][/ROW]
[ROW][C]47[/C][C]0.471267[/C][C]0.942534[/C][C]0.528733[/C][/ROW]
[ROW][C]48[/C][C]0.447419[/C][C]0.894838[/C][C]0.552581[/C][/ROW]
[ROW][C]49[/C][C]0.459248[/C][C]0.918496[/C][C]0.540752[/C][/ROW]
[ROW][C]50[/C][C]0.428316[/C][C]0.856631[/C][C]0.571684[/C][/ROW]
[ROW][C]51[/C][C]0.437156[/C][C]0.874312[/C][C]0.562844[/C][/ROW]
[ROW][C]52[/C][C]0.418337[/C][C]0.836674[/C][C]0.581663[/C][/ROW]
[ROW][C]53[/C][C]0.382957[/C][C]0.765915[/C][C]0.617043[/C][/ROW]
[ROW][C]54[/C][C]0.374364[/C][C]0.748728[/C][C]0.625636[/C][/ROW]
[ROW][C]55[/C][C]0.337844[/C][C]0.675688[/C][C]0.662156[/C][/ROW]
[ROW][C]56[/C][C]0.329301[/C][C]0.658602[/C][C]0.670699[/C][/ROW]
[ROW][C]57[/C][C]0.307212[/C][C]0.614425[/C][C]0.692788[/C][/ROW]
[ROW][C]58[/C][C]0.323263[/C][C]0.646526[/C][C]0.676737[/C][/ROW]
[ROW][C]59[/C][C]0.371048[/C][C]0.742097[/C][C]0.628952[/C][/ROW]
[ROW][C]60[/C][C]0.342957[/C][C]0.685915[/C][C]0.657043[/C][/ROW]
[ROW][C]61[/C][C]0.313416[/C][C]0.626832[/C][C]0.686584[/C][/ROW]
[ROW][C]62[/C][C]0.326489[/C][C]0.652978[/C][C]0.673511[/C][/ROW]
[ROW][C]63[/C][C]0.287309[/C][C]0.574618[/C][C]0.712691[/C][/ROW]
[ROW][C]64[/C][C]0.263302[/C][C]0.526603[/C][C]0.736698[/C][/ROW]
[ROW][C]65[/C][C]0.288806[/C][C]0.577613[/C][C]0.711194[/C][/ROW]
[ROW][C]66[/C][C]0.286744[/C][C]0.573487[/C][C]0.713256[/C][/ROW]
[ROW][C]67[/C][C]0.278365[/C][C]0.55673[/C][C]0.721635[/C][/ROW]
[ROW][C]68[/C][C]0.246948[/C][C]0.493896[/C][C]0.753052[/C][/ROW]
[ROW][C]69[/C][C]0.243271[/C][C]0.486541[/C][C]0.756729[/C][/ROW]
[ROW][C]70[/C][C]0.259678[/C][C]0.519356[/C][C]0.740322[/C][/ROW]
[ROW][C]71[/C][C]0.288561[/C][C]0.577121[/C][C]0.711439[/C][/ROW]
[ROW][C]72[/C][C]0.326086[/C][C]0.652172[/C][C]0.673914[/C][/ROW]
[ROW][C]73[/C][C]0.35571[/C][C]0.711419[/C][C]0.64429[/C][/ROW]
[ROW][C]74[/C][C]0.304789[/C][C]0.609577[/C][C]0.695211[/C][/ROW]
[ROW][C]75[/C][C]0.269968[/C][C]0.539936[/C][C]0.730032[/C][/ROW]
[ROW][C]76[/C][C]0.226751[/C][C]0.453501[/C][C]0.773249[/C][/ROW]
[ROW][C]77[/C][C]0.223124[/C][C]0.446247[/C][C]0.776876[/C][/ROW]
[ROW][C]78[/C][C]0.201232[/C][C]0.402464[/C][C]0.798768[/C][/ROW]
[ROW][C]79[/C][C]0.231749[/C][C]0.463499[/C][C]0.768251[/C][/ROW]
[ROW][C]80[/C][C]0.229608[/C][C]0.459216[/C][C]0.770392[/C][/ROW]
[ROW][C]81[/C][C]0.228823[/C][C]0.457645[/C][C]0.771177[/C][/ROW]
[ROW][C]82[/C][C]0.250044[/C][C]0.500089[/C][C]0.749956[/C][/ROW]
[ROW][C]83[/C][C]0.245663[/C][C]0.491326[/C][C]0.754337[/C][/ROW]
[ROW][C]84[/C][C]0.249632[/C][C]0.499263[/C][C]0.750368[/C][/ROW]
[ROW][C]85[/C][C]0.253763[/C][C]0.507525[/C][C]0.746237[/C][/ROW]
[ROW][C]86[/C][C]0.285492[/C][C]0.570984[/C][C]0.714508[/C][/ROW]
[ROW][C]87[/C][C]0.266518[/C][C]0.533035[/C][C]0.733482[/C][/ROW]
[ROW][C]88[/C][C]0.243499[/C][C]0.486998[/C][C]0.756501[/C][/ROW]
[ROW][C]89[/C][C]0.220846[/C][C]0.441691[/C][C]0.779154[/C][/ROW]
[ROW][C]90[/C][C]0.221493[/C][C]0.442987[/C][C]0.778507[/C][/ROW]
[ROW][C]91[/C][C]0.234232[/C][C]0.468463[/C][C]0.765768[/C][/ROW]
[ROW][C]92[/C][C]0.220884[/C][C]0.441769[/C][C]0.779116[/C][/ROW]
[ROW][C]93[/C][C]0.221024[/C][C]0.442049[/C][C]0.778976[/C][/ROW]
[ROW][C]94[/C][C]0.234566[/C][C]0.469133[/C][C]0.765434[/C][/ROW]
[ROW][C]95[/C][C]0.240751[/C][C]0.481501[/C][C]0.759249[/C][/ROW]
[ROW][C]96[/C][C]0.228247[/C][C]0.456493[/C][C]0.771753[/C][/ROW]
[ROW][C]97[/C][C]0.217146[/C][C]0.434292[/C][C]0.782854[/C][/ROW]
[ROW][C]98[/C][C]0.223404[/C][C]0.446808[/C][C]0.776596[/C][/ROW]
[ROW][C]99[/C][C]0.285535[/C][C]0.571071[/C][C]0.714465[/C][/ROW]
[ROW][C]100[/C][C]0.233804[/C][C]0.467609[/C][C]0.766196[/C][/ROW]
[ROW][C]101[/C][C]0.244409[/C][C]0.488817[/C][C]0.755591[/C][/ROW]
[ROW][C]102[/C][C]0.300465[/C][C]0.60093[/C][C]0.699535[/C][/ROW]
[ROW][C]103[/C][C]0.240226[/C][C]0.480452[/C][C]0.759774[/C][/ROW]
[ROW][C]104[/C][C]0.207641[/C][C]0.415283[/C][C]0.792359[/C][/ROW]
[ROW][C]105[/C][C]0.237364[/C][C]0.474728[/C][C]0.762636[/C][/ROW]
[ROW][C]106[/C][C]0.377073[/C][C]0.754147[/C][C]0.622927[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=263596&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=263596&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
70.2301190.4602380.769881
80.1709360.3418720.829064
90.1001840.2003680.899816
100.3979240.7958480.602076
110.3888110.7776220.611189
120.2855110.5710220.714489
130.2679560.5359130.732044
140.3462210.6924430.653779
150.4594040.9188080.540596
160.6464770.7070470.353523
170.6684730.6630530.331527
180.6854520.6290960.314548
190.6361540.7276920.363846
200.654790.6904190.34521
210.6273930.7452130.372607
220.5926990.8146010.407301
230.6030980.7938030.396902
240.6062070.7875850.393793
250.5702390.8595230.429761
260.5685780.8628450.431422
270.5432580.9134840.456742
280.5964850.8070310.403515
290.5533350.8933310.446665
300.5373380.9253230.462662
310.4891850.978370.510815
320.5226790.9546420.477321
330.5299840.9400320.470016
340.5474240.9051520.452576
350.5681860.8636270.431814
360.5309530.9380930.469047
370.5568510.8862980.443149
380.5383380.9233240.461662
390.4892360.9784730.510764
400.4660060.9320110.533994
410.4243620.8487240.575638
420.3841170.7682340.615883
430.3709040.7418080.629096
440.4265680.8531360.573432
450.4381110.8762220.561889
460.5072640.9854720.492736
470.4712670.9425340.528733
480.4474190.8948380.552581
490.4592480.9184960.540752
500.4283160.8566310.571684
510.4371560.8743120.562844
520.4183370.8366740.581663
530.3829570.7659150.617043
540.3743640.7487280.625636
550.3378440.6756880.662156
560.3293010.6586020.670699
570.3072120.6144250.692788
580.3232630.6465260.676737
590.3710480.7420970.628952
600.3429570.6859150.657043
610.3134160.6268320.686584
620.3264890.6529780.673511
630.2873090.5746180.712691
640.2633020.5266030.736698
650.2888060.5776130.711194
660.2867440.5734870.713256
670.2783650.556730.721635
680.2469480.4938960.753052
690.2432710.4865410.756729
700.2596780.5193560.740322
710.2885610.5771210.711439
720.3260860.6521720.673914
730.355710.7114190.64429
740.3047890.6095770.695211
750.2699680.5399360.730032
760.2267510.4535010.773249
770.2231240.4462470.776876
780.2012320.4024640.798768
790.2317490.4634990.768251
800.2296080.4592160.770392
810.2288230.4576450.771177
820.2500440.5000890.749956
830.2456630.4913260.754337
840.2496320.4992630.750368
850.2537630.5075250.746237
860.2854920.5709840.714508
870.2665180.5330350.733482
880.2434990.4869980.756501
890.2208460.4416910.779154
900.2214930.4429870.778507
910.2342320.4684630.765768
920.2208840.4417690.779116
930.2210240.4420490.778976
940.2345660.4691330.765434
950.2407510.4815010.759249
960.2282470.4564930.771753
970.2171460.4342920.782854
980.2234040.4468080.776596
990.2855350.5710710.714465
1000.2338040.4676090.766196
1010.2444090.4888170.755591
1020.3004650.600930.699535
1030.2402260.4804520.759774
1040.2076410.4152830.792359
1050.2373640.4747280.762636
1060.3770730.7541470.622927







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level00OK
5% type I error level00OK
10% type I error level00OK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 0 & 0 & OK \tabularnewline
5% type I error level & 0 & 0 & OK \tabularnewline
10% type I error level & 0 & 0 & OK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=263596&T=6

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=263596&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=263596&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level00OK
5% type I error level00OK
10% type I error level00OK



Parameters (Session):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
R code (references can be found in the software module):
library(lattice)
library(lmtest)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
qqline(mysum$resid)
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, signif(mysum$coefficients[i,1],6), sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,signif(mysum$coefficients[i,1],6))
a<-table.element(a, signif(mysum$coefficients[i,2],6))
a<-table.element(a, signif(mysum$coefficients[i,3],4))
a<-table.element(a, signif(mysum$coefficients[i,4],6))
a<-table.element(a, signif(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, signif(sqrt(mysum$r.squared),6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, signif(mysum$r.squared,6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, signif(mysum$adj.r.squared,6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[1],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[2],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[3],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, signif(1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]),6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, signif(mysum$sigma,6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, signif(sum(myerror*myerror),6))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,signif(x[i],6))
a<-table.element(a,signif(x[i]-mysum$resid[i],6))
a<-table.element(a,signif(mysum$resid[i],6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,signif(gqarr[mypoint-kp3+1,1],6))
a<-table.element(a,signif(gqarr[mypoint-kp3+1,2],6))
a<-table.element(a,signif(gqarr[mypoint-kp3+1,3],6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant1,6))
a<-table.element(a,signif(numsignificant1/numgqtests,6))
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant5,6))
a<-table.element(a,signif(numsignificant5/numgqtests,6))
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant10,6))
a<-table.element(a,signif(numsignificant10/numgqtests,6))
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}