Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationThu, 11 Dec 2014 11:07:53 +0000
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2014/Dec/11/t1418296093vi823catwzzxsov.htm/, Retrieved Thu, 16 May 2024 04:51:49 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=265739, Retrieved Thu, 16 May 2024 04:51:49 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact78
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [Multiple regressi...] [2014-12-11 11:07:53] [10320d42b3a1ca1321e6e126fa928a8a] [Current]
Feedback Forum

Post a new message
Dataseries X:
9 769
9 321
9 939
9 336
10 195
9 464
10 010
10 213
9 563
9 890
9 305
9 391
9 928
8 686
9 843
9 627
10 074
9 503
10 119
10 000
9 313
9 866
9 172
9 241
9 659
8 904
9 755
9 080
9 435
8 971
10 063
9 793
9 454
9 759
8 820
9 403
9 676
8 642
9 402
9 610
9 294
9 448
10 319
9 548
9 801
9 596
8 923
9 746
9 829
9 125
9 782
9 441
9 162
9 915
10 444
10 209
9 985
9 842
9 429
10 132
9 849
9 172
10 313
9 819
9 955
10 048
10 082
10 541
10 208
10 233
9 439
9 963
10 158
9 225
10 474
9 757
10 490
10 281
10 444
10 640
10 695
10 786
9 832
9 747
10 411
9 511
10 402
9 701
10 540
10 112
10 915
11 183
10 384
10 834
9 886
10 216
10 943
9 867
10 203
10 837
10 573
10 647
11 502
10 656
10 866
10 835
9 945
10 331
10 718
9 462
10 579
10 633
10 346
10 757
11 207
11 013
11 015
10 765
10 042
10 661




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time6 seconds
R Server'Sir Ronald Aylmer Fisher' @ fisher.wessa.net

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 6 seconds \tabularnewline
R Server & 'Sir Ronald Aylmer Fisher' @ fisher.wessa.net \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=265739&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]6 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Sir Ronald Aylmer Fisher' @ fisher.wessa.net[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=265739&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=265739&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time6 seconds
R Server'Sir Ronald Aylmer Fisher' @ fisher.wessa.net







Multiple Linear Regression - Estimated Regression Equation
V1[t] = + 9.14515 -0.000983315V2[t] + 0.32903M1[t] -0.581151M2[t] + 0.284194M3[t] -0.0122136M4[t] + 0.201992M5[t] + 0.0973281M6[t] + 0.685574M7[t] + 0.542463M8[t] + 0.277721M9[t] + 0.375322M10[t] -0.394346M11[t] + 0.0110589t + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
V1[t] =  +  9.14515 -0.000983315V2[t] +  0.32903M1[t] -0.581151M2[t] +  0.284194M3[t] -0.0122136M4[t] +  0.201992M5[t] +  0.0973281M6[t] +  0.685574M7[t] +  0.542463M8[t] +  0.277721M9[t] +  0.375322M10[t] -0.394346M11[t] +  0.0110589t  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=265739&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]V1[t] =  +  9.14515 -0.000983315V2[t] +  0.32903M1[t] -0.581151M2[t] +  0.284194M3[t] -0.0122136M4[t] +  0.201992M5[t] +  0.0973281M6[t] +  0.685574M7[t] +  0.542463M8[t] +  0.277721M9[t] +  0.375322M10[t] -0.394346M11[t] +  0.0110589t  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=265739&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=265739&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
V1[t] = + 9.14515 -0.000983315V2[t] + 0.32903M1[t] -0.581151M2[t] + 0.284194M3[t] -0.0122136M4[t] + 0.201992M5[t] + 0.0973281M6[t] + 0.685574M7[t] + 0.542463M8[t] + 0.277721M9[t] + 0.375322M10[t] -0.394346M11[t] + 0.0110589t + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)9.145150.11697778.181.47882e-957.39412e-96
V2-0.0009833150.000103993-9.4569.43427e-164.71713e-16
M10.329030.1347072.4430.01623680.00811839
M2-0.5811510.13278-4.3772.83174e-051.41587e-05
M30.2841940.133062.1360.03499540.0174977
M4-0.01221360.133135-0.091740.9270790.463539
M50.2019920.1328741.520.1314440.0657221
M60.09732810.1326720.73360.4648130.232406
M70.6855740.1337765.1251.34661e-066.73306e-07
M80.5424630.1329994.0798.78721e-054.39361e-05
M90.2777210.132652.0940.03867820.0193391
M100.3753220.1352472.7750.006524440.00326222
M11-0.3943460.132924-2.9670.003720320.00186016
t0.01105890.00078639714.065.57521e-262.78761e-26

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 9.14515 & 0.116977 & 78.18 & 1.47882e-95 & 7.39412e-96 \tabularnewline
V2 & -0.000983315 & 0.000103993 & -9.456 & 9.43427e-16 & 4.71713e-16 \tabularnewline
M1 & 0.32903 & 0.134707 & 2.443 & 0.0162368 & 0.00811839 \tabularnewline
M2 & -0.581151 & 0.13278 & -4.377 & 2.83174e-05 & 1.41587e-05 \tabularnewline
M3 & 0.284194 & 0.13306 & 2.136 & 0.0349954 & 0.0174977 \tabularnewline
M4 & -0.0122136 & 0.133135 & -0.09174 & 0.927079 & 0.463539 \tabularnewline
M5 & 0.201992 & 0.132874 & 1.52 & 0.131444 & 0.0657221 \tabularnewline
M6 & 0.0973281 & 0.132672 & 0.7336 & 0.464813 & 0.232406 \tabularnewline
M7 & 0.685574 & 0.133776 & 5.125 & 1.34661e-06 & 6.73306e-07 \tabularnewline
M8 & 0.542463 & 0.132999 & 4.079 & 8.78721e-05 & 4.39361e-05 \tabularnewline
M9 & 0.277721 & 0.13265 & 2.094 & 0.0386782 & 0.0193391 \tabularnewline
M10 & 0.375322 & 0.135247 & 2.775 & 0.00652444 & 0.00326222 \tabularnewline
M11 & -0.394346 & 0.132924 & -2.967 & 0.00372032 & 0.00186016 \tabularnewline
t & 0.0110589 & 0.000786397 & 14.06 & 5.57521e-26 & 2.78761e-26 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=265739&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]9.14515[/C][C]0.116977[/C][C]78.18[/C][C]1.47882e-95[/C][C]7.39412e-96[/C][/ROW]
[ROW][C]V2[/C][C]-0.000983315[/C][C]0.000103993[/C][C]-9.456[/C][C]9.43427e-16[/C][C]4.71713e-16[/C][/ROW]
[ROW][C]M1[/C][C]0.32903[/C][C]0.134707[/C][C]2.443[/C][C]0.0162368[/C][C]0.00811839[/C][/ROW]
[ROW][C]M2[/C][C]-0.581151[/C][C]0.13278[/C][C]-4.377[/C][C]2.83174e-05[/C][C]1.41587e-05[/C][/ROW]
[ROW][C]M3[/C][C]0.284194[/C][C]0.13306[/C][C]2.136[/C][C]0.0349954[/C][C]0.0174977[/C][/ROW]
[ROW][C]M4[/C][C]-0.0122136[/C][C]0.133135[/C][C]-0.09174[/C][C]0.927079[/C][C]0.463539[/C][/ROW]
[ROW][C]M5[/C][C]0.201992[/C][C]0.132874[/C][C]1.52[/C][C]0.131444[/C][C]0.0657221[/C][/ROW]
[ROW][C]M6[/C][C]0.0973281[/C][C]0.132672[/C][C]0.7336[/C][C]0.464813[/C][C]0.232406[/C][/ROW]
[ROW][C]M7[/C][C]0.685574[/C][C]0.133776[/C][C]5.125[/C][C]1.34661e-06[/C][C]6.73306e-07[/C][/ROW]
[ROW][C]M8[/C][C]0.542463[/C][C]0.132999[/C][C]4.079[/C][C]8.78721e-05[/C][C]4.39361e-05[/C][/ROW]
[ROW][C]M9[/C][C]0.277721[/C][C]0.13265[/C][C]2.094[/C][C]0.0386782[/C][C]0.0193391[/C][/ROW]
[ROW][C]M10[/C][C]0.375322[/C][C]0.135247[/C][C]2.775[/C][C]0.00652444[/C][C]0.00326222[/C][/ROW]
[ROW][C]M11[/C][C]-0.394346[/C][C]0.132924[/C][C]-2.967[/C][C]0.00372032[/C][C]0.00186016[/C][/ROW]
[ROW][C]t[/C][C]0.0110589[/C][C]0.000786397[/C][C]14.06[/C][C]5.57521e-26[/C][C]2.78761e-26[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=265739&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=265739&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)9.145150.11697778.181.47882e-957.39412e-96
V2-0.0009833150.000103993-9.4569.43427e-164.71713e-16
M10.329030.1347072.4430.01623680.00811839
M2-0.5811510.13278-4.3772.83174e-051.41587e-05
M30.2841940.133062.1360.03499540.0174977
M4-0.01221360.133135-0.091740.9270790.463539
M50.2019920.1328741.520.1314440.0657221
M60.09732810.1326720.73360.4648130.232406
M70.6855740.1337765.1251.34661e-066.73306e-07
M80.5424630.1329994.0798.78721e-054.39361e-05
M90.2777210.132652.0940.03867820.0193391
M100.3753220.1352472.7750.006524440.00326222
M11-0.3943460.132924-2.9670.003720320.00186016
t0.01105890.00078639714.065.57521e-262.78761e-26







Multiple Linear Regression - Regression Statistics
Multiple R0.90549
R-squared0.819913
Adjusted R-squared0.797827
F-TEST (value)37.1233
F-TEST (DF numerator)13
F-TEST (DF denominator)106
p-value0
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation0.29637
Sum Squared Residuals9.31051

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.90549 \tabularnewline
R-squared & 0.819913 \tabularnewline
Adjusted R-squared & 0.797827 \tabularnewline
F-TEST (value) & 37.1233 \tabularnewline
F-TEST (DF numerator) & 13 \tabularnewline
F-TEST (DF denominator) & 106 \tabularnewline
p-value & 0 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 0.29637 \tabularnewline
Sum Squared Residuals & 9.31051 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=265739&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.90549[/C][/ROW]
[ROW][C]R-squared[/C][C]0.819913[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]0.797827[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]37.1233[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]13[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]106[/C][/ROW]
[ROW][C]p-value[/C][C]0[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]0.29637[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]9.31051[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=265739&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=265739&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.90549
R-squared0.819913
Adjusted R-squared0.797827
F-TEST (value)37.1233
F-TEST (DF numerator)13
F-TEST (DF denominator)106
p-value0
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation0.29637
Sum Squared Residuals9.31051







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
198.729070.270931
298.270470.729527
398.539190.460813
498.846780.153222
5109.210690.78931
698.852570.147427
7109.89830.101696
8109.566640.433362
998.968790.0312055
1098.755910.24409
1198.572540.427459
1298.893380.106619
1398.705430.294571
1488.04427-0.0442697
1598.766290.233707
1698.693340.30666
17109.462380.537621
1898.946930.053069
19109.923830.0761706
20109.908790.091209
2199.34733-0.347331
2298.912220.0877832
2398.836030.163971
2499.17359-0.173585
2599.10265-0.102648
2687.962610.0373858
2798.985530.0144682
2899.36392-0.363921
2999.24011-0.240109
3088.61945-0.619447
311010.1116-0.111602
3299.26173-0.261729
3399.34139-0.34139
3499.15014-0.150139
3588.33155-0.331548
3699.147-0.146995
3799.21864-0.218639
3888.35295-0.35295
3999.46535-0.465349
4098.975470.0245288
4199.51146-0.511464
4299.26643-0.266428
43109.992580.00741911
4499.63535-0.635349
4599.13289-0.132887
4699.44313-0.443126
4788.36297-0.362974
4898.942430.0575746
4999.2009-0.200899
5098.994030.00596853
5199.2244-0.224397
5299.27436-0.274359
5399.77397-0.773969
5498.939930.0600731
551010.0024-0.00237375
561010.1014-0.1014
5799.08466-0.0846645
5899.33394-0.333938
5998.981440.0185613
60109.678890.321112
6199.31394-0.31394
6299.08052-0.0805229
63109.818280.181721
6499.03537-0.0353728
6599.12691-0.126907
66109.925170.0748314
671010.491-0.491041
68109.907650.0923535
69109.981410.0185921
701010.0655-0.0654846
7199.10431-0.104313
7298.994460.00553944
731010.1261-0.126118
7499.16111-0.161114
75109.792670.207327
7699.22905-0.229046
77109.716860.283144
78109.828760.171237
791010.2678-0.267788
80109.943010.0569944
81109.635240.364759
82109.654420.345582
8398.850580.149423
8499.33956-0.339564
851010.01-0.0100465
8699.01259-0.0125936
87109.996180.00382145
8899.41682-0.416819
89109.80040.199603
901010.1277-0.127651
91109.937350.062646
921110.52510.474912
931010.0738-0.073759
94109.739930.260073
9598.930190.0698146
96109.994410.00558824
97109.619630.38037
9898.795240.204759
991010.3246-0.324566
100109.41580.584205
101109.900660.0993449
102109.734280.265715
1031110.47620.523829
1041010.1927-0.192687
105109.732510.267492
106109.871650.128349
10799.00488-0.00487704
1081010.014-0.0140378
109109.973580.0264168
11099.32619-0.326191
1111010.0875-0.0875463
112109.74910.250901
1131010.2566-0.256575
114109.758830.241173
1151110.8990.101044
1161110.95770.0423338
1171110.7020.297983
1181010.0732-0.0731899
1191010.0255-0.0255182
120109.822250.177749

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 9 & 8.72907 & 0.270931 \tabularnewline
2 & 9 & 8.27047 & 0.729527 \tabularnewline
3 & 9 & 8.53919 & 0.460813 \tabularnewline
4 & 9 & 8.84678 & 0.153222 \tabularnewline
5 & 10 & 9.21069 & 0.78931 \tabularnewline
6 & 9 & 8.85257 & 0.147427 \tabularnewline
7 & 10 & 9.8983 & 0.101696 \tabularnewline
8 & 10 & 9.56664 & 0.433362 \tabularnewline
9 & 9 & 8.96879 & 0.0312055 \tabularnewline
10 & 9 & 8.75591 & 0.24409 \tabularnewline
11 & 9 & 8.57254 & 0.427459 \tabularnewline
12 & 9 & 8.89338 & 0.106619 \tabularnewline
13 & 9 & 8.70543 & 0.294571 \tabularnewline
14 & 8 & 8.04427 & -0.0442697 \tabularnewline
15 & 9 & 8.76629 & 0.233707 \tabularnewline
16 & 9 & 8.69334 & 0.30666 \tabularnewline
17 & 10 & 9.46238 & 0.537621 \tabularnewline
18 & 9 & 8.94693 & 0.053069 \tabularnewline
19 & 10 & 9.92383 & 0.0761706 \tabularnewline
20 & 10 & 9.90879 & 0.091209 \tabularnewline
21 & 9 & 9.34733 & -0.347331 \tabularnewline
22 & 9 & 8.91222 & 0.0877832 \tabularnewline
23 & 9 & 8.83603 & 0.163971 \tabularnewline
24 & 9 & 9.17359 & -0.173585 \tabularnewline
25 & 9 & 9.10265 & -0.102648 \tabularnewline
26 & 8 & 7.96261 & 0.0373858 \tabularnewline
27 & 9 & 8.98553 & 0.0144682 \tabularnewline
28 & 9 & 9.36392 & -0.363921 \tabularnewline
29 & 9 & 9.24011 & -0.240109 \tabularnewline
30 & 8 & 8.61945 & -0.619447 \tabularnewline
31 & 10 & 10.1116 & -0.111602 \tabularnewline
32 & 9 & 9.26173 & -0.261729 \tabularnewline
33 & 9 & 9.34139 & -0.34139 \tabularnewline
34 & 9 & 9.15014 & -0.150139 \tabularnewline
35 & 8 & 8.33155 & -0.331548 \tabularnewline
36 & 9 & 9.147 & -0.146995 \tabularnewline
37 & 9 & 9.21864 & -0.218639 \tabularnewline
38 & 8 & 8.35295 & -0.35295 \tabularnewline
39 & 9 & 9.46535 & -0.465349 \tabularnewline
40 & 9 & 8.97547 & 0.0245288 \tabularnewline
41 & 9 & 9.51146 & -0.511464 \tabularnewline
42 & 9 & 9.26643 & -0.266428 \tabularnewline
43 & 10 & 9.99258 & 0.00741911 \tabularnewline
44 & 9 & 9.63535 & -0.635349 \tabularnewline
45 & 9 & 9.13289 & -0.132887 \tabularnewline
46 & 9 & 9.44313 & -0.443126 \tabularnewline
47 & 8 & 8.36297 & -0.362974 \tabularnewline
48 & 9 & 8.94243 & 0.0575746 \tabularnewline
49 & 9 & 9.2009 & -0.200899 \tabularnewline
50 & 9 & 8.99403 & 0.00596853 \tabularnewline
51 & 9 & 9.2244 & -0.224397 \tabularnewline
52 & 9 & 9.27436 & -0.274359 \tabularnewline
53 & 9 & 9.77397 & -0.773969 \tabularnewline
54 & 9 & 8.93993 & 0.0600731 \tabularnewline
55 & 10 & 10.0024 & -0.00237375 \tabularnewline
56 & 10 & 10.1014 & -0.1014 \tabularnewline
57 & 9 & 9.08466 & -0.0846645 \tabularnewline
58 & 9 & 9.33394 & -0.333938 \tabularnewline
59 & 9 & 8.98144 & 0.0185613 \tabularnewline
60 & 10 & 9.67889 & 0.321112 \tabularnewline
61 & 9 & 9.31394 & -0.31394 \tabularnewline
62 & 9 & 9.08052 & -0.0805229 \tabularnewline
63 & 10 & 9.81828 & 0.181721 \tabularnewline
64 & 9 & 9.03537 & -0.0353728 \tabularnewline
65 & 9 & 9.12691 & -0.126907 \tabularnewline
66 & 10 & 9.92517 & 0.0748314 \tabularnewline
67 & 10 & 10.491 & -0.491041 \tabularnewline
68 & 10 & 9.90765 & 0.0923535 \tabularnewline
69 & 10 & 9.98141 & 0.0185921 \tabularnewline
70 & 10 & 10.0655 & -0.0654846 \tabularnewline
71 & 9 & 9.10431 & -0.104313 \tabularnewline
72 & 9 & 8.99446 & 0.00553944 \tabularnewline
73 & 10 & 10.1261 & -0.126118 \tabularnewline
74 & 9 & 9.16111 & -0.161114 \tabularnewline
75 & 10 & 9.79267 & 0.207327 \tabularnewline
76 & 9 & 9.22905 & -0.229046 \tabularnewline
77 & 10 & 9.71686 & 0.283144 \tabularnewline
78 & 10 & 9.82876 & 0.171237 \tabularnewline
79 & 10 & 10.2678 & -0.267788 \tabularnewline
80 & 10 & 9.94301 & 0.0569944 \tabularnewline
81 & 10 & 9.63524 & 0.364759 \tabularnewline
82 & 10 & 9.65442 & 0.345582 \tabularnewline
83 & 9 & 8.85058 & 0.149423 \tabularnewline
84 & 9 & 9.33956 & -0.339564 \tabularnewline
85 & 10 & 10.01 & -0.0100465 \tabularnewline
86 & 9 & 9.01259 & -0.0125936 \tabularnewline
87 & 10 & 9.99618 & 0.00382145 \tabularnewline
88 & 9 & 9.41682 & -0.416819 \tabularnewline
89 & 10 & 9.8004 & 0.199603 \tabularnewline
90 & 10 & 10.1277 & -0.127651 \tabularnewline
91 & 10 & 9.93735 & 0.062646 \tabularnewline
92 & 11 & 10.5251 & 0.474912 \tabularnewline
93 & 10 & 10.0738 & -0.073759 \tabularnewline
94 & 10 & 9.73993 & 0.260073 \tabularnewline
95 & 9 & 8.93019 & 0.0698146 \tabularnewline
96 & 10 & 9.99441 & 0.00558824 \tabularnewline
97 & 10 & 9.61963 & 0.38037 \tabularnewline
98 & 9 & 8.79524 & 0.204759 \tabularnewline
99 & 10 & 10.3246 & -0.324566 \tabularnewline
100 & 10 & 9.4158 & 0.584205 \tabularnewline
101 & 10 & 9.90066 & 0.0993449 \tabularnewline
102 & 10 & 9.73428 & 0.265715 \tabularnewline
103 & 11 & 10.4762 & 0.523829 \tabularnewline
104 & 10 & 10.1927 & -0.192687 \tabularnewline
105 & 10 & 9.73251 & 0.267492 \tabularnewline
106 & 10 & 9.87165 & 0.128349 \tabularnewline
107 & 9 & 9.00488 & -0.00487704 \tabularnewline
108 & 10 & 10.014 & -0.0140378 \tabularnewline
109 & 10 & 9.97358 & 0.0264168 \tabularnewline
110 & 9 & 9.32619 & -0.326191 \tabularnewline
111 & 10 & 10.0875 & -0.0875463 \tabularnewline
112 & 10 & 9.7491 & 0.250901 \tabularnewline
113 & 10 & 10.2566 & -0.256575 \tabularnewline
114 & 10 & 9.75883 & 0.241173 \tabularnewline
115 & 11 & 10.899 & 0.101044 \tabularnewline
116 & 11 & 10.9577 & 0.0423338 \tabularnewline
117 & 11 & 10.702 & 0.297983 \tabularnewline
118 & 10 & 10.0732 & -0.0731899 \tabularnewline
119 & 10 & 10.0255 & -0.0255182 \tabularnewline
120 & 10 & 9.82225 & 0.177749 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=265739&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]9[/C][C]8.72907[/C][C]0.270931[/C][/ROW]
[ROW][C]2[/C][C]9[/C][C]8.27047[/C][C]0.729527[/C][/ROW]
[ROW][C]3[/C][C]9[/C][C]8.53919[/C][C]0.460813[/C][/ROW]
[ROW][C]4[/C][C]9[/C][C]8.84678[/C][C]0.153222[/C][/ROW]
[ROW][C]5[/C][C]10[/C][C]9.21069[/C][C]0.78931[/C][/ROW]
[ROW][C]6[/C][C]9[/C][C]8.85257[/C][C]0.147427[/C][/ROW]
[ROW][C]7[/C][C]10[/C][C]9.8983[/C][C]0.101696[/C][/ROW]
[ROW][C]8[/C][C]10[/C][C]9.56664[/C][C]0.433362[/C][/ROW]
[ROW][C]9[/C][C]9[/C][C]8.96879[/C][C]0.0312055[/C][/ROW]
[ROW][C]10[/C][C]9[/C][C]8.75591[/C][C]0.24409[/C][/ROW]
[ROW][C]11[/C][C]9[/C][C]8.57254[/C][C]0.427459[/C][/ROW]
[ROW][C]12[/C][C]9[/C][C]8.89338[/C][C]0.106619[/C][/ROW]
[ROW][C]13[/C][C]9[/C][C]8.70543[/C][C]0.294571[/C][/ROW]
[ROW][C]14[/C][C]8[/C][C]8.04427[/C][C]-0.0442697[/C][/ROW]
[ROW][C]15[/C][C]9[/C][C]8.76629[/C][C]0.233707[/C][/ROW]
[ROW][C]16[/C][C]9[/C][C]8.69334[/C][C]0.30666[/C][/ROW]
[ROW][C]17[/C][C]10[/C][C]9.46238[/C][C]0.537621[/C][/ROW]
[ROW][C]18[/C][C]9[/C][C]8.94693[/C][C]0.053069[/C][/ROW]
[ROW][C]19[/C][C]10[/C][C]9.92383[/C][C]0.0761706[/C][/ROW]
[ROW][C]20[/C][C]10[/C][C]9.90879[/C][C]0.091209[/C][/ROW]
[ROW][C]21[/C][C]9[/C][C]9.34733[/C][C]-0.347331[/C][/ROW]
[ROW][C]22[/C][C]9[/C][C]8.91222[/C][C]0.0877832[/C][/ROW]
[ROW][C]23[/C][C]9[/C][C]8.83603[/C][C]0.163971[/C][/ROW]
[ROW][C]24[/C][C]9[/C][C]9.17359[/C][C]-0.173585[/C][/ROW]
[ROW][C]25[/C][C]9[/C][C]9.10265[/C][C]-0.102648[/C][/ROW]
[ROW][C]26[/C][C]8[/C][C]7.96261[/C][C]0.0373858[/C][/ROW]
[ROW][C]27[/C][C]9[/C][C]8.98553[/C][C]0.0144682[/C][/ROW]
[ROW][C]28[/C][C]9[/C][C]9.36392[/C][C]-0.363921[/C][/ROW]
[ROW][C]29[/C][C]9[/C][C]9.24011[/C][C]-0.240109[/C][/ROW]
[ROW][C]30[/C][C]8[/C][C]8.61945[/C][C]-0.619447[/C][/ROW]
[ROW][C]31[/C][C]10[/C][C]10.1116[/C][C]-0.111602[/C][/ROW]
[ROW][C]32[/C][C]9[/C][C]9.26173[/C][C]-0.261729[/C][/ROW]
[ROW][C]33[/C][C]9[/C][C]9.34139[/C][C]-0.34139[/C][/ROW]
[ROW][C]34[/C][C]9[/C][C]9.15014[/C][C]-0.150139[/C][/ROW]
[ROW][C]35[/C][C]8[/C][C]8.33155[/C][C]-0.331548[/C][/ROW]
[ROW][C]36[/C][C]9[/C][C]9.147[/C][C]-0.146995[/C][/ROW]
[ROW][C]37[/C][C]9[/C][C]9.21864[/C][C]-0.218639[/C][/ROW]
[ROW][C]38[/C][C]8[/C][C]8.35295[/C][C]-0.35295[/C][/ROW]
[ROW][C]39[/C][C]9[/C][C]9.46535[/C][C]-0.465349[/C][/ROW]
[ROW][C]40[/C][C]9[/C][C]8.97547[/C][C]0.0245288[/C][/ROW]
[ROW][C]41[/C][C]9[/C][C]9.51146[/C][C]-0.511464[/C][/ROW]
[ROW][C]42[/C][C]9[/C][C]9.26643[/C][C]-0.266428[/C][/ROW]
[ROW][C]43[/C][C]10[/C][C]9.99258[/C][C]0.00741911[/C][/ROW]
[ROW][C]44[/C][C]9[/C][C]9.63535[/C][C]-0.635349[/C][/ROW]
[ROW][C]45[/C][C]9[/C][C]9.13289[/C][C]-0.132887[/C][/ROW]
[ROW][C]46[/C][C]9[/C][C]9.44313[/C][C]-0.443126[/C][/ROW]
[ROW][C]47[/C][C]8[/C][C]8.36297[/C][C]-0.362974[/C][/ROW]
[ROW][C]48[/C][C]9[/C][C]8.94243[/C][C]0.0575746[/C][/ROW]
[ROW][C]49[/C][C]9[/C][C]9.2009[/C][C]-0.200899[/C][/ROW]
[ROW][C]50[/C][C]9[/C][C]8.99403[/C][C]0.00596853[/C][/ROW]
[ROW][C]51[/C][C]9[/C][C]9.2244[/C][C]-0.224397[/C][/ROW]
[ROW][C]52[/C][C]9[/C][C]9.27436[/C][C]-0.274359[/C][/ROW]
[ROW][C]53[/C][C]9[/C][C]9.77397[/C][C]-0.773969[/C][/ROW]
[ROW][C]54[/C][C]9[/C][C]8.93993[/C][C]0.0600731[/C][/ROW]
[ROW][C]55[/C][C]10[/C][C]10.0024[/C][C]-0.00237375[/C][/ROW]
[ROW][C]56[/C][C]10[/C][C]10.1014[/C][C]-0.1014[/C][/ROW]
[ROW][C]57[/C][C]9[/C][C]9.08466[/C][C]-0.0846645[/C][/ROW]
[ROW][C]58[/C][C]9[/C][C]9.33394[/C][C]-0.333938[/C][/ROW]
[ROW][C]59[/C][C]9[/C][C]8.98144[/C][C]0.0185613[/C][/ROW]
[ROW][C]60[/C][C]10[/C][C]9.67889[/C][C]0.321112[/C][/ROW]
[ROW][C]61[/C][C]9[/C][C]9.31394[/C][C]-0.31394[/C][/ROW]
[ROW][C]62[/C][C]9[/C][C]9.08052[/C][C]-0.0805229[/C][/ROW]
[ROW][C]63[/C][C]10[/C][C]9.81828[/C][C]0.181721[/C][/ROW]
[ROW][C]64[/C][C]9[/C][C]9.03537[/C][C]-0.0353728[/C][/ROW]
[ROW][C]65[/C][C]9[/C][C]9.12691[/C][C]-0.126907[/C][/ROW]
[ROW][C]66[/C][C]10[/C][C]9.92517[/C][C]0.0748314[/C][/ROW]
[ROW][C]67[/C][C]10[/C][C]10.491[/C][C]-0.491041[/C][/ROW]
[ROW][C]68[/C][C]10[/C][C]9.90765[/C][C]0.0923535[/C][/ROW]
[ROW][C]69[/C][C]10[/C][C]9.98141[/C][C]0.0185921[/C][/ROW]
[ROW][C]70[/C][C]10[/C][C]10.0655[/C][C]-0.0654846[/C][/ROW]
[ROW][C]71[/C][C]9[/C][C]9.10431[/C][C]-0.104313[/C][/ROW]
[ROW][C]72[/C][C]9[/C][C]8.99446[/C][C]0.00553944[/C][/ROW]
[ROW][C]73[/C][C]10[/C][C]10.1261[/C][C]-0.126118[/C][/ROW]
[ROW][C]74[/C][C]9[/C][C]9.16111[/C][C]-0.161114[/C][/ROW]
[ROW][C]75[/C][C]10[/C][C]9.79267[/C][C]0.207327[/C][/ROW]
[ROW][C]76[/C][C]9[/C][C]9.22905[/C][C]-0.229046[/C][/ROW]
[ROW][C]77[/C][C]10[/C][C]9.71686[/C][C]0.283144[/C][/ROW]
[ROW][C]78[/C][C]10[/C][C]9.82876[/C][C]0.171237[/C][/ROW]
[ROW][C]79[/C][C]10[/C][C]10.2678[/C][C]-0.267788[/C][/ROW]
[ROW][C]80[/C][C]10[/C][C]9.94301[/C][C]0.0569944[/C][/ROW]
[ROW][C]81[/C][C]10[/C][C]9.63524[/C][C]0.364759[/C][/ROW]
[ROW][C]82[/C][C]10[/C][C]9.65442[/C][C]0.345582[/C][/ROW]
[ROW][C]83[/C][C]9[/C][C]8.85058[/C][C]0.149423[/C][/ROW]
[ROW][C]84[/C][C]9[/C][C]9.33956[/C][C]-0.339564[/C][/ROW]
[ROW][C]85[/C][C]10[/C][C]10.01[/C][C]-0.0100465[/C][/ROW]
[ROW][C]86[/C][C]9[/C][C]9.01259[/C][C]-0.0125936[/C][/ROW]
[ROW][C]87[/C][C]10[/C][C]9.99618[/C][C]0.00382145[/C][/ROW]
[ROW][C]88[/C][C]9[/C][C]9.41682[/C][C]-0.416819[/C][/ROW]
[ROW][C]89[/C][C]10[/C][C]9.8004[/C][C]0.199603[/C][/ROW]
[ROW][C]90[/C][C]10[/C][C]10.1277[/C][C]-0.127651[/C][/ROW]
[ROW][C]91[/C][C]10[/C][C]9.93735[/C][C]0.062646[/C][/ROW]
[ROW][C]92[/C][C]11[/C][C]10.5251[/C][C]0.474912[/C][/ROW]
[ROW][C]93[/C][C]10[/C][C]10.0738[/C][C]-0.073759[/C][/ROW]
[ROW][C]94[/C][C]10[/C][C]9.73993[/C][C]0.260073[/C][/ROW]
[ROW][C]95[/C][C]9[/C][C]8.93019[/C][C]0.0698146[/C][/ROW]
[ROW][C]96[/C][C]10[/C][C]9.99441[/C][C]0.00558824[/C][/ROW]
[ROW][C]97[/C][C]10[/C][C]9.61963[/C][C]0.38037[/C][/ROW]
[ROW][C]98[/C][C]9[/C][C]8.79524[/C][C]0.204759[/C][/ROW]
[ROW][C]99[/C][C]10[/C][C]10.3246[/C][C]-0.324566[/C][/ROW]
[ROW][C]100[/C][C]10[/C][C]9.4158[/C][C]0.584205[/C][/ROW]
[ROW][C]101[/C][C]10[/C][C]9.90066[/C][C]0.0993449[/C][/ROW]
[ROW][C]102[/C][C]10[/C][C]9.73428[/C][C]0.265715[/C][/ROW]
[ROW][C]103[/C][C]11[/C][C]10.4762[/C][C]0.523829[/C][/ROW]
[ROW][C]104[/C][C]10[/C][C]10.1927[/C][C]-0.192687[/C][/ROW]
[ROW][C]105[/C][C]10[/C][C]9.73251[/C][C]0.267492[/C][/ROW]
[ROW][C]106[/C][C]10[/C][C]9.87165[/C][C]0.128349[/C][/ROW]
[ROW][C]107[/C][C]9[/C][C]9.00488[/C][C]-0.00487704[/C][/ROW]
[ROW][C]108[/C][C]10[/C][C]10.014[/C][C]-0.0140378[/C][/ROW]
[ROW][C]109[/C][C]10[/C][C]9.97358[/C][C]0.0264168[/C][/ROW]
[ROW][C]110[/C][C]9[/C][C]9.32619[/C][C]-0.326191[/C][/ROW]
[ROW][C]111[/C][C]10[/C][C]10.0875[/C][C]-0.0875463[/C][/ROW]
[ROW][C]112[/C][C]10[/C][C]9.7491[/C][C]0.250901[/C][/ROW]
[ROW][C]113[/C][C]10[/C][C]10.2566[/C][C]-0.256575[/C][/ROW]
[ROW][C]114[/C][C]10[/C][C]9.75883[/C][C]0.241173[/C][/ROW]
[ROW][C]115[/C][C]11[/C][C]10.899[/C][C]0.101044[/C][/ROW]
[ROW][C]116[/C][C]11[/C][C]10.9577[/C][C]0.0423338[/C][/ROW]
[ROW][C]117[/C][C]11[/C][C]10.702[/C][C]0.297983[/C][/ROW]
[ROW][C]118[/C][C]10[/C][C]10.0732[/C][C]-0.0731899[/C][/ROW]
[ROW][C]119[/C][C]10[/C][C]10.0255[/C][C]-0.0255182[/C][/ROW]
[ROW][C]120[/C][C]10[/C][C]9.82225[/C][C]0.177749[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=265739&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=265739&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
198.729070.270931
298.270470.729527
398.539190.460813
498.846780.153222
5109.210690.78931
698.852570.147427
7109.89830.101696
8109.566640.433362
998.968790.0312055
1098.755910.24409
1198.572540.427459
1298.893380.106619
1398.705430.294571
1488.04427-0.0442697
1598.766290.233707
1698.693340.30666
17109.462380.537621
1898.946930.053069
19109.923830.0761706
20109.908790.091209
2199.34733-0.347331
2298.912220.0877832
2398.836030.163971
2499.17359-0.173585
2599.10265-0.102648
2687.962610.0373858
2798.985530.0144682
2899.36392-0.363921
2999.24011-0.240109
3088.61945-0.619447
311010.1116-0.111602
3299.26173-0.261729
3399.34139-0.34139
3499.15014-0.150139
3588.33155-0.331548
3699.147-0.146995
3799.21864-0.218639
3888.35295-0.35295
3999.46535-0.465349
4098.975470.0245288
4199.51146-0.511464
4299.26643-0.266428
43109.992580.00741911
4499.63535-0.635349
4599.13289-0.132887
4699.44313-0.443126
4788.36297-0.362974
4898.942430.0575746
4999.2009-0.200899
5098.994030.00596853
5199.2244-0.224397
5299.27436-0.274359
5399.77397-0.773969
5498.939930.0600731
551010.0024-0.00237375
561010.1014-0.1014
5799.08466-0.0846645
5899.33394-0.333938
5998.981440.0185613
60109.678890.321112
6199.31394-0.31394
6299.08052-0.0805229
63109.818280.181721
6499.03537-0.0353728
6599.12691-0.126907
66109.925170.0748314
671010.491-0.491041
68109.907650.0923535
69109.981410.0185921
701010.0655-0.0654846
7199.10431-0.104313
7298.994460.00553944
731010.1261-0.126118
7499.16111-0.161114
75109.792670.207327
7699.22905-0.229046
77109.716860.283144
78109.828760.171237
791010.2678-0.267788
80109.943010.0569944
81109.635240.364759
82109.654420.345582
8398.850580.149423
8499.33956-0.339564
851010.01-0.0100465
8699.01259-0.0125936
87109.996180.00382145
8899.41682-0.416819
89109.80040.199603
901010.1277-0.127651
91109.937350.062646
921110.52510.474912
931010.0738-0.073759
94109.739930.260073
9598.930190.0698146
96109.994410.00558824
97109.619630.38037
9898.795240.204759
991010.3246-0.324566
100109.41580.584205
101109.900660.0993449
102109.734280.265715
1031110.47620.523829
1041010.1927-0.192687
105109.732510.267492
106109.871650.128349
10799.00488-0.00487704
1081010.014-0.0140378
109109.973580.0264168
11099.32619-0.326191
1111010.0875-0.0875463
112109.74910.250901
1131010.2566-0.256575
114109.758830.241173
1151110.8990.101044
1161110.95770.0423338
1171110.7020.297983
1181010.0732-0.0731899
1191010.0255-0.0255182
120109.822250.177749







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
170.7492650.5014710.250735
180.6195860.7608280.380414
190.5126980.9746040.487302
200.4442990.8885970.555701
210.3556870.7113740.644313
220.2674770.5349540.732523
230.2081810.4163620.791819
240.141750.28350.85825
250.09393920.1878780.906061
260.06322750.1264550.936772
270.04184520.08369030.958155
280.03318960.06637920.96681
290.1436650.287330.856335
300.1582710.3165410.841729
310.1486330.2972670.851367
320.1069440.2138890.893056
330.1008910.2017820.899109
340.07308810.1461760.926912
350.05419010.108380.94581
360.05868560.1173710.941314
370.0429580.08591590.957042
380.03076550.0615310.969235
390.03244410.06488820.967556
400.09689220.1937840.903108
410.1189690.2379390.881031
420.1418440.2836870.858156
430.2179570.4359140.782043
440.2573170.5146330.742683
450.3820940.7641890.617906
460.3519550.7039110.648045
470.3163810.6327630.683619
480.4255160.8510310.574484
490.404530.809060.59547
500.4502540.9005090.549746
510.4146840.8293670.585316
520.3733290.7466580.626671
530.5586250.882750.441375
540.6990430.6019150.300957
550.7057110.5885780.294289
560.7123510.5752990.287649
570.7488150.5023690.251185
580.7558030.4883950.244197
590.7589570.4820860.241043
600.8868280.2263450.113172
610.8871560.2256880.112844
620.8670290.2659430.132971
630.8957610.2084790.104239
640.8855090.2289820.114491
650.880070.2398590.11993
660.8789930.2420150.121007
670.8967510.2064970.103249
680.8959220.2081550.104078
690.8874630.2250750.112537
700.8626240.2747520.137376
710.8298520.3402950.170148
720.8014120.3971760.198588
730.7625330.4749350.237467
740.7115420.5769160.288458
750.7362520.5274950.263748
760.7315270.5369460.268473
770.7700620.4598750.229938
780.7538990.4922010.246101
790.7815230.4369540.218477
800.752310.495380.24769
810.7764810.4470380.223519
820.7988240.4023520.201176
830.7698020.4603960.230198
840.7942780.4114430.205722
850.7485060.5029870.251494
860.6912260.6175480.308774
870.6458480.7083030.354152
880.8882310.2235380.111769
890.8737550.252490.126245
900.8823230.2353540.117677
910.9054610.1890790.0945394
920.9578350.08433060.0421653
930.9748240.05035280.0251764
940.9635640.07287220.0364361
950.9406980.1186040.059302
960.9152340.1695330.0847665
970.899340.2013210.10066
980.9250740.1498510.0749256
990.9216690.1566620.0783308
1000.9086310.1827390.0913695
1010.8959690.2080630.104031
1020.8040610.3918770.195939
1030.8890680.2218630.110932

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
17 & 0.749265 & 0.501471 & 0.250735 \tabularnewline
18 & 0.619586 & 0.760828 & 0.380414 \tabularnewline
19 & 0.512698 & 0.974604 & 0.487302 \tabularnewline
20 & 0.444299 & 0.888597 & 0.555701 \tabularnewline
21 & 0.355687 & 0.711374 & 0.644313 \tabularnewline
22 & 0.267477 & 0.534954 & 0.732523 \tabularnewline
23 & 0.208181 & 0.416362 & 0.791819 \tabularnewline
24 & 0.14175 & 0.2835 & 0.85825 \tabularnewline
25 & 0.0939392 & 0.187878 & 0.906061 \tabularnewline
26 & 0.0632275 & 0.126455 & 0.936772 \tabularnewline
27 & 0.0418452 & 0.0836903 & 0.958155 \tabularnewline
28 & 0.0331896 & 0.0663792 & 0.96681 \tabularnewline
29 & 0.143665 & 0.28733 & 0.856335 \tabularnewline
30 & 0.158271 & 0.316541 & 0.841729 \tabularnewline
31 & 0.148633 & 0.297267 & 0.851367 \tabularnewline
32 & 0.106944 & 0.213889 & 0.893056 \tabularnewline
33 & 0.100891 & 0.201782 & 0.899109 \tabularnewline
34 & 0.0730881 & 0.146176 & 0.926912 \tabularnewline
35 & 0.0541901 & 0.10838 & 0.94581 \tabularnewline
36 & 0.0586856 & 0.117371 & 0.941314 \tabularnewline
37 & 0.042958 & 0.0859159 & 0.957042 \tabularnewline
38 & 0.0307655 & 0.061531 & 0.969235 \tabularnewline
39 & 0.0324441 & 0.0648882 & 0.967556 \tabularnewline
40 & 0.0968922 & 0.193784 & 0.903108 \tabularnewline
41 & 0.118969 & 0.237939 & 0.881031 \tabularnewline
42 & 0.141844 & 0.283687 & 0.858156 \tabularnewline
43 & 0.217957 & 0.435914 & 0.782043 \tabularnewline
44 & 0.257317 & 0.514633 & 0.742683 \tabularnewline
45 & 0.382094 & 0.764189 & 0.617906 \tabularnewline
46 & 0.351955 & 0.703911 & 0.648045 \tabularnewline
47 & 0.316381 & 0.632763 & 0.683619 \tabularnewline
48 & 0.425516 & 0.851031 & 0.574484 \tabularnewline
49 & 0.40453 & 0.80906 & 0.59547 \tabularnewline
50 & 0.450254 & 0.900509 & 0.549746 \tabularnewline
51 & 0.414684 & 0.829367 & 0.585316 \tabularnewline
52 & 0.373329 & 0.746658 & 0.626671 \tabularnewline
53 & 0.558625 & 0.88275 & 0.441375 \tabularnewline
54 & 0.699043 & 0.601915 & 0.300957 \tabularnewline
55 & 0.705711 & 0.588578 & 0.294289 \tabularnewline
56 & 0.712351 & 0.575299 & 0.287649 \tabularnewline
57 & 0.748815 & 0.502369 & 0.251185 \tabularnewline
58 & 0.755803 & 0.488395 & 0.244197 \tabularnewline
59 & 0.758957 & 0.482086 & 0.241043 \tabularnewline
60 & 0.886828 & 0.226345 & 0.113172 \tabularnewline
61 & 0.887156 & 0.225688 & 0.112844 \tabularnewline
62 & 0.867029 & 0.265943 & 0.132971 \tabularnewline
63 & 0.895761 & 0.208479 & 0.104239 \tabularnewline
64 & 0.885509 & 0.228982 & 0.114491 \tabularnewline
65 & 0.88007 & 0.239859 & 0.11993 \tabularnewline
66 & 0.878993 & 0.242015 & 0.121007 \tabularnewline
67 & 0.896751 & 0.206497 & 0.103249 \tabularnewline
68 & 0.895922 & 0.208155 & 0.104078 \tabularnewline
69 & 0.887463 & 0.225075 & 0.112537 \tabularnewline
70 & 0.862624 & 0.274752 & 0.137376 \tabularnewline
71 & 0.829852 & 0.340295 & 0.170148 \tabularnewline
72 & 0.801412 & 0.397176 & 0.198588 \tabularnewline
73 & 0.762533 & 0.474935 & 0.237467 \tabularnewline
74 & 0.711542 & 0.576916 & 0.288458 \tabularnewline
75 & 0.736252 & 0.527495 & 0.263748 \tabularnewline
76 & 0.731527 & 0.536946 & 0.268473 \tabularnewline
77 & 0.770062 & 0.459875 & 0.229938 \tabularnewline
78 & 0.753899 & 0.492201 & 0.246101 \tabularnewline
79 & 0.781523 & 0.436954 & 0.218477 \tabularnewline
80 & 0.75231 & 0.49538 & 0.24769 \tabularnewline
81 & 0.776481 & 0.447038 & 0.223519 \tabularnewline
82 & 0.798824 & 0.402352 & 0.201176 \tabularnewline
83 & 0.769802 & 0.460396 & 0.230198 \tabularnewline
84 & 0.794278 & 0.411443 & 0.205722 \tabularnewline
85 & 0.748506 & 0.502987 & 0.251494 \tabularnewline
86 & 0.691226 & 0.617548 & 0.308774 \tabularnewline
87 & 0.645848 & 0.708303 & 0.354152 \tabularnewline
88 & 0.888231 & 0.223538 & 0.111769 \tabularnewline
89 & 0.873755 & 0.25249 & 0.126245 \tabularnewline
90 & 0.882323 & 0.235354 & 0.117677 \tabularnewline
91 & 0.905461 & 0.189079 & 0.0945394 \tabularnewline
92 & 0.957835 & 0.0843306 & 0.0421653 \tabularnewline
93 & 0.974824 & 0.0503528 & 0.0251764 \tabularnewline
94 & 0.963564 & 0.0728722 & 0.0364361 \tabularnewline
95 & 0.940698 & 0.118604 & 0.059302 \tabularnewline
96 & 0.915234 & 0.169533 & 0.0847665 \tabularnewline
97 & 0.89934 & 0.201321 & 0.10066 \tabularnewline
98 & 0.925074 & 0.149851 & 0.0749256 \tabularnewline
99 & 0.921669 & 0.156662 & 0.0783308 \tabularnewline
100 & 0.908631 & 0.182739 & 0.0913695 \tabularnewline
101 & 0.895969 & 0.208063 & 0.104031 \tabularnewline
102 & 0.804061 & 0.391877 & 0.195939 \tabularnewline
103 & 0.889068 & 0.221863 & 0.110932 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=265739&T=5

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]17[/C][C]0.749265[/C][C]0.501471[/C][C]0.250735[/C][/ROW]
[ROW][C]18[/C][C]0.619586[/C][C]0.760828[/C][C]0.380414[/C][/ROW]
[ROW][C]19[/C][C]0.512698[/C][C]0.974604[/C][C]0.487302[/C][/ROW]
[ROW][C]20[/C][C]0.444299[/C][C]0.888597[/C][C]0.555701[/C][/ROW]
[ROW][C]21[/C][C]0.355687[/C][C]0.711374[/C][C]0.644313[/C][/ROW]
[ROW][C]22[/C][C]0.267477[/C][C]0.534954[/C][C]0.732523[/C][/ROW]
[ROW][C]23[/C][C]0.208181[/C][C]0.416362[/C][C]0.791819[/C][/ROW]
[ROW][C]24[/C][C]0.14175[/C][C]0.2835[/C][C]0.85825[/C][/ROW]
[ROW][C]25[/C][C]0.0939392[/C][C]0.187878[/C][C]0.906061[/C][/ROW]
[ROW][C]26[/C][C]0.0632275[/C][C]0.126455[/C][C]0.936772[/C][/ROW]
[ROW][C]27[/C][C]0.0418452[/C][C]0.0836903[/C][C]0.958155[/C][/ROW]
[ROW][C]28[/C][C]0.0331896[/C][C]0.0663792[/C][C]0.96681[/C][/ROW]
[ROW][C]29[/C][C]0.143665[/C][C]0.28733[/C][C]0.856335[/C][/ROW]
[ROW][C]30[/C][C]0.158271[/C][C]0.316541[/C][C]0.841729[/C][/ROW]
[ROW][C]31[/C][C]0.148633[/C][C]0.297267[/C][C]0.851367[/C][/ROW]
[ROW][C]32[/C][C]0.106944[/C][C]0.213889[/C][C]0.893056[/C][/ROW]
[ROW][C]33[/C][C]0.100891[/C][C]0.201782[/C][C]0.899109[/C][/ROW]
[ROW][C]34[/C][C]0.0730881[/C][C]0.146176[/C][C]0.926912[/C][/ROW]
[ROW][C]35[/C][C]0.0541901[/C][C]0.10838[/C][C]0.94581[/C][/ROW]
[ROW][C]36[/C][C]0.0586856[/C][C]0.117371[/C][C]0.941314[/C][/ROW]
[ROW][C]37[/C][C]0.042958[/C][C]0.0859159[/C][C]0.957042[/C][/ROW]
[ROW][C]38[/C][C]0.0307655[/C][C]0.061531[/C][C]0.969235[/C][/ROW]
[ROW][C]39[/C][C]0.0324441[/C][C]0.0648882[/C][C]0.967556[/C][/ROW]
[ROW][C]40[/C][C]0.0968922[/C][C]0.193784[/C][C]0.903108[/C][/ROW]
[ROW][C]41[/C][C]0.118969[/C][C]0.237939[/C][C]0.881031[/C][/ROW]
[ROW][C]42[/C][C]0.141844[/C][C]0.283687[/C][C]0.858156[/C][/ROW]
[ROW][C]43[/C][C]0.217957[/C][C]0.435914[/C][C]0.782043[/C][/ROW]
[ROW][C]44[/C][C]0.257317[/C][C]0.514633[/C][C]0.742683[/C][/ROW]
[ROW][C]45[/C][C]0.382094[/C][C]0.764189[/C][C]0.617906[/C][/ROW]
[ROW][C]46[/C][C]0.351955[/C][C]0.703911[/C][C]0.648045[/C][/ROW]
[ROW][C]47[/C][C]0.316381[/C][C]0.632763[/C][C]0.683619[/C][/ROW]
[ROW][C]48[/C][C]0.425516[/C][C]0.851031[/C][C]0.574484[/C][/ROW]
[ROW][C]49[/C][C]0.40453[/C][C]0.80906[/C][C]0.59547[/C][/ROW]
[ROW][C]50[/C][C]0.450254[/C][C]0.900509[/C][C]0.549746[/C][/ROW]
[ROW][C]51[/C][C]0.414684[/C][C]0.829367[/C][C]0.585316[/C][/ROW]
[ROW][C]52[/C][C]0.373329[/C][C]0.746658[/C][C]0.626671[/C][/ROW]
[ROW][C]53[/C][C]0.558625[/C][C]0.88275[/C][C]0.441375[/C][/ROW]
[ROW][C]54[/C][C]0.699043[/C][C]0.601915[/C][C]0.300957[/C][/ROW]
[ROW][C]55[/C][C]0.705711[/C][C]0.588578[/C][C]0.294289[/C][/ROW]
[ROW][C]56[/C][C]0.712351[/C][C]0.575299[/C][C]0.287649[/C][/ROW]
[ROW][C]57[/C][C]0.748815[/C][C]0.502369[/C][C]0.251185[/C][/ROW]
[ROW][C]58[/C][C]0.755803[/C][C]0.488395[/C][C]0.244197[/C][/ROW]
[ROW][C]59[/C][C]0.758957[/C][C]0.482086[/C][C]0.241043[/C][/ROW]
[ROW][C]60[/C][C]0.886828[/C][C]0.226345[/C][C]0.113172[/C][/ROW]
[ROW][C]61[/C][C]0.887156[/C][C]0.225688[/C][C]0.112844[/C][/ROW]
[ROW][C]62[/C][C]0.867029[/C][C]0.265943[/C][C]0.132971[/C][/ROW]
[ROW][C]63[/C][C]0.895761[/C][C]0.208479[/C][C]0.104239[/C][/ROW]
[ROW][C]64[/C][C]0.885509[/C][C]0.228982[/C][C]0.114491[/C][/ROW]
[ROW][C]65[/C][C]0.88007[/C][C]0.239859[/C][C]0.11993[/C][/ROW]
[ROW][C]66[/C][C]0.878993[/C][C]0.242015[/C][C]0.121007[/C][/ROW]
[ROW][C]67[/C][C]0.896751[/C][C]0.206497[/C][C]0.103249[/C][/ROW]
[ROW][C]68[/C][C]0.895922[/C][C]0.208155[/C][C]0.104078[/C][/ROW]
[ROW][C]69[/C][C]0.887463[/C][C]0.225075[/C][C]0.112537[/C][/ROW]
[ROW][C]70[/C][C]0.862624[/C][C]0.274752[/C][C]0.137376[/C][/ROW]
[ROW][C]71[/C][C]0.829852[/C][C]0.340295[/C][C]0.170148[/C][/ROW]
[ROW][C]72[/C][C]0.801412[/C][C]0.397176[/C][C]0.198588[/C][/ROW]
[ROW][C]73[/C][C]0.762533[/C][C]0.474935[/C][C]0.237467[/C][/ROW]
[ROW][C]74[/C][C]0.711542[/C][C]0.576916[/C][C]0.288458[/C][/ROW]
[ROW][C]75[/C][C]0.736252[/C][C]0.527495[/C][C]0.263748[/C][/ROW]
[ROW][C]76[/C][C]0.731527[/C][C]0.536946[/C][C]0.268473[/C][/ROW]
[ROW][C]77[/C][C]0.770062[/C][C]0.459875[/C][C]0.229938[/C][/ROW]
[ROW][C]78[/C][C]0.753899[/C][C]0.492201[/C][C]0.246101[/C][/ROW]
[ROW][C]79[/C][C]0.781523[/C][C]0.436954[/C][C]0.218477[/C][/ROW]
[ROW][C]80[/C][C]0.75231[/C][C]0.49538[/C][C]0.24769[/C][/ROW]
[ROW][C]81[/C][C]0.776481[/C][C]0.447038[/C][C]0.223519[/C][/ROW]
[ROW][C]82[/C][C]0.798824[/C][C]0.402352[/C][C]0.201176[/C][/ROW]
[ROW][C]83[/C][C]0.769802[/C][C]0.460396[/C][C]0.230198[/C][/ROW]
[ROW][C]84[/C][C]0.794278[/C][C]0.411443[/C][C]0.205722[/C][/ROW]
[ROW][C]85[/C][C]0.748506[/C][C]0.502987[/C][C]0.251494[/C][/ROW]
[ROW][C]86[/C][C]0.691226[/C][C]0.617548[/C][C]0.308774[/C][/ROW]
[ROW][C]87[/C][C]0.645848[/C][C]0.708303[/C][C]0.354152[/C][/ROW]
[ROW][C]88[/C][C]0.888231[/C][C]0.223538[/C][C]0.111769[/C][/ROW]
[ROW][C]89[/C][C]0.873755[/C][C]0.25249[/C][C]0.126245[/C][/ROW]
[ROW][C]90[/C][C]0.882323[/C][C]0.235354[/C][C]0.117677[/C][/ROW]
[ROW][C]91[/C][C]0.905461[/C][C]0.189079[/C][C]0.0945394[/C][/ROW]
[ROW][C]92[/C][C]0.957835[/C][C]0.0843306[/C][C]0.0421653[/C][/ROW]
[ROW][C]93[/C][C]0.974824[/C][C]0.0503528[/C][C]0.0251764[/C][/ROW]
[ROW][C]94[/C][C]0.963564[/C][C]0.0728722[/C][C]0.0364361[/C][/ROW]
[ROW][C]95[/C][C]0.940698[/C][C]0.118604[/C][C]0.059302[/C][/ROW]
[ROW][C]96[/C][C]0.915234[/C][C]0.169533[/C][C]0.0847665[/C][/ROW]
[ROW][C]97[/C][C]0.89934[/C][C]0.201321[/C][C]0.10066[/C][/ROW]
[ROW][C]98[/C][C]0.925074[/C][C]0.149851[/C][C]0.0749256[/C][/ROW]
[ROW][C]99[/C][C]0.921669[/C][C]0.156662[/C][C]0.0783308[/C][/ROW]
[ROW][C]100[/C][C]0.908631[/C][C]0.182739[/C][C]0.0913695[/C][/ROW]
[ROW][C]101[/C][C]0.895969[/C][C]0.208063[/C][C]0.104031[/C][/ROW]
[ROW][C]102[/C][C]0.804061[/C][C]0.391877[/C][C]0.195939[/C][/ROW]
[ROW][C]103[/C][C]0.889068[/C][C]0.221863[/C][C]0.110932[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=265739&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=265739&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
170.7492650.5014710.250735
180.6195860.7608280.380414
190.5126980.9746040.487302
200.4442990.8885970.555701
210.3556870.7113740.644313
220.2674770.5349540.732523
230.2081810.4163620.791819
240.141750.28350.85825
250.09393920.1878780.906061
260.06322750.1264550.936772
270.04184520.08369030.958155
280.03318960.06637920.96681
290.1436650.287330.856335
300.1582710.3165410.841729
310.1486330.2972670.851367
320.1069440.2138890.893056
330.1008910.2017820.899109
340.07308810.1461760.926912
350.05419010.108380.94581
360.05868560.1173710.941314
370.0429580.08591590.957042
380.03076550.0615310.969235
390.03244410.06488820.967556
400.09689220.1937840.903108
410.1189690.2379390.881031
420.1418440.2836870.858156
430.2179570.4359140.782043
440.2573170.5146330.742683
450.3820940.7641890.617906
460.3519550.7039110.648045
470.3163810.6327630.683619
480.4255160.8510310.574484
490.404530.809060.59547
500.4502540.9005090.549746
510.4146840.8293670.585316
520.3733290.7466580.626671
530.5586250.882750.441375
540.6990430.6019150.300957
550.7057110.5885780.294289
560.7123510.5752990.287649
570.7488150.5023690.251185
580.7558030.4883950.244197
590.7589570.4820860.241043
600.8868280.2263450.113172
610.8871560.2256880.112844
620.8670290.2659430.132971
630.8957610.2084790.104239
640.8855090.2289820.114491
650.880070.2398590.11993
660.8789930.2420150.121007
670.8967510.2064970.103249
680.8959220.2081550.104078
690.8874630.2250750.112537
700.8626240.2747520.137376
710.8298520.3402950.170148
720.8014120.3971760.198588
730.7625330.4749350.237467
740.7115420.5769160.288458
750.7362520.5274950.263748
760.7315270.5369460.268473
770.7700620.4598750.229938
780.7538990.4922010.246101
790.7815230.4369540.218477
800.752310.495380.24769
810.7764810.4470380.223519
820.7988240.4023520.201176
830.7698020.4603960.230198
840.7942780.4114430.205722
850.7485060.5029870.251494
860.6912260.6175480.308774
870.6458480.7083030.354152
880.8882310.2235380.111769
890.8737550.252490.126245
900.8823230.2353540.117677
910.9054610.1890790.0945394
920.9578350.08433060.0421653
930.9748240.05035280.0251764
940.9635640.07287220.0364361
950.9406980.1186040.059302
960.9152340.1695330.0847665
970.899340.2013210.10066
980.9250740.1498510.0749256
990.9216690.1566620.0783308
1000.9086310.1827390.0913695
1010.8959690.2080630.104031
1020.8040610.3918770.195939
1030.8890680.2218630.110932







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level00OK
5% type I error level00OK
10% type I error level80.091954OK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 0 & 0 & OK \tabularnewline
5% type I error level & 0 & 0 & OK \tabularnewline
10% type I error level & 8 & 0.091954 & OK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=265739&T=6

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]8[/C][C]0.091954[/C][C]OK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=265739&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=265739&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level00OK
5% type I error level00OK
10% type I error level80.091954OK



Parameters (Session):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
R code (references can be found in the software module):
par3 <- 'No Linear Trend'
par2 <- 'Include Monthly Dummies'
par1 <- '1'
library(lattice)
library(lmtest)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
qqline(mysum$resid)
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, signif(mysum$coefficients[i,1],6), sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,signif(mysum$coefficients[i,1],6))
a<-table.element(a, signif(mysum$coefficients[i,2],6))
a<-table.element(a, signif(mysum$coefficients[i,3],4))
a<-table.element(a, signif(mysum$coefficients[i,4],6))
a<-table.element(a, signif(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, signif(sqrt(mysum$r.squared),6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, signif(mysum$r.squared,6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, signif(mysum$adj.r.squared,6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[1],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[2],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[3],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, signif(1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]),6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, signif(mysum$sigma,6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, signif(sum(myerror*myerror),6))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,signif(x[i],6))
a<-table.element(a,signif(x[i]-mysum$resid[i],6))
a<-table.element(a,signif(mysum$resid[i],6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,signif(gqarr[mypoint-kp3+1,1],6))
a<-table.element(a,signif(gqarr[mypoint-kp3+1,2],6))
a<-table.element(a,signif(gqarr[mypoint-kp3+1,3],6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant1,6))
a<-table.element(a,signif(numsignificant1/numgqtests,6))
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant5,6))
a<-table.element(a,signif(numsignificant5/numgqtests,6))
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant10,6))
a<-table.element(a,signif(numsignificant10/numgqtests,6))
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}