Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationWed, 18 Nov 2009 09:31:58 -0700
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2009/Nov/18/t1258562002kaarbsh5v8xc2v7.htm/, Retrieved Sun, 05 May 2024 16:32:32 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=57521, Retrieved Sun, 05 May 2024 16:32:32 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact126
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-     [Multiple Regression] [Q1 The Seatbeltlaw] [2007-11-14 19:27:43] [8cd6641b921d30ebe00b648d1481bba0]
- RMPD  [Multiple Regression] [Seatbelt] [2009-11-12 13:54:52] [b98453cac15ba1066b407e146608df68]
-   PD      [Multiple Regression] [Ws 7 regressie an...] [2009-11-18 16:31:58] [51d49d3536f6a59f2486a67bf50b2759] [Current]
Feedback Forum

Post a new message
Dataseries X:
1643	8997	1639	1395
1751	9062	1643	1639
1797	8885	1751	1643
1373	9058	1797	1751
1558	9095	1373	1797
1555	9149	1558	1373
2061	9857	1555	1558
2010	9848	2061	1555
2119	10269	2010	2061
1985	10341	2119	2010
1963	9690	1985	2119
2017	10125	1963	1985
1975	9349	2017	1963
1589	9224	1975	2017
1679	9224	1589	1975
1392	9454	1679	1589
1511	9347	1392	1679
1449	9430	1511	1392
1767	9933	1449	1511
1899	10148	1767	1449
2179	10677	1899	1767
2217	10735	2179	1899
2049	9760	2217	2179
2343	10567	2049	2217
2175	9333	2343	2049
1607	9409	2175	2343
1702	9502	1607	2175
1764	9348	1702	1607
1766	9319	1764	1702
1615	9594	1766	1764
1953	10160	1615	1766
2091	10182	1953	1615
2411	10810	2091	1953
2550	11105	2411	2091
2351	9874	2550	2411
2786	10958	2351	2550
2525	9311	2786	2351
2474	9610	2525	2786
2332	9398	2474	2525
1978	9784	2332	2474
1789	9425	1978	2332
1904	9557	1789	1978
1997	10166	1904	1789
2207	10337	1997	1904
2453	10770	2207	1997
1948	11265	2453	2207
1384	10183	1948	2453
1989	10941	1384	1948
2140	9628	1989	1384
2100	9709	2140	1989
2045	9637	2100	2140
2083	9579	2045	2100
2022	9741	2083	2045
1950	9754	2022	2083
1422	10508	1950	2022
1859	10749	1422	1950
2147	11079	1859	1422




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 3 seconds \tabularnewline
R Server & 'Gwilym Jenkins' @ 72.249.127.135 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=57521&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]3 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Gwilym Jenkins' @ 72.249.127.135[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=57521&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=57521&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135







Multiple Linear Regression - Estimated Regression Equation
aanbod[t] = + 617.674552839261 + 0.0404402210843701invoer[t] + 0.710157341244884`y(t-1)`[t] -0.0773485495535648`y(t-2)`[t] -315.540905496075M1[t] -436.959058722751M2[t] -299.931399291189M3[t] -517.866187168882M4[t] -367.629587341215M5[t] -430.560458513548M6[t] -286.062967850213M7[t] -224.916251638326M8[t] -107.946933806307M9[t] -376.257868939131M10[t] -475.138020364354M11[t] + 0.943798257753726t + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
aanbod[t] =  +  617.674552839261 +  0.0404402210843701invoer[t] +  0.710157341244884`y(t-1)`[t] -0.0773485495535648`y(t-2)`[t] -315.540905496075M1[t] -436.959058722751M2[t] -299.931399291189M3[t] -517.866187168882M4[t] -367.629587341215M5[t] -430.560458513548M6[t] -286.062967850213M7[t] -224.916251638326M8[t] -107.946933806307M9[t] -376.257868939131M10[t] -475.138020364354M11[t] +  0.943798257753726t  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=57521&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]aanbod[t] =  +  617.674552839261 +  0.0404402210843701invoer[t] +  0.710157341244884`y(t-1)`[t] -0.0773485495535648`y(t-2)`[t] -315.540905496075M1[t] -436.959058722751M2[t] -299.931399291189M3[t] -517.866187168882M4[t] -367.629587341215M5[t] -430.560458513548M6[t] -286.062967850213M7[t] -224.916251638326M8[t] -107.946933806307M9[t] -376.257868939131M10[t] -475.138020364354M11[t] +  0.943798257753726t  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=57521&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=57521&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
aanbod[t] = + 617.674552839261 + 0.0404402210843701invoer[t] + 0.710157341244884`y(t-1)`[t] -0.0773485495535648`y(t-2)`[t] -315.540905496075M1[t] -436.959058722751M2[t] -299.931399291189M3[t] -517.866187168882M4[t] -367.629587341215M5[t] -430.560458513548M6[t] -286.062967850213M7[t] -224.916251638326M8[t] -107.946933806307M9[t] -376.257868939131M10[t] -475.138020364354M11[t] + 0.943798257753726t + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)617.6745528392612726.5868180.22650.8219090.410954
invoer0.04044022108437010.2629380.15380.8785210.43926
`y(t-1)`0.7101573412448840.1650664.30230.0001025.1e-05
`y(t-2)`-0.07734854955356480.160768-0.48110.632990.316495
M1-315.540905496075341.592062-0.92370.3610290.180514
M2-436.959058722751333.404964-1.31060.1972890.098644
M3-299.931399291189361.535908-0.82960.4115670.205784
M4-517.866187168882331.986235-1.55990.1264690.063235
M5-367.629587341215359.605365-1.02230.3126270.156313
M6-430.560458513548331.870048-1.29740.2017540.100877
M7-286.062967850213202.444197-1.4130.1651890.082595
M8-224.916251638326182.255332-1.23410.2242030.112102
M9-107.946933806307153.819601-0.70180.4867840.243392
M10-376.257868939131184.584485-2.03840.0479940.023997
M11-475.138020364354240.700152-1.9740.0551470.027574
t0.9437982577537264.2631940.22140.8258930.412947

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 617.674552839261 & 2726.586818 & 0.2265 & 0.821909 & 0.410954 \tabularnewline
invoer & 0.0404402210843701 & 0.262938 & 0.1538 & 0.878521 & 0.43926 \tabularnewline
`y(t-1)` & 0.710157341244884 & 0.165066 & 4.3023 & 0.000102 & 5.1e-05 \tabularnewline
`y(t-2)` & -0.0773485495535648 & 0.160768 & -0.4811 & 0.63299 & 0.316495 \tabularnewline
M1 & -315.540905496075 & 341.592062 & -0.9237 & 0.361029 & 0.180514 \tabularnewline
M2 & -436.959058722751 & 333.404964 & -1.3106 & 0.197289 & 0.098644 \tabularnewline
M3 & -299.931399291189 & 361.535908 & -0.8296 & 0.411567 & 0.205784 \tabularnewline
M4 & -517.866187168882 & 331.986235 & -1.5599 & 0.126469 & 0.063235 \tabularnewline
M5 & -367.629587341215 & 359.605365 & -1.0223 & 0.312627 & 0.156313 \tabularnewline
M6 & -430.560458513548 & 331.870048 & -1.2974 & 0.201754 & 0.100877 \tabularnewline
M7 & -286.062967850213 & 202.444197 & -1.413 & 0.165189 & 0.082595 \tabularnewline
M8 & -224.916251638326 & 182.255332 & -1.2341 & 0.224203 & 0.112102 \tabularnewline
M9 & -107.946933806307 & 153.819601 & -0.7018 & 0.486784 & 0.243392 \tabularnewline
M10 & -376.257868939131 & 184.584485 & -2.0384 & 0.047994 & 0.023997 \tabularnewline
M11 & -475.138020364354 & 240.700152 & -1.974 & 0.055147 & 0.027574 \tabularnewline
t & 0.943798257753726 & 4.263194 & 0.2214 & 0.825893 & 0.412947 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=57521&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]617.674552839261[/C][C]2726.586818[/C][C]0.2265[/C][C]0.821909[/C][C]0.410954[/C][/ROW]
[ROW][C]invoer[/C][C]0.0404402210843701[/C][C]0.262938[/C][C]0.1538[/C][C]0.878521[/C][C]0.43926[/C][/ROW]
[ROW][C]`y(t-1)`[/C][C]0.710157341244884[/C][C]0.165066[/C][C]4.3023[/C][C]0.000102[/C][C]5.1e-05[/C][/ROW]
[ROW][C]`y(t-2)`[/C][C]-0.0773485495535648[/C][C]0.160768[/C][C]-0.4811[/C][C]0.63299[/C][C]0.316495[/C][/ROW]
[ROW][C]M1[/C][C]-315.540905496075[/C][C]341.592062[/C][C]-0.9237[/C][C]0.361029[/C][C]0.180514[/C][/ROW]
[ROW][C]M2[/C][C]-436.959058722751[/C][C]333.404964[/C][C]-1.3106[/C][C]0.197289[/C][C]0.098644[/C][/ROW]
[ROW][C]M3[/C][C]-299.931399291189[/C][C]361.535908[/C][C]-0.8296[/C][C]0.411567[/C][C]0.205784[/C][/ROW]
[ROW][C]M4[/C][C]-517.866187168882[/C][C]331.986235[/C][C]-1.5599[/C][C]0.126469[/C][C]0.063235[/C][/ROW]
[ROW][C]M5[/C][C]-367.629587341215[/C][C]359.605365[/C][C]-1.0223[/C][C]0.312627[/C][C]0.156313[/C][/ROW]
[ROW][C]M6[/C][C]-430.560458513548[/C][C]331.870048[/C][C]-1.2974[/C][C]0.201754[/C][C]0.100877[/C][/ROW]
[ROW][C]M7[/C][C]-286.062967850213[/C][C]202.444197[/C][C]-1.413[/C][C]0.165189[/C][C]0.082595[/C][/ROW]
[ROW][C]M8[/C][C]-224.916251638326[/C][C]182.255332[/C][C]-1.2341[/C][C]0.224203[/C][C]0.112102[/C][/ROW]
[ROW][C]M9[/C][C]-107.946933806307[/C][C]153.819601[/C][C]-0.7018[/C][C]0.486784[/C][C]0.243392[/C][/ROW]
[ROW][C]M10[/C][C]-376.257868939131[/C][C]184.584485[/C][C]-2.0384[/C][C]0.047994[/C][C]0.023997[/C][/ROW]
[ROW][C]M11[/C][C]-475.138020364354[/C][C]240.700152[/C][C]-1.974[/C][C]0.055147[/C][C]0.027574[/C][/ROW]
[ROW][C]t[/C][C]0.943798257753726[/C][C]4.263194[/C][C]0.2214[/C][C]0.825893[/C][C]0.412947[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=57521&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=57521&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)617.6745528392612726.5868180.22650.8219090.410954
invoer0.04044022108437010.2629380.15380.8785210.43926
`y(t-1)`0.7101573412448840.1650664.30230.0001025.1e-05
`y(t-2)`-0.07734854955356480.160768-0.48110.632990.316495
M1-315.540905496075341.592062-0.92370.3610290.180514
M2-436.959058722751333.404964-1.31060.1972890.098644
M3-299.931399291189361.535908-0.82960.4115670.205784
M4-517.866187168882331.986235-1.55990.1264690.063235
M5-367.629587341215359.605365-1.02230.3126270.156313
M6-430.560458513548331.870048-1.29740.2017540.100877
M7-286.062967850213202.444197-1.4130.1651890.082595
M8-224.916251638326182.255332-1.23410.2242030.112102
M9-107.946933806307153.819601-0.70180.4867840.243392
M10-376.257868939131184.584485-2.03840.0479940.023997
M11-475.138020364354240.700152-1.9740.0551470.027574
t0.9437982577537264.2631940.22140.8258930.412947







Multiple Linear Regression - Regression Statistics
Multiple R0.837502903368616
R-squared0.701411113150861
Adjusted R-squared0.592171276498738
F-TEST (value)6.42083634182388
F-TEST (DF numerator)15
F-TEST (DF denominator)41
p-value1.03794539541013e-06
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation206.136457647118
Sum Squared Residuals1742181.80602339

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.837502903368616 \tabularnewline
R-squared & 0.701411113150861 \tabularnewline
Adjusted R-squared & 0.592171276498738 \tabularnewline
F-TEST (value) & 6.42083634182388 \tabularnewline
F-TEST (DF numerator) & 15 \tabularnewline
F-TEST (DF denominator) & 41 \tabularnewline
p-value & 1.03794539541013e-06 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 206.136457647118 \tabularnewline
Sum Squared Residuals & 1742181.80602339 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=57521&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.837502903368616[/C][/ROW]
[ROW][C]R-squared[/C][C]0.701411113150861[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]0.592171276498738[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]6.42083634182388[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]15[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]41[/C][/ROW]
[ROW][C]p-value[/C][C]1.03794539541013e-06[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]206.136457647118[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]1742181.80602339[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=57521&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=57521&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.837502903368616
R-squared0.701411113150861
Adjusted R-squared0.592171276498738
F-TEST (value)6.42083634182388
F-TEST (DF numerator)15
F-TEST (DF denominator)41
p-value1.03794539541013e-06
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation206.136457647118
Sum Squared Residuals1742181.80602339







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
116431722.96477037016-79.9647703701586
217511589.08661304563161.913386954369
317971796.287750259250.712249740753709
413731610.60651323238-237.606513232383
515581458.6184535306399.3815464693688
615551562.99004569562-7.99004569562253
720611720.6230574533340.376942546699
820102141.92127025175-131.921270251755
921192201.50332894045-82.5033289404535
1019852018.39981420638-33.3998142063823
1119631790.54480148484172.455198515165
1220172278.95936041143-261.959360411434
1319751973.030806129041.96919387095642
1415891813.49799351640-224.497993516397
1516791680.59735656644-1.59735656643769
1613921566.67831863562-174.678318635619
1715111502.755086667918.24491333208958
1814491550.83230943335-101.832309433348
1917671663.38079700582103.619202994182
2018991964.79160359679-65.7916035967924
2121792173.241526926495.75847307351248
2222172096.85396988181120.146030118193
2320491964.8167862493984.1832137506149
2423432351.28818507440-8.2881850744033
2521752208.56865966896-33.5686596689639
2616071949.12085460457-342.120854604565
2717021700.478439352631.52156064736768
2817641588.65857925039175.341420749611
2917661775.34785387396-9.34785387395748
3016151721.10654636775-106.106546367748
3119531782.04854479551170.951455204494
3220912096.74155645236-5.74155645236218
3324112311.9090347258199.0909652741918
3425502273.04801243060276.951987569403
3523512199.29008168417151.709918315833
3627862567.13634066605218.863659333945
3725252510.2449941044614.7550058955406
3824742182.86458011905291.135419880951
3923322296.2326579684735.7673420315308
4019781997.95402725755-19.9540272575546
4117891894.20418120960-105.204181209605
4219041730.71686652484173.283133475159
4319971997.07322019510-0.0732201950963134
4422072123.2285620072883.771437992721
4524532400.5919203795352.4080796204722
4619482311.69820348121-363.698203481213
4713841792.34833058161-408.348330581613
4819891937.6161138481151.3838861518918
4921402043.1907697273796.8092302726257
5021001986.42995871436113.570041285643
5120452081.40379585321-36.4037958532144
5220831826.10256162405256.897438375945
5320222015.074424717906.92557528210368
5419501907.3542319784442.6457680215599
5514222036.87438055028-614.87438055028
5618591739.31700769181119.682992308188
5721472221.75418902772-74.754189027723

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 1643 & 1722.96477037016 & -79.9647703701586 \tabularnewline
2 & 1751 & 1589.08661304563 & 161.913386954369 \tabularnewline
3 & 1797 & 1796.28775025925 & 0.712249740753709 \tabularnewline
4 & 1373 & 1610.60651323238 & -237.606513232383 \tabularnewline
5 & 1558 & 1458.61845353063 & 99.3815464693688 \tabularnewline
6 & 1555 & 1562.99004569562 & -7.99004569562253 \tabularnewline
7 & 2061 & 1720.6230574533 & 340.376942546699 \tabularnewline
8 & 2010 & 2141.92127025175 & -131.921270251755 \tabularnewline
9 & 2119 & 2201.50332894045 & -82.5033289404535 \tabularnewline
10 & 1985 & 2018.39981420638 & -33.3998142063823 \tabularnewline
11 & 1963 & 1790.54480148484 & 172.455198515165 \tabularnewline
12 & 2017 & 2278.95936041143 & -261.959360411434 \tabularnewline
13 & 1975 & 1973.03080612904 & 1.96919387095642 \tabularnewline
14 & 1589 & 1813.49799351640 & -224.497993516397 \tabularnewline
15 & 1679 & 1680.59735656644 & -1.59735656643769 \tabularnewline
16 & 1392 & 1566.67831863562 & -174.678318635619 \tabularnewline
17 & 1511 & 1502.75508666791 & 8.24491333208958 \tabularnewline
18 & 1449 & 1550.83230943335 & -101.832309433348 \tabularnewline
19 & 1767 & 1663.38079700582 & 103.619202994182 \tabularnewline
20 & 1899 & 1964.79160359679 & -65.7916035967924 \tabularnewline
21 & 2179 & 2173.24152692649 & 5.75847307351248 \tabularnewline
22 & 2217 & 2096.85396988181 & 120.146030118193 \tabularnewline
23 & 2049 & 1964.81678624939 & 84.1832137506149 \tabularnewline
24 & 2343 & 2351.28818507440 & -8.2881850744033 \tabularnewline
25 & 2175 & 2208.56865966896 & -33.5686596689639 \tabularnewline
26 & 1607 & 1949.12085460457 & -342.120854604565 \tabularnewline
27 & 1702 & 1700.47843935263 & 1.52156064736768 \tabularnewline
28 & 1764 & 1588.65857925039 & 175.341420749611 \tabularnewline
29 & 1766 & 1775.34785387396 & -9.34785387395748 \tabularnewline
30 & 1615 & 1721.10654636775 & -106.106546367748 \tabularnewline
31 & 1953 & 1782.04854479551 & 170.951455204494 \tabularnewline
32 & 2091 & 2096.74155645236 & -5.74155645236218 \tabularnewline
33 & 2411 & 2311.90903472581 & 99.0909652741918 \tabularnewline
34 & 2550 & 2273.04801243060 & 276.951987569403 \tabularnewline
35 & 2351 & 2199.29008168417 & 151.709918315833 \tabularnewline
36 & 2786 & 2567.13634066605 & 218.863659333945 \tabularnewline
37 & 2525 & 2510.24499410446 & 14.7550058955406 \tabularnewline
38 & 2474 & 2182.86458011905 & 291.135419880951 \tabularnewline
39 & 2332 & 2296.23265796847 & 35.7673420315308 \tabularnewline
40 & 1978 & 1997.95402725755 & -19.9540272575546 \tabularnewline
41 & 1789 & 1894.20418120960 & -105.204181209605 \tabularnewline
42 & 1904 & 1730.71686652484 & 173.283133475159 \tabularnewline
43 & 1997 & 1997.07322019510 & -0.0732201950963134 \tabularnewline
44 & 2207 & 2123.22856200728 & 83.771437992721 \tabularnewline
45 & 2453 & 2400.59192037953 & 52.4080796204722 \tabularnewline
46 & 1948 & 2311.69820348121 & -363.698203481213 \tabularnewline
47 & 1384 & 1792.34833058161 & -408.348330581613 \tabularnewline
48 & 1989 & 1937.61611384811 & 51.3838861518918 \tabularnewline
49 & 2140 & 2043.19076972737 & 96.8092302726257 \tabularnewline
50 & 2100 & 1986.42995871436 & 113.570041285643 \tabularnewline
51 & 2045 & 2081.40379585321 & -36.4037958532144 \tabularnewline
52 & 2083 & 1826.10256162405 & 256.897438375945 \tabularnewline
53 & 2022 & 2015.07442471790 & 6.92557528210368 \tabularnewline
54 & 1950 & 1907.35423197844 & 42.6457680215599 \tabularnewline
55 & 1422 & 2036.87438055028 & -614.87438055028 \tabularnewline
56 & 1859 & 1739.31700769181 & 119.682992308188 \tabularnewline
57 & 2147 & 2221.75418902772 & -74.754189027723 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=57521&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]1643[/C][C]1722.96477037016[/C][C]-79.9647703701586[/C][/ROW]
[ROW][C]2[/C][C]1751[/C][C]1589.08661304563[/C][C]161.913386954369[/C][/ROW]
[ROW][C]3[/C][C]1797[/C][C]1796.28775025925[/C][C]0.712249740753709[/C][/ROW]
[ROW][C]4[/C][C]1373[/C][C]1610.60651323238[/C][C]-237.606513232383[/C][/ROW]
[ROW][C]5[/C][C]1558[/C][C]1458.61845353063[/C][C]99.3815464693688[/C][/ROW]
[ROW][C]6[/C][C]1555[/C][C]1562.99004569562[/C][C]-7.99004569562253[/C][/ROW]
[ROW][C]7[/C][C]2061[/C][C]1720.6230574533[/C][C]340.376942546699[/C][/ROW]
[ROW][C]8[/C][C]2010[/C][C]2141.92127025175[/C][C]-131.921270251755[/C][/ROW]
[ROW][C]9[/C][C]2119[/C][C]2201.50332894045[/C][C]-82.5033289404535[/C][/ROW]
[ROW][C]10[/C][C]1985[/C][C]2018.39981420638[/C][C]-33.3998142063823[/C][/ROW]
[ROW][C]11[/C][C]1963[/C][C]1790.54480148484[/C][C]172.455198515165[/C][/ROW]
[ROW][C]12[/C][C]2017[/C][C]2278.95936041143[/C][C]-261.959360411434[/C][/ROW]
[ROW][C]13[/C][C]1975[/C][C]1973.03080612904[/C][C]1.96919387095642[/C][/ROW]
[ROW][C]14[/C][C]1589[/C][C]1813.49799351640[/C][C]-224.497993516397[/C][/ROW]
[ROW][C]15[/C][C]1679[/C][C]1680.59735656644[/C][C]-1.59735656643769[/C][/ROW]
[ROW][C]16[/C][C]1392[/C][C]1566.67831863562[/C][C]-174.678318635619[/C][/ROW]
[ROW][C]17[/C][C]1511[/C][C]1502.75508666791[/C][C]8.24491333208958[/C][/ROW]
[ROW][C]18[/C][C]1449[/C][C]1550.83230943335[/C][C]-101.832309433348[/C][/ROW]
[ROW][C]19[/C][C]1767[/C][C]1663.38079700582[/C][C]103.619202994182[/C][/ROW]
[ROW][C]20[/C][C]1899[/C][C]1964.79160359679[/C][C]-65.7916035967924[/C][/ROW]
[ROW][C]21[/C][C]2179[/C][C]2173.24152692649[/C][C]5.75847307351248[/C][/ROW]
[ROW][C]22[/C][C]2217[/C][C]2096.85396988181[/C][C]120.146030118193[/C][/ROW]
[ROW][C]23[/C][C]2049[/C][C]1964.81678624939[/C][C]84.1832137506149[/C][/ROW]
[ROW][C]24[/C][C]2343[/C][C]2351.28818507440[/C][C]-8.2881850744033[/C][/ROW]
[ROW][C]25[/C][C]2175[/C][C]2208.56865966896[/C][C]-33.5686596689639[/C][/ROW]
[ROW][C]26[/C][C]1607[/C][C]1949.12085460457[/C][C]-342.120854604565[/C][/ROW]
[ROW][C]27[/C][C]1702[/C][C]1700.47843935263[/C][C]1.52156064736768[/C][/ROW]
[ROW][C]28[/C][C]1764[/C][C]1588.65857925039[/C][C]175.341420749611[/C][/ROW]
[ROW][C]29[/C][C]1766[/C][C]1775.34785387396[/C][C]-9.34785387395748[/C][/ROW]
[ROW][C]30[/C][C]1615[/C][C]1721.10654636775[/C][C]-106.106546367748[/C][/ROW]
[ROW][C]31[/C][C]1953[/C][C]1782.04854479551[/C][C]170.951455204494[/C][/ROW]
[ROW][C]32[/C][C]2091[/C][C]2096.74155645236[/C][C]-5.74155645236218[/C][/ROW]
[ROW][C]33[/C][C]2411[/C][C]2311.90903472581[/C][C]99.0909652741918[/C][/ROW]
[ROW][C]34[/C][C]2550[/C][C]2273.04801243060[/C][C]276.951987569403[/C][/ROW]
[ROW][C]35[/C][C]2351[/C][C]2199.29008168417[/C][C]151.709918315833[/C][/ROW]
[ROW][C]36[/C][C]2786[/C][C]2567.13634066605[/C][C]218.863659333945[/C][/ROW]
[ROW][C]37[/C][C]2525[/C][C]2510.24499410446[/C][C]14.7550058955406[/C][/ROW]
[ROW][C]38[/C][C]2474[/C][C]2182.86458011905[/C][C]291.135419880951[/C][/ROW]
[ROW][C]39[/C][C]2332[/C][C]2296.23265796847[/C][C]35.7673420315308[/C][/ROW]
[ROW][C]40[/C][C]1978[/C][C]1997.95402725755[/C][C]-19.9540272575546[/C][/ROW]
[ROW][C]41[/C][C]1789[/C][C]1894.20418120960[/C][C]-105.204181209605[/C][/ROW]
[ROW][C]42[/C][C]1904[/C][C]1730.71686652484[/C][C]173.283133475159[/C][/ROW]
[ROW][C]43[/C][C]1997[/C][C]1997.07322019510[/C][C]-0.0732201950963134[/C][/ROW]
[ROW][C]44[/C][C]2207[/C][C]2123.22856200728[/C][C]83.771437992721[/C][/ROW]
[ROW][C]45[/C][C]2453[/C][C]2400.59192037953[/C][C]52.4080796204722[/C][/ROW]
[ROW][C]46[/C][C]1948[/C][C]2311.69820348121[/C][C]-363.698203481213[/C][/ROW]
[ROW][C]47[/C][C]1384[/C][C]1792.34833058161[/C][C]-408.348330581613[/C][/ROW]
[ROW][C]48[/C][C]1989[/C][C]1937.61611384811[/C][C]51.3838861518918[/C][/ROW]
[ROW][C]49[/C][C]2140[/C][C]2043.19076972737[/C][C]96.8092302726257[/C][/ROW]
[ROW][C]50[/C][C]2100[/C][C]1986.42995871436[/C][C]113.570041285643[/C][/ROW]
[ROW][C]51[/C][C]2045[/C][C]2081.40379585321[/C][C]-36.4037958532144[/C][/ROW]
[ROW][C]52[/C][C]2083[/C][C]1826.10256162405[/C][C]256.897438375945[/C][/ROW]
[ROW][C]53[/C][C]2022[/C][C]2015.07442471790[/C][C]6.92557528210368[/C][/ROW]
[ROW][C]54[/C][C]1950[/C][C]1907.35423197844[/C][C]42.6457680215599[/C][/ROW]
[ROW][C]55[/C][C]1422[/C][C]2036.87438055028[/C][C]-614.87438055028[/C][/ROW]
[ROW][C]56[/C][C]1859[/C][C]1739.31700769181[/C][C]119.682992308188[/C][/ROW]
[ROW][C]57[/C][C]2147[/C][C]2221.75418902772[/C][C]-74.754189027723[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=57521&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=57521&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
116431722.96477037016-79.9647703701586
217511589.08661304563161.913386954369
317971796.287750259250.712249740753709
413731610.60651323238-237.606513232383
515581458.6184535306399.3815464693688
615551562.99004569562-7.99004569562253
720611720.6230574533340.376942546699
820102141.92127025175-131.921270251755
921192201.50332894045-82.5033289404535
1019852018.39981420638-33.3998142063823
1119631790.54480148484172.455198515165
1220172278.95936041143-261.959360411434
1319751973.030806129041.96919387095642
1415891813.49799351640-224.497993516397
1516791680.59735656644-1.59735656643769
1613921566.67831863562-174.678318635619
1715111502.755086667918.24491333208958
1814491550.83230943335-101.832309433348
1917671663.38079700582103.619202994182
2018991964.79160359679-65.7916035967924
2121792173.241526926495.75847307351248
2222172096.85396988181120.146030118193
2320491964.8167862493984.1832137506149
2423432351.28818507440-8.2881850744033
2521752208.56865966896-33.5686596689639
2616071949.12085460457-342.120854604565
2717021700.478439352631.52156064736768
2817641588.65857925039175.341420749611
2917661775.34785387396-9.34785387395748
3016151721.10654636775-106.106546367748
3119531782.04854479551170.951455204494
3220912096.74155645236-5.74155645236218
3324112311.9090347258199.0909652741918
3425502273.04801243060276.951987569403
3523512199.29008168417151.709918315833
3627862567.13634066605218.863659333945
3725252510.2449941044614.7550058955406
3824742182.86458011905291.135419880951
3923322296.2326579684735.7673420315308
4019781997.95402725755-19.9540272575546
4117891894.20418120960-105.204181209605
4219041730.71686652484173.283133475159
4319971997.07322019510-0.0732201950963134
4422072123.2285620072883.771437992721
4524532400.5919203795352.4080796204722
4619482311.69820348121-363.698203481213
4713841792.34833058161-408.348330581613
4819891937.6161138481151.3838861518918
4921402043.1907697273796.8092302726257
5021001986.42995871436113.570041285643
5120452081.40379585321-36.4037958532144
5220831826.10256162405256.897438375945
5320222015.074424717906.92557528210368
5419501907.3542319784442.6457680215599
5514222036.87438055028-614.87438055028
5618591739.31700769181119.682992308188
5721472221.75418902772-74.754189027723







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
190.05657052491682770.1131410498336550.943429475083172
200.01958901382880410.03917802765760820.980410986171196
210.005203669194870760.01040733838974150.99479633080513
220.001877264102080060.003754528204160130.99812273589792
230.003756663043742660.007513326087485310.996243336956257
240.002280577925972360.004561155851944730.997719422074028
250.004848074584069050.00969614916813810.99515192541593
260.01349085703077770.02698171406155540.986509142969222
270.006257218851494540.01251443770298910.993742781148505
280.08524967271072450.1704993454214490.914750327289276
290.05359237594026550.1071847518805310.946407624059735
300.05501868779846140.1100373755969230.944981312201539
310.03881471605021030.07762943210042070.96118528394979
320.09107327983101680.1821465596620340.908926720168983
330.0594724494200210.1189448988400420.94052755057998
340.09544749738291580.1908949947658320.904552502617084
350.09646013112753860.1929202622550770.903539868872461
360.1363233918202610.2726467836405220.86367660817974
370.08444504410217590.1688900882043520.915554955897824
380.1693340971773060.3386681943546130.830665902822694

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
19 & 0.0565705249168277 & 0.113141049833655 & 0.943429475083172 \tabularnewline
20 & 0.0195890138288041 & 0.0391780276576082 & 0.980410986171196 \tabularnewline
21 & 0.00520366919487076 & 0.0104073383897415 & 0.99479633080513 \tabularnewline
22 & 0.00187726410208006 & 0.00375452820416013 & 0.99812273589792 \tabularnewline
23 & 0.00375666304374266 & 0.00751332608748531 & 0.996243336956257 \tabularnewline
24 & 0.00228057792597236 & 0.00456115585194473 & 0.997719422074028 \tabularnewline
25 & 0.00484807458406905 & 0.0096961491681381 & 0.99515192541593 \tabularnewline
26 & 0.0134908570307777 & 0.0269817140615554 & 0.986509142969222 \tabularnewline
27 & 0.00625721885149454 & 0.0125144377029891 & 0.993742781148505 \tabularnewline
28 & 0.0852496727107245 & 0.170499345421449 & 0.914750327289276 \tabularnewline
29 & 0.0535923759402655 & 0.107184751880531 & 0.946407624059735 \tabularnewline
30 & 0.0550186877984614 & 0.110037375596923 & 0.944981312201539 \tabularnewline
31 & 0.0388147160502103 & 0.0776294321004207 & 0.96118528394979 \tabularnewline
32 & 0.0910732798310168 & 0.182146559662034 & 0.908926720168983 \tabularnewline
33 & 0.059472449420021 & 0.118944898840042 & 0.94052755057998 \tabularnewline
34 & 0.0954474973829158 & 0.190894994765832 & 0.904552502617084 \tabularnewline
35 & 0.0964601311275386 & 0.192920262255077 & 0.903539868872461 \tabularnewline
36 & 0.136323391820261 & 0.272646783640522 & 0.86367660817974 \tabularnewline
37 & 0.0844450441021759 & 0.168890088204352 & 0.915554955897824 \tabularnewline
38 & 0.169334097177306 & 0.338668194354613 & 0.830665902822694 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=57521&T=5

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]19[/C][C]0.0565705249168277[/C][C]0.113141049833655[/C][C]0.943429475083172[/C][/ROW]
[ROW][C]20[/C][C]0.0195890138288041[/C][C]0.0391780276576082[/C][C]0.980410986171196[/C][/ROW]
[ROW][C]21[/C][C]0.00520366919487076[/C][C]0.0104073383897415[/C][C]0.99479633080513[/C][/ROW]
[ROW][C]22[/C][C]0.00187726410208006[/C][C]0.00375452820416013[/C][C]0.99812273589792[/C][/ROW]
[ROW][C]23[/C][C]0.00375666304374266[/C][C]0.00751332608748531[/C][C]0.996243336956257[/C][/ROW]
[ROW][C]24[/C][C]0.00228057792597236[/C][C]0.00456115585194473[/C][C]0.997719422074028[/C][/ROW]
[ROW][C]25[/C][C]0.00484807458406905[/C][C]0.0096961491681381[/C][C]0.99515192541593[/C][/ROW]
[ROW][C]26[/C][C]0.0134908570307777[/C][C]0.0269817140615554[/C][C]0.986509142969222[/C][/ROW]
[ROW][C]27[/C][C]0.00625721885149454[/C][C]0.0125144377029891[/C][C]0.993742781148505[/C][/ROW]
[ROW][C]28[/C][C]0.0852496727107245[/C][C]0.170499345421449[/C][C]0.914750327289276[/C][/ROW]
[ROW][C]29[/C][C]0.0535923759402655[/C][C]0.107184751880531[/C][C]0.946407624059735[/C][/ROW]
[ROW][C]30[/C][C]0.0550186877984614[/C][C]0.110037375596923[/C][C]0.944981312201539[/C][/ROW]
[ROW][C]31[/C][C]0.0388147160502103[/C][C]0.0776294321004207[/C][C]0.96118528394979[/C][/ROW]
[ROW][C]32[/C][C]0.0910732798310168[/C][C]0.182146559662034[/C][C]0.908926720168983[/C][/ROW]
[ROW][C]33[/C][C]0.059472449420021[/C][C]0.118944898840042[/C][C]0.94052755057998[/C][/ROW]
[ROW][C]34[/C][C]0.0954474973829158[/C][C]0.190894994765832[/C][C]0.904552502617084[/C][/ROW]
[ROW][C]35[/C][C]0.0964601311275386[/C][C]0.192920262255077[/C][C]0.903539868872461[/C][/ROW]
[ROW][C]36[/C][C]0.136323391820261[/C][C]0.272646783640522[/C][C]0.86367660817974[/C][/ROW]
[ROW][C]37[/C][C]0.0844450441021759[/C][C]0.168890088204352[/C][C]0.915554955897824[/C][/ROW]
[ROW][C]38[/C][C]0.169334097177306[/C][C]0.338668194354613[/C][C]0.830665902822694[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=57521&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=57521&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
190.05657052491682770.1131410498336550.943429475083172
200.01958901382880410.03917802765760820.980410986171196
210.005203669194870760.01040733838974150.99479633080513
220.001877264102080060.003754528204160130.99812273589792
230.003756663043742660.007513326087485310.996243336956257
240.002280577925972360.004561155851944730.997719422074028
250.004848074584069050.00969614916813810.99515192541593
260.01349085703077770.02698171406155540.986509142969222
270.006257218851494540.01251443770298910.993742781148505
280.08524967271072450.1704993454214490.914750327289276
290.05359237594026550.1071847518805310.946407624059735
300.05501868779846140.1100373755969230.944981312201539
310.03881471605021030.07762943210042070.96118528394979
320.09107327983101680.1821465596620340.908926720168983
330.0594724494200210.1189448988400420.94052755057998
340.09544749738291580.1908949947658320.904552502617084
350.09646013112753860.1929202622550770.903539868872461
360.1363233918202610.2726467836405220.86367660817974
370.08444504410217590.1688900882043520.915554955897824
380.1693340971773060.3386681943546130.830665902822694







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level40.2NOK
5% type I error level80.4NOK
10% type I error level90.45NOK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 4 & 0.2 & NOK \tabularnewline
5% type I error level & 8 & 0.4 & NOK \tabularnewline
10% type I error level & 9 & 0.45 & NOK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=57521&T=6

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]4[/C][C]0.2[/C][C]NOK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]8[/C][C]0.4[/C][C]NOK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]9[/C][C]0.45[/C][C]NOK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=57521&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=57521&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level40.2NOK
5% type I error level80.4NOK
10% type I error level90.45NOK



Parameters (Session):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
R code (references can be found in the software module):
library(lattice)
library(lmtest)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
qqline(mysum$resid)
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, mysum$coefficients[i,1], sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,mysum$coefficients[i,1])
a<-table.element(a, round(mysum$coefficients[i,2],6))
a<-table.element(a, round(mysum$coefficients[i,3],4))
a<-table.element(a, round(mysum$coefficients[i,4],6))
a<-table.element(a, round(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, sqrt(mysum$r.squared))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, mysum$r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, mysum$adj.r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, mysum$fstatistic[1])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[2])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[3])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, 1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, mysum$sigma)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, sum(myerror*myerror))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,x[i])
a<-table.element(a,x[i]-mysum$resid[i])
a<-table.element(a,mysum$resid[i])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,gqarr[mypoint-kp3+1,1])
a<-table.element(a,gqarr[mypoint-kp3+1,2])
a<-table.element(a,gqarr[mypoint-kp3+1,3])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,numsignificant1)
a<-table.element(a,numsignificant1/numgqtests)
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,numsignificant5)
a<-table.element(a,numsignificant5/numgqtests)
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,numsignificant10)
a<-table.element(a,numsignificant10/numgqtests)
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}