Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationFri, 20 Nov 2009 00:41:25 -0700
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2009/Nov/20/t1258702963azcaalt7gwjjguj.htm/, Retrieved Tue, 16 Apr 2024 16:00:32 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=57977, Retrieved Tue, 16 Apr 2024 16:00:32 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact170
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-     [Multiple Regression] [Q1 The Seatbeltlaw] [2007-11-14 19:27:43] [8cd6641b921d30ebe00b648d1481bba0]
- RMPD  [Multiple Regression] [Seatbelt] [2009-11-12 14:03:14] [b98453cac15ba1066b407e146608df68]
-    D      [Multiple Regression] [] [2009-11-20 07:41:25] [2b679e8ec54382eeb0ec0b6bb527570a] [Current]
Feedback Forum

Post a new message
Dataseries X:
100.03	2
100.25	1.8
99.6	2.7
100.16	2.3
100.49	1.9
99.72	2
100.14	2.3
98.48	2.8
100.38	2.4
101.45	2.3
98.42	2.7
98.6	2.7
100.06	2.9
98.62	3
100.84	2.2
100.02	2.3
97.95	2.8
98.32	2.8
98.27	2.8
97.22	2.2
99.28	2.6
100.38	2.8
99.02	2.5
100.32	2.4
99.81	2.3
100.6	1.9
101.19	1.7
100.47	2
101.77	2.1
102.32	1.7
102.39	1.8
101.16	1.8
100.63	1.8
101.48	1.3
101.44	1.3
100.09	1.3
100.7	1.2
100.78	1.4
99.81	2.2
98.45	2.9
98.49	3.1
97.48	3.5
97.91	3.6
96.94	4.4
98.53	4.1
96.82	5.1
95.76	5.8
95.27	5.9
97.32	5.4
96.68	5.5
97.87	4.8
97.42	3.2
97.94	2.7
99.52	2.1
100.99	1.9
99.92	0.6
101.97	0.7
101.58	-0.2
99.54	-1
100.83	-1.7




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time5 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 5 seconds \tabularnewline
R Server & 'Gwilym Jenkins' @ 72.249.127.135 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=57977&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]5 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Gwilym Jenkins' @ 72.249.127.135[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=57977&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=57977&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time5 seconds
R Server'Gwilym Jenkins' @ 72.249.127.135







Multiple Linear Regression - Estimated Regression Equation
Y[t] = + 100.937861512697 -0.903708260706107X[t] + 1.14037328685191M1[t] + 0.906224956423668M2[t] + 1.38222495642367M3[t] + 0.661557469496567M4[t] + 0.667483304282443M5[t] + 0.721112478211832M6[t] + 1.2433349738542M7[t] -0.0611100174305318M8[t] + 1.31674165214122M9[t] + 1.44651915649886M10[t] -0.0594808435011405M11[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
Y[t] =  +  100.937861512697 -0.903708260706107X[t] +  1.14037328685191M1[t] +  0.906224956423668M2[t] +  1.38222495642367M3[t] +  0.661557469496567M4[t] +  0.667483304282443M5[t] +  0.721112478211832M6[t] +  1.2433349738542M7[t] -0.0611100174305318M8[t] +  1.31674165214122M9[t] +  1.44651915649886M10[t] -0.0594808435011405M11[t]  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=57977&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]Y[t] =  +  100.937861512697 -0.903708260706107X[t] +  1.14037328685191M1[t] +  0.906224956423668M2[t] +  1.38222495642367M3[t] +  0.661557469496567M4[t] +  0.667483304282443M5[t] +  0.721112478211832M6[t] +  1.2433349738542M7[t] -0.0611100174305318M8[t] +  1.31674165214122M9[t] +  1.44651915649886M10[t] -0.0594808435011405M11[t]  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=57977&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=57977&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
Y[t] = + 100.937861512697 -0.903708260706107X[t] + 1.14037328685191M1[t] + 0.906224956423668M2[t] + 1.38222495642367M3[t] + 0.661557469496567M4[t] + 0.667483304282443M5[t] + 0.721112478211832M6[t] + 1.2433349738542M7[t] -0.0611100174305318M8[t] + 1.31674165214122M9[t] + 1.44651915649886M10[t] -0.0594808435011405M11[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)100.9378615126970.503342200.535500
X-0.9037082607061070.09529-9.483700
M11.140373286851910.6548321.74150.0881430.044071
M20.9062249564236680.6544881.38460.1727030.086352
M31.382224956423670.6544882.11190.0400370.020018
M40.6615574694965670.6532131.01280.3163530.158177
M50.6674833042824430.6530991.0220.3120010.156001
M60.7211124782118320.6526121.1050.27480.1374
M71.24333497385420.6528871.90440.0629940.031497
M8-0.06111001743053180.652387-0.09370.9257680.462884
M91.316741652141220.6522642.01870.0492420.024621
M101.446519156498860.6521222.21820.0314120.015706
M11-0.05948084350114050.652122-0.09120.9277120.463856

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 100.937861512697 & 0.503342 & 200.5355 & 0 & 0 \tabularnewline
X & -0.903708260706107 & 0.09529 & -9.4837 & 0 & 0 \tabularnewline
M1 & 1.14037328685191 & 0.654832 & 1.7415 & 0.088143 & 0.044071 \tabularnewline
M2 & 0.906224956423668 & 0.654488 & 1.3846 & 0.172703 & 0.086352 \tabularnewline
M3 & 1.38222495642367 & 0.654488 & 2.1119 & 0.040037 & 0.020018 \tabularnewline
M4 & 0.661557469496567 & 0.653213 & 1.0128 & 0.316353 & 0.158177 \tabularnewline
M5 & 0.667483304282443 & 0.653099 & 1.022 & 0.312001 & 0.156001 \tabularnewline
M6 & 0.721112478211832 & 0.652612 & 1.105 & 0.2748 & 0.1374 \tabularnewline
M7 & 1.2433349738542 & 0.652887 & 1.9044 & 0.062994 & 0.031497 \tabularnewline
M8 & -0.0611100174305318 & 0.652387 & -0.0937 & 0.925768 & 0.462884 \tabularnewline
M9 & 1.31674165214122 & 0.652264 & 2.0187 & 0.049242 & 0.024621 \tabularnewline
M10 & 1.44651915649886 & 0.652122 & 2.2182 & 0.031412 & 0.015706 \tabularnewline
M11 & -0.0594808435011405 & 0.652122 & -0.0912 & 0.927712 & 0.463856 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=57977&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]100.937861512697[/C][C]0.503342[/C][C]200.5355[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]X[/C][C]-0.903708260706107[/C][C]0.09529[/C][C]-9.4837[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]M1[/C][C]1.14037328685191[/C][C]0.654832[/C][C]1.7415[/C][C]0.088143[/C][C]0.044071[/C][/ROW]
[ROW][C]M2[/C][C]0.906224956423668[/C][C]0.654488[/C][C]1.3846[/C][C]0.172703[/C][C]0.086352[/C][/ROW]
[ROW][C]M3[/C][C]1.38222495642367[/C][C]0.654488[/C][C]2.1119[/C][C]0.040037[/C][C]0.020018[/C][/ROW]
[ROW][C]M4[/C][C]0.661557469496567[/C][C]0.653213[/C][C]1.0128[/C][C]0.316353[/C][C]0.158177[/C][/ROW]
[ROW][C]M5[/C][C]0.667483304282443[/C][C]0.653099[/C][C]1.022[/C][C]0.312001[/C][C]0.156001[/C][/ROW]
[ROW][C]M6[/C][C]0.721112478211832[/C][C]0.652612[/C][C]1.105[/C][C]0.2748[/C][C]0.1374[/C][/ROW]
[ROW][C]M7[/C][C]1.2433349738542[/C][C]0.652887[/C][C]1.9044[/C][C]0.062994[/C][C]0.031497[/C][/ROW]
[ROW][C]M8[/C][C]-0.0611100174305318[/C][C]0.652387[/C][C]-0.0937[/C][C]0.925768[/C][C]0.462884[/C][/ROW]
[ROW][C]M9[/C][C]1.31674165214122[/C][C]0.652264[/C][C]2.0187[/C][C]0.049242[/C][C]0.024621[/C][/ROW]
[ROW][C]M10[/C][C]1.44651915649886[/C][C]0.652122[/C][C]2.2182[/C][C]0.031412[/C][C]0.015706[/C][/ROW]
[ROW][C]M11[/C][C]-0.0594808435011405[/C][C]0.652122[/C][C]-0.0912[/C][C]0.927712[/C][C]0.463856[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=57977&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=57977&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)100.9378615126970.503342200.535500
X-0.9037082607061070.09529-9.483700
M11.140373286851910.6548321.74150.0881430.044071
M20.9062249564236680.6544881.38460.1727030.086352
M31.382224956423670.6544882.11190.0400370.020018
M40.6615574694965670.6532131.01280.3163530.158177
M50.6674833042824430.6530991.0220.3120010.156001
M60.7211124782118320.6526121.1050.27480.1374
M71.24333497385420.6528871.90440.0629940.031497
M8-0.06111001743053180.652387-0.09370.9257680.462884
M91.316741652141220.6522642.01870.0492420.024621
M101.446519156498860.6521222.21820.0314120.015706
M11-0.05948084350114050.652122-0.09120.9277120.463856







Multiple Linear Regression - Regression Statistics
Multiple R0.828776724306222
R-squared0.68687085875175
Adjusted R-squared0.606922992901134
F-TEST (value)8.5914846061704
F-TEST (DF numerator)12
F-TEST (DF denominator)47
p-value2.56587797675678e-08
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation1.03087971742847
Sum Squared Residuals49.9475106148534

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.828776724306222 \tabularnewline
R-squared & 0.68687085875175 \tabularnewline
Adjusted R-squared & 0.606922992901134 \tabularnewline
F-TEST (value) & 8.5914846061704 \tabularnewline
F-TEST (DF numerator) & 12 \tabularnewline
F-TEST (DF denominator) & 47 \tabularnewline
p-value & 2.56587797675678e-08 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 1.03087971742847 \tabularnewline
Sum Squared Residuals & 49.9475106148534 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=57977&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.828776724306222[/C][/ROW]
[ROW][C]R-squared[/C][C]0.68687085875175[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]0.606922992901134[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]8.5914846061704[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]12[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]47[/C][/ROW]
[ROW][C]p-value[/C][C]2.56587797675678e-08[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]1.03087971742847[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]49.9475106148534[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=57977&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=57977&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.828776724306222
R-squared0.68687085875175
Adjusted R-squared0.606922992901134
F-TEST (value)8.5914846061704
F-TEST (DF numerator)12
F-TEST (DF denominator)47
p-value2.56587797675678e-08
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation1.03087971742847
Sum Squared Residuals49.9475106148534







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1100.03100.270818278137-0.24081827813664
2100.25100.2174115998500.0325884001503753
399.699.880074165214-0.280074165214128
4100.1699.52088998256950.639110017430531
5100.4999.88829912163780.60170087836221
699.7299.8515574694966-0.131557469496564
7100.14100.1026674869270.0373325130729031
898.4898.34636836528930.133631634710691
9100.38100.0857033391440.294296660856486
10101.45100.3058516695721.14414833042825
1198.4298.4383683652893-0.0183683652893126
1298.698.49784920879050.102150791209539
13100.0699.45748084350110.602519156498856
1498.6299.1329616870023-0.512961687002286
15100.84100.3319282955670.508071704432827
16100.0299.52088998256950.499110017430531
1797.9599.0749616870023-1.12496168700228
1898.3299.1285908609317-0.808590860931683
1998.2799.650813356574-1.38081335657405
2097.2298.888593321713-1.66859332171298
2199.2899.9049616870023-0.624961687002287
22100.3899.85399753921870.526002460781295
2399.0298.61911001743050.40088998256946
24100.3298.76896168700231.55103831299771
2599.8199.9997057999248-0.189705799924808
26100.6100.1270407737790.472959226220986
27101.19100.7837824259200.406217574079767
28100.4799.79200246078130.677997539218702
29101.7799.70755746949662.06244253050343
30102.32100.1226699477082.1973300522916
31102.39100.5545216172801.83547838271985
32101.1699.25007662599541.90992337400458
33100.63100.6279282955670.00207170443282080
34101.48101.2095599302780.270440069722142
35101.4499.70355993027791.73644006972213
36100.0999.7630407737790.326959226220998
37100.7100.993784886702-0.293784886701526
38100.78100.5788949041320.201105095867939
3999.81100.331928295567-0.521928295567174
4098.4598.9786650261458-0.528665026145798
4198.4998.8038492087905-0.31384920879046
4297.4898.4959950784374-1.01599507843740
4397.9198.9278467480092-1.01784674800916
4496.9496.90043514815950.0395648518404578
4598.5398.5493992959431-0.0193992959431265
4696.8297.7754685395947-0.95546853959466
4795.7695.63687275710040.123127242899625
4895.2795.605982774531-0.335982774530914
4997.3297.19821019173590.121789808264117
5096.6896.873691035237-0.193691035237014
5197.8797.9822868177313-0.112286817731292
5297.4298.707552547934-1.28755254793397
5397.9499.1653325130729-1.2253325130729
5499.5299.761186643426-0.241186643425955
55100.99100.4641507912100.525849208790454
5699.92100.334526538843-0.414526538842747
57101.97101.6220073823440.347992617656106
58101.58102.565122321337-0.985122321337025
5999.54101.782088929902-2.24208892990191
60100.83102.474165555897-1.64416555589733

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 100.03 & 100.270818278137 & -0.24081827813664 \tabularnewline
2 & 100.25 & 100.217411599850 & 0.0325884001503753 \tabularnewline
3 & 99.6 & 99.880074165214 & -0.280074165214128 \tabularnewline
4 & 100.16 & 99.5208899825695 & 0.639110017430531 \tabularnewline
5 & 100.49 & 99.8882991216378 & 0.60170087836221 \tabularnewline
6 & 99.72 & 99.8515574694966 & -0.131557469496564 \tabularnewline
7 & 100.14 & 100.102667486927 & 0.0373325130729031 \tabularnewline
8 & 98.48 & 98.3463683652893 & 0.133631634710691 \tabularnewline
9 & 100.38 & 100.085703339144 & 0.294296660856486 \tabularnewline
10 & 101.45 & 100.305851669572 & 1.14414833042825 \tabularnewline
11 & 98.42 & 98.4383683652893 & -0.0183683652893126 \tabularnewline
12 & 98.6 & 98.4978492087905 & 0.102150791209539 \tabularnewline
13 & 100.06 & 99.4574808435011 & 0.602519156498856 \tabularnewline
14 & 98.62 & 99.1329616870023 & -0.512961687002286 \tabularnewline
15 & 100.84 & 100.331928295567 & 0.508071704432827 \tabularnewline
16 & 100.02 & 99.5208899825695 & 0.499110017430531 \tabularnewline
17 & 97.95 & 99.0749616870023 & -1.12496168700228 \tabularnewline
18 & 98.32 & 99.1285908609317 & -0.808590860931683 \tabularnewline
19 & 98.27 & 99.650813356574 & -1.38081335657405 \tabularnewline
20 & 97.22 & 98.888593321713 & -1.66859332171298 \tabularnewline
21 & 99.28 & 99.9049616870023 & -0.624961687002287 \tabularnewline
22 & 100.38 & 99.8539975392187 & 0.526002460781295 \tabularnewline
23 & 99.02 & 98.6191100174305 & 0.40088998256946 \tabularnewline
24 & 100.32 & 98.7689616870023 & 1.55103831299771 \tabularnewline
25 & 99.81 & 99.9997057999248 & -0.189705799924808 \tabularnewline
26 & 100.6 & 100.127040773779 & 0.472959226220986 \tabularnewline
27 & 101.19 & 100.783782425920 & 0.406217574079767 \tabularnewline
28 & 100.47 & 99.7920024607813 & 0.677997539218702 \tabularnewline
29 & 101.77 & 99.7075574694966 & 2.06244253050343 \tabularnewline
30 & 102.32 & 100.122669947708 & 2.1973300522916 \tabularnewline
31 & 102.39 & 100.554521617280 & 1.83547838271985 \tabularnewline
32 & 101.16 & 99.2500766259954 & 1.90992337400458 \tabularnewline
33 & 100.63 & 100.627928295567 & 0.00207170443282080 \tabularnewline
34 & 101.48 & 101.209559930278 & 0.270440069722142 \tabularnewline
35 & 101.44 & 99.7035599302779 & 1.73644006972213 \tabularnewline
36 & 100.09 & 99.763040773779 & 0.326959226220998 \tabularnewline
37 & 100.7 & 100.993784886702 & -0.293784886701526 \tabularnewline
38 & 100.78 & 100.578894904132 & 0.201105095867939 \tabularnewline
39 & 99.81 & 100.331928295567 & -0.521928295567174 \tabularnewline
40 & 98.45 & 98.9786650261458 & -0.528665026145798 \tabularnewline
41 & 98.49 & 98.8038492087905 & -0.31384920879046 \tabularnewline
42 & 97.48 & 98.4959950784374 & -1.01599507843740 \tabularnewline
43 & 97.91 & 98.9278467480092 & -1.01784674800916 \tabularnewline
44 & 96.94 & 96.9004351481595 & 0.0395648518404578 \tabularnewline
45 & 98.53 & 98.5493992959431 & -0.0193992959431265 \tabularnewline
46 & 96.82 & 97.7754685395947 & -0.95546853959466 \tabularnewline
47 & 95.76 & 95.6368727571004 & 0.123127242899625 \tabularnewline
48 & 95.27 & 95.605982774531 & -0.335982774530914 \tabularnewline
49 & 97.32 & 97.1982101917359 & 0.121789808264117 \tabularnewline
50 & 96.68 & 96.873691035237 & -0.193691035237014 \tabularnewline
51 & 97.87 & 97.9822868177313 & -0.112286817731292 \tabularnewline
52 & 97.42 & 98.707552547934 & -1.28755254793397 \tabularnewline
53 & 97.94 & 99.1653325130729 & -1.2253325130729 \tabularnewline
54 & 99.52 & 99.761186643426 & -0.241186643425955 \tabularnewline
55 & 100.99 & 100.464150791210 & 0.525849208790454 \tabularnewline
56 & 99.92 & 100.334526538843 & -0.414526538842747 \tabularnewline
57 & 101.97 & 101.622007382344 & 0.347992617656106 \tabularnewline
58 & 101.58 & 102.565122321337 & -0.985122321337025 \tabularnewline
59 & 99.54 & 101.782088929902 & -2.24208892990191 \tabularnewline
60 & 100.83 & 102.474165555897 & -1.64416555589733 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=57977&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]100.03[/C][C]100.270818278137[/C][C]-0.24081827813664[/C][/ROW]
[ROW][C]2[/C][C]100.25[/C][C]100.217411599850[/C][C]0.0325884001503753[/C][/ROW]
[ROW][C]3[/C][C]99.6[/C][C]99.880074165214[/C][C]-0.280074165214128[/C][/ROW]
[ROW][C]4[/C][C]100.16[/C][C]99.5208899825695[/C][C]0.639110017430531[/C][/ROW]
[ROW][C]5[/C][C]100.49[/C][C]99.8882991216378[/C][C]0.60170087836221[/C][/ROW]
[ROW][C]6[/C][C]99.72[/C][C]99.8515574694966[/C][C]-0.131557469496564[/C][/ROW]
[ROW][C]7[/C][C]100.14[/C][C]100.102667486927[/C][C]0.0373325130729031[/C][/ROW]
[ROW][C]8[/C][C]98.48[/C][C]98.3463683652893[/C][C]0.133631634710691[/C][/ROW]
[ROW][C]9[/C][C]100.38[/C][C]100.085703339144[/C][C]0.294296660856486[/C][/ROW]
[ROW][C]10[/C][C]101.45[/C][C]100.305851669572[/C][C]1.14414833042825[/C][/ROW]
[ROW][C]11[/C][C]98.42[/C][C]98.4383683652893[/C][C]-0.0183683652893126[/C][/ROW]
[ROW][C]12[/C][C]98.6[/C][C]98.4978492087905[/C][C]0.102150791209539[/C][/ROW]
[ROW][C]13[/C][C]100.06[/C][C]99.4574808435011[/C][C]0.602519156498856[/C][/ROW]
[ROW][C]14[/C][C]98.62[/C][C]99.1329616870023[/C][C]-0.512961687002286[/C][/ROW]
[ROW][C]15[/C][C]100.84[/C][C]100.331928295567[/C][C]0.508071704432827[/C][/ROW]
[ROW][C]16[/C][C]100.02[/C][C]99.5208899825695[/C][C]0.499110017430531[/C][/ROW]
[ROW][C]17[/C][C]97.95[/C][C]99.0749616870023[/C][C]-1.12496168700228[/C][/ROW]
[ROW][C]18[/C][C]98.32[/C][C]99.1285908609317[/C][C]-0.808590860931683[/C][/ROW]
[ROW][C]19[/C][C]98.27[/C][C]99.650813356574[/C][C]-1.38081335657405[/C][/ROW]
[ROW][C]20[/C][C]97.22[/C][C]98.888593321713[/C][C]-1.66859332171298[/C][/ROW]
[ROW][C]21[/C][C]99.28[/C][C]99.9049616870023[/C][C]-0.624961687002287[/C][/ROW]
[ROW][C]22[/C][C]100.38[/C][C]99.8539975392187[/C][C]0.526002460781295[/C][/ROW]
[ROW][C]23[/C][C]99.02[/C][C]98.6191100174305[/C][C]0.40088998256946[/C][/ROW]
[ROW][C]24[/C][C]100.32[/C][C]98.7689616870023[/C][C]1.55103831299771[/C][/ROW]
[ROW][C]25[/C][C]99.81[/C][C]99.9997057999248[/C][C]-0.189705799924808[/C][/ROW]
[ROW][C]26[/C][C]100.6[/C][C]100.127040773779[/C][C]0.472959226220986[/C][/ROW]
[ROW][C]27[/C][C]101.19[/C][C]100.783782425920[/C][C]0.406217574079767[/C][/ROW]
[ROW][C]28[/C][C]100.47[/C][C]99.7920024607813[/C][C]0.677997539218702[/C][/ROW]
[ROW][C]29[/C][C]101.77[/C][C]99.7075574694966[/C][C]2.06244253050343[/C][/ROW]
[ROW][C]30[/C][C]102.32[/C][C]100.122669947708[/C][C]2.1973300522916[/C][/ROW]
[ROW][C]31[/C][C]102.39[/C][C]100.554521617280[/C][C]1.83547838271985[/C][/ROW]
[ROW][C]32[/C][C]101.16[/C][C]99.2500766259954[/C][C]1.90992337400458[/C][/ROW]
[ROW][C]33[/C][C]100.63[/C][C]100.627928295567[/C][C]0.00207170443282080[/C][/ROW]
[ROW][C]34[/C][C]101.48[/C][C]101.209559930278[/C][C]0.270440069722142[/C][/ROW]
[ROW][C]35[/C][C]101.44[/C][C]99.7035599302779[/C][C]1.73644006972213[/C][/ROW]
[ROW][C]36[/C][C]100.09[/C][C]99.763040773779[/C][C]0.326959226220998[/C][/ROW]
[ROW][C]37[/C][C]100.7[/C][C]100.993784886702[/C][C]-0.293784886701526[/C][/ROW]
[ROW][C]38[/C][C]100.78[/C][C]100.578894904132[/C][C]0.201105095867939[/C][/ROW]
[ROW][C]39[/C][C]99.81[/C][C]100.331928295567[/C][C]-0.521928295567174[/C][/ROW]
[ROW][C]40[/C][C]98.45[/C][C]98.9786650261458[/C][C]-0.528665026145798[/C][/ROW]
[ROW][C]41[/C][C]98.49[/C][C]98.8038492087905[/C][C]-0.31384920879046[/C][/ROW]
[ROW][C]42[/C][C]97.48[/C][C]98.4959950784374[/C][C]-1.01599507843740[/C][/ROW]
[ROW][C]43[/C][C]97.91[/C][C]98.9278467480092[/C][C]-1.01784674800916[/C][/ROW]
[ROW][C]44[/C][C]96.94[/C][C]96.9004351481595[/C][C]0.0395648518404578[/C][/ROW]
[ROW][C]45[/C][C]98.53[/C][C]98.5493992959431[/C][C]-0.0193992959431265[/C][/ROW]
[ROW][C]46[/C][C]96.82[/C][C]97.7754685395947[/C][C]-0.95546853959466[/C][/ROW]
[ROW][C]47[/C][C]95.76[/C][C]95.6368727571004[/C][C]0.123127242899625[/C][/ROW]
[ROW][C]48[/C][C]95.27[/C][C]95.605982774531[/C][C]-0.335982774530914[/C][/ROW]
[ROW][C]49[/C][C]97.32[/C][C]97.1982101917359[/C][C]0.121789808264117[/C][/ROW]
[ROW][C]50[/C][C]96.68[/C][C]96.873691035237[/C][C]-0.193691035237014[/C][/ROW]
[ROW][C]51[/C][C]97.87[/C][C]97.9822868177313[/C][C]-0.112286817731292[/C][/ROW]
[ROW][C]52[/C][C]97.42[/C][C]98.707552547934[/C][C]-1.28755254793397[/C][/ROW]
[ROW][C]53[/C][C]97.94[/C][C]99.1653325130729[/C][C]-1.2253325130729[/C][/ROW]
[ROW][C]54[/C][C]99.52[/C][C]99.761186643426[/C][C]-0.241186643425955[/C][/ROW]
[ROW][C]55[/C][C]100.99[/C][C]100.464150791210[/C][C]0.525849208790454[/C][/ROW]
[ROW][C]56[/C][C]99.92[/C][C]100.334526538843[/C][C]-0.414526538842747[/C][/ROW]
[ROW][C]57[/C][C]101.97[/C][C]101.622007382344[/C][C]0.347992617656106[/C][/ROW]
[ROW][C]58[/C][C]101.58[/C][C]102.565122321337[/C][C]-0.985122321337025[/C][/ROW]
[ROW][C]59[/C][C]99.54[/C][C]101.782088929902[/C][C]-2.24208892990191[/C][/ROW]
[ROW][C]60[/C][C]100.83[/C][C]102.474165555897[/C][C]-1.64416555589733[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=57977&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=57977&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1100.03100.270818278137-0.24081827813664
2100.25100.2174115998500.0325884001503753
399.699.880074165214-0.280074165214128
4100.1699.52088998256950.639110017430531
5100.4999.88829912163780.60170087836221
699.7299.8515574694966-0.131557469496564
7100.14100.1026674869270.0373325130729031
898.4898.34636836528930.133631634710691
9100.38100.0857033391440.294296660856486
10101.45100.3058516695721.14414833042825
1198.4298.4383683652893-0.0183683652893126
1298.698.49784920879050.102150791209539
13100.0699.45748084350110.602519156498856
1498.6299.1329616870023-0.512961687002286
15100.84100.3319282955670.508071704432827
16100.0299.52088998256950.499110017430531
1797.9599.0749616870023-1.12496168700228
1898.3299.1285908609317-0.808590860931683
1998.2799.650813356574-1.38081335657405
2097.2298.888593321713-1.66859332171298
2199.2899.9049616870023-0.624961687002287
22100.3899.85399753921870.526002460781295
2399.0298.61911001743050.40088998256946
24100.3298.76896168700231.55103831299771
2599.8199.9997057999248-0.189705799924808
26100.6100.1270407737790.472959226220986
27101.19100.7837824259200.406217574079767
28100.4799.79200246078130.677997539218702
29101.7799.70755746949662.06244253050343
30102.32100.1226699477082.1973300522916
31102.39100.5545216172801.83547838271985
32101.1699.25007662599541.90992337400458
33100.63100.6279282955670.00207170443282080
34101.48101.2095599302780.270440069722142
35101.4499.70355993027791.73644006972213
36100.0999.7630407737790.326959226220998
37100.7100.993784886702-0.293784886701526
38100.78100.5788949041320.201105095867939
3999.81100.331928295567-0.521928295567174
4098.4598.9786650261458-0.528665026145798
4198.4998.8038492087905-0.31384920879046
4297.4898.4959950784374-1.01599507843740
4397.9198.9278467480092-1.01784674800916
4496.9496.90043514815950.0395648518404578
4598.5398.5493992959431-0.0193992959431265
4696.8297.7754685395947-0.95546853959466
4795.7695.63687275710040.123127242899625
4895.2795.605982774531-0.335982774530914
4997.3297.19821019173590.121789808264117
5096.6896.873691035237-0.193691035237014
5197.8797.9822868177313-0.112286817731292
5297.4298.707552547934-1.28755254793397
5397.9499.1653325130729-1.2253325130729
5499.5299.761186643426-0.241186643425955
55100.99100.4641507912100.525849208790454
5699.92100.334526538843-0.414526538842747
57101.97101.6220073823440.347992617656106
58101.58102.565122321337-0.985122321337025
5999.54101.782088929902-2.24208892990191
60100.83102.474165555897-1.64416555589733







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
160.0933772562905330.1867545125810660.906622743709467
170.1283431924618340.2566863849236680.871656807538166
180.06098063366795890.1219612673359180.93901936633204
190.056675940537220.113351881074440.94332405946278
200.1675552241613340.3351104483226680.832444775838666
210.1199383747813860.2398767495627720.880061625218614
220.07577573150494250.1515514630098850.924224268495057
230.04325127965566010.08650255931132030.95674872034434
240.05276182257588920.1055236451517780.94723817742411
250.03128982198279410.06257964396558810.968710178017206
260.01781831083967210.03563662167934420.982181689160328
270.009452186295620.018904372591240.99054781370438
280.005915398448744310.01183079689748860.994084601551256
290.04719663893445750.0943932778689150.952803361065543
300.1818546786619410.3637093573238810.81814532133806
310.3011582677323610.6023165354647220.698841732267639
320.4734212657773750.946842531554750.526578734222624
330.396521077707930.793042155415860.60347892229207
340.4936485097615750.987297019523150.506351490238425
350.8316591136015750.336681772796850.168340886398425
360.8990171174049020.2019657651901960.100982882595098
370.8616320255417260.2767359489165490.138367974458274
380.8303578796252220.3392842407495560.169642120374778
390.7515281706927350.496943658614530.248471829307265
400.6913699853318370.6172600293363260.308630014668163
410.6203538826136470.7592922347727050.379646117386353
420.5598482033968050.880303593206390.440151796603195
430.7325492557685530.5349014884628940.267450744231447
440.5916230347702570.8167539304594860.408376965229743

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
16 & 0.093377256290533 & 0.186754512581066 & 0.906622743709467 \tabularnewline
17 & 0.128343192461834 & 0.256686384923668 & 0.871656807538166 \tabularnewline
18 & 0.0609806336679589 & 0.121961267335918 & 0.93901936633204 \tabularnewline
19 & 0.05667594053722 & 0.11335188107444 & 0.94332405946278 \tabularnewline
20 & 0.167555224161334 & 0.335110448322668 & 0.832444775838666 \tabularnewline
21 & 0.119938374781386 & 0.239876749562772 & 0.880061625218614 \tabularnewline
22 & 0.0757757315049425 & 0.151551463009885 & 0.924224268495057 \tabularnewline
23 & 0.0432512796556601 & 0.0865025593113203 & 0.95674872034434 \tabularnewline
24 & 0.0527618225758892 & 0.105523645151778 & 0.94723817742411 \tabularnewline
25 & 0.0312898219827941 & 0.0625796439655881 & 0.968710178017206 \tabularnewline
26 & 0.0178183108396721 & 0.0356366216793442 & 0.982181689160328 \tabularnewline
27 & 0.00945218629562 & 0.01890437259124 & 0.99054781370438 \tabularnewline
28 & 0.00591539844874431 & 0.0118307968974886 & 0.994084601551256 \tabularnewline
29 & 0.0471966389344575 & 0.094393277868915 & 0.952803361065543 \tabularnewline
30 & 0.181854678661941 & 0.363709357323881 & 0.81814532133806 \tabularnewline
31 & 0.301158267732361 & 0.602316535464722 & 0.698841732267639 \tabularnewline
32 & 0.473421265777375 & 0.94684253155475 & 0.526578734222624 \tabularnewline
33 & 0.39652107770793 & 0.79304215541586 & 0.60347892229207 \tabularnewline
34 & 0.493648509761575 & 0.98729701952315 & 0.506351490238425 \tabularnewline
35 & 0.831659113601575 & 0.33668177279685 & 0.168340886398425 \tabularnewline
36 & 0.899017117404902 & 0.201965765190196 & 0.100982882595098 \tabularnewline
37 & 0.861632025541726 & 0.276735948916549 & 0.138367974458274 \tabularnewline
38 & 0.830357879625222 & 0.339284240749556 & 0.169642120374778 \tabularnewline
39 & 0.751528170692735 & 0.49694365861453 & 0.248471829307265 \tabularnewline
40 & 0.691369985331837 & 0.617260029336326 & 0.308630014668163 \tabularnewline
41 & 0.620353882613647 & 0.759292234772705 & 0.379646117386353 \tabularnewline
42 & 0.559848203396805 & 0.88030359320639 & 0.440151796603195 \tabularnewline
43 & 0.732549255768553 & 0.534901488462894 & 0.267450744231447 \tabularnewline
44 & 0.591623034770257 & 0.816753930459486 & 0.408376965229743 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=57977&T=5

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]16[/C][C]0.093377256290533[/C][C]0.186754512581066[/C][C]0.906622743709467[/C][/ROW]
[ROW][C]17[/C][C]0.128343192461834[/C][C]0.256686384923668[/C][C]0.871656807538166[/C][/ROW]
[ROW][C]18[/C][C]0.0609806336679589[/C][C]0.121961267335918[/C][C]0.93901936633204[/C][/ROW]
[ROW][C]19[/C][C]0.05667594053722[/C][C]0.11335188107444[/C][C]0.94332405946278[/C][/ROW]
[ROW][C]20[/C][C]0.167555224161334[/C][C]0.335110448322668[/C][C]0.832444775838666[/C][/ROW]
[ROW][C]21[/C][C]0.119938374781386[/C][C]0.239876749562772[/C][C]0.880061625218614[/C][/ROW]
[ROW][C]22[/C][C]0.0757757315049425[/C][C]0.151551463009885[/C][C]0.924224268495057[/C][/ROW]
[ROW][C]23[/C][C]0.0432512796556601[/C][C]0.0865025593113203[/C][C]0.95674872034434[/C][/ROW]
[ROW][C]24[/C][C]0.0527618225758892[/C][C]0.105523645151778[/C][C]0.94723817742411[/C][/ROW]
[ROW][C]25[/C][C]0.0312898219827941[/C][C]0.0625796439655881[/C][C]0.968710178017206[/C][/ROW]
[ROW][C]26[/C][C]0.0178183108396721[/C][C]0.0356366216793442[/C][C]0.982181689160328[/C][/ROW]
[ROW][C]27[/C][C]0.00945218629562[/C][C]0.01890437259124[/C][C]0.99054781370438[/C][/ROW]
[ROW][C]28[/C][C]0.00591539844874431[/C][C]0.0118307968974886[/C][C]0.994084601551256[/C][/ROW]
[ROW][C]29[/C][C]0.0471966389344575[/C][C]0.094393277868915[/C][C]0.952803361065543[/C][/ROW]
[ROW][C]30[/C][C]0.181854678661941[/C][C]0.363709357323881[/C][C]0.81814532133806[/C][/ROW]
[ROW][C]31[/C][C]0.301158267732361[/C][C]0.602316535464722[/C][C]0.698841732267639[/C][/ROW]
[ROW][C]32[/C][C]0.473421265777375[/C][C]0.94684253155475[/C][C]0.526578734222624[/C][/ROW]
[ROW][C]33[/C][C]0.39652107770793[/C][C]0.79304215541586[/C][C]0.60347892229207[/C][/ROW]
[ROW][C]34[/C][C]0.493648509761575[/C][C]0.98729701952315[/C][C]0.506351490238425[/C][/ROW]
[ROW][C]35[/C][C]0.831659113601575[/C][C]0.33668177279685[/C][C]0.168340886398425[/C][/ROW]
[ROW][C]36[/C][C]0.899017117404902[/C][C]0.201965765190196[/C][C]0.100982882595098[/C][/ROW]
[ROW][C]37[/C][C]0.861632025541726[/C][C]0.276735948916549[/C][C]0.138367974458274[/C][/ROW]
[ROW][C]38[/C][C]0.830357879625222[/C][C]0.339284240749556[/C][C]0.169642120374778[/C][/ROW]
[ROW][C]39[/C][C]0.751528170692735[/C][C]0.49694365861453[/C][C]0.248471829307265[/C][/ROW]
[ROW][C]40[/C][C]0.691369985331837[/C][C]0.617260029336326[/C][C]0.308630014668163[/C][/ROW]
[ROW][C]41[/C][C]0.620353882613647[/C][C]0.759292234772705[/C][C]0.379646117386353[/C][/ROW]
[ROW][C]42[/C][C]0.559848203396805[/C][C]0.88030359320639[/C][C]0.440151796603195[/C][/ROW]
[ROW][C]43[/C][C]0.732549255768553[/C][C]0.534901488462894[/C][C]0.267450744231447[/C][/ROW]
[ROW][C]44[/C][C]0.591623034770257[/C][C]0.816753930459486[/C][C]0.408376965229743[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=57977&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=57977&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
160.0933772562905330.1867545125810660.906622743709467
170.1283431924618340.2566863849236680.871656807538166
180.06098063366795890.1219612673359180.93901936633204
190.056675940537220.113351881074440.94332405946278
200.1675552241613340.3351104483226680.832444775838666
210.1199383747813860.2398767495627720.880061625218614
220.07577573150494250.1515514630098850.924224268495057
230.04325127965566010.08650255931132030.95674872034434
240.05276182257588920.1055236451517780.94723817742411
250.03128982198279410.06257964396558810.968710178017206
260.01781831083967210.03563662167934420.982181689160328
270.009452186295620.018904372591240.99054781370438
280.005915398448744310.01183079689748860.994084601551256
290.04719663893445750.0943932778689150.952803361065543
300.1818546786619410.3637093573238810.81814532133806
310.3011582677323610.6023165354647220.698841732267639
320.4734212657773750.946842531554750.526578734222624
330.396521077707930.793042155415860.60347892229207
340.4936485097615750.987297019523150.506351490238425
350.8316591136015750.336681772796850.168340886398425
360.8990171174049020.2019657651901960.100982882595098
370.8616320255417260.2767359489165490.138367974458274
380.8303578796252220.3392842407495560.169642120374778
390.7515281706927350.496943658614530.248471829307265
400.6913699853318370.6172600293363260.308630014668163
410.6203538826136470.7592922347727050.379646117386353
420.5598482033968050.880303593206390.440151796603195
430.7325492557685530.5349014884628940.267450744231447
440.5916230347702570.8167539304594860.408376965229743







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level00OK
5% type I error level30.103448275862069NOK
10% type I error level60.206896551724138NOK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 0 & 0 & OK \tabularnewline
5% type I error level & 3 & 0.103448275862069 & NOK \tabularnewline
10% type I error level & 6 & 0.206896551724138 & NOK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=57977&T=6

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]3[/C][C]0.103448275862069[/C][C]NOK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]6[/C][C]0.206896551724138[/C][C]NOK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=57977&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=57977&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level00OK
5% type I error level30.103448275862069NOK
10% type I error level60.206896551724138NOK



Parameters (Session):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = No Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = No Linear Trend ;
R code (references can be found in the software module):
library(lattice)
library(lmtest)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
qqline(mysum$resid)
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, mysum$coefficients[i,1], sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,mysum$coefficients[i,1])
a<-table.element(a, round(mysum$coefficients[i,2],6))
a<-table.element(a, round(mysum$coefficients[i,3],4))
a<-table.element(a, round(mysum$coefficients[i,4],6))
a<-table.element(a, round(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, sqrt(mysum$r.squared))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, mysum$r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, mysum$adj.r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, mysum$fstatistic[1])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[2])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[3])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, 1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, mysum$sigma)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, sum(myerror*myerror))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,x[i])
a<-table.element(a,x[i]-mysum$resid[i])
a<-table.element(a,mysum$resid[i])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,gqarr[mypoint-kp3+1,1])
a<-table.element(a,gqarr[mypoint-kp3+1,2])
a<-table.element(a,gqarr[mypoint-kp3+1,3])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,numsignificant1)
a<-table.element(a,numsignificant1/numgqtests)
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,numsignificant5)
a<-table.element(a,numsignificant5/numgqtests)
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,numsignificant10)
a<-table.element(a,numsignificant10/numgqtests)
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}