Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationSat, 10 Nov 2012 06:47:28 -0500
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2012/Nov/10/t1352548068lz1hec4l9h09wjm.htm/, Retrieved Fri, 29 Mar 2024 07:20:54 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=187304, Retrieved Fri, 29 Mar 2024 07:20:54 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact129
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-     [Univariate Data Series] [data set] [2008-12-01 19:54:57] [b98453cac15ba1066b407e146608df68]
- RMP   [Classical Decomposition] [Unemployment] [2010-11-30 13:33:27] [b98453cac15ba1066b407e146608df68]
- RMPD      [Multiple Regression] [] [2012-11-10 11:47:28] [eeec99d459a890eb36d32eb90406e4cb] [Current]
Feedback Forum

Post a new message
Dataseries X:
99.2	96.7	101.0
99.0	98.1	100.1
100.0	100.0	100.0
111.6	104.9	90.6
122.2	104.9	86.5
117.6	109.5	89.7
121.1	110.8	90.6
136.0	112.3	82.8
154.2	109.3	70.1
153.6	105.3	65.4
158.5	101.7	61.3
140.6	95.4	62.5
136.2	96.4	63.6
168.0	97.6	52.6
154.3	102.4	59.7
149.0	101.6	59.5
165.5	103.8	61.3




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time6 seconds
R Server'Sir Ronald Aylmer Fisher' @ fisher.wessa.net

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 6 seconds \tabularnewline
R Server & 'Sir Ronald Aylmer Fisher' @ fisher.wessa.net \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=187304&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]6 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Sir Ronald Aylmer Fisher' @ fisher.wessa.net[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=187304&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=187304&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time6 seconds
R Server'Sir Ronald Aylmer Fisher' @ fisher.wessa.net







Multiple Linear Regression - Estimated Regression Equation
Cons[t] = + 141.648090178319 + 0.660084237790883Inc[t] -1.13957260765924Price[t] + 1.90575805616525M1[t] + 9.46692807187031M2[t] + 4.2938865424838M3[t] + 0.0205018786537541M4[t] + 10.9336372586813M5[t] + 2.29076793304609M6[t] + 5.35801031121686M7[t] + 9.77895415519412M8[t] + 14.8863712917001M9[t] + 10.9704535272708M10[t] + 12.9742456323207M11[t] + 0.600263459594387t + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
Cons[t] =  +  141.648090178319 +  0.660084237790883Inc[t] -1.13957260765924Price[t] +  1.90575805616525M1[t] +  9.46692807187031M2[t] +  4.2938865424838M3[t] +  0.0205018786537541M4[t] +  10.9336372586813M5[t] +  2.29076793304609M6[t] +  5.35801031121686M7[t] +  9.77895415519412M8[t] +  14.8863712917001M9[t] +  10.9704535272708M10[t] +  12.9742456323207M11[t] +  0.600263459594387t  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=187304&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]Cons[t] =  +  141.648090178319 +  0.660084237790883Inc[t] -1.13957260765924Price[t] +  1.90575805616525M1[t] +  9.46692807187031M2[t] +  4.2938865424838M3[t] +  0.0205018786537541M4[t] +  10.9336372586813M5[t] +  2.29076793304609M6[t] +  5.35801031121686M7[t] +  9.77895415519412M8[t] +  14.8863712917001M9[t] +  10.9704535272708M10[t] +  12.9742456323207M11[t] +  0.600263459594387t  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=187304&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=187304&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
Cons[t] = + 141.648090178319 + 0.660084237790883Inc[t] -1.13957260765924Price[t] + 1.90575805616525M1[t] + 9.46692807187031M2[t] + 4.2938865424838M3[t] + 0.0205018786537541M4[t] + 10.9336372586813M5[t] + 2.29076793304609M6[t] + 5.35801031121686M7[t] + 9.77895415519412M8[t] + 14.8863712917001M9[t] + 10.9704535272708M10[t] + 12.9742456323207M11[t] + 0.600263459594387t + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)141.648090178319371.2203540.38160.7395020.369751
Inc0.6600842377908833.4014590.19410.8640530.432027
Price-1.139572607659240.810546-1.40590.2949730.147486
M11.9057580561652512.3463930.15440.8914970.445749
M29.4669280718703113.6718710.69240.5602540.280127
M34.293886542483825.5292050.16820.88190.44095
M40.020501878653754131.117337e-040.9995340.499767
M510.933637258681335.6376950.30680.7879910.393996
M62.2907679330460952.4100890.04370.9691080.484554
M75.3580103112168658.6529230.09140.9355390.46777
M89.7789541551941261.5339450.15890.888330.444165
M914.886371291700147.8565520.31110.7851810.39259
M1010.970453527270834.4777170.31820.7804930.390247
M1112.974245632320723.3390990.55590.6341660.317083
t0.6002634595943872.5715510.23340.8371470.418574

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 141.648090178319 & 371.220354 & 0.3816 & 0.739502 & 0.369751 \tabularnewline
Inc & 0.660084237790883 & 3.401459 & 0.1941 & 0.864053 & 0.432027 \tabularnewline
Price & -1.13957260765924 & 0.810546 & -1.4059 & 0.294973 & 0.147486 \tabularnewline
M1 & 1.90575805616525 & 12.346393 & 0.1544 & 0.891497 & 0.445749 \tabularnewline
M2 & 9.46692807187031 & 13.671871 & 0.6924 & 0.560254 & 0.280127 \tabularnewline
M3 & 4.2938865424838 & 25.529205 & 0.1682 & 0.8819 & 0.44095 \tabularnewline
M4 & 0.0205018786537541 & 31.11733 & 7e-04 & 0.999534 & 0.499767 \tabularnewline
M5 & 10.9336372586813 & 35.637695 & 0.3068 & 0.787991 & 0.393996 \tabularnewline
M6 & 2.29076793304609 & 52.410089 & 0.0437 & 0.969108 & 0.484554 \tabularnewline
M7 & 5.35801031121686 & 58.652923 & 0.0914 & 0.935539 & 0.46777 \tabularnewline
M8 & 9.77895415519412 & 61.533945 & 0.1589 & 0.88833 & 0.444165 \tabularnewline
M9 & 14.8863712917001 & 47.856552 & 0.3111 & 0.785181 & 0.39259 \tabularnewline
M10 & 10.9704535272708 & 34.477717 & 0.3182 & 0.780493 & 0.390247 \tabularnewline
M11 & 12.9742456323207 & 23.339099 & 0.5559 & 0.634166 & 0.317083 \tabularnewline
t & 0.600263459594387 & 2.571551 & 0.2334 & 0.837147 & 0.418574 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=187304&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]141.648090178319[/C][C]371.220354[/C][C]0.3816[/C][C]0.739502[/C][C]0.369751[/C][/ROW]
[ROW][C]Inc[/C][C]0.660084237790883[/C][C]3.401459[/C][C]0.1941[/C][C]0.864053[/C][C]0.432027[/C][/ROW]
[ROW][C]Price[/C][C]-1.13957260765924[/C][C]0.810546[/C][C]-1.4059[/C][C]0.294973[/C][C]0.147486[/C][/ROW]
[ROW][C]M1[/C][C]1.90575805616525[/C][C]12.346393[/C][C]0.1544[/C][C]0.891497[/C][C]0.445749[/C][/ROW]
[ROW][C]M2[/C][C]9.46692807187031[/C][C]13.671871[/C][C]0.6924[/C][C]0.560254[/C][C]0.280127[/C][/ROW]
[ROW][C]M3[/C][C]4.2938865424838[/C][C]25.529205[/C][C]0.1682[/C][C]0.8819[/C][C]0.44095[/C][/ROW]
[ROW][C]M4[/C][C]0.0205018786537541[/C][C]31.11733[/C][C]7e-04[/C][C]0.999534[/C][C]0.499767[/C][/ROW]
[ROW][C]M5[/C][C]10.9336372586813[/C][C]35.637695[/C][C]0.3068[/C][C]0.787991[/C][C]0.393996[/C][/ROW]
[ROW][C]M6[/C][C]2.29076793304609[/C][C]52.410089[/C][C]0.0437[/C][C]0.969108[/C][C]0.484554[/C][/ROW]
[ROW][C]M7[/C][C]5.35801031121686[/C][C]58.652923[/C][C]0.0914[/C][C]0.935539[/C][C]0.46777[/C][/ROW]
[ROW][C]M8[/C][C]9.77895415519412[/C][C]61.533945[/C][C]0.1589[/C][C]0.88833[/C][C]0.444165[/C][/ROW]
[ROW][C]M9[/C][C]14.8863712917001[/C][C]47.856552[/C][C]0.3111[/C][C]0.785181[/C][C]0.39259[/C][/ROW]
[ROW][C]M10[/C][C]10.9704535272708[/C][C]34.477717[/C][C]0.3182[/C][C]0.780493[/C][C]0.390247[/C][/ROW]
[ROW][C]M11[/C][C]12.9742456323207[/C][C]23.339099[/C][C]0.5559[/C][C]0.634166[/C][C]0.317083[/C][/ROW]
[ROW][C]t[/C][C]0.600263459594387[/C][C]2.571551[/C][C]0.2334[/C][C]0.837147[/C][C]0.418574[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=187304&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=187304&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)141.648090178319371.2203540.38160.7395020.369751
Inc0.6600842377908833.4014590.19410.8640530.432027
Price-1.139572607659240.810546-1.40590.2949730.147486
M11.9057580561652512.3463930.15440.8914970.445749
M29.4669280718703113.6718710.69240.5602540.280127
M34.293886542483825.5292050.16820.88190.44095
M40.020501878653754131.117337e-040.9995340.499767
M510.933637258681335.6376950.30680.7879910.393996
M62.2907679330460952.4100890.04370.9691080.484554
M75.3580103112168658.6529230.09140.9355390.46777
M89.7789541551941261.5339450.15890.888330.444165
M914.886371291700147.8565520.31110.7851810.39259
M1010.970453527270834.4777170.31820.7804930.390247
M1112.974245632320723.3390990.55590.6341660.317083
t0.6002634595943872.5715510.23340.8371470.418574







Multiple Linear Regression - Regression Statistics
Multiple R0.991570863798807
R-squared0.983212777934713
Adjusted R-squared0.865702223477705
F-TEST (value)8.36701675417961
F-TEST (DF numerator)14
F-TEST (DF denominator)2
p-value0.111755354104154
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation8.64030497087169
Sum Squared Residuals149.30973997934

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.991570863798807 \tabularnewline
R-squared & 0.983212777934713 \tabularnewline
Adjusted R-squared & 0.865702223477705 \tabularnewline
F-TEST (value) & 8.36701675417961 \tabularnewline
F-TEST (DF numerator) & 14 \tabularnewline
F-TEST (DF denominator) & 2 \tabularnewline
p-value & 0.111755354104154 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 8.64030497087169 \tabularnewline
Sum Squared Residuals & 149.30973997934 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=187304&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.991570863798807[/C][/ROW]
[ROW][C]R-squared[/C][C]0.983212777934713[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]0.865702223477705[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]8.36701675417961[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]14[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]2[/C][/ROW]
[ROW][C]p-value[/C][C]0.111755354104154[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]8.64030497087169[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]149.30973997934[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=187304&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=187304&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.991570863798807
R-squared0.983212777934713
Adjusted R-squared0.865702223477705
F-TEST (value)8.36701675417961
F-TEST (DF numerator)14
F-TEST (DF denominator)2
p-value0.111755354104154
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation8.64030497087169
Sum Squared Residuals149.30973997934







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
199.292.88742411487466.31257588512536
299102.998590869975-3.99859086997455
310099.7939301127510.206069887248988
4111.6110.0672041856881.53279581431248
5122.2126.252850716712-4.05285071671228
6117.6117.6-5.12610787151146e-16
7121.1121.1-2.90566182226115e-16
81361361.53523027623947e-16
9154.2154.23.75567632548979e-16
10153.6153.61.04170144732407e-15
11158.5158.5-6.85215773010839e-17
12140.6140.6-6.85215773010839e-17
13136.2142.512575885125-6.31257588512536
14168164.0014091300253.99859086997455
15154.3154.506069887249-0.206069887248988
16149150.532795814312-1.53279581431248
17165.5161.4471492832884.05285071671228

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 99.2 & 92.8874241148746 & 6.31257588512536 \tabularnewline
2 & 99 & 102.998590869975 & -3.99859086997455 \tabularnewline
3 & 100 & 99.793930112751 & 0.206069887248988 \tabularnewline
4 & 111.6 & 110.067204185688 & 1.53279581431248 \tabularnewline
5 & 122.2 & 126.252850716712 & -4.05285071671228 \tabularnewline
6 & 117.6 & 117.6 & -5.12610787151146e-16 \tabularnewline
7 & 121.1 & 121.1 & -2.90566182226115e-16 \tabularnewline
8 & 136 & 136 & 1.53523027623947e-16 \tabularnewline
9 & 154.2 & 154.2 & 3.75567632548979e-16 \tabularnewline
10 & 153.6 & 153.6 & 1.04170144732407e-15 \tabularnewline
11 & 158.5 & 158.5 & -6.85215773010839e-17 \tabularnewline
12 & 140.6 & 140.6 & -6.85215773010839e-17 \tabularnewline
13 & 136.2 & 142.512575885125 & -6.31257588512536 \tabularnewline
14 & 168 & 164.001409130025 & 3.99859086997455 \tabularnewline
15 & 154.3 & 154.506069887249 & -0.206069887248988 \tabularnewline
16 & 149 & 150.532795814312 & -1.53279581431248 \tabularnewline
17 & 165.5 & 161.447149283288 & 4.05285071671228 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=187304&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]99.2[/C][C]92.8874241148746[/C][C]6.31257588512536[/C][/ROW]
[ROW][C]2[/C][C]99[/C][C]102.998590869975[/C][C]-3.99859086997455[/C][/ROW]
[ROW][C]3[/C][C]100[/C][C]99.793930112751[/C][C]0.206069887248988[/C][/ROW]
[ROW][C]4[/C][C]111.6[/C][C]110.067204185688[/C][C]1.53279581431248[/C][/ROW]
[ROW][C]5[/C][C]122.2[/C][C]126.252850716712[/C][C]-4.05285071671228[/C][/ROW]
[ROW][C]6[/C][C]117.6[/C][C]117.6[/C][C]-5.12610787151146e-16[/C][/ROW]
[ROW][C]7[/C][C]121.1[/C][C]121.1[/C][C]-2.90566182226115e-16[/C][/ROW]
[ROW][C]8[/C][C]136[/C][C]136[/C][C]1.53523027623947e-16[/C][/ROW]
[ROW][C]9[/C][C]154.2[/C][C]154.2[/C][C]3.75567632548979e-16[/C][/ROW]
[ROW][C]10[/C][C]153.6[/C][C]153.6[/C][C]1.04170144732407e-15[/C][/ROW]
[ROW][C]11[/C][C]158.5[/C][C]158.5[/C][C]-6.85215773010839e-17[/C][/ROW]
[ROW][C]12[/C][C]140.6[/C][C]140.6[/C][C]-6.85215773010839e-17[/C][/ROW]
[ROW][C]13[/C][C]136.2[/C][C]142.512575885125[/C][C]-6.31257588512536[/C][/ROW]
[ROW][C]14[/C][C]168[/C][C]164.001409130025[/C][C]3.99859086997455[/C][/ROW]
[ROW][C]15[/C][C]154.3[/C][C]154.506069887249[/C][C]-0.206069887248988[/C][/ROW]
[ROW][C]16[/C][C]149[/C][C]150.532795814312[/C][C]-1.53279581431248[/C][/ROW]
[ROW][C]17[/C][C]165.5[/C][C]161.447149283288[/C][C]4.05285071671228[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=187304&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=187304&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
199.292.88742411487466.31257588512536
299102.998590869975-3.99859086997455
310099.7939301127510.206069887248988
4111.6110.0672041856881.53279581431248
5122.2126.252850716712-4.05285071671228
6117.6117.6-5.12610787151146e-16
7121.1121.1-2.90566182226115e-16
81361361.53523027623947e-16
9154.2154.23.75567632548979e-16
10153.6153.61.04170144732407e-15
11158.5158.5-6.85215773010839e-17
12140.6140.6-6.85215773010839e-17
13136.2142.512575885125-6.31257588512536
14168164.0014091300253.99859086997455
15154.3154.506069887249-0.206069887248988
16149150.532795814312-1.53279581431248
17165.5161.4471492832884.05285071671228



Parameters (Session):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ;
R code (references can be found in the software module):
library(lattice)
library(lmtest)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
qqline(mysum$resid)
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, mysum$coefficients[i,1], sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,mysum$coefficients[i,1])
a<-table.element(a, round(mysum$coefficients[i,2],6))
a<-table.element(a, round(mysum$coefficients[i,3],4))
a<-table.element(a, round(mysum$coefficients[i,4],6))
a<-table.element(a, round(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, sqrt(mysum$r.squared))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, mysum$r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, mysum$adj.r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, mysum$fstatistic[1])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[2])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[3])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, 1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, mysum$sigma)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, sum(myerror*myerror))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,x[i])
a<-table.element(a,x[i]-mysum$resid[i])
a<-table.element(a,mysum$resid[i])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,gqarr[mypoint-kp3+1,1])
a<-table.element(a,gqarr[mypoint-kp3+1,2])
a<-table.element(a,gqarr[mypoint-kp3+1,3])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,numsignificant1)
a<-table.element(a,numsignificant1/numgqtests)
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,numsignificant5)
a<-table.element(a,numsignificant5/numgqtests)
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,numsignificant10)
a<-table.element(a,numsignificant10/numgqtests)
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}