Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationMon, 07 Dec 2015 11:56:33 +0000
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2015/Dec/07/t1449489413w6wzmb82b5argk6.htm/, Retrieved Thu, 16 May 2024 11:39:54 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=285352, Retrieved Thu, 16 May 2024 11:39:54 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact99
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-     [Multiple Regression] [] [2015-12-03 16:40:34] [2e6b1bdc398efa0639617f5108875d85]
- R P   [Multiple Regression] [] [2015-12-04 10:36:36] [2e6b1bdc398efa0639617f5108875d85]
- R PD      [Multiple Regression] [] [2015-12-07 11:56:33] [417cd1fa2ccbc3df120e1b65b71e6aee] [Current]
Feedback Forum

Post a new message
Dataseries X:
191	221
189	219
184	214
179	210
175	207
171	206
179	217
191	231
195	234
195	233
193	228
193	226
195	227
193	225
187	219
181	215
176	210
169	206
174	215
185	228
186	229
182	222
178	215
178	212
179	211
178	208
174	205
171	201
168	198
167	198
175	210
187	224
191	226
188	222
185	216
185	215
187	215
188	214
186	211
183	207
179	203
176	200
183	209
198	223
203	225
198	216
192	206
191	203
194	203
194	201
192	197
188	192
182	187
175	184
178	194
181	203
171	197
164	191
159	182
160	175
163	163
159	155
148	151
139	156
129	154
124	153
136	167
146	177
143	171
141	169
135	160
134	151
135	139
134	130
136	126
142	130
142	127
135	122
140	129
146	135
155	142
170	156
167	157
166	165
160	170
156	169
156	162
160	148
156	143
150	146
157	175
158	181
167	178
189	166
197	161
199	164
193	173
188	174
186	167
190	156
186	148
181	150
190	174
189	181
192	183
201	178
200	176
206	184
208	193
202	192
190	182
171	163
163	157
167	167
195	205
208	219
208	214
197	198
189	183
192	184
199	192
202	196
200	194
191	185
190	181
180	184
194	206
196	210
199	208
200	197
199	189
205	190
207	191
211	190
210	187
208	184
201	183
186	184
177	203
168	208
173	205
181	195
185	189
186	188
189	190
186	190
181	190
182	193
176	185
165	173
176	176
174	170
168	163
165	170
162	171
170	173
179	171
178	162
169	152
160	142
151	136
159	146
191	179
195	191
184	181
162	170
152	161
162	168
188	180
202	182
209	176
204	164
193	154
191	160
202	189
204	196
206	186
211	171
214	169
224	181
224	198
222	202
219	196
218	183
213	173
213	175
229	198
225	203
220	197
212	191
204	182
204	172
202	158
195	147
186	143
175	146
170	147
171	152
196	177
202	184
200	174
191	162
186	157
186	155
193	159
193	158
188	156
185	157
182	156
180	158
194	173
204	179
216	172
233	169
241	168
243	172
241	180
233	182
228	182
225	181
219	178
217	178
235	196
237	199
238	192
235	187
234	184
239	184
248	188
248	183
247	176
246	168
240	163
233	166
242	189
239	195
238	192
238	189
238	187
240	187
249	190
251	187
253	179
251	168
246	160
247	161
260	177
260	182
259	176




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time6 seconds
R Server'Herman Ole Andreas Wold' @ wold.wessa.net

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 6 seconds \tabularnewline
R Server & 'Herman Ole Andreas Wold' @ wold.wessa.net \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=285352&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]6 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Herman Ole Andreas Wold' @ wold.wessa.net[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=285352&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=285352&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time6 seconds
R Server'Herman Ole Andreas Wold' @ wold.wessa.net







Multiple Linear Regression - Estimated Regression Equation
A[t] = -1.20917 + 0.0517449B[t] + 1.53453`A(t-1)`[t] -0.692133`A(t-2)`[t] -0.165818`A(t-3)`[t] + 0.206888`A(t-4)`[t] + 0.210478`A(t-5)`[t] -0.167137`A(t-6)`[t] + 13.7482M1[t] -2.54363M2[t] + 0.738746M3[t] + 5.64068M4[t] + 2.73565M5[t] + 4.74152M6[t] + 3.34283M7[t] -0.923448M8[t] + 0.904754M9[t] + 2.4259M10[t] -1.16005M11[t] + 0.0285335t + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
A[t] =  -1.20917 +  0.0517449B[t] +  1.53453`A(t-1)`[t] -0.692133`A(t-2)`[t] -0.165818`A(t-3)`[t] +  0.206888`A(t-4)`[t] +  0.210478`A(t-5)`[t] -0.167137`A(t-6)`[t] +  13.7482M1[t] -2.54363M2[t] +  0.738746M3[t] +  5.64068M4[t] +  2.73565M5[t] +  4.74152M6[t] +  3.34283M7[t] -0.923448M8[t] +  0.904754M9[t] +  2.4259M10[t] -1.16005M11[t] +  0.0285335t  + e[t] \tabularnewline
 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=285352&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]A[t] =  -1.20917 +  0.0517449B[t] +  1.53453`A(t-1)`[t] -0.692133`A(t-2)`[t] -0.165818`A(t-3)`[t] +  0.206888`A(t-4)`[t] +  0.210478`A(t-5)`[t] -0.167137`A(t-6)`[t] +  13.7482M1[t] -2.54363M2[t] +  0.738746M3[t] +  5.64068M4[t] +  2.73565M5[t] +  4.74152M6[t] +  3.34283M7[t] -0.923448M8[t] +  0.904754M9[t] +  2.4259M10[t] -1.16005M11[t] +  0.0285335t  + e[t][/C][/ROW]
[ROW][C][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=285352&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=285352&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
A[t] = -1.20917 + 0.0517449B[t] + 1.53453`A(t-1)`[t] -0.692133`A(t-2)`[t] -0.165818`A(t-3)`[t] + 0.206888`A(t-4)`[t] + 0.210478`A(t-5)`[t] -0.167137`A(t-6)`[t] + 13.7482M1[t] -2.54363M2[t] + 0.738746M3[t] + 5.64068M4[t] + 2.73565M5[t] + 4.74152M6[t] + 3.34283M7[t] -0.923448M8[t] + 0.904754M9[t] + 2.4259M10[t] -1.16005M11[t] + 0.0285335t + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)-1.209 3.063-3.9470e-01 0.6934 0.3467
B+0.05174 0.02433+2.1270e+00 0.03454 0.01727
`A(t-1)`+1.534 0.06718+2.2840e+01 2.566e-60 1.283e-60
`A(t-2)`-0.6921 0.1225-5.6530e+00 4.806e-08 2.403e-08
`A(t-3)`-0.1658 0.1302-1.2730e+00 0.2042 0.1021
`A(t-4)`+0.2069 0.1303+1.5880e+00 0.1138 0.05688
`A(t-5)`+0.2105 0.1226+1.7170e+00 0.08745 0.04373
`A(t-6)`-0.1671 0.0665-2.5130e+00 0.01267 0.006333
M1+13.75 1.568+8.7670e+00 4.766e-16 2.383e-16
M2-2.544 1.908-1.3330e+00 0.1839 0.09196
M3+0.7388 1.788+4.1320e-01 0.6799 0.3399
M4+5.641 1.843+3.0610e+00 0.002474 0.001237
M5+2.736 1.902+1.4390e+00 0.1517 0.07583
M6+4.742 1.635+2.9010e+00 0.004097 0.002049
M7+3.343 1.607+2.0800e+00 0.03864 0.01932
M8-0.9234 1.58-5.8440e-01 0.5595 0.2798
M9+0.9048 1.539+5.8790e-01 0.5572 0.2786
M10+2.426 1.573+1.5420e+00 0.1245 0.06225
M11-1.16 1.522-7.6190e-01 0.4469 0.2235
t+0.02853 0.009512+3.0000e+00 0.00301 0.001505

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & -1.209 &  3.063 & -3.9470e-01 &  0.6934 &  0.3467 \tabularnewline
B & +0.05174 &  0.02433 & +2.1270e+00 &  0.03454 &  0.01727 \tabularnewline
`A(t-1)` & +1.534 &  0.06718 & +2.2840e+01 &  2.566e-60 &  1.283e-60 \tabularnewline
`A(t-2)` & -0.6921 &  0.1225 & -5.6530e+00 &  4.806e-08 &  2.403e-08 \tabularnewline
`A(t-3)` & -0.1658 &  0.1302 & -1.2730e+00 &  0.2042 &  0.1021 \tabularnewline
`A(t-4)` & +0.2069 &  0.1303 & +1.5880e+00 &  0.1138 &  0.05688 \tabularnewline
`A(t-5)` & +0.2105 &  0.1226 & +1.7170e+00 &  0.08745 &  0.04373 \tabularnewline
`A(t-6)` & -0.1671 &  0.0665 & -2.5130e+00 &  0.01267 &  0.006333 \tabularnewline
M1 & +13.75 &  1.568 & +8.7670e+00 &  4.766e-16 &  2.383e-16 \tabularnewline
M2 & -2.544 &  1.908 & -1.3330e+00 &  0.1839 &  0.09196 \tabularnewline
M3 & +0.7388 &  1.788 & +4.1320e-01 &  0.6799 &  0.3399 \tabularnewline
M4 & +5.641 &  1.843 & +3.0610e+00 &  0.002474 &  0.001237 \tabularnewline
M5 & +2.736 &  1.902 & +1.4390e+00 &  0.1517 &  0.07583 \tabularnewline
M6 & +4.742 &  1.635 & +2.9010e+00 &  0.004097 &  0.002049 \tabularnewline
M7 & +3.343 &  1.607 & +2.0800e+00 &  0.03864 &  0.01932 \tabularnewline
M8 & -0.9234 &  1.58 & -5.8440e-01 &  0.5595 &  0.2798 \tabularnewline
M9 & +0.9048 &  1.539 & +5.8790e-01 &  0.5572 &  0.2786 \tabularnewline
M10 & +2.426 &  1.573 & +1.5420e+00 &  0.1245 &  0.06225 \tabularnewline
M11 & -1.16 &  1.522 & -7.6190e-01 &  0.4469 &  0.2235 \tabularnewline
t & +0.02853 &  0.009512 & +3.0000e+00 &  0.00301 &  0.001505 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=285352&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]-1.209[/C][C] 3.063[/C][C]-3.9470e-01[/C][C] 0.6934[/C][C] 0.3467[/C][/ROW]
[ROW][C]B[/C][C]+0.05174[/C][C] 0.02433[/C][C]+2.1270e+00[/C][C] 0.03454[/C][C] 0.01727[/C][/ROW]
[ROW][C]`A(t-1)`[/C][C]+1.534[/C][C] 0.06718[/C][C]+2.2840e+01[/C][C] 2.566e-60[/C][C] 1.283e-60[/C][/ROW]
[ROW][C]`A(t-2)`[/C][C]-0.6921[/C][C] 0.1225[/C][C]-5.6530e+00[/C][C] 4.806e-08[/C][C] 2.403e-08[/C][/ROW]
[ROW][C]`A(t-3)`[/C][C]-0.1658[/C][C] 0.1302[/C][C]-1.2730e+00[/C][C] 0.2042[/C][C] 0.1021[/C][/ROW]
[ROW][C]`A(t-4)`[/C][C]+0.2069[/C][C] 0.1303[/C][C]+1.5880e+00[/C][C] 0.1138[/C][C] 0.05688[/C][/ROW]
[ROW][C]`A(t-5)`[/C][C]+0.2105[/C][C] 0.1226[/C][C]+1.7170e+00[/C][C] 0.08745[/C][C] 0.04373[/C][/ROW]
[ROW][C]`A(t-6)`[/C][C]-0.1671[/C][C] 0.0665[/C][C]-2.5130e+00[/C][C] 0.01267[/C][C] 0.006333[/C][/ROW]
[ROW][C]M1[/C][C]+13.75[/C][C] 1.568[/C][C]+8.7670e+00[/C][C] 4.766e-16[/C][C] 2.383e-16[/C][/ROW]
[ROW][C]M2[/C][C]-2.544[/C][C] 1.908[/C][C]-1.3330e+00[/C][C] 0.1839[/C][C] 0.09196[/C][/ROW]
[ROW][C]M3[/C][C]+0.7388[/C][C] 1.788[/C][C]+4.1320e-01[/C][C] 0.6799[/C][C] 0.3399[/C][/ROW]
[ROW][C]M4[/C][C]+5.641[/C][C] 1.843[/C][C]+3.0610e+00[/C][C] 0.002474[/C][C] 0.001237[/C][/ROW]
[ROW][C]M5[/C][C]+2.736[/C][C] 1.902[/C][C]+1.4390e+00[/C][C] 0.1517[/C][C] 0.07583[/C][/ROW]
[ROW][C]M6[/C][C]+4.742[/C][C] 1.635[/C][C]+2.9010e+00[/C][C] 0.004097[/C][C] 0.002049[/C][/ROW]
[ROW][C]M7[/C][C]+3.343[/C][C] 1.607[/C][C]+2.0800e+00[/C][C] 0.03864[/C][C] 0.01932[/C][/ROW]
[ROW][C]M8[/C][C]-0.9234[/C][C] 1.58[/C][C]-5.8440e-01[/C][C] 0.5595[/C][C] 0.2798[/C][/ROW]
[ROW][C]M9[/C][C]+0.9048[/C][C] 1.539[/C][C]+5.8790e-01[/C][C] 0.5572[/C][C] 0.2786[/C][/ROW]
[ROW][C]M10[/C][C]+2.426[/C][C] 1.573[/C][C]+1.5420e+00[/C][C] 0.1245[/C][C] 0.06225[/C][/ROW]
[ROW][C]M11[/C][C]-1.16[/C][C] 1.522[/C][C]-7.6190e-01[/C][C] 0.4469[/C][C] 0.2235[/C][/ROW]
[ROW][C]t[/C][C]+0.02853[/C][C] 0.009512[/C][C]+3.0000e+00[/C][C] 0.00301[/C][C] 0.001505[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=285352&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=285352&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)-1.209 3.063-3.9470e-01 0.6934 0.3467
B+0.05174 0.02433+2.1270e+00 0.03454 0.01727
`A(t-1)`+1.534 0.06718+2.2840e+01 2.566e-60 1.283e-60
`A(t-2)`-0.6921 0.1225-5.6530e+00 4.806e-08 2.403e-08
`A(t-3)`-0.1658 0.1302-1.2730e+00 0.2042 0.1021
`A(t-4)`+0.2069 0.1303+1.5880e+00 0.1138 0.05688
`A(t-5)`+0.2105 0.1226+1.7170e+00 0.08745 0.04373
`A(t-6)`-0.1671 0.0665-2.5130e+00 0.01267 0.006333
M1+13.75 1.568+8.7670e+00 4.766e-16 2.383e-16
M2-2.544 1.908-1.3330e+00 0.1839 0.09196
M3+0.7388 1.788+4.1320e-01 0.6799 0.3399
M4+5.641 1.843+3.0610e+00 0.002474 0.001237
M5+2.736 1.902+1.4390e+00 0.1517 0.07583
M6+4.742 1.635+2.9010e+00 0.004097 0.002049
M7+3.343 1.607+2.0800e+00 0.03864 0.01932
M8-0.9234 1.58-5.8440e-01 0.5595 0.2798
M9+0.9048 1.539+5.8790e-01 0.5572 0.2786
M10+2.426 1.573+1.5420e+00 0.1245 0.06225
M11-1.16 1.522-7.6190e-01 0.4469 0.2235
t+0.02853 0.009512+3.0000e+00 0.00301 0.001505







Multiple Linear Regression - Regression Statistics
Multiple R 0.9879
R-squared 0.9759
Adjusted R-squared 0.9738
F-TEST (value) 474.5
F-TEST (DF numerator)19
F-TEST (DF denominator)223
p-value 0
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 4.664
Sum Squared Residuals 4852

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R &  0.9879 \tabularnewline
R-squared &  0.9759 \tabularnewline
Adjusted R-squared &  0.9738 \tabularnewline
F-TEST (value) &  474.5 \tabularnewline
F-TEST (DF numerator) & 19 \tabularnewline
F-TEST (DF denominator) & 223 \tabularnewline
p-value &  0 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation &  4.664 \tabularnewline
Sum Squared Residuals &  4852 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=285352&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C] 0.9879[/C][/ROW]
[ROW][C]R-squared[/C][C] 0.9759[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C] 0.9738[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C] 474.5[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]19[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]223[/C][/ROW]
[ROW][C]p-value[/C][C] 0[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C] 4.664[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C] 4852[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=285352&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=285352&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R 0.9879
R-squared 0.9759
Adjusted R-squared 0.9738
F-TEST (value) 474.5
F-TEST (DF numerator)19
F-TEST (DF denominator)223
p-value 0
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 4.664
Sum Squared Residuals 4852



Parameters (Session):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ; par4 = 6 ; par5 = 0 ;
Parameters (R input):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = Linear Trend ; par4 = 6 ; par5 = 0 ;
R code (references can be found in the software module):
library(lattice)
library(lmtest)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
mywarning <- ''
par1 <- as.numeric(par1)
if(is.na(par1)) {
par1 <- 1
mywarning = 'Warning: you did not specify the column number of the endogenous series! The first column was selected by default.'
}
if (par4=='') par4 <- 0
par4 <- as.numeric(par4)
if (par5=='') par5 <- 0
par5 <- as.numeric(par5)
x <- na.omit(t(y))
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'Seasonal Differences (s=12)'){
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'First and Seasonal Differences (s=12)'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if(par4 > 0) {
x2 <- array(0, dim=c(n-par4,par4), dimnames=list(1:(n-par4), paste(colnames(x)[par1],'(t-',1:par4,')',sep='')))
for (i in 1:(n-par4)) {
for (j in 1:par4) {
x2[i,j] <- x[i+par4-j,par1]
}
}
x <- cbind(x[(par4+1):n,], x2)
n <- n - par4
}
if(par5 > 0) {
x2 <- array(0, dim=c(n-par5*12,par5), dimnames=list(1:(n-par5*12), paste(colnames(x)[par1],'(t-',1:par5,'s)',sep='')))
for (i in 1:(n-par5*12)) {
for (j in 1:par5) {
x2[i,j] <- x[i+par5*12-j*12,par1]
}
}
x <- cbind(x[(par5*12+1):n,], x2)
n <- n - par5*12
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
(k <- length(x[n,]))
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
(k <- length(x[n,]))
head(x)
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
qqline(mysum$resid)
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, signif(mysum$coefficients[i,1],6), sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, mywarning)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,formatC(signif(mysum$coefficients[i,1],5),format='g',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,2],5),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,3],4),format='e',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4],4),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4]/2,4),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a,formatC(signif(sqrt(mysum$r.squared),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$adj.r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a,formatC(signif(mysum$fstatistic[1],6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[2],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[3],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a,formatC(signif(1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a,formatC(signif(mysum$sigma,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a,formatC(signif(sum(myerror*myerror),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
if(n < 200) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,formatC(signif(x[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(x[i]-mysum$resid[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$resid[i],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,1],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,2],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,3],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant1,6))
a<-table.element(a,formatC(signif(numsignificant1/numgqtests,6),format='g',flag=' '))
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant5,6))
a<-table.element(a,signif(numsignificant5/numgqtests,6))
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant10,6))
a<-table.element(a,signif(numsignificant10/numgqtests,6))
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}
}