Free Statistics

of Irreproducible Research!

Author's title

Author*Unverified author*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationWed, 06 Jun 2012 09:46:11 -0400
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2012/Jun/06/t1338990511jza7gxn68aohjd4.htm/, Retrieved Sun, 28 Apr 2024 10:30:04 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=168734, Retrieved Sun, 28 Apr 2024 10:30:04 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact200
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-     [Multiple Regression] [15th bird enterin...] [2012-03-06 03:20:16] [74be16979710d4c4e7c6647856088456]
-    D  [Multiple Regression] [Reduced model ] [2012-03-06 15:35:32] [74be16979710d4c4e7c6647856088456]
-    D    [Multiple Regression] [Chimney swift ent...] [2012-03-07 21:49:25] [74be16979710d4c4e7c6647856088456]
- R  D        [Multiple Regression] [Final model] [2012-06-06 13:46:11] [d41d8cd98f00b204e9800998ecf8427e] [Current]
Feedback Forum

Post a new message
Dataseries X:
1192	5	-4.3574	0
1196	6	-1.4534	1
1183	8	-3.9786	0
1210	9	-1.0745	0
1210	10	1.8295	0
1218	11	2.5114	0
1219	12	7.0821	0
1202	15	-3.0946	0
1195	16	-2.4128	0
1203	17	2.1579	0
1170	19	-0.9229	1
1189	20	4.2034	1
1199	21	2.6630	0
1196	22	4.4560	0
1189	23	-2.6400	0
1185	25	4.2792	0
1192	26	1.0722	0
1188	27	-1.0238	0
1176	28	-2.5641	0
1166	31	-7.7409	0
1176	32	-0.9479	0
1181	33	1.9561	0




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time4 seconds
R Server'Gwilym Jenkins' @ jenkins.wessa.net

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 4 seconds \tabularnewline
R Server & 'Gwilym Jenkins' @ jenkins.wessa.net \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=168734&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]4 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Gwilym Jenkins' @ jenkins.wessa.net[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=168734&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=168734&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time4 seconds
R Server'Gwilym Jenkins' @ jenkins.wessa.net







Multiple Linear Regression - Estimated Regression Equation
TIMEin[t] = + 1216.47337957194 -1.15627618132896DATE[t] + 2.19776083284445TEMP[t] -15.4677464579001RAIN[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
TIMEin[t] =  +  1216.47337957194 -1.15627618132896DATE[t] +  2.19776083284445TEMP[t] -15.4677464579001RAIN[t]  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=168734&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]TIMEin[t] =  +  1216.47337957194 -1.15627618132896DATE[t] +  2.19776083284445TEMP[t] -15.4677464579001RAIN[t]  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=168734&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=168734&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
TIMEin[t] = + 1216.47337957194 -1.15627618132896DATE[t] + 2.19776083284445TEMP[t] -15.4677464579001RAIN[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)1216.473379571943.812191319.100900
DATE-1.156276181328960.17742-6.51724e-062e-06
TEMP2.197760832844450.4289655.12347.1e-053.6e-05
RAIN-15.46774645790014.389084-3.52410.0024230.001212

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 1216.47337957194 & 3.812191 & 319.1009 & 0 & 0 \tabularnewline
DATE & -1.15627618132896 & 0.17742 & -6.5172 & 4e-06 & 2e-06 \tabularnewline
TEMP & 2.19776083284445 & 0.428965 & 5.1234 & 7.1e-05 & 3.6e-05 \tabularnewline
RAIN & -15.4677464579001 & 4.389084 & -3.5241 & 0.002423 & 0.001212 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=168734&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]1216.47337957194[/C][C]3.812191[/C][C]319.1009[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]DATE[/C][C]-1.15627618132896[/C][C]0.17742[/C][C]-6.5172[/C][C]4e-06[/C][C]2e-06[/C][/ROW]
[ROW][C]TEMP[/C][C]2.19776083284445[/C][C]0.428965[/C][C]5.1234[/C][C]7.1e-05[/C][C]3.6e-05[/C][/ROW]
[ROW][C]RAIN[/C][C]-15.4677464579001[/C][C]4.389084[/C][C]-3.5241[/C][C]0.002423[/C][C]0.001212[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=168734&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=168734&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)1216.473379571943.812191319.100900
DATE-1.156276181328960.17742-6.51724e-062e-06
TEMP2.197760832844450.4289655.12347.1e-053.6e-05
RAIN-15.46774645790014.389084-3.52410.0024230.001212







Multiple Linear Regression - Regression Statistics
Multiple R0.894341065391886
R-squared0.799845941246293
Adjusted R-squared0.766486931454009
F-TEST (value)23.9769089738175
F-TEST (DF numerator)3
F-TEST (DF denominator)18
p-value1.6449488076109e-06
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation6.9272603232592
Sum Squared Residuals863.764840551621

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.894341065391886 \tabularnewline
R-squared & 0.799845941246293 \tabularnewline
Adjusted R-squared & 0.766486931454009 \tabularnewline
F-TEST (value) & 23.9769089738175 \tabularnewline
F-TEST (DF numerator) & 3 \tabularnewline
F-TEST (DF denominator) & 18 \tabularnewline
p-value & 1.6449488076109e-06 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 6.9272603232592 \tabularnewline
Sum Squared Residuals & 863.764840551621 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=168734&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.894341065391886[/C][/ROW]
[ROW][C]R-squared[/C][C]0.799845941246293[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]0.766486931454009[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]23.9769089738175[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]3[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]18[/C][/ROW]
[ROW][C]p-value[/C][C]1.6449488076109e-06[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]6.9272603232592[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]863.764840551621[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=168734&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=168734&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.894341065391886
R-squared0.799845941246293
Adjusted R-squared0.766486931454009
F-TEST (value)23.9769089738175
F-TEST (DF numerator)3
F-TEST (DF denominator)18
p-value1.6449488076109e-06
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation6.9272603232592
Sum Squared Residuals863.764840551621







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
111921201.11547561226-9.11547561225627
211961190.873750431615.1262495683922
311831198.47915887175-15.4791588717511
412101203.705399925096.29460007491424
512101208.931421202341.06857879766291
612181209.273798132928.72620186707523
712191218.162827390280.837172609722057
812021192.328046178689.67195382131707
911951192.670203333192.32979666681268
1012031201.559232590541.4407674094595
1111701177.00807219616-7.00807219615532
1211891187.118177372241.88182262776312
1311991198.044216861890.955783138105601
1411961200.82852585386-4.82852585385554
1511891184.076938802664.92306119733765
1611851196.97113319462-11.9711331946218
1711921188.766638022363.23336197763935
1811881183.003855135394.99614486461028
1911761178.46236794323-2.46236794323045
2011661163.616171119772.38382888022559
2111761177.38928427596-1.38928427595782
2211811182.61530555321-1.61530555320915

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 1192 & 1201.11547561226 & -9.11547561225627 \tabularnewline
2 & 1196 & 1190.87375043161 & 5.1262495683922 \tabularnewline
3 & 1183 & 1198.47915887175 & -15.4791588717511 \tabularnewline
4 & 1210 & 1203.70539992509 & 6.29460007491424 \tabularnewline
5 & 1210 & 1208.93142120234 & 1.06857879766291 \tabularnewline
6 & 1218 & 1209.27379813292 & 8.72620186707523 \tabularnewline
7 & 1219 & 1218.16282739028 & 0.837172609722057 \tabularnewline
8 & 1202 & 1192.32804617868 & 9.67195382131707 \tabularnewline
9 & 1195 & 1192.67020333319 & 2.32979666681268 \tabularnewline
10 & 1203 & 1201.55923259054 & 1.4407674094595 \tabularnewline
11 & 1170 & 1177.00807219616 & -7.00807219615532 \tabularnewline
12 & 1189 & 1187.11817737224 & 1.88182262776312 \tabularnewline
13 & 1199 & 1198.04421686189 & 0.955783138105601 \tabularnewline
14 & 1196 & 1200.82852585386 & -4.82852585385554 \tabularnewline
15 & 1189 & 1184.07693880266 & 4.92306119733765 \tabularnewline
16 & 1185 & 1196.97113319462 & -11.9711331946218 \tabularnewline
17 & 1192 & 1188.76663802236 & 3.23336197763935 \tabularnewline
18 & 1188 & 1183.00385513539 & 4.99614486461028 \tabularnewline
19 & 1176 & 1178.46236794323 & -2.46236794323045 \tabularnewline
20 & 1166 & 1163.61617111977 & 2.38382888022559 \tabularnewline
21 & 1176 & 1177.38928427596 & -1.38928427595782 \tabularnewline
22 & 1181 & 1182.61530555321 & -1.61530555320915 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=168734&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]1192[/C][C]1201.11547561226[/C][C]-9.11547561225627[/C][/ROW]
[ROW][C]2[/C][C]1196[/C][C]1190.87375043161[/C][C]5.1262495683922[/C][/ROW]
[ROW][C]3[/C][C]1183[/C][C]1198.47915887175[/C][C]-15.4791588717511[/C][/ROW]
[ROW][C]4[/C][C]1210[/C][C]1203.70539992509[/C][C]6.29460007491424[/C][/ROW]
[ROW][C]5[/C][C]1210[/C][C]1208.93142120234[/C][C]1.06857879766291[/C][/ROW]
[ROW][C]6[/C][C]1218[/C][C]1209.27379813292[/C][C]8.72620186707523[/C][/ROW]
[ROW][C]7[/C][C]1219[/C][C]1218.16282739028[/C][C]0.837172609722057[/C][/ROW]
[ROW][C]8[/C][C]1202[/C][C]1192.32804617868[/C][C]9.67195382131707[/C][/ROW]
[ROW][C]9[/C][C]1195[/C][C]1192.67020333319[/C][C]2.32979666681268[/C][/ROW]
[ROW][C]10[/C][C]1203[/C][C]1201.55923259054[/C][C]1.4407674094595[/C][/ROW]
[ROW][C]11[/C][C]1170[/C][C]1177.00807219616[/C][C]-7.00807219615532[/C][/ROW]
[ROW][C]12[/C][C]1189[/C][C]1187.11817737224[/C][C]1.88182262776312[/C][/ROW]
[ROW][C]13[/C][C]1199[/C][C]1198.04421686189[/C][C]0.955783138105601[/C][/ROW]
[ROW][C]14[/C][C]1196[/C][C]1200.82852585386[/C][C]-4.82852585385554[/C][/ROW]
[ROW][C]15[/C][C]1189[/C][C]1184.07693880266[/C][C]4.92306119733765[/C][/ROW]
[ROW][C]16[/C][C]1185[/C][C]1196.97113319462[/C][C]-11.9711331946218[/C][/ROW]
[ROW][C]17[/C][C]1192[/C][C]1188.76663802236[/C][C]3.23336197763935[/C][/ROW]
[ROW][C]18[/C][C]1188[/C][C]1183.00385513539[/C][C]4.99614486461028[/C][/ROW]
[ROW][C]19[/C][C]1176[/C][C]1178.46236794323[/C][C]-2.46236794323045[/C][/ROW]
[ROW][C]20[/C][C]1166[/C][C]1163.61617111977[/C][C]2.38382888022559[/C][/ROW]
[ROW][C]21[/C][C]1176[/C][C]1177.38928427596[/C][C]-1.38928427595782[/C][/ROW]
[ROW][C]22[/C][C]1181[/C][C]1182.61530555321[/C][C]-1.61530555320915[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=168734&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=168734&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
111921201.11547561226-9.11547561225627
211961190.873750431615.1262495683922
311831198.47915887175-15.4791588717511
412101203.705399925096.29460007491424
512101208.931421202341.06857879766291
612181209.273798132928.72620186707523
712191218.162827390280.837172609722057
812021192.328046178689.67195382131707
911951192.670203333192.32979666681268
1012031201.559232590541.4407674094595
1111701177.00807219616-7.00807219615532
1211891187.118177372241.88182262776312
1311991198.044216861890.955783138105601
1411961200.82852585386-4.82852585385554
1511891184.076938802664.92306119733765
1611851196.97113319462-11.9711331946218
1711921188.766638022363.23336197763935
1811881183.003855135394.99614486461028
1911761178.46236794323-2.46236794323045
2011661163.616171119772.38382888022559
2111761177.38928427596-1.38928427595782
2211811182.61530555321-1.61530555320915



Parameters (Session):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
R code (references can be found in the software module):
library(lattice)
library(lmtest)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
qqline(mysum$resid)
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, mysum$coefficients[i,1], sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,mysum$coefficients[i,1])
a<-table.element(a, round(mysum$coefficients[i,2],6))
a<-table.element(a, round(mysum$coefficients[i,3],4))
a<-table.element(a, round(mysum$coefficients[i,4],6))
a<-table.element(a, round(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, sqrt(mysum$r.squared))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, mysum$r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, mysum$adj.r.squared)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, mysum$fstatistic[1])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[2])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, mysum$fstatistic[3])
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, 1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, mysum$sigma)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, sum(myerror*myerror))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,x[i])
a<-table.element(a,x[i]-mysum$resid[i])
a<-table.element(a,mysum$resid[i])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,gqarr[mypoint-kp3+1,1])
a<-table.element(a,gqarr[mypoint-kp3+1,2])
a<-table.element(a,gqarr[mypoint-kp3+1,3])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,numsignificant1)
a<-table.element(a,numsignificant1/numgqtests)
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,numsignificant5)
a<-table.element(a,numsignificant5/numgqtests)
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,numsignificant10)
a<-table.element(a,numsignificant10/numgqtests)
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}