Free Statistics

of Irreproducible Research!

Author's title

Author*Unverified author*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationThu, 29 Oct 2015 04:49:40 +0000
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2015/Oct/29/t1446094453dvcpnmt4p201sp7.htm/, Retrieved Tue, 14 May 2024 22:33:14 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=283106, Retrieved Tue, 14 May 2024 22:33:14 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact160
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [r] [2015-10-29 04:49:40] [d41d8cd98f00b204e9800998ecf8427e] [Current]
Feedback Forum

Post a new message
Dataseries X:
1 11 6 
1 8 0
1 5 2
1 14 8
1 19 11
1 6 4
1 10 13
1 6 1
1 11 8
1 3 0
0 16 13
0 13 10
0 11 18
0 9 5
0 21 23
0 16 12
0 12 5
0 12 16
0 7 1
0 12 20




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time4 seconds
R Server'George Udny Yule' @ yule.wessa.net

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 4 seconds \tabularnewline
R Server & 'George Udny Yule' @ yule.wessa.net \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=283106&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]4 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'George Udny Yule' @ yule.wessa.net[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=283106&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=283106&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time4 seconds
R Server'George Udny Yule' @ yule.wessa.net







Multiple Linear Regression - Estimated Regression Equation
V3[t] = + 0.319826 -3.6567V1[t] + 0.928696V2[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
V3[t] =  +  0.319826 -3.6567V1[t] +  0.928696V2[t]  + e[t] \tabularnewline
 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=283106&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]V3[t] =  +  0.319826 -3.6567V1[t] +  0.928696V2[t]  + e[t][/C][/ROW]
[ROW][C][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=283106&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=283106&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
V3[t] = + 0.319826 -3.6567V1[t] + 0.928696V2[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+0.3198 3.495+9.1500e-02 0.9282 0.4641
V1-3.657 2.232-1.6380e+00 0.1198 0.0599
V2+0.9287 0.2466+3.7660e+00 0.00154 0.0007701

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & +0.3198 &  3.495 & +9.1500e-02 &  0.9282 &  0.4641 \tabularnewline
V1 & -3.657 &  2.232 & -1.6380e+00 &  0.1198 &  0.0599 \tabularnewline
V2 & +0.9287 &  0.2466 & +3.7660e+00 &  0.00154 &  0.0007701 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=283106&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]+0.3198[/C][C] 3.495[/C][C]+9.1500e-02[/C][C] 0.9282[/C][C] 0.4641[/C][/ROW]
[ROW][C]V1[/C][C]-3.657[/C][C] 2.232[/C][C]-1.6380e+00[/C][C] 0.1198[/C][C] 0.0599[/C][/ROW]
[ROW][C]V2[/C][C]+0.9287[/C][C] 0.2466[/C][C]+3.7660e+00[/C][C] 0.00154[/C][C] 0.0007701[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=283106&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=283106&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+0.3198 3.495+9.1500e-02 0.9282 0.4641
V1-3.657 2.232-1.6380e+00 0.1198 0.0599
V2+0.9287 0.2466+3.7660e+00 0.00154 0.0007701







Multiple Linear Regression - Regression Statistics
Multiple R 0.7768
R-squared 0.6034
Adjusted R-squared 0.5567
F-TEST (value) 12.93
F-TEST (DF numerator)2
F-TEST (DF denominator)17
p-value 0.0003857
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 4.58
Sum Squared Residuals 356.6

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R &  0.7768 \tabularnewline
R-squared &  0.6034 \tabularnewline
Adjusted R-squared &  0.5567 \tabularnewline
F-TEST (value) &  12.93 \tabularnewline
F-TEST (DF numerator) & 2 \tabularnewline
F-TEST (DF denominator) & 17 \tabularnewline
p-value &  0.0003857 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation &  4.58 \tabularnewline
Sum Squared Residuals &  356.6 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=283106&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C] 0.7768[/C][/ROW]
[ROW][C]R-squared[/C][C] 0.6034[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C] 0.5567[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C] 12.93[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]2[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]17[/C][/ROW]
[ROW][C]p-value[/C][C] 0.0003857[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C] 4.58[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C] 356.6[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=283106&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=283106&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R 0.7768
R-squared 0.6034
Adjusted R-squared 0.5567
F-TEST (value) 12.93
F-TEST (DF numerator)2
F-TEST (DF denominator)17
p-value 0.0003857
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 4.58
Sum Squared Residuals 356.6







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 6 6.879-0.8788
2 0 4.093-4.093
3 2 1.307 0.6934
4 8 9.665-1.665
5 11 14.31-3.308
6 4 2.235 1.765
7 13 5.95 7.05
8 1 2.235-1.235
9 8 6.879 1.121
10 0-0.5508 0.5508
11 13 15.18-2.179
12 10 12.39-2.393
13 18 10.54 7.465
14 5 8.678-3.678
15 23 19.82 3.178
16 12 15.18-3.179
17 5 11.46-6.464
18 16 11.46 4.536
19 1 6.821-5.821
20 20 11.46 8.536

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 &  6 &  6.879 & -0.8788 \tabularnewline
2 &  0 &  4.093 & -4.093 \tabularnewline
3 &  2 &  1.307 &  0.6934 \tabularnewline
4 &  8 &  9.665 & -1.665 \tabularnewline
5 &  11 &  14.31 & -3.308 \tabularnewline
6 &  4 &  2.235 &  1.765 \tabularnewline
7 &  13 &  5.95 &  7.05 \tabularnewline
8 &  1 &  2.235 & -1.235 \tabularnewline
9 &  8 &  6.879 &  1.121 \tabularnewline
10 &  0 & -0.5508 &  0.5508 \tabularnewline
11 &  13 &  15.18 & -2.179 \tabularnewline
12 &  10 &  12.39 & -2.393 \tabularnewline
13 &  18 &  10.54 &  7.465 \tabularnewline
14 &  5 &  8.678 & -3.678 \tabularnewline
15 &  23 &  19.82 &  3.178 \tabularnewline
16 &  12 &  15.18 & -3.179 \tabularnewline
17 &  5 &  11.46 & -6.464 \tabularnewline
18 &  16 &  11.46 &  4.536 \tabularnewline
19 &  1 &  6.821 & -5.821 \tabularnewline
20 &  20 &  11.46 &  8.536 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=283106&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C] 6[/C][C] 6.879[/C][C]-0.8788[/C][/ROW]
[ROW][C]2[/C][C] 0[/C][C] 4.093[/C][C]-4.093[/C][/ROW]
[ROW][C]3[/C][C] 2[/C][C] 1.307[/C][C] 0.6934[/C][/ROW]
[ROW][C]4[/C][C] 8[/C][C] 9.665[/C][C]-1.665[/C][/ROW]
[ROW][C]5[/C][C] 11[/C][C] 14.31[/C][C]-3.308[/C][/ROW]
[ROW][C]6[/C][C] 4[/C][C] 2.235[/C][C] 1.765[/C][/ROW]
[ROW][C]7[/C][C] 13[/C][C] 5.95[/C][C] 7.05[/C][/ROW]
[ROW][C]8[/C][C] 1[/C][C] 2.235[/C][C]-1.235[/C][/ROW]
[ROW][C]9[/C][C] 8[/C][C] 6.879[/C][C] 1.121[/C][/ROW]
[ROW][C]10[/C][C] 0[/C][C]-0.5508[/C][C] 0.5508[/C][/ROW]
[ROW][C]11[/C][C] 13[/C][C] 15.18[/C][C]-2.179[/C][/ROW]
[ROW][C]12[/C][C] 10[/C][C] 12.39[/C][C]-2.393[/C][/ROW]
[ROW][C]13[/C][C] 18[/C][C] 10.54[/C][C] 7.465[/C][/ROW]
[ROW][C]14[/C][C] 5[/C][C] 8.678[/C][C]-3.678[/C][/ROW]
[ROW][C]15[/C][C] 23[/C][C] 19.82[/C][C] 3.178[/C][/ROW]
[ROW][C]16[/C][C] 12[/C][C] 15.18[/C][C]-3.179[/C][/ROW]
[ROW][C]17[/C][C] 5[/C][C] 11.46[/C][C]-6.464[/C][/ROW]
[ROW][C]18[/C][C] 16[/C][C] 11.46[/C][C] 4.536[/C][/ROW]
[ROW][C]19[/C][C] 1[/C][C] 6.821[/C][C]-5.821[/C][/ROW]
[ROW][C]20[/C][C] 20[/C][C] 11.46[/C][C] 8.536[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=283106&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=283106&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 6 6.879-0.8788
2 0 4.093-4.093
3 2 1.307 0.6934
4 8 9.665-1.665
5 11 14.31-3.308
6 4 2.235 1.765
7 13 5.95 7.05
8 1 2.235-1.235
9 8 6.879 1.121
10 0-0.5508 0.5508
11 13 15.18-2.179
12 10 12.39-2.393
13 18 10.54 7.465
14 5 8.678-3.678
15 23 19.82 3.178
16 12 15.18-3.179
17 5 11.46-6.464
18 16 11.46 4.536
19 1 6.821-5.821
20 20 11.46 8.536



Parameters (Session):
par1 = 3 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
Parameters (R input):
par1 = 3 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
R code (references can be found in the software module):
par3 <- 'No Linear Trend'
par2 <- 'Do not include Seasonal Dummies'
par1 <- '3'
library(lattice)
library(lmtest)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
mywarning <- ''
par1 <- as.numeric(par1)
if(is.na(par1)) {
par1 <- 1
mywarning = 'Warning: you did not specify the column number of the endogenous series! The first column was selected by default.'
}
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
qqline(mysum$resid)
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, signif(mysum$coefficients[i,1],6), sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, mywarning)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,formatC(signif(mysum$coefficients[i,1],5),format='g',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,2],5),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,3],4),format='e',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4],4),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4]/2,4),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a,formatC(signif(sqrt(mysum$r.squared),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$adj.r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a,formatC(signif(mysum$fstatistic[1],6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[2],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[3],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a,formatC(signif(1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a,formatC(signif(mysum$sigma,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a,formatC(signif(sum(myerror*myerror),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,formatC(signif(x[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(x[i]-mysum$resid[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$resid[i],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,1],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,2],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,3],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant1,6))
a<-table.element(a,formatC(signif(numsignificant1/numgqtests,6),format='g',flag=' '))
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant5,6))
a<-table.element(a,signif(numsignificant5/numgqtests,6))
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant10,6))
a<-table.element(a,signif(numsignificant10/numgqtests,6))
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}