Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_pairs.wasp
Title produced by softwareKendall tau Correlation Matrix
Date of computationFri, 02 Dec 2016 21:14:07 +0100
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2016/Dec/02/t14807100600p8nn07uu867qjd.htm/, Retrieved Tue, 07 May 2024 17:50:20 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=297600, Retrieved Tue, 07 May 2024 17:50:20 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact102
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Kendall tau Correlation Matrix] [] [2016-12-02 20:14:07] [219800a2f11ddd28e3280d87dbde8c8d] [Current]
Feedback Forum

Post a new message
Dataseries X:
5	3	4	5	14
2	2	5	2	19
3	3	4	2	17
3	3	4	2	17
3	2	4	4	15
4	4	5	4	20
4	3	5	NA	15
2	2	5	3	19
5	4	5	2	15
4	2	5	4	15
2	2	5	2	19
4	4	4	4	NA
3	5	4	3	20
3	5	5	3	18
4	2	5	4	15
2	2	4	3	14
1	1	4	2	20
NA	5	NA	NA	NA
2	2	4	2	16
3	4	5	2	16
5	4	5	2	16
4	4	4	3	10
5	4	4	2	19
3	3	4	2	19
5	5	5	3	16
2	2	4	2	15
4	5	5	3	18
4	2	4	2	17
3	3	5	2	19
2	1	4	2	17
1	1	4	5	NA
2	2	3	3	19
5	1	5	4	20
4	4	4	3	5
3	3	4	3	19
2	3	5	3	16
1	2	4	2	15
3	2	5	4	16
3	3	5	3	18
3	1	5	2	16
5	3	4	3	15
2	2	4	4	17
2	2	4	3	NA
1	2	5	4	20
4	4	4	3	19
4	1	4	4	7
2	2	4	3	13
1	5	2	2	16
5	4	4	3	16
4	4	4	1	NA
4	4	5	2	18
4	2	5	3	18
2	2	5	3	16
2	2	4	2	17
3	2	4	3	19
2	1	4	2	16
3	5	5	2	19
4	5	5	2	13
3	3	4	2	16
2	2	5	2	13
2	2	5	2	12
1	2	4	2	17
3	2	5	3	17
4	5	5	3	17
4	5	5	4	16
4	3	5	3	16
3	3	3	3	14
5	4	5	4	16
4	1	4	2	13
1	1	3	1	16
1	1	5	3	14
5	5	5	4	20
5	4	3	4	12
3	1	4	4	13
2	2	4	2	18
4	3	5	2	14
4	2	5	1	19
4	2	5	2	18
4	5	5	2	14
5	5	5	3	18
4	2	5	2	19
4	4	4	3	15
4	4	4	4	14
2	1	4	2	17
1	1	5	2	19
1	2	4	1	13
5	4	5	4	19
5	5	5	3	18
3	2	5	4	20
2	2	2	2	15
4	3	4	3	15
2	1	5	5	15
3	4	4	3	20
1	1	4	1	15
5	5	5	3	19
4	4	5	3	18
2	1	4	2	18
2	3	5	1	15
1	1	5	3	20
4	2	5	2	17
2	1	5	2	12
3	1	5	3	18
1	3	4	3	19
2	2	5	3	20
3	2	4	3	NA
1	2	5	2	17
5	5	5	NA	15
4	3	4	1	16
1	2	5	4	18
4	4	5	3	18
1	3	5	2	14
4	2	3	3	15
2	2	5	3	12
3	4	3	3	17
3	1	4	2	14
3	4	4	3	18
3	3	5	2	17
3	5	4	3	17
2	4	5	2	20
2	3	5	3	16
4	4	5	4	14
2	3	4	3	15
5	5	4	3	18
1	1	5	2	20
3	2	4	3	17
3	4	5	2	17
3	4	5	2	17
4	5	3	2	17
3	2	5	2	15
3	3	4	NA	17
2	4	4	3	18
4	5	4	2	17
5	5	3	3	20
4	2	5	2	15
4	4	4	2	16
4	4	4	2	15
3	5	4	5	18
4	2	4	3	11
3	4	5	3	15
NA	1	5	1	18
1	2	5	3	20
2	2	5	2	19
1	1	4	3	14
4	4	4	3	16
5	3	5	3	15
4	4	5	3	17
3	1	4	2	18
2	4	5	4	20
1	2	5	2	17
3	3	5	1	18
4	3	5	2	15
4	5	5	4	16
1	5	5	4	11
5	5	5	4	15
3	4	3	3	18
NA	2	4	2	17
4	2	5	4	16
1	1	3	2	12
3	2	4	5	19
3	4	NA	2	18
4	2	5	3	15
4	3	2	2	17
5	5	5	3	19
1	1	3	3	18
NA	5	5	4	19
1	1	1	2	16
5	3	5	4	16
3	4	5	2	16
4	3	5	5	14




Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time1 seconds
R ServerBig Analytics Cloud Computing Center

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input view raw input (R code)  \tabularnewline
Raw Outputview raw output of R engine  \tabularnewline
Computing time1 seconds \tabularnewline
R ServerBig Analytics Cloud Computing Center \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=297600&T=0

[TABLE]
[ROW]
Summary of computational transaction[/C][/ROW] [ROW]Raw Input[/C] view raw input (R code) [/C][/ROW] [ROW]Raw Output[/C]view raw output of R engine [/C][/ROW] [ROW]Computing time[/C]1 seconds[/C][/ROW] [ROW]R Server[/C]Big Analytics Cloud Computing Center[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=297600&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297600&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time1 seconds
R ServerBig Analytics Cloud Computing Center







Correlations for all pairs of data series (method=kendall)
EC1EC2EC3EC4ITH
EC110.4480.1130.233-0.045
EC20.44810.0890.1670.08
EC30.1130.08910.0930.113
EC40.2330.1670.09310.017
ITH-0.0450.080.1130.0171

\begin{tabular}{lllllllll}
\hline
Correlations for all pairs of data series (method=kendall) \tabularnewline
  & EC1 & EC2 & EC3 & EC4 & ITH \tabularnewline
EC1 & 1 & 0.448 & 0.113 & 0.233 & -0.045 \tabularnewline
EC2 & 0.448 & 1 & 0.089 & 0.167 & 0.08 \tabularnewline
EC3 & 0.113 & 0.089 & 1 & 0.093 & 0.113 \tabularnewline
EC4 & 0.233 & 0.167 & 0.093 & 1 & 0.017 \tabularnewline
ITH & -0.045 & 0.08 & 0.113 & 0.017 & 1 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=297600&T=1

[TABLE]
[ROW][C]Correlations for all pairs of data series (method=kendall)[/C][/ROW]
[ROW][C] [/C][C]EC1[/C][C]EC2[/C][C]EC3[/C][C]EC4[/C][C]ITH[/C][/ROW]
[ROW][C]EC1[/C][C]1[/C][C]0.448[/C][C]0.113[/C][C]0.233[/C][C]-0.045[/C][/ROW]
[ROW][C]EC2[/C][C]0.448[/C][C]1[/C][C]0.089[/C][C]0.167[/C][C]0.08[/C][/ROW]
[ROW][C]EC3[/C][C]0.113[/C][C]0.089[/C][C]1[/C][C]0.093[/C][C]0.113[/C][/ROW]
[ROW][C]EC4[/C][C]0.233[/C][C]0.167[/C][C]0.093[/C][C]1[/C][C]0.017[/C][/ROW]
[ROW][C]ITH[/C][C]-0.045[/C][C]0.08[/C][C]0.113[/C][C]0.017[/C][C]1[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=297600&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297600&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Correlations for all pairs of data series (method=kendall)
EC1EC2EC3EC4ITH
EC110.4480.1130.233-0.045
EC20.44810.0890.1670.08
EC30.1130.08910.0930.113
EC40.2330.1670.09310.017
ITH-0.0450.080.1130.0171







Correlations for all pairs of data series with p-values
pairPearson rSpearman rhoKendall tau
EC1;EC20.52990.53150.4482
p-value(0)(0)(0)
EC1;EC30.15550.12930.1133
p-value(0.0526)(0.1078)(0.1032)
EC1;EC40.26870.27620.2332
p-value(7e-04)(5e-04)(5e-04)
EC1;ITH-0.0517-0.0542-0.0451
p-value(0.5218)(0.5013)(0.4703)
EC2;EC30.09280.10230.0888
p-value(0.2491)(0.2037)(0.2011)
EC2;EC40.17850.19890.167
p-value(0.0257)(0.0128)(0.0129)
EC2;ITH0.09330.09950.0802
p-value(0.2467)(0.2165)(0.1986)
EC3;EC40.12730.10370.0928
p-value(0.1134)(0.1978)(0.1976)
EC3;ITH0.13020.13540.113
p-value(0.1053)(0.0919)(0.0912)
EC4;ITH-0.02870.01440.017
p-value(0.7221)(0.8588)(0.7924)

\begin{tabular}{lllllllll}
\hline
Correlations for all pairs of data series with p-values \tabularnewline
pair & Pearson r & Spearman rho & Kendall tau \tabularnewline
EC1;EC2 & 0.5299 & 0.5315 & 0.4482 \tabularnewline
p-value & (0) & (0) & (0) \tabularnewline
EC1;EC3 & 0.1555 & 0.1293 & 0.1133 \tabularnewline
p-value & (0.0526) & (0.1078) & (0.1032) \tabularnewline
EC1;EC4 & 0.2687 & 0.2762 & 0.2332 \tabularnewline
p-value & (7e-04) & (5e-04) & (5e-04) \tabularnewline
EC1;ITH & -0.0517 & -0.0542 & -0.0451 \tabularnewline
p-value & (0.5218) & (0.5013) & (0.4703) \tabularnewline
EC2;EC3 & 0.0928 & 0.1023 & 0.0888 \tabularnewline
p-value & (0.2491) & (0.2037) & (0.2011) \tabularnewline
EC2;EC4 & 0.1785 & 0.1989 & 0.167 \tabularnewline
p-value & (0.0257) & (0.0128) & (0.0129) \tabularnewline
EC2;ITH & 0.0933 & 0.0995 & 0.0802 \tabularnewline
p-value & (0.2467) & (0.2165) & (0.1986) \tabularnewline
EC3;EC4 & 0.1273 & 0.1037 & 0.0928 \tabularnewline
p-value & (0.1134) & (0.1978) & (0.1976) \tabularnewline
EC3;ITH & 0.1302 & 0.1354 & 0.113 \tabularnewline
p-value & (0.1053) & (0.0919) & (0.0912) \tabularnewline
EC4;ITH & -0.0287 & 0.0144 & 0.017 \tabularnewline
p-value & (0.7221) & (0.8588) & (0.7924) \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=297600&T=2

[TABLE]
[ROW][C]Correlations for all pairs of data series with p-values[/C][/ROW]
[ROW][C]pair[/C][C]Pearson r[/C][C]Spearman rho[/C][C]Kendall tau[/C][/ROW]
[ROW][C]EC1;EC2[/C][C]0.5299[/C][C]0.5315[/C][C]0.4482[/C][/ROW]
[ROW][C]p-value[/C][C](0)[/C][C](0)[/C][C](0)[/C][/ROW]
[ROW][C]EC1;EC3[/C][C]0.1555[/C][C]0.1293[/C][C]0.1133[/C][/ROW]
[ROW][C]p-value[/C][C](0.0526)[/C][C](0.1078)[/C][C](0.1032)[/C][/ROW]
[ROW][C]EC1;EC4[/C][C]0.2687[/C][C]0.2762[/C][C]0.2332[/C][/ROW]
[ROW][C]p-value[/C][C](7e-04)[/C][C](5e-04)[/C][C](5e-04)[/C][/ROW]
[ROW][C]EC1;ITH[/C][C]-0.0517[/C][C]-0.0542[/C][C]-0.0451[/C][/ROW]
[ROW][C]p-value[/C][C](0.5218)[/C][C](0.5013)[/C][C](0.4703)[/C][/ROW]
[ROW][C]EC2;EC3[/C][C]0.0928[/C][C]0.1023[/C][C]0.0888[/C][/ROW]
[ROW][C]p-value[/C][C](0.2491)[/C][C](0.2037)[/C][C](0.2011)[/C][/ROW]
[ROW][C]EC2;EC4[/C][C]0.1785[/C][C]0.1989[/C][C]0.167[/C][/ROW]
[ROW][C]p-value[/C][C](0.0257)[/C][C](0.0128)[/C][C](0.0129)[/C][/ROW]
[ROW][C]EC2;ITH[/C][C]0.0933[/C][C]0.0995[/C][C]0.0802[/C][/ROW]
[ROW][C]p-value[/C][C](0.2467)[/C][C](0.2165)[/C][C](0.1986)[/C][/ROW]
[ROW][C]EC3;EC4[/C][C]0.1273[/C][C]0.1037[/C][C]0.0928[/C][/ROW]
[ROW][C]p-value[/C][C](0.1134)[/C][C](0.1978)[/C][C](0.1976)[/C][/ROW]
[ROW][C]EC3;ITH[/C][C]0.1302[/C][C]0.1354[/C][C]0.113[/C][/ROW]
[ROW][C]p-value[/C][C](0.1053)[/C][C](0.0919)[/C][C](0.0912)[/C][/ROW]
[ROW][C]EC4;ITH[/C][C]-0.0287[/C][C]0.0144[/C][C]0.017[/C][/ROW]
[ROW][C]p-value[/C][C](0.7221)[/C][C](0.8588)[/C][C](0.7924)[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=297600&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297600&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Correlations for all pairs of data series with p-values
pairPearson rSpearman rhoKendall tau
EC1;EC20.52990.53150.4482
p-value(0)(0)(0)
EC1;EC30.15550.12930.1133
p-value(0.0526)(0.1078)(0.1032)
EC1;EC40.26870.27620.2332
p-value(7e-04)(5e-04)(5e-04)
EC1;ITH-0.0517-0.0542-0.0451
p-value(0.5218)(0.5013)(0.4703)
EC2;EC30.09280.10230.0888
p-value(0.2491)(0.2037)(0.2011)
EC2;EC40.17850.19890.167
p-value(0.0257)(0.0128)(0.0129)
EC2;ITH0.09330.09950.0802
p-value(0.2467)(0.2165)(0.1986)
EC3;EC40.12730.10370.0928
p-value(0.1134)(0.1978)(0.1976)
EC3;ITH0.13020.13540.113
p-value(0.1053)(0.0919)(0.0912)
EC4;ITH-0.02870.01440.017
p-value(0.7221)(0.8588)(0.7924)







Meta Analysis of Correlation Tests
Number of significant by total number of Correlations
Type I errorPearson rSpearman rhoKendall tau
0.010.20.20.2
0.020.20.30.3
0.030.30.30.3
0.040.30.30.3
0.050.30.30.3
0.060.40.30.3
0.070.40.30.3
0.080.40.30.3
0.090.40.30.3
0.10.40.40.4

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Correlation Tests \tabularnewline
Number of significant by total number of Correlations \tabularnewline
Type I error & Pearson r & Spearman rho & Kendall tau \tabularnewline
0.01 & 0.2 & 0.2 & 0.2 \tabularnewline
0.02 & 0.2 & 0.3 & 0.3 \tabularnewline
0.03 & 0.3 & 0.3 & 0.3 \tabularnewline
0.04 & 0.3 & 0.3 & 0.3 \tabularnewline
0.05 & 0.3 & 0.3 & 0.3 \tabularnewline
0.06 & 0.4 & 0.3 & 0.3 \tabularnewline
0.07 & 0.4 & 0.3 & 0.3 \tabularnewline
0.08 & 0.4 & 0.3 & 0.3 \tabularnewline
0.09 & 0.4 & 0.3 & 0.3 \tabularnewline
0.1 & 0.4 & 0.4 & 0.4 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=297600&T=3

[TABLE]
[ROW][C]Meta Analysis of Correlation Tests[/C][/ROW]
[ROW][C]Number of significant by total number of Correlations[/C][/ROW]
[ROW][C]Type I error[/C][C]Pearson r[/C][C]Spearman rho[/C][C]Kendall tau[/C][/ROW]
[ROW][C]0.01[/C][C]0.2[/C][C]0.2[/C][C]0.2[/C][/ROW]
[ROW][C]0.02[/C][C]0.2[/C][C]0.3[/C][C]0.3[/C][/ROW]
[ROW][C]0.03[/C][C]0.3[/C][C]0.3[/C][C]0.3[/C][/ROW]
[ROW][C]0.04[/C][C]0.3[/C][C]0.3[/C][C]0.3[/C][/ROW]
[ROW][C]0.05[/C][C]0.3[/C][C]0.3[/C][C]0.3[/C][/ROW]
[ROW][C]0.06[/C][C]0.4[/C][C]0.3[/C][C]0.3[/C][/ROW]
[ROW][C]0.07[/C][C]0.4[/C][C]0.3[/C][C]0.3[/C][/ROW]
[ROW][C]0.08[/C][C]0.4[/C][C]0.3[/C][C]0.3[/C][/ROW]
[ROW][C]0.09[/C][C]0.4[/C][C]0.3[/C][C]0.3[/C][/ROW]
[ROW][C]0.1[/C][C]0.4[/C][C]0.4[/C][C]0.4[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=297600&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297600&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Correlation Tests
Number of significant by total number of Correlations
Type I errorPearson rSpearman rhoKendall tau
0.010.20.20.2
0.020.20.30.3
0.030.30.30.3
0.040.30.30.3
0.050.30.30.3
0.060.40.30.3
0.070.40.30.3
0.080.40.30.3
0.090.40.30.3
0.10.40.40.4



Parameters (Session):
par1 = kendall ;
Parameters (R input):
par1 = kendall ;
R code (references can be found in the software module):
par1 <- 'kendall'
panel.tau <- function(x, y, digits=2, prefix='', cex.cor)
{
usr <- par('usr'); on.exit(par(usr))
par(usr = c(0, 1, 0, 1))
rr <- cor.test(x, y, method=par1)
r <- round(rr$p.value,2)
txt <- format(c(r, 0.123456789), digits=digits)[1]
txt <- paste(prefix, txt, sep='')
if(missing(cex.cor)) cex <- 0.5/strwidth(txt)
text(0.5, 0.5, txt, cex = cex)
}
panel.hist <- function(x, ...)
{
usr <- par('usr'); on.exit(par(usr))
par(usr = c(usr[1:2], 0, 1.5) )
h <- hist(x, plot = FALSE)
breaks <- h$breaks; nB <- length(breaks)
y <- h$counts; y <- y/max(y)
rect(breaks[-nB], 0, breaks[-1], y, col='grey', ...)
}
x <- na.omit(x)
y <- t(na.omit(t(y)))
bitmap(file='test1.png')
pairs(t(y),diag.panel=panel.hist, upper.panel=panel.smooth, lower.panel=panel.tau, main=main)
dev.off()
load(file='createtable')
n <- length(y[,1])
print(n)
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,paste('Correlations for all pairs of data series (method=',par1,')',sep=''),n+1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,' ',header=TRUE)
for (i in 1:n) {
a<-table.element(a,dimnames(t(x))[[2]][i],header=TRUE)
}
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,dimnames(t(x))[[2]][i],header=TRUE)
for (j in 1:n) {
r <- cor.test(y[i,],y[j,],method=par1)
a<-table.element(a,round(r$estimate,3))
}
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable.tab')
ncorrs <- (n*n -n)/2
mycorrs <- array(0, dim=c(10,3))
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Correlations for all pairs of data series with p-values',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'pair',1,TRUE)
a<-table.element(a,'Pearson r',1,TRUE)
a<-table.element(a,'Spearman rho',1,TRUE)
a<-table.element(a,'Kendall tau',1,TRUE)
a<-table.row.end(a)
cor.test(y[1,],y[2,],method=par1)
for (i in 1:(n-1))
{
for (j in (i+1):n)
{
a<-table.row.start(a)
dum <- paste(dimnames(t(x))[[2]][i],';',dimnames(t(x))[[2]][j],sep='')
a<-table.element(a,dum,header=TRUE)
rp <- cor.test(y[i,],y[j,],method='pearson')
a<-table.element(a,round(rp$estimate,4))
rs <- cor.test(y[i,],y[j,],method='spearman')
a<-table.element(a,round(rs$estimate,4))
rk <- cor.test(y[i,],y[j,],method='kendall')
a<-table.element(a,round(rk$estimate,4))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-value',header=T)
a<-table.element(a,paste('(',round(rp$p.value,4),')',sep=''))
a<-table.element(a,paste('(',round(rs$p.value,4),')',sep=''))
a<-table.element(a,paste('(',round(rk$p.value,4),')',sep=''))
a<-table.row.end(a)
for (iii in 1:10) {
iiid100 <- iii / 100
if (rp$p.value < iiid100) mycorrs[iii, 1] = mycorrs[iii, 1] + 1
if (rs$p.value < iiid100) mycorrs[iii, 2] = mycorrs[iii, 2] + 1
if (rk$p.value < iiid100) mycorrs[iii, 3] = mycorrs[iii, 3] + 1
}
}
}
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Correlation Tests',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Number of significant by total number of Correlations',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Type I error',1,TRUE)
a<-table.element(a,'Pearson r',1,TRUE)
a<-table.element(a,'Spearman rho',1,TRUE)
a<-table.element(a,'Kendall tau',1,TRUE)
a<-table.row.end(a)
for (iii in 1:10) {
iiid100 <- iii / 100
a<-table.row.start(a)
a<-table.element(a,round(iiid100,2),header=T)
a<-table.element(a,round(mycorrs[iii,1]/ncorrs,2))
a<-table.element(a,round(mycorrs[iii,2]/ncorrs,2))
a<-table.element(a,round(mycorrs[iii,3]/ncorrs,2))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')