Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_pairs.wasp
Title produced by softwareKendall tau Correlation Matrix
Date of computationWed, 07 Dec 2016 15:04:10 +0100
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2016/Dec/07/t1481119532wa1bpsqv4s6rgj5.htm/, Retrieved Tue, 07 May 2024 15:18:38 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=298125, Retrieved Tue, 07 May 2024 15:18:38 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact67
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Kendall tau Correlation Matrix] [Multivariate Kend...] [2016-12-07 14:04:10] [6b2845a830bced35782aaf33b6e68e42] [Current]
Feedback Forum

Post a new message
Dataseries X:
5	4	4	4	13
5	NA	4	4	16
4	3	3	2	17
4	3	3	3	15
5	4	4	3	16
5	3	4	3	16
5	4	2	3	18
5	4	2	4	16
5	2	2	4	17
5	1	2	4	17
4	4	3	2	17
5	4	3	2	15
5	4	5	4	16
5	5	4	5	14
4	4	3	4	16
5	1	4	4	17
3	4	4	2	16
5	2	NA	2	15
5	3	4	5	17
5	3	NA	4	16
NA	2	3	1	15
3	1	3	5	16
4	3	2	3	15
4	2	2	4	17
4	NA	3	4	14
5	4	3	2	16
4	4	3	4	15
5	2	4	2	16
4	3	4	3	16
5	4	3	4	13
4	4	4	4	15
4	4	3	4	17
4	3	4	4	15
5	4	3	4	13
5	4	3	4	17
5	4	3	5	15
5	4	3	4	14
2	3	2	4	14
4	3	5	3	18
4	4	3	4	15
4	2	1	4	17
5	3	2	3	13
5	4	2	2	16
5	4	3	5	15
4	3	2	4	15
4	2	3	3	16
5	3	5	4	15
5	3	4	4	13
4	3	2	3	17
4	3	4	4	18
5	3	3	4	18
5	3	3	4	11
5	3	2	4	14
4	5	3	5	13
5	4	2	4	15
5	NA	4	2	17
4	3	NA	4	16
4	4	3	5	15
5	4	1	2	17
5	1	1	3	16
4	4	3	4	16
4	3	NA	3	16
5	3	2	4	15
3	4	3	4	12
3	2	4	4	17
5	4	3	5	14
4	5	4	3	14
4	4	4	4	16
5	4	3	4	15
5	4	4	4	15
4	NA	4	4	14
5	4	3	4	13
4	2	3	4	18
4	4	5	4	15
4	2	2	4	16
5	5	4	4	14
4	5	3	3	15
4	2	3	3	17
4	4	3	2	16
4	3	4	2	10
4	3	4	2	16
2	3	NA	3	17
4	4	5	4	17
4	4	3	4	20
5	3	4	4	17
4	3	3	4	18
5	4	5	4	15
4	4	4	4	17
4	2	4	4	14
3	3	4	2	15
4	3	4	3	17
2	3	2	2	16
4	4	3	3	17
5	4	4	4	15
3	4	3	5	16
4	4	3	4	18
5	5	5	5	18
2	4	3	3	16
5	4	3	4	17
5	4	4	5	15
4	2	2	2	13
4	3	3	3	15
5	3	4	4	17
5	3	4	5	16
4	4	4	4	16
4	4	4	5	15
5	4	NA	5	16
5	4	4	5	16
5	3	3	4	14
4	3	3	4	15
5	3	3	4	12
4	2	NA	4	19
5	3	4	4	16
4	2	2	4	16
5	4	5	5	17
5	5	2	5	16
4	3	2	5	14
4	3	2	4	15
4	3	3	4	14
5	2	3	4	16
5	3	4	5	15
4	3	NA	4	17
4	3	4	4	15
5	4	3	4	16
5	4	4	4	16
4	3	4	2	15
4	4	3	4	15
4	1	3	2	11
4	5	5	4	16
5	4	4	3	18
5	3	3	5	13
4	5	3	2	11
NA	4	3	4	16
4	3	3	3	18
3	4	3	3	15
4	4	2	4	19
5	3	4	5	17
4	2	4	3	13
4	4	4	2	14
5	3	5	5	16
3	3	2	4	13
4	4	2	4	17
1	2	3	2	14
5	3	3	5	19
4	4	2	3	14
5	4	4	3	16
3	3	2	3	12
4	4	3	4	16
4	4	NA	4	16
4	3	3	4	15
4	2	3	4	12
5	4	4	4	15
5	2	2	4	17
5	3	5	5	14
5	4	4	3	15
4	3	3	NA	18
5	2	5	4	15
5	4	2	4	18
4	1	4	5	15
3	5	4	3	15
4	4	4	4	16
4	3	3	2	13
5	4	5	5	16
4	4	3	4	14
4	3	3	3	16




Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time1 seconds
R ServerBig Analytics Cloud Computing Center

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input view raw input (R code)  \tabularnewline
Raw Outputview raw output of R engine  \tabularnewline
Computing time1 seconds \tabularnewline
R ServerBig Analytics Cloud Computing Center \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=298125&T=0

[TABLE]
[ROW]
Summary of computational transaction[/C][/ROW] [ROW]Raw Input[/C] view raw input (R code) [/C][/ROW] [ROW]Raw Output[/C]view raw output of R engine [/C][/ROW] [ROW]Computing time[/C]1 seconds[/C][/ROW] [ROW]R Server[/C]Big Analytics Cloud Computing Center[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=298125&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=298125&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time1 seconds
R ServerBig Analytics Cloud Computing Center







Correlations for all pairs of data series (method=kendall)
KVDD1KVDD2KVDD3KVDD4TVDCSUM
KVDD110.0990.130.2790.053
KVDD20.09910.140.081-0.026
KVDD30.130.1410.1280.016
KVDD40.2790.0810.12810.021
TVDCSUM0.053-0.0260.0160.0211

\begin{tabular}{lllllllll}
\hline
Correlations for all pairs of data series (method=kendall) \tabularnewline
  & KVDD1 & KVDD2 & KVDD3 & KVDD4 & TVDCSUM \tabularnewline
KVDD1 & 1 & 0.099 & 0.13 & 0.279 & 0.053 \tabularnewline
KVDD2 & 0.099 & 1 & 0.14 & 0.081 & -0.026 \tabularnewline
KVDD3 & 0.13 & 0.14 & 1 & 0.128 & 0.016 \tabularnewline
KVDD4 & 0.279 & 0.081 & 0.128 & 1 & 0.021 \tabularnewline
TVDCSUM & 0.053 & -0.026 & 0.016 & 0.021 & 1 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=298125&T=1

[TABLE]
[ROW][C]Correlations for all pairs of data series (method=kendall)[/C][/ROW]
[ROW][C] [/C][C]KVDD1[/C][C]KVDD2[/C][C]KVDD3[/C][C]KVDD4[/C][C]TVDCSUM[/C][/ROW]
[ROW][C]KVDD1[/C][C]1[/C][C]0.099[/C][C]0.13[/C][C]0.279[/C][C]0.053[/C][/ROW]
[ROW][C]KVDD2[/C][C]0.099[/C][C]1[/C][C]0.14[/C][C]0.081[/C][C]-0.026[/C][/ROW]
[ROW][C]KVDD3[/C][C]0.13[/C][C]0.14[/C][C]1[/C][C]0.128[/C][C]0.016[/C][/ROW]
[ROW][C]KVDD4[/C][C]0.279[/C][C]0.081[/C][C]0.128[/C][C]1[/C][C]0.021[/C][/ROW]
[ROW][C]TVDCSUM[/C][C]0.053[/C][C]-0.026[/C][C]0.016[/C][C]0.021[/C][C]1[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=298125&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=298125&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Correlations for all pairs of data series (method=kendall)
KVDD1KVDD2KVDD3KVDD4TVDCSUM
KVDD110.0990.130.2790.053
KVDD20.09910.140.081-0.026
KVDD30.130.1410.1280.016
KVDD40.2790.0810.12810.021
TVDCSUM0.053-0.0260.0160.0211







Correlations for all pairs of data series with p-values
pairPearson rSpearman rhoKendall tau
KVDD1;KVDD20.11030.10980.0986
p-value(0.1804)(0.1824)(0.1807)
KVDD1;KVDD30.14970.14510.1299
p-value(0.0684)(0.0775)(0.0757)
KVDD1;KVDD40.30920.30710.2787
p-value(1e-04)(1e-04)(2e-04)
KVDD1;TVDCSUM0.08610.0630.0526
p-value(0.2964)(0.4452)(0.4519)
KVDD2;KVDD30.19260.16220.1402
p-value(0.0186)(0.0481)(0.047)
KVDD2;KVDD40.07990.09350.0813
p-value(0.3329)(0.2565)(0.2528)
KVDD2;TVDCSUM-0.0069-0.0349-0.0264
p-value(0.9335)(0.6723)(0.6953)
KVDD3;KVDD40.16250.15040.1283
p-value(0.0477)(0.0671)(0.069)
KVDD3;TVDCSUM0.01790.0160.0164
p-value(0.8287)(0.846)(0.8064)
KVDD4;TVDCSUM0.10210.02520.0209
p-value(0.2155)(0.7605)(0.7565)

\begin{tabular}{lllllllll}
\hline
Correlations for all pairs of data series with p-values \tabularnewline
pair & Pearson r & Spearman rho & Kendall tau \tabularnewline
KVDD1;KVDD2 & 0.1103 & 0.1098 & 0.0986 \tabularnewline
p-value & (0.1804) & (0.1824) & (0.1807) \tabularnewline
KVDD1;KVDD3 & 0.1497 & 0.1451 & 0.1299 \tabularnewline
p-value & (0.0684) & (0.0775) & (0.0757) \tabularnewline
KVDD1;KVDD4 & 0.3092 & 0.3071 & 0.2787 \tabularnewline
p-value & (1e-04) & (1e-04) & (2e-04) \tabularnewline
KVDD1;TVDCSUM & 0.0861 & 0.063 & 0.0526 \tabularnewline
p-value & (0.2964) & (0.4452) & (0.4519) \tabularnewline
KVDD2;KVDD3 & 0.1926 & 0.1622 & 0.1402 \tabularnewline
p-value & (0.0186) & (0.0481) & (0.047) \tabularnewline
KVDD2;KVDD4 & 0.0799 & 0.0935 & 0.0813 \tabularnewline
p-value & (0.3329) & (0.2565) & (0.2528) \tabularnewline
KVDD2;TVDCSUM & -0.0069 & -0.0349 & -0.0264 \tabularnewline
p-value & (0.9335) & (0.6723) & (0.6953) \tabularnewline
KVDD3;KVDD4 & 0.1625 & 0.1504 & 0.1283 \tabularnewline
p-value & (0.0477) & (0.0671) & (0.069) \tabularnewline
KVDD3;TVDCSUM & 0.0179 & 0.016 & 0.0164 \tabularnewline
p-value & (0.8287) & (0.846) & (0.8064) \tabularnewline
KVDD4;TVDCSUM & 0.1021 & 0.0252 & 0.0209 \tabularnewline
p-value & (0.2155) & (0.7605) & (0.7565) \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=298125&T=2

[TABLE]
[ROW][C]Correlations for all pairs of data series with p-values[/C][/ROW]
[ROW][C]pair[/C][C]Pearson r[/C][C]Spearman rho[/C][C]Kendall tau[/C][/ROW]
[ROW][C]KVDD1;KVDD2[/C][C]0.1103[/C][C]0.1098[/C][C]0.0986[/C][/ROW]
[ROW][C]p-value[/C][C](0.1804)[/C][C](0.1824)[/C][C](0.1807)[/C][/ROW]
[ROW][C]KVDD1;KVDD3[/C][C]0.1497[/C][C]0.1451[/C][C]0.1299[/C][/ROW]
[ROW][C]p-value[/C][C](0.0684)[/C][C](0.0775)[/C][C](0.0757)[/C][/ROW]
[ROW][C]KVDD1;KVDD4[/C][C]0.3092[/C][C]0.3071[/C][C]0.2787[/C][/ROW]
[ROW][C]p-value[/C][C](1e-04)[/C][C](1e-04)[/C][C](2e-04)[/C][/ROW]
[ROW][C]KVDD1;TVDCSUM[/C][C]0.0861[/C][C]0.063[/C][C]0.0526[/C][/ROW]
[ROW][C]p-value[/C][C](0.2964)[/C][C](0.4452)[/C][C](0.4519)[/C][/ROW]
[ROW][C]KVDD2;KVDD3[/C][C]0.1926[/C][C]0.1622[/C][C]0.1402[/C][/ROW]
[ROW][C]p-value[/C][C](0.0186)[/C][C](0.0481)[/C][C](0.047)[/C][/ROW]
[ROW][C]KVDD2;KVDD4[/C][C]0.0799[/C][C]0.0935[/C][C]0.0813[/C][/ROW]
[ROW][C]p-value[/C][C](0.3329)[/C][C](0.2565)[/C][C](0.2528)[/C][/ROW]
[ROW][C]KVDD2;TVDCSUM[/C][C]-0.0069[/C][C]-0.0349[/C][C]-0.0264[/C][/ROW]
[ROW][C]p-value[/C][C](0.9335)[/C][C](0.6723)[/C][C](0.6953)[/C][/ROW]
[ROW][C]KVDD3;KVDD4[/C][C]0.1625[/C][C]0.1504[/C][C]0.1283[/C][/ROW]
[ROW][C]p-value[/C][C](0.0477)[/C][C](0.0671)[/C][C](0.069)[/C][/ROW]
[ROW][C]KVDD3;TVDCSUM[/C][C]0.0179[/C][C]0.016[/C][C]0.0164[/C][/ROW]
[ROW][C]p-value[/C][C](0.8287)[/C][C](0.846)[/C][C](0.8064)[/C][/ROW]
[ROW][C]KVDD4;TVDCSUM[/C][C]0.1021[/C][C]0.0252[/C][C]0.0209[/C][/ROW]
[ROW][C]p-value[/C][C](0.2155)[/C][C](0.7605)[/C][C](0.7565)[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=298125&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=298125&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Correlations for all pairs of data series with p-values
pairPearson rSpearman rhoKendall tau
KVDD1;KVDD20.11030.10980.0986
p-value(0.1804)(0.1824)(0.1807)
KVDD1;KVDD30.14970.14510.1299
p-value(0.0684)(0.0775)(0.0757)
KVDD1;KVDD40.30920.30710.2787
p-value(1e-04)(1e-04)(2e-04)
KVDD1;TVDCSUM0.08610.0630.0526
p-value(0.2964)(0.4452)(0.4519)
KVDD2;KVDD30.19260.16220.1402
p-value(0.0186)(0.0481)(0.047)
KVDD2;KVDD40.07990.09350.0813
p-value(0.3329)(0.2565)(0.2528)
KVDD2;TVDCSUM-0.0069-0.0349-0.0264
p-value(0.9335)(0.6723)(0.6953)
KVDD3;KVDD40.16250.15040.1283
p-value(0.0477)(0.0671)(0.069)
KVDD3;TVDCSUM0.01790.0160.0164
p-value(0.8287)(0.846)(0.8064)
KVDD4;TVDCSUM0.10210.02520.0209
p-value(0.2155)(0.7605)(0.7565)







Meta Analysis of Correlation Tests
Number of significant by total number of Correlations
Type I errorPearson rSpearman rhoKendall tau
0.010.10.10.1
0.020.20.10.1
0.030.20.10.1
0.040.20.10.1
0.050.30.20.2
0.060.30.20.2
0.070.40.30.3
0.080.40.40.4
0.090.40.40.4
0.10.40.40.4

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Correlation Tests \tabularnewline
Number of significant by total number of Correlations \tabularnewline
Type I error & Pearson r & Spearman rho & Kendall tau \tabularnewline
0.01 & 0.1 & 0.1 & 0.1 \tabularnewline
0.02 & 0.2 & 0.1 & 0.1 \tabularnewline
0.03 & 0.2 & 0.1 & 0.1 \tabularnewline
0.04 & 0.2 & 0.1 & 0.1 \tabularnewline
0.05 & 0.3 & 0.2 & 0.2 \tabularnewline
0.06 & 0.3 & 0.2 & 0.2 \tabularnewline
0.07 & 0.4 & 0.3 & 0.3 \tabularnewline
0.08 & 0.4 & 0.4 & 0.4 \tabularnewline
0.09 & 0.4 & 0.4 & 0.4 \tabularnewline
0.1 & 0.4 & 0.4 & 0.4 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=298125&T=3

[TABLE]
[ROW][C]Meta Analysis of Correlation Tests[/C][/ROW]
[ROW][C]Number of significant by total number of Correlations[/C][/ROW]
[ROW][C]Type I error[/C][C]Pearson r[/C][C]Spearman rho[/C][C]Kendall tau[/C][/ROW]
[ROW][C]0.01[/C][C]0.1[/C][C]0.1[/C][C]0.1[/C][/ROW]
[ROW][C]0.02[/C][C]0.2[/C][C]0.1[/C][C]0.1[/C][/ROW]
[ROW][C]0.03[/C][C]0.2[/C][C]0.1[/C][C]0.1[/C][/ROW]
[ROW][C]0.04[/C][C]0.2[/C][C]0.1[/C][C]0.1[/C][/ROW]
[ROW][C]0.05[/C][C]0.3[/C][C]0.2[/C][C]0.2[/C][/ROW]
[ROW][C]0.06[/C][C]0.3[/C][C]0.2[/C][C]0.2[/C][/ROW]
[ROW][C]0.07[/C][C]0.4[/C][C]0.3[/C][C]0.3[/C][/ROW]
[ROW][C]0.08[/C][C]0.4[/C][C]0.4[/C][C]0.4[/C][/ROW]
[ROW][C]0.09[/C][C]0.4[/C][C]0.4[/C][C]0.4[/C][/ROW]
[ROW][C]0.1[/C][C]0.4[/C][C]0.4[/C][C]0.4[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=298125&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=298125&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Correlation Tests
Number of significant by total number of Correlations
Type I errorPearson rSpearman rhoKendall tau
0.010.10.10.1
0.020.20.10.1
0.030.20.10.1
0.040.20.10.1
0.050.30.20.2
0.060.30.20.2
0.070.40.30.3
0.080.40.40.4
0.090.40.40.4
0.10.40.40.4



Parameters (Session):
par1 = 5 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ; par4 = 0 ; par5 = 0 ;
Parameters (R input):
par1 = kendall ;
R code (references can be found in the software module):
panel.tau <- function(x, y, digits=2, prefix='', cex.cor)
{
usr <- par('usr'); on.exit(par(usr))
par(usr = c(0, 1, 0, 1))
rr <- cor.test(x, y, method=par1)
r <- round(rr$p.value,2)
txt <- format(c(r, 0.123456789), digits=digits)[1]
txt <- paste(prefix, txt, sep='')
if(missing(cex.cor)) cex <- 0.5/strwidth(txt)
text(0.5, 0.5, txt, cex = cex)
}
panel.hist <- function(x, ...)
{
usr <- par('usr'); on.exit(par(usr))
par(usr = c(usr[1:2], 0, 1.5) )
h <- hist(x, plot = FALSE)
breaks <- h$breaks; nB <- length(breaks)
y <- h$counts; y <- y/max(y)
rect(breaks[-nB], 0, breaks[-1], y, col='grey', ...)
}
x <- na.omit(x)
y <- t(na.omit(t(y)))
bitmap(file='test1.png')
pairs(t(y),diag.panel=panel.hist, upper.panel=panel.smooth, lower.panel=panel.tau, main=main)
dev.off()
load(file='createtable')
n <- length(y[,1])
print(n)
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,paste('Correlations for all pairs of data series (method=',par1,')',sep=''),n+1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,' ',header=TRUE)
for (i in 1:n) {
a<-table.element(a,dimnames(t(x))[[2]][i],header=TRUE)
}
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,dimnames(t(x))[[2]][i],header=TRUE)
for (j in 1:n) {
r <- cor.test(y[i,],y[j,],method=par1)
a<-table.element(a,round(r$estimate,3))
}
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable.tab')
ncorrs <- (n*n -n)/2
mycorrs <- array(0, dim=c(10,3))
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Correlations for all pairs of data series with p-values',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'pair',1,TRUE)
a<-table.element(a,'Pearson r',1,TRUE)
a<-table.element(a,'Spearman rho',1,TRUE)
a<-table.element(a,'Kendall tau',1,TRUE)
a<-table.row.end(a)
cor.test(y[1,],y[2,],method=par1)
for (i in 1:(n-1))
{
for (j in (i+1):n)
{
a<-table.row.start(a)
dum <- paste(dimnames(t(x))[[2]][i],';',dimnames(t(x))[[2]][j],sep='')
a<-table.element(a,dum,header=TRUE)
rp <- cor.test(y[i,],y[j,],method='pearson')
a<-table.element(a,round(rp$estimate,4))
rs <- cor.test(y[i,],y[j,],method='spearman')
a<-table.element(a,round(rs$estimate,4))
rk <- cor.test(y[i,],y[j,],method='kendall')
a<-table.element(a,round(rk$estimate,4))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-value',header=T)
a<-table.element(a,paste('(',round(rp$p.value,4),')',sep=''))
a<-table.element(a,paste('(',round(rs$p.value,4),')',sep=''))
a<-table.element(a,paste('(',round(rk$p.value,4),')',sep=''))
a<-table.row.end(a)
for (iii in 1:10) {
iiid100 <- iii / 100
if (rp$p.value < iiid100) mycorrs[iii, 1] = mycorrs[iii, 1] + 1
if (rs$p.value < iiid100) mycorrs[iii, 2] = mycorrs[iii, 2] + 1
if (rk$p.value < iiid100) mycorrs[iii, 3] = mycorrs[iii, 3] + 1
}
}
}
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Correlation Tests',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Number of significant by total number of Correlations',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Type I error',1,TRUE)
a<-table.element(a,'Pearson r',1,TRUE)
a<-table.element(a,'Spearman rho',1,TRUE)
a<-table.element(a,'Kendall tau',1,TRUE)
a<-table.row.end(a)
for (iii in 1:10) {
iiid100 <- iii / 100
a<-table.row.start(a)
a<-table.element(a,round(iiid100,2),header=T)
a<-table.element(a,round(mycorrs[iii,1]/ncorrs,2))
a<-table.element(a,round(mycorrs[iii,2]/ncorrs,2))
a<-table.element(a,round(mycorrs[iii,3]/ncorrs,2))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')