Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_pairs.wasp
Title produced by softwareKendall tau Correlation Matrix
Date of computationThu, 31 Jan 2019 17:29:27 +0100
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2019/Jan/31/t1548952189w3r1jj36l9lpvjy.htm/, Retrieved Sun, 05 May 2024 12:37:37 +0200
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=, Retrieved Sun, 05 May 2024 12:37:37 +0200
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact0
Dataseries X:
14 13 22
19 16 24
17 17 26
17 NA 21
15 NA 26
20 16 25
15 NA 21
19 NA 24
15 NA 27
15 17 28
19 17 23
NA 15 25
20 16 24
18 14 24
15 16 24
14 17 25
20 NA 25
NA NA NA
16 NA 25
16 NA 25
16 16 24
10 NA 26
19 16 26
19 NA 25
16 NA 26
15 NA 23
18 16 24
17 15 24
19 16 25
17 16 25
NA 13 24
19 15 28
20 17 27
5 NA NA
19 13 23
16 17 23
15 NA 24
16 14 24
18 14 22
16 18 25
15 NA 25
17 17 28
NA 13 22
20 16 28
19 15 25
7 15 24
13 NA 24
16 15 23
16 13 25
NA NA NA
18 17 26
18 NA 25
16 NA 27
17 11 26
19 14 23
16 13 25
19 NA 21
13 17 22
16 16 24
13 NA 25
12 17 27
17 16 24
17 16 26
17 16 21
16 15 27
16 12 22
14 17 23
16 14 24
13 14 25
16 16 24
14 NA 23
20 NA 28
12 NA NA
13 NA 24
18 NA 26
14 15 22
19 16 25
18 14 25
14 15 24
18 17 24
19 NA 26
15 10 21
14 NA 25
17 17 25
19 NA 26
13 20 25
19 17 26
18 18 27
20 NA 25
15 17 NA
15 14 20
15 NA 24
20 17 26
15 NA 25
19 17 25
18 NA 24
18 16 26
15 18 25
20 18 28
17 16 27
12 NA 25
18 NA 26
19 15 26
20 13 26
NA NA NA
17 NA 28
15 NA NA
16 NA 21
18 NA 25
18 16 25
14 NA 24
15 NA 24
12 NA 24
17 12 23
14 NA 23
18 16 24
17 16 24
17 NA 25
20 16 28
16 14 23
14 15 24
15 14 23
18 NA 24
20 15 25
17 NA 24
17 15 23
17 16 23
17 NA 25
15 NA 21
17 NA 22
18 11 19
17 NA 24
20 18 25
15 NA 21
16 11 22
15 NA 23
18 18 27
11 NA NA
15 15 26
18 19 29
20 17 28
19 NA 24
14 14 25
16 NA 25
15 13 22
17 17 25
18 14 26
20 19 26
17 14 24
18 NA 25
15 NA 19
16 16 25
11 16 23
15 15 25
18 12 25
17 NA 26
16 17 27
12 NA 24
19 NA 22
18 18 25
15 15 24
17 18 23
19 15 27
18 NA 24
19 NA 24
16 NA 21
16 16 25
16 NA 25
14 16 23




Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time2 seconds
R ServerBig Analytics Cloud Computing Center

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input view raw input (R code)  \tabularnewline
Raw Outputview raw output of R engine  \tabularnewline
Computing time2 seconds \tabularnewline
R ServerBig Analytics Cloud Computing Center \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=&T=0

[TABLE]
[ROW]
Summary of computational transaction[/C][/ROW] [ROW]Raw Input[/C] view raw input (R code) [/C][/ROW] [ROW]Raw Output[/C]view raw output of R engine [/C][/ROW] [ROW]Computing time[/C]2 seconds[/C][/ROW] [ROW]R Server[/C]Big Analytics Cloud Computing Center[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time2 seconds
R ServerBig Analytics Cloud Computing Center







Correlations for all pairs of data series (method=pearson)
ITHSUMTVDCSKEOUSUM
ITHSUM10.1150.332
TVDC0.11510.464
SKEOUSUM0.3320.4641

\begin{tabular}{lllllllll}
\hline
Correlations for all pairs of data series (method=pearson) \tabularnewline
  & ITHSUM & TVDC & SKEOUSUM \tabularnewline
ITHSUM & 1 & 0.115 & 0.332 \tabularnewline
TVDC & 0.115 & 1 & 0.464 \tabularnewline
SKEOUSUM & 0.332 & 0.464 & 1 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=&T=1

[TABLE]
[ROW][C]Correlations for all pairs of data series (method=pearson)[/C][/ROW]
[ROW][C] [/C][C]ITHSUM[/C][C]TVDC[/C][C]SKEOUSUM[/C][/ROW]
[ROW][C]ITHSUM[/C][C]1[/C][C]0.115[/C][C]0.332[/C][/ROW]
[ROW][C]TVDC[/C][C]0.115[/C][C]1[/C][C]0.464[/C][/ROW]
[ROW][C]SKEOUSUM[/C][C]0.332[/C][C]0.464[/C][C]1[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Correlations for all pairs of data series (method=pearson)
ITHSUMTVDCSKEOUSUM
ITHSUM10.1150.332
TVDC0.11510.464
SKEOUSUM0.3320.4641







Correlations for all pairs of data series with p-values
pairPearson rSpearman rhoKendall tau
ITHSUM;TVDC0.11520.17340.1337
p-value(0.2561)(0.0861)(0.0836)
ITHSUM;SKEOUSUM0.33240.39490.3041
p-value(8e-04)(1e-04)(1e-04)
TVDC;SKEOUSUM0.46430.41310.3292
p-value(0)(0)(0)

\begin{tabular}{lllllllll}
\hline
Correlations for all pairs of data series with p-values \tabularnewline
pair & Pearson r & Spearman rho & Kendall tau \tabularnewline
ITHSUM;TVDC & 0.1152 & 0.1734 & 0.1337 \tabularnewline
p-value & (0.2561) & (0.0861) & (0.0836) \tabularnewline
ITHSUM;SKEOUSUM & 0.3324 & 0.3949 & 0.3041 \tabularnewline
p-value & (8e-04) & (1e-04) & (1e-04) \tabularnewline
TVDC;SKEOUSUM & 0.4643 & 0.4131 & 0.3292 \tabularnewline
p-value & (0) & (0) & (0) \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=&T=2

[TABLE]
[ROW][C]Correlations for all pairs of data series with p-values[/C][/ROW]
[ROW][C]pair[/C][C]Pearson r[/C][C]Spearman rho[/C][C]Kendall tau[/C][/ROW]
[ROW][C]ITHSUM;TVDC[/C][C]0.1152[/C][C]0.1734[/C][C]0.1337[/C][/ROW]
[ROW][C]p-value[/C][C](0.2561)[/C][C](0.0861)[/C][C](0.0836)[/C][/ROW]
[ROW][C]ITHSUM;SKEOUSUM[/C][C]0.3324[/C][C]0.3949[/C][C]0.3041[/C][/ROW]
[ROW][C]p-value[/C][C](8e-04)[/C][C](1e-04)[/C][C](1e-04)[/C][/ROW]
[ROW][C]TVDC;SKEOUSUM[/C][C]0.4643[/C][C]0.4131[/C][C]0.3292[/C][/ROW]
[ROW][C]p-value[/C][C](0)[/C][C](0)[/C][C](0)[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Correlations for all pairs of data series with p-values
pairPearson rSpearman rhoKendall tau
ITHSUM;TVDC0.11520.17340.1337
p-value(0.2561)(0.0861)(0.0836)
ITHSUM;SKEOUSUM0.33240.39490.3041
p-value(8e-04)(1e-04)(1e-04)
TVDC;SKEOUSUM0.46430.41310.3292
p-value(0)(0)(0)







Meta Analysis of Correlation Tests
Number of significant by total number of Correlations
Type I errorPearson rSpearman rhoKendall tau
0.010.670.670.67
0.020.670.670.67
0.030.670.670.67
0.040.670.670.67
0.050.670.670.67
0.060.670.670.67
0.070.670.670.67
0.080.670.670.67
0.090.6711
0.10.6711

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Correlation Tests \tabularnewline
Number of significant by total number of Correlations \tabularnewline
Type I error & Pearson r & Spearman rho & Kendall tau \tabularnewline
0.01 & 0.67 & 0.67 & 0.67 \tabularnewline
0.02 & 0.67 & 0.67 & 0.67 \tabularnewline
0.03 & 0.67 & 0.67 & 0.67 \tabularnewline
0.04 & 0.67 & 0.67 & 0.67 \tabularnewline
0.05 & 0.67 & 0.67 & 0.67 \tabularnewline
0.06 & 0.67 & 0.67 & 0.67 \tabularnewline
0.07 & 0.67 & 0.67 & 0.67 \tabularnewline
0.08 & 0.67 & 0.67 & 0.67 \tabularnewline
0.09 & 0.67 & 1 & 1 \tabularnewline
0.1 & 0.67 & 1 & 1 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=&T=3

[TABLE]
[ROW][C]Meta Analysis of Correlation Tests[/C][/ROW]
[ROW][C]Number of significant by total number of Correlations[/C][/ROW]
[ROW][C]Type I error[/C][C]Pearson r[/C][C]Spearman rho[/C][C]Kendall tau[/C][/ROW]
[ROW][C]0.01[/C][C]0.67[/C][C]0.67[/C][C]0.67[/C][/ROW]
[ROW][C]0.02[/C][C]0.67[/C][C]0.67[/C][C]0.67[/C][/ROW]
[ROW][C]0.03[/C][C]0.67[/C][C]0.67[/C][C]0.67[/C][/ROW]
[ROW][C]0.04[/C][C]0.67[/C][C]0.67[/C][C]0.67[/C][/ROW]
[ROW][C]0.05[/C][C]0.67[/C][C]0.67[/C][C]0.67[/C][/ROW]
[ROW][C]0.06[/C][C]0.67[/C][C]0.67[/C][C]0.67[/C][/ROW]
[ROW][C]0.07[/C][C]0.67[/C][C]0.67[/C][C]0.67[/C][/ROW]
[ROW][C]0.08[/C][C]0.67[/C][C]0.67[/C][C]0.67[/C][/ROW]
[ROW][C]0.09[/C][C]0.67[/C][C]1[/C][C]1[/C][/ROW]
[ROW][C]0.1[/C][C]0.67[/C][C]1[/C][C]1[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Correlation Tests
Number of significant by total number of Correlations
Type I errorPearson rSpearman rhoKendall tau
0.010.670.670.67
0.020.670.670.67
0.030.670.670.67
0.040.670.670.67
0.050.670.670.67
0.060.670.670.67
0.070.670.670.67
0.080.670.670.67
0.090.6711
0.10.6711



Parameters (Session):
par1 = 1212121111111111211pearson ; par2 = TripleTripleTriple222Do not include Seasonal DummiesDo not include Seasonal DummiesDo not include Seasonal DummiesDo not include Seasonal DummiesDo not include Seasonal DummiesTripleDo not include Seasonal DummiesDo not include Seasonal DummiesInclude Seasonal Dummies ; par3 = additiveadditiveadditive333additiveadditiveNo Linear TrendNo Linear TrendNo Linear TrendadditiveNo Linear TrendNo Linear TrendLinear Trend ; par4 = 12TRUETRUETRUETRUETRUE1212121212 ; par5 = 12 ; par6 = 121212121212121212 ;
Parameters (R input):
par1 = pearson ;
R code (references can be found in the software module):
panel.tau <- function(x, y, digits=2, prefix='', cex.cor)
{
usr <- par('usr'); on.exit(par(usr))
par(usr = c(0, 1, 0, 1))
rr <- cor.test(x, y, method=par1)
r <- round(rr$p.value,2)
txt <- format(c(r, 0.123456789), digits=digits)[1]
txt <- paste(prefix, txt, sep='')
if(missing(cex.cor)) cex <- 0.5/strwidth(txt)
text(0.5, 0.5, txt, cex = cex)
}
panel.hist <- function(x, ...)
{
usr <- par('usr'); on.exit(par(usr))
par(usr = c(usr[1:2], 0, 1.5) )
h <- hist(x, plot = FALSE)
breaks <- h$breaks; nB <- length(breaks)
y <- h$counts; y <- y/max(y)
rect(breaks[-nB], 0, breaks[-1], y, col='grey', ...)
}
x <- na.omit(x)
y <- t(na.omit(t(y)))
bitmap(file='test1.png')
pairs(t(y),diag.panel=panel.hist, upper.panel=panel.smooth, lower.panel=panel.tau, main=main)
dev.off()
load(file='createtable')
n <- length(y[,1])
print(n)
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,paste('Correlations for all pairs of data series (method=',par1,')',sep=''),n+1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,' ',header=TRUE)
for (i in 1:n) {
a<-table.element(a,dimnames(t(x))[[2]][i],header=TRUE)
}
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,dimnames(t(x))[[2]][i],header=TRUE)
for (j in 1:n) {
r <- cor.test(y[i,],y[j,],method=par1)
a<-table.element(a,round(r$estimate,3))
}
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable.tab')
ncorrs <- (n*n -n)/2
mycorrs <- array(0, dim=c(10,3))
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Correlations for all pairs of data series with p-values',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'pair',1,TRUE)
a<-table.element(a,'Pearson r',1,TRUE)
a<-table.element(a,'Spearman rho',1,TRUE)
a<-table.element(a,'Kendall tau',1,TRUE)
a<-table.row.end(a)
cor.test(y[1,],y[2,],method=par1)
for (i in 1:(n-1))
{
for (j in (i+1):n)
{
a<-table.row.start(a)
dum <- paste(dimnames(t(x))[[2]][i],';',dimnames(t(x))[[2]][j],sep='')
a<-table.element(a,dum,header=TRUE)
rp <- cor.test(y[i,],y[j,],method='pearson')
a<-table.element(a,round(rp$estimate,4))
rs <- cor.test(y[i,],y[j,],method='spearman')
a<-table.element(a,round(rs$estimate,4))
rk <- cor.test(y[i,],y[j,],method='kendall')
a<-table.element(a,round(rk$estimate,4))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-value',header=T)
a<-table.element(a,paste('(',round(rp$p.value,4),')',sep=''))
a<-table.element(a,paste('(',round(rs$p.value,4),')',sep=''))
a<-table.element(a,paste('(',round(rk$p.value,4),')',sep=''))
a<-table.row.end(a)
for (iii in 1:10) {
iiid100 <- iii / 100
if (rp$p.value < iiid100) mycorrs[iii, 1] = mycorrs[iii, 1] + 1
if (rs$p.value < iiid100) mycorrs[iii, 2] = mycorrs[iii, 2] + 1
if (rk$p.value < iiid100) mycorrs[iii, 3] = mycorrs[iii, 3] + 1
}
}
}
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Correlation Tests',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Number of significant by total number of Correlations',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Type I error',1,TRUE)
a<-table.element(a,'Pearson r',1,TRUE)
a<-table.element(a,'Spearman rho',1,TRUE)
a<-table.element(a,'Kendall tau',1,TRUE)
a<-table.row.end(a)
for (iii in 1:10) {
iiid100 <- iii / 100
a<-table.row.start(a)
a<-table.element(a,round(iiid100,2),header=T)
a<-table.element(a,round(mycorrs[iii,1]/ncorrs,2))
a<-table.element(a,round(mycorrs[iii,2]/ncorrs,2))
a<-table.element(a,round(mycorrs[iii,3]/ncorrs,2))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')