Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_pairs.wasp
Title produced by softwareKendall tau Correlation Matrix
Date of computationSun, 14 Dec 2014 14:16:05 +0000
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2014/Dec/14/t1418567174ffizwhidq2ffk6r.htm/, Retrieved Thu, 16 May 2024 19:11:20 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=267622, Retrieved Thu, 16 May 2024 19:11:20 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact82
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Kendall tau Correlation Matrix] [] [2014-12-14 14:16:05] [c7f962214140f976f2c4b1bb2571d9df] [Current]
Feedback Forum

Post a new message
Dataseries X:
166.06 159.81 3.40
154.50 147.75 4.80
146.87 147.13 6.50
145.10 140.33 8.50
143.32 142.87 13.60
137.03 139.67 15.70
132.42 135.35 18.80
130.71 136.32 19.20
128.60 129.27 12.90
130.39 126.81 14.40
138.43 137.17 6.20
154.74 150.94 2.40
184.35 173.71 4.60
163.39 156.68 7.10
149.06 146.84 7.80
147.10 144.17 9.90
138.06 134.81 13.90
134.13 135.13 17.10
139.87 131.45 17.80
133.68 133.77 18.30
125.47 134.87 14.70
137.03 140.90 10.50
140.50 136.57 8.60
157.13 155.52 4.40
159.55 160.16 2.30
160.36 158.04 2.80
156.48 148.42 8.80
153.03 150.70 10.70
138.03 135.26 12.80
139.70 134.63 19.30
138.23 132.23 19.50
145.68 132.55 20.30
139.90 134.13 15.30
142.06 136.94 7.90
145.77 141.73 8.30
171.19 165.68 4.50
171.61 162.48 3.20
150.21 145.86 5.00
144.65 142.19 6.60
140.33 137.30 11.10
129.61 131.71 13.40
130.40 133.67 16.30
128.13 133.81 17.40
125.35 127.48 18.90
129.73 128.10 15.80
136.84 134.32 11.70
137.80 135.83 6.40
153.00 151.87 2.90
165.03 158.87 4.70
172.25 163.86 2.40
177.06 158.58 7.00
142.10 140.13 10.60
136.16 136.87 12.80
135.87 134.20 17.70
119.84 126.19 18.20
119.84 122.52 16.50
126.13 124.20 16.20
133.58 133.87 13.90
132.27 136.53 6.60
153.77 148.90 3.60
161.90 151.19 1.40
155.11 151.29 2.60
156.55 149.06 4.30
138.47 138.80 8.80
130.16 134.77 14.50
133.20 135.43 16.80
152.71 141.19 22.70
121.87 126.77 15.70
129.57 126.43 18.20
127.52 131.00 14.20
132.90 134.00 9.10
143.10 138.13 5.90
154.94 151.06 7.00
166.86 158.61 6.20
147.10 144.03 7.80
142.97 139.57 14.30
127.77 128.74 14.60
131.43 127.20 17.30
126.84 125.90 17.10
123.10 122.06 17.00
127.80 127.23 13.90
133.23 135.13 10.30
148.90 144.83 6.70
143.45 134.94 3.90




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time1 seconds
R Server'Sir Ronald Aylmer Fisher' @ fisher.wessa.net

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 1 seconds \tabularnewline
R Server & 'Sir Ronald Aylmer Fisher' @ fisher.wessa.net \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=267622&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]1 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Sir Ronald Aylmer Fisher' @ fisher.wessa.net[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=267622&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=267622&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time1 seconds
R Server'Sir Ronald Aylmer Fisher' @ fisher.wessa.net







Correlations for all pairs of data series (method=kendall)
SVtSMtTt
SVt10.788-0.543
SMt0.7881-0.59
Tt-0.543-0.591

\begin{tabular}{lllllllll}
\hline
Correlations for all pairs of data series (method=kendall) \tabularnewline
  & SVt & SMt & Tt \tabularnewline
SVt & 1 & 0.788 & -0.543 \tabularnewline
SMt & 0.788 & 1 & -0.59 \tabularnewline
Tt & -0.543 & -0.59 & 1 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=267622&T=1

[TABLE]
[ROW][C]Correlations for all pairs of data series (method=kendall)[/C][/ROW]
[ROW][C] [/C][C]SVt[/C][C]SMt[/C][C]Tt[/C][/ROW]
[ROW][C]SVt[/C][C]1[/C][C]0.788[/C][C]-0.543[/C][/ROW]
[ROW][C]SMt[/C][C]0.788[/C][C]1[/C][C]-0.59[/C][/ROW]
[ROW][C]Tt[/C][C]-0.543[/C][C]-0.59[/C][C]1[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=267622&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=267622&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Correlations for all pairs of data series (method=kendall)
SVtSMtTt
SVt10.788-0.543
SMt0.7881-0.59
Tt-0.543-0.591







Correlations for all pairs of data series with p-values
pairPearson rSpearman rhoKendall tau
SVt;SMt0.95690.92870.7882
p-value(0)(0)(0)
SVt;Tt-0.725-0.7409-0.5434
p-value(0)(0)(0)
SMt;Tt-0.7648-0.7945-0.5898
p-value(0)(0)(0)

\begin{tabular}{lllllllll}
\hline
Correlations for all pairs of data series with p-values \tabularnewline
pair & Pearson r & Spearman rho & Kendall tau \tabularnewline
SVt;SMt & 0.9569 & 0.9287 & 0.7882 \tabularnewline
p-value & (0) & (0) & (0) \tabularnewline
SVt;Tt & -0.725 & -0.7409 & -0.5434 \tabularnewline
p-value & (0) & (0) & (0) \tabularnewline
SMt;Tt & -0.7648 & -0.7945 & -0.5898 \tabularnewline
p-value & (0) & (0) & (0) \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=267622&T=2

[TABLE]
[ROW][C]Correlations for all pairs of data series with p-values[/C][/ROW]
[ROW][C]pair[/C][C]Pearson r[/C][C]Spearman rho[/C][C]Kendall tau[/C][/ROW]
[ROW][C]SVt;SMt[/C][C]0.9569[/C][C]0.9287[/C][C]0.7882[/C][/ROW]
[ROW][C]p-value[/C][C](0)[/C][C](0)[/C][C](0)[/C][/ROW]
[ROW][C]SVt;Tt[/C][C]-0.725[/C][C]-0.7409[/C][C]-0.5434[/C][/ROW]
[ROW][C]p-value[/C][C](0)[/C][C](0)[/C][C](0)[/C][/ROW]
[ROW][C]SMt;Tt[/C][C]-0.7648[/C][C]-0.7945[/C][C]-0.5898[/C][/ROW]
[ROW][C]p-value[/C][C](0)[/C][C](0)[/C][C](0)[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=267622&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=267622&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Correlations for all pairs of data series with p-values
pairPearson rSpearman rhoKendall tau
SVt;SMt0.95690.92870.7882
p-value(0)(0)(0)
SVt;Tt-0.725-0.7409-0.5434
p-value(0)(0)(0)
SMt;Tt-0.7648-0.7945-0.5898
p-value(0)(0)(0)







Meta Analysis of Correlation Tests
Number of significant by total number of Correlations
Type I errorPearson rSpearman rhoKendall tau
0.01111
0.02111
0.03111
0.04111
0.05111
0.06111
0.07111
0.08111
0.09111
0.1111

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Correlation Tests \tabularnewline
Number of significant by total number of Correlations \tabularnewline
Type I error & Pearson r & Spearman rho & Kendall tau \tabularnewline
0.01 & 1 & 1 & 1 \tabularnewline
0.02 & 1 & 1 & 1 \tabularnewline
0.03 & 1 & 1 & 1 \tabularnewline
0.04 & 1 & 1 & 1 \tabularnewline
0.05 & 1 & 1 & 1 \tabularnewline
0.06 & 1 & 1 & 1 \tabularnewline
0.07 & 1 & 1 & 1 \tabularnewline
0.08 & 1 & 1 & 1 \tabularnewline
0.09 & 1 & 1 & 1 \tabularnewline
0.1 & 1 & 1 & 1 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=267622&T=3

[TABLE]
[ROW][C]Meta Analysis of Correlation Tests[/C][/ROW]
[ROW][C]Number of significant by total number of Correlations[/C][/ROW]
[ROW][C]Type I error[/C][C]Pearson r[/C][C]Spearman rho[/C][C]Kendall tau[/C][/ROW]
[ROW][C]0.01[/C][C]1[/C][C]1[/C][C]1[/C][/ROW]
[ROW][C]0.02[/C][C]1[/C][C]1[/C][C]1[/C][/ROW]
[ROW][C]0.03[/C][C]1[/C][C]1[/C][C]1[/C][/ROW]
[ROW][C]0.04[/C][C]1[/C][C]1[/C][C]1[/C][/ROW]
[ROW][C]0.05[/C][C]1[/C][C]1[/C][C]1[/C][/ROW]
[ROW][C]0.06[/C][C]1[/C][C]1[/C][C]1[/C][/ROW]
[ROW][C]0.07[/C][C]1[/C][C]1[/C][C]1[/C][/ROW]
[ROW][C]0.08[/C][C]1[/C][C]1[/C][C]1[/C][/ROW]
[ROW][C]0.09[/C][C]1[/C][C]1[/C][C]1[/C][/ROW]
[ROW][C]0.1[/C][C]1[/C][C]1[/C][C]1[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=267622&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=267622&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Correlation Tests
Number of significant by total number of Correlations
Type I errorPearson rSpearman rhoKendall tau
0.01111
0.02111
0.03111
0.04111
0.05111
0.06111
0.07111
0.08111
0.09111
0.1111



Parameters (Session):
par1 = kendall ;
Parameters (R input):
par1 = kendall ;
R code (references can be found in the software module):
panel.tau <- function(x, y, digits=2, prefix='', cex.cor)
{
usr <- par('usr'); on.exit(par(usr))
par(usr = c(0, 1, 0, 1))
rr <- cor.test(x, y, method=par1)
r <- round(rr$p.value,2)
txt <- format(c(r, 0.123456789), digits=digits)[1]
txt <- paste(prefix, txt, sep='')
if(missing(cex.cor)) cex <- 0.5/strwidth(txt)
text(0.5, 0.5, txt, cex = cex)
}
panel.hist <- function(x, ...)
{
usr <- par('usr'); on.exit(par(usr))
par(usr = c(usr[1:2], 0, 1.5) )
h <- hist(x, plot = FALSE)
breaks <- h$breaks; nB <- length(breaks)
y <- h$counts; y <- y/max(y)
rect(breaks[-nB], 0, breaks[-1], y, col='grey', ...)
}
bitmap(file='test1.png')
pairs(t(y),diag.panel=panel.hist, upper.panel=panel.smooth, lower.panel=panel.tau, main=main)
dev.off()
load(file='createtable')
n <- length(y[,1])
n
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,paste('Correlations for all pairs of data series (method=',par1,')',sep=''),n+1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,' ',header=TRUE)
for (i in 1:n) {
a<-table.element(a,dimnames(t(x))[[2]][i],header=TRUE)
}
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,dimnames(t(x))[[2]][i],header=TRUE)
for (j in 1:n) {
r <- cor.test(y[i,],y[j,],method=par1)
a<-table.element(a,round(r$estimate,3))
}
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable.tab')
ncorrs <- (n*n -n)/2
mycorrs <- array(0, dim=c(10,3))
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Correlations for all pairs of data series with p-values',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'pair',1,TRUE)
a<-table.element(a,'Pearson r',1,TRUE)
a<-table.element(a,'Spearman rho',1,TRUE)
a<-table.element(a,'Kendall tau',1,TRUE)
a<-table.row.end(a)
cor.test(y[1,],y[2,],method=par1)
for (i in 1:(n-1))
{
for (j in (i+1):n)
{
a<-table.row.start(a)
dum <- paste(dimnames(t(x))[[2]][i],';',dimnames(t(x))[[2]][j],sep='')
a<-table.element(a,dum,header=TRUE)
rp <- cor.test(y[i,],y[j,],method='pearson')
a<-table.element(a,round(rp$estimate,4))
rs <- cor.test(y[i,],y[j,],method='spearman')
a<-table.element(a,round(rs$estimate,4))
rk <- cor.test(y[i,],y[j,],method='kendall')
a<-table.element(a,round(rk$estimate,4))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-value',header=T)
a<-table.element(a,paste('(',round(rp$p.value,4),')',sep=''))
a<-table.element(a,paste('(',round(rs$p.value,4),')',sep=''))
a<-table.element(a,paste('(',round(rk$p.value,4),')',sep=''))
a<-table.row.end(a)
for (iii in 1:10) {
iiid100 <- iii / 100
if (rp$p.value < iiid100) mycorrs[iii, 1] = mycorrs[iii, 1] + 1
if (rs$p.value < iiid100) mycorrs[iii, 2] = mycorrs[iii, 2] + 1
if (rk$p.value < iiid100) mycorrs[iii, 3] = mycorrs[iii, 3] + 1
}
}
}
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Correlation Tests',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Number of significant by total number of Correlations',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Type I error',1,TRUE)
a<-table.element(a,'Pearson r',1,TRUE)
a<-table.element(a,'Spearman rho',1,TRUE)
a<-table.element(a,'Kendall tau',1,TRUE)
a<-table.row.end(a)
for (iii in 1:10) {
iiid100 <- iii / 100
a<-table.row.start(a)
a<-table.element(a,round(iiid100,2),header=T)
a<-table.element(a,round(mycorrs[iii,1]/ncorrs,2))
a<-table.element(a,round(mycorrs[iii,2]/ncorrs,2))
a<-table.element(a,round(mycorrs[iii,3]/ncorrs,2))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')