Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_pairs.wasp
Title produced by softwareKendall tau Correlation Matrix
Date of computationThu, 03 Dec 2015 19:56:37 +0000
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2015/Dec/03/t14491728348t36o40z14atu4j.htm/, Retrieved Thu, 16 May 2024 15:32:51 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=285028, Retrieved Thu, 16 May 2024 15:32:51 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact81
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-     [Kendall tau Rank Correlation] [rank correlatie b...] [2015-12-02 12:40:14] [170d69a18d8bca4f88fd44767421f945]
- R  D  [Kendall tau Rank Correlation] [Sterftegevallen- ...] [2015-12-02 17:25:58] [170d69a18d8bca4f88fd44767421f945]
- RMPD      [Kendall tau Correlation Matrix] [correlatie matrix...] [2015-12-03 19:56:37] [dde0885d390165e184eb3f8febefa56e] [Current]
Feedback Forum

Post a new message
Dataseries X:
8	9,100000381
9,300000191	8,699999809
7,5	7,199999809
8,899999619	8,899999619
10,19999981	8,300000191
8,300000191	10,89999962
8,800000191	10
8,800000191	9,100000381
10,69999981	8,699999809
11,69999981	7,599999905
8,5	10,80000019
8,300000191	9,5
8,199999809	8,800000191
7,900000095	9,5
10,30000019	8,699999809
7,400000095	11,19999981
9,600000381	9,699999809
9,300000191	9,600000381
10,60000038	9,100000381
9,699999809	9,199999809
11,60000038	8,300000191
8,100000381	8,399999619
9,800000191	9,399999619
7,400000095	9,800000191
9,399999619	10,39999962
11,19999981	9,899999619
9,100000381	9,199999809
10,5	10,30000019
11,89999962	8,899999619
8,399999619	9,600000381
5	10,30000019
9,800000191	10,39999962
9,800000191	9,699999809
10,80000019	9,600000381
10,10000038	10,69999981
10,89999962	10,30000019
9,199999809	10,69999981
8,300000191	9,600000381
7,300000191	10,5
9,399999619	7,699999809
9,399999619	10,19999981
9,800000191	9,899999619
3,599999905	8,399999619
8,399999619	10,39999962
10,80000019	9,199999809
10,10000038	13
9	8,800000191
10	9,199999809
11,30000019	7,800000191
11,30000019	8,199999809
12,80000019	7,400000095
10	10,39999962
6,699999809	8,899999619




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time1 seconds
R Server'Sir Ronald Aylmer Fisher' @ fisher.wessa.net

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 1 seconds \tabularnewline
R Server & 'Sir Ronald Aylmer Fisher' @ fisher.wessa.net \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=285028&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]1 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Sir Ronald Aylmer Fisher' @ fisher.wessa.net[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=285028&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=285028&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time1 seconds
R Server'Sir Ronald Aylmer Fisher' @ fisher.wessa.net







Correlations for all pairs of data series (method=pearson)
AB
A1-0.172
B-0.1721

\begin{tabular}{lllllllll}
\hline
Correlations for all pairs of data series (method=pearson) \tabularnewline
  & A & B \tabularnewline
A & 1 & -0.172 \tabularnewline
B & -0.172 & 1 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=285028&T=1

[TABLE]
[ROW][C]Correlations for all pairs of data series (method=pearson)[/C][/ROW]
[ROW][C] [/C][C]A[/C][C]B[/C][/ROW]
[ROW][C]A[/C][C]1[/C][C]-0.172[/C][/ROW]
[ROW][C]B[/C][C]-0.172[/C][C]1[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=285028&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=285028&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Correlations for all pairs of data series (method=pearson)
AB
A1-0.172
B-0.1721







Correlations for all pairs of data series with p-values
pairPearson rSpearman rhoKendall tau
A;B-0.172-0.2144-0.1489
p-value(0.2181)(0.1232)(0.1223)

\begin{tabular}{lllllllll}
\hline
Correlations for all pairs of data series with p-values \tabularnewline
pair & Pearson r & Spearman rho & Kendall tau \tabularnewline
A;B & -0.172 & -0.2144 & -0.1489 \tabularnewline
p-value & (0.2181) & (0.1232) & (0.1223) \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=285028&T=2

[TABLE]
[ROW][C]Correlations for all pairs of data series with p-values[/C][/ROW]
[ROW][C]pair[/C][C]Pearson r[/C][C]Spearman rho[/C][C]Kendall tau[/C][/ROW]
[ROW][C]A;B[/C][C]-0.172[/C][C]-0.2144[/C][C]-0.1489[/C][/ROW]
[ROW][C]p-value[/C][C](0.2181)[/C][C](0.1232)[/C][C](0.1223)[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=285028&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=285028&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Correlations for all pairs of data series with p-values
pairPearson rSpearman rhoKendall tau
A;B-0.172-0.2144-0.1489
p-value(0.2181)(0.1232)(0.1223)







Meta Analysis of Correlation Tests
Number of significant by total number of Correlations
Type I errorPearson rSpearman rhoKendall tau
0.01000
0.02000
0.03000
0.04000
0.05000
0.06000
0.07000
0.08000
0.09000
0.1000

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Correlation Tests \tabularnewline
Number of significant by total number of Correlations \tabularnewline
Type I error & Pearson r & Spearman rho & Kendall tau \tabularnewline
0.01 & 0 & 0 & 0 \tabularnewline
0.02 & 0 & 0 & 0 \tabularnewline
0.03 & 0 & 0 & 0 \tabularnewline
0.04 & 0 & 0 & 0 \tabularnewline
0.05 & 0 & 0 & 0 \tabularnewline
0.06 & 0 & 0 & 0 \tabularnewline
0.07 & 0 & 0 & 0 \tabularnewline
0.08 & 0 & 0 & 0 \tabularnewline
0.09 & 0 & 0 & 0 \tabularnewline
0.1 & 0 & 0 & 0 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=285028&T=3

[TABLE]
[ROW][C]Meta Analysis of Correlation Tests[/C][/ROW]
[ROW][C]Number of significant by total number of Correlations[/C][/ROW]
[ROW][C]Type I error[/C][C]Pearson r[/C][C]Spearman rho[/C][C]Kendall tau[/C][/ROW]
[ROW][C]0.01[/C][C]0[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]0.02[/C][C]0[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]0.03[/C][C]0[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]0.04[/C][C]0[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]0.05[/C][C]0[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]0.06[/C][C]0[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]0.07[/C][C]0[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]0.08[/C][C]0[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]0.09[/C][C]0[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]0.1[/C][C]0[/C][C]0[/C][C]0[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=285028&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=285028&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Correlation Tests
Number of significant by total number of Correlations
Type I errorPearson rSpearman rhoKendall tau
0.01000
0.02000
0.03000
0.04000
0.05000
0.06000
0.07000
0.08000
0.09000
0.1000



Parameters (Session):
par1 = pearson ;
Parameters (R input):
par1 = pearson ;
R code (references can be found in the software module):
panel.tau <- function(x, y, digits=2, prefix='', cex.cor)
{
usr <- par('usr'); on.exit(par(usr))
par(usr = c(0, 1, 0, 1))
rr <- cor.test(x, y, method=par1)
r <- round(rr$p.value,2)
txt <- format(c(r, 0.123456789), digits=digits)[1]
txt <- paste(prefix, txt, sep='')
if(missing(cex.cor)) cex <- 0.5/strwidth(txt)
text(0.5, 0.5, txt, cex = cex)
}
panel.hist <- function(x, ...)
{
usr <- par('usr'); on.exit(par(usr))
par(usr = c(usr[1:2], 0, 1.5) )
h <- hist(x, plot = FALSE)
breaks <- h$breaks; nB <- length(breaks)
y <- h$counts; y <- y/max(y)
rect(breaks[-nB], 0, breaks[-1], y, col='grey', ...)
}
x <- na.omit(x)
y <- t(na.omit(t(y)))
bitmap(file='test1.png')
pairs(t(y),diag.panel=panel.hist, upper.panel=panel.smooth, lower.panel=panel.tau, main=main)
dev.off()
load(file='createtable')
n <- length(y[,1])
n
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,paste('Correlations for all pairs of data series (method=',par1,')',sep=''),n+1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,' ',header=TRUE)
for (i in 1:n) {
a<-table.element(a,dimnames(t(x))[[2]][i],header=TRUE)
}
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,dimnames(t(x))[[2]][i],header=TRUE)
for (j in 1:n) {
r <- cor.test(y[i,],y[j,],method=par1)
a<-table.element(a,round(r$estimate,3))
}
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable.tab')
ncorrs <- (n*n -n)/2
mycorrs <- array(0, dim=c(10,3))
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Correlations for all pairs of data series with p-values',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'pair',1,TRUE)
a<-table.element(a,'Pearson r',1,TRUE)
a<-table.element(a,'Spearman rho',1,TRUE)
a<-table.element(a,'Kendall tau',1,TRUE)
a<-table.row.end(a)
cor.test(y[1,],y[2,],method=par1)
for (i in 1:(n-1))
{
for (j in (i+1):n)
{
a<-table.row.start(a)
dum <- paste(dimnames(t(x))[[2]][i],';',dimnames(t(x))[[2]][j],sep='')
a<-table.element(a,dum,header=TRUE)
rp <- cor.test(y[i,],y[j,],method='pearson')
a<-table.element(a,round(rp$estimate,4))
rs <- cor.test(y[i,],y[j,],method='spearman')
a<-table.element(a,round(rs$estimate,4))
rk <- cor.test(y[i,],y[j,],method='kendall')
a<-table.element(a,round(rk$estimate,4))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-value',header=T)
a<-table.element(a,paste('(',round(rp$p.value,4),')',sep=''))
a<-table.element(a,paste('(',round(rs$p.value,4),')',sep=''))
a<-table.element(a,paste('(',round(rk$p.value,4),')',sep=''))
a<-table.row.end(a)
for (iii in 1:10) {
iiid100 <- iii / 100
if (rp$p.value < iiid100) mycorrs[iii, 1] = mycorrs[iii, 1] + 1
if (rs$p.value < iiid100) mycorrs[iii, 2] = mycorrs[iii, 2] + 1
if (rk$p.value < iiid100) mycorrs[iii, 3] = mycorrs[iii, 3] + 1
}
}
}
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Correlation Tests',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Number of significant by total number of Correlations',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Type I error',1,TRUE)
a<-table.element(a,'Pearson r',1,TRUE)
a<-table.element(a,'Spearman rho',1,TRUE)
a<-table.element(a,'Kendall tau',1,TRUE)
a<-table.row.end(a)
for (iii in 1:10) {
iiid100 <- iii / 100
a<-table.row.start(a)
a<-table.element(a,round(iiid100,2),header=T)
a<-table.element(a,round(mycorrs[iii,1]/ncorrs,2))
a<-table.element(a,round(mycorrs[iii,2]/ncorrs,2))
a<-table.element(a,round(mycorrs[iii,3]/ncorrs,2))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')