Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_pairs.wasp
Title produced by softwareKendall tau Correlation Matrix
Date of computationWed, 16 Oct 2013 11:03:29 -0400
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2013/Oct/16/t1381935840pb5uj1gjyakbd3n.htm/, Retrieved Sat, 27 Apr 2024 21:16:10 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=216152, Retrieved Sat, 27 Apr 2024 21:16:10 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact103
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-     [Notched Boxplots] [] [2012-10-10 08:19:09] [b98453cac15ba1066b407e146608df68]
- RMP     [Kendall tau Correlation Matrix] [Workshop 2 Vraag 5] [2013-10-16 15:03:29] [964acef360ea028b18952da197fdc0b8] [Current]
Feedback Forum

Post a new message
Dataseries X:
1.14	0.88	0.85	0.97	0.88	0.86
-0.27	-0.28	-0.21	0.08	0.09	0.02
1.07	0.86	0.82	0.86	0.87	0.84
0.18	0.22	0.26	-0.12	-0.16	-0.22
0.92	0.79	0.78	0.52	0.6	0.62
1.15	0.91	0.88	0.87	0.84	0.81
0.59	0.64	0.62	0.75	0.82	0.81
1.05	0.86	0.83	0.78	0.83	0.8
-0.49	-0.53	-0.47	-0.66	-0.71	-0.68
0.98	0.88	0.84	0.92	0.83	0.8
0.74	0.81	0.78	0.65	0.72	0.71
0.51	0.6	0.62	0.41	0.56	0.57
0.6	0.64	0.61	0.47	0.53	0.52
0.42	0.53	0.53	0.4	0.46	0.45
0.72	0.69	0.66	0.48	0.58	0.57
0.35	0.53	0.49	0.21	0.31	0.32
0.58	0.7	0.67	0.65	0.78	0.78
0.62	0.72	0.71	0.59	0.73	0.78
0.54	0.63	0.63	0.34	0.43	0.4
0.08	0.09	0.13	-0.05	-0.06	-0.02




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time1 seconds
R Server'Sir Maurice George Kendall' @ kendall.wessa.net

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 1 seconds \tabularnewline
R Server & 'Sir Maurice George Kendall' @ kendall.wessa.net \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=216152&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]1 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Sir Maurice George Kendall' @ kendall.wessa.net[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=216152&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=216152&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time1 seconds
R Server'Sir Maurice George Kendall' @ kendall.wessa.net







Correlations for all pairs of data series (method=pearson)
mean0sum0count0mean1sum1count1
mean010.9590.9590.9240.9010.898
sum00.95910.9990.9130.9270.93
count00.9590.99910.9120.9260.929
mean10.9240.9130.91210.9850.977
sum10.9010.9270.9260.98510.997
count10.8980.930.9290.9770.9971

\begin{tabular}{lllllllll}
\hline
Correlations for all pairs of data series (method=pearson) \tabularnewline
  & mean0 & sum0 & count0 & mean1 & sum1 & count1 \tabularnewline
mean0 & 1 & 0.959 & 0.959 & 0.924 & 0.901 & 0.898 \tabularnewline
sum0 & 0.959 & 1 & 0.999 & 0.913 & 0.927 & 0.93 \tabularnewline
count0 & 0.959 & 0.999 & 1 & 0.912 & 0.926 & 0.929 \tabularnewline
mean1 & 0.924 & 0.913 & 0.912 & 1 & 0.985 & 0.977 \tabularnewline
sum1 & 0.901 & 0.927 & 0.926 & 0.985 & 1 & 0.997 \tabularnewline
count1 & 0.898 & 0.93 & 0.929 & 0.977 & 0.997 & 1 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=216152&T=1

[TABLE]
[ROW][C]Correlations for all pairs of data series (method=pearson)[/C][/ROW]
[ROW][C] [/C][C]mean0[/C][C]sum0[/C][C]count0[/C][C]mean1[/C][C]sum1[/C][C]count1[/C][/ROW]
[ROW][C]mean0[/C][C]1[/C][C]0.959[/C][C]0.959[/C][C]0.924[/C][C]0.901[/C][C]0.898[/C][/ROW]
[ROW][C]sum0[/C][C]0.959[/C][C]1[/C][C]0.999[/C][C]0.913[/C][C]0.927[/C][C]0.93[/C][/ROW]
[ROW][C]count0[/C][C]0.959[/C][C]0.999[/C][C]1[/C][C]0.912[/C][C]0.926[/C][C]0.929[/C][/ROW]
[ROW][C]mean1[/C][C]0.924[/C][C]0.913[/C][C]0.912[/C][C]1[/C][C]0.985[/C][C]0.977[/C][/ROW]
[ROW][C]sum1[/C][C]0.901[/C][C]0.927[/C][C]0.926[/C][C]0.985[/C][C]1[/C][C]0.997[/C][/ROW]
[ROW][C]count1[/C][C]0.898[/C][C]0.93[/C][C]0.929[/C][C]0.977[/C][C]0.997[/C][C]1[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=216152&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=216152&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Correlations for all pairs of data series (method=pearson)
mean0sum0count0mean1sum1count1
mean010.9590.9590.9240.9010.898
sum00.95910.9990.9130.9270.93
count00.9590.99910.9120.9260.929
mean10.9240.9130.91210.9850.977
sum10.9010.9270.9260.98510.997
count10.8980.930.9290.9770.9971







Correlations for all pairs of data series with p-values
pairPearson rSpearman rhoKendall tau
mean0;sum00.95920.97890.9149
p-value(0)(0)(0)
mean0;count00.95930.96460.8783
p-value(0)(0)(0)
mean0;mean10.92430.92140.7757
p-value(0)(0)(0)
mean0;sum10.9010.91540.7652
p-value(0)(0)(0)
mean0;count10.89770.8870.7447
p-value(0)(0)(0)
sum0;count00.99910.98830.9519
p-value(0)(0)(0)
sum0;mean10.91330.94840.8267
p-value(0)(0)(0)
sum0;sum10.92710.93180.784
p-value(0)(0)(0)
sum0;count10.92970.89860.7634
p-value(0)(0)(0)
count0;mean10.91230.9270.7905
p-value(0)(0)(0)
count0;sum10.92640.91830.7692
p-value(0)(0)(0)
count0;count10.92860.88240.7487
p-value(0)(0)(0)
mean1;sum10.98470.9880.9418
p-value(0)(0)(0)
mean1;count10.97680.97660.912
p-value(0)(0)(0)
sum1;count10.99740.99280.9707
p-value(0)(0)(0)

\begin{tabular}{lllllllll}
\hline
Correlations for all pairs of data series with p-values \tabularnewline
pair & Pearson r & Spearman rho & Kendall tau \tabularnewline
mean0;sum0 & 0.9592 & 0.9789 & 0.9149 \tabularnewline
p-value & (0) & (0) & (0) \tabularnewline
mean0;count0 & 0.9593 & 0.9646 & 0.8783 \tabularnewline
p-value & (0) & (0) & (0) \tabularnewline
mean0;mean1 & 0.9243 & 0.9214 & 0.7757 \tabularnewline
p-value & (0) & (0) & (0) \tabularnewline
mean0;sum1 & 0.901 & 0.9154 & 0.7652 \tabularnewline
p-value & (0) & (0) & (0) \tabularnewline
mean0;count1 & 0.8977 & 0.887 & 0.7447 \tabularnewline
p-value & (0) & (0) & (0) \tabularnewline
sum0;count0 & 0.9991 & 0.9883 & 0.9519 \tabularnewline
p-value & (0) & (0) & (0) \tabularnewline
sum0;mean1 & 0.9133 & 0.9484 & 0.8267 \tabularnewline
p-value & (0) & (0) & (0) \tabularnewline
sum0;sum1 & 0.9271 & 0.9318 & 0.784 \tabularnewline
p-value & (0) & (0) & (0) \tabularnewline
sum0;count1 & 0.9297 & 0.8986 & 0.7634 \tabularnewline
p-value & (0) & (0) & (0) \tabularnewline
count0;mean1 & 0.9123 & 0.927 & 0.7905 \tabularnewline
p-value & (0) & (0) & (0) \tabularnewline
count0;sum1 & 0.9264 & 0.9183 & 0.7692 \tabularnewline
p-value & (0) & (0) & (0) \tabularnewline
count0;count1 & 0.9286 & 0.8824 & 0.7487 \tabularnewline
p-value & (0) & (0) & (0) \tabularnewline
mean1;sum1 & 0.9847 & 0.988 & 0.9418 \tabularnewline
p-value & (0) & (0) & (0) \tabularnewline
mean1;count1 & 0.9768 & 0.9766 & 0.912 \tabularnewline
p-value & (0) & (0) & (0) \tabularnewline
sum1;count1 & 0.9974 & 0.9928 & 0.9707 \tabularnewline
p-value & (0) & (0) & (0) \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=216152&T=2

[TABLE]
[ROW][C]Correlations for all pairs of data series with p-values[/C][/ROW]
[ROW][C]pair[/C][C]Pearson r[/C][C]Spearman rho[/C][C]Kendall tau[/C][/ROW]
[ROW][C]mean0;sum0[/C][C]0.9592[/C][C]0.9789[/C][C]0.9149[/C][/ROW]
[ROW][C]p-value[/C][C](0)[/C][C](0)[/C][C](0)[/C][/ROW]
[ROW][C]mean0;count0[/C][C]0.9593[/C][C]0.9646[/C][C]0.8783[/C][/ROW]
[ROW][C]p-value[/C][C](0)[/C][C](0)[/C][C](0)[/C][/ROW]
[ROW][C]mean0;mean1[/C][C]0.9243[/C][C]0.9214[/C][C]0.7757[/C][/ROW]
[ROW][C]p-value[/C][C](0)[/C][C](0)[/C][C](0)[/C][/ROW]
[ROW][C]mean0;sum1[/C][C]0.901[/C][C]0.9154[/C][C]0.7652[/C][/ROW]
[ROW][C]p-value[/C][C](0)[/C][C](0)[/C][C](0)[/C][/ROW]
[ROW][C]mean0;count1[/C][C]0.8977[/C][C]0.887[/C][C]0.7447[/C][/ROW]
[ROW][C]p-value[/C][C](0)[/C][C](0)[/C][C](0)[/C][/ROW]
[ROW][C]sum0;count0[/C][C]0.9991[/C][C]0.9883[/C][C]0.9519[/C][/ROW]
[ROW][C]p-value[/C][C](0)[/C][C](0)[/C][C](0)[/C][/ROW]
[ROW][C]sum0;mean1[/C][C]0.9133[/C][C]0.9484[/C][C]0.8267[/C][/ROW]
[ROW][C]p-value[/C][C](0)[/C][C](0)[/C][C](0)[/C][/ROW]
[ROW][C]sum0;sum1[/C][C]0.9271[/C][C]0.9318[/C][C]0.784[/C][/ROW]
[ROW][C]p-value[/C][C](0)[/C][C](0)[/C][C](0)[/C][/ROW]
[ROW][C]sum0;count1[/C][C]0.9297[/C][C]0.8986[/C][C]0.7634[/C][/ROW]
[ROW][C]p-value[/C][C](0)[/C][C](0)[/C][C](0)[/C][/ROW]
[ROW][C]count0;mean1[/C][C]0.9123[/C][C]0.927[/C][C]0.7905[/C][/ROW]
[ROW][C]p-value[/C][C](0)[/C][C](0)[/C][C](0)[/C][/ROW]
[ROW][C]count0;sum1[/C][C]0.9264[/C][C]0.9183[/C][C]0.7692[/C][/ROW]
[ROW][C]p-value[/C][C](0)[/C][C](0)[/C][C](0)[/C][/ROW]
[ROW][C]count0;count1[/C][C]0.9286[/C][C]0.8824[/C][C]0.7487[/C][/ROW]
[ROW][C]p-value[/C][C](0)[/C][C](0)[/C][C](0)[/C][/ROW]
[ROW][C]mean1;sum1[/C][C]0.9847[/C][C]0.988[/C][C]0.9418[/C][/ROW]
[ROW][C]p-value[/C][C](0)[/C][C](0)[/C][C](0)[/C][/ROW]
[ROW][C]mean1;count1[/C][C]0.9768[/C][C]0.9766[/C][C]0.912[/C][/ROW]
[ROW][C]p-value[/C][C](0)[/C][C](0)[/C][C](0)[/C][/ROW]
[ROW][C]sum1;count1[/C][C]0.9974[/C][C]0.9928[/C][C]0.9707[/C][/ROW]
[ROW][C]p-value[/C][C](0)[/C][C](0)[/C][C](0)[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=216152&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=216152&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Correlations for all pairs of data series with p-values
pairPearson rSpearman rhoKendall tau
mean0;sum00.95920.97890.9149
p-value(0)(0)(0)
mean0;count00.95930.96460.8783
p-value(0)(0)(0)
mean0;mean10.92430.92140.7757
p-value(0)(0)(0)
mean0;sum10.9010.91540.7652
p-value(0)(0)(0)
mean0;count10.89770.8870.7447
p-value(0)(0)(0)
sum0;count00.99910.98830.9519
p-value(0)(0)(0)
sum0;mean10.91330.94840.8267
p-value(0)(0)(0)
sum0;sum10.92710.93180.784
p-value(0)(0)(0)
sum0;count10.92970.89860.7634
p-value(0)(0)(0)
count0;mean10.91230.9270.7905
p-value(0)(0)(0)
count0;sum10.92640.91830.7692
p-value(0)(0)(0)
count0;count10.92860.88240.7487
p-value(0)(0)(0)
mean1;sum10.98470.9880.9418
p-value(0)(0)(0)
mean1;count10.97680.97660.912
p-value(0)(0)(0)
sum1;count10.99740.99280.9707
p-value(0)(0)(0)







Meta Analysis of Correlation Tests
Number of significant by total number of Correlations
Type I errorPearson rSpearman rhoKendall tau
0.01111
0.02111
0.03111
0.04111
0.05111
0.06111
0.07111
0.08111
0.09111
0.1111

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Correlation Tests \tabularnewline
Number of significant by total number of Correlations \tabularnewline
Type I error & Pearson r & Spearman rho & Kendall tau \tabularnewline
0.01 & 1 & 1 & 1 \tabularnewline
0.02 & 1 & 1 & 1 \tabularnewline
0.03 & 1 & 1 & 1 \tabularnewline
0.04 & 1 & 1 & 1 \tabularnewline
0.05 & 1 & 1 & 1 \tabularnewline
0.06 & 1 & 1 & 1 \tabularnewline
0.07 & 1 & 1 & 1 \tabularnewline
0.08 & 1 & 1 & 1 \tabularnewline
0.09 & 1 & 1 & 1 \tabularnewline
0.1 & 1 & 1 & 1 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=216152&T=3

[TABLE]
[ROW][C]Meta Analysis of Correlation Tests[/C][/ROW]
[ROW][C]Number of significant by total number of Correlations[/C][/ROW]
[ROW][C]Type I error[/C][C]Pearson r[/C][C]Spearman rho[/C][C]Kendall tau[/C][/ROW]
[ROW][C]0.01[/C][C]1[/C][C]1[/C][C]1[/C][/ROW]
[ROW][C]0.02[/C][C]1[/C][C]1[/C][C]1[/C][/ROW]
[ROW][C]0.03[/C][C]1[/C][C]1[/C][C]1[/C][/ROW]
[ROW][C]0.04[/C][C]1[/C][C]1[/C][C]1[/C][/ROW]
[ROW][C]0.05[/C][C]1[/C][C]1[/C][C]1[/C][/ROW]
[ROW][C]0.06[/C][C]1[/C][C]1[/C][C]1[/C][/ROW]
[ROW][C]0.07[/C][C]1[/C][C]1[/C][C]1[/C][/ROW]
[ROW][C]0.08[/C][C]1[/C][C]1[/C][C]1[/C][/ROW]
[ROW][C]0.09[/C][C]1[/C][C]1[/C][C]1[/C][/ROW]
[ROW][C]0.1[/C][C]1[/C][C]1[/C][C]1[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=216152&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=216152&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Correlation Tests
Number of significant by total number of Correlations
Type I errorPearson rSpearman rhoKendall tau
0.01111
0.02111
0.03111
0.04111
0.05111
0.06111
0.07111
0.08111
0.09111
0.1111



Parameters (Session):
par1 = pearson ;
Parameters (R input):
par1 = pearson ;
R code (references can be found in the software module):
panel.tau <- function(x, y, digits=2, prefix='', cex.cor)
{
usr <- par('usr'); on.exit(par(usr))
par(usr = c(0, 1, 0, 1))
rr <- cor.test(x, y, method=par1)
r <- round(rr$p.value,2)
txt <- format(c(r, 0.123456789), digits=digits)[1]
txt <- paste(prefix, txt, sep='')
if(missing(cex.cor)) cex <- 0.5/strwidth(txt)
text(0.5, 0.5, txt, cex = cex)
}
panel.hist <- function(x, ...)
{
usr <- par('usr'); on.exit(par(usr))
par(usr = c(usr[1:2], 0, 1.5) )
h <- hist(x, plot = FALSE)
breaks <- h$breaks; nB <- length(breaks)
y <- h$counts; y <- y/max(y)
rect(breaks[-nB], 0, breaks[-1], y, col='grey', ...)
}
bitmap(file='test1.png')
pairs(t(y),diag.panel=panel.hist, upper.panel=panel.smooth, lower.panel=panel.tau, main=main)
dev.off()
load(file='createtable')
n <- length(y[,1])
n
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,paste('Correlations for all pairs of data series (method=',par1,')',sep=''),n+1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,' ',header=TRUE)
for (i in 1:n) {
a<-table.element(a,dimnames(t(x))[[2]][i],header=TRUE)
}
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,dimnames(t(x))[[2]][i],header=TRUE)
for (j in 1:n) {
r <- cor.test(y[i,],y[j,],method=par1)
a<-table.element(a,round(r$estimate,3))
}
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable.tab')
ncorrs <- (n*n -n)/2
mycorrs <- array(0, dim=c(10,3))
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Correlations for all pairs of data series with p-values',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'pair',1,TRUE)
a<-table.element(a,'Pearson r',1,TRUE)
a<-table.element(a,'Spearman rho',1,TRUE)
a<-table.element(a,'Kendall tau',1,TRUE)
a<-table.row.end(a)
cor.test(y[1,],y[2,],method=par1)
for (i in 1:(n-1))
{
for (j in (i+1):n)
{
a<-table.row.start(a)
dum <- paste(dimnames(t(x))[[2]][i],';',dimnames(t(x))[[2]][j],sep='')
a<-table.element(a,dum,header=TRUE)
rp <- cor.test(y[i,],y[j,],method='pearson')
a<-table.element(a,round(rp$estimate,4))
rs <- cor.test(y[i,],y[j,],method='spearman')
a<-table.element(a,round(rs$estimate,4))
rk <- cor.test(y[i,],y[j,],method='kendall')
a<-table.element(a,round(rk$estimate,4))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-value',header=T)
a<-table.element(a,paste('(',round(rp$p.value,4),')',sep=''))
a<-table.element(a,paste('(',round(rs$p.value,4),')',sep=''))
a<-table.element(a,paste('(',round(rk$p.value,4),')',sep=''))
a<-table.row.end(a)
for (iii in 1:10) {
iiid100 <- iii / 100
if (rp$p.value < iiid100) mycorrs[iii, 1] = mycorrs[iii, 1] + 1
if (rs$p.value < iiid100) mycorrs[iii, 2] = mycorrs[iii, 2] + 1
if (rk$p.value < iiid100) mycorrs[iii, 3] = mycorrs[iii, 3] + 1
}
}
}
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Correlation Tests',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Number of significant by total number of Correlations',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Type I error',1,TRUE)
a<-table.element(a,'Pearson r',1,TRUE)
a<-table.element(a,'Spearman rho',1,TRUE)
a<-table.element(a,'Kendall tau',1,TRUE)
a<-table.row.end(a)
for (iii in 1:10) {
iiid100 <- iii / 100
a<-table.row.start(a)
a<-table.element(a,round(iiid100,2),header=T)
a<-table.element(a,round(mycorrs[iii,1]/ncorrs,2))
a<-table.element(a,round(mycorrs[iii,2]/ncorrs,2))
a<-table.element(a,round(mycorrs[iii,3]/ncorrs,2))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')