Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_pairs.wasp
Title produced by softwareKendall tau Correlation Matrix
Date of computationSun, 27 Oct 2013 09:28:29 -0400
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2013/Oct/27/t1382880522na0r7hc2vs11po6.htm/, Retrieved Mon, 29 Apr 2024 08:05:04 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=219858, Retrieved Mon, 29 Apr 2024 08:05:04 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact68
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-     [Wilcoxon-Mann-Whitney Test] [Reddy-Moores Plac...] [2013-10-22 12:31:16] [34296d8f7657c52ed60d5bff9133afec]
- R     [Wilcoxon-Mann-Whitney Test] [] [2013-10-24 16:47:11] [b13e9c596812cd07dfc230f1f052ed59]
- RMP       [Kendall tau Correlation Matrix] [] [2013-10-27 13:28:29] [e9446f9705f6e3624e662c6dcffccced] [Current]
Feedback Forum

Post a new message
Dataseries X:
70.80	66.67
69.60	66.33
69.87	64.33
67.47	64.00
67.60	63.33
67.13	61.33
66.27	64.67
66.73	63.00
68.07	60.67
67.80	63.67
64.80	60.67
64.60	61.67
64.20	62.33
64.20	60.33
63.67	59.67
61.00	60.33
59.67	59.33
59.67	58.67
59.80	58.67
60.73	59.33
59.40	57.33
58.07	59.33
57.47	56.00
70.73	53.67
72.87	58.67
66.00	49.33
66.07	71.33
66.00	70.33
66.27	69.00
64.00	66.00
63.67	66.00
63.73	63.33
63.33	65.33
63.53	64.33
63.53	64.00
62.87	61.67
59.53	63.67
62.80	64.67
60.80	61.67
59.80	62.00
56.67	61.33
57.67	63.67
58.40	61.33
55.47	62.33
56.20	59.67
71.53	59.33
68.67	61.67
65.67	58.67
66.73	58.00
67.33	56.67
66.73	59.67
66.87	58.00
65.80	57.00
64.73	57.67
65.47	58.67
63.60	55.33
64.07	56.00
64.67	55.67
63.73	53.33
62.53	53.67
61.93	51.00
62.67	47.00
62.80	4.33
61.33	70.00
62.60	68.67
59.13	67.67
61.27	66.00
59.47	65.67
57.87	65.67
59.73	63.67
61.40	63.67
58.80	64.00
58.33	62.00
57.47	62.00
57.13	61.67
55.00	61.67
51.53	63.33
72.73	61.00
73.00	62.33
70.80	60.33
70.07	60.33
71.67	60.67
71.07	57.67
70.67	58.33
70.73	58.00
70.73	57.33
68.60	56.67
69.60	58.00
66.47	55.33
67.07	55.67
68.67	54.67
66.93	56.33
65.93	55.00
68.87	55.00
66.53	54.67
65.80	54.33
66.60	49.00
66.00	48.33
65.00	49.67
66.80	43.67
65.60	6.33
66.00	3.00
65.67	66.67
64.67	67.33
65.07	65.33
64.67	66.00
65.07	65.67
65.20	66.67
64.87	65.67
63.47	65.00
62.60	64.67
64.07	66.67
63.73	63.67
64.67	63.33
61.60	63.67
61.60	63.33
60.47	63.67
61.27	63.00
63.00	61.67
61.47	61.33
60.87	60.67
61.67	60.00
62.87	61.67
62.40	61.33
59.73	58.67
60.13	60.33
58.80	59.67
59.60	59.33
58.93	59.67
60.13	61.00
58.20	61.00
58.27	60.00
58.27	60.00
55.07	58.67
53.87	58.33
52.33	58.00
47.20	56.33
37.93	54.67
72.73	55.33
70.07	54.00
70.67	52.67
72.07	44.00
68.80	65.67
68.80	65.00
67.47	66.33
66.73	64.00
66.53	62.33
66.00	61.33
67.60	63.00
66.00	63.67
66.00	62.00
66.53	61.33
65.80	64.67
64.27	62.67
64.67	64.00
64.60	61.00
64.13	60.67
65.47	59.67
62.93	60.33
63.53	56.67
62.13	56.67
63.87	54.33
64.67	51.00
63.33	51.00
63.13	47.00
62.80	68.00
62.40	65.00
62.40	64.00
62.60	64.00
61.47	64.00
62.20	62.00
63.00	61.00
61.80	60.00
59.73	60.00
60.33	62.00
60.13	60.00
59.53	59.00
59.00	61.00
55.93	60.00
41.87	60.00
36.33	58.00
71.67	58.00
71.47	60.00
70.47	58.00
69.53	59.00
70.73	56.00
69.93	54.00
68.73	51.00
67.53	47.00




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time2 seconds
R Server'George Udny Yule' @ yule.wessa.net

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 2 seconds \tabularnewline
R Server & 'George Udny Yule' @ yule.wessa.net \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=219858&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]2 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'George Udny Yule' @ yule.wessa.net[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=219858&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=219858&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time2 seconds
R Server'George Udny Yule' @ yule.wessa.net







Correlations for all pairs of data series (method=pearson)
pmentnopment
pment1-0.076
nopment-0.0761

\begin{tabular}{lllllllll}
\hline
Correlations for all pairs of data series (method=pearson) \tabularnewline
  & pment & nopment \tabularnewline
pment & 1 & -0.076 \tabularnewline
nopment & -0.076 & 1 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=219858&T=1

[TABLE]
[ROW][C]Correlations for all pairs of data series (method=pearson)[/C][/ROW]
[ROW][C] [/C][C]pment[/C][C]nopment[/C][/ROW]
[ROW][C]pment[/C][C]1[/C][C]-0.076[/C][/ROW]
[ROW][C]nopment[/C][C]-0.076[/C][C]1[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=219858&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=219858&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Correlations for all pairs of data series (method=pearson)
pmentnopment
pment1-0.076
nopment-0.0761







Correlations for all pairs of data series with p-values
pairPearson rSpearman rhoKendall tau
pment;nopment-0.0757-0.1446-0.0736
p-value(0.3003)(0.0471)(0.1377)

\begin{tabular}{lllllllll}
\hline
Correlations for all pairs of data series with p-values \tabularnewline
pair & Pearson r & Spearman rho & Kendall tau \tabularnewline
pment;nopment & -0.0757 & -0.1446 & -0.0736 \tabularnewline
p-value & (0.3003) & (0.0471) & (0.1377) \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=219858&T=2

[TABLE]
[ROW][C]Correlations for all pairs of data series with p-values[/C][/ROW]
[ROW][C]pair[/C][C]Pearson r[/C][C]Spearman rho[/C][C]Kendall tau[/C][/ROW]
[ROW][C]pment;nopment[/C][C]-0.0757[/C][C]-0.1446[/C][C]-0.0736[/C][/ROW]
[ROW][C]p-value[/C][C](0.3003)[/C][C](0.0471)[/C][C](0.1377)[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=219858&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=219858&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Correlations for all pairs of data series with p-values
pairPearson rSpearman rhoKendall tau
pment;nopment-0.0757-0.1446-0.0736
p-value(0.3003)(0.0471)(0.1377)







Meta Analysis of Correlation Tests
Number of significant by total number of Correlations
Type I errorPearson rSpearman rhoKendall tau
0.01000
0.02000
0.03000
0.04000
0.05010
0.06010
0.07010
0.08010
0.09010
0.1010

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Correlation Tests \tabularnewline
Number of significant by total number of Correlations \tabularnewline
Type I error & Pearson r & Spearman rho & Kendall tau \tabularnewline
0.01 & 0 & 0 & 0 \tabularnewline
0.02 & 0 & 0 & 0 \tabularnewline
0.03 & 0 & 0 & 0 \tabularnewline
0.04 & 0 & 0 & 0 \tabularnewline
0.05 & 0 & 1 & 0 \tabularnewline
0.06 & 0 & 1 & 0 \tabularnewline
0.07 & 0 & 1 & 0 \tabularnewline
0.08 & 0 & 1 & 0 \tabularnewline
0.09 & 0 & 1 & 0 \tabularnewline
0.1 & 0 & 1 & 0 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=219858&T=3

[TABLE]
[ROW][C]Meta Analysis of Correlation Tests[/C][/ROW]
[ROW][C]Number of significant by total number of Correlations[/C][/ROW]
[ROW][C]Type I error[/C][C]Pearson r[/C][C]Spearman rho[/C][C]Kendall tau[/C][/ROW]
[ROW][C]0.01[/C][C]0[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]0.02[/C][C]0[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]0.03[/C][C]0[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]0.04[/C][C]0[/C][C]0[/C][C]0[/C][/ROW]
[ROW][C]0.05[/C][C]0[/C][C]1[/C][C]0[/C][/ROW]
[ROW][C]0.06[/C][C]0[/C][C]1[/C][C]0[/C][/ROW]
[ROW][C]0.07[/C][C]0[/C][C]1[/C][C]0[/C][/ROW]
[ROW][C]0.08[/C][C]0[/C][C]1[/C][C]0[/C][/ROW]
[ROW][C]0.09[/C][C]0[/C][C]1[/C][C]0[/C][/ROW]
[ROW][C]0.1[/C][C]0[/C][C]1[/C][C]0[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=219858&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=219858&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Correlation Tests
Number of significant by total number of Correlations
Type I errorPearson rSpearman rhoKendall tau
0.01000
0.02000
0.03000
0.04000
0.05010
0.06010
0.07010
0.08010
0.09010
0.1010



Parameters (Session):
par1 = pearson ;
Parameters (R input):
par1 = pearson ;
R code (references can be found in the software module):
panel.tau <- function(x, y, digits=2, prefix='', cex.cor)
{
usr <- par('usr'); on.exit(par(usr))
par(usr = c(0, 1, 0, 1))
rr <- cor.test(x, y, method=par1)
r <- round(rr$p.value,2)
txt <- format(c(r, 0.123456789), digits=digits)[1]
txt <- paste(prefix, txt, sep='')
if(missing(cex.cor)) cex <- 0.5/strwidth(txt)
text(0.5, 0.5, txt, cex = cex)
}
panel.hist <- function(x, ...)
{
usr <- par('usr'); on.exit(par(usr))
par(usr = c(usr[1:2], 0, 1.5) )
h <- hist(x, plot = FALSE)
breaks <- h$breaks; nB <- length(breaks)
y <- h$counts; y <- y/max(y)
rect(breaks[-nB], 0, breaks[-1], y, col='grey', ...)
}
bitmap(file='test1.png')
pairs(t(y),diag.panel=panel.hist, upper.panel=panel.smooth, lower.panel=panel.tau, main=main)
dev.off()
load(file='createtable')
n <- length(y[,1])
n
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,paste('Correlations for all pairs of data series (method=',par1,')',sep=''),n+1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,' ',header=TRUE)
for (i in 1:n) {
a<-table.element(a,dimnames(t(x))[[2]][i],header=TRUE)
}
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,dimnames(t(x))[[2]][i],header=TRUE)
for (j in 1:n) {
r <- cor.test(y[i,],y[j,],method=par1)
a<-table.element(a,round(r$estimate,3))
}
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable.tab')
ncorrs <- (n*n -n)/2
mycorrs <- array(0, dim=c(10,3))
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Correlations for all pairs of data series with p-values',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'pair',1,TRUE)
a<-table.element(a,'Pearson r',1,TRUE)
a<-table.element(a,'Spearman rho',1,TRUE)
a<-table.element(a,'Kendall tau',1,TRUE)
a<-table.row.end(a)
cor.test(y[1,],y[2,],method=par1)
for (i in 1:(n-1))
{
for (j in (i+1):n)
{
a<-table.row.start(a)
dum <- paste(dimnames(t(x))[[2]][i],';',dimnames(t(x))[[2]][j],sep='')
a<-table.element(a,dum,header=TRUE)
rp <- cor.test(y[i,],y[j,],method='pearson')
a<-table.element(a,round(rp$estimate,4))
rs <- cor.test(y[i,],y[j,],method='spearman')
a<-table.element(a,round(rs$estimate,4))
rk <- cor.test(y[i,],y[j,],method='kendall')
a<-table.element(a,round(rk$estimate,4))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-value',header=T)
a<-table.element(a,paste('(',round(rp$p.value,4),')',sep=''))
a<-table.element(a,paste('(',round(rs$p.value,4),')',sep=''))
a<-table.element(a,paste('(',round(rk$p.value,4),')',sep=''))
a<-table.row.end(a)
for (iii in 1:10) {
iiid100 <- iii / 100
if (rp$p.value < iiid100) mycorrs[iii, 1] = mycorrs[iii, 1] + 1
if (rs$p.value < iiid100) mycorrs[iii, 2] = mycorrs[iii, 2] + 1
if (rk$p.value < iiid100) mycorrs[iii, 3] = mycorrs[iii, 3] + 1
}
}
}
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Correlation Tests',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Number of significant by total number of Correlations',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Type I error',1,TRUE)
a<-table.element(a,'Pearson r',1,TRUE)
a<-table.element(a,'Spearman rho',1,TRUE)
a<-table.element(a,'Kendall tau',1,TRUE)
a<-table.row.end(a)
for (iii in 1:10) {
iiid100 <- iii / 100
a<-table.row.start(a)
a<-table.element(a,round(iiid100,2),header=T)
a<-table.element(a,round(mycorrs[iii,1]/ncorrs,2))
a<-table.element(a,round(mycorrs[iii,2]/ncorrs,2))
a<-table.element(a,round(mycorrs[iii,3]/ncorrs,2))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')