Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.
Dependence in dimension 2 is difficult. But one has to admit that dimension 2 is way more simple than dimension 3 ! I recently rediscovered a nice paper, Langford, Schwertman & Owens (2001), on transitivity of the property of being positively correlated (which inspired the odd title of this post). And more recently, Castro Sotos, Vanhoof, Van Den Noortgate & Onghena (2001) conducted a study which confirmed that there are strong misconceptions of correlation (and I guess, not only because probabilistic reasoning is extremely weak, as mentioned in Stock & Gross (1989)) and association, or correlation (as already stated in Estapa & Bataneor (1996), or Batanero, Estepa, Godino and Green (1996)). My understanding is that is it possible to have almost anything… even counterintuitive results. For instance, if we want to mix independence and comonotonicity (i.e. perfect positive dependence), all the theorems you might think of should probably be incorrect. Consider the following result (based on some old examples I have been using in my courses 5 or 6 years ago, see e.g. here)
“If X and Y are comontonic, and if Y and Z are comonotonic, then X and Z are comonotonic“
Well, this result seems to be intuitive, and probably valid. But it is not. Consider the following triplet,
Here, X and Y are comonotonic, so are Y and Z, but X and Z are independent… Weird, isn’t it ? Another one ?
“If X and Y are comontonic, and if Y and Z are independent, then X and Z are independent“
Again, even if it is intuitive, it is not correct… Consider for instance the following 3 dimensional distribution,
In that case, X and Y are comonotonic, while Y and Z are independent, but X here and Z are comonotonic (perfect positive dependence). So obviously, we should be able to construct any kind of counterexample, on any kind of result we might think as intuitive.
To be honest, the problem with intuition is that is usually comes from the Gaussian case, and from the perception that dependence is related to correlation. Pearson’s linear correlation. Consider the case of a 3 dimensional random vector, with correlation matrix
Langford, Schwertman & Owens (2001) claim (in Theorem 3) that correlations have to satisfy some property, like
But is that a sufficient and necessary condition ? Since I am extremely lazy, let us run some numerical calculation to visualize possible values for
U=seq(-1,1,by=.1) V=seq(-1,1,by=.001) FSUP=function(a,b){ DF=function(c){min(eigen(matrix (c(1,a,b,a,1,c,b,c,1),3,3))$values)}; V[max(which(Vectorize(DF)(V)>0))]} FINF=function(a,b){ DF=function(c){min(eigen(matrix( c(1,a,b,a,1,c,b,c,1),3,3))$values)}; V[min(which(Vectorize(DF)(V)>0))]} MSUP=outer(U,U,Vectorize(FSUP)) MINF=outer(U,U,Vectorize(FINF)) library(RColorBrewer) clr=rev(brewer.pal(6, "RdBu")) U=U[2:20] MSUP=MSUP[2:20,2:20] MINF=MINF[2:20,2:20] persp(U,U,MSUP,col="green",shade=TRUE) image(U,U,MSUP,breaks=((-3):3)/3,col=clr) persp(U,U,MINF,col="green",shade=TRUE) image(U,U,MINF,breaks=((-3):3)/3,col=clr)Here, we can derive the lower and the upper bound for
V=seq(-1,1,by=.001) U=seq(-1,1,by=.1) U=U[2:(length(U)-1)] V=V[2:(length(V)-1)] U=c(-.9999,U,.9999) V=c(-.99999,V,.99999) FSUP=function(a){ DF=function(c){min(eigen(matrix( c(1,a,-.7,a,1,c,-.7,c,1),3,3))$values)}; V[max(which(Vectorize(DF)(V)>0))]} FINF=function(a){ DF=function(c){min(eigen(matrix( c(1,a,-.7,a,1,c,-.7,c,1),3,3))$values)}; V[min(which(Vectorize(DF)(V)>0))]} VS=Vectorize(FSUP)(U) VI=Vectorize(FINF)(U) plot(c(U,U),c(VS,VI),col="white") polygon(c(U,rev(U)),c(VS,rev(VI)), col="yellow",border=NA) lines(U,VS,lwd=2,col="red") lines(U,VI,lwd=2,col="red")
The interpretation is that if
R-bloggers.com offers daily e-mail updates about R news and tutorials about learning R and many other topics. Click here if you're looking to post or find an R/data-science job.
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.