Binary R card still displays gradient

I am trying to build a heat map for a binary data matrix (11 x ~ 1500) in R.

heatmap(y, col = hmcols); 

the matrix "y" looks like this:

 [,1] [,2] [,3] [,4] [,5] [,6] [,7] [,8] [,9] [,10] [,11] [,12] [,13] [1,] 0 0 0 0 1 1 1 1 1 1 1 1 1 [2,] 0 0 1 0 0 1 0 0 0 0 0 0 1 [3,] 0 0 0 0 0 1 1 1 1 0 0 1 1 ...etc... 

I use the default distance and clustering functions, but for some reason my heatmap displays a color gradient. I also tried using the binary distance function, although a similar gradient occurs. Is this lack of similarity in the samples due to the distance between each sample? Here is an image of a heat map:

https://www.dropbox.com/s/jz1r41lhnrkisvz/Rplots.pdf

I feel this is because I don’t understand how the default data and clustering functions reorder the data. How can I interpret these results?

+6
source share
1 answer

As @Joran points out, the scale argument is the one to use:

(Note: I reduced the dimension and generated random data since you did not provide your complete set)

Colors are chosen by col , if you want to make a simple black and white color, you can make col = c("black", "white") , but you can also make things more interesting:

 x <- matrix(sample(c(0, 1), 15*15, replace = TRUE), nrow = 15) heatmap(x, scale = "none", Rowv = NA, Colv = NA, col = cm.colors(2), main = "HeatMap Example") 

enter image description here

+5
source

Source: https://habr.com/ru/post/953808/


All Articles