Wednesday, October 24, 2012

Nonnegative Matrix Factorization and Recommendor Systems

Albert Au Yeung provides a very nice tutorial on non-negative matrix factorization and an implementation in python. This is based very loosely on his approach. Suppose we have the following matrix of users and ratings on movies:

If we use the information above to form a matrix R it can be decomposed into two matrices  W and H such that R~ WH'

where R is an n x p matrix of users and ratings
            W = n x r user feature matrix
            H = r x p  movie feature matrix

Similar to principle components analysis, the columns in W can be interpreted to represent latent user features while the columns in H’ can be interpreted as latent movie features. This factorization allows us to classify or cluster user types and movie types based on these latent factors.

For example, using the nmf function in R, we can decompose the matrix R above and obtain the following column vectors from H. 

We can see that the first column vector ‘loads’ heavily on ‘military’ movies while the second feature more heavily ‘loads’ onto the ‘western’ themed movies. These vectors form a ‘feature space’ for movie types.  Each movie can be visualized in this space as being a member of a cluster associated with its respective latent feature.

If a new user gives a high recommendation to a movie belonging to one of the clusters created by the matrix factorization, other movies belonging to the same cluster can be recommended.


Yehuda Koren, Yahoo Research Robert Bell and Chris Volinsky, AT&T Labs—Research
IEEE Computer Society 2009

Matrix Factorisation: A Simple Tutorial and Implementation in Python

R Code:

#   ------------------------------------------------------------------
#  | PROGRAM NAME: R nmf example
#  | DATE: 10/20/12
#  | PROJECT FILE: /Users/wkuuser/Desktop/Briefcase/R Programs             
#  |----------------------------------------------------------------
#  | PURPOSE: very basic example of a recommendor system based on               
#  | non-negative matrix factorization
#  |
#  | 
#  |------------------------------------------------------------------
# X ~ WH' 
# X is an n x p matrix
# W = n x r  user feature matrix
# H = r x p  movie feature matrix
# get ratings for 5 users on 4 movies
x1 <- c(5,4,1,1)
x2 <- c(4,5,1,1)
x3 <- c(1,1,5,5)
x4 <- c(1,1,4,5)
x5 <- c(1,1,5,4)
R <- as.matrix(rbind(x1,x2,x3,x4,x5)) # n = 5 rows p = 4 columns 
res <- nmf(R, 4,"lee") # lee & seung method
V.hat <- fitted(res) 
print(V.hat) # estimated target matrix
w <- basis(res) #  W  user feature matrix matrix
dim(w) # n x r (n= 5  r = 4)
h <- coef(res) # H  movie feature matrix
dim(h) #  r x p (r = 4 p = 4)
# recommendor system via clustering based on vectors in H
movies <- data.frame(t(h))
features <- cbind(movies$X1,movies$X2)
title("Movie Feature Plot")
Created by Pretty R at


  1. Hi
    I think that package 'nmf' is no longer in CRAN but there is 'NNMF'.

    so that would be more or less the same:

    > library(NMFN)
    > res <- nnmf(R, 4)
    > w<- res$W
    > h<- res$H

    1. I think you are correct. I was somehow able to download it on my Mac and use it. Not sure why I was able to do that and it still seems to work. I also found some documentation and examples using it here: Its good to know there is another package out there though. What I'd really like to do when I get time is implement this in R ( or SAS/IML) similar to the way Albert Au Yeung did with Python, just to get my hands a little dirtier. Not sure when I will have time though. Thanks for your comment!

    2. Hi Matt,

      I programmed the same stuff for making a lab session for my students, but does this R package handle missing values in the matrix you feed the NMF with - and sparsity is typically the situation for recommendation systems, and we can't just fill the holes with 0s, which NMF will interpret are "bullshit film" ? There are python packages that do, but I couln't get hold of a R package.
      I also use it on mac.

      Thanks if any clue !

    3. I mean, with the factorization on your matrix you can identify the latent structure and perform the "understanding" task on the data set, but I guess not the prediction task - it that correct ?

  2. Marc, you are correct. It's really about interpretation, & it does not seem to handle missing data. That is one reason I want to recode Yeung's python into R or SAS IML if I can find the time , because if you read his example that I link to, it appears to do the things we both are really interested in.