com.gregdennis.drej
Class Regression

java.lang.Object
  extended by com.gregdennis.drej.Regression

public final class Regression
extends java.lang.Object

A least-squares regression, also known as a regularized least squares classification.

Author:
Greg Dennis (gdennis@mit.edu)

Method Summary
static GMatrix kernelMatrix(GMatrix data, Kernel kernel)
          Returns a kernel matrix for the specified data matrix (each column contains a data point) and the specified kernel.
static Representer solve(GMatrix data, GVector values, Kernel kernel, double lambda)
          Performs a least squares regression for the specified data matrix (one data point in each column), and returns a representer function fit to the data.
 
Methods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
 

Method Detail

solve

public static Representer solve(GMatrix data,
                                GVector values,
                                Kernel kernel,
                                double lambda)
Performs a least squares regression for the specified data matrix (one data point in each column), and returns a representer function fit to the data. The data point in column i is assumed to have the value of the ith element in the values vector. The specified kernel is used for the regression and the specified lambda is the penalty factor on the complexity of the solution.

Given the kernel matrix K, the identity matrix I, and the values vector y, the returned representer function has the following vector c of coefficients:

c = (K - λI)-1y


kernelMatrix

public static GMatrix kernelMatrix(GMatrix data,
                                   Kernel kernel)
Returns a kernel matrix for the specified data matrix (each column contains a data point) and the specified kernel. The element (i, j) in the returned matrix is the kernel evaluated for the data points in columns i and j in the data.