# LWPR

Locally Weighted Projection Regression

Locally Weighted Projection Regression (LWPR) is an algorithm that achieves **nonlinear function approximation** in **high dimensional spaces** with **redundant and irrelevant** input dimensions. At its core, it uses **locally linear models**, spanned by a small number of univariate regressions in selected directions in input space. A locally weighted variant of Partial Least Squares (PLS) is employed for doing the **dimensionality reduction**. This nonparametric local learning system

- learns rapidly with second order learning methods based on
**incremental training**, - uses statistically sound stochastic cross validation to learn,
- adjusts its weighting kernels based on local information only,
- has a computational complexity that is linear in the number of inputs, and
- can deal with a large number of - possibly redundant - inputs,

as shown in evaluations with up to 50 dimensional data sets. To our knowledge, this is the first truly incremental spatially localized learning method to combine all these properties.

**A good reference for the algorithm:**

**Incremental Online Learning in High Dimensions**. Neural Computation, vol. 17, no. 12, pp. 2602-2634 (2005).

**A good guide to practical usage:**

**A library for Locally Weighted Projection Regression**, Journal of Machine Learning Research (JMLR), vol. 9, pp. 623-626 (2008). [pdf]

**[**Supplementary Documentation

**]**

### When to use LWPR (and when not)

LWPR is particularly suited for the following regression problems:

- The function to be learnt is
**non-linear**. Otherwise having multiple local models is a waste of resources, and you should rather use ordinary linear regression, or (global) PLS for the case of high-dimensional or irrelevant inputs. - There are
**large amounts of training data**. If you desire good generalization from only relatively few samples (say, less than 2000), you are probably better off with Gaussian Processes (GP). LWPR needs that much data to properly detect the local dimensionality (number of projection directions) and the scale on which the regression function is locally linear. If the training set is small, the samples need to be presented to LWPR multiple times in random order. - Your application requires
**incremental, online**training. If you can afford to collect the data beforehand, and the time required for batch learning is not critical, LWPR loses its edge against SVM regression, or (Sparse) GP regression. When compared to global function approximators like multi-layer neural networks, LWPR has the tremendous advantage that its local models learn**independently**and without interference. - The input space is
**high-dimensional**, but the data lies on low-dimensional manifolds. LWPR places local models only where they are needed, and can detect the local dimensionality through PLS, yielding robust estimates of the regression coefficients. The latter feature sets off LWPR against previous (but otherwise similar) algorithms such as Receptive Field Weighted Regression. - The model may require
**adaptation**, since the target mapping may**change over time**. This suits LWPR very well because a built-in forgetting factor can be tuned to match the expected time scale at which such changes occur. The adaptation then usually happens quite fast, since the overall placement of receptive fields, their size, and the local PLS directions of a well-trained model can often be kept, while the regression parameters get re-adjusted.

### Our Implementation

We have implemented the LWPR algorithm in plain ANSI C, with wrappers and bindings for C++, Matlab/Octave, and Python (using Numpy). LWPR models can be stored in platform-dependent binary files or **optionally** in platform-independent, human-readable XML files. The latter functionality relies on the XML parser Expat as the only dependency.

LWPR models are fully interchangeable between the different implementations, that is, you could train a regression model in Matlab, store it to a file, and load it from a Python script or C++ program to calculate predictions. Just as well, you could train a model from real robot data collected online in C/C++, and later inspect the LWPR model comfortably within Matlab.

### Documentation

The Matlab and Python implementations contain documentation in their "native" format (accessible through the help command in either environment). The C library and its C++ wrapper contains Doxygen-style documentation, which you can browse online.

### Download and Installation

The library is freely available under the terms of the LGPL (with an exception that allows for static linking)