Radial basis function
From Wikipedia, the free encyclopedia
A radial basis function (RBF) is a real-valued function whose value depends only on the distance from the origin, s
o that a center, so that ; or alternatively on the distance from some other point c, called . Any function φ that satisfies the property is aradial function. The
norm is usually Euclidean distance, although other distance functions are also possible. For example by using Lukaszyk-Karmowski metric, it is possible for some radial functions to avoid problems with ill conditioning of the matrix solved to determine coefficients wi (see below), since the is always greater than zero.
Sums of radial basis functions are typically used to approximate given functions. This approximation process can also be interpreted as a simple kind ofneural network.
1 RBF types 2 Approximation 3 RBF network 4 References
Commonly used types of radial basis functions include (writing
Thin plate spline (a special polyharmonic spline):
Radial basis functions are typically used to build up function approximations of the form
where the approximating function y(x) is represented as a sum ) of N radial basis functions, each associated with a different center ci, and weighted by an appropriate coefficient wi. The weights wi can be estimated using the matrix methods of linear least squares because squares, the approximating function is islinear in the weights. Approximation schemes of this kind have been particularly used in time series prediction and control of nonlinear systems exhibiting sufficiently simple simplechaotic behaviour, 3D reconstruction in computer graphics (for example, hierarchical RBF).
[edit RBF edit]
See also: radial basis function network
Two unnormalized Gaussian radial basis functions in one input dimension. The basis function centers are located at c1=0.75 and c2=3.25.
can also be interpreted as a rather simple single-layer type of artificial neural network called aradial basis function network, with the radial basis functions taking on the role of the activation functions of the network. It can be shown that any continuous function on a compact interval can in principle be interpolated with arbitrary accuracy by a sum of this form, if a sufficiently large number N of radial basis functions is used. The approximant y(x) is differentiable with respect to the weights wi. The weights could thus be learned using any of the standard iterative methods for neural networks.
^ Lukaszyk, S. (2004) A new concept of probability metric and its applications in approximation of scattered data sets. Computational Mechanics, 33, 299-3004. limited access
Buhmann, Martin D. (2003), Radial Basis Functions: Theory and Implementations, Cambridge University Press, ISBN 978-0-521-63338-3. Categories: Neural networks | Interpolation | Numerical analysis