Julia package for kernel functions for machine learning
240 Stars
Updated Last
1 Year Ago
Started In
May 2019


CI codecov Documentation (stable) Documentation (latest) ColPrac: Contributor's Guide on Collaborative Practices for Community Packages Code Style: Blue DOI

Kernel functions for machine learning

KernelFunctions.jl is a general purpose kernel package. It provides a flexible framework for creating kernel functions and manipulating them, and an extensive collection of implementations. The main goals of this package are:

  • Flexibility: operations between kernels should be fluid and easy without breaking, with a user-friendly API.
  • Plug-and-play: being model-agnostic; including the kernels before/after other steps should be straightforward. To interoperate well with generic packages for handling parameters like ParameterHandling.jl and FluxML's Functors.jl.
  • Automatic Differentiation compatibility: all kernel functions which ought to be differentiable using AD packages like ForwardDiff.jl or Zygote.jl should be.


x = range(-3.0, 3.0; length=100)

# A simple standardised squared-exponential / exponentiated-quadratic kernel.
k₁ = SqExponentialKernel()
K₁ = kernelmatrix(k₁, x)

# Set a function transformation on the data
k₂ = Matern32Kernel()  FunctionTransform(sin)
K₂ = kernelmatrix(k₂, x)

# Set a matrix premultiplication on the data
k₃ = PolynomialKernel(; c=2.0, degree=2)  LinearTransform(randn(4, 1))
K₃ = kernelmatrix(k₃, x)

# Add and sum kernels
k₄ = 0.5 * SqExponentialKernel() * LinearKernel(; c=0.5) + 0.4 * k₂
K₄ = kernelmatrix(k₄, x)

    heatmap.([K₁, K₂, K₃, K₄]; yflip=true, colorbar=false)...;
    layout=(2, 2), title=["K₁" "K₂" "K₃" "K₄"],

Related Work

This package replaces the now-defunct MLKernels.jl. It incorporates lots of excellent existing work from packages such as GaussianProcesses.jl, and is used in downstream packages such as AbstractGPs.jl, ApproximateGPs.jl, Stheno.jl, and AugmentedGaussianProcesses.jl.

See the JuliaGaussianProcesses Github organisation and website for more information.


If you notice a problem or would like to contribute by adding more kernel functions or features please submit an issue, or open a PR (please see the ColPrac contribution guidelines).