Skip to content

david-vicente/KernelFunctions.jl

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

KernelFunctions.jl

CI codecov Documentation (stable) Documentation (latest) ColPrac: Contributor's Guide on Collaborative Practices for Community Packages Code Style: Blue DOI

Kernel functions for machine learning

KernelFunctions.jl provides a flexible framework for defining kernel functions, and an extensive collection of implementations.

The aim is to make the API as model-agnostic as possible while still being user-friendly, and to interoperate well with generic packages for handling parameters like ParameterHandling.jl and FluxML's Functors.jl.

Where appropriate, kernels are AD-compatible.

Examples

x = range(-3.0, 3.0; length=100)

# A simple standardised squared-exponential / exponentiated-quadratic kernel.
k₁ = SqExponentialKernel()
K₁ = kernelmatrix(k₁, x)

# Set a function transformation on the data
k₂ = Matern32Kernel()  FunctionTransform(sin)
K₂ = kernelmatrix(k₂, x)

# Set a matrix premultiplication on the data
k₃ = PolynomialKernel(; c=2.0, degree=2)  LinearTransform(randn(4, 1))
K₃ = kernelmatrix(k₃, x)

# Add and sum kernels
k₄ = 0.5 * SqExponentialKernel() * LinearKernel(; c=0.5) + 0.4 * k₂
K₄ = kernelmatrix(k₄, x)

plot(
    heatmap.([K₁, K₂, K₃, K₄]; yflip=true, colorbar=false)...;
    layout=(2, 2), title=["K₁" "K₂" "K₃" "K₄"],
)

Related Work

Directly inspired by the MLKernels package.

See the JuliaGaussianProcesses Github organisation and website for more related packages.

Issues/Contributing

If you notice a problem or would like to contribute by adding more kernel functions or features please submit an issue, or open a PR (please see the ColPrac contribution guidelines).

About

Julia package for kernel functions for machine learning

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Julia 100.0%