Fractional Kolmogorov-Arnold Networks (fKANs)

Alireza Afzal Aghaei
3 min readJun 12, 2024

The world of neural network design has been revolutionized with the advent of Kolmogorov-Arnold Networks (KANs), known for their interpretability, and precision. Today, we delve into an exciting new development in this field — the Fractional Kolmogorov-Arnold Networks (fKANs).

Understanding Kolmogorov-Arnold Networks

KANs are based on the Kolmogorov-Arnold representation theorem, which states that a multivariate continuous function can be expressed as a finite composition of continuous functions of a single variable and the binary operation of addition. Mathematically:

Kolmogorov-Arnold approximation

Fractional Kolmogorov-Arnold Networks (fKANs)

The fKAN proposes to use a fractional basis function instead of a naive one. Mathematically, for a predefined gamma:

fractional KAN

Fractional Basis Functions

In the paper fKAN: Fractional Kolmogorov-Arnold Networks with Trainable Jacobi Basis Functions, the fractional order of orthogonal Jacobi functions is explored as the potential fractional basis function for KAN approximation. These functions are defined as:

Fractional Jacobi functions

Then, the fKAN basis functions (or Fractional Jacobi Neural Blocks) are formulated as:

Fractional Jacobi Neural Block

Here, ELU stands for Exponential Linear Unit, σ represents the sigmoid function, and γ, α, and β are trainable weights.

How to use it?

The fKAN has a GitHub repo as well as a PyPI page. Hence it can be installed using pip python package manager:

pip install fkan

and can be easily integrated into any deep architecture. For example in a Keras sequential model:

from tensorflow import keras
from tensorflow.keras import layers
from fkan.tensorflow import FractionalJacobiNeuralBlock as fJNB

model = keras.Sequential(
[
layers.InputLayer(input_shape=input_shape),
layers.Conv2D(32, kernel_size=(3, 3)),
fJNB(3),
layers.MaxPooling2D(pool_size=(2, 2)),
layers.Flatten(),
layers.Dropout(0.5),
layers.Dense(16),
fJNB(2),
layers.Dense(num_classes, activation="softmax"),
]
)

or in PyTorch sequential API:

import torch.nn as nn
from fkan.torch import FractionalJacobiNeuralBlock as fJNB

model = nn.Sequential(
nn.Linear(1, 16),
fJNB(3),
nn.Linear(16, 32),
fJNB(6),
nn.Linear(32, 1),
)

Results

The fKAN achieves more accurate predictions in less time compared to KANs.

fKAN vs other alternatives

It also demonstrates excellent accuracy in image classification, image denoising, and sentiment analysis tasks:

MNIST classification using fKAN
Image denoising using fKAN
IMDB sentiment analysis using fKAN

Physics-informed fractional KANs

The fKAN is also capable of solving complex physics-informed neural network (PINN) tasks. It can simulate ordinary, partial, and fractional differential equations easily.

Simulation of standard Lane-Emden singular second-order ordinary differential equation
Simulation results of Burgers partial differential equations using fKAN
Simulation results of a Caputo fractional delay differential equation using fKAN

--

--

Alireza Afzal Aghaei
0 Followers

A researcher in computer science and mathematics.