├── eq1.png ├── eq2.png ├── kernel_activation_functions.png ├── kernel_activation_functions_2D.png ├── tensorflow ├── README ├── demo_kaf_feedforward.py ├── demo_kaf_convolutional.py └── kafnets.py ├── keras ├── README.md ├── demo_kaf_feedforward.py ├── demo_kaf_convolutional.py └── kafnets.py ├── .gitignore ├── pytorch ├── README.md ├── demo_kaf_feedforward.py ├── demo_kaf_convolutional.py └── kafnets.py ├── autograd ├── README.md ├── kafnets.py └── demo_kaf_regression.py ├── LICENSE └── README.md /eq1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ispamm/kernel-activation-functions/HEAD/eq1.png -------------------------------------------------------------------------------- /eq2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ispamm/kernel-activation-functions/HEAD/eq2.png -------------------------------------------------------------------------------- /kernel_activation_functions.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ispamm/kernel-activation-functions/HEAD/kernel_activation_functions.png -------------------------------------------------------------------------------- /kernel_activation_functions_2D.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ispamm/kernel-activation-functions/HEAD/kernel_activation_functions_2D.png -------------------------------------------------------------------------------- /tensorflow/README: -------------------------------------------------------------------------------- 1 | ## Kernel activation functions (TensorFlow) 2 | 3 | In the *kafnets* module you can find the modules for defining KAF layers, both for feedforward networks and convolutional networks (using the flag 'conv' during initialization). 4 | The code has two demos to showcase the modules using TensorFlow layers. 5 | 6 | ## Requirements 7 | 8 | * tensorflow = 1.1.13 9 | * numpy = 1.15.4 -------------------------------------------------------------------------------- /keras/README.md: -------------------------------------------------------------------------------- 1 | ## Kernel activation functions (Keras) 2 | 3 | In the *kafnets* module you can find the modules for defining KAF layers, both for feedforward networks and convolutional networks (using the flag 'conv' during initialization). 4 | The code has two demos to showcase the modules using the Keras Sequential model. 5 | 6 | ## Requirements 7 | 8 | * keras = 2.2.4 9 | * numpy = 1.15.4 10 | -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | *__pycache__* 2 | 3 | \.idea/ 4 | 5 | pytorch/data/MNIST/raw/train-labels-idx1-ubyte 6 | 7 | pytorch/data/MNIST/processed/test\.pt 8 | 9 | pytorch/data/MNIST/processed/training\.pt 10 | 11 | pytorch/data/MNIST/raw/t10k-images-idx3-ubyte 12 | 13 | pytorch/data/MNIST/raw/t10k-labels-idx1-ubyte 14 | 15 | pytorch/data/MNIST/raw/train-images-idx3-ubyte 16 | 17 | tensorflow/MNIST_data/ 18 | 19 | tensorflow/logs/ 20 | -------------------------------------------------------------------------------- /pytorch/README.md: -------------------------------------------------------------------------------- 1 | ## Kernel activation functions (PyTorch) 2 | 3 | In the *kafnets* module you can find the modules for defining KAF layers, both for feedforward networks and convolutional networks (using the flag 'conv' during initialization). 4 | The code has two demos to showcase the modules using the PyTorch sequential class. 5 | 6 | ## Requirements 7 | 8 | * pytorch = 1.0.1 9 | * numpy = 1.15.4 10 | * tqdm = 4.28.1 (for demo_kaf_convolutional.py) 11 | * scikit-learn = 0.20.1 (for demo_feedforward.py) -------------------------------------------------------------------------------- /autograd/README.md: -------------------------------------------------------------------------------- 1 | ## Kernel activation functions (Autograd) 2 | 3 | The code customizes the example from here: 4 | https://github.com/HIPS/autograd/blob/master/examples/neural_net.py 5 | 6 | In the *kafnets* module you can find the code to initialize and run neural networks having KAF activation functions. 7 | Note that this is a regression example and we use custom functions also in the output layer. 8 | For classification, consider changing the final layer to a standard softmax. 9 | 10 | ## Requirements 11 | 12 | * autograd = 1.2. 13 | * scikit-learn = 0.20.1 (for demo_kaf_regression.py) -------------------------------------------------------------------------------- /keras/demo_kaf_feedforward.py: -------------------------------------------------------------------------------- 1 | # -*- coding: utf-8 -*- 2 | 3 | """ 4 | Simple demo using kernel activation functions on a basic regression dataset. 5 | """ 6 | 7 | # Keras imports 8 | from keras import datasets 9 | from keras.models import Sequential 10 | from keras.layers import Dense 11 | 12 | # Custom imports 13 | from kafnets import KAF 14 | 15 | # Load Breast Cancer dataset 16 | (X_train, y_train), (X_test, y_test) = datasets.boston_housing.load_data() 17 | 18 | # Initialize a KAF neural network 19 | kafnet = Sequential([ 20 | Dense(20, input_shape=(13,)), 21 | KAF(20), 22 | Dense(1), 23 | ]) 24 | 25 | #Uncomment to use KAF with Softplus kernel 26 | #kafnet = Sequential([ 27 | # Dense(20, input_shape=(13,)), 28 | # KAF(20, kernel='softplus', D=5), 29 | # Dense(1), 30 | #]) 31 | 32 | # Training 33 | kafnet.compile(optimizer='adam', loss='mse') 34 | kafnet.summary() 35 | kafnet.fit(X_train, y_train, epochs=250, batch_size=32, verbose=0) 36 | 37 | # Evaluation 38 | print('Final error is: ' + str(kafnet.evaluate(X_test, y_test, batch_size=64))) -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2017 Ispamm Laboratory 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /autograd/kafnets.py: -------------------------------------------------------------------------------- 1 | # -*- coding: utf-8 -*- 2 | 3 | import autograd.numpy as np 4 | 5 | def init_kaf_nn(layer_sizes, scale=0.01, rs=np.random.RandomState(0), dict_size=20, boundary=3.0): 6 | """ 7 | Initialize the parameters of a KAF feedforward network. 8 | - dict_size: the size of the dictionary for every neuron. 9 | - boundary: the boundary for the activation functions. 10 | """ 11 | 12 | # Initialize the dictionary 13 | D = np.linspace(-boundary, boundary, dict_size).reshape(-1, 1) 14 | 15 | # Rule of thumb for gamma 16 | interval = D[1,0] - D[0,0]; 17 | gamma = 0.5/np.square(2*interval) 18 | D = D.reshape(1, 1, -1) 19 | 20 | # Initialize a list of parameters for the layer 21 | w = [(rs.randn(insize, outsize) * scale, # Weight matrix 22 | rs.randn(outsize) * scale, # Bias vector 23 | rs.randn(1, outsize, dict_size) * 0.5) # Mixing coefficients 24 | for insize, outsize in zip(layer_sizes[:-1], layer_sizes[1:])] 25 | 26 | return w, (D, gamma) 27 | 28 | def predict_kaf_nn(w, X, info): 29 | """ 30 | Compute the outputs of a KAF feedforward network. 31 | """ 32 | 33 | D, gamma = info 34 | for W, b, alpha in w: 35 | outputs = np.dot(X, W) + b 36 | K = gauss_kernel(outputs, D, gamma) 37 | X = np.sum(K*alpha, axis=2) 38 | return X 39 | 40 | def gauss_kernel(X, D, gamma=1.0): 41 | """ 42 | Compute the 1D Gaussian kernel between all elements of a 43 | NxH matrix and a fixed L-dimensional dictionary, resulting in a NxHxL matrix of kernel 44 | values. 45 | """ 46 | return np.exp(- gamma*np.square(X.reshape(-1, X.shape[1], 1) - D)) -------------------------------------------------------------------------------- /tensorflow/demo_kaf_feedforward.py: -------------------------------------------------------------------------------- 1 | # -*- coding: utf-8 -*- 2 | 3 | """ 4 | Simple demo using kernel activation functions on a basic regression dataset. 5 | """ 6 | 7 | # Import TensorFlow 8 | import numpy as np 9 | import tensorflow as tf 10 | import tensorflow.contrib.eager as tfe 11 | tf.enable_eager_execution() 12 | 13 | # Keras imports 14 | from tensorflow.keras import datasets 15 | from tensorflow.keras.models import Sequential 16 | from tensorflow.keras.layers import Dense 17 | 18 | # Custom imports 19 | from kafnets import KAF 20 | import tqdm 21 | 22 | # Load Breast Cancer dataset 23 | (X_train, y_train), (X_test, y_test) = datasets.boston_housing.load_data() 24 | 25 | # Initialize a KAF neural network 26 | kafnet = Sequential([ 27 | Dense(20, input_shape=(13,)), 28 | KAF(20), 29 | Dense(1), 30 | ]) 31 | 32 | #Uncomment to use KAF with Softplus kernel 33 | #kafnet = Sequential([ 34 | # Dense(20, input_shape=(13,)), 35 | # KAF(20, kernel='softplus', D=5), 36 | # Dense(1), 37 | #]) 38 | 39 | # Use tf.data DataLoader 40 | train_data = tf.data.Dataset.from_tensor_slices((X_train.astype(np.float32), y_train.reshape(-1, 1))) 41 | test_data = tf.data.Dataset.from_tensor_slices((X_test.astype(np.float32), y_test.astype(np.float32).reshape(-1, 1))) 42 | 43 | # Optimizer 44 | opt = tf.train.AdamOptimizer() 45 | 46 | # Training 47 | for e in tqdm.trange(300, desc='Training'): 48 | 49 | for xb, yb in train_data.shuffle(1000).batch(32): 50 | 51 | with tfe.GradientTape() as tape: 52 | loss = tf.losses.mean_squared_error(yb, kafnet(xb)) 53 | g = tape.gradient(loss, kafnet.variables) 54 | opt.apply_gradients(zip(g, kafnet.variables)) 55 | 56 | # Evaluation 57 | err = tfe.metrics.Mean() 58 | for xb, yb in test_data.batch(32): 59 | err((yb - kafnet(xb))**2) 60 | print('Final error is: ' + str(err.result())) 61 | -------------------------------------------------------------------------------- /keras/demo_kaf_convolutional.py: -------------------------------------------------------------------------------- 1 | # -*- coding: utf-8 -*- 2 | 3 | """ 4 | Simple demo using kernel activation functions with convolutional networks on the MNIST dataset. 5 | """ 6 | 7 | # Keras imports 8 | from keras import datasets 9 | from keras.models import Sequential 10 | from keras.layers import Dense, Conv2D, MaxPooling2D, Flatten 11 | from keras.utils import to_categorical 12 | import keras.backend as K 13 | 14 | # Custom imports 15 | from kafnets import KAF 16 | 17 | # Load Breast Cancer dataset 18 | (X_train, y_train), (X_test, y_test) = datasets.mnist.load_data() 19 | 20 | # Preprocessing is taken from here: 21 | # https://github.com/keras-team/keras/blob/master/examples/mnist_cnn.py 22 | if K.image_data_format() == 'channels_first': 23 | X_train = X_train.reshape(X_train.shape[0], 1, 28, 28) 24 | X_test = X_test.reshape(X_test.shape[0], 1, 28, 28) 25 | else: 26 | X_train = X_train.reshape(X_train.shape[0], 28, 28, 1) 27 | X_test = X_test.reshape(X_test.shape[0], 28, 28, 1) 28 | 29 | X_train = X_train.astype('float32') 30 | X_test = X_test.astype('float32') 31 | X_train /= 255 32 | X_test /= 255 33 | 34 | # convert class vectors to binary class matrices 35 | y_train = to_categorical(y_train, 10) 36 | y_test = to_categorical(y_test, 10) 37 | 38 | # Initialize a KAF neural network 39 | kafnet = Sequential() 40 | kafnet.add(Conv2D(32, (3, 3), input_shape=(28, 28, 1))) 41 | kafnet.add(KAF(32, conv=True)) 42 | kafnet.add(Conv2D(32, (3, 3))) 43 | kafnet.add(KAF(32, conv=True)) 44 | kafnet.add(MaxPooling2D(pool_size=(2, 2))) 45 | kafnet.add(Flatten()) 46 | kafnet.add(Dense(100)) 47 | kafnet.add(KAF(100)) 48 | kafnet.add(Dense(10, activation='softmax')) 49 | 50 | # Training 51 | kafnet.compile(optimizer='adam', loss='categorical_crossentropy', metrics=['accuracy']) 52 | kafnet.summary() 53 | kafnet.fit(X_train, y_train, epochs=5, batch_size=32, verbose=1) 54 | 55 | # Evaluation 56 | print('Final accuracy is: ' + str(kafnet.evaluate(X_test, y_test, batch_size=64)[1])) -------------------------------------------------------------------------------- /autograd/demo_kaf_regression.py: -------------------------------------------------------------------------------- 1 | # -*- coding: utf-8 -*- 2 | 3 | # Imports from Python libraries 4 | import autograd.numpy as np 5 | from autograd import grad 6 | from autograd.misc.optimizers import adam 7 | from sklearn import datasets, preprocessing, model_selection 8 | 9 | # Custom imports 10 | from kafnets import init_kaf_nn, predict_kaf_nn 11 | 12 | # Extends this example: 13 | # https://github.com/HIPS/autograd/blob/master/examples/neural_net.py 14 | 15 | # Set seed for PRNG 16 | np.random.seed(1) 17 | 18 | # Size of the neural network's layers 19 | layers = [13, 10, 1] 20 | 21 | # Batch size 22 | B = 40 23 | 24 | # Load Boston dataset 25 | data = datasets.load_boston() 26 | X = preprocessing.MinMaxScaler(feature_range=(-1, +1)).fit_transform(data['data']) 27 | y = preprocessing.MinMaxScaler(feature_range=(-0.9, +0.9)).fit_transform(data['target'].reshape(-1, 1)) 28 | (X_train, X_test, y_train, y_test) = model_selection.train_test_split(X, y, test_size=0.25) 29 | 30 | # Initialize KAF neural network 31 | w, info = init_kaf_nn(layers) 32 | predict_fcn = lambda w, inputs: predict_kaf_nn(w, inputs, info) 33 | 34 | # Loss function (MSE) 35 | def loss_fcn(params, inputs, targets): 36 | return np.mean(np.square(predict_fcn(params, inputs) - targets)) 37 | 38 | # Iterator over mini-batches 39 | num_batches = int(np.ceil(X_train.shape[0] / B)) 40 | def batch_indices(iter): 41 | idx = iter % num_batches 42 | return slice(idx * B, (idx+1) * B) 43 | 44 | # Define training objective 45 | def objective(params, iter): 46 | idx = batch_indices(iter) 47 | return loss_fcn(params, X_train[idx], y_train[idx]) 48 | 49 | # Get gradient of objective using autograd. 50 | objective_grad = grad(objective) 51 | 52 | # The optimizers provided can optimize lists, tuples, or dicts of parameters 53 | print('Optimizing the network...\n') 54 | w_final = adam(objective_grad, w, num_iters=1000) 55 | 56 | # Compute test accuracy 57 | print('Final test MSE is ', loss_fcn(w_final, X_test, y_test), '\n') -------------------------------------------------------------------------------- /tensorflow/demo_kaf_convolutional.py: -------------------------------------------------------------------------------- 1 | # -*- coding: utf-8 -*- 2 | 3 | """ 4 | Simple demo using kernel activation functions with convolutional networks on the MNIST dataset. 5 | """ 6 | 7 | # Import TensorFlow 8 | import numpy as np 9 | import tensorflow as tf 10 | import tensorflow.contrib.eager as tfe 11 | tf.enable_eager_execution() 12 | 13 | # Keras imports 14 | from tensorflow.keras import datasets 15 | from tensorflow.keras.models import Sequential 16 | from tensorflow.keras.layers import Dense, Conv2D, MaxPooling2D, Flatten 17 | 18 | # Custom imports 19 | from kafnets import KAF 20 | import tqdm 21 | 22 | # Load Breast Cancer dataset 23 | (X_train, y_train), (X_test, y_test) = datasets.mnist.load_data() 24 | 25 | # Preprocessing is taken from here: 26 | # https://github.com/keras-team/keras/blob/master/examples/mnist_cnn.py 27 | X_train = X_train.reshape(X_train.shape[0], 28, 28, 1) 28 | X_test = X_test.reshape(X_test.shape[0], 28, 28, 1) 29 | 30 | X_train = X_train.astype('float32') 31 | X_test = X_test.astype('float32') 32 | X_train /= 255 33 | X_test /= 255 34 | 35 | 36 | # Initialize a KAF neural network 37 | kafnet = Sequential() 38 | kafnet.add(Conv2D(32, (3, 3), input_shape=(28, 28, 1))) 39 | kafnet.add(KAF(32, conv=True)) 40 | kafnet.add(Conv2D(32, (3, 3))) 41 | kafnet.add(KAF(32, conv=True)) 42 | kafnet.add(MaxPooling2D(pool_size=(2, 2))) 43 | kafnet.add(Flatten()) 44 | kafnet.add(Dense(100)) 45 | kafnet.add(KAF(100)) 46 | kafnet.add(Dense(10, activation='softmax')) 47 | 48 | # Use tf.data DataLoader 49 | train_data = tf.data.Dataset.from_tensor_slices((X_train.astype(np.float32), y_train.astype(np.int64))) 50 | test_data = tf.data.Dataset.from_tensor_slices((X_test.astype(np.float32), y_test.astype(np.int64))) 51 | 52 | # Optimizer 53 | opt = tf.train.AdamOptimizer() 54 | 55 | # Training 56 | for e in tqdm.trange(5, desc='Training'): 57 | 58 | for xb, yb in train_data.shuffle(1000).batch(32): 59 | 60 | with tfe.GradientTape() as tape: 61 | loss = tf.losses.sparse_softmax_cross_entropy(yb, kafnet(xb)) 62 | g = tape.gradient(loss, kafnet.variables) 63 | opt.apply_gradients(zip(g, kafnet.variables)) 64 | 65 | # Evaluation 66 | acc = tfe.metrics.Accuracy() 67 | for xb, yb in test_data.batch(32): 68 | acc(yb, tf.argmax(kafnet(xb), axis=1)) 69 | tqdm.tqdm.write('Test accuracy after epoch {} is: '.format(e+1) + str(acc.result())) -------------------------------------------------------------------------------- /pytorch/demo_kaf_feedforward.py: -------------------------------------------------------------------------------- 1 | # -*- coding: utf-8 -*- 2 | 3 | """ 4 | Simple demo using kernel activation functions on a basic classification dataset. 5 | """ 6 | 7 | # Imports from Python libraries 8 | import numpy as np 9 | from sklearn import datasets, preprocessing, model_selection 10 | 11 | # PyTorch imports 12 | import torch 13 | from torch.utils.data import TensorDataset 14 | from torch.utils.data import DataLoader 15 | 16 | # Custom imports 17 | from kafnets import KAF 18 | 19 | # Set seed for PRNG 20 | np.random.seed(1) 21 | torch.manual_seed(1) 22 | 23 | # Batch size 24 | B = 40 25 | 26 | # Load Breast Cancer dataset 27 | data = datasets.load_breast_cancer() 28 | X = preprocessing.MinMaxScaler(feature_range=(-1, +1)).fit_transform(data['data']).astype(np.float32) 29 | (X_train, X_test, y_train, y_test) = model_selection.train_test_split(X, data['target'].astype(np.float32).reshape(-1, 1), test_size=0.25) 30 | 31 | # Load in PyTorch data loader 32 | data_train = DataLoader(TensorDataset(torch.from_numpy(X_train), torch.from_numpy(y_train)), shuffle=True, batch_size=64) 33 | data_test = DataLoader(TensorDataset(torch.from_numpy(X_test), torch.from_numpy(y_test)), batch_size=100) 34 | 35 | # Initialize a KAF neural network 36 | kafnet = torch.nn.Sequential( 37 | torch.nn.Linear(30, 20), 38 | KAF(20), 39 | torch.nn.Linear(20, 1), 40 | ) 41 | 42 | # Uncomment to use KAF with custom initialization 43 | #kafnet = torch.nn.Sequential( 44 | # torch.nn.Linear(30, 20), 45 | # KAF(20, init_fcn=np.tanh), 46 | # torch.nn.Linear(20, 1), 47 | #) 48 | 49 | #Uncomment to use KAF with Softplus kernel 50 | #kafnet = torch.nn.Sequential( 51 | # torch.nn.Linear(30, 20), 52 | # KAF(20, kernel='softplus'), 53 | # torch.nn.Linear(20, 1), 54 | #) 55 | 56 | # Reset parameters 57 | for m in kafnet: 58 | if len(m._parameters) > 0: 59 | m.reset_parameters() 60 | 61 | print('Training: **KAFNET**', flush=True) 62 | 63 | # Loss function 64 | loss_fn = torch.nn.BCEWithLogitsLoss() 65 | 66 | # Build optimizer 67 | optimizer = torch.optim.Adam(kafnet.parameters(), weight_decay=1e-4) 68 | 69 | for idx_epoch in range(100): 70 | 71 | kafnet.train() 72 | 73 | for _, (X_batch, y_batch) in enumerate(data_train): 74 | 75 | # Forward pass: compute predicted y by passing x to the model. 76 | y_pred = kafnet(X_batch) 77 | 78 | # Compute loss. 79 | loss = loss_fn(y_pred, y_batch) 80 | 81 | # Zeroes out all gradients 82 | optimizer.zero_grad() 83 | 84 | # Backward pass 85 | loss.backward() 86 | 87 | # Update parameters 88 | optimizer.step() 89 | 90 | with torch.no_grad(): 91 | # Compute final test score 92 | print('Computing test score for: **KAFNET**', flush=True) 93 | kafnet.eval() 94 | acc = 0 95 | for _, (X_batch, y_batch) in enumerate(data_test): 96 | acc += np.sum(y_batch.numpy() == np.round(torch.sigmoid(kafnet(X_batch)).numpy())) 97 | print('Final score on test set: ', acc/data_test.dataset.__len__()) 98 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # Kernel Activation Functions 2 | 3 | This repository contains several implementations of the kernel activation functions (KAFs) described in the following paper ([link to the preprint](https://arxiv.org/abs/1707.04035)): 4 | 5 | Scardapane, S., Van Vaerenbergh, S., Totaro, S. and Uncini, A., 2019. 6 | Kafnets: Kernel-based non-parametric activation functions for neural networks. 7 | Neural Networks, 110, pp.19-32. 8 | 9 | ## Available implementations 10 | 11 | We currently provide the following stable implementations: 12 | 13 | * [PyTorch](/pytorch): feedforward and convolutional layers, three kernels (Gaussian/ReLU/Softplus), with random initialization or kernel ridge regression. 14 | * [Keras](/keras): same functionalities as the PyTorch implementation. 15 | * [TensorFlow](/tensorflow/): similar to the Keras implementation, but we use the internal tf.keras.Layer and the eager execution in the demos. 16 | * [Autograd](/autograd): only feedforward layers with a Gaussian kernel and random initialization. 17 | 18 | More information for each implementation is given in the corresponding folder. The code should be relatively easy to plug-in in other architectures or projects. 19 | 20 | ## What is a KAF? 21 | 22 | Most neural networks work by interleaving linear projections and simple (fixed) activation functions, like the ReLU function: 23 | 24 |
25 |
26 |
31 |
32 |
43 | 
44 | Fig. 1. Examples of kernel activation functions learned on the Sensorless data set. The KAF after initialization is shown with a dashed red, while the final KAF is shown with a solid green. As a reference, the distribution of activation values after training is shown in light blue.
45 |
48 | 
49 | Fig. 2. Examples of two-dimensional kernel activation functions learned on the Sensorless data set.
50 |