├── README.md ├── test.py └── time_embedding_layer.py /README.md: -------------------------------------------------------------------------------- 1 | # Implementation an RNN time-embedding layer from "Time-Dependent Representation for Neural Event Sequence Prediction" paper 2 | 3 | This is an implementation of the best performing approach to embedding time into RNNs. This is usefull for cases there your stream of RNNs steps contain timestep information, 4 | and those timesteps are non-equally spaced. This approach performs better than just adding timestep as a single feature. 5 | 6 | You can read detailed explanation in [my blog post](https://fridayexperiment.com/how-to-encode-time-property-in-recurrent-neutral-networks/) 7 | 8 | References: 9 | 10 | 1. [paper - Time-Dependent Representation for Neural Event Sequence Prediction](https://arxiv.org/abs/1708.00065) -------------------------------------------------------------------------------- /test.py: -------------------------------------------------------------------------------- 1 | import tensorflow as tf 2 | from time_embedding_layer import TimeEmbedding 3 | 4 | 5 | class ModelTest(tf.keras.layers.Layer): 6 | def __init__(self, **kwargs): 7 | super(ModelTest, self).__init__(**kwargs) 8 | self.time_emb = TimeEmbedding(20,64) 9 | 10 | def call(self, input1): 11 | emb = self.time_emb(input1) 12 | return emb 13 | 14 | 15 | def model_generator2(): 16 | input1 = tf.keras.layers.Input(shape=(1)) 17 | 18 | result = ModelTest()(input1) 19 | model = tf.keras.models.Model(input1, result) 20 | return model 21 | 22 | model = model_generator2() 23 | model.compile(loss='mean_squared_error', 24 | optimizer=tf.keras.optimizers.Adam(1e-4)) 25 | model.summary() 26 | 27 | results = model(tf.random.uniform([16,20])) 28 | assert results.shape == [16,20,64] -------------------------------------------------------------------------------- /time_embedding_layer.py: -------------------------------------------------------------------------------- 1 | import tensorflow as tf 2 | from tensorflow.keras import backend as K 3 | from tensorflow.keras.layers import Layer 4 | 5 | 6 | class TimeEmbedding(Layer): 7 | def __init__(self, hidden_embedding_size, output_dim, **kwargs): 8 | super(TimeEmbedding, self).__init__(**kwargs) 9 | self.output_dim = output_dim 10 | self.hidden_embedding_size = hidden_embedding_size 11 | 12 | 13 | def build(self, input_shape): 14 | self.emb_weights = self.add_weight(name='weights', shape=(self.hidden_embedding_size,), initializer='uniform', 15 | trainable=True) 16 | self.emb_biases = self.add_weight(name='biases', shape=(self.hidden_embedding_size,), initializer='uniform', 17 | trainable=True) 18 | self.emb_final = self.add_weight(name='embedding_matrix', shape=(self.hidden_embedding_size, self.output_dim), 19 | initializer='uniform', trainable=True) 20 | 21 | 22 | def call(self, x): 23 | x = tf.keras.backend.expand_dims(x) 24 | x = tf.keras.activations.softmax(x * self.emb_weights + self.emb_biases) 25 | x = tf.einsum('bsv,vi->bsi', x, self.emb_final) 26 | return x 27 | 28 | 29 | def get_config(self): 30 | config = super(TimeEmbedding, self).get_config() 31 | config.update({'time_dims': self.output_dim, 'hidden_embedding_size': self.hidden_embedding_size}) 32 | return config 33 | 34 | --------------------------------------------------------------------------------