Custom Auto Encoder Model

As documented, the number of layers in each of all three of our models is pre-defined. However, you can also create a custom autoencoder model that suits your dataset.

The Custom Autoencoder Model feature allows users to tailor their autoencoder architecture according to specific requirements. By specifying the number of layers, along with the nodes in each layer for both the encoder and decoder, users gain flexibility in crafting a personalized autoencoder. This capability empowers users to fine-tune the model architecture based on the intricacies of their data and desired encoding-decoding outcomes.

from sdgne.datagenerator.autoencoder import AutoEncoderModel  
  
encoder_dense_layers = [64,32,16]
bottle_neck = 14
decoder_dense_layers = [32,64]
decoder_activation = 'tanh'

synthesizer = AutoEncoderModel()

model = synthesizer.build_model(dataset, encoder_dense_layers, 
                  bottle_neck, decoder_dense_layers, 
                  decoder_activation)

Importing Auto Encoder Model

from sdgne.datagenerator.autoencoder import AutoEncoderModel

Creating a synthesizer

synthesizer = AutoEncoderModel()

Returns

An instance of class AutoEncoderModel.

Creating the custom model

model = synthesizer.build_model(dataset, encoder_dense_layers, 
                  bottle_neck, decoder_dense_layers, 
                  decoder_activation)

Parameters

dataset

required

pd.Dataframe

Represents a pandas data frame containing both, the original minority and the original majority data

encoder_dense_layers

required

list[int]

Represents a list of layers where each integer is the number of nodes in the encoder layer.

bottle_neck

required

integer

Represents the number of nodes in the bottleneck layer

decoder_dense_layers

required

list[int]

Represents a list of layers where each integer is the number of nodes in the decoder layer.

decoder_activation

required

string

Represents the activation function to be used at end of decoder.

Returns

A Keras model

Further Usage Example

Below is the sample code for compiling and training the model.

model.compile(optimizer='adam', loss='mse')
history = model.fit(minority_df, minority_df, epochs, batch_size)

Last updated