Train models against your custom classifier
Currently, Pesidious uses a sample classifier to train the models that can generate evasive malware and evade this classifier.
You can add your own classifier to the tool in order to train the models against it. Following are the steps to add your own classifier
Step 1: Create a config file
Create a config file called pesidious.congif
in the home of your machine.
Step 2: Populate the config file
Add the path to your local saved model (pickle file) or configuration for your remote model
[Local-AI-model]
saved_model = /path/to/your/saved/model
Threshold = <threshold for classification>
[Remote-AI-model]
URL =
Username =
Password =
Version =
Threshold =
Step 3: Modify the tool to query your model
Modify the function get_score_load(bytez)
in gym_malware/envs/utils/interface.py
to get the score from your classifier
def get_score_local(bytez):
# local_model = model saved from pesidious.config
score = local_model.predict( ) #modify this line to get score from your model
return score
Last updated
Was this helpful?