Install Zoofs¶
In [25]:
!pip install zoofs
Requirement already satisfied: zoofs in /usr/local/lib/python3.7/dist-packages (0.1.3) Requirement already satisfied: plotly in /usr/local/lib/python3.7/dist-packages (from zoofs) (4.4.1) Requirement already satisfied: scipy in /usr/local/lib/python3.7/dist-packages (from zoofs) (1.4.1) Requirement already satisfied: pandas in /usr/local/lib/python3.7/dist-packages (from zoofs) (1.1.5) Requirement already satisfied: numpy in /usr/local/lib/python3.7/dist-packages (from zoofs) (1.19.5) Requirement already satisfied: python-dateutil>=2.7.3 in /usr/local/lib/python3.7/dist-packages (from pandas->zoofs) (2.8.2) Requirement already satisfied: pytz>=2017.2 in /usr/local/lib/python3.7/dist-packages (from pandas->zoofs) (2018.9) Requirement already satisfied: six>=1.5 in /usr/local/lib/python3.7/dist-packages (from python-dateutil>=2.7.3->pandas->zoofs) (1.15.0) Requirement already satisfied: retrying>=1.3.3 in /usr/local/lib/python3.7/dist-packages (from plotly->zoofs) (1.3.3)
Prepare data¶
In [26]:
from sklearn.datasets import load_breast_cancer
import pandas as pd
data = load_breast_cancer()
In [27]:
X_train=pd.DataFrame(data['data'],columns=data['feature_names'])
y_train=pd.Series(data['target'])
Load an algo¶
In [28]:
from zoofs import HarrisHawkOptimization
Prepare an objective function, and run the algo¶
In [29]:
from sklearn.metrics import log_loss
# define your own objective function, make sure the function receives four parameters,
# fit your model and return the objective value !
def objective_function_topass(model,X_train, y_train, X_valid, y_valid):
model.fit(X_train,y_train)
P=log_loss(y_valid,model.predict_proba(X_valid))
return P
# import an algorithm !
from zoofs import HarrisHawkOptimization
# create object of algorithm
algo_object=HarrisHawkOptimization(objective_function_topass,n_iteration=20,
population_size=20,minimize=True)
import lightgbm as lgb
lgb_model = lgb.LGBMClassifier()
# fit the algorithm
algo_object.fit(lgb_model,X_train, y_train, X_train, y_train,verbose=True)
#plot your results
Best value of metric across iteration Best value of metric across population Iteration 0 0.000834507195418593 0.000834507195418593 Iteration 1 0.0006783272710830701 0.0006783272710830701 Iteration 2 0.0006783272710830701 0.0006783272710830701 Iteration 3 0.0007444757214734708 0.0006783272710830701 Iteration 4 0.000698228318389392 0.0006783272710830701 Iteration 5 0.0007875445320993873 0.0006783272710830701 Iteration 6 0.0006751479047728708 0.0006751479047728708 Iteration 7 0.0006751479047728708 0.0006751479047728708 Iteration 8 0.0006933743957114641 0.0006751479047728708 Iteration 9 0.0006933743957114641 0.0006751479047728708 Iteration 10 0.000685477824521282 0.0006751479047728708 Iteration 11 0.0006563435418546247 0.0006563435418546247 Iteration 12 0.0006933743957114641 0.0006563435418546247 Iteration 13 0.0006933743957114641 0.0006563435418546247 Iteration 14 0.0006965898526582949 0.0006563435418546247 Iteration 15 0.0006261969431135148 0.0006261969431135148 Iteration 16 0.0006261969431135148 0.0006261969431135148 Iteration 17 0.0006261969431135148 0.0006261969431135148 Iteration 18 0.0006261969431135148 0.0006261969431135148 Iteration 19 0.0006308051888594318 0.0006261969431135148
Out[29]:
['mean radius', 'mean texture', 'mean perimeter', 'mean area', 'mean smoothness', 'mean concavity', 'mean concave points', 'mean symmetry', 'mean fractal dimension', 'radius error', 'area error', 'smoothness error', 'symmetry error', 'worst radius', 'worst perimeter', 'worst area', 'worst smoothness', 'worst compactness', 'worst concave points', 'worst symmetry']
Plot objective history plot¶
In [30]:
algo_object.plot_history()
In [ ]: