Computer Vision News - May 2021

17 Use Stratified K-fold cross validation Another technique to deal with imbalance in the dataset, which also focuses on the input data by carefully alternating the training and validation subsets, is stratified k-fold cross validation. This is a slightly different version of the traditional k-fold validation; it ensures that the folds are made by preserving the percentage of samples for each class. Class Imbalance in Classification Tasks # Assign inputs and targets inputs = X targets = y # K-fold Cross Validation parameters fold_no = 1 num_folds = 5 no_epochs = 500 no_classes = 3 kf = KFold(n_splits=num_folds) skf = StratifiedKFold(n_splits=num_folds, random_state= 7 , shuffle=True) save_dir = "./archive/model/logs/" for train, test in kf.split(inputs, targets): print (inputs[train].shape, '/', test.shape) # CREATE NEW MODEL model = create_model(no_epochs, no_classes) # CREATE CALLBACKS checkpoint = ModelCheckpoint(save_dir+'model_'+str(fold_no)+'.h5', monitor='val_accuracy', verbose= 1 , save_best_only=True, mode='max') earlyStopping = EarlyStopping(monitor='val_loss', patience=100, verbose=0, mode='min') reduce_lr_loss = ReduceLROnPlateau( monitor='val_loss', factor= 0.8 , patience= 200 , verbose= 1 , epsilon= 1e- 3 , mode='min') callbacks_list = [checkpoint, reduce_lr_loss, earlyStopping] # FIT THE MODEL history_model = model.fit(inputs[train], targets[train], epochs=no_epochs, callbacks=callbacks_list, validation_data=(inputs[test], targets[test]))

RkJQdWJsaXNoZXIy NTc3NzU=