Week 4 : Coding Period
Coding Period Update This week my progress has been comparatively slower than other weeks. In the beginning of this week I tried to train my models again. But this time, I reduced the dataset size almost to half of the original size. I balanced the positive and negative samples in the dataset and retrained the models. These efforts went to waste as I did not receive good performance from my models even after altering the hyperparameters. They gave accuracies lesser than the imbalanced dataset. My time got wasted and then I quickly got onto my next task of pruning and int8 quantization. I migrated my notebook from Kaggle to Jupyter Hub hosted on PLHI server. I successfully pruned my models on the server as most of the issues that I had faced on Kaggle and Colab were solved here. I experimented with quite some parameters ran pruning on DenseNet as well as Inception. Now I tried Int8 quantization and quantization of the pruned models. Unfortunately, I ran into a lot of errors in this...