Week 4 : Coding Period

Coding Period Update

This week my progress has been comparatively slower than other weeks. In the beginning of this week I tried to train my models again. But this time, I reduced the dataset size almost to half of the original size. I balanced the positive and negative samples in the dataset and retrained the models. These efforts went to waste as I did not receive good performance from my models even after altering the hyperparameters. They gave accuracies lesser than the imbalanced dataset. My time got wasted and then I quickly got onto my next task of pruning and int8 quantization.

I migrated my notebook from Kaggle to Jupyter Hub hosted on PLHI server. I successfully pruned my models on the server as most of the issues that I had faced on Kaggle and Colab were solved here. I experimented with quite some parameters ran pruning on DenseNet as well as Inception. Now I tried Int8 quantization and quantization of the pruned models. Unfortunately, I ran into a lot of errors in this process as TFLite convertors were not able to access the required binaries. Now I will shift this to Colab/Kaggle (which ever is ready to give me the outputs) and then quantize my pruned models as well as perform int8 quantization.

Along with this, I am now preparing for my next task. This time, I do not wish to go into that loop of training models but rather pick up SOTA models and then quantize them. This will be quicker. more efficient and relative to the purpose of this project. I hope to increase my speed next week to recover for the unfortunate series of events this week.

Till then, keep coding and persisting!
See you next week.


Popular posts from this blog

GSoC 2020 with LibreHealth : Final Report

Week 1 : Acceptance and Community Bonding

Week 3 : Coding period