Posts

Week 4 : Coding Period

Coding Period Update This week my progress has been comparatively slower than other weeks. In the beginning of this week I tried to train my models again. But this time, I reduced the dataset size almost to half of the original size. I balanced the positive and negative samples in the dataset and retrained the models. These efforts went to waste as I did not receive good performance from my models even after altering the hyperparameters. They gave accuracies lesser than the imbalanced dataset. My time got wasted and then I quickly got onto my next task of pruning and int8 quantization. I migrated my notebook from Kaggle to Jupyter Hub hosted on PLHI server. I successfully pruned my models on the server as most of the issues that I had faced on Kaggle and Colab were solved here. I experimented with quite some parameters ran pruning on DenseNet as well as Inception. Now I tried Int8 quantization and quantization of the pruned models. Unfortunately, I ran into a lot of errors in this...

Week 3 : Coding period

What did I do this week? I fine-tuned my DenseNet201 and InceptionV3 models till I could maximize their accuracy. I used class-weighting for this purpose and achieved around 85% accuracy on both the models. For both the models, I calculated all the performance metrics like - Accuracy Loss Confusion matrix Precision Recall F1 score ROC curve I will now document all these metrics to present them to my mentor. I implemented dynamic range and Float16 quantization methods on DenseNet201 and InceptionV3 and evaluated their accuracy. Float16 gave minimum change in accuracy while dynamic range dropped the accuracy considerably. Plans for the upcoming week I tried implementing Int8 quantization and model pruning but ran into a heap of errors. So this week, I will rectify those errors and make these modules run. Earlier when I had experimented with pruning I ran into a lot of OOM errors so my mentor has been gracious to provide me with a Jupyter Hub access hosted on the...

Week 2 : Coding Period

This week I continued training models on the dataset. I was running into OOM errors because the complete dataset could not be loaded into the memory all at once. I overcame this error by using mini-batching. This method loads a small set of data into the memory as an when required as per the training batch size. It reduces the overload on the memory as well. I trained DenseNet201 and InceptionV3 on the dataset. These models did not give sufficient accuracy which is why I will be experimenting with more hyper-parameters before proceeding to the next task of quantization. The current accuracy is around 85%. I want to stretch it to a scale above 95%. This will be my focus this week. Happy coding! :)

Week 1 : Coding Period

This week I began with Exploratory Data Analysis. I downloaded the RSNA-Pneumonia Detection Dataset. Being a dataset of DICOM image segmentation, I purged all the information regarding segmentation and kept only classification information. I noticed a dataset imbalance in the images as the number of positive pneumonia cases were extremely low compared to the negative cases. This problem gets solved using undersampling and oversampling techniques. Being a huge dataset of over 26k+ patients, my system occasionally ran out of memory while preprocessing it. I shifted my code to Colab and similar errors were encountered. To tackle this, I will use Mini Batches for training so that the training process becomes more efficient. By then end of this weekend, I will complete training a model on this dataset. Next week, I will train the rest of the models.

Week 4 : Last week of Community Bonding

This week I had a video meeting with my mentor Priyanshu Sinha. We discussed a lot of things and I was able to clear most of my queries. 1. I discussed how the datasets suggested earlier were not open source and hence, I would be switching to other open source options. 2. We finalized all the model compression methods that I would be trying. I will be majorly focusing on post-training methods but at the same time, I was free to experiment with compression aided methods. 3. We restructured the system architecture and decided the functionalities of the Flask application. It will run on the Raspberry Pi emulated using Qemu. Qemu will be run using a docker container. I have successfully done 3/4th of this. The integration of the Flask application is challenging and will be done later. 4. Based on the previous point, I modified my Dockerfile and created my first Merge Request. 5. I mapped out a workflow for my project.  First I will begin with model training and then proceed ...

Week 3: Community Bonding cont.

Update : In this week, I did the following tasks. Prepared Dockerfile to send a merge request. This needed to be integrated with the previous Dockerfile.  Implemented sample quantization and pruning methods on Densenet201 model. Quantization turned out to be fine but pruning made me run into Out-Of-Memory (OOM) errors multiple times. Prepared my system for the project by installing all the required packages. I had to uninstall my Ubuntu system and dual boot my PC again with an increased partition size so that it could accommodate the size of the project as well as the Docker images. Finalized model performance methods and packages for it. I found the psutil python library the could help me perform model evaluation. Read about Knowledge Distillation and it’s related papers. This concept took time to understand and I found more interesting methods for model compression. Following are my goals for next week. These are urgent tasks and need to be done asap. Finalize datas...

Week 2 : Community Bonding cont.

Update : This week, I received a brief overview of instructions from my mentor and began working on them. I had finished reading about Quantization and Pruning of Models. I referred to the following videos for a brief overview apart from my other reading material. TFWorld session by Raziel Alvarez - https://youtu.be/3JWRVx1OKQQ Inside Tensorflow session by Suharsh Sivakumar - https://youtu.be/4iq-d2AmfRU I also emulated a Raspberry Pi device using Qemu emulator inside a docker container. Being new to Docker and Qemu, I faced a few errors but in the end, I was able to do this task successfully. This week I will read a few papers that my mentors have mentioned. My work will include building upon the work performed by the researchers in the papers. I also plan to set up a detailed plan for the summer with my mentor.