Week 2 : Coding Period
This week I continued training models on the dataset. I was running into OOM errors because the complete dataset could not be loaded into the memory all at once. I overcame this error by using mini-batching. This method loads a small set of data into the memory as an when required as per the training batch size. It reduces the overload on the memory as well.
I trained DenseNet201 and InceptionV3 on the dataset. These models did not give sufficient accuracy which is why I will be experimenting with more hyper-parameters before proceeding to the next task of quantization. The current accuracy is around 85%. I want to stretch it to a scale above 95%. This will be my focus this week.
Happy coding! :)
I trained DenseNet201 and InceptionV3 on the dataset. These models did not give sufficient accuracy which is why I will be experimenting with more hyper-parameters before proceeding to the next task of quantization. The current accuracy is around 85%. I want to stretch it to a scale above 95%. This will be my focus this week.
Happy coding! :)
Comments
Post a Comment