TFLite image classification model maker using Google Colab and Google Drive

Roberto Hizon Jr
5 min readMar 7, 2021

--

You will need to have a Google account to be able to use Google Colab and Google Drive. Google Colab is a free service offered by Google where you can run Python scripts and use machine learning libraries taking advantage of their powerful hardware. You can use Google Drive with 15GB free storage. Google Drive is where you will store your images.zip (or any filename you prefer) as your input dataset and this is where your output model files (model.tflite and labels.txt) will be dumped as well after the script has been successfully completed. Google Colab will connect to Google Drive and you need to authenticate and give permission to do this.

Three ipynb scripts that you may choose from:

All of these scripts will result the same model.tflite and labels.txt files. And they’ll all ask the same permission to access, read and write to your Google Drive. They’re all based on Tensorflow documention. Just choose one of the following option scripts, and follow the procedure on creating the model files as stated below:

  1. Retraining_an_Image_Classifier_for_TFLite.ipynb —This was based on Retraining an Image Classifier and TF Hub for TF2: Retraining an image classifier
  2. TFLite_Model_Maker_Image_Classifier.ipynb — This was also based on Retraining an Image Classifier and TF Hub for TF2: Retraining an image classifier to get the MODULE_HANDLE and proceeded on CLI command make_image_classifier to get the TFLite output files.
  3. TFLite_ModelMaker.ipynb — This was based on Flower classification with TensorFlow Lite Model Maker with TensorFlow 2.0, TFLite Model Maker, Image classification with TensorFlow Lite Model Maker, TensorFlow Lite Model Maker, or Image classification with TensorFlow Lite Model Maker.

The following are the detailed steps on how your model is created on your Google Drive:

Prepare the Image dataset

  • 1) An image dataset is a folder containing a lot of images with each type still contained in its own folder. Each folder type represents a class in the model that is to be created. I would suggest to get hundreds of images for each class.
Sample image dataset
  • 2) Compress the images from its main folder. Name it whatever you prefer.

Later on, you will set variable names on the ipynb script that would point to the zipped file from your Google Drive.

Image dataset that is zipped from the main folder. Upload the zipped file to your Google Drive.
  • 3) Upload the zipped file to your designated path on your Google Drive.

If you do not have set of images to test this script out, you could download my samples zipped file here as a starter.

Prepare your .ipynb script

  • 1) You may choose among the three ipynb scripts mentioned above and place it on your Google Drive.
  • 2) Double click on the script file so that it would open on Google Colab.

Run the ipynb script on Google Colab

This section will be executed exclusively on Google Colab. Within these steps, you can set Notebook settings and you will set variable names and various dropdown and checkbox options before running the script. Upon running the script, first, you should give permission to read and write to your Google Drive.

This illustrates the steps on giving permission to access, read, and write to your Google Drive.

After a successful authentication, the script will proceed on running on the remaining cells below for unzipping your compressed image file, saving it on session storage, defining, training and creating the model. If everything went well, your model is saved back and exported to your Google Drive.

This shows that your zipped image dataset and the ipynb script should reside on the same folder on your Google Drive. It also shows a newly generated output path when the script has successfully completed.
  • 1) Change the hardware accelerator to GPU for better and faster performance. You can find this dialog box from the main menu Edit>Notebook settings or Runtime>Change runtime type.
Change hardware accelerator to GPU.
  • 2) Set variable names data_dir, base_path, and imagezipped. data_dir is the main folder of the image dataset before it was compressed. base_path is the path on your Google Drive where ipynb file and compressed image file reside. imagezipped is the name of your compressed image file. There is also an option to set epoch value to 10. Epoch is the number of iteration for back propagation to correct the errors on earlier part so that it would increase accuracy. The larger the epoch the more iteration thus the longer the training process.
This shows how variables get their values
  • 3) Set script options, if any. You can choose between InceptionV3 or MobileNetV2 as your pretrained TF2 SavedModel module. You can check on Do Data Augmentation to increase your image dataset. You can check on Do Fine Tuning for better performance and accuracy during training. Other settings are Optimization settings for model.tflite.
Choose between InceptionV3 and MobileNetV2 pre-trained TF2 SavedModel for feature extractor layer. Choose MobileNetV2 if you are going to use a mobile device for inference testing.
Data Augmentation option to add more images to the dataset
Choose fine tuning to increase the overall model accuracy during training. This will train the feature extractor from the pretrained model alongside with the newly added image dataset.
Optimization settings for model.tflite.
  • 4) Run the script from Runtime>Run all. Alternatively, you can also run each cell manually. Please note that the first cell is the authentication to connect to Google Drive. Doing so will mount your Google Drive to the session storage. You will need to give permission to that before proceeding to the next cell.

Your TFLite model files are now exported to your Google Drive

After you successfully run the ipynb script, you should have created and exported your model files, model.tflite and labels.txt, on a new ‘output’ path on your Google Drive.

Other related links:

Consider donating to Paypal if you appreciate the effort on creating this article.

--

--