Seasons of Serverless, Lovely Ladoos

Seasons of Serverless

Challenge 2: Lovely Ladoos

It’s Diwali season in India! A very popular delicacy that Indians eat during Diwali are ladoos, balls of flour dipped in sugar syrup. Can you create a machine learning model to detect a proper ladoo? Learn more about this challenge.

Teaser

Solution

  1. The Solution creates an Image Classifier using Custom Vision API
  2. The solution also creates a Serverless Endpoint which is used to
  • List all the images stored on the Azure Blob Storage
  • Upload a new image on the Azure Blob Storage
  • Predict the already uploaded image using Custom Vision API

Serverless Compute

  • Function as a Service: Modular way of writing business logics as a different application(s) which is/are triggered with the event(s). eg., Azure Functions
  • Stateless Compute Containers​: Service which does not require to read/store their previous states after each start.
  • Event-triggered and Ephemeral​: Triggered only by events, and they exist until the time it is triggered. If not triggered for a period, the service gets down allowing other services to use their resources.
  • Dynamically Allocated​: Services gets down when not in use so their resources might be used by other services, that is why they have to be redeployed(behind the scene) thus dynamically allocated.
  • Pay as you go​: Pay for the services that have actually been used.

Advantages of using Serverless

  • Lower Operational Cost​
  • Easily Scalable​
  • Billing is based upon Usage​
  • Easy Deployment​
  • Low Cost​

Disadvantages

  • Complex Architecture​
  • Execution Time​
  • Execution Frequency​

Azure CLI

Azure CLI Commands

az login
az account list-locations \
--query "[].{Region:name}" \
--out table
# To see all the available subscriptions
az account list
# To set the subscription
az account set --subscription <Subscription-ID or Subscription-Name>

Create Azure Resource Group

az group create --name <Your-Resource-Group-Name> --location eastus

Blob Storage

az storage account create \
--name <Blob Storage Name> \
--resource-group <Resource Group Name> \
--location eastus \
--sku Standard_ZRS \
--encryption-services blob

Create Container

az ad signed-in-user show --query objectId -o tsv | az role assignment create \
--role "Storage Blob Data Contributor" \
--assignee @- \
--scope "/subscriptions/<Subscription>/resourceGroups/<Resource Group Name>/providers/Microsoft.Storage/storageAccounts/<Blob Storage Name>"
az storage container create \
--account-name <Storage Account> \
--name <Container Name> \
--auth-mode login

Azure Functions

Creating Functions Project

# This will generate a folder containing two files host.json and local.settings.json
func init Seasons-of-Serverless-Solution-Lovely-Ladoos --worker-runtime node
cd Seasons-of-Serverless-Solution-Lovely-Ladoos

Creating Functions Template

# Here a pre-built template for the Http Trigger will be created
# with two files, function.json and index.js
# We are going to write our logics in the index.js
# To create blob endpoint
func new --template "Http Trigger" --name blobs
# To create prediction endpoint
func new --template "Http Trigger" --name predict # To create blob endpoint

Initialize your nodeJs project with npm

npm init

List all the blobs in a Blob Storage Container

const { BlobServiceClient, StorageSharedKeyCredential, newPipeline } = require('@azure/storage-blob');const sharedKeyCredential = new StorageSharedKeyCredential(
process.env.AZURE_STORAGE_ACCOUNT_NAME,
process.env.AZURE_STORAGE_ACCOUNT_ACCESS_KEY
);
const pipeline = newPipeline(sharedKeyCredential);
const containerName = process.env.CONTAINER_NAME;
const blobServiceClient = new BlobServiceClient(`https://${process.env.AZURE_STORAGE_ACCOUNT_NAME}.blob.core.windows.net`, pipeline );const containerClient = blobServiceClient.getContainerClient(containerName);const listBlobsResponse = await containerClient.listBlobFlatSegment();

Uploading the image to the Blob Storage

const { BlobServiceClient } = require('@azure/storage-blob');
const streamifier = require("streamifier");
const containerName = process.env.CONTAINER_NAME;const blobServiceClient = BlobServiceClient.fromConnectionString(process.env.AzureWebJobsStorage);const containerClient = blobServiceClient.getContainerClient(containerName);;const blockBlobClient = containerClient.getBlockBlobClient(fileName);const result = await blockBlobClient.uploadStream(streamifier.createReadStream(Buffer.from(fileData)), fileData.length);

Predict the image with image URL

// Import Dependencies
const msRest = require("@azure/ms-rest-js");
const { PredictionAPIClient } = require("@azure/cognitiveservices-customvision-prediction");
// Get the env variables
const projectId = process.env.PROJECT_ID
const publishedName = process.env.PUBLISHED_NAME
const predictionKey = process.env.PREDICTION_KEY
const endPoint = process.env.PREDICTION_ENDPOINT
const predictionResourceId = process.env.PREDICTION_RESOURCE_ID
const predictor_credentials = new msRest.ApiKeyCredentials({ inHeader: { "Prediction-key": predictionKey } });const predictor = new PredictionAPIClient(predictor_credentials, endPoint);const results = await predictor.classifyImageUrl(projectId, publishedName, { url: imageUrl });

Custom Vision API

  • Log in the Azure Portal, search for Cognitive Services → Custom Vision or search for Custom Vision, and click Create.
Create Custom Vision
  • Select the subscription, resource group you created above.
    Select the pricing tier best suited for your purpose, well here I would go with Free F0 Pricing Tier.
  • It takes a few minutes to deploy
  • Navigate to ‘Go to prediction resource’ ie., customvision.ai portal
  • Create a New Project
Name: <Any Name>
Description: <Any Description>
Resources: <The Resource just created>
Project Types: Classification
Classification Types: Multiclass (Single tag per image)
Domains: Food
  • On the Custom Vision Portal, click on ‘Add Images’, and upload the images of “ladoos” and tag them as “ladoo”. At the end Upload all the files.

For your simplicity, I have attached the dataset: ramanaditya/Seasons-of-Serverless-Solution-Lovely-Ladoos/tree/main/datasets

  • Similarly, upload the images of doughnut and sesame buns.
  • Once all the images are uploaded click on Train, you can select any training options, here I have selected “Quick Training”.
  • Once the training is completed you will find the beautiful statistics
  • Click on “Publish” to publish your prediction model to be used in the Azure Functions.
  • While publishing save the “Published Name” to your .env file
  • Navigate to settings and store Project ID, Prediction Endpoint, Prediction Key and Prediction Resource ID in the .env file.

Run locally

func start

Check the working of API Endpoints

GET: /api/blobs

list

POST: /api/blobs

upload

GET: /api/predict?imageurl=<imageUrl>

prediction

Chefs for the Challenge

  1. Jasmine Greenaway, Cloud Advocate (Microsoft)
  2. Rama Soumya Naraparaju, Microsoft Student Ambassadors
  3. Aditya Raman (me), Microsoft Student Ambassadors

About the Author

Aditya Raman

--

--

Back End Developer | Software Engineer | DevOps Engineer | Data Science | Microsoft Learn Student Ambassador | Mentor MLH | Full Stack Developer

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Aditya Raman

Back End Developer | Software Engineer | DevOps Engineer | Data Science | Microsoft Learn Student Ambassador | Mentor MLH | Full Stack Developer