Seasons of Serverless, Lovely Ladoos

Aditya Raman
8 min readNov 30, 2020


Seasons of Serverless

This article is part of #SeasonsOfServerless.

Each week we will publish a challenge created by Azure Advocates with amazing Student Ambassadors around the world. Discover popular festive recipes and learn how Microsoft Azure empowers you to do more with Serverless Functions! 🍽 😍.

Explore our serverless Resources here and learn how you can contribute solutions here.

Check original repository at,
Serverless Solution for Lovely Ladoos:

Challenge 2: Lovely Ladoos

Featured Region: India

Your Chefs: Jasmine Greenaway, Cloud Advocate (Microsoft) with Soumya Narapaju and Aditya Raman, Microsoft Student Ambassadors

It’s Diwali season in India! A very popular delicacy that Indians eat during Diwali are ladoos, balls of flour dipped in sugar syrup. Can you create a machine learning model to detect a proper ladoo? Learn more about this challenge.



  1. The Solution creates an Image Classifier using Custom Vision API
  2. The solution also creates a Serverless Endpoint which is used to
  • List all the images stored on the Azure Blob Storage
  • Upload a new image on the Azure Blob Storage
  • Predict the already uploaded image using Custom Vision API

Serverless Compute

Serverless computing enables developers to build applications faster by eliminating the need for them to manage infrastructure. With serverless applications, the cloud service provider automatically provisions, scales and manages the infrastructure required to run the code.

Altogether it can be summarized as

  • Function as a Service: Modular way of writing business logics as a different application(s) which is/are triggered with the event(s). eg., Azure Functions
  • Stateless Compute Containers​: Service which does not require to read/store their previous states after each start.
  • Event-triggered and Ephemeral​: Triggered only by events, and they exist until the time it is triggered. If not triggered for a period, the service gets down allowing other services to use their resources.
  • Dynamically Allocated​: Services gets down when not in use so their resources might be used by other services, that is why they have to be redeployed(behind the scene) thus dynamically allocated.
  • Pay as you go​: Pay for the services that have actually been used.

Advantages of using Serverless

  • Lower Operational Cost​
  • Easily Scalable​
  • Billing is based upon Usage​
  • Easy Deployment​
  • Low Cost​


  • Complex Architecture​
  • Execution Time​
  • Execution Frequency​

Azure CLI

Azure CLI: The Azure command-line interface (Azure CLI) is a set of commands used to create and manage Azure resources. The Azure CLI is available across Azure services and is designed to get you working quickly with Azure, with an emphasis on automation.

Azure CLI Commands

Login to Azure: The command will open the tab in the browser letting you log in to the portal.

az login

To check all the available locations

az account list-locations \
--query "[].{Region:name}" \
--out table

Setting up the subscription
If this step is performed you can skip the subscription argument while creating group, storage and applications. This set as the default subscription on your local system.

# To see all the available subscriptions
az account list
# To set the subscription
az account set --subscription <Subscription-ID or Subscription-Name>

Create Azure Resource Group

Azure Resource Group is the container to store the resources as the directory in your system. It helps in managing the resources associated with a similar type of solutions or the related one.

az group create --name <Your-Resource-Group-Name> --location eastus

Blob Storage

Azure Blob storage is Microsoft’s object storage solution for the cloud. Blob storage is optimized for storing massive amounts of unstructured data. Unstructured data is data that doesn’t adhere to a particular data model or definition, such as text or binary data.

az storage account create \
--name <Blob Storage Name> \
--resource-group <Resource Group Name> \
--location eastus \
--sku Standard_ZRS \
--encryption-services blob

Create Container

A container organizes a set of blobs, similar to a directory in a file system. A storage account can include an unlimited number of containers, and a container can store an unlimited number of blobs.

az ad signed-in-user show --query objectId -o tsv | az role assignment create \
--role "Storage Blob Data Contributor" \
--assignee @- \
--scope "/subscriptions/<Subscription>/resourceGroups/<Resource Group Name>/providers/Microsoft.Storage/storageAccounts/<Blob Storage Name>"
az storage container create \
--account-name <Storage Account> \
--name <Container Name> \
--auth-mode login

Azure Functions

Azure Functions is a serverless compute service that lets you run event-triggered code without having to explicitly provision or manage infrastructure.

Azure Functions Core Tools: To develop and test the functions locally using a terminal or command prompt. This is required and can be downloaded from here.

Creating Functions Project

The command initializes the project and creates a project directory — worker-runtime denotes the language you want to use in the application

# This will generate a folder containing two files host.json and local.settings.json
func init Seasons-of-Serverless-Solution-Lovely-Ladoos --worker-runtime node
cd Seasons-of-Serverless-Solution-Lovely-Ladoos

Note: The values of these files can be changed according to the connection strings which you had or will have for DB, storage etc.

Creating Functions Template

To Create a JavaScript HTTP Trigger, this is a sample template to create the HTTP Trigger.

# Here a pre-built template for the Http Trigger will be created
# with two files, function.json and index.js
# We are going to write our logics in the index.js
# To create blob endpoint
func new --template "Http Trigger" --name blobs
# To create prediction endpoint
func new --template "Http Trigger" --name predict # To create blob endpoint

Initialize your nodeJs project with npm

npm init

Refer to the GitHub Repository for the complete code: ramanaditya/Seasons-of-Serverless-Solution-Lovely-Ladoos

List all the blobs in a Blob Storage Container

const { BlobServiceClient, StorageSharedKeyCredential, newPipeline } = require('@azure/storage-blob');const sharedKeyCredential = new StorageSharedKeyCredential(
const pipeline = newPipeline(sharedKeyCredential);
const containerName = process.env.CONTAINER_NAME;
const blobServiceClient = new BlobServiceClient(`https://${process.env.AZURE_STORAGE_ACCOUNT_NAME}`, pipeline );const containerClient = blobServiceClient.getContainerClient(containerName);const listBlobsResponse = await containerClient.listBlobFlatSegment();

Uploading the image to the Blob Storage

const { BlobServiceClient } = require('@azure/storage-blob');
const streamifier = require("streamifier");
const containerName = process.env.CONTAINER_NAME;const blobServiceClient = BlobServiceClient.fromConnectionString(process.env.AzureWebJobsStorage);const containerClient = blobServiceClient.getContainerClient(containerName);;const blockBlobClient = containerClient.getBlockBlobClient(fileName);const result = await blockBlobClient.uploadStream(streamifier.createReadStream(Buffer.from(fileData)), fileData.length);

Predict the image with image URL

// Import Dependencies
const msRest = require("@azure/ms-rest-js");
const { PredictionAPIClient } = require("@azure/cognitiveservices-customvision-prediction");
// Get the env variables
const projectId = process.env.PROJECT_ID
const publishedName = process.env.PUBLISHED_NAME
const predictionKey = process.env.PREDICTION_KEY
const endPoint = process.env.PREDICTION_ENDPOINT
const predictionResourceId = process.env.PREDICTION_RESOURCE_ID
const predictor_credentials = new msRest.ApiKeyCredentials({ inHeader: { "Prediction-key": predictionKey } });const predictor = new PredictionAPIClient(predictor_credentials, endPoint);const results = await predictor.classifyImageUrl(projectId, publishedName, { url: imageUrl });

Custom Vision API

Custom Vision lets you build, deploy, and improve your own image classifiers. An image classifier is an AI service that applies labels (which represent classes) to images, based on their visual characteristics.

Note: The solution here is not the only solution, one can use the Serverless Function or the Azure CLI for the same, we just went with the simpler method of using the Azure Portal.

  • Log in the Azure Portal, search for Cognitive Services → Custom Vision or search for Custom Vision, and click Create.
Create Custom Vision
  • Select the subscription, resource group you created above.
    Select the pricing tier best suited for your purpose, well here I would go with Free F0 Pricing Tier.
  • It takes a few minutes to deploy
  • Navigate to ‘Go to prediction resource’ ie., portal
  • Create a New Project
Name: <Any Name>
Description: <Any Description>
Resources: <The Resource just created>
Project Types: Classification
Classification Types: Multiclass (Single tag per image)
Domains: Food
  • On the Custom Vision Portal, click on ‘Add Images’, and upload the images of “ladoos” and tag them as “ladoo”. At the end Upload all the files.

For your simplicity, I have attached the dataset: ramanaditya/Seasons-of-Serverless-Solution-Lovely-Ladoos/tree/main/datasets

  • Similarly, upload the images of doughnut and sesame buns.
  • Once all the images are uploaded click on Train, you can select any training options, here I have selected “Quick Training”.
  • Once the training is completed you will find the beautiful statistics
  • Click on “Publish” to publish your prediction model to be used in the Azure Functions.
  • While publishing save the “Published Name” to your .env file
  • Navigate to settings and store Project ID, Prediction Endpoint, Prediction Key and Prediction Resource ID in the .env file.

Run locally

func start

Check the working of API Endpoints

GET: /api/blobs

List all the images stored with their image URL


POST: /api/blobs

To upload the image to the blob storage


GET: /api/predict?imageurl=<imageUrl>

To predict the image


Chefs for the Challenge

  1. Jasmine Greenaway, Cloud Advocate (Microsoft)
  2. Rama Soumya Naraparaju, Microsoft Student Ambassadors
  3. Aditya Raman (me), Microsoft Student Ambassadors

About the Author

Aditya Raman

I am Aditya Raman, Software Engineer, and Developer. Also, I am a Gold Microsft Learn Student Ambassador. I am an Open Source Contributor who loves DevOps, Backend Engineer, and Data Science.

If you want to connect with me or if you want to call me as a speaker in your conference/workshops or have some opportunity for me, please follow the links



Aditya Raman

Back End Developer | Software Engineer | DevOps Engineer | Data Science | Microsoft Learn Student Ambassador | Mentor MLH | Full Stack Developer