Bilal Shaikh — Updated On June 26th, 2023
Automl Beginner Classification Machine Learning Python


AutoML is also known as Automatic Machine Learning. In the year 2018, Google launched cloud AutoML which gained a lot of interest and is one of the most significant tools in the field of Machine Learning and Artificial Intelligence. In this article, you will learn “AutoML” a no code solution for building machine learning models with help of Google cloud AutoML.

AutoML No code | ML Models

AutoML is a part of Vertex AI on the Google Cloud Platform. Vertex AI is the end-to-end solution for building and creating machine learning pipelines on the cloud. However, we will discuss the details of Vertex AI in a future article. AutoML mainly depends on two things one is transfer learning and neural search architecture. You just need to provide the data post that AutoML will build an optimal custom model for your use case.

In this article, we will discuss the benefits, usage and practical implementation of AutoML with Python code on the Google Cloud Platform.

Learning Objectives

  • To let readers know how to use AutoML with code
  • To understand the benefits of AutoML
  • How to use the client library to create an ML pipeline

This article was published as a part of the Data Science Blogathon.

Problem Statement

Building a machine learning model is a time consuming process and requires a lot of expertise such as proficiency in a programming language, good knowledge of mathematics and statistics, and an understanding of machine learning algorithms. In past, people with technical skills could only work in data science and build models. For non-technical people building a machine learning model was a most difficult task. However, the path was not easy for technical persons who built models. Once the model is built, its maintenance, deployment, and autoscaling require additional efforts, man-hours and require a slightly different set of skills. To overcome these challenges global search giant Google launched AutoML in 2014 but it was publically made available later.

Benefits of AutoML

  • AutoML reduces manual intervention and requires little Machine Learning expertise.
  • AutoML allows technical and non-technical people to build Machine Learning models without writing any code
  • It takes care of each and every step of building a model such as data pre-processing, feature engineering, model building, hyperparameter tuning, model evaluation and prediction on test data hence you don’t need to write any code to perform such tasks
  • AutoML has an intuitive user interface and provides different APIs
  • AutoML also provide the client library with Python and other programming languages
Benefits of AutoML | AutoML No code | ML Models

Supported Types of Data

AutoML support unstructured and structured data that is categorized into four types

  1. Image
  2. Tabular
  3. Video
  4. Text

with these four data types, you can perform certain activities supported by AutoML.


With the image dataset, you can perform the below task in AutoML

  • Image Classification (Single-label)
  • Image Classification (Multi-label)
  • Object Detection
  • Image Segmentation


With a tabular dataset, you can perform the following task:

  • Regression
  • Classification
  • Time Series Forecasting


You can perform the below activities with the video dataset

  • Object Detection
  • Video Action Recognition
  • Video Object Tracking


AutoML text data support the below task:

  • Sentiment Analysis
  • Text Classification (Single-label)
  • Text Classification (Multi-label)
  • Entity Extraction


To use AutoML, one should have an account on the Google Cloud Platform. Account set-up is a very simple process, just go to the
URL and click on join, it will ask for your Gmail email id and password and an account gets created on GCP. Click on the search bar and search for Vertex AI, in the left side you will see all components of Vertex AI, click on workbench.

Workbench provides you with a jupyter lab where you can create a notebook instance on the cloud using a virtual machine. Select the “USER-MANAGED NOTEBOOKS” instance and click on “NEW NOTEBOOK”, choose Python 3 and leave the default settings as it is, It will take two to three minutes and a Jupyter Lab will be created for you. You can also create a tensorflow and pytorch instance with or without GPU. Click on “OPEN JUPYTERLAB” then click on Python 3 (ipykernel) from the Notebook section. Your Jupyter notebook is ready; now you can write code similar to your local Python Jupyter notebook.

Implementation | AutoML No code | ML Models

AutoML Client Library in Python

We will create a tabular classification model for the demo using the AutoML client library in Python.

First, you need to install the two packages.

!pip install --upgrade google-cloud-aiplatform

!pip install --upgrade google-cloud-storage

Once these two packages are installed successfully, restart the kernel. You can restart the kernel in two ways, One is from the user interface, select the “Kernel” tab from the top bar and click “Restart Kernel”, The second option is by programmatically.

#restart the kernel

import os

if not os.getenv("IS_TESTING"):
  import ipython
  app = Ipython.Application.instance()

Set your project id, bucket name and region. If you don’t know your project id, run the below code to get to know your google cloud project id using gcloud command.

import os


if not os.getenv("IS_TESTING"):
  proj_output = !gcloud config list --format 'value(core.project)' 2>/dev/null
  PROJECT_ID = proj_output[0]
  print("Project ID: ", PROJECT_ID)
#set project id, bucket name and region

PROJECT_ID = '@YOUR PROJECT ID' #from the above code you can get your project id
BUCKET_NAME = 'gs://PROJECT_ID' #you can set your own bucket name
REGION = 'us-west1' #change the region if different

Why do we need a bucket name? In AutoML you can upload the data using three ways:

  • BigQuery
  • Cloud Storage
  • Local Drive (from local machine)

In this example, we are uploading the dataset from cloud storage for that we need to create a bucket where we will upload our CSV file.

Create a bucket in cloud storage and set the data path from google cloud storage.

#using gsutil command we can create a bucket in cloud storage
! gsutil mb -l $REGION $BUCKET_NAME

#checking if the bucket created
! gsutil ls -al $BUCKET_NAME

#dataset path in gcs

IMPORT_FILE = 'data.csv'
gcs_path = f"{BUCKET_NAME}/{IMPORT_FILE}"

Now, we need to create a dataset in AutoML post that we train the model on the dataset.

#import necessary libraries

import os
from import aiplatform

#initializing the AI platform
aiplatform.init(project=PROJECT_ID, location=REGION)

#creating dataset in AutoML
ds = aiplatform.TabularDataset.create(
  display_name = 'data_tabular', #set your own name
  gcs_source = gcs_path)

#create a training job in AutoML to run the model
job = aiplatform.AutoMLTabularTrainingJob(
  diaply_name = '#set your own name',
  optimization_prediction_type = 'classification',
  column_transformations = [
      {'categorical' : {'column_name': 'City'}},  #just randomly given the name
      {'numeric' : {'column_name': 'Age'}},
      {'numeric' : {'column_name': 'Salary'}}])

#run the model
#this will take time, depending on your dataset
model =
  dataset = ds,
  target_column = Adopted,
  training_fraction_split = 0.8,
  test_fraction_split = 0.2,
  model_display_name = '#give your own name',
  disable_early_stopping = False)

Once training is done we will deploy our model using endpoint. Endpoint is one of the components of Vertex AI where you can deploy your mode and make online predictions.

#deploying the model
endpoint = model.deploy(machine_type = 'n1-standard-4')

This will take a few minutes. While creating an endpoint instance choose your machine type wisely as this will incur the cost. Setting a low machine type results in fewer fees, whereas setting a high machine type results in more costs. For more clarity on pricing, please check out the below link.

#making prediction

pred = endpoint.prediction([
  {'City': 'Madrid',
  'Age': 52,
  'Salary': 70000}])



Google Cloud AutoML is a powerful tool that anyone can use to build Machine Learning models without writing code. AutoML has a very interactive user interface from where you can build and deploy the model without extensive knowledge of algorithms and coding. However, the key takeaways from this article are:

  • How to leverage AutoML services programmatically with the help of the AutoML client library
  • You can build different types of models such as image classification, text entity extraction, time series forecasting, object detection and many more in AutoML
  • You don’t need much ML expertise to use AutoML and how it reduces manual intervention.
  • How it empowers developers and data scientists to leverage the power of AI technology in their applications quickly and efficiently

Frequently Asked Questions

Q1. Will AutoML take the job of Data Scientist?

A. No, AutoML will not take the job of Data Scientist. AutoML has a lot of potentials and automates Machine Learning, but if we want to build a custom model with total control of the code, we need a Data Scientist’s expertise.

Q2. Are pre-built APIs and AutoML doing the same work?

A. Pre-built APIs use a pre-built ML model and AutoML uses a custom-built ML model.

Q3. Can non-technical people use AutoML?

A. Yes, Anyone can use AutoML and build the Machine Learning model on the Google Cloud.

Q4. Is Google Cloud too costly?

A. It depends on the use case and cloud services you are going to use.

Q5. What is Google Cloud vertex AI? Is it like AutoML?

A. Vertex AI is an ML suite of Google Cloud which provide an end-to-end solution for building, deploying and creating Machine Learning and Artificial Intelligence pipeline on the cloud. AutoML is one of the components of Vertex AI.

The media shown in this article is not owned by Analytics Vidhya and is used at the Author’s discretion.

About the Author

Our Top Authors

Download Analytics Vidhya App for the Latest blog/Article