Roadmap To Clear Azure DP 100 -Designing and Implementing a Data Science Solution on Azure

Chrisx10 Last Updated : 10 Jul, 2021
6 min read

This article was published as a part of the Data Science Blogathon

Motivation To Take Up DP-100

Data science, machine learning, MLops, data engineering, all these frontiers of data are moving ahead with rapid pace and precision. The future of data science is defined by larger firms such as Microsoft, Amazon, Databricks, Google, and these firms are driving innovation in this field. Due to such fast-paced changes, it makes sense to get certified with any one of these big players and get to know their product offering. Moreover with end to end solutions provided by these platforms from scalable data lakes to scalable clusters,  for a test as well as production, making life easier for data professionals. From a business perspective, it has all the infrastructure under one roof, on cloud and on-demand, and more and more businesses are inclined or moreover forced to move to the cloud due to the ongoing pandemic.

How does DP-100(Designing and Implementing a Data Science Solution on Azure) help a data scientist or anyone working with data?

In short, businesses gather data from various sources, mobile apps, POS systems, in-house tools, machines, etc., and all these are housed under various departments or various databases, this is especially true for legacy big firms. One of the major hurdles for data scientists is to get relevant data under one single roof to build models on and use in production. In the case of Azure, all this data moves to a data lake, data manipulation can be done using SQL pools or Spark pools, data cleaning, model preprocessing, model building using test clusters(low cost), model monitoring,  model fairness, data drift and deployment using cluster(high scalable higher cost). The data scientist can focus on solving problems and let Azure do the heavy lifting.

Another use case scenario is model tracking using mlflow(open source project by Databricks). Anybody who has participated in a DS hackathon knows model tracking, logging metrics, and comparing models is a tedious task, if you haven’t set up a pipeline. In Azure all of this is made easy using called experiments, all models are logged, metrics logged, artifacts logged, all using one single line of code.

About Azure DP-100

Azure DP-100 (Designing and Implementing a Data Science Solution on Azure) is the go-to Data science certification from Microsoft for all data enthusiasts. It’s a self-paced learning experience, with freedom and flexibility. After completion, one can work on azure hassle-free and build models, track experiments, build pipelines, tune hyperparameters the AZURE way.

Requirements

  1. Basic knowledge of python, having worked on it for at least 3-6 months makes it easy to prepare for the exam.
  2. Basic knowledge of machine learning. This helps to make sense of the codes, and answer ML questions during the exam.
  3. Having worked on Jupyter notebook or Jupyter lab, this is not a mandate, as all the labs are on jupyter notebook, it’s easy to work with them.
  4. Knowledge of Databricks and mlflow can be leveraged to score better marks in the test, starting July 2021 these concepts are included in DP-100.
  5. Rs. 4500 exam fees. 
  6. Register for a free Azure account, you receive Rs13,000 credits using which Azure ML can be explored. This is more than sufficient. But Azure ML is free only for the first 30 days. So make good use of this subscription.
  7. Most importantly set your exam date 30 days from today, pay for it, this serves as a good motivation factor. 

 

Azure data scientist associate

Dp 100 exam page

Azure Webpage

Is it worth it?

The cost of the exam is about Rs.4,500 and not many firms expect a certification during recruitment, it’s good to have but many, not recruiters demand it or are aware of it, so the question arises is it worth paying for it? Is it worth my weekends? The answer is yes, simply because, even though one could be a machine learning grandmaster or python expert, but the inner workings of Azure are specific to Azure, many methods are Azure specific to drive performance improvements. One cannot just dump a python code and expect it to give optimal performance. Many processes are automated on azure for example – automl module builds models with just one line of code, hyper-parameter tuning takes one line of code. No code ML is another such drag and drop tool which makes building models a child’s play. Containers/ storage / Key vaults / workspace / experiments/  all are azure specific tools and class. Creating compute instances, working with the pipeline, mlflow helps understand Mlops concepts as well. It’s a definite plus if you are working on Azure and want to explore the nitty-gritty of it. Overall, the rewards exceed the effort.

Preparation

  1. The exam is MCQ-based with about 60-80 questions and the time provided is 180min. This time is more than sufficient to complete and review all the questions.
  2. Two lab questions or case study type questions are asked and these are must answer questions and cannot be skipped
  3. It is a proctored test, so make sure you prepare for the exam.
  4. Microsoft changes the pattern about twice a year, so it’s best to review the updated exam pattern. 
  5. It’s easier if the exam preparation is divided into 2 steps, Theory and Lab.
  6. The theory is quite detailed and needs at least 1-2 weeks of preparation and review. All theory questions can be studied from microsoft docs. A detailed study of these docs will be sufficient.
  7. This important section constitutes the highest number of questions – Build and operate machine learning solutions with Azure Machine Learning.
  8. Labs are important as well. Even though practical lab questions won’t be asked, it helps to understand azure-specific classes and methods. And these constitute the majority of the questions.
  9. Machine learning questions will not be asked, for example, what is the R2 score won’t be asked. What can be asked is how to log the R2 score for an experiment. So the ML application on azure should be the focus.
  10. Microsoft provides an instructor-led paid course for DP-100 as well. I don’t see a need to take this up, as everything is provided in MS docs.
  11. Practice labs, about 14, practice at least once to get a hang of Azure workspace.
  12. Review theory before appearing for the exams, as to not get confused during the exam.

 

Skills Measured:

  • Set up an Azure Machine Learning workspace
  • Run experiments and train models
  • Optimize and manage models
  • Deploy and consume models

Clone the repo to practice azure labs:

git clone https://github.com/microsoftdocs/ml-basics

A Few Important Azure Methods/Classes:

## to create workspace
ws = Workspace.get(name='aml-workspace',
                   subscription_id='1234567-abcde-890-fgh...',
                   resource_group='aml-resources')
## register model
model = Model.register(workspace=ws,
model_name='classification_model',

model_path='model.pkl', # local path

description='A classification model',

tags={'data-format': 'CSV'},

model_framework=Model.Framework.SCIKITLEARN,

model_framework_version='0.20.3')




## Run a .py file in a piepeline

step2 = PythonScriptStep(name = 'train model',

                         source_directory = 'scripts',

                         script_name = 'train_model.py',

                         compute_target = 'aml-cluster')




# Define the parallel run step step configuration

parallel_run_config = ParallelRunConfig(

    source_directory='batch_scripts',

    entry_script="batch_scoring_script.py",

    mini_batch_size="5",

    error_threshold=10,

    output_action="append_row",

    environment=batch_env,

    compute_target=aml_cluster,

    node_count=4)

# Create the parallel run step

parallelrun_step = ParallelRunStep(

    name='batch-score',

    parallel_run_config=parallel_run_config,

    inputs=[batch_data_set.as_named_input('batch_data')],

    output=output_dir,

    arguments=[],

    allow_reuse=True

)

A Few Important Concepts(not an exhaustive list):

  1. Create compute cluster for test and productions
  2. Create pipeline steps
  3. Connect Databricks cluster to azure ML workspace
  4. Hyperparameter tuning method
  5. Working with data – datasets and datastore
  6. Model drift
  7. Differential privacy
  8. Detect model unfairness (MCQ questions)
  9. Model explanations using shap explainers.
  10. Method to remember
    1. Scriptrunconfig
    2. PipelineData
    3. ParallelRunConfig
    4. PipelineEndpoint
    5. RunConfiguration
    6. init() run()
    7. PublishedPipeline
    8.  ComputeTarget.attach
    9. dataset/datastore methods

Azure DP-100 exam prep session

Azure Machine Learning Workspace:

Azure ml studio | how azure DP 100

Azure Databricks create a cluster:

create cluster | how azure DP 100

Azure Designer:

azure desginer | how azure DP 100

Exam Day

  1. Make sure to test your system a day before. Work laptops sometimes cause problems, so it’s better to use personal laptops.
  2. No books/papers/pens or other stationery allowed.
  3. The proctor does initial basic checks, and lets you start the exam.
  4. Once the exam is submitted, the scores are provided on screen, and later in an email. So do not forget to check your mail.
  5. The certification is valid for 2 years only.

Good luck! Your next target should be DP-203 (Data Engineering on Microsoft Azure).

Here is my Linkedin profile in case you want to connect with me. I’ll be happy to be connected with you. My Azure DS badge.
The media shown in this article are not owned by Analytics Vidhya and are used at the Author’s discretion.

Data scientist. Extensively using data mining, data processing algorithms, visualization, statistics, and predictive modeling to solve challenging business problems and generate insights. My responsibilities as a Data Scientist include but are not limited to developing analytical models, data cleaning, explorations, feature engineering, feature selection, modeling, building prototype, documentation of an algorithm, and insights for projects such as pricing analytics for a craft retailer, promotion analytics for a fortune 500 wholesale club, inventory management/demand forecasting for a jewelry retailer and collaborating with on-site teams to deliver highly accurate results on time.

Responses From Readers

Clear

Congratulations, You Did It!
Well Done on Completing Your Learning Journey. Stay curious and keep exploring!

We use cookies essential for this site to function well. Please click to help us improve its usefulness with additional cookies. Learn about our use of cookies in our Privacy Policy & Cookies Policy.

Show details