Rishabh Jaiswal — Published On October 26, 2022 and Last Modified On October 26th, 2022
Deep Learning Intermediate Object Detection Python

This article was published as a part of the Data Science Blogathon.

Centroid Tracker

Introduction

In this article, we will learn how to make an object tracker using OpenCV in Python and using, and we will build an object tracker and make a counter system.

A tracker keeps track of moving objects in the frame; In OpenCV, we can build a tracker class using Euclidean distance tracking or centroid tracking.

Object tracking and the counter system are only used in video or camera feed. Our tracker will assign a specific id to the moving objects, which will not change throughout the feed.

Generally, there could be two types of object trackers based on their capabilities.

  • Multiple Object Tracker
  • Single Object Tracker

In this article, we will design a Multiple Object Tracker, and using that; we will make a Vehicle Counter System using Python.

Available Tracking Algorithms

There are various tracking algorithms available in the market. They have different working algorithms and complexity.

Some Popular tracking algorithms are given below-

  • DEEP SORT: This is one of the most popular tracker algorithms which uses deep learning. This works combined with the YOLO object detector. In the backend, it uses various filters, including Kalman’s filter, for its tracking. Deep learning-based models often overperform the classical machine learning models.
  • Centroid Tracker algorithm: Centroid tracker algorithms are all about tracking the coordinates by defining some threshold values to be called as same points. There is no machine learning stuff involved in it. This algorithm is a multiple-step process where we calculate the euclidean distance of centroids of all detected objects in subsequent frames. We will make a custom python class to implement Centroid tracking.

Centroid Tracker Algorithm Process

Step 1. In the first step, moving objects will be detected, and a bounding box for each and every moving object will be found. Using the bounding box coordinates, the centroid points will be calculated, which will be the point of diagonal intersection of the bounding box rectangle.

Centroid Tracker

Centroids calculation

Step 2. After calculating the centroid points, assign them a specific unique id and then calculate the euclidean distance between every possible centroid in our frame.

Centroid Tracker

Euclidean Distance between centroids

Step 3. The main Idea of Centroid is that the same object will be moved a minimum distance compared to others points in the subsequent frames. Using this idea,, whichever centroid has the minimum distance pair will be assigned with the same id as the previous centroid with the least euclidean distance.

Centroid distance difference

Centroid distance difference

Step 4. We can assign previous IDs to the object in the subsequent frame based on the euclidean distance difference.

Points being Tracked

Points being Tracked

We are using the subtraction technique on subsequent frames to capture the moving points ( F(t+1) -F(t)). However, we are free to use any object detector algorithms based on our objective.

Application of Object Tracking

The involvement of deep learning in Object tracking makes it more robust daily. Modern object trackers are far better than what we have seen previously. These Object Trackers are being deployed extensively.

  • Tracking Unwanted Behaviours
  • Crowd Detection
  • Vehicle Tracking
  • In the Aviation Industry
  • Tracker in the Malls and Shopping Complexes
  • Security Systems

Euclidean Distance Tracker in Python

By Combining all the Processes we discussed, we made a EuclideanDistTracker class In Python. This Tracker Object will return me the tracking Id and coordinates of the moving objects.

This Class Contains all the math involved in Object Tracking.

All the source codes used in this article can be downloaded through this link. I encourage you to download the source code files in order to avoid any typing mistakes in your workbook.

After downloading the tracker file, save it as tracker.py To import in python as a module. We will import the tracker file later. Download the file tracker.pyusing this link.

Functions Available in our Tracker Class

  • updateIt inputs bounding box coordinates in an array.
  • Tracker returns a list containing object id and coordinates in the form of [x,y,w,h,object_id].
  • x,y,w,h is the bounding box coordinates of detected objects and object_id is the unique id given to the objects?

Libraries and Video Feed

Before Starting, we will import the tracker object and initiate a video feed; it could be a camera feed or a video feed.

import cv2
import numpy as np
from tracker import EuclideanDistTracker
tracker = EuclideanDistTracker()
cap  = cv2.VideoCapture('RoadRage.mp4')
ret, frame1 = cap.read()
ret, frame2 = cap.read()

We are using the sample highway video provided by OpenCV as a sample.

cap.read() It returns the frame and a boolean value after reading the feed.

Video Feed-In OpenCV

In this part, we will implement object detection using subsequent frame subtraction in OpenCV and use the detected objects for tracking using the tracker class.

Object detection might catch some noise due to unwanted moving objects like trees, leaves etc. we will filter all unwanted moving objects by defining a minimum threshold area.

while cap.isOpened():
    diff = cv2.absdiff(frame1, frame2)  
    gray = cv2.cvtColor(diff, cv2.COLOR_BGR2GRAY)
    blur = cv2.GaussianBlur(gray, (5,5), 0 )
    height, width = blur.shape
    print(height, width)
    dilated = cv2.dilate(threshold, (1,1), iterations=1)
    contours, _, = cv2.findContours(dilated, cv2.RETR_TREE, cv2.CHAIN_APPROX_SIMPLE)
    detections = []
    for contour in contours:
        (x,y,w,h) = cv2.boundingRect(contour)
        if cv2.contourArea(contour) <300:
            continue
        detections.append([x,y,w,h])
    boxes_ids = tracker.update(detections)
    for box_id in boxes_ids:
        x,y,w,h,id = box_id
        cv2.putText(frame1, str(id),(x,y-15),  cv2.FONT_HERSHEY_SIMPLEX, 0.8, (0,0,255), 2)
        cv2.rectangle(frame1, (x,y),(x+w, y+h), (0,255,0), 2)
        cv2.imshow('frame',frame1)
frame1 = frame2
    ret, frame2 = cap.read()
    key = cv2.waitKey(30)
    if key == ord('q):
        break
cv2.destroyAllWindows()
  • cv2.absdiff This method is used to find the difference between two subsequent frames and based on the difference, we will catch moving objects.
  • To find the objects, we will apply contouring on the difference of subsequent frames after scaling them into greyscale.
  • We will apply a threshold to counter the noise.
  • After contours detections, we will filter out contours smaller than 200 px area. Any contours having 300 or more areas will be considered detected objects.
  • boxes_ids It contains the associated object Ids and coordinates of bounding boxes.

Output Frame:

Conclusion

In this article, we discussed Object Detection algorithms and learned the process of Euclidean Tracker. Using all the knowledge, we made a Tracker Class and implemented a Tracker in OpenCV.

We made a vehicle counter system using the Euclidean Tracker and object detection using frame subtraction.

Centroid Tracker is easy to implement but inaccurate compared to modern Tracking algorithms. Euclidean and Centroid Tracker Algorithms will fail if objects travel at different speeds. Hence it doesn’t suit the real-world scenario.

  • DEEPSORT is a much more robust and accurate deep learning-based tracking algorithm.
  • The centroid Tracking algorithm is a multi-step tracking algorithm based on geometrical calculation.
  • Pairs with minimum distance difference will be considered as a single object.
  • Centroid Tracking algorithms might not be a good fit for real-world problems.

Thanks for Reading!!

The media shown in this article is not owned by Analytics Vidhya and is used at the Author’s discretion.

About the Author

Rishabh Jaiswal

Our Top Authors

Download Analytics Vidhya App for the Latest blog/Article

Leave a Reply Your email address will not be published. Required fields are marked *