Become a Computer Vision Artist with Stanford’s Game Changing ‘Outpainting’ Algorithm (with GitHub link)

Pranav Dar 07 May, 2019 • 3 min read

Overview

  • Stanford researchers have designed ‘Outpainting’ – an algorithm that extrapolates and extends existing images
  • At the core of the algorithm are GANs; these were used on a dataset of 36,500 images, with 100 held out in the validation set
  • You can try it out yourself using a Keras implementation that has been open sourced on GitHub

 

Introduction

Becoming an artist these days is easier than ever before (especially if you’re a programmer!) thanks to recent advances in computer vision algorithms. Your computer is your canvas and machine learning techniques are your paint brushes, and there’s really no reason to dally on your interests any longer – it’s time to get started on your artistic dreams!

If you are a keen follower of AVBytes, you must have read about a technique called “inpainting” (read up on that in case you haven’t yet, it’s really worth it). It is a popular computer vision technique that aims to restore missing parts in an image and has produced some exquisite results, as you will see in that article. Current state-of-the-art methods for inpainting involve GANs (Generative Adversarial Networks) and CNNs (Convolutional Neural Networks).

It was only a matter of time before someone from the ML community figured out a technique that goes beyond the scope of inpainting. This breakthrough has come from a couple of Stanford researchers, Mark Sabini and Gili Rusak, and the new technique is appropriately named “outpainting”.

This approach extends the use of GANs for inpainting to estimate and imagine what the existing image might look like beyond what can be seen. Then the algorithm expands the image and paints what it has estimated – and the results, as you can see in the image below, are truly astounding.

For the dataset, the researchers used 36,500 images of 256×256 size, which were downsampled to 128×128. 100 images were held out for the validation set.

Even the research paper for outpainting has been written in a user-friendly format. Instead of the usual page after page of theory, the paper is of just 2 pages – one which lists down how the technique was derived and how it works, and the second which contains a list of references. Check out the image of the first page below which lists down a step-by-step approach for designing and executing outpainting:

Wondering how to implement this on your own? Wonder no more – use this GitHub repository as your stepping stone. It is a Keras implementation of outpainting in Python. It gives you the option to either build your model from scratch or use the pertained model the creator has uploaded. Get started now!

 

Our take on this

What an awesome concept! If this doesn’t get your interest in computer vision going, I don’t know what will. Take this course to learn all about computer vision with deep learning, and get started on your path toward becoming a CV expert!

For the Keras model, there’s a caveat here – as you’ll read in this Reddit discussion thread, there’s a chance that the Keras model was overfitted. The model was trained on images that were present in the training set itself so it was able to convincingly extrapolate the generated image. The model still did fairly well when tested on unseen data, but not as well as first imagined. But don’t let that dissuade you! The Stanford technique is still solid, and there will be far more refined frameworks coming soon using outpainting. Hope to see one from our Analytics Vidhya community as well!

 

Subscribe to AVBytes here to get regular data science, machine learning and AI updates in your inbox!

 

Pranav Dar 07 May 2019

Senior Editor at Analytics Vidhya. Data visualization practitioner who loves reading and delving deeper into the data science and machine learning arts. Always looking for new ways to improve processes using ML and AI.

Frequently Asked Questions

Lorem ipsum dolor sit amet, consectetur adipiscing elit,

Responses From Readers

Clear

Ruthger Righart
Ruthger Righart 03 Aug, 2018

Thanks for sharing!