Introduction to Google Firebase Cloud Storage using Python

TK Kaushik Jegannathan 16 Jul, 2022 • 6 min read

This article was published as a part of the Data Science Blogathon.

Introduction

Firebase is a very popular Backend as a Service (BaaS) offered by Google. It aims to replace conventional backend servers for web and mobile applications by offering multiple services on the same platform like authentication, real-time database, Firestore (NoSQL database), cloud functions, machine learning, cloud storage, and many more. These services are cloud-based and production-ready that can automatically scale as per demand without any need for configuration.

In my previous article, I covered Google Firestore, a cloud-based NoSQL database offered by Firebase. You can read my previous article on Google Firestore here. One such offering is Cloud Storage, which is a powerful yet simple storage service offered by Firebase. The Cloud Storage offering in Firebase is the Google cloud storage available on the Google Cloud Platform (GCP). The free-tier version provides 5GB of storage space for a bucket. In this article, we will learn about cloud storage and how it can be used to store and access files securely over the internet using python.

Setting up Firebase to access Cloud Storage

It is mandatory to create a Firebase project to access Cloud Storage. Let’s go ahead and create a new Firebase project. Open Firebase in your browser, log in using your Google account and then click on “Create a new project, ” which will ask for a project name. You can give your firebase project any name, while I named it “cloud-storage-basics-python”. Click on Storage, which you can find in the left navigation bar as shown in the picture.

cloud storage basics

Once you click the get started button, you will be prompted to choose either production mode or testing mode. Choose test mode and proceed.

set up cloud storage

You will be prompted to choose a location for the cloud storage bucket. Choose the closest region to your current location to reduce latency and click done.

google storage

Connecting Python to Cloud Storage

To connect to Google Firestore, we need to install a python package called “firebase-admin.” This can be installed like any other python package using pip. Ensure that your python version is 3.6 or below, as this module throws an exception because of the async module added in python 3.7 onwards. If you have a higher version installed, you can use anaconda to create a new environment with python 3.6. Run the following commands to create and activate a new environment in the anaconda.

conda create -n cloud_storage_env python=3.6.5
conda activate cloud_storage_env

To install the “firebase-admin” package, run the following.

pip install firebase-admin

To use any service offered by Firebase, we first need to perform authentication using our credentials. To get the credentials for authentication, click on project settings and “service accounts.”

python

In the “Service Accounts” tab, you can find a code snippet for connecting to Google Firebase. Select python as the language and copy the code snippet. After copying the code snippet, click on “manage service account permissions.”

service account

Now click on “manage keys” and then click on the “add key” button and select “create new key“, and choose JSON as the file format and click create.

google cloud
firebase

Now that we have the credentials let’s connect to Firebase and start accessing the cloud storage service. To do so, paste the following code snippet shown below and add the file path of the credentials file that got downloaded in the previous step. You can find your storage bucket link in your Firebase cloud storage console.

import firebase_admin
from firebase_admin import credentials, storage
cred = credentials.Certificate("path/to/your/credentials.json")
firebase_admin.initialize_app(cred,{'storageBucket': 'your_bucket_link_without_gs://'}) # connecting to firebase

Now that we have connected to Firebase let’s try to use the cloud storage service.

Using Google Cloud Storage

Now consider that you maintain a folder structure on your server and wish to replicate the same folder structure in your storage bucket as well. For this, we can directly use the “upload_from_filename()” function, which is a property of the blob object. This function will replicate the folder structure of each file that is being uploaded. This means that if you have a text file inside a folder named “text_files”, the same folder structure will also be replicated in your storage bucket. Now, let’s see how to use this function to upload files to our storage bucket.

Firstly, I will upload an image file present in the root directory to our storage bucket. Once that is done, I will try to upload a text file present inside a folder named “text_docs” to our storage bucket using the above-described function.

file_path = "sample_image_file.jpg"
bucket = storage.bucket() # storage bucket
blob = bucket.blob(file_path)
blob.upload_from_filename(file_path)
cloud storage

We can see that the image file has been uploaded to our storage bucket in the root directory. Now let’s try to upload the text file present inside the “text_docs directory.”

file_path = "text_docs/sample_text_file.txt"
bucket = storage.bucket() # storage bucket
blob = bucket.blob(file_path)
blob.upload_from_filename(file_path)
cloud storage

We can see that the text file has been uploaded inside the text_docs folder, just like it is on our local machine.

Now consider that you do not maintain a folder structure on your server and wish to maintain a proper folder structure in your storage bucket. For this, we can also use the “upload_from_filename()” function with a slight modification. Let’s try to upload the image file inside a folder named “images”. On our local machine, the image file is present in the root directory and there is no folder named images. We will also rename the image file while storing it in the storage bucket.

from google.cloud import storage
from google.oauth2 import service_account
def upload_blob(bucket_name, source_file_name, destination_blob_name):
    credentials = service_account.Credentials.from_service_account_file("path/to/your/credentials.json")
    storage_client = storage.Client(credentials=credentials)
    bucket = storage_client.bucket(bucket_name)
    blob = bucket.blob(destination_blob_name)
    blob.upload_from_filename(source_file_name)
    print(f"File {source_file_name} uploaded to {destination_blob_name}.")
upload_blob(firebase_admin.storage.bucket().name, 'sample_image_file.jpg', 'images/beatiful_picture.jpg')

Now let’s see if the image from our root directory has been uploaded inside a folder named “images” in our storage bucket. We can see that a new folder called “images” has been created, and the image file has also been uploaded inside that folder.

google cloud

Now, if you want to access your files from your bucket and want to download them, you can do that easily with a few lines of code. Let’s try downloading the text file we uploaded to our storage bucket inside the text_docs folder and rename the file as “downloaded_file.txt”. The code snippet shown below will download the file to our local machine.

credentials = service_account.Credentials.from_service_account_file("path/to/your/credentials.json")
storage.Client(credentials=credentials).bucket(firebase_admin.storage.bucket().name).blob('text_docs/sample_text_file.txt').download_to_filename('downloaded_file.txt')

Now, if you want to share the files over the internet or want them to be public, you can directly access the “public_url” property of the blob object that returns a URL for that file. Let’s try to get the URL of all the files present in our storage bucket. To do so, we first need to get all the files present in our storage bucket and then access their public URL.

credentials = service_account.Credentials.from_service_account_file("path/to/your/credentials.json")
files = storage.Client(credentials=credentials).list_blobs(firebase_admin.storage.bucket().name) # fetch all the files in the bucket
for i in files: print('The public url is ', i.public_url)

Conclusion

Key takeaways from this article are:

  • Understanding how to set up a Firebase project in detail
  • Uploading and downloading files to and from the cloud-based storage bucket using python
  • Extracting a public URL for the files from our storage bucket for sharing across the internet

As mentioned earlier, Google Firebase offers a lot of production-ready services for free that are hosted on the Google Cloud. Firebase has been a lifesaver for many front-end developers, who do not have to explicitly know backend programming and frameworks like nodejs, flask, etc., to build a full-stack web or mobile application. If you are interested in learning about other services offered by Google Firebase, you can refer to my article on Firestore, which is a NoSQL database offered by Google. I will try to cover other services Google Firebase offers in the coming weeks, so stay tuned!

The media shown in this article is not owned by Analytics Vidhya and is used at the Author’s discretion.

Frequently Asked Questions

Lorem ipsum dolor sit amet, consectetur adipiscing elit,

Responses From Readers

Clear

Related Courses