Firebase NoSQL database available via Google Cloud Platform

Migration a MySQL Database to GCP Firebase using Python and Pandas

I have a little VM on a well-known Developer Cloud which I love for quick prototyping for new and prospective clients.

It has all my Data Science and Machine Learning Libraries installed, a Flask Webserver, built on uWSGI behind an Nginx Reverse Proxy, for storage I have installed and connected MongoDB, Redis and MySQL.

Therefore, it is easy for me to quickly scrape some data or train an ML model and put it in a somewhat presentable form onto this server to demonstrate what I have built.

This works great for small datasets, but as soon as any project gets bigger the Cloud question justifiably comes up.

There are as many viable options for your projects as there are clouds and data centers in the world, so I am not recommending anything in particular with regards to YOUR data or project.

However, if you find yourself having to Migrate an SQL table to the NoSQL database Firestore you will find that there arent really any good out-of -box tools you can use.

Thankfully, Pandas is great for wrangling structured data and moving between different storage options.

To read in SQL I like using SQL-Alchemy, a Python Object-Relational Mapper.

You can find the documentation here:

A connection string like this:

‘mysql+pymysql://user:password@database.server.com:25060/databasename’

Is all you need to connect to an SQL database on any server. Make sure your IP adress is white-listed if you are not on localhost.

Then, after installing pymysql and sqlalchemy via pip you can read in any relational datase into pandas like this:

import pymysql

import pandas

from sqlalchemy import create_engine

sqlEngine = create_engine(‘mysql+pymysql://user:password@database.server.com:port/databasename’)

tableName = ‘<NAME_OF_YOUR_TABLE>’

try:

df = pd.read_sql_table(tableName, con = sqlEngine);

except ValueError as vx:

print(“valueError:”,vx)

except Exception as ex:

print(“another exception:”,ex)

else:

print(“Table %s loaded successfully.”%tableName);

Then, once you have the table loaded into Pandas you can pip install GCP’s firebase-Admin module like this:

pip install firebase-admin

Go to the Firebase console and grab your service account json keys.

Save them in your local ENV or directory of your choice.

Make sure to never push them to a public repo!

https://console.firebase.google.com/project/<YOUR_PROJECT_ID>/settings/serviceaccounts/adminsdk

With your keys ready to go, now all you need to do is:

import firebase_admin
from firebase_admin import credentials

import firebase_admin
from firebase_admin import credentials, firestore

cred = credentials.Certificate(<PATH TO YOUR json KEY FILE>”)

firebase_admin.initialize_app(cred,
{
“databaseURL”: “https://<YOUR_PROJECT_ID.firebaseio.com/"
})
db = firestore.client()

doc_ref = db.collection(<NAME_OF_COLLECTION>)

tmp = df.to_dict(orient=”records”)
list(map(lambda x: doc_ref.add(x), tmp))

Now you can go the Firebase Section of the Google Cloud Console and see your data, taduuuh!!!!

I am new to writing on Medium. If you found this article helpful, please like and share. This will encourage me to write more Data Engineering Articles in the future. Thank you!