Member-only story
Oracle to BigQuery Migration: The Challenges, The Fixes, and The Wins
Introduction: The Bold Decision to Move
It started with an ambitious goal — move our entire data warehouse from Oracle to BigQuery. Our leadership envisioned a modern, scalable, and cost-effective cloud solution, and BigQuery seemed like the perfect fit. What they did not anticipate, and neither did we, was the chaos that would follow.
From data type mismatches to SQL rewrites, and from unexpected performance issues to hidden costs, the journey was nothing short of a data engineering horror story. But in the end, we not only survived — we thrived. This is our story.

Phase One: Planning
Like any responsible team, we started with a detailed migration plan. We mapped out:
- The data volume, which was more than 100 terabytes of Oracle data
- The number of tables, which exceeded 5000 across multiple schemas
- The transformation requirements, since PL/SQL would not work in BigQuery
- The estimated downtime, which we aimed to keep as close to zero as possible
- The dependencies between tables, ensuring referential integrity
- The data ingestion strategy, deciding between batch and streaming approaches
- The security and access controls, since Oracle’s user-based security model differed from Google Cloud’s IAM
It all looked good on paper. But then reality hit.
Phase Two: Data Type Mismatches
Oracle and BigQuery do not speak the same language. Our first major hurdle was dealing with incompatible data types.
- Oracle’s Number(38,10) had to be mapped to BigQuery’s Numeric, which resulted in precision loss.
- Varchar2(4000) became String, but BigQuery’s UTF-8 encoding caused unexpected issues with special characters.
- CLOB and BLOB were unsupported directly, forcing us to store large text as JSON or break it into multiple columns.
- Date and Timestamp had slight differences, requiring conversion functions to ensure consistency.