16.5. How Fast is BigQuery?
BigQuery Speed in Action
Let's examine the execution details of a query I ran recently to understand how fast BigQuery can be. The query involved selecting everything from a table (using SELECT *
). This table was just under 10 gigabytes in size and contained 19 million rows with 60 columns each.
In total, I was working with 1,140,000,000 cells of data. The entire process took only one minute and 48 seconds for BigQuery to complete.
Parallel Processing Power
During that brief time, BigQuery scaled in parallel and used multiple slots to process the query. Each slot represents one unit of virtual CPU processing power.
Thanks to this parallel processing approach, it completed a task that would have taken a single CPU nearly 29 minutes in just under two minutes. In doing so, it shuffled through 33 gigabytes of data—far faster than what you could achieve using Google Sheets or APIs or even setting up your own server.
And because this example fell within the free limits, it cost nothing!
Scalability: A Real-World Example
BigQuery's scalability is also impressive. I ran another query last week that took four hours and 26 minutes to complete. This particular query used more than 1,000 days' worth of CPU time and shuffled an astounding 85 terabytes of data.
This "impossible" query cost $450—a price tag that underscores its value when compared to alternative methods like using your own computer or server (which likely couldn't handle such an immense task).
In conclusion, BigQuery's ability to scale according to your needs while delivering results quickly makes it an invaluable tool for handling large datasets—especially if you're willing to pay for its capabilities.
📩 Receive my weekly Looker Studio tips
🎥 Subscribe to my YouTube channel
🖇 Connect with me on LinkedIn