In my recent work, I need to run some SQL snippet from Redshift on Google’s BigQuery platform. Since different data warehouses have a different recipe for SQL, the transferring work couldn’t be avoided.
Here comes some tricks:
Redshift | BigQuery |
field::VARCHAR | CAST(field AS String) |
isnull(), nvl() | ifnull() |
dateadd() | date_add() |
datediff() | date_diff() |
union | union all |
field ILIKE pattern | UPPER(field) LIKE pattern |
split_part(string, delimiter, part) | split(string, delimiter)[safe_offset(part)] |
In Redshift we can select columns like this:
SELECT SQRT(score) AS new_score, new_score * 10 FROM ...
But in BigQuery we couldn’t use column name from “AS”. The SQL in BigQuery should be:
SELECT SQRT(score) AS new_score, SQRT(score) * 10 FROM ...
And, BigQuery has the “WITH” clause to replace the “temporary table”, which is very powerful:
WITH result AS ( WITH example AS ( SELECT * FROM `dataset.table` ) SELECT * FROM example ) SELECT * FROM result