SnowflakeOperator¶
Use the SnowflakeOperator
to execute
SQL commands in a Snowflake database.
Using the Operator¶
Use the snowflake_conn_id
argument to connect to your Snowflake instance where
the connection metadata is structured as follows:
Parameter |
Input |
---|---|
Login: string |
Snowflake user name |
Password: string |
Password for Snowflake user |
Schema: string |
Set schema to execute SQL operations on by default |
Extra: dictionary |
|
An example usage of the SnowflakeOperator is as follows:
snowflake_op_sql_str = SnowflakeOperator(task_id="snowflake_op_sql_str", sql=CREATE_TABLE_SQL_STRING)
snowflake_op_with_params = SnowflakeOperator(
task_id="snowflake_op_with_params",
sql=SQL_INSERT_STATEMENT,
parameters={"id": 56},
)
snowflake_op_sql_list = SnowflakeOperator(task_id="snowflake_op_sql_list", sql=SQL_LIST)
snowflake_op_sql_multiple_stmts = SnowflakeOperator(
task_id="snowflake_op_sql_multiple_stmts",
sql=SQL_MULTIPLE_STMTS,
split_statements=True,
)
snowflake_op_template_file = SnowflakeOperator(
task_id="snowflake_op_template_file",
sql="example_snowflake_snowflake_op_template_file.sql",
)
Note
Parameters that can be passed onto the operator will be given priority over the parameters already given
in the Airflow connection metadata (such as schema
, role
, database
and so forth).
SnowflakeSqlApiOperator¶
Use the SnowflakeSqlApiHook
to execute
SQL commands in a Snowflake database.
You can also run this operator in deferrable mode by setting deferrable
param to True
.
This will ensure that the task is deferred from the Airflow worker slot and polling for the task status happens on the trigger.
Using the Operator¶
Use the snowflake_conn_id
argument to connect to your Snowflake instance where
the connection metadata is structured as follows:
Parameter |
Input |
---|---|
Login: string |
Snowflake user name |
Password: string |
Password for Snowflake user |
Schema: string |
Set schema to execute SQL operations on by default |
Extra: dictionary |
|
An example usage of the SnowflakeSqlApiHook is as follows:
snowflake_sql_api_op_sql_multiple_stmt = SnowflakeSqlApiOperator(
task_id="snowflake_op_sql_multiple_stmt",
sql=SQL_MULTIPLE_STMTS,
statement_count=len(SQL_LIST),
)
Note
Parameters that can be passed onto the operator will be given priority over the parameters already given
in the Airflow connection metadata (such as schema
, role
, database
and so forth).