- Notifications
You must be signed in to change notification settings - Fork126
Description
What
ParamEscaper'sescape_string() gives incorrect behavior on Databricks SQL and in Databricks notebooks.
Itreplaces a single quote' with'', but the correct way to escape' is with a backslash, like\'.
You can verify in PySpark with:
assert spark.sql("select 'cat''s meow' as my_col").head(1)[0]['my_col'] == "cats meow"assert spark.sql("select 'cat\\'s meow' as my_col").head(1)[0]['my_col'] == "cat's meow"Note that because it starts as a Python literal, we need two backslashes\\ to get Python to first escape\\' to\' and then Spark escapes to'.
I don't know what the motivation for this implementation was, but the result seems to be concatenation instead of escaping the quote character.
Reproduction in databricks-sql-python
The following demonstrates the issue in version1.2.2 2.0.5 of databricks-sql-python against a serverless SQL warehouse in Azure, v2022.30, plus an implementation without parameter substitution showing an escape treatment that does work :
fromtypingimportListfromdatabricksimportsqlimportosfromdatabricks.sqlimportServerOperationErrorserver_hostname=os.environ.get('DBT_DATABRICKS_HOST')http_path=f'/sql/1.0/endpoints/{os.environ.get("DBT_DATABRICKS_ENDPOINT")}'access_token=os.environ.get('DBT_DATABRICKS_TOKEN')user_agent_entry="dbt-databricks/1.2.2"connection=sql.connect(server_hostname=server_hostname,http_path=http_path,access_token=access_token,_user_agent_entry=user_agent_entry)cursor=connection.cursor()defget_result_using_parameter_bindings(p:List[str]):try:cursor.execute('select %s as my_col',p)result=list(cursor.fetchall())[0]['my_col']exceptServerOperationErrorasexc:result=exc.message.strip()[:20]+'...'returnresultdefget_result_using_fstring(p:List[str]):try:escaped=p[0].replace('\\','\\\\' ).replace("'","\\'")cursor.execute(f"select '{escaped}' as my_col")result=list(cursor.fetchall())[0]['my_col']exceptServerOperationErrorasexc:result=exc.message.strip()[:20]+'...'returnresultparams= [ ["cat's meow"], ["cat\'s meow"], ["cat\\'s meow"], ["cat''s meow"],]forpinparams:# using dbt-databricks-sql's parameter substitutionparam_binding_result=get_result_using_parameter_bindings(p)# using manually built and escaped queryf_string_result=get_result_using_fstring(p)print('\nparameter value:',p[0],'parameter-binding result:',param_binding_result,'round-trip ok?',p[0]==param_binding_result)print('parameter value:',p[0],'f-string result:',f_string_result,'round-trip ok?',p[0]==f_string_result )assertp[0]==f_string_resultcursor.close()connection.close()
The output is:
bash_1 | parameter value: cat's meow parameter-binding result: cats meow round-trip ok? Falsebash_1 | parameter value: cat's meow f-string result: cat's meow round-trip ok? Truebash_1 | bash_1 | parameter value: cat's meow parameter-binding result: cats meow round-trip ok? Falsebash_1 | parameter value: cat's meow f-string result: cat's meow round-trip ok? Truebash_1 | bash_1 | parameter value: cat\'s meow parameter-binding result: [PARSE_SYNTAX_ERROR]... round-trip ok? Falsebash_1 | parameter value: cat\'s meow f-string result: cat\'s meow round-trip ok? Truebash_1 | bash_1 | parameter value: cat''s meow parameter-binding result: cats meow round-trip ok? Falsebash_1 | parameter value: cat''s meow f-string result: cat''s meow round-trip ok? TrueExpected results
String parameters with single quotes and backslashes should be properly reproduced:"cat's meow" would be escaped as"cat\\'s meow" and the resulting SQL would returncat's meow"cat\\'s meow" would escape to"cat\\\\\\'s meow" and the SQL would returncat\'s meow
Suggested fix
I'm not sure how this is usually implemented, but in my example just doingparam.replace('\\','\\\\' ).replace("'", "\\'") at least preserves single quotes and backslashes, which are probably the most common cases. It would also leave alone escaped unicode literals like\U0001F44D.
How I encountered it
I'm using dbt with Databricks and noticed on upgrading from dbt-databricks 1.0 to 1.2.2 that single quotes started disappearing from our "seeds" (csv files loaded as Delta tables). Code had changed in dbt-databricks to use the new parameter binding functionality in this library, whereas (I assume) before it must have been injecting the values as literals into the SQL.