- Notifications
You must be signed in to change notification settings - Fork126
Complete SeaDatabricksClient (Execution Phase)#580
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to ourterms of service andprivacy statement. We’ll occasionally send you account related emails.
Already on GitHub?Sign in to your account
Closed
Uh oh!
There was an error while loading.Please reload this page.
Closed
Changes fromall commits
Commits
Show all changes
355 commits Select commitHold shift + click to select a range
904c304 PySQL Connector split into connector and sqlalchemy (#444)
jprakash-dbe5ac8c6 Removed CI CD for python3.8 (#490)
jprakash-dbe06fd2f Added CI CD upto python 3.12 (#491)
jprakash-db7126f97 Merging changes from v3.7.1 release (#488)
jprakash-dbb1245da Bumped up to version 4.0.0 (#493)
jprakash-dbb54b04a Support Python 3.13 and update deps (#510)
dhirschfelde1d7f71 Improve debugging + fix PR review template (#514)
samikshya-db1c721e0 Forward porting all changes into 4.x.x. uptil v3.7.3 (#529)
jprakash-dbf1fa67a Updated the actions/cache version (#532)
jprakash-db512d37c Updated the CODEOWNERS (#531)
jprakash-dbff0ec64 Add version check for urllib3 in backoff calculation (#526)
shivam2680c5900c9 Support multiple timestamp formats in non arrow flow (#533)
jprakash-dbc86e99b prepare release for v4.0.1 (#534)
shivam2680bcd6f01 Relaxed bound for python-dateutil (#538)
jprakash-dbca51d1d Bumped up the version for 4.0.2 (#539)
jprakash-dbd403fcd Added example for async execute query (#537)
jprakash-db1b4154c Added urllib3 version check (#547)
jprakash-dbeac0433 Bump version to 4.0.3 (#549)
jprakash-db1b0fc9b decouple session class from existing Connection
varun-edachali-dbxb45871e add open property to Connection to ensure maintenance of existing API
varun-edachali-dbx561c351 update unit tests to address ThriftBackend through session instead of…
varun-edachali-dbxa3188ad chore: move session specific tests from test_client to test_session
varun-edachali-dbxff7abf6 formatting (black)
varun-edachali-dbxe7e2333 use connection open property instead of long chain through session
varun-edachali-dbxfd4fae6 trigger integration workflow
varun-edachali-dbx677e66a fix: ensure open attribute of Connection never fails
varun-edachali-dbx4ec8703 introduce databricksClient interface and thrift backend implementation
varun-edachali-dbx6ecb6bf change names of ThriftBackend -> ThriftDatabricksClient in tests
varun-edachali-dbxfe2ce17 formatting: black + re-organise backend into new dir
varun-edachali-dbx568b1f4 Update CODEOWNERS (#562)
jprakash-dbebbd150 remove un-necessary example change
varun-edachali-dbxaa8af45 [empty commit] trigger integration tests
varun-edachali-dbxe7be76b introduce normalised sessionId and CommandId for (near) complete back…
varun-edachali-dbx7461715 fix: Any is not defined
varun-edachali-dbxb295acd fix: get_session_id_hex() is not defined
varun-edachali-dbx8afc5d5 command_handle -> command_id in ExecuteResponse
varun-edachali-dbxb8f9146 fix: active op handle -> active command id in Cursor
varun-edachali-dbx7c733ee fixed (most) tests by accounting for normalised Session interface
varun-edachali-dbx0917ea1 fix: convert command id to operationHandle in status_request
varun-edachali-dbx3fd2a46 decouple session class from existing Connection
varun-edachali-dbx03d3ae7 add open property to Connection to ensure maintenance of existing API
varun-edachali-dbxfb0fa46 use connection open property instead of long chain through session
varun-edachali-dbxe3770cd trigger integration workflow
varun-edachali-dbx3b8002f fix: ensure open attribute of Connection never fails
varun-edachali-dbxf1a350a fix: de-complicate earlier connection open logic
varun-edachali-dbx98b0dc7 Revert "fix: de-complicate earlier connection open logic"
varun-edachali-dbxafee423 [empty commit] attempt to trigger ci e2e workflow
varun-edachali-dbx2d24fdd PECOBLR-86 improve logging on python driver (#556)
saishreeeeea5561e8 Revert "Merge remote-tracking branch 'upstream/sea-migration' into de…
varun-edachali-dbx0d890a5 Reapply "Merge remote-tracking branch 'upstream/sea-migration' into d…
varun-edachali-dbxc2aa762 fix: separate session opening logic from instantiation
varun-edachali-dbxfe642da chore: use get_handle() instead of private session attribute in client
varun-edachali-dbx394333c fix: remove accidentally removed assertions
varun-edachali-dbxef07acd generalise open session, fix session tests to consider positional args
varun-edachali-dbx1ef46cf formatting (black)
varun-edachali-dbx76ca997 correct session logic after duplication during merge
varun-edachali-dbxafc6f8f args -> kwargs in tests
varun-edachali-dbxf6660ba delegate protocol version to SessionId
varun-edachali-dbx9871a93 ids -> backend/types
varun-edachali-dbx595d795 update open session with normalised SessionId
varun-edachali-dbx10ee940 remove merge artifacts, account for result set
varun-edachali-dbx3d75d6c fix: import CommandId in client tests
varun-edachali-dbxb8e1bbd expect session_id in protocol version getter
varun-edachali-dbxdac08f2 enforce ResultSet return in exec commands in backend client
varun-edachali-dbx7b0cbed abstract Command State away from Thrift specific types
varun-edachali-dbx9267ef9 close_command return not used, replacing with None and logging resp
varun-edachali-dbx5f00532 move py.typed to correct places (#403)
wyattscarpenter59ed5ce Upgrade mypy (#406)
wyattscarpenterc95951d Do not retry failing requests with status code 401 (#408)
Hodnebo335d918 [PECO-1715] Remove username/password (BasicAuth) auth option (#409)
jackyhu-dbee4f94c [PECO-1751] Refactor CloudFetch downloader: handle files sequentially…
kravets-levko1c8bb11 Fix CloudFetch retry policy to be compatible with all `urllib3` versi…
kravets-levko9de280e Disable SSL verification for CloudFetch links (#414)
kravets-levko04b626a Prepare relese 3.3.0 (#415)
kravets-levko2a01173 Fix pandas 2.2.2 support (#416)
kfollesdalb1faa09 [PECO-1801] Make OAuth as the default authenticator if no authenticat…
jackyhu-db270edcf [PECO-1857] Use SSL options with HTTPS connection pool (#425)
kravets-levko8523fd3 Prepare release v3.4.0 (#430)
kravets-levko763f070 [PECO-1926] Create a non pyarrow flow to handle small results for the…
jprakash-db1e0d9d5 [PECO-1961] On non-retryable error, ensure PySQL includes useful info…
shivam2680890cdd7 Reformatted all the files using black (#448)
jprakash-db9bdee1d Prepare release v3.5.0 (#457)
jackyhu-dbcdd7a19 [PECO-2051] Add custom auth headers into cloud fetch request (#460)
jackyhu-dbfcc2da9 Prepare release 3.6.0 (#461)
jackyhu-dbd354309 [ PECO - 1768 ] PySQL: adjust HTTP retry logic to align with Go and N…
jprakash-dbd63544e [ PECO-2065 ] Create the async execution flow for the PySQL Connector…
jprakash-db5bbf223 Fix for check_types github action failing (#472)
jprakash-db9c62b21 Remove upper caps on dependencies (#452)
arredond7bb7ca6 Updated the doc to specify native parameters in PUT operation is not …
jprakash-db438a080 Incorrect rows in inline fetch result (#479)
jprakash-dbeb50411 Bumped up to version 3.7.0 (#482)
jprakash-db2a5b9c7 PySQL Connector split into connector and sqlalchemy (#444)
jprakash-dbd31aa59 Removed CI CD for python3.8 (#490)
jprakash-db3e62c90 Added CI CD upto python 3.12 (#491)
jprakash-dbf9a6b13 Merging changes from v3.7.1 release (#488)
jprakash-dba941575 Bumped up to version 4.0.0 (#493)
jprakash-db032c276 Updated action's version (#455)
newwingbirdd36889d Support Python 3.13 and update deps (#510)
dhirschfeld22e5ce4 Improve debugging + fix PR review template (#514)
samikshya-db7772403 Forward porting all changes into 4.x.x. uptil v3.7.3 (#529)
jprakash-db8b27150 Updated the actions/cache version (#532)
jprakash-db398db45 Updated the CODEOWNERS (#531)
jprakash-dbc962b63 Add version check for urllib3 in backoff calculation (#526)
shivam2680c246872 [ES-1372353] make user_agent_header part of public API (#530)
shivam2680326f338 Updates runner used to run DCO check to use databricks-protected-runn…
madhav-db37e73a9 Support multiple timestamp formats in non arrow flow (#533)
jprakash-db3d7123c prepare release for v4.0.1 (#534)
shivam2680132e1b7 Relaxed bound for python-dateutil (#538)
jprakash-db46090c0 Bumped up the version for 4.0.2 (#539)
jprakash-db28249c0 Added example for async execute query (#537)
jprakash-db5ab0a2c Added urllib3 version check (#547)
jprakash-db6528cd1 Bump version to 4.0.3 (#549)
jprakash-db8f7754b Cleanup fields as they might be deprecated/removed/change in the futu…
vikrantpuppalaf7d3865 Refactor decimal conversion in PyArrow tables to use direct casting (…
jayantsing-db61cc398 [PECOBLR-361] convert column table to arrow if arrow present (#551)
shivam2680554d011 decouple session class from existing Connection
varun-edachali-dbx6f28297 add open property to Connection to ensure maintenance of existing API
varun-edachali-dbx983ec03 update unit tests to address ThriftBackend through session instead of…
varun-edachali-dbx6f3b5b7 chore: move session specific tests from test_client to test_session
varun-edachali-dbx29a2840 formatting (black)
varun-edachali-dbx0d28b69 use connection open property instead of long chain through session
varun-edachali-dbx8cb8cdd trigger integration workflow
varun-edachali-dbx4495f9b fix: ensure open attribute of Connection never fails
varun-edachali-dbxc744117 introduce databricksClient interface and thrift backend implementation
varun-edachali-dbxef5a06b change names of ThriftBackend -> ThriftDatabricksClient in tests
varun-edachali-dbxabbaaa5 fix: remove excess debug log
varun-edachali-dbx33765cb fix: replace thrift_backend with backend in result set param
varun-edachali-dbx788d8c7 fix: replace module replacement with concrete mock instance in execut…
varun-edachali-dbx4debbd3 formatting: black + re-organise backend into new dir
varun-edachali-dbx0e6e215 fix: sql.thrift_backend -> sql.backend.thrift_backend in tests and ex…
varun-edachali-dbx925394c Update CODEOWNERS (#562)
jprakash-db4ad6c8d Enhance Cursor close handling and context manager exception managemen…
madhav-db51369c8 PECOBLR-86 improve logging on python driver (#556)
saishreeeee9541464 Update github actions run conditions (#569)
jprakash-dbcbdd3d7 remove un-necessary example change
varun-edachali-dbxca38e95 [empty commit] trigger integration tests
varun-edachali-dbxb40c0fd fix: use backend in Cursor, not thrift_backend
varun-edachali-dbx35ed462 fix: backend references in integration tests
varun-edachali-dbx37f3af1 fix: thrift_backend -> backend in ResultSet reference in e2e test
varun-edachali-dbx09c5e2f introduce normalised sessionId and CommandId for (near) complete back…
varun-edachali-dbx4ce6aab fix: Any is not defined
varun-edachali-dbx307f447 fix: get_session_id_hex() is not defined
varun-edachali-dbx802d8dc command_handle -> command_id in ExecuteResponse
varun-edachali-dbx944d446 fix: active op handle -> active command id in Cursor
varun-edachali-dbx6338083 fixed (most) tests by accounting for normalised Session interface
varun-edachali-dbx3658a91 fix: convert command id to operationHandle in status_request
varun-edachali-dbx8ef6ed6 decouple session class from existing Connection
varun-edachali-dbx61300b2 add open property to Connection to ensure maintenance of existing API
varun-edachali-dbx44e7d17 formatting (black)
varun-edachali-dbxd2035ea use connection open property instead of long chain through session
varun-edachali-dbx8b4451b trigger integration workflow
varun-edachali-dbxd21d2c3 fix: ensure open attribute of Connection never fails
varun-edachali-dbx21068a3 fix: de-complicate earlier connection open logic
varun-edachali-dbx476e763 Revert "fix: de-complicate earlier connection open logic"
varun-edachali-dbx1e1cf1e [empty commit] attempt to trigger ci e2e workflow
varun-edachali-dbxb408c2c PECOBLR-86 improve logging on python driver (#556)
saishreeeee73649f2 Revert "Merge remote-tracking branch 'upstream/sea-migration' into de…
varun-edachali-dbxa61df99 Reapply "Merge remote-tracking branch 'upstream/sea-migration' into d…
varun-edachali-dbxe1a2c0e fix: separate session opening logic from instantiation
varun-edachali-dbx71ba9d5 chore: use get_handle() instead of private session attribute in client
varun-edachali-dbx160ba9f fix: remove accidentally removed assertions
varun-edachali-dbx6b3436f generalise open session, fix session tests to consider positional args
varun-edachali-dbx30849dc formatting (black)
varun-edachali-dbx4d455bb correct session logic after duplication during merge
varun-edachali-dbx6fc0834 args -> kwargs in tests
varun-edachali-dbxd254e48 delegate protocol version to SessionId
varun-edachali-dbx370627d ids -> backend/types
varun-edachali-dbxca1b57d update open session with normalised SessionId
varun-edachali-dbx6c120c0 Merging changes from v3.7.1 release (#488)
jprakash-dbcdf6865 Support Python 3.13 and update deps (#510)
dhirschfeld12ce717 Updated the actions/cache version (#532)
jprakash-db1215fd8 Add version check for urllib3 in backoff calculation (#526)
shivam2680dd083f6 Support multiple timestamp formats in non arrow flow (#533)
jprakash-db8d30436 Added example for async execute query (#537)
jprakash-db066aef9 Added urllib3 version check (#547)
jprakash-db1ed3514 decouple session class from existing Connection
varun-edachali-dbxca80f94 formatting (black)
varun-edachali-dbx6027fb1 use connection open property instead of long chain through session
varun-edachali-dbx7a2f9b5 trigger integration workflow
varun-edachali-dbx39294e9 fix: ensure open attribute of Connection never fails
varun-edachali-dbx709e910 Revert "fix: de-complicate earlier connection open logic"
varun-edachali-dbx1ad0ace [empty commit] attempt to trigger ci e2e workflow
varun-edachali-dbx913da63 PECOBLR-86 improve logging on python driver (#556)
saishreeeeed8159e7 Revert "Merge remote-tracking branch 'upstream/sea-migration' into de…
varun-edachali-dbx0b91183 Reapply "Merge remote-tracking branch 'upstream/sea-migration' into d…
varun-edachali-dbxff78b5f fix: separate session opening logic from instantiation
varun-edachali-dbxc1d53d2 Enhance Cursor close handling and context manager exception managemen…
madhav-dba5a8e51 PECOBLR-86 improve logging on python driver (#556)
saishreeeeef7be10c New Complex type test table + Github Action changes (#575)
jprakash-dba888dd6 remove excess logs, assertions, instantiations
varun-edachali-dbx29a2985 Merge remote-tracking branch 'origin/sea-migration' into backend-inte…
varun-edachali-dbx9b9735e formatting (black) + remove excess log (merge artifact)
varun-edachali-dbx0a8226c fix typing
varun-edachali-dbx42263c4 remove un-necessary check
varun-edachali-dbxac984e4 remove un-necessary replace call
varun-edachali-dbx8da84e8 introduce __str__ methods for CommandId and SessionId
varun-edachali-dbxf4f27e3 Merge remote-tracking branch 'origin/backend-interface' into fetch-in…
varun-edachali-dbx4e3ccce correct some merge artifacts
varun-edachali-dbx04eb8c1 replace match case with if else for compatibility with older python v…
varun-edachali-dbxca425eb correct TOperationState literal, remove un-necessary check
varun-edachali-dbx7a47dd0 chore: remove duplicate def
varun-edachali-dbx00d9aeb correct typing
varun-edachali-dbxeecc67d docstrings for DatabricksClient interface
varun-edachali-dbx9800636 stronger typing of Cursor and ExecuteResponse
varun-edachali-dbxe07f56c remove utility functions from backend interface, fix circular import
varun-edachali-dbx73fb141 rename info to properties
varun-edachali-dbxd838653 newline for cleanliness
varun-edachali-dbx6654f06 fix circular import
varun-edachali-dbx89425f9 formatting (black)
varun-edachali-dbx93e55e8 to_hex_id -> get_hex_id
varun-edachali-dbx7689d75 better comment on protocol version getter
varun-edachali-dbx1ec8c45 formatting (black)
varun-edachali-dbx80b7bc3 Merge remote-tracking branch 'origin/backend-interface' into fetch-in…
varun-edachali-dbx904efe7 stricter typing for cursor
varun-edachali-dbxc91bc37 correct typing
varun-edachali-dbx42a0d08 init sea backend
varun-edachali-dbx1056bc2 move test script into experimental dir
varun-edachali-dbxd59880c formatting (black)
varun-edachali-dbx16ff4ec cleanup: removed excess comments, validated decisions
varun-edachali-dbx0bba7f1 init sea exec
varun-edachali-dbx1da5694 introduce models
varun-edachali-dbxf23ef8f introduce req resp models, update example tester script
varun-edachali-dbx95b0781 add unit tests for sea backend
varun-edachali-dbxef09dbe Merge remote-tracking branch 'origin/sess-sea' into exec-sea
varun-edachali-dbxd552695 introduce unit tests for sea backend
varun-edachali-dbxb8a170e typing, change DESCRIBE TABLE to SHOW COLUMNS
varun-edachali-dbx1003319 remove model redundancies
varun-edachali-dbx9f11c9d raise ServerOpError in case of not SUCCEEDED state
varun-edachali-dbxe1c7091 simplify logging, comments
varun-edachali-dbxb6d7c0c review metadata ops
varun-edachali-dbx611d79f result compression
varun-edachali-dbxedd4c87 client side table_types filtering
varun-edachali-dbx9395141 preliminary table filtering
varun-edachali-dbx7fdc01d filters and sea_result_set unit tests
varun-edachali-dbx2871e05 stronger typing on ResultSet
varun-edachali-dbx15a8efc fix type issues
varun-edachali-dbxf559c79 move sea stuff into sea/ dir
varun-edachali-dbx8fe566b Merge branch 'sea-migration' into exec-sea
varun-edachali-dbxe351834 fix simple merge artifacts
varun-edachali-dbx7f96b70 remove ExecuteResponse namedtuple, use as backend obj store
varun-edachali-dbxd3d3fc2 fix imports in tests
varun-edachali-dbx53bc6ac remove accidental changes in workflows
varun-edachali-dbx40f15a7 formatting (black)
varun-edachali-dbxaba27ab account for new exec resp
varun-edachali-dbx4148aac account for new exec resp
varun-edachali-dbx0d51d1b op_state -> status in ResultSet
varun-edachali-dbxd01f3e4 add unit tests for SeaResultSet and ResultSetFilter
varun-edachali-dbxd80ea49 formatting (black)
varun-edachali-dbxac381f3 change test name in example script
varun-edachali-dbx92b778c cleaner, organised test scripts
varun-edachali-dbxFile filter
Filter by extension
Conversations
Failed to load comments.
Loading
Uh oh!
There was an error while loading.Please reload this page.
Jump to
Jump to file
Failed to load files.
Loading
Uh oh!
There was an error while loading.Please reload this page.
Diff view
Diff view
There are no files selected for viewing
147 changes: 96 additions & 51 deletionsexamples/experimental/sea_connector_test.py
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,66 +1,111 @@ | ||
| """ | ||
| Main script to run all SEA connector tests. | ||
| This script imports and runs all the individual test modules and displays | ||
| a summary of test results with visual indicators. | ||
| """ | ||
| import os | ||
| import sys | ||
| import logging | ||
| import importlib.util | ||
| from typing import Dict, Callable, List, Tuple | ||
| # Configure logging | ||
| logging.basicConfig(level=logging.INFO) | ||
| logger = logging.getLogger(__name__) | ||
| # Define test modules and their main test functions | ||
| TEST_MODULES = [ | ||
| "test_sea_session", | ||
| "test_sea_sync_query", | ||
| "test_sea_async_query", | ||
| "test_sea_metadata", | ||
| ] | ||
| def load_test_function(module_name: str) -> Callable: | ||
| """Load a test function from a module.""" | ||
| module_path = os.path.join( | ||
| os.path.dirname(os.path.abspath(__file__)), | ||
| "tests", | ||
| f"{module_name}.py" | ||
| ) | ||
| spec = importlib.util.spec_from_file_location(module_name, module_path) | ||
| module = importlib.util.module_from_spec(spec) | ||
| spec.loader.exec_module(module) | ||
| # Get the main test function (assuming it starts with "test_") | ||
| for name in dir(module): | ||
| if name.startswith("test_") and callable(getattr(module, name)): | ||
| # For sync and async query modules, we want the main function that runs both tests | ||
| if name == f"test_sea_{module_name.replace('test_sea_', '')}_exec": | ||
| return getattr(module, name) | ||
| # Fallback to the first test function found | ||
| for name in dir(module): | ||
| if name.startswith("test_") and callable(getattr(module, name)): | ||
| return getattr(module, name) | ||
| raise ValueError(f"No test function found in module {module_name}") | ||
| def run_tests() -> List[Tuple[str, bool]]: | ||
| """Run all tests and return results.""" | ||
| results = [] | ||
| for module_name in TEST_MODULES: | ||
| try: | ||
| test_func = load_test_function(module_name) | ||
| logger.info(f"\n{'=' * 50}") | ||
| logger.info(f"Running test: {module_name}") | ||
| logger.info(f"{'-' * 50}") | ||
| success = test_func() | ||
| results.append((module_name, success)) | ||
| status = "✅ PASSED" if success else "❌ FAILED" | ||
| logger.info(f"Test {module_name}: {status}") | ||
| except Exception as e: | ||
| logger.error(f"Error loading or running test {module_name}: {str(e)}") | ||
| import traceback | ||
| logger.error(traceback.format_exc()) | ||
| results.append((module_name, False)) | ||
| return results | ||
| def print_summary(results: List[Tuple[str, bool]]) -> None: | ||
| """Print a summary of test results.""" | ||
| logger.info(f"\n{'=' * 50}") | ||
| logger.info("TEST SUMMARY") | ||
| logger.info(f"{'-' * 50}") | ||
| passed = sum(1 for _, success in results if success) | ||
| total = len(results) | ||
| for module_name, success in results: | ||
| status = "✅ PASSED" if success else "❌ FAILED" | ||
| logger.info(f"{status} - {module_name}") | ||
| logger.info(f"{'-' * 50}") | ||
| logger.info(f"Total: {total} | Passed: {passed} | Failed: {total - passed}") | ||
| logger.info(f"{'=' * 50}") | ||
| if __name__ == "__main__": | ||
| # Check if required environment variables are set | ||
| required_vars = ["DATABRICKS_SERVER_HOSTNAME", "DATABRICKS_HTTP_PATH", "DATABRICKS_TOKEN"] | ||
| missing_vars = [var for var in required_vars if not os.environ.get(var)] | ||
| if missing_vars: | ||
| logger.error(f"Missing required environment variables: {', '.join(missing_vars)}") | ||
| logger.error("Please set these variables before running the tests.") | ||
| sys.exit(1) | ||
| # Run all tests | ||
| results = run_tests() | ||
| # Print summary | ||
| print_summary(results) | ||
| # Exit with appropriate status code | ||
| all_passed = all(success for _, success in results) | ||
| sys.exit(0 if all_passed else 1) |
1 change: 1 addition & 0 deletionsexamples/experimental/tests/__init__.py
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1 @@ | ||
| # This file makes the tests directory a Python package |
165 changes: 165 additions & 0 deletionsexamples/experimental/tests/test_sea_async_query.py
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,165 @@ | ||
| """ | ||
| Test for SEA asynchronous query execution functionality. | ||
| """ | ||
| import os | ||
| import sys | ||
| import logging | ||
| import time | ||
| from databricks.sql.client import Connection | ||
| from databricks.sql.backend.types import CommandState | ||
| logging.basicConfig(level=logging.INFO) | ||
| logger = logging.getLogger(__name__) | ||
| def test_sea_async_query_with_cloud_fetch(): | ||
| """ | ||
| Test executing a query asynchronously using the SEA backend with cloud fetch enabled. | ||
| This function connects to a Databricks SQL endpoint using the SEA backend, | ||
| executes a simple query asynchronously with cloud fetch enabled, and verifies that execution completes successfully. | ||
| """ | ||
| server_hostname = os.environ.get("DATABRICKS_SERVER_HOSTNAME") | ||
| http_path = os.environ.get("DATABRICKS_HTTP_PATH") | ||
| access_token = os.environ.get("DATABRICKS_TOKEN") | ||
| catalog = os.environ.get("DATABRICKS_CATALOG") | ||
| if not all([server_hostname, http_path, access_token]): | ||
| logger.error("Missing required environment variables.") | ||
| logger.error( | ||
| "Please set DATABRICKS_SERVER_HOSTNAME, DATABRICKS_HTTP_PATH, and DATABRICKS_TOKEN." | ||
| ) | ||
| return False | ||
| try: | ||
| # Create connection with cloud fetch enabled | ||
| logger.info("Creating connection for asynchronous query execution with cloud fetch enabled") | ||
| connection = Connection( | ||
| server_hostname=server_hostname, | ||
| http_path=http_path, | ||
| access_token=access_token, | ||
| catalog=catalog, | ||
| schema="default", | ||
| use_sea=True, | ||
| user_agent_entry="SEA-Test-Client", | ||
| use_cloud_fetch=True, | ||
| ) | ||
| logger.info( | ||
| f"Successfully opened SEA session with ID: {connection.get_session_id_hex()}" | ||
| ) | ||
| # Execute a simple query asynchronously | ||
| cursor = connection.cursor() | ||
| logger.info("Executing asynchronous query with cloud fetch: SELECT 1 as test_value") | ||
| cursor.execute_async("SELECT 1 as test_value") | ||
| logger.info("Asynchronous query submitted successfully with cloud fetch enabled") | ||
| # Check query state | ||
| logger.info("Checking query state...") | ||
| while cursor.is_query_pending(): | ||
| logger.info("Query is still pending, waiting...") | ||
| time.sleep(1) | ||
| logger.info("Query is no longer pending, getting results...") | ||
| cursor.get_async_execution_result() | ||
| logger.info("Successfully retrieved asynchronous query results with cloud fetch enabled") | ||
| # Close resources | ||
| cursor.close() | ||
| connection.close() | ||
| logger.info("Successfully closed SEA session") | ||
| return True | ||
| except Exception as e: | ||
| logger.error(f"Error during SEA asynchronous query execution test with cloud fetch: {str(e)}") | ||
| import traceback | ||
| logger.error(traceback.format_exc()) | ||
| return False | ||
| def test_sea_async_query_without_cloud_fetch(): | ||
| """ | ||
| Test executing a query asynchronously using the SEA backend with cloud fetch disabled. | ||
| This function connects to a Databricks SQL endpoint using the SEA backend, | ||
| executes a simple query asynchronously with cloud fetch disabled, and verifies that execution completes successfully. | ||
| """ | ||
| server_hostname = os.environ.get("DATABRICKS_SERVER_HOSTNAME") | ||
| http_path = os.environ.get("DATABRICKS_HTTP_PATH") | ||
| access_token = os.environ.get("DATABRICKS_TOKEN") | ||
| catalog = os.environ.get("DATABRICKS_CATALOG") | ||
| if not all([server_hostname, http_path, access_token]): | ||
| logger.error("Missing required environment variables.") | ||
| logger.error( | ||
| "Please set DATABRICKS_SERVER_HOSTNAME, DATABRICKS_HTTP_PATH, and DATABRICKS_TOKEN." | ||
| ) | ||
| return False | ||
| try: | ||
| # Create connection with cloud fetch disabled | ||
| logger.info("Creating connection for asynchronous query execution with cloud fetch disabled") | ||
| connection = Connection( | ||
| server_hostname=server_hostname, | ||
| http_path=http_path, | ||
| access_token=access_token, | ||
| catalog=catalog, | ||
| schema="default", | ||
| use_sea=True, | ||
| user_agent_entry="SEA-Test-Client", | ||
| use_cloud_fetch=False, | ||
| enable_query_result_lz4_compression=False, | ||
| ) | ||
| logger.info( | ||
| f"Successfully opened SEA session with ID: {connection.get_session_id_hex()}" | ||
| ) | ||
| # Execute a simple query asynchronously | ||
| cursor = connection.cursor() | ||
| logger.info("Executing asynchronous query without cloud fetch: SELECT 1 as test_value") | ||
| cursor.execute_async("SELECT 1 as test_value") | ||
| logger.info("Asynchronous query submitted successfully with cloud fetch disabled") | ||
| # Check query state | ||
| logger.info("Checking query state...") | ||
| while cursor.is_query_pending(): | ||
| logger.info("Query is still pending, waiting...") | ||
| time.sleep(1) | ||
| logger.info("Query is no longer pending, getting results...") | ||
| cursor.get_async_execution_result() | ||
| logger.info("Successfully retrieved asynchronous query results with cloud fetch disabled") | ||
| # Close resources | ||
| cursor.close() | ||
| connection.close() | ||
| logger.info("Successfully closed SEA session") | ||
| return True | ||
| except Exception as e: | ||
| logger.error(f"Error during SEA asynchronous query execution test without cloud fetch: {str(e)}") | ||
| import traceback | ||
| logger.error(traceback.format_exc()) | ||
| return False | ||
| def test_sea_async_query_exec(): | ||
| """ | ||
| Run both asynchronous query tests and return overall success. | ||
| """ | ||
| with_cloud_fetch_success = test_sea_async_query_with_cloud_fetch() | ||
| logger.info(f"Asynchronous query with cloud fetch: {'✅ PASSED' if with_cloud_fetch_success else '❌ FAILED'}") | ||
| without_cloud_fetch_success = test_sea_async_query_without_cloud_fetch() | ||
| logger.info(f"Asynchronous query without cloud fetch: {'✅ PASSED' if without_cloud_fetch_success else '❌ FAILED'}") | ||
| return with_cloud_fetch_success and without_cloud_fetch_success | ||
| if __name__ == "__main__": | ||
| success = test_sea_async_query_exec() | ||
| sys.exit(0 if success else 1) |
Oops, something went wrong.
Uh oh!
There was an error while loading.Please reload this page.
Oops, something went wrong.
Uh oh!
There was an error while loading.Please reload this page.
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.