- Notifications
You must be signed in to change notification settings - Fork126
Introduce SeaDatabricksClient (Session Implementation)#577
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to ourterms of service andprivacy statement. We’ll occasionally send you account related emails.
Already on GitHub?Sign in to your account
Closed
Uh oh!
There was an error while loading.Please reload this page.
Closed
Changes fromall commits
Commits
Show all changes
325 commits Select commitHold shift + click to select a range
e71ed00 Refactor decimal conversion in PyArrow tables to use direct casting (…
jayantsing-dbc7ea206 [PECOBLR-361] convert column table to arrow if arrow present (#551)
shivam2680a47163e decouple session class from existing Connection
varun-edachali-dbx6d92c76 add open property to Connection to ensure maintenance of existing API
varun-edachali-dbx0c0675f update unit tests to address ThriftBackend through session instead of…
varun-edachali-dbxc021435 chore: move session specific tests from test_client to test_session
varun-edachali-dbx9b82d61 formatting (black)
varun-edachali-dbx8789e32 use connection open property instead of long chain through session
varun-edachali-dbx035b5fd trigger integration workflow
varun-edachali-dbx6253d4f fix: ensure open attribute of Connection never fails
varun-edachali-dbxf9fcb79 introduce databricksClient interface and thrift backend implementation
varun-edachali-dbx7925954 change names of ThriftBackend -> ThriftDatabricksClient in tests
varun-edachali-dbx54804b9 fix: remove excess debug log
varun-edachali-dbxcbaa1f3 fix: replace thrift_backend with backend in result set param
varun-edachali-dbx98b0984 fix: replace module replacement with concrete mock instance in execut…
varun-edachali-dbx882dda7 formatting: black + re-organise backend into new dir
varun-edachali-dbxa57f6c3 fix: sql.thrift_backend -> sql.backend.thrift_backend in tests and ex…
varun-edachali-dbxd4a49ca introduce Result Set interface and concrete thrift implementation
varun-edachali-dbx574df21 create concrete ThriftResultSet instead of ResultSet type
varun-edachali-dbx8ea5cf4 ensure backend client returns a ResultSet type in backend tests
varun-edachali-dbx3eba92a formatting (black)
varun-edachali-dbx6735fd5 fix: correct client tests
varun-edachali-dbx12388e8 formatting (black)
varun-edachali-dbx853e6c3 remove excess comments (nit)
varun-edachali-dbx4f8b54c Update CODEOWNERS (#562)
jprakash-db12ee56f Enhance Cursor close handling and context manager exception managemen…
madhav-dba5152d0 PECOBLR-86 improve logging on python driver (#556)
saishreeeee2c2984e Update github actions run conditions (#569)
jprakash-db4979422 fix: thrift_backend->backend, ResultSet -> ThriftResultSet
varun-edachali-dbx6361e85 fix: thrift_backend -> backend in e2e test_driver
varun-edachali-dbx904c304 PySQL Connector split into connector and sqlalchemy (#444)
jprakash-dbe5ac8c6 Removed CI CD for python3.8 (#490)
jprakash-dbe06fd2f Added CI CD upto python 3.12 (#491)
jprakash-db7126f97 Merging changes from v3.7.1 release (#488)
jprakash-dbb1245da Bumped up to version 4.0.0 (#493)
jprakash-dbb54b04a Support Python 3.13 and update deps (#510)
dhirschfelde1d7f71 Improve debugging + fix PR review template (#514)
samikshya-db1c721e0 Forward porting all changes into 4.x.x. uptil v3.7.3 (#529)
jprakash-dbf1fa67a Updated the actions/cache version (#532)
jprakash-db512d37c Updated the CODEOWNERS (#531)
jprakash-dbff0ec64 Add version check for urllib3 in backoff calculation (#526)
shivam2680c5900c9 Support multiple timestamp formats in non arrow flow (#533)
jprakash-dbc86e99b prepare release for v4.0.1 (#534)
shivam2680bcd6f01 Relaxed bound for python-dateutil (#538)
jprakash-dbca51d1d Bumped up the version for 4.0.2 (#539)
jprakash-dbd403fcd Added example for async execute query (#537)
jprakash-db1b4154c Added urllib3 version check (#547)
jprakash-dbeac0433 Bump version to 4.0.3 (#549)
jprakash-db1b0fc9b decouple session class from existing Connection
varun-edachali-dbxb45871e add open property to Connection to ensure maintenance of existing API
varun-edachali-dbx561c351 update unit tests to address ThriftBackend through session instead of…
varun-edachali-dbxa3188ad chore: move session specific tests from test_client to test_session
varun-edachali-dbxff7abf6 formatting (black)
varun-edachali-dbxe7e2333 use connection open property instead of long chain through session
varun-edachali-dbxfd4fae6 trigger integration workflow
varun-edachali-dbx677e66a fix: ensure open attribute of Connection never fails
varun-edachali-dbx4ec8703 introduce databricksClient interface and thrift backend implementation
varun-edachali-dbx6ecb6bf change names of ThriftBackend -> ThriftDatabricksClient in tests
varun-edachali-dbxfe2ce17 formatting: black + re-organise backend into new dir
varun-edachali-dbx568b1f4 Update CODEOWNERS (#562)
jprakash-dbebbd150 remove un-necessary example change
varun-edachali-dbxaa8af45 [empty commit] trigger integration tests
varun-edachali-dbxe7be76b introduce normalised sessionId and CommandId for (near) complete back…
varun-edachali-dbx7461715 fix: Any is not defined
varun-edachali-dbxb295acd fix: get_session_id_hex() is not defined
varun-edachali-dbx8afc5d5 command_handle -> command_id in ExecuteResponse
varun-edachali-dbxb8f9146 fix: active op handle -> active command id in Cursor
varun-edachali-dbx7c733ee fixed (most) tests by accounting for normalised Session interface
varun-edachali-dbx0917ea1 fix: convert command id to operationHandle in status_request
varun-edachali-dbx3fd2a46 decouple session class from existing Connection
varun-edachali-dbx03d3ae7 add open property to Connection to ensure maintenance of existing API
varun-edachali-dbxfb0fa46 use connection open property instead of long chain through session
varun-edachali-dbxe3770cd trigger integration workflow
varun-edachali-dbx3b8002f fix: ensure open attribute of Connection never fails
varun-edachali-dbxf1a350a fix: de-complicate earlier connection open logic
varun-edachali-dbx98b0dc7 Revert "fix: de-complicate earlier connection open logic"
varun-edachali-dbxafee423 [empty commit] attempt to trigger ci e2e workflow
varun-edachali-dbx2d24fdd PECOBLR-86 improve logging on python driver (#556)
saishreeeeea5561e8 Revert "Merge remote-tracking branch 'upstream/sea-migration' into de…
varun-edachali-dbx0d890a5 Reapply "Merge remote-tracking branch 'upstream/sea-migration' into d…
varun-edachali-dbxc2aa762 fix: separate session opening logic from instantiation
varun-edachali-dbxfe642da chore: use get_handle() instead of private session attribute in client
varun-edachali-dbx394333c fix: remove accidentally removed assertions
varun-edachali-dbxef07acd generalise open session, fix session tests to consider positional args
varun-edachali-dbx1ef46cf formatting (black)
varun-edachali-dbx76ca997 correct session logic after duplication during merge
varun-edachali-dbxafc6f8f args -> kwargs in tests
varun-edachali-dbxf6660ba delegate protocol version to SessionId
varun-edachali-dbx9871a93 ids -> backend/types
varun-edachali-dbx595d795 update open session with normalised SessionId
varun-edachali-dbx10ee940 remove merge artifacts, account for result set
varun-edachali-dbx3d75d6c fix: import CommandId in client tests
varun-edachali-dbxb8e1bbd expect session_id in protocol version getter
varun-edachali-dbxdac08f2 enforce ResultSet return in exec commands in backend client
varun-edachali-dbx7b0cbed abstract Command State away from Thrift specific types
varun-edachali-dbx9267ef9 close_command return not used, replacing with None and logging resp
varun-edachali-dbx5f00532 move py.typed to correct places (#403)
wyattscarpenter59ed5ce Upgrade mypy (#406)
wyattscarpenterc95951d Do not retry failing requests with status code 401 (#408)
Hodnebo335d918 [PECO-1715] Remove username/password (BasicAuth) auth option (#409)
jackyhu-dbee4f94c [PECO-1751] Refactor CloudFetch downloader: handle files sequentially…
kravets-levko1c8bb11 Fix CloudFetch retry policy to be compatible with all `urllib3` versi…
kravets-levko9de280e Disable SSL verification for CloudFetch links (#414)
kravets-levko04b626a Prepare relese 3.3.0 (#415)
kravets-levko2a01173 Fix pandas 2.2.2 support (#416)
kfollesdalb1faa09 [PECO-1801] Make OAuth as the default authenticator if no authenticat…
jackyhu-db270edcf [PECO-1857] Use SSL options with HTTPS connection pool (#425)
kravets-levko8523fd3 Prepare release v3.4.0 (#430)
kravets-levko763f070 [PECO-1926] Create a non pyarrow flow to handle small results for the…
jprakash-db1e0d9d5 [PECO-1961] On non-retryable error, ensure PySQL includes useful info…
shivam2680890cdd7 Reformatted all the files using black (#448)
jprakash-db9bdee1d Prepare release v3.5.0 (#457)
jackyhu-dbcdd7a19 [PECO-2051] Add custom auth headers into cloud fetch request (#460)
jackyhu-dbfcc2da9 Prepare release 3.6.0 (#461)
jackyhu-dbd354309 [ PECO - 1768 ] PySQL: adjust HTTP retry logic to align with Go and N…
jprakash-dbd63544e [ PECO-2065 ] Create the async execution flow for the PySQL Connector…
jprakash-db5bbf223 Fix for check_types github action failing (#472)
jprakash-db9c62b21 Remove upper caps on dependencies (#452)
arredond7bb7ca6 Updated the doc to specify native parameters in PUT operation is not …
jprakash-db438a080 Incorrect rows in inline fetch result (#479)
jprakash-dbeb50411 Bumped up to version 3.7.0 (#482)
jprakash-db2a5b9c7 PySQL Connector split into connector and sqlalchemy (#444)
jprakash-dbd31aa59 Removed CI CD for python3.8 (#490)
jprakash-db3e62c90 Added CI CD upto python 3.12 (#491)
jprakash-dbf9a6b13 Merging changes from v3.7.1 release (#488)
jprakash-dba941575 Bumped up to version 4.0.0 (#493)
jprakash-db032c276 Updated action's version (#455)
newwingbirdd36889d Support Python 3.13 and update deps (#510)
dhirschfeld22e5ce4 Improve debugging + fix PR review template (#514)
samikshya-db7772403 Forward porting all changes into 4.x.x. uptil v3.7.3 (#529)
jprakash-db8b27150 Updated the actions/cache version (#532)
jprakash-db398db45 Updated the CODEOWNERS (#531)
jprakash-dbc962b63 Add version check for urllib3 in backoff calculation (#526)
shivam2680c246872 [ES-1372353] make user_agent_header part of public API (#530)
shivam2680326f338 Updates runner used to run DCO check to use databricks-protected-runn…
madhav-db37e73a9 Support multiple timestamp formats in non arrow flow (#533)
jprakash-db3d7123c prepare release for v4.0.1 (#534)
shivam2680132e1b7 Relaxed bound for python-dateutil (#538)
jprakash-db46090c0 Bumped up the version for 4.0.2 (#539)
jprakash-db28249c0 Added example for async execute query (#537)
jprakash-db5ab0a2c Added urllib3 version check (#547)
jprakash-db6528cd1 Bump version to 4.0.3 (#549)
jprakash-db8f7754b Cleanup fields as they might be deprecated/removed/change in the futu…
vikrantpuppalaf7d3865 Refactor decimal conversion in PyArrow tables to use direct casting (…
jayantsing-db61cc398 [PECOBLR-361] convert column table to arrow if arrow present (#551)
shivam2680554d011 decouple session class from existing Connection
varun-edachali-dbx6f28297 add open property to Connection to ensure maintenance of existing API
varun-edachali-dbx983ec03 update unit tests to address ThriftBackend through session instead of…
varun-edachali-dbx6f3b5b7 chore: move session specific tests from test_client to test_session
varun-edachali-dbx29a2840 formatting (black)
varun-edachali-dbx0d28b69 use connection open property instead of long chain through session
varun-edachali-dbx8cb8cdd trigger integration workflow
varun-edachali-dbx4495f9b fix: ensure open attribute of Connection never fails
varun-edachali-dbxc744117 introduce databricksClient interface and thrift backend implementation
varun-edachali-dbxef5a06b change names of ThriftBackend -> ThriftDatabricksClient in tests
varun-edachali-dbxabbaaa5 fix: remove excess debug log
varun-edachali-dbx33765cb fix: replace thrift_backend with backend in result set param
varun-edachali-dbx788d8c7 fix: replace module replacement with concrete mock instance in execut…
varun-edachali-dbx4debbd3 formatting: black + re-organise backend into new dir
varun-edachali-dbx0e6e215 fix: sql.thrift_backend -> sql.backend.thrift_backend in tests and ex…
varun-edachali-dbx925394c Update CODEOWNERS (#562)
jprakash-db4ad6c8d Enhance Cursor close handling and context manager exception managemen…
madhav-db51369c8 PECOBLR-86 improve logging on python driver (#556)
saishreeeee9541464 Update github actions run conditions (#569)
jprakash-dbcbdd3d7 remove un-necessary example change
varun-edachali-dbxca38e95 [empty commit] trigger integration tests
varun-edachali-dbxb40c0fd fix: use backend in Cursor, not thrift_backend
varun-edachali-dbx35ed462 fix: backend references in integration tests
varun-edachali-dbx37f3af1 fix: thrift_backend -> backend in ResultSet reference in e2e test
varun-edachali-dbx09c5e2f introduce normalised sessionId and CommandId for (near) complete back…
varun-edachali-dbx4ce6aab fix: Any is not defined
varun-edachali-dbx307f447 fix: get_session_id_hex() is not defined
varun-edachali-dbx802d8dc command_handle -> command_id in ExecuteResponse
varun-edachali-dbx944d446 fix: active op handle -> active command id in Cursor
varun-edachali-dbx6338083 fixed (most) tests by accounting for normalised Session interface
varun-edachali-dbx3658a91 fix: convert command id to operationHandle in status_request
varun-edachali-dbx8ef6ed6 decouple session class from existing Connection
varun-edachali-dbx61300b2 add open property to Connection to ensure maintenance of existing API
varun-edachali-dbx44e7d17 formatting (black)
varun-edachali-dbxd2035ea use connection open property instead of long chain through session
varun-edachali-dbx8b4451b trigger integration workflow
varun-edachali-dbxd21d2c3 fix: ensure open attribute of Connection never fails
varun-edachali-dbx21068a3 fix: de-complicate earlier connection open logic
varun-edachali-dbx476e763 Revert "fix: de-complicate earlier connection open logic"
varun-edachali-dbx1e1cf1e [empty commit] attempt to trigger ci e2e workflow
varun-edachali-dbxb408c2c PECOBLR-86 improve logging on python driver (#556)
saishreeeee73649f2 Revert "Merge remote-tracking branch 'upstream/sea-migration' into de…
varun-edachali-dbxa61df99 Reapply "Merge remote-tracking branch 'upstream/sea-migration' into d…
varun-edachali-dbxe1a2c0e fix: separate session opening logic from instantiation
varun-edachali-dbx71ba9d5 chore: use get_handle() instead of private session attribute in client
varun-edachali-dbx160ba9f fix: remove accidentally removed assertions
varun-edachali-dbx6b3436f generalise open session, fix session tests to consider positional args
varun-edachali-dbx30849dc formatting (black)
varun-edachali-dbx4d455bb correct session logic after duplication during merge
varun-edachali-dbx6fc0834 args -> kwargs in tests
varun-edachali-dbxd254e48 delegate protocol version to SessionId
varun-edachali-dbx370627d ids -> backend/types
varun-edachali-dbxca1b57d update open session with normalised SessionId
varun-edachali-dbx6c120c0 Merging changes from v3.7.1 release (#488)
jprakash-dbcdf6865 Support Python 3.13 and update deps (#510)
dhirschfeld12ce717 Updated the actions/cache version (#532)
jprakash-db1215fd8 Add version check for urllib3 in backoff calculation (#526)
shivam2680dd083f6 Support multiple timestamp formats in non arrow flow (#533)
jprakash-db8d30436 Added example for async execute query (#537)
jprakash-db066aef9 Added urllib3 version check (#547)
jprakash-db1ed3514 decouple session class from existing Connection
varun-edachali-dbxca80f94 formatting (black)
varun-edachali-dbx6027fb1 use connection open property instead of long chain through session
varun-edachali-dbx7a2f9b5 trigger integration workflow
varun-edachali-dbx39294e9 fix: ensure open attribute of Connection never fails
varun-edachali-dbx709e910 Revert "fix: de-complicate earlier connection open logic"
varun-edachali-dbx1ad0ace [empty commit] attempt to trigger ci e2e workflow
varun-edachali-dbx913da63 PECOBLR-86 improve logging on python driver (#556)
saishreeeeed8159e7 Revert "Merge remote-tracking branch 'upstream/sea-migration' into de…
varun-edachali-dbx0b91183 Reapply "Merge remote-tracking branch 'upstream/sea-migration' into d…
varun-edachali-dbxff78b5f fix: separate session opening logic from instantiation
varun-edachali-dbxc1d53d2 Enhance Cursor close handling and context manager exception managemen…
madhav-dba5a8e51 PECOBLR-86 improve logging on python driver (#556)
saishreeeeef7be10c New Complex type test table + Github Action changes (#575)
jprakash-dba888dd6 remove excess logs, assertions, instantiations
varun-edachali-dbx29a2985 Merge remote-tracking branch 'origin/sea-migration' into backend-inte…
varun-edachali-dbx9b9735e formatting (black) + remove excess log (merge artifact)
varun-edachali-dbx0a8226c fix typing
varun-edachali-dbx42263c4 remove un-necessary check
varun-edachali-dbxac984e4 remove un-necessary replace call
varun-edachali-dbx8da84e8 introduce __str__ methods for CommandId and SessionId
varun-edachali-dbxf4f27e3 Merge remote-tracking branch 'origin/backend-interface' into fetch-in…
varun-edachali-dbx4e3ccce correct some merge artifacts
varun-edachali-dbx04eb8c1 replace match case with if else for compatibility with older python v…
varun-edachali-dbxca425eb correct TOperationState literal, remove un-necessary check
varun-edachali-dbx7a47dd0 chore: remove duplicate def
varun-edachali-dbx00d9aeb correct typing
varun-edachali-dbxeecc67d docstrings for DatabricksClient interface
varun-edachali-dbx9800636 stronger typing of Cursor and ExecuteResponse
varun-edachali-dbxe07f56c remove utility functions from backend interface, fix circular import
varun-edachali-dbx73fb141 rename info to properties
varun-edachali-dbxd838653 newline for cleanliness
varun-edachali-dbx6654f06 fix circular import
varun-edachali-dbx89425f9 formatting (black)
varun-edachali-dbx93e55e8 to_hex_id -> get_hex_id
varun-edachali-dbx7689d75 better comment on protocol version getter
varun-edachali-dbx1ec8c45 formatting (black)
varun-edachali-dbx80b7bc3 Merge remote-tracking branch 'origin/backend-interface' into fetch-in…
varun-edachali-dbx904efe7 stricter typing for cursor
varun-edachali-dbxc91bc37 correct typing
varun-edachali-dbx42a0d08 init sea backend
varun-edachali-dbx1056bc2 move test script into experimental dir
varun-edachali-dbxd59880c formatting (black)
varun-edachali-dbx16ff4ec cleanup: removed excess comments, validated decisions
varun-edachali-dbx95b0781 add unit tests for sea backend
varun-edachali-dbxFile filter
Filter by extension
Conversations
Failed to load comments.
Loading
Uh oh!
There was an error while loading.Please reload this page.
Jump to
Jump to file
Failed to load files.
Loading
Uh oh!
There was an error while loading.Please reload this page.
Diff view
Diff view
There are no files selected for viewing
14 changes: 11 additions & 3 deletions.github/workflows/code-quality-checks.yml
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.Learn more about bidirectional Unicode characters
10 changes: 7 additions & 3 deletions.github/workflows/integration.yml
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.Learn more about bidirectional Unicode characters
65 changes: 65 additions & 0 deletionsexamples/experimental/sea_connector_test.py
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,65 @@ | ||
| import os | ||
| import sys | ||
| import logging | ||
| from databricks.sql.client import Connection | ||
| logging.basicConfig(level=logging.DEBUG) | ||
| logger = logging.getLogger(__name__) | ||
| def test_sea_session(): | ||
| """ | ||
| Test opening and closing a SEA session using the connector. | ||
| This function connects to a Databricks SQL endpoint using the SEA backend, | ||
| opens a session, and then closes it. | ||
| Required environment variables: | ||
| - DATABRICKS_SERVER_HOSTNAME: Databricks server hostname | ||
| - DATABRICKS_HTTP_PATH: HTTP path for the SQL endpoint | ||
| - DATABRICKS_TOKEN: Personal access token for authentication | ||
| """ | ||
| server_hostname = os.environ.get("DATABRICKS_SERVER_HOSTNAME") | ||
| http_path = os.environ.get("DATABRICKS_HTTP_PATH") | ||
| access_token = os.environ.get("DATABRICKS_TOKEN") | ||
| catalog = os.environ.get("DATABRICKS_CATALOG") | ||
| if not all([server_hostname, http_path, access_token]): | ||
| logger.error("Missing required environment variables.") | ||
| logger.error("Please set DATABRICKS_SERVER_HOSTNAME, DATABRICKS_HTTP_PATH, and DATABRICKS_TOKEN.") | ||
| sys.exit(1) | ||
| logger.info(f"Connecting to {server_hostname}") | ||
| logger.info(f"HTTP Path: {http_path}") | ||
| if catalog: | ||
| logger.info(f"Using catalog: {catalog}") | ||
| try: | ||
| logger.info("Creating connection with SEA backend...") | ||
| connection = Connection( | ||
| server_hostname=server_hostname, | ||
| http_path=http_path, | ||
| access_token=access_token, | ||
| catalog=catalog, | ||
| schema="default", | ||
| use_sea=True, | ||
| user_agent_entry="SEA-Test-Client" # add custom user agent | ||
| ) | ||
| logger.info(f"Successfully opened SEA session with ID: {connection.get_session_id_hex()}") | ||
| logger.info(f"backend type: {type(connection.session.backend)}") | ||
| # Close the connection | ||
| logger.info("Closing the SEA session...") | ||
| connection.close() | ||
| logger.info("Successfully closed SEA session") | ||
| except Exception as e: | ||
| logger.error(f"Error testing SEA session: {str(e)}") | ||
| import traceback | ||
| logger.error(traceback.format_exc()) | ||
| sys.exit(1) | ||
| logger.info("SEA session test completed successfully") | ||
| if __name__ == "__main__": | ||
| test_sea_session() |
212 changes: 212 additions & 0 deletionssrc/databricks/sql/backend/databricks_client.py
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,212 @@ | ||
| """ | ||
| Abstract client interface for interacting with Databricks SQL services. | ||
| Implementations of this class are responsible for: | ||
| - Managing connections to Databricks SQL services | ||
| - Handling authentication | ||
| - Executing SQL queries and commands | ||
| - Retrieving query results | ||
| - Fetching metadata about catalogs, schemas, tables, and columns | ||
| - Managing error handling and retries | ||
| """ | ||
| from abc import ABC, abstractmethod | ||
| from typing import Dict, Tuple, List, Optional, Any, Union, TYPE_CHECKING | ||
| if TYPE_CHECKING: | ||
| from databricks.sql.client import Cursor | ||
| from databricks.sql.thrift_api.TCLIService import ttypes | ||
| from databricks.sql.backend.types import SessionId, CommandId, CommandState | ||
| from databricks.sql.utils import ExecuteResponse | ||
| from databricks.sql.types import SSLOptions | ||
| # Forward reference for type hints | ||
| from typing import TYPE_CHECKING | ||
| if TYPE_CHECKING: | ||
| from databricks.sql.result_set import ResultSet | ||
| class DatabricksClient(ABC): | ||
| # == Connection and Session Management == | ||
| @abstractmethod | ||
| def open_session( | ||
| self, | ||
| session_configuration: Optional[Dict[str, Any]], | ||
| catalog: Optional[str], | ||
| schema: Optional[str], | ||
| ) -> SessionId: | ||
| """ | ||
| Opens a new session with the Databricks SQL service. | ||
| This method establishes a new session with the server and returns a session | ||
| identifier that can be used for subsequent operations. | ||
| Args: | ||
| session_configuration: Optional dictionary of configuration parameters for the session | ||
| catalog: Optional catalog name to use as the initial catalog for the session | ||
| schema: Optional schema name to use as the initial schema for the session | ||
| Returns: | ||
| SessionId: A session identifier object that can be used for subsequent operations | ||
| Raises: | ||
| Error: If the session configuration is invalid | ||
| OperationalError: If there's an error establishing the session | ||
| InvalidServerResponseError: If the server response is invalid or unexpected | ||
| """ | ||
| pass | ||
| @abstractmethod | ||
| def close_session(self, session_id: SessionId) -> None: | ||
| """ | ||
| Closes an existing session with the Databricks SQL service. | ||
| This method terminates the session identified by the given session ID and | ||
| releases any resources associated with it. | ||
| Args: | ||
| session_id: The session identifier returned by open_session() | ||
| Raises: | ||
| ValueError: If the session ID is invalid | ||
| OperationalError: If there's an error closing the session | ||
| """ | ||
| pass | ||
| # == Query Execution, Command Management == | ||
| @abstractmethod | ||
| def execute_command( | ||
| self, | ||
| operation: str, | ||
| session_id: SessionId, | ||
| max_rows: int, | ||
| max_bytes: int, | ||
| lz4_compression: bool, | ||
| cursor: "Cursor", | ||
| use_cloud_fetch: bool, | ||
| parameters: List[ttypes.TSparkParameter], | ||
| async_op: bool, | ||
| enforce_embedded_schema_correctness: bool, | ||
| ) -> Union["ResultSet", None]: | ||
| pass | ||
| @abstractmethod | ||
| def cancel_command(self, command_id: CommandId) -> None: | ||
| """ | ||
| Cancels a running command or query. | ||
| This method attempts to cancel a command that is currently being executed. | ||
| It can be called from a different thread than the one executing the command. | ||
| Args: | ||
| command_id: The command identifier to cancel | ||
| Raises: | ||
| ValueError: If the command ID is invalid | ||
| OperationalError: If there's an error canceling the command | ||
| """ | ||
| pass | ||
| @abstractmethod | ||
| def close_command(self, command_id: CommandId) -> None: | ||
| pass | ||
| @abstractmethod | ||
| def get_query_state(self, command_id: CommandId) -> CommandState: | ||
| pass | ||
| @abstractmethod | ||
| def get_execution_result( | ||
| self, | ||
| command_id: CommandId, | ||
| cursor: "Cursor", | ||
| ) -> "ResultSet": | ||
| pass | ||
| # == Metadata Operations == | ||
| @abstractmethod | ||
| def get_catalogs( | ||
| self, | ||
| session_id: SessionId, | ||
| max_rows: int, | ||
| max_bytes: int, | ||
| cursor: "Cursor", | ||
| ) -> "ResultSet": | ||
| pass | ||
| @abstractmethod | ||
| def get_schemas( | ||
| self, | ||
| session_id: SessionId, | ||
| max_rows: int, | ||
| max_bytes: int, | ||
| cursor: "Cursor", | ||
| catalog_name: Optional[str] = None, | ||
| schema_name: Optional[str] = None, | ||
| ) -> "ResultSet": | ||
| pass | ||
| @abstractmethod | ||
| def get_tables( | ||
| self, | ||
| session_id: SessionId, | ||
| max_rows: int, | ||
| max_bytes: int, | ||
| cursor: "Cursor", | ||
| catalog_name: Optional[str] = None, | ||
| schema_name: Optional[str] = None, | ||
| table_name: Optional[str] = None, | ||
| table_types: Optional[List[str]] = None, | ||
| ) -> "ResultSet": | ||
| pass | ||
| @abstractmethod | ||
| def get_columns( | ||
| self, | ||
| session_id: SessionId, | ||
| max_rows: int, | ||
| max_bytes: int, | ||
| cursor: "Cursor", | ||
| catalog_name: Optional[str] = None, | ||
| schema_name: Optional[str] = None, | ||
| table_name: Optional[str] = None, | ||
| column_name: Optional[str] = None, | ||
| ) -> "ResultSet": | ||
| pass | ||
| # == Properties == | ||
| @property | ||
| @abstractmethod | ||
| def staging_allowed_local_path(self) -> Union[None, str, List[str]]: | ||
| """ | ||
| Gets the allowed local paths for staging operations. | ||
| Returns: | ||
| Union[None, str, List[str]]: The allowed local paths for staging operations, | ||
| or None if staging is not allowed | ||
| """ | ||
| pass | ||
| @property | ||
| @abstractmethod | ||
| def ssl_options(self) -> SSLOptions: | ||
| """ | ||
| Gets the SSL options for this client. | ||
| Returns: | ||
| SSLOptions: The SSL configuration options | ||
| """ | ||
| pass | ||
| @property | ||
| @abstractmethod | ||
| def max_download_threads(self) -> int: | ||
| """ | ||
| Gets the maximum number of download threads for cloud fetch operations. | ||
| Returns: | ||
| int: The maximum number of download threads | ||
| """ | ||
| pass |
Oops, something went wrong.
Uh oh!
There was an error while loading.Please reload this page.
Oops, something went wrong.
Uh oh!
There was an error while loading.Please reload this page.
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.