Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
This repository was archived by the owner on Nov 22, 2022. It is now read-only.
/pyspark-stubsPublic archive

Apache (Py)Spark type annotations (stub files).

License

NotificationsYou must be signed in to change notification settings

zero323/pyspark-stubs

Repository files navigation

Build StatusPyPI versionConda Forge version

A collection of the Apache Sparkstubfiles. Thesefiles were generated bystubgenand manually edited to include accurate type hints.

Tests and configuration files have been originally contributed to theTypeshed project. Please referto itscontributorslist andlicense fordetails.

Important

This projecthas been merged with the main Apache Spark repository (SPARK-32714). All further development for Spark 3.1 and onwards will be continued there.

For Spark 2.4 and 3.0, development of this package will be continued, until their official deprecation.

Motivation

  • Static error detection (seeSPARK-20631)

    SPARK-20631

  • Improved autocompletion.

    Syntax completion

Installation and usage

Please note that the guidelines for distribution of type information isstill work in progress (PEP 561 - Distributing and Packaging TypeInformation). Currentlyinstallation script overlays existing Spark installations (pyi stubfiles are copied next to theirpy counterparts in the PySparkinstallation directory). If this approach is not acceptable you can add stubfiles to the search path manually.

According toPEP484:

Third-party stub packages can use any location for stub storage.Type checkers should search for them using PYTHONPATH.

Moreover:

Third-party stub packages can use any location for stub storage.Type checkers should search for them using PYTHONPATH. A defaultfallback directory that is always checked isshared/typehints/python3.5/ (or 3.6, etc.)

Please check usage before proceeding.

The package is available onPYPI:

pip install pyspark-stubs

andconda-forge:

conda install -c conda-forge pyspark-stubs

Depending on your environment you might also need a type checker, likeMypyorPytype[1], and autocompletion tool, likeJedi.

EditorType checkingAutocompletionNotes
Atom[2][3]Through plugins.
IPython /Jupyter Notebook[4] 
PyCharm 
PyDev[5]? 
VIM /Neovim[6][7]Through plugins.
Visual Studio Code[8][9]Completion with plugin
Environment independent / other editors[10][11]ThroughMypy andJedi.

This package is tested against MyPy development branch and in rare cases (primarily important upstrean bugfixes), is not compatible with the preceding MyPy release.

PySpark Version Compatibility

Package versions follow PySpark versions with exception to maintenance releases - i.e. pyspark-stubs==2.3.0 should be compatible with pyspark>=2.3.0,<2.4.0.Maintenance releases (post1, post2, ..., postN) are reserved for internal annotations updates.

API Coverage:

As of release 2.4.0 most of the public API is covered. For details please checkAPI coverage document.

See also

Disclaimer

Apache Spark, Spark, PySpark, Apache, and the Spark logo aretrademarks ofTheApache Software Foundation. This project is not owned, endorsed, orsponsored by The Apache Software Foundation.

Footnotes

[1]Not supported or tested.
[2]Requiresatom-mypy or equivalent.
[3]Requiresautocomplete-python-jedi or equivalent.
[4]It is possibleto use magics to type check directly in the notebook. In general though, you'll have to export whole notebook to .py file and runtype checker on the result.
[5]Requires PyDev 7.0.3 or later.
[6]TODO Usingvim-mypy,syntastic orNeomake.
[7]Withjedi-vim.
[8]WithMypy linter.
[9]WithPython extension for Visual Studio Code.
[10]Just use your favorite checker directly, optionally combined with tool likeentr.
[11]SeeJedi editor plugins list.

[8]ページ先頭

©2009-2025 Movatter.jp