Core Spark functionality.
Core Spark functionality.org.apache.spark.SparkContext serves as the main entry point toSpark, whileorg.apache.spark.rdd.RDD is the data type representing a distributed collection,and provides most parallel operations.
In addition,org.apache.spark.rdd.PairRDDFunctions contains operations available only on RDDsof key-value pairs, such asgroupByKey andjoin;org.apache.spark.rdd.DoubleRDDFunctionscontains operations available only on RDDs of Doubles; andorg.apache.spark.rdd.SequenceFileRDDFunctions contains operations available on RDDs that canbe saved as SequenceFiles. These operations are automatically available on any RDD of the righttype (e.g. RDD[(Int, Int)] through implicit conversions.
Java programmers should reference theorg.apache.spark.api.java packagefor Spark programming APIs in Java.
Classes and methods marked withExperimental are user-facing features which have not been officially adopted by theSpark project. These are subject to change or removal in minor releases.
Classes and methods marked withDeveloper API are intended for advanced users want to extend Spark through lowerlevel interfaces. These are subject to changes or removal in minor releases.
Spark Java programming APIs.
Spark Java programming APIs.
Set of interfaces to represent functions in Spark's Java API.
Set of interfaces to represent functions in Spark's Java API. Users create implementations ofthese interfaces to pass functions to various Java API methods for Spark. Please visit Spark'sJava programming guide for more details.
(Since version 9)