Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Pedro changes for videos#5

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to ourterms of service andprivacy statement. We’ll occasionally send you account related emails.

Already on GitHub?Sign in to your account

Merged
jleetutorial merged 3 commits intomasterfrompedro-changes_for_videos
Dec 17, 2017
Merged
Show file tree
Hide file tree
Changes fromall commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
33 changes: 0 additions & 33 deletionsbuild.gradle
View file
Open in desktop

This file was deleted.

Binary file removedgradle/wrapper/gradle-wrapper.jar
View file
Open in desktop
Binary file not shown.
6 changes: 0 additions & 6 deletionsgradle/wrapper/gradle-wrapper.properties
View file
Open in desktop

This file was deleted.

160 changes: 0 additions & 160 deletionsgradlew
View file
Open in desktop

This file was deleted.

90 changes: 0 additions & 90 deletionsgradlew.bat
View file
Open in desktop

This file was deleted.

7 changes: 3 additions & 4 deletionspairRdd/aggregation/reducebykey/WordCount.py
View file
Open in desktop
Original file line numberDiff line numberDiff line change
@@ -1,9 +1,8 @@
from pyspark import SparkContext
from pyspark import SparkContext, SparkConf

if __name__ == "__main__":

sc = SparkContext("local", "wordCounts")
sc.setLogLevel("ERROR")
conf = SparkConf().setAppName("wordCounts").setMaster("local[3]")
sc = SparkContext(conf = conf)

lines = sc.textFile("in/word_count.text")
wordRdd = lines.flatMap(lambda line: line.split(" "))
Expand Down
7 changes: 3 additions & 4 deletionspairRdd/create/PairRddFromRegularRdd.py
View file
Open in desktop
Original file line numberDiff line numberDiff line change
@@ -1,9 +1,8 @@
from pyspark import SparkContext
from pyspark import SparkContext, SparkConf

if __name__ == "__main__":

sc = SparkContext("local", "create")
sc.setLogLevel("ERROR")
conf = SparkConf().setAppName("create").setMaster("local")
sc = SparkContext(conf = conf)

inputStrings = ["Lily 23", "Jack 29", "Mary 29", "James 8"]
regularRDDs = sc.parallelize(inputStrings)
Expand Down
7 changes: 3 additions & 4 deletionspairRdd/create/PairRddFromTupleList.py
View file
Open in desktop
Original file line numberDiff line numberDiff line change
@@ -1,9 +1,8 @@
from pyspark import SparkContext
from pyspark import SparkContext, SparkConf

if __name__ == "__main__":

sc = SparkContext("local", "create")
sc.setLogLevel("ERROR")
conf = SparkConf().setAppName("create").setMaster("local")
sc = SparkContext(conf = conf)

tuples = [("Lily", 23), ("Jack", 29), ("Mary", 29), ("James", 8)]
pairRDD = sc.parallelize(tuples)
Expand Down
15 changes: 9 additions & 6 deletionspairRdd/groupbykey/GroupByKeyVsReduceByKey.py
View file
Open in desktop
Original file line numberDiff line numberDiff line change
@@ -1,18 +1,21 @@
from pyspark import SparkContext
from pyspark import SparkContext, SparkConf

if __name__ == "__main__":

sc = SparkContext("local", "GroupByKeyVsReduceByKey")
sc.setLogLevel("ERROR")
conf = SparkConf().setAppName('GroupByKeyVsReduceByKey').setMaster("local[*]")
sc = SparkContext(conf = conf)

words = ["one", "two", "two", "three", "three", "three"]
wordsPairRdd = sc.parallelize(words).map(lambda word: (word, 1))

wordCountsWithReduceByKey = wordsPairRdd.reduceByKey(lambda x, y: x + y).collect()
wordCountsWithReduceByKey = wordsPairRdd \
.reduceByKey(lambda x, y: x + y) \
.collect()
print("wordCountsWithReduceByKey: {}".format(list(wordCountsWithReduceByKey)))

wordCountsWithGroupByKey = wordsPairRdd \
.groupByKey() \
.mapValues(lambda intIterable:len(intIterable)) \
.mapValues(len) \
.collect()
print("wordCountsWithGroupByKey: {}".format(list(wordCountsWithGroupByKey)))


7 changes: 3 additions & 4 deletionspairRdd/join/JoinOperations.py
View file
Open in desktop
Original file line numberDiff line numberDiff line change
@@ -1,9 +1,8 @@
from pyspark import SparkContext
from pyspark import SparkContext, SparkConf

if __name__ == "__main__":

sc = SparkContext("local", "JoinOperations")
sc.setLogLevel("ERROR")
conf = SparkConf().setAppName("JoinOperations").setMaster("local[1]")
sc = SparkContext(conf = conf)

ages = sc.parallelize([("Tom", 29), ("John", 22)])
addresses = sc.parallelize([("James", "USA"), ("John", "UK")])
Expand Down
Loading

[8]ページ先頭

©2009-2025 Movatter.jp