In this article, we are going to see how to scrape Reddit using Python, here we will be using python's PRAW (Python Reddit API Wrapper) module to scrape the data. Praw is an acronym Python Reddit API wrapper, it allows Reddit API through Python scripts.
Installation
To install PRAW, run the following commands on the command prompt:
pip install praw
Creating a Reddit App
Step 1:To extract data from Reddit, we need to create a Reddit app. You can create a new Reddit app(https://www.reddit.com/prefs/apps).
Reddit - Create an AppStep 2:Click on "are you a developer? create an app...".
Step 3:A form like this will show up on your screen. Enter the name and description of your choice. In theredirect uri box, enterhttp://localhost:8080
App FormStep 4:After entering the details, click on "create app".
Developed ApplicationThe Reddit app has been created. Now, we can use python and praw to scrape data from Reddit. Note down the client_id, secret, and user_agent values. These values will be used to connect to Reddit using python.
Creating a PRAW Instance
In order to connect to Reddit, we need to create a praw instance. There are 2 types of praw instances:
- Read-only Instance: Using read-only instances, we can only scrape publicly available information on Reddit. For example, retrieving the top 5 posts from a particular subreddit.
- Authorized Instance: Using an authorized instance, you can do everything you do with your Reddit account. Actions like upvote, post, comment, etc., can be performed.
Python3# Read-only instancereddit_read_only=praw.Reddit(client_id="",# your client idclient_secret="",# your client secretuser_agent="")# your user agent# Authorized instancereddit_authorized=praw.Reddit(client_id="",# your client idclient_secret="",# your client secretuser_agent="",# your user agentusername="",# your reddit usernamepassword="")# your reddit password
Now that we have created an instance, we can use Reddit's API to extract data. In this tutorial, we will be only using the read-only instance.
Scraping Reddit Subreddits
There are different ways of extracting data from a subreddit. The posts in a subreddit are sorted as hot, new, top, controversial, etc. You can use any sorting method of your choice.
Let's extract some information from the redditdev subreddit.
Python3importprawimportpandasaspdreddit_read_only=praw.Reddit(client_id="",# your client idclient_secret="",# your client secretuser_agent="")# your user agentsubreddit=reddit_read_only.subreddit("redditdev")# Display the name of the Subredditprint("Display Name:",subreddit.display_name)# Display the title of the Subredditprint("Title:",subreddit.title)# Display the description of the Subredditprint("Description:",subreddit.description)
Output:
Name, Title, and DescriptionNow let's extract 5 hot posts from the Python subreddit:
Python3subreddit=reddit_read_only.subreddit("Python")forpostinsubreddit.hot(limit=5):print(post.title)print()
Output:
Top 5 hot postsWe will now save the top posts of the python subreddit in a pandas data frame:
Python3posts=subreddit.top("month")# Scraping the top posts of the current monthposts_dict={"Title":[],"Post Text":[],"ID":[],"Score":[],"Total Comments":[],"Post URL":[]}forpostinposts:# Title of each postposts_dict["Title"].append(post.title)# Text inside a postposts_dict["Post Text"].append(post.selftext)# Unique ID of each postposts_dict["ID"].append(post.id)# The score of a postposts_dict["Score"].append(post.score)# Total number of comments inside the postposts_dict["Total Comments"].append(post.num_comments)# URL of each postposts_dict["Post URL"].append(post.url)# Saving the data in a pandas dataframetop_posts=pd.DataFrame(posts_dict)top_posts
Output:
top posts of the python subredditExporting Data to a CSV File:
Python3importpandasaspdtop_posts.to_csv("Top Posts.csv",index=True)
Output:
CSV File of Top PostsScraping Reddit Posts:
To extract data from Reddit posts, we need the URL of the post. Once we have the URL, we need to create a submission object.
Python3importprawimportpandasaspdreddit_read_only=praw.Reddit(client_id="",# your client idclient_secret="",# your client secretuser_agent="")# your user agent# URL of the posturl="https://www.reddit.com/r/IAmA/comments/m8n4vt/\im_bill_gates_cochair_of_the_bill_and_melinda/"# Creating a submission objectsubmission=reddit_read_only.submission(url=url)
We will extract the best comments from the post we have selected. We will need the MoreComments object from the praw module. To extract the comments, we will use a for-loop on the submission object. All the comments will be added to the post_comments list. We will also add an if-statement in the for-loop to check whether any comment has the object type of more comments. If it does, it means that our post has more comments available. So we will add these comments to our list as well. Finally, we will convert the list into a pandas data frame.
Python3frompraw.modelsimportMoreCommentspost_comments=[]forcommentinsubmission.comments:iftype(comment)==MoreComments:continuepost_comments.append(comment.body)# creating a dataframecomments_df=pd.DataFrame(post_comments,columns=['comment'])comments_df
Output:
list into a pandas dataframe
How to Scrape Reddit Using Python