Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Scrape Hacktoberfest Events 2020 fixes#16#24

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to ourterms of service andprivacy statement. We’ll occasionally send you account related emails.

Already on GitHub?Sign in to your account

Open
jiyoungsin wants to merge1 commit intorealpython:master
base:master
Choose a base branch
Loading
fromjiyoungsin:master
Open
Changes fromall commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
55 changes: 55 additions & 0 deletionsscripts/35_scrape_hacktoberfest_events.py
View file
Open in desktop
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,55 @@
import requests
import pandas
from bs4 import BeautifulSoup

# creating a soup object with html we got from the response
url = "https://hacktoberfest.digitalocean.com/events"
response = requests.get(url)
html = response.text
soup = BeautifulSoup(html)

# creating array of datas
all_names = []
all_locations = []
all_dates = []
all_time_zones = []
all_urls = []

# iterating on all the "tr" elements.
for tr_element in soup.findAll("tr", attrs={"class": "past"}):

# for each tr element we find the proper value and add it to its proper array
name_element = tr_element.find("td", attrs={"class": "event_name"})
name = name_element.text.strip()
all_names.append(name)

location_element = tr_element.find("td", attrs={"class": "location"})
location = location_element.text.strip()
all_locations.append(location)

date_element = tr_element.find("td", attrs={"data-label": "date"})
date = date_element.text.strip()
all_dates.append(date)

time_zone_element = tr_element.find("td", attrs={"data-label": "zone"})
time_zone = time_zone_element.text.strip()
all_time_zones.append(time_zone)

url_element = tr_element.find("a", attrs={"class": "emphasis"})
url = url_element['href']
all_urls.append(url)

# setting up our Comma Seperated Values
csv_name = "events.csv"
csv_structure = {
"Name": all_names,
"Location": all_locations,
"Date": all_dates,
"Time Zone": all_time_zones,
"URL": all_urls,
}
# Creating a csv
dataFrame = pandas.DataFrame(csv_structure)
dataFrame.to_csv(csv_name, index=False, encoding='utf-8')



[8]ページ先頭

©2009-2025 Movatter.jp