python code to wrangle john hopkins corona virus website by state in python

To scrape John Hopkins Coronavirus website by state using Python, you can use BeautifulSoup library to scrape the data and pandas to store the data in a DataFrame.

Here is an example code to scrape and wrangle the data by state:

main.py
import requests
from bs4 import BeautifulSoup
import pandas as pd

# URL for the US Statewise data
url = 'https://coronavirus.jhu.edu/us-map'

# Send a GET request to the specified URL
res = requests.get(url)

# Parse the HTML content using BeautifulSoup
soup = BeautifulSoup(res.content,'html.parser')

# Extract the table containing the statewise data
table = soup.find_all('table')[0]

# Create an empty list to store the data
data = []

# Loop through each row of the table and extract the data
for row in table.find_all('tr')[1:]:
    state = row.find_all('td')[0].text.strip()
    confirmed = row.find_all('td')[1].text.strip()
    deaths = row.find_all('td')[2].text.strip()
    data.append([state,confirmed,deaths])

# Convert the list into a pandas DataFrame
df = pd.DataFrame(data, columns=['State','Confirmed Cases','Deaths'])

# Print the first few rows of the DataFrame
print(df.head())
900 chars
32 lines

This code will give you a DataFrame containing statewise Coronavirus data for the United States, including the number of confirmed cases and the number of deaths. You can further wrangle the data as per your requirement.

gistlibby LogSnag