Skip to content

gkarthik/crawl-covid19-cases

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

26 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Crawl cases and save to file

Dependencies

Configuration

  1. Add config file (./covid19/config.py) to post ascii tables to URLs (Use case: Slack bot).
slack_sandiego_post_url = "<post-url>"
  1. Create ./logs/ and ./data directories
  2. Run command from base directory
scrapy crawl --logfile logs/$(date +%Y-%m-%d-%H-%M.log) -o data/items.csv sandiego

Contribution

Pick a region and write a spider. I've tried to keep the classes TestingStats and CaseCategories as general as possible.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages