This repository uses a Python script to import data from the GeoJSON endpoint into a SQLite database for use with Datasette.
- Python 3.6+
requests
librarysqlite3
(included with Python)geojson-to-sqlite
tool
-
Install dependencies:
pip install requests datasette geojson-to-sqlite
-
Run the script:
python script.py
The script will:
- Fetch GeoJSON from Tampa's ArcGIS endpoint
- Create/update SQLite database (
data.db
) - Normalize timestamps (CREATEDDATE, LASTUPDATE)
- Track new entries with
date_added
- Mark removed entries as
archived
withdate_archived
-
Schedule daily updates (optional):
crontab -e
Add this line to run at midnight daily:
0 0 * * * cd /path/to/project && /usr/bin/python script.py
Deploy with Datasette:
-
Start local server:
datasette data.db
-
Deploy to cloud (optional):
datasette publish cloudrun data.db
For more deployment options, see Datasette documentation.
This repository uses GitHub Actions to:
- Run the script daily at midnight UTC
- Deploy updated database to Vercel via datasette-publish-vercel
To set up automated deployment:
- Create a Vercel account and get API token
- Add token as GitHub repository secret named
VERCEL_TOKEN
- Enable GitHub Actions in repository settings
- Push code to main branch to trigger initial deployment
The live deployment will be available at: https://your-project-name.vercel.app
The database contains these fields:
RECORDID
(Primary Key)- Location:
ADDRESS
,UNIT
,geometry
(GeoJSON Point) - Status:
APPSTATUS
,TENTATIVEHEARING
,TENTATIVETIME
- Metadata:
CREATEDDATE
,LASTUPDATE
(ISO format) - System:
date_added
,date_archived
,archived
Temporary files:
temp.geojson
- Deleted after import- data.db - SQLite database (add to .gitignore)
Code is licensed under MIT. Data sourced from City of Tampa Open Data.