Celery worker that asynchronously loads static GTFS feeds into the database
- Python 93.8%
- Dockerfile 4.5%
- Makefile 1.7%
|
All checks were successful
Build / build (push) Successful in 17s
Closes gtfs.zone/deploy-gtfs-rt#43 |
||
|---|---|---|
| .forgejo/workflows | ||
| src/worker | ||
| .env | ||
| .pre-commit-config.yaml | ||
| CLAUDE.md | ||
| Dockerfile | ||
| LICENSE.txt | ||
| Makefile | ||
| pyproject.toml | ||
| README.md | ||
| uv.lock | ||
schedule-foamer
Celery worker that downloads GTFS static feeds, parses them, and writes to PostgreSQL tables managed by cafe-car.
Quick start
cp .env.example .env
# Edit .env with your DATABASE_URL and Redis URLs
uv sync
uv run pre-commit install # install git hooks (required once per clone)
uv run celery -A worker.celery_app worker -l info
Development commands
# Install git hooks (required once per clone)
uv run pre-commit install
uv run ruff check src/ # lint
uv run ruff check --fix src/ # lint + autofix
Docker
Built and run as part of cafe-car's docker-compose:
cd ../cafe-car
docker compose up --build
Tasks
| Task | Description |
|---|---|
worker.tasks.load_feed |
Download and parse a single GTFS feed by ID |
worker.tasks.refresh_all_feeds |
Enqueue load_feed for every feed (runs daily at 02:00 UTC) |
worker.tasks.ensure_all_feeds_scheduled |
Re-enqueue feeds with no status or failed status (runs every minute) |