Local Development Setup
This document defines engineering standards for setting up a backend project locally.
Rule: This page should stay generic. Project-specific setup details belong in the project repo’s own docs (and must be the source of truth for exact versions, ports, and environment variables).
Prerequisites
What every backend repo must define
- Supported Python version(s): a single pinned version (or a tight range) documented in the repo (at minimum in the README).
- Database requirements: the database engine + any required extensions documented in the repo.
- Dependency management: one clear install path documented in the repo (recommended:
pyproject.toml-based install, if the repo is packaged). - How to run locally: one command (or Make target) documented in the repo.
Common “Python version pin” files (optional)
These are not required by the standards. They’re just common ways projects make the supported Python version explicit:
.python-version: a tiny file used by tools likepyenvto auto-select the Python version in a folder.pyproject.toml: a standard Python project config file; many repos use it to declare supported Python versions and tool settings.runtime.txt: a simple “runtime hint” file used by some deployment platforms to pick a Python version.
Baseline local dependencies (generic)
- Git: required.
- Python: required.
- A Postgres client (CLI or GUI): recommended for day-to-day debugging.
Non-standard (never required by these standards):
- IDE choice (use what you prefer).
- Docker/containers (nice to have in some environments, but must not be required for all developers).
- Any specific API client application (optional).
Initial Setup
1. Clone and Navigate to Repository
git clone <repository-url>
cd <project-name>
2. Create Virtual Environment
Using venv (recommended):
python3 -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
3. Install Dependencies
Rule: Install dependencies using the repo’s declared mechanism (don’t invent a new one).
pip install --upgrade pip
# Example: pyproject.toml based repos (recommended)
pip install -e .
# Example: requirements.txt based repos (allowed)
# pip install -r requirements.txt
4. Install Development Tooling (repo-defined)
Rule: Dev tooling must be pinned and repeatable (CI and local should use the same versions).
Acceptable patterns include (use one; document it in the repo):
- A
pyproject.tomlworkflow with extras (e.g.,pip install -e ".[dev]") when the repo is packaged. - A single
requirements.txtthat includes both runtime + dev tools (allowed). - A separate dev requirements file (commonly named
requirements-dev.txt) (allowed). - A tool-managed lockfile workflow (if the repo uses one).
If you’re unsure which one applies, look for the repo’s “getting started” instructions and CI configuration.
Database Setup
Postgres + pgvector (standard)
Standard: Use PostgreSQL with pgvector available in all environments (local/dev/staging/prod), even if the project doesn’t use embeddings yet. This keeps teams “RAG-ready” without a later infrastructure scramble.
Notes:
- Some projects may not enable the extension in every database; if so, document that explicitly.
- Exact Postgres major version should match production where feasible and be documented in the repo.
Provisioning options (choose what works for your machine)
- Local Postgres install: good when you prefer native tooling.
- Remote dev database: good when local installs are constrained.
- Containerized Postgres: optional; do not assume it’s available.
Create database + enable pgvector
createdb <project_db_local>
# Enable pgvector (only needs to be done once per database)
psql <project_db_local> -c "CREATE EXTENSION vector;"
Environment Configuration
.env standards (generic)
Rule: The repo must provide a committed .env.example (or equivalent) that:
- Contains placeholders only (no real keys, no secrets).
- Groups variables by category (database, auth, integrations, workers, etc.).
- Documents which variables are required for “minimal local run”.
Create your local .env (not committed) based on .env.example.
# Example shape only (project must define actual names/meaning)
ENVIRONMENT=local
# Database
DATABASE_URL=postgresql+asyncpg://<user>:<password>@<host>:<port>/<db_name>
# Auth / secrets
SECRET_KEY=<required>
# Optional feature flags
FEATURE_FLAG_EXAMPLE=0
Running the Application
Development Server
Rule: Prefer a single documented “run locally” command (or Make target) so onboarding is consistent.
uvicorn <module_path>:<app_object> --reload
If the project uses a non-default host/port or requires auth for docs, that must be documented in the repo (not hard-coded in these standards).
Code Quality Tools
These standards require a single baseline quality toolchain for Python backends.
Required baseline (standardized)
- Formatter:
black - Linter:
flake8 - Type checker:
mypy - Tests:
pytest(and async plugins if the repo is async)
Rule: These checks must run in CI for every PR. Running them locally is strongly recommended, but CI is the enforcement point.
What these mean (plain English)
- Formatter: automatically rewrites code into a consistent style (so the team doesn’t argue about whitespace).
- Linter: finds likely bugs and style issues (unused imports, suspicious code, etc.).
- Type checker: verifies type hints (helps catch certain classes of bugs earlier).
- Tests: verify behavior (the minimum bar even if you skip everything else).
Standard commands
If the tools are installed in your environment, use:
# Format code
python -m black .
# Lint code
python -m flake8 .
# Type-check code
python -m mypy .
# Run tests
python -m pytest
Install note (keep it simple)
Rule: Each repo must document exactly how to install these tools (and pin versions somewhere in the repo). Recommended: declare runtime deps + dev/test deps via pyproject.toml (extras), so pip install -e ".[dev]" installs everything needed for local work and CI.
Black (formatter)
python -m black .
Flake8 (linter)
python -m flake8 .
MyPy (type checker)
python -m mypy .