Compare commits

...
Sign in to create a new pull request.

19 commits

Author SHA1 Message Date
2337747ca6
Merge branch 'dev' (staging/no-tag) 2025-06-06 18:10:00 -07:00
a58a0e642a
fix(deploy): remove invalid parameters for download-artifact, add back in missing setuptools-scm dependency 2025-06-06 18:09:01 -07:00
e183a990e7
Merge branch 'dev' (staging/no-tag) 2025-06-06 17:28:45 -07:00
291231c886
fix(deploy): correct env vars, docker compose project names, and workflow outputs
- Standardize environment variable from IS_PROD to PROD across all scripts
- Add missing -p flag to docker compose commands for consistent project naming
- Fix GitHub Actions workflow to use environment vars instead of job outputs
- Consolidate metadata setup and fix artifact naming in build/deploy jobs
- Correct service paths in docker-compose_core.yml
2025-06-06 17:26:50 -07:00
a012c2a2f6
Merge branch 'dev' (staging/no-tag) 2025-06-06 14:39:53 -07:00
f20c4f9474
feat: add dynamic versioning and automated deployment with rollback capability
- Implement setuptools-scm for dynamic version management from git tags
- Refactor CI/CD into separate build and deploy jobs with artifact sharing
- Add versioned releases with timestamp-based deployment directories
- Implement health checks and automatic rollback on deployment failure
- Extract deployment logic into reusable shell scripts
- Add Docker layer caching to speed up builds
- Include version info in Django context and build args
2025-06-06 14:38:23 -07:00
445be58cd3
Merge branch 'dev' (staging/no-tag) 2025-06-01 19:11:28 -07:00
46619bd5e1
fix(templates) Add missing cache templatetag to trade_acceptance partial
This commit updates the `trade_acceptance.html` template partial to include the `cache` templatetag in its `{% load %}` directive, preventing a missing tag error.
2025-06-01 19:08:00 -07:00
48ea0eb48e
fix(dev): Resolve Dev Debug Environment Issues and Streamline Local Setup
This commit comprehensively addresses issues with the local development and debugging environment, ensuring a smoother and more reliable developer experience.

Key changes include:

- **VSCode Debugger:** Corrected launch configuration (`.vscode/launch.json`) to properly run the Django development server with debugging enabled. It now correctly sets `DEBUG=True`, uses `0.0.0.0:8000`, and specifies the correct working directory.
- **Docker Compose:** Exposed the PostgreSQL port (`5432:5432`) in `docker-compose.yml` to allow direct connections from the host, facilitating local development and debugging without needing to run the full application stack.
- **Environment Variables:**
    - Updated `.gitignore` to ignore all `.env.*` files, allowing for environment-specific configurations.
    - Modified `src/pkmntrade_club/django_project/settings.py` to use `localhost` for `DJANGO_DATABASE_URL` and `REDIS_URL` by default, aligning with the exposed Docker services for easier local development. Default `DISABLE_SIGNUPS` and `DISABLE_CACHE` are now `True` for a more typical local dev setup.
- **Management Commands & Scripts:**
    - Adjusted `manage.py` to correctly append the project's root directory to `sys.path`, resolving potential import issues when running management commands.
    - Significantly improved `scripts/reset-db_make-migrations_seed-data.sh`:
        - Removed reliance on sourcing `.env` directly.
        - Ensured the database service (`db`) is started independently before migrations.
        - Added explicit steps for running `prebuild.sh`, migrations, and `collectstatic`.
        - Switched to using `uv run manage.py loaddata` for seeding, which is more consistent with the project's tooling.
- **Django Settings:** Added `SESSION_ENGINE = "django.contrib.sessions.backends.signed_cookies"` and `SESSION_COOKIE_HTTPONLY = True` for improved session management and security.

These changes collectively fix the previously problematic development setup, enabling straightforward debugging and a more efficient workflow for local development.
2025-06-01 19:06:56 -07:00
4af7512293
fix(settings): Remove erroneously added DJANGO_SETTINGS_MODULE from settings.py as it is causing ModuleNotFoundError: No module named 'pkmntrade_club.django_project.settings' when using django-admin commands 2025-05-23 23:22:03 -07:00
552c74f626
Merge branch 'dev' (staging/no-tag) 2025-05-23 21:55:17 -07:00
b26ca10489
fix(docker): Add missing ALLOWED_HOSTS environment variables to docker-compose_web.yml and docker-compose_staging.yml to prevent security errors. 2025-05-23 21:54:55 -07:00
35894ab545
Merge branch 'dev' (staging/no-tag) 2025-05-23 21:47:53 -07:00
acbbc33efa
fix(static): For now, replace @ symbols from static js filenames as granian doesn't handle URL-encoded static paths properly yet. But the static file handling is faster with granian, so we want to keep it. 2025-05-23 21:35:52 -07:00
51de3c7a6d
feat(dev): Enable hot reloading and streamline local development
This commit significantly improves the local development experience by enabling hot reloading for the Django application. This is achieved by installing the project as an editable package within the Docker services.

Key changes:

- **Hot Reloading:**
    - Modified `docker-compose.yml` for `web` and `celery` services to use `uv pip install --editable . --no-deps`.
    - Mounted the project root (`./`) to `/code` in `web` and `celery` services to facilitate the editable install.
- **Docker & Build Enhancements:**
    - Added `uv` binary to stage-1 in the `Dockerfile` for faster package operations.
    - Adjusted file permissions in `Dockerfile` during the app copy.
    - Set `DEBUG=true` for the `web` service in `docker-compose.yml` for easier local debugging.
    - Changed `restart` policy to `unless-stopped` for `web` and `celery` dev services.
    - Added a healthcheck for the `redis` service in the dev `docker-compose.yml`.
- **Code & Script Cleanup:**
    - Removed the custom `HealthCheckView` from the `home` app, as health checks are now handled by django-health-checks.
    - Updated paths and commands in `scripts/entrypoint.sh`, `scripts/prebuild.sh`, and `scripts/reset-db_make-migrations_seed-data.sh` to align with the new setup and remove obsolete steps (e.g., db cache table creation; we now use redis).
2025-05-23 21:19:33 -07:00
02f23dba28
refactor(docker): Enhance settings.py and deployment
This commit significantly refactors the Docker setup and application
configuration for improved robustness and flexibility.

Key changes include:

- Centralized Environment Variables:
  - Default values for essential settings (database, email, cache, etc.)
    are now defined in `django_project/settings.py` using `environ.Env`.
    This provides sensible defaults and reduces reliance on `.env` files,
    especially during Docker image builds.
  - `docker-compose.yml` no longer defines environment variables directly
    for `web` and `worker` services, deferring to `.env` and settings defaults.

- Dockerfile & Entrypoint Improvements:
  - `DJANGO_SETTINGS_MODULE` is now exclusively set as an ENV in `Dockerfile`, instead of setting it in `entrypoint.sh`
  - `entrypoint.sh` now conditionally appends `--static-path-mount`
    only to the `granian` command, leveraging the upgraded Granian's
    (v2.3.0+) ability to serve static files directly. The `STATIC_ROOT` is
    dynamically fetched from Django settings.

- Dependency Updates:
  - Upgraded `granian` from 2.2.5 to 2.3.1.
  - Upgraded `click` from 8.2.0 to 8.2.1.
  - `uv.lock` reflects these and other minor transitive dependency updates.

- Configuration Adjustments in `settings.py`:
  - Add defaults for all env variables, and set to default local dev settings
  - Introduced a `SCHEME` environment variable (defaulting to 'http')
    used for `CSRF_TRUSTED_ORIGINS`, `META_SITE_PROTOCOL`,
    `ACCOUNT_DEFAULT_HTTP_PROTOCOL`, etc.
  - `TIME_ZONE` and various email settings (host, port, user, password, TLS)
    are now configurable via environment variables with defaults.
  - `CELERY_TIMEZONE` now defaults to the `TIME_ZONE` setting.
  - Removed the unused `SCW_SECRET_KEY` variable (previously used for
    EMAIL auth).
2025-05-23 18:46:29 -07:00
d4948e7cd3
fix: Ensure deploy script runs once and is part of entrypoint
The deploy.sh script is now re-added to the entrypoint.sh script
to ensure it runs only during first container startup.

A flag file (/flags/.deployed) is now created after a successful deployment.
The deploy.sh script checks for this flag and will not re-run
deployment steps unless FORCE_DEPLOY is set to true. This prevents
unnecessary re-runs of migrations, collectstatic, etc., on subsequent
container starts within the same deployment.

Corrected permissions for `/app/.cursor-server` and created a `/flags`
directory with appropriate permissions in the `Dockerfile`. Added
ENV DJANGO_SETTINGS_MODULE with default value to `Dockerfile`.
2025-05-23 18:39:29 -07:00
c87d73435b
feat: Enhance gatekeeper resilience and host handling
This commit significantly improves the gatekeeper system's robustness, monitoring capabilities, and simplifies host header management for backend services.

Key changes include:

**Gatekeeper Health, Management & Resilience:**
- Implemented active health checking for individual gatekeeper containers within the `gatekeeper-manager` service.
    - The manager now periodically curls the `/metrics` endpoint of each gatekeeper container.
    - Reports health status to a new Gatus `services_gatekeeper` endpoint.
    - Automatically attempts to restart the gatekeeper stack if any gatekeeper instance is unhealthy or if the expected number of gatekeepers is not running.
- Refactored the `gatekeeper-manager` shell script for improved state management and signal handling:
    - Introduced `STARTED`, `RESTARTING`, `TERMINATING` state flags for more controlled operations.
    - Enhanced SIGTERM and SIGHUP handling to gracefully manage gatekeeper lifecycles.
    - Added `apk add curl` to ensure `curl` is available in the manager container.
- Renamed the gatekeeper Docker Compose template from `docker-compose_gatekeeper.template.yml` to `gatekeepers.template.yml` and its output to `gatekeepers.yml`.
- Updated `dockergen-gatekeeper` to watch the new template file and notify the correct `gatekeeper-manager` service instance (e.g., `pkmntrade-club-gatekeeper-manager-1`).
- Discover services that should be protected by looking for a `gatekeeper=true` label.

**Host Header Management & `ALLOWED_HOSTS` Simplification:**
- HAProxy configuration (`haproxy.cfg`) now consistently sets the `Host` HTTP header for requests to all backend services (e.g., `pkmntrade.club`, `staging.pkmntrade.club`). This centralizes and standardizes host information.
- Consequently, explicit `ALLOWED_HOSTS` environment variables have been removed from the `web` and `celery` service definitions in `docker-compose_web.yml` and `docker-compose_staging.yml`. Backend Django applications should now rely on the `Host` header set by HAProxy for request validation.
- The `gatekeepers.template.yml` now defines a `TARGET_HOST` environment variable for proxied services (e.g., `web`, `web-staging`). This aligns with the ALLOWED_HOSTS on the target to ensure requests aren't blocked.

**Gatus Monitoring & Configuration Updates:**
- In Gatus configuration (`gatus/config.template.yaml`):
    - The "Redis" external service endpoint has been renamed to "Cache" for better clarity and to fit the theme of simple names.
    - A new external service endpoint "Gatekeeper" has been added to monitor the overall health reported by the `gatekeeper-manager`.
    - Health checks for "Web Worker" endpoints (both main and staging) now include the appropriate `Host` header (e.g., `Host: pkmntrade.club`) to ensure accurate health assessments by Django.
- In `docker-compose_core.yml`, the `curl` commands used by `db-redis-healthcheck` for database and cache health now append `|| true`. This prevents the script from exiting on a curl error (e.g., timeout, connection refused), ensuring that the failure is still reported to Gatus via the `success=false` parameter rather than the script terminating prematurely.

These changes collectively make the gatekeeper system more fault-tolerant, provide better visibility into its status, and streamline the configuration of backend applications by standardizing how they receive host information.
2025-05-23 16:16:59 -07:00
6aa15d1af9
feat: Implement dynamic Gatekeeper proxy and enhance service health monitoring
- **Implemented Dynamic Gatekeeper (Anubis) Proxy:**
  - Introduced Anubis as a Gatekeeper proxy layer for services (`web`, `web-staging`, `feedback`, `health`).
  - Added `docker-gen` setup (`docker-compose_gatekeeper.template.yml`, `gatekeeper-manager`) to dynamically configure Anubis instances based on container labels (`enable_gatekeeper=true`).
  - Updated HAProxy to route traffic through the respective Gatekeeper services.

- **Enhanced Service Health Monitoring & Checks:**
  - Integrated `django-health-check` into the Django application, providing detailed health endpoints (e.g., `/health/`).
  - Replaced the custom health check view with `django-health-check` URLs.
  - Added `psutil` for system metrics in health checks.
  - Made Gatus configuration dynamic using `docker-gen` (`config.template.yaml`), allowing automatic discovery and monitoring of service instances (e.g., web workers).
  - Externalized Gatus SMTP credentials to environment variables.
  - Strengthened `docker-compose_core.yml` with a combined `db-redis-healthcheck` service reporting to Gatus.
  - Added explicit health checks for `db` and `redis` services in `docker-compose.yml`.

- **Improved Docker & Compose Configuration:**
  - Added `depends_on` conditions in `docker-compose.yml` for `web` and `celery` services to wait for the database.
  - Updated `ALLOWED_HOSTS` in `docker-compose_staging.yml` and `docker-compose_web.yml` to include internal container names for Gatekeeper communication.
  - Set `DEBUG=False` for staging services.
  - Removed `.env.production` from `.gitignore` (standardized to `.env`).
  - Streamlined `scripts/entrypoint.sh` by removing the call to the no-longer-present `/deploy.sh`.

- **Dependency Updates:**
  - Added `django-health-check>=3.18.3` and `psutil>=7.0.0` to `pyproject.toml` and `uv.lock`.
  - Updated `settings.py` to include `health_check` apps, configuration, and use `REDIS_URL` consistently.

- **Streamlined deployment script used in GHA:**
  - Updated the workflow to copy new server files and create a new `.env` file in the temporary directory before moving them into place.
  - Consolidated the stopping and removal of old containers into a single step for better clarity and efficiency.
  - Reduce container downtime by rearranging stop/start steps.
2025-05-23 00:15:19 -07:00
41 changed files with 1478 additions and 518 deletions

0
.envrc Normal file → Executable file
View file

View file

@ -8,43 +8,37 @@ on:
branches: [main]
jobs:
build-deploy:
# Job 1: Build the Docker image
build:
runs-on: ubuntu-latest
outputs:
repo: ${{ steps.meta.outputs.REPO }}
repo-name: ${{ steps.meta.outputs.REPO_NAME_ONLY }}
repo-path: ${{ steps.meta.outputs.REPO_PROJECT_PATH }}
image-tar: ${{ steps.meta.outputs.REPO_NAME_ONLY }}-${{ github.ref_name }}_${{ github.sha }}.tar
tags: ${{ steps.generated_docker_tags.outputs.tag }}
prod: ${{ steps.meta.outputs.prod }}
steps:
- name: Checkout the repo
uses: actions/checkout@v4
- name: Get full and partial repository name
- name: Ensure scripts are executable
run: chmod +x scripts/*.sh
- name: Setup build metadata and environment
id: meta
run: |
echo "GITHUB_REPOSITORY: ${{ github.repository }}"
if [[ "${{ github.repository }}" == *".git" ]]; then
if [[ "${{ github.repository }}" == "https://"* ]]; then
echo "GITHUB_REPOSITORY ends in .git and is a URL"
REPO=$(echo "${{ github.repository }}" | sed 's/\.git$//' | cut -d'/' -f4-5 | sed 's/[^a-zA-Z0-9\/-]/-/g')
else
echo "GITHUB_REPOSITORY ends in .git and is not a URL"
REPO=$(echo "${{ github.repository }}" | sed 's/\.git$//' | sed 's/[^a-zA-Z0-9\/-]/-/g')
fi
else
echo "GITHUB_REPOSITORY is not a URL"
REPO=$(echo "${{ github.repository }}" | sed 's/[^a-zA-Z0-9\/-]/-/g')
fi
echo "REPO=$REPO" >> $GITHUB_OUTPUT
REPO_NAME_ONLY=$(echo "$REPO" | cut -d'/' -f2)
echo "REPO_NAME_ONLY=$REPO_NAME_ONLY" >> $GITHUB_OUTPUT
REPO_PROJECT_PATH=/srv/$(echo "$REPO_NAME_ONLY")
echo "REPO_PROJECT_PATH=$REPO_PROJECT_PATH" >> $GITHUB_OUTPUT
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Set PROD environment variable
run: |
echo "✅ Exit script on any error"
set -eu -o pipefail
prod_value=""
# Parse repository name and set outputs
eval "$(./scripts/parse-repository-name.sh '${{ github.repository }}')"
echo "REPO=$REPO" >> $GITHUB_OUTPUT
echo "REPO_NAME_ONLY=$REPO_NAME_ONLY" >> $GITHUB_OUTPUT
echo "REPO_PROJECT_PATH=$REPO_PROJECT_PATH" >> $GITHUB_OUTPUT
# Determine PROD environment
prod_value=""
echo "🔍 Check if PROD is set via vars; if not, determine from github.ref"
if [ -z "${{ vars.PROD }}" ]; then
prod_value="${{ startsWith(github.ref, 'refs/tags/v') && !endsWith(github.ref, '-prerelease') }}"
@ -53,148 +47,141 @@ jobs:
prod_value="${{ vars.PROD }}"
echo "📦 PROD mode already set to: ${prod_value}"
fi
echo "prod=${prod_value}" >> $GITHUB_OUTPUT
# Set environment variables for subsequent steps
echo "🖊️ Writing determined values to GITHUB_ENV:"
echo "PROD=${prod_value}" >> $GITHUB_ENV
echo "PROD=${prod_value} -> GITHUB_ENV"
echo "IMAGE_TAR_NAME=${REPO_NAME_ONLY}-${{ github.ref_name }}_${{ github.sha }}.tar" >> $GITHUB_ENV
echo "IMAGE_TAR_NAME=${REPO_NAME_ONLY}-${{ github.ref_name }}_${{ github.sha }}.tar -> GITHUB_ENV"
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Generate tags
id: generated_docker_tags
run: |
echo "✅ Exit script on any error"
set -eu -o pipefail
# echo current shell
echo "Current shell: $SHELL"
IMAGE_BASE_NAME="${{ steps.meta.outputs.REPO }}"
GIT_SHA="${{ github.sha }}"
GIT_REF="${{ github.ref }}"
echo "Inputs for tagging:"
echo "IMAGE_BASE_NAME: $IMAGE_BASE_NAME"
echo "GIT_SHA: $GIT_SHA"
echo "GIT_REF: $GIT_REF"
echo "PROD status for tagging: ${PROD}"
TAG_LIST=()
VERSION_TAG_LIST=()
if [[ -n "$GIT_SHA" ]]; then
SHORT_SHA=$(echo "$GIT_SHA" | cut -c1-7)
TAG_LIST+=("${IMAGE_BASE_NAME}:sha-${SHORT_SHA}")
TAG_LIST+=("${IMAGE_BASE_NAME}:sha-${GIT_SHA}")
else
echo "🔴 No Git SHA found, cannot generate tags. Aborting."
exit 1
fi
GIT_TAG_VERSION=""
# Extract version only if GIT_REF is a tag like refs/tags/vX.Y.Z or refs/tags/vX.Y.Z-prerelease
if [[ "$GIT_REF" == refs/tags/v* ]]; then
GIT_TAG_VERSION=$(echo "$GIT_REF" | sed 's%refs/tags/v%%' | sed 's%-prerelease$%%')
if [[ "$GIT_TAG_VERSION" =~ ^([0-9]+)\.([0-9]+)\.([0-9]+)$ ]]; then
echo "Detected Git tag: v$GIT_TAG_VERSION"
MAJOR=${BASH_REMATCH[1]}
MINOR=${BASH_REMATCH[2]}
PATCH=${BASH_REMATCH[3]}
echo "Parsed version: Major=$MAJOR, Minor=$MINOR, Patch=$PATCH from v$GIT_TAG_VERSION"
if [ "$MAJOR" -gt 0 ]; then
VERSION_TAG_LIST+=("${IMAGE_BASE_NAME}:v${MAJOR}")
else
echo " Major version is 0 (v$GIT_TAG_VERSION). Skipping MAJOR-only tag v0. Please reference by MAJOR.MINOR or MAJOR.MINOR.PATCH."
fi
VERSION_TAG_LIST+=("${IMAGE_BASE_NAME}:v${MAJOR}.${MINOR}")
VERSION_TAG_LIST+=("${IMAGE_BASE_NAME}:v${MAJOR}.${MINOR}.${PATCH}")
else
echo "⚠️ Git tag 'v$GIT_TAG_VERSION' is not a valid semantic version (x.y.z) but should be. Aborting."
exit 1
fi
fi
if [ "$PROD" = "true" ]; then
echo "📦 Generating PROD tags."
TAG_LIST+=("${IMAGE_BASE_NAME}:stable")
TAG_LIST+=("${IMAGE_BASE_NAME}:latest")
if [ ${#VERSION_TAG_LIST[@]} -gt 0 ]; then
TAG_LIST+=("${VERSION_TAG_LIST[@]}")
else
echo "🔴 PROD mode is true, but Git ref ($GIT_REF) is not a valid version tag. This is unexpected, aborting."
exit 1
fi
else # Non-PROD
echo "🔨 Generating STAGING tags."
TAG_LIST+=("${IMAGE_BASE_NAME}:staging")
TAG_LIST+=("${IMAGE_BASE_NAME}:latest-staging")
if [ ${#VERSION_TAG_LIST[@]} -gt 0 ]; then
echo "🔨 This is also a prerelease version, generating version tags with '-prerelease' suffix."
VERSION_TAG_LIST=("${VERSION_TAG_LIST[@]/%/-prerelease}")
TAG_LIST+=("${VERSION_TAG_LIST[@]}")
else
echo " Git ref ($GIT_REF) is not a valid version tag. Skipping versioned -prerelease tag generation."
fi
fi
# Use the script to generate tags
TAG_LIST=$(./scripts/generate-docker-tags.sh \
"${{ steps.meta.outputs.REPO }}" \
"${{ github.sha }}" \
"${{ github.ref }}" \
"$PROD")
echo "Final list of generated tags:"
printf "%s\n" "${TAG_LIST[@]}"
echo "$TAG_LIST"
if [[ -z "$GIT_SHA" || ${#TAG_LIST[@]} -lt 4 ]]; then
echo "⚠️ No tags (or too few) were generated based on the logic. Need at least 4 tags: Git commit short and full length SHA tags, a latest/latest-staging tag, and a stable/staging tag. This is unexpected, aborting."
TAG_COUNT=$(echo "$TAG_LIST" | wc -l)
if [[ -z "${{ github.sha }}" || $TAG_COUNT -lt 4 ]]; then
echo "⚠️ No tags (or too few) were generated based on the logic. Need at least 4 tags. Generated: $TAG_COUNT"
exit 1
fi
# Output the tags for the docker build action (output name is 'tag')
# Output the tags for the docker build action
{
echo "tag<<EOF"
printf "%s\n" "${TAG_LIST[@]}"
echo "$TAG_LIST"
echo "EOF"
} >> "$GITHUB_OUTPUT"
- name: Run prebuild tasks
run: ./scripts/prebuild.sh
- name: Cache Docker layers
uses: actions/cache@v4
with:
path: /tmp/.buildx-cache
key: ${{ runner.os }}-buildx-${{ github.sha }}
restore-keys: |
${{ runner.os }}-buildx-
- name: Extract version for Docker build
id: extract_version
run: |
echo "🔄 Chdir to src/pkmntrade_club/theme/static_src"
cd src/pkmntrade_club/theme/static_src
pip install setuptools-scm
VERSION=$(python -c "from setuptools_scm import get_version; print(get_version())")
echo "VERSION=${VERSION}" >> $GITHUB_ENV
echo "📦 Install npm dependencies"
npm install .
echo "🔨 Build the tailwind theme css"
npm run build
# - name: Cache Docker layers
# uses: actions/cache@v4
# with:
# path: ${{ runner.temp }}/.cache
# key: ${{ runner.os }}-${{ steps.meta.outputs.REPO_NAME_ONLY }}-${{ github.ref_name }}-${{ github.sha }}
# restore-keys: |
# ${{ runner.os }}-${{ steps.meta.outputs.REPO_NAME_ONLY }}-${{ github.ref_name }}-
# - name: Inject buildx-cache
# uses: reproducible-containers/buildkit-cache-dance@4b2444fec0c0fb9dbf175a96c094720a692ef810 # v2.1.4
# with:
# cache-source: ${{ runner.temp }}/.cache/buildx-cache
- name: Build container
uses: docker/build-push-action@v6
with:
outputs: type=docker,dest=${{ runner.temp }}/${{ steps.meta.outputs.REPO_NAME_ONLY }}-${{ github.ref_name }}_${{ github.sha }}.tar
tags: ${{ steps.generated_docker_tags.outputs.tag }}
build-args: CACHE_DIR=${{ runner.temp }}/.cache/dockerfile-cache
build-args: |
VERSION=${{ env.VERSION }}
context: .
#cache-from: type=local,src=${{ runner.temp }}/.cache/buildx-cache
#cache-to: type=local,src=${{ runner.temp }}/.cache/buildx-cache-new,mode=max
# - name: Rotate cache # along with cache-from & cache-to: prevents cache from growing indefinitely
# run: |
# rm -rf ${{ runner.temp }}/.cache/buildx-cache
# mv ${{ runner.temp }}/.cache/buildx-cache-new ${{ runner.temp }}/.cache/buildx-cache
# - name: Upload container as artifact
# uses: actions/upload-artifact@v4
# with:
# name: ${{ steps.meta.outputs.REPO_NAME_ONLY }}-${{ github.ref_name }}_${{ github.sha }}.tar
# path: ${{ runner.temp }}/${{ steps.meta.outputs.REPO_NAME_ONLY }}-${{ github.ref_name }}_${{ github.sha }}.tar
# if-no-files-found: error
# compression-level: 0
cache-from: type=local,src=/tmp/.buildx-cache
cache-to: type=local,dest=/tmp/.buildx-cache-new,mode=max
- name: Rotate cache
run: |
rm -rf /tmp/.buildx-cache
mv /tmp/.buildx-cache-new /tmp/.buildx-cache
- name: Upload container as artifact
uses: actions/upload-artifact@v4
with:
name: ${{ env.IMAGE_TAR_NAME }}
path: ${{ runner.temp }}/${{ steps.meta.outputs.REPO_NAME_ONLY }}-${{ github.ref_name }}_${{ github.sha }}.tar
if-no-files-found: error
retention-days: 1
# Job 2: Deploy (only runs on main branch or tags)
deploy:
needs: build
runs-on: ubuntu-latest
if: github.event_name == 'push' && (github.ref == 'refs/heads/main' || startsWith(github.ref, 'refs/tags/'))
# Determine environment based on ref
environment: ${{ (startsWith(github.ref, 'refs/tags/v') && !endsWith(github.ref, '-prerelease')) && 'production' || 'staging' }}
steps:
- name: Checkout the repo
uses: actions/checkout@v4
- name: Ensure scripts are executable
run: chmod +x scripts/*.sh
- name: Setup deployment metadata and environment
id: meta
run: |
echo "✅ Exit script on any error"
set -eu -o pipefail
# Parse repository name and set outputs
eval "$(./scripts/parse-repository-name.sh '${{ github.repository }}')"
echo "REPO=$REPO" >> $GITHUB_OUTPUT
echo "REPO_NAME_ONLY=$REPO_NAME_ONLY" >> $GITHUB_OUTPUT
echo "REPO_PROJECT_PATH=$REPO_PROJECT_PATH" >> $GITHUB_OUTPUT
# Determine PROD environment
prod_value=""
echo "🔍 Check if PROD is set via vars; if not, determine from github.ref"
if [ -z "${{ vars.PROD }}" ]; then
prod_value="${{ startsWith(github.ref, 'refs/tags/v') && !endsWith(github.ref, '-prerelease') }}"
echo "📦 PROD mode unset, determined from github.ref (starts with v and does not end with -prerelease?): ${prod_value}"
else
prod_value="${{ vars.PROD }}"
echo "📦 PROD mode already set to: ${prod_value}"
fi
echo "prod=${prod_value}" >> $GITHUB_OUTPUT
# Set all deployment environment variables
echo "📝 Setting deployment environment variables"
echo "REPO_PROJECT_PATH=${REPO_PROJECT_PATH}" >> $GITHUB_ENV
echo "REPO_NAME_ONLY=${REPO_NAME_ONLY}" >> $GITHUB_ENV
echo "IMAGE_TAR_NAME=${REPO_NAME_ONLY}-${{ github.ref_name }}_${{ github.sha }}.tar" >> $GITHUB_ENV
echo "PROD=${prod_value}" >> $GITHUB_ENV
- name: Download container artifact
uses: actions/download-artifact@v4
with:
name: ${{ env.IMAGE_TAR_NAME }}
path: ${{ runner.temp }}
- name: Get Deploy Secrets
uses: bitwarden/sm-action@v2
with:
@ -205,6 +192,7 @@ jobs:
9aefe34e-c2cf-442e-973c-b2dd0032b6cf > ENV_FILE_BASE64
d3bb47f8-bfc0-4a61-9cee-b2df0147a02a > CF_PEM_CERT
5f658ddf-aadd-4464-b501-b2df0147c338 > CF_PEM_CA
- name: Set up SSH
run: |
mkdir -p $HOME/.ssh
@ -224,82 +212,40 @@ jobs:
ControlPath $HOME/.ssh/control-%C
ControlPersist yes
END
- name: Run Deploy Script
- name: Deploy to Server
env:
DOCKER_HOST: ssh://deploy
REPO_PROJECT_PATH: ${{ env.REPO_PROJECT_PATH }}
REPO_NAME_ONLY: ${{ env.REPO_NAME_ONLY }}
IMAGE_TAR: ${{ runner.temp }}/${{ env.IMAGE_TAR_NAME }}
PROD: ${{ env.PROD }}
run: |
echo "✅ Exit script on any error"
set -eu -o pipefail
./scripts/deploy-to-server.sh
echo "⚙️ Set docker host to ssh://deploy so that all docker commands are run on the remote server"
export DOCKER_HOST=ssh://deploy
echo "🚀 Enable and start docker service"
ssh deploy "sudo systemctl enable --now docker.service"
echo "💾 Load the new docker image (${{ steps.meta.outputs.REPO_NAME_ONLY }}-${{ github.ref_name }}_${{ github.sha }}.tar)"
docker load -i "${{ runner.temp }}/${{ steps.meta.outputs.REPO_NAME_ONLY }}-${{ github.ref_name }}_${{ github.sha }}.tar"
echo "🧹 Remove the docker image artifact"
rm "${{ runner.temp }}/${{ steps.meta.outputs.REPO_NAME_ONLY }}-${{ github.ref_name }}_${{ github.sha }}.tar"
echo "🛑 Stop and remove containers before updating compose files"
#ssh deploy "cd ${{ steps.meta.outputs.REPO_PROJECT_PATH}} && docker compose -f docker-compose_core.yml down"
if [ "${PROD}" = true ]; then
ssh deploy "cd ${{ steps.meta.outputs.REPO_PROJECT_PATH}} && docker compose -f docker-compose_web.yml down"
- name: Health Check and Rollback
run: |
# Determine the correct URL based on environment
if [ "${{ env.PROD }}" = "true" ]; then
# Ensure PRODUCTION_DOMAIN is set
if [ -z "${{ vars.PRODUCTION_DOMAIN }}" ]; then
echo "Error: PRODUCTION_DOMAIN is not set"
exit 1
fi
HEALTH_CHECK_URL="https://${{ vars.PRODUCTION_DOMAIN }}/health/"
else
ssh deploy "cd ${{ steps.meta.outputs.REPO_PROJECT_PATH}} && docker compose -f docker-compose_staging.yml down"
# Ensure STAGING_DOMAIN is set
if [ -z "${{ vars.STAGING_DOMAIN }}" ]; then
echo "Error: STAGING_DOMAIN is not set"
exit 1
fi
HEALTH_CHECK_URL="https://${{ vars.STAGING_DOMAIN }}/health/"
fi
echo "💾 Copy files to server"
ssh deploy "mkdir -p ${{ steps.meta.outputs.REPO_PROJECT_PATH}}"
scp -pr ./server/* deploy:${{ steps.meta.outputs.REPO_PROJECT_PATH}}/
echo "📝 Create .env file"
printf "%s" "${ENV_FILE_BASE64}" | base64 -d | ssh deploy "cat > ${{ steps.meta.outputs.REPO_PROJECT_PATH}}/.env && chmod 600 ${{ steps.meta.outputs.REPO_PROJECT_PATH}}/.env"
echo "🔑 Set up certificates"
ssh deploy "mkdir -p ${{ steps.meta.outputs.REPO_PROJECT_PATH}}/certs && chmod 550 ${{ steps.meta.outputs.REPO_PROJECT_PATH}}/certs && chown 99:root ${{ steps.meta.outputs.REPO_PROJECT_PATH}}/certs"
printf "%s" "$CF_PEM_CERT" | ssh deploy "cat > ${{ steps.meta.outputs.REPO_PROJECT_PATH}}/certs/crt.pem && chmod 440 ${{ steps.meta.outputs.REPO_PROJECT_PATH}}/certs/crt.pem && chown 99:root ${{ steps.meta.outputs.REPO_PROJECT_PATH}}/certs/crt.pem"
printf "%s" "$CF_PEM_CA" | ssh deploy "cat > ${{ steps.meta.outputs.REPO_PROJECT_PATH}}/certs/ca.pem && chmod 440 ${{ steps.meta.outputs.REPO_PROJECT_PATH}}/certs/ca.pem && chown 99:root ${{ steps.meta.outputs.REPO_PROJECT_PATH}}/certs/ca.pem"
echo "🚀 Start the new containers"
if [ "${PROD}" = true ]; then
ssh deploy "cd ${{ steps.meta.outputs.REPO_PROJECT_PATH }} && docker compose -f docker-compose_core.yml -f docker-compose_web.yml up -d --no-build"
else
ssh deploy "cd ${{ steps.meta.outputs.REPO_PROJECT_PATH }} && docker compose -f docker-compose_core.yml -f docker-compose_staging.yml up -d --no-build"
fi
# echo "🚀 Start the new containers, zero-downtime"
# if [ "${PROD}" = true ]; then
# ssh deploy <<<END
# cd ${{ steps.meta.outputs.REPO_PROJECT_PATH}}
# old_container_id=$(docker compose -f docker-compose_web.yml ps -f name=web -q | tail -n1)
# docker compose -f docker-compose_web.yml up -d --no-build --no-recreate
# new_container_id=$(docker compose -f docker-compose_web.yml ps -f name=web -q | head -n1)
# # not needed, but might be useful at some point
# #new_container_ip=$(docker inspect -f '{{range.NetworkSettings.Networks}}{{.IPAddress}}{{end}}' $new_container_id)
# #new_container_name=$(docker inspect -f '{{.Name}}' $new_container_id | cut -c2-)
# sleep 100 # change to wait for healthcheck in the future
# #docker compose -f docker-compose_core.yml kill -s SIGUSR2 loba
# docker stop $old_container_id
# docker rm $old_container_id
# #docker compose -f docker-compose_core.yml kill -s SIGUSR2 loba
# END
# else
# ssh deploy <<<END
# cd ${{ steps.meta.outputs.REPO_PROJECT_PATH}}
# old_container_id=$(docker compose -f docker-compose_staging.yml ps -f name=web-staging -q | tail -n1)
# docker compose -f docker-compose_staging.yml up -d --no-build --no-recreate
# new_container_id=$(docker compose -f docker-compose_staging.yml ps -f name=web-staging -q | head -n1)
# # not needed, but might be useful at some point
# #new_container_ip=$(docker inspect -f '{{range.NetworkSettings.Networks}}{{.IPAddress}}{{end}}' $new_container_id)
# #new_container_name=$(docker inspect -f '{{.Name}}' $new_container_id | cut -c2-)
# sleep 100 # change to wait for healthcheck in the future
# #docker compose -f docker-compose_core.yml kill -s SIGUSR2 loba
# docker stop $old_container_id
# docker rm $old_container_id
# #docker compose -f docker-compose_core.yml kill -s SIGUSR2 loba
# END
# fi
echo "🧹 Prune all unused images"
docker system prune -f
# Copy script to remote and execute
scp scripts/health-check-and-rollback.sh deploy:/tmp/
ssh deploy "chmod +x /tmp/health-check-and-rollback.sh"
ssh deploy "/tmp/health-check-and-rollback.sh '${{ env.REPO_PROJECT_PATH }}' '${{ env.PROD }}' '$HEALTH_CHECK_URL' 30"
ssh deploy "rm -f /tmp/health-check-and-rollback.sh"

3
.gitignore vendored
View file

@ -1,5 +1,4 @@
.env.production
.env
.env.*
src/pkmntrade_club/staticfiles/*
!src/pkmntrade_club/staticfiles/.gitkeep
src/pkmntrade_club/media/*

8
.vscode/launch.json vendored
View file

@ -6,10 +6,14 @@
"type": "debugpy",
"request": "launch",
"program": "${workspaceFolder}/manage.py",
"args": ["runserver"],
"cwd": "${workspaceFolder}",
"args": ["runserver", "0.0.0.0:8000"],
"django": true,
"justMyCode": true,
"preLaunchTask": "Run db standalone"
"preLaunchTask": "Run db standalone",
"env": {
"DEBUG": "True"
},
}
]
}

View file

@ -62,6 +62,7 @@ ENV PATH=/app/bin:$PATH
ENV PYTHONPATH=/app
ENV PYTHONUNBUFFERED=1
ENV HOME=/app
ENV DJANGO_SETTINGS_MODULE=pkmntrade_club.django_project.settings
WORKDIR /app
@ -85,8 +86,9 @@ EOT
# See <https://hynek.me/articles/docker-signals/>.
STOPSIGNAL SIGINT
COPY --from=build --chown=app:app /app /app
COPY --from=build --chown=app:app --chmod=u+rw /app /app
COPY --from=ghcr.io/astral-sh/uv:0.7.2 /uv /app/bin/uv
COPY --chown=app:app --chmod=700 /scripts/entrypoint.sh /entrypoint.sh
COPY --chown=app:app --chmod=700 /scripts/deploy.sh /deploy.sh
COPY --chown=app:app --chmod=700 /manage.py /app/manage.py
@ -94,11 +96,13 @@ COPY --chown=app:app --chmod=700 /manage.py /app/manage.py
ENTRYPOINT ["/entrypoint.sh"]
RUN --mount=type=cache,target=${CACHE_DIR} \
mkdir -p /app/.cursor-server && chown app:app /app /app/.cursor-server
mkdir -p /app/.cursor-server && chmod 755 /app/.cursor-server && chown app:app /app /app/.cursor-server
RUN --mount=type=cache,target=${CACHE_DIR} \
mkdir -p /flags && chmod 700 /flags && chown app:app /flags
USER app
EXPOSE 8000
CMD ["granian", "--interface", "wsgi", "pkmntrade_club.django_project.wsgi:app", "--host", "0.0.0.0", "--port", "8000", "--workers", "1", "--backpressure", "16", "--workers-kill-timeout", "180", "--access-log"]
#, "--static-path-mount", "./staticfiles"

View file

@ -1,54 +1,54 @@
services:
web:
build: .
command: ["django-admin", "runserver", "0.0.0.0:8000"]
command: bash -c "cd /code && uv pip install --editable . --no-deps && python manage.py runserver 0.0.0.0:8000"
ports:
- 8000:8000
restart: always
- "8000:8000"
restart: unless-stopped
environment:
- DEBUG=true
volumes:
- ./seed:/seed:ro
# DANGEROUS DUE TO DOCKERFILE PACKAGE BUILDING/INSTALLATION
#- ./src/pkmntrade_club:/app/lib/python3.12/site-packages/pkmntrade_club:ro
env_file:
- .env
environment:
- DEBUG=true
- PUBLIC_HOST=localhost
- ALLOWED_HOSTS=127.0.0.1,localhost
- DISABLE_CACHE=false
- ./:/code
depends_on:
db:
condition: service_healthy
redis:
condition: service_started
celery:
build: .
command: ["celery", "-A", "pkmntrade_club.django_project", "worker", "-l", "INFO", "-B", "-E"]
restart: always
env_file:
- .env
environment:
- DEBUG=true
- PUBLIC_HOST=localhost
- ALLOWED_HOSTS=127.0.0.1,localhost
- DISABLE_CACHE=false
command: bash -c "cd /code && uv pip install --editable . --no-deps && celery -A pkmntrade_club.django_project worker -l INFO -B -E"
restart: unless-stopped
volumes:
- ./:/code
depends_on:
db:
condition: service_healthy
redis:
condition: service_started
redis:
image: redis:latest
restart: always
ports:
- 6379:6379
# depends_on:
# db:
# condition: service_healthy
# db:
# image: postgres:16
# restart: always
# ports:
# - 5432:5432
# volumes:
# - postgres_data:/var/lib/postgresql/data/
# environment:
# - "POSTGRES_HOST_AUTH_METHOD=trust"
# healthcheck:
# test: ["CMD", "pg_isready", "-U", "postgres", "-d", "postgres"]
# interval: 10s
# timeout: 5s
# retries: 5
# volumes:
# postgres_data:
healthcheck:
test: ["CMD", "redis-cli", "ping"]
interval: 10s
timeout: 5s
retries: 5
db:
image: postgres:16
restart: always
ports:
- 5432:5432
volumes:
- postgres_data:/var/lib/postgresql/data/
environment:
- "POSTGRES_HOST_AUTH_METHOD=trust"
healthcheck:
test: ["CMD", "pg_isready", "-U", "postgres", "-d", "postgres"]
interval: 10s
timeout: 5s
retries: 5
volumes:
postgres_data:

View file

@ -6,6 +6,8 @@ import sys
def main():
"""Run administrative tasks."""
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "pkmntrade_club.django_project.settings")
sys.path.append(os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
try:
from django.core.management import execute_from_command_line
except ImportError as exc:

View file

@ -4,7 +4,7 @@ build-backend = "setuptools.build_meta"
[project]
name = "pkmntrade-club"
version = "0.1.0"
dynamic = ["version"]
description = "A django project for trading Pokémon TCG Pocket Cards"
readme = "README.md"
requires-python = ">=3.12"
@ -43,18 +43,20 @@ dependencies = [
"django-daisy==1.0.13",
"django-debug-toolbar==4.4.6",
"django-environ==0.12.0",
"django-health-check>=3.18.3",
"django-linear-migrations>=2.17.0",
"django-meta==2.4.2",
"django-tailwind-4[reload]==0.1.4",
"django-widget-tweaks==1.5.0",
"gevent==25.4.1",
"granian==2.2.5",
"granian==2.3.1",
"gunicorn==23.0.0",
"idna==3.4",
"oauthlib==3.2.2",
"packaging==23.1",
"pillow>=11.2.1",
"playwright==1.52.0",
"psutil>=7.0.0",
"psycopg==3.2.3",
"psycopg-binary==3.2.3",
"pycparser==2.21",
@ -67,6 +69,8 @@ dependencies = [
"typing-extensions==4.9.0",
"urllib3==1.26.14",
"whitenoise==6.7.0",
"django-parler>=2.3",
"setuptools-scm>=8.3.1",
]
[project.scripts]
@ -77,3 +81,8 @@ Homepage = "https://pkmntrade.club"
[tool.setuptools.packages.find]
where = ["src"]
[tool.setuptools_scm]
version_scheme = "no-guess-dev"
tag_regex = "^v(?P<version>[0-9]+(?:\\.[0-9]+)*(?:-.*)?)"
fallback_version = "0.0.0+unknown"

124
scripts/deploy-to-server.sh Normal file
View file

@ -0,0 +1,124 @@
#!/bin/bash
set -euo pipefail
# Main deployment script with versioned releases
# Usage: ./deploy-to-server.sh
# Source retry function
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
source "${SCRIPT_DIR}/retry.sh"
# Required environment variables (should be set by GitHub Actions)
: "${DOCKER_HOST:?Error: DOCKER_HOST not set}"
: "${REPO_PROJECT_PATH:?Error: REPO_PROJECT_PATH not set}"
: "${REPO_NAME_ONLY:?Error: REPO_NAME_ONLY not set}"
: "${IMAGE_TAR:?Error: IMAGE_TAR not set}"
: "${ENV_FILE_BASE64:?Error: ENV_FILE_BASE64 not set}"
: "${CF_PEM_CERT:?Error: CF_PEM_CERT not set}"
: "${CF_PEM_CA:?Error: CF_PEM_CA not set}"
: "${PROD:?Error: PROD not set}"
echo "⚙️ Docker host: $DOCKER_HOST"
# Generate deployment timestamp
DEPLOYMENT_TIMESTAMP=$(date +%Y%m%d_%H%M%S)
RELEASES_PATH="${REPO_PROJECT_PATH}/releases"
NEW_RELEASE_PATH="${RELEASES_PATH}/${DEPLOYMENT_TIMESTAMP}"
CURRENT_LINK_PATH="${REPO_PROJECT_PATH}/current"
echo "📅 Deployment version: ${DEPLOYMENT_TIMESTAMP}"
echo "🚀 Enable and start docker service"
retry ssh deploy "sudo systemctl enable --now docker.service"
echo "💾 Load the new docker image ($IMAGE_TAR)"
if [ ! -f "$IMAGE_TAR" ]; then
echo "Error: Docker image tar file not found: $IMAGE_TAR"
exit 1
fi
retry docker load -i "$IMAGE_TAR"
echo "📁 Create versioned release directory"
ssh deploy "mkdir -p '${NEW_RELEASE_PATH}'"
echo "💾 Copy new files to server"
# Check if server directory exists before copying
if [ -d "./server" ]; then
retry scp -pr ./server/* "deploy:${NEW_RELEASE_PATH}/"
else
echo "⚠️ No server directory found, error"
exit 1
fi
echo "📝 Create new .env file"
printf "%s" "${ENV_FILE_BASE64}" | base64 -d | ssh deploy "cat > '${NEW_RELEASE_PATH}/.env' && chmod 600 '${NEW_RELEASE_PATH}/.env'"
echo "🔑 Set up certs"
ssh deploy "mkdir -p '${NEW_RELEASE_PATH}/certs' && chmod 550 '${NEW_RELEASE_PATH}/certs' && chown 99:root '${NEW_RELEASE_PATH}/certs'"
printf "%s" "$CF_PEM_CERT" | ssh deploy "cat > '${NEW_RELEASE_PATH}/certs/crt.pem' && chmod 440 '${NEW_RELEASE_PATH}/certs/crt.pem' && chown 99:root '${NEW_RELEASE_PATH}/certs/crt.pem'"
printf "%s" "$CF_PEM_CA" | ssh deploy "cat > '${NEW_RELEASE_PATH}/certs/ca.pem' && chmod 440 '${NEW_RELEASE_PATH}/certs/ca.pem' && chown 99:root '${NEW_RELEASE_PATH}/certs/ca.pem'"
echo "🔄 Prepare deployment (stop current containers)"
# Copy script to remote and execute with parameters
scp "${SCRIPT_DIR}/prepare-deployment.sh" deploy:/tmp/
ssh deploy "chmod +x /tmp/prepare-deployment.sh && /tmp/prepare-deployment.sh '${REPO_PROJECT_PATH}' '${PROD}' '${CURRENT_LINK_PATH}'"
ssh deploy "rm -f /tmp/prepare-deployment.sh"
echo "📝 Save deployment metadata"
ssh deploy "echo '${DEPLOYMENT_TIMESTAMP}' > '${NEW_RELEASE_PATH}/.deployment_version'"
ssh deploy "echo '${PROD}' > '${NEW_RELEASE_PATH}/.deployment_env'"
# Save previous version info for potential rollback
ssh deploy "if [ -L '${CURRENT_LINK_PATH}' ]; then readlink -f '${CURRENT_LINK_PATH}' > '${NEW_RELEASE_PATH}/.previous_version'; fi"
echo "🔗 Update current symlink to new release"
ssh deploy "ln -sfn '${NEW_RELEASE_PATH}' '${CURRENT_LINK_PATH}'"
# TODO: implement zero-downtime deployment
# echo "🚀 Start the new containers, zero-downtime"
# if [ "${PROD}" = true ]; then
# ssh deploy <<EOF
# cd ${{ steps.meta.outputs.REPO_PROJECT_PATH}}
# old_container_id=$(docker compose -f docker-compose_web.yml ps -f name=web -q | tail -n1)
# docker compose -f docker-compose_web.yml up -d --no-build --no-recreate
# new_container_id=$(docker compose -f docker-compose_web.yml ps -f name=web -q | head -n1)
# # not needed, but might be useful at some point
# #new_container_ip=$(docker inspect -f '{{range.NetworkSettings.Networks}}{{.IPAddress}}{{end}}' $new_container_id)
# #new_container_name=$(docker inspect -f '{{.Name}}' $new_container_id | cut -c2-)
# sleep 100 # change to wait for healthcheck in the future
# #docker compose -f docker-compose_core.yml kill -s SIGUSR2 loba
# docker stop $old_container_id
# docker rm $old_container_id
# #docker compose -f docker-compose_core.yml kill -s SIGUSR2 loba
# EOF
# else
# ssh deploy <<EOF
# cd ${{ steps.meta.outputs.REPO_PROJECT_PATH}}
# old_container_id=$(docker compose -f docker-compose_staging.yml ps -f name=web-staging -q | tail -n1)
# docker compose -f docker-compose_staging.yml up -d --no-build --no-recreate
# new_container_id=$(docker compose -f docker-compose_staging.yml ps -f name=web-staging -q | head -n1)
# # not needed, but might be useful at some point
# #new_container_ip=$(docker inspect -f '{{range.NetworkSettings.Networks}}{{.IPAddress}}{{end}}' $new_container_id)
# #new_container_name=$(docker inspect -f '{{.Name}}' $new_container_id | cut -c2-)
# sleep 100 # change to wait for healthcheck in the future
# #docker compose -f docker-compose_core.yml kill -s SIGUSR2 loba
# docker stop $old_container_id
# docker rm $old_container_id
# #docker compose -f docker-compose_core.yml kill -s SIGUSR2 loba
# EOF
# fi
echo "🚀 Start the new containers"
if [ "$PROD" = "true" ]; then
retry ssh deploy "cd '${CURRENT_LINK_PATH}' && docker compose -f docker-compose_core.yml -f docker-compose_web.yml -p pkmntrade-club up -d --no-build"
else
retry ssh deploy "cd '${CURRENT_LINK_PATH}' && docker compose -f docker-compose_core.yml -f docker-compose_web.yml -f docker-compose_staging.yml -p pkmntrade-club up -d --no-build"
fi
echo "🧹 Prune unused Docker resources"
ssh deploy "docker system prune -f"
echo "🗑️ Clean up old releases (keep last 5)"
ssh deploy "cd '${RELEASES_PATH}' && ls -dt */ 2>/dev/null | tail -n +6 | xargs -r rm -rf || true"
echo "✅ Deployment completed. Version: ${DEPLOYMENT_TIMESTAMP}"

View file

@ -1,5 +1,10 @@
#!/bin/bash
if [[ -f /flags/.deployed && "$FORCE_DEPLOY" != "true" ]]; then
echo "*** Previously deployed successfully."
exit 0
fi
echo "*** Running makemigrations --check to make sure migrations are up to date..."
django-admin makemigrations --noinput --check 2>&1 || exit 1
@ -12,4 +17,7 @@ django-admin clear_cache 2>&1
echo "*** Running collectstatic..."
django-admin collectstatic -c --no-input 2>&1
echo "*** Marking as deployed..."
touch /flags/.deployed
echo "*** Deployed successfully!"

View file

@ -6,15 +6,14 @@ if [ "$1" == "" ]; then
exit;
fi
if [ "$DJANGO_SETTINGS_MODULE" == "" ]; then
echo "Environment variable 'DJANGO_SETTINGS_MODULE' not set. Exiting."
exit;
else
export DJANGO_SETTINGS_MODULE=$DJANGO_SETTINGS_MODULE
fi
echo "Running deploy.sh... (if you get a APP_REGISTRY_NOT_READY error, there's probably an error in settings.py)"
/deploy.sh
echo "Environment is correct and deploy.sh has been run - executing command: '$@'"
exec "$@" && exit 0
if [ "$1" == "granian" ]; then
granian --version
echo "Appending static files path to granian command (requires granian >= 2.3.0)"
STATIC_ROOT=$(python -c 'import os; import pkmntrade_club; from django.conf import settings; print(settings.STATIC_ROOT)')
set -- "$@" --static-path-mount "$STATIC_ROOT"
fi
echo "Environment is correct - executing command: '$@'"
exec "$@"

View file

@ -0,0 +1,49 @@
#!/bin/bash
set -euo pipefail
# Generate Docker tags based on git ref and environment
# Usage: ./generate-docker-tags.sh IMAGE_BASE GIT_SHA GIT_REF PROD
if [ $# -ne 4 ]; then
echo "Error: Invalid number of arguments" > /dev/stderr
echo "Usage: $0 IMAGE_BASE GIT_SHA GIT_REF PROD" > /dev/stderr
exit 1
fi
IMAGE_BASE="$1"
GIT_SHA="$2"
GIT_REF="$3"
PROD="$4"
# Validate inputs
if [ -z "$IMAGE_BASE" ] || [ -z "$GIT_SHA" ]; then
echo "Error: IMAGE_BASE and GIT_SHA cannot be empty" > /dev/stderr
exit 1
fi
# Always include SHA tags
echo "${IMAGE_BASE}:sha-${GIT_SHA:0:7}"
echo "${IMAGE_BASE}:sha-${GIT_SHA}"
# Handle version tags
if [[ "$GIT_REF" =~ ^refs/tags/v([0-9]+)\.([0-9]+)\.([0-9]+)(-.*)?$ ]]; then
MAJOR="${BASH_REMATCH[1]}"
MINOR="${BASH_REMATCH[2]}"
PATCH="${BASH_REMATCH[3]}"
PRERELEASE="${BASH_REMATCH[4]}"
if [[ -z "$PRERELEASE" ]] && [[ "$PROD" == "true" ]]; then
echo "${IMAGE_BASE}:latest"
echo "${IMAGE_BASE}:stable"
[[ "$MAJOR" -gt 0 ]] && echo "${IMAGE_BASE}:v${MAJOR}"
echo "${IMAGE_BASE}:v${MAJOR}.${MINOR}"
echo "${IMAGE_BASE}:v${MAJOR}.${MINOR}.${PATCH}"
else
echo "${IMAGE_BASE}:latest-staging"
echo "${IMAGE_BASE}:staging"
echo "${IMAGE_BASE}:v${MAJOR}.${MINOR}.${PATCH}-prerelease"
fi
elif [[ "$PROD" == "false" ]]; then
echo "${IMAGE_BASE}:latest-staging"
echo "${IMAGE_BASE}:staging"
fi

View file

@ -0,0 +1,102 @@
#!/bin/bash
set -euo pipefail
# Perform health check and rollback if necessary
# Usage: ./health-check-and-rollback.sh REPO_PROJECT_PATH PROD HEALTH_CHECK_URL [MAX_ATTEMPTS]
if [ $# -lt 3 ]; then
echo "Error: Invalid number of arguments"
echo "Usage: $0 REPO_PROJECT_PATH PROD HEALTH_CHECK_URL [MAX_ATTEMPTS]"
exit 1
fi
REPO_PROJECT_PATH="$1"
PROD="$2"
HEALTH_CHECK_URL="$3"
MAX_ATTEMPTS="${4:-30}"
CURRENT_LINK_PATH="${REPO_PROJECT_PATH}/current"
RELEASES_PATH="${REPO_PROJECT_PATH}/releases"
echo "🏥 Performing health check..."
echo "Health check URL: $HEALTH_CHECK_URL"
get_current_version() {
if [ -L "$CURRENT_LINK_PATH" ]; then
basename "$(readlink -f "$CURRENT_LINK_PATH")"
else
echo "unknown"
fi
}
ATTEMPT=0
while [ "$ATTEMPT" -lt "$MAX_ATTEMPTS" ]; do
# Check if the service is responding with 200 OK
HTTP_CODE=$(curl -s -o /dev/null -w '%{http_code}' -m 10 "$HEALTH_CHECK_URL" || echo '000')
if [ "$HTTP_CODE" = "200" ]; then
echo "✅ Health check passed! (HTTP $HTTP_CODE)"
CURRENT_VERSION=$(get_current_version)
echo "📌 Current version: ${CURRENT_VERSION}"
exit 0
fi
ATTEMPT=$((ATTEMPT + 1))
if [ "$ATTEMPT" -eq "$MAX_ATTEMPTS" ]; then
echo "❌ Health check failed after $MAX_ATTEMPTS attempts (Last HTTP code: $HTTP_CODE)"
echo "🔄 Rolling back deployment..."
FAILED_VERSION=$(get_current_version)
echo "❌ Failed version: ${FAILED_VERSION}"
# Check if we have a previous version to roll back to
if [ -f "${CURRENT_LINK_PATH}/.previous_version" ]; then
PREVIOUS_VERSION_PATH=$(cat "${CURRENT_LINK_PATH}/.previous_version")
PREVIOUS_VERSION=$(basename "$PREVIOUS_VERSION_PATH")
if [ -d "$PREVIOUS_VERSION_PATH" ]; then
echo "🔄 Rolling back to version: ${PREVIOUS_VERSION}"
# Stop failed deployment containers
cd "$CURRENT_LINK_PATH"
echo "Stopping failed deployment containers..."
docker compose -f docker-compose_web.yml -p pkmntrade-club down || true
if [ "$PROD" = "false" ]; then
docker compose -f docker-compose_staging.yml -p pkmntrade-club down || true
fi
docker compose -f docker-compose_core.yml -p pkmntrade-club down || true
# Switch symlink back to previous version
ln -sfn "$PREVIOUS_VERSION_PATH" "$CURRENT_LINK_PATH"
# Start previous version containers
cd "$CURRENT_LINK_PATH"
docker compose -f docker-compose_core.yml -p pkmntrade-club up -d --no-build
if [ "$PROD" = "true" ]; then
docker compose -f docker-compose_web.yml -p pkmntrade-club up -d --no-build
else
docker compose -f docker-compose_web.yml -f docker-compose_staging.yml -p pkmntrade-club up -d --no-build
fi
echo "✅ Rollback completed to version: ${PREVIOUS_VERSION}"
# Mark failed version
if [ -d "${RELEASES_PATH}/${FAILED_VERSION}" ]; then
touch "${RELEASES_PATH}/${FAILED_VERSION}/.failed"
echo "$(date): Health check failed, rolled back to ${PREVIOUS_VERSION}" > "${RELEASES_PATH}/${FAILED_VERSION}/.failure_reason"
fi
else
echo "❌ Previous version directory not found: $PREVIOUS_VERSION_PATH"
exit 1
fi
else
echo "❌ No previous version information found. Cannot rollback!"
echo "💡 This might be the first deployment or the previous version info is missing."
exit 1
fi
exit 1
fi
echo "⏳ Waiting for service to be healthy... (attempt $ATTEMPT/$MAX_ATTEMPTS, HTTP code: $HTTP_CODE)"
sleep 10
done

120
scripts/manage-releases.sh Normal file
View file

@ -0,0 +1,120 @@
#!/bin/bash
set -euo pipefail
# Manage deployment releases
# Usage: ./manage-releases.sh REPO_PROJECT_PATH COMMAND [ARGS]
if [ $# -lt 2 ]; then
echo "Error: Invalid number of arguments"
echo "Usage: $0 REPO_PROJECT_PATH COMMAND [ARGS]"
echo "Commands:"
echo " list - List all releases"
echo " current - Show current release"
echo " rollback VERSION - Rollback to specific version"
echo " cleanup [KEEP] - Clean up old releases (default: keep 5)"
exit 1
fi
REPO_PROJECT_PATH="$1"
COMMAND="$2"
CURRENT_LINK_PATH="${REPO_PROJECT_PATH}/current"
RELEASES_PATH="${REPO_PROJECT_PATH}/releases"
case "$COMMAND" in
list)
echo "📋 Available releases:"
if [ -d "$RELEASES_PATH" ]; then
for release in $(ls -dt "${RELEASES_PATH}"/*/); do
version=$(basename "$release")
status=""
# Check if it's current
if [ -L "$CURRENT_LINK_PATH" ] && [ "$(readlink -f "$CURRENT_LINK_PATH")" = "$(realpath "$release")" ]; then
status=" [CURRENT]"
fi
# Check if it failed
if [ -f "${release}/.failed" ]; then
status="${status} [FAILED]"
fi
echo " - ${version}${status}"
done
else
echo "No releases found"
fi
;;
current)
if [ -L "$CURRENT_LINK_PATH" ]; then
current_version=$(basename "$(readlink -f "$CURRENT_LINK_PATH")")
echo "📌 Current version: ${current_version}"
else
echo "❌ No current deployment found"
fi
;;
rollback)
if [ $# -lt 3 ]; then
echo "Error: VERSION required for rollback"
exit 1
fi
TARGET_VERSION="$3"
TARGET_PATH="${RELEASES_PATH}/${TARGET_VERSION}"
if [ ! -d "$TARGET_PATH" ]; then
echo "Error: Version ${TARGET_VERSION} not found"
exit 1
fi
echo "🔄 Rolling back to version: ${TARGET_VERSION}"
# Read environment from target version
if [ -f "${TARGET_PATH}/.deployment_env" ]; then
PROD=$(cat "${TARGET_PATH}/.deployment_env")
else
echo "Warning: Could not determine environment, assuming staging"
PROD="false"
fi
# Stop current containers
if [ -L "$CURRENT_LINK_PATH" ] && [ -d "$CURRENT_LINK_PATH" ]; then
cd "$CURRENT_LINK_PATH"
docker compose -f docker-compose_web.yml down || true
[ "$PROD" = "false" ] && docker compose -f docker-compose_staging.yml down || true
docker compose -f docker-compose_core.yml down || true
fi
# Update symlink
ln -sfn "$TARGET_PATH" "$CURRENT_LINK_PATH"
# Start containers
cd "$CURRENT_LINK_PATH"
docker compose -f docker-compose_core.yml up -d --no-build
if [ "$PROD" = "true" ]; then
docker compose -f docker-compose_web.yml up -d --no-build
else
docker compose -f docker-compose_web.yml -f docker-compose_staging.yml up -d --no-build
fi
echo "✅ Rollback completed"
;;
cleanup)
KEEP_COUNT="${3:-5}"
echo "🗑️ Cleaning up old releases (keeping last ${KEEP_COUNT})"
if [ -d "$RELEASES_PATH" ]; then
cd "$RELEASES_PATH"
ls -dt */ 2>/dev/null | tail -n +$((KEEP_COUNT + 1)) | xargs -r rm -rf || true
echo "✅ Cleanup completed"
else
echo "No releases directory found"
fi
;;
*)
echo "Error: Unknown command: $COMMAND"
exit 1
;;
esac

View file

@ -0,0 +1,36 @@
#!/bin/bash
set -euo pipefail
# Parse repository name and generate project paths
# Usage: ./parse-repository-name.sh GITHUB_REPOSITORY
if [ $# -eq 0 ]; then
echo "Error: No repository name provided" > /dev/stderr
echo "Usage: $0 GITHUB_REPOSITORY" > /dev/stderr
exit 1
fi
GITHUB_REPOSITORY="$1"
echo "GITHUB_REPOSITORY: $GITHUB_REPOSITORY" > /dev/stderr
if [[ "$GITHUB_REPOSITORY" == *".git" ]]; then
if [[ "$GITHUB_REPOSITORY" == "https://"* ]]; then
echo "GITHUB_REPOSITORY ends in .git and is a URL" > /dev/stderr
REPO=$(echo "$GITHUB_REPOSITORY" | sed 's/\.git$//' | cut -d'/' -f4-5 | sed 's/[^a-zA-Z0-9\/-]/-/g')
else
echo "GITHUB_REPOSITORY ends in .git and is not a URL" > /dev/stderr
REPO=$(echo "$GITHUB_REPOSITORY" | sed 's/\.git$//' | sed 's/[^a-zA-Z0-9\/-]/-/g')
fi
else
echo "GITHUB_REPOSITORY is not a URL" > /dev/stderr
REPO=$(echo "$GITHUB_REPOSITORY" | sed 's/[^a-zA-Z0-9\/-]/-/g')
fi
REPO_NAME_ONLY=$(echo "$REPO" | cut -d'/' -f2)
REPO_PROJECT_PATH="/srv/${REPO_NAME_ONLY}"
# Output in format that can be sourced - using printf %q for proper escaping
printf "export REPO=%q\n" "$REPO"
printf "export REPO_NAME_ONLY=%q\n" "$REPO_NAME_ONLY"
printf "export REPO_PROJECT_PATH=%q\n" "$REPO_PROJECT_PATH"

View file

@ -1,11 +1,12 @@
#!/bin/bash
cd src/pkmntrade_club/
# Remove all files in staticfiles except .gitkeep
if [ -d "staticfiles" ]; then
find staticfiles -type f ! -name '.gitkeep' -delete
find staticfiles -type d -empty -delete
fi
# Build the tailwind theme css
cd src/pkmntrade_club/theme/static_src
cd theme/static_src
npm install . && npm run build

View file

@ -0,0 +1,44 @@
#!/bin/bash
set -euo pipefail
# Prepare deployment by stopping containers
# Usage: ./prepare-deployment.sh REPO_PROJECT_PATH PROD CURRENT_LINK_PATH
if [ $# -ne 3 ]; then
echo "Error: Invalid number of arguments"
echo "Usage: $0 REPO_PROJECT_PATH PROD CURRENT_LINK_PATH"
exit 1
fi
REPO_PROJECT_PATH="$1"
PROD="$2"
CURRENT_LINK_PATH="$3"
# Ensure base directory exists
if [ ! -d "$REPO_PROJECT_PATH" ]; then
echo "⚠️ Directory $REPO_PROJECT_PATH does not exist, creating it..."
mkdir -p "$REPO_PROJECT_PATH"
fi
# If current symlink exists, stop containers in that directory
if [ -L "$CURRENT_LINK_PATH" ] && [ -d "$CURRENT_LINK_PATH" ]; then
echo "🛑 Stopping containers in current deployment..."
cd "$CURRENT_LINK_PATH"
# Stop containers
if [ -f "docker-compose_web.yml" ]; then
docker compose -f docker-compose_web.yml -p pkmntrade-club down || true
fi
if [ "$PROD" = "false" ] && [ -f "docker-compose_staging.yml" ]; then
docker compose -f docker-compose_staging.yml -p pkmntrade-club down || true
fi
if [ -f "docker-compose_core.yml" ]; then
docker compose -f docker-compose_core.yml -p pkmntrade-club down || true
fi
echo "✅ Containers stopped"
else
echo " No current deployment found (symlink doesn't exist or point to valid directory)"
fi

View file

@ -4,26 +4,23 @@ set -e
echo "Remaking migrations..."
find . -path "*/migrations/0*.py" -delete
set -a
source .env
set +a
uv run manage.py makemigrations --noinput
echo "Resetting database... "
echo "Resetting dev database... "
docker compose down \
&& docker volume rm -f pkmntradeclub_postgres_data \
&& ./scripts/rebuild-and-run.sh
&& docker compose up -d db
# Wait for the database to be ready.
echo "Waiting 15 seconds for the database to be ready, and migrations to be autorun..."
sleep 15
echo "Running prebuild..."
./scripts/prebuild.sh
echo "Creating cache table..."
docker compose exec -it web bash -c "django-admin createcachetable django_cache"
echo "Running migrations..."
uv run manage.py migrate --noinput
echo "Loading seed data..."
docker compose exec -it web bash -c "django-admin loaddata /seed/0*"
uv run manage.py loaddata ./seed/0*
docker compose down
echo "Running collectstatic..."
uv run manage.py collectstatic -c --no-input
echo "Done!"

23
scripts/retry.sh Normal file
View file

@ -0,0 +1,23 @@
#!/bin/bash
# Retry function with exponential backoff
# Usage: source retry.sh && retry <command>
retry() {
local max_attempts=3
local delay=5
local attempt=1
until "$@"; do
if [ "$attempt" -ge "$max_attempts" ]; then
echo "Command failed after $max_attempts attempts: $*"
return 1
fi
echo "Command failed (attempt $attempt/$max_attempts): $*"
echo "Retrying in $delay seconds..."
sleep "$delay"
attempt=$((attempt + 1))
delay=$((delay * 2)) # Exponential backoff
done
}

View file

@ -1,5 +1,5 @@
services:
db-healthcheck:
db-redis-healthcheck:
image: stephenc/postgresql-cli:latest
command:
- "sh"
@ -9,26 +9,47 @@ services:
sleep 30;
while true; do
pg_output=$$(pg_isready -d ${DJANGO_DATABASE_URL} 2>&1);
exit_code=$$?;
if [ $$exit_code -eq 0 ]; then
success="true";
error="";
pg_exit_code=$$?;
if [ $$pg_exit_code -eq 0 ]; then
pg_success="true";
pg_error="";
else
success="false";
error="$$pg_output";
pg_success="false";
pg_error="$$pg_output";
fi;
curl -s -f -X POST \
--connect-timeout 10 \
--max-time 15 \
--header "Authorization: Bearer ${GATUS_TOKEN}" \
http://health:8080/api/v1/endpoints/db_pg-isready/external?success=$$success&error=$$error;
if [ "$$success" = "true" ]; then
http://health:8080/api/v1/endpoints/services_database/external?success=$$pg_success&error=$$pg_error || true
if [ "$$pg_success" = "true" ]; then
echo " Database is OK";
sleep 60;
else
echo "Database is not OK: $$pg_output";
exit 1;
fi;
redis_output=$$(echo -e "ping\nquit" | curl -v --max-time 10 --connect-timeout 10 telnet://redis:6379 2>&1 | grep -q "+PONG");
redis_exit_code=$$?;
if [ $$redis_exit_code -eq 0 ]; then
redis_success="true";
redis_error="";
else
redis_success="false";
redis_error="$$redis_output";
fi;
curl -s -f -X POST \
--connect-timeout 10 \
--max-time 15 \
--header "Authorization: Bearer ${GATUS_TOKEN}" \
http://health:8080/api/v1/endpoints/services_cache/external?success=$$redis_success&error=$$redis_error;
if [ "$$redis_success" = "true" ]; then
echo " Redis is OK";
else
echo "Redis is not OK: $$redis_output";
exit 1;
fi;
sleep 60;
done
env_file:
- .env
@ -46,41 +67,193 @@ services:
feedback:
restart: always
image: getfider/fider:stable
labels:
- "enable_gatekeeper=true"
env_file:
- .env
cadvisor:
volumes:
- /:/rootfs:ro
- /var/run:/var/run:ro
- /sys:/sys:ro
- /var/lib/docker/:/var/lib/docker:ro
- /dev/disk/:/dev/disk:ro
privileged: true
devices:
- /dev/kmsg
image: gcr.io/cadvisor/cadvisor:v0.52.1
# cadvisor:
# volumes:
# - /:/rootfs:ro
# - /var/run:/var/run:ro
# - /sys:/sys:ro
# - /var/lib/docker/:/var/lib/docker:ro
# - /dev/disk/:/dev/disk:ro
# privileged: true
# devices:
# - /dev/kmsg
# image: gcr.io/cadvisor/cadvisor:v0.52.1
redis:
image: redis:latest
restart: always
ports:
- 6379:6379
# anubis:
# image: ghcr.io/techarohq/anubis:latest
# env_file:
# - .env
# dockergen:
# image: jwilder/docker-gen:latest
# container_name: dockergen_gatus_config
# command: -watch -notify-sighup gatus_service -only-exposed /app/config.template.yml /app/config.yaml
# restart: unless-stopped
# volumes:
# - /var/run/docker.sock:/tmp/docker.sock:ro
# - ./gatus:/app
# depends_on:
# - health
healthcheck:
test: ["CMD", "redis-cli", "ping"]
interval: 10s
timeout: 5s
retries: 5
start_period: 10s
dockergen-health:
image: nginxproxy/docker-gen:latest
command: -wait 15s -watch /gatus/config.template.yaml /gatus/config.yaml
restart: unless-stopped
volumes:
- /var/run/docker.sock:/tmp/docker.sock:ro
- ./gatus:/gatus
dockergen-gatekeeper:
image: nginxproxy/docker-gen:latest
command: -wait 15s -watch /gatekeeper/gatekeepers.template.yml /gatekeeper/gatekeepers.yml -notify-sighup pkmntrade-club-gatekeeper-manager-1
restart: unless-stopped
volumes:
- /var/run/docker.sock:/tmp/docker.sock:ro
- ./:/gatekeeper
gatekeeper-manager:
image: docker:latest
restart: always
stop_signal: SIGTERM
volumes:
- /srv:/srv:ro
- /var/run/docker.sock:/var/run/docker.sock
environment:
- REFRESH_INTERVAL=60
entrypoint: ["/bin/sh", "-c"]
command:
- |
set -eu -o pipefail
apk add --no-cache curl
COMPOSE_FILE_PATH="/srv/pkmntrade-club/current/gatekeepers.yml"
PROJECT_DIR_PATH="/srv/pkmntrade-club/current"
PROJECT_NAME_TAG="gatekeepers"
TERMINATING="false"
RESTARTING="false"
STARTED="false"
gatekeeper_down() {
echo "$(date +'%Y-%m-%d %H:%M:%S') [INFO]: Downing gatekeepers (Project: $$PROJECT_NAME_TAG)..."
cd "$$PROJECT_DIR_PATH"
if ! docker compose -p "$$PROJECT_NAME_TAG" -f "$$COMPOSE_FILE_PATH" down; then
echo "$(date +'%Y-%m-%d %H:%M:%S') [WARN]: 'docker compose down' for $$PROJECT_NAME_TAG encountered an issue, but proceeding."
else
STARTED="false"
fi
}
gatekeeper_up() {
if [ "$$TERMINATING" = "true" ]; then return; fi
echo "$(date +'%Y-%m-%d %H:%M:%S') [INFO]: Upping gatekeepers (Project: $$PROJECT_NAME_TAG, File: $$COMPOSE_FILE_PATH)..."
cd "$$PROJECT_DIR_PATH"
if ! docker compose -p "$$PROJECT_NAME_TAG" -f "$$COMPOSE_FILE_PATH" up -d --remove-orphans; then
echo "$(date +'%Y-%m-%d %H:%M:%S') [ERROR]: 'docker compose up' for $$PROJECT_NAME_TAG failed. Will retry."
else
STARTED="true"
fi
}
restart_gatekeepers() {
if [ "$$TERMINATING" = "true" -o "$$RESTARTING" = "true" -o "$$STARTED" = "false" ]; then return; fi
RESTARTING="true"
echo "$(date +'%Y-%m-%d %H:%M:%S') [INFO]: Restarting gatekeepers."
gatekeeper_down
gatekeeper_up
echo "$(date +'%Y-%m-%d %H:%M:%S') [INFO]: Gatekeepers restarted."
RESTARTING="false"
}
gatekeeper_healthcheck() {
if [ "$$TERMINATING" = "true" -o "$$RESTARTING" = "true" -o "$$STARTED" = "false" ]; then
echo "$(date +'%Y-%m-%d %H:%M:%S') [INFO]: Gatekeeper Manager is terminating/restarting/not started. Skipping healthcheck."
return 0
fi
ERROR_MSG=""
echo "$(date +'%Y-%m-%d %H:%M:%S') [INFO]: Checking gatekeepers health..."
num_containers=$$(docker ps -q -a --filter "label=gatekeeper" | wc -l)
if [ "$$num_containers" -eq 0 ]; then
ERROR_MSG="No gatekeepers found. Healthcheck failed."
elif [ $(docker ps -q -a --filter "label=gatekeeper" --filter "status=running" | wc -l) -ne "$$num_containers" ]; then
ERROR_MSG="Gatekeeper containers are missing or not running. Healthcheck failed."
else
# check for 200 status code from each gatekeeper container
for container in $$(docker ps -q -a --filter "label=gatekeeper"); do
if [ $$(curl -s -o /dev/null -w "%{http_code}" -H "X-Real-Ip: 127.0.0.1" http://$$container:9090/metrics) -ne 200 ]; then
container_name=$$(docker ps -a --filter "label=gatekeeper" --filter "id=$$container" --format "{{.Names}}")
ERROR_MSG="Gatekeeper container $$container_name is unhealthy. Healthcheck failed."
fi
done
fi
if [ "$$ERROR_MSG" != "" ]; then
echo "$(date +'%Y-%m-%d %H:%M:%S') [ERROR]: $$ERROR_MSG"
curl -s -f -X POST \
--connect-timeout 10 \
--max-time 15 \
--header "Authorization: Bearer ${GATUS_TOKEN}" \
"http://health:8080/api/v1/endpoints/services_gatekeeper/external?success=false&error=$$ERROR_MSG" || true
restart_gatekeepers
return 1
else
echo "$(date +'%Y-%m-%d %H:%M:%S') [INFO]: All gatekeepers are OK/HEALTHY."
curl -s -f -X POST \
--connect-timeout 10 \
--max-time 15 \
--header "Authorization: Bearer ${GATUS_TOKEN}" \
http://health:8080/api/v1/endpoints/services_gatekeeper/external?success=true&error=HEALTHY || true
fi
}
handle_sigterm() {
if [ "$$TERMINATING" = "true" ]; then return; fi
TERMINATING="true"
echo "$(date +'%Y-%m-%d %H:%M:%S') [INFO]: SIGTERM received. Initiating graceful shutdown for gatekeepers."
curl -s -f -X POST \
--connect-timeout 10 \
--max-time 15 \
--header "Authorization: Bearer ${GATUS_TOKEN}" \
http://health:8080/api/v1/endpoints/services_gatekeeper/external?success=false&error=SIGTERM%20received.%20Shutting%20down. || true
gatekeeper_down
echo "$(date +'%Y-%m-%d %H:%M:%S') [INFO]: Gatekeepers shut down. Gatekeeper Manager exiting."
exit 0
}
handle_sighup() {
if [ "$$TERMINATING" = "true" -o "$$RESTARTING" = "true" -o "$$STARTED" = "false" ]; then return; fi
echo "$(date +'%Y-%m-%d %H:%M:%S') [INFO]: SIGHUP received."
restart_gatekeepers
}
trap 'handle_sigterm' SIGTERM
trap 'handle_sighup' SIGHUP
echo "$(date +'%Y-%m-%d %H:%M:%S') [INFO]: Gatekeeper Manager started."
echo "$(date +'%Y-%m-%d %H:%M:%S') [INFO]: Periodic refresh enabled: $$REFRESH_INTERVAL seconds. Initial wait started."
while [ "$$TERMINATING" = "false" ]; do
# 'sleep x &' and 'wait $!' allows signals to interrupt the sleep.
# '|| true' ensures the loop continues if 'wait' is killed by a handled signal (SIGHUP/SIGTERM)
# SIGTERM handler exits completely, so loop won't continue. SIGHUP handler doesn't exit.
sleep $$REFRESH_INTERVAL &
wait $! || true
echo "$(date +'%Y-%m-%d %H:%M:%S') [INFO]: Periodic healthcheck and refresh triggered."
if [ ! -f "$$COMPOSE_FILE_PATH" ]; then
echo "$(date +'%Y-%m-%d %H:%M:%S') [ERROR]: Gatekeepers.yml has not been generated after $$REFRESH_INTERVAL seconds. Please check dockergen-gatekeeper is running correctly. Exiting."
exit 1
fi
if gatekeeper_healthcheck && [ "$$RESTARTING" = "false" ]; then
gatekeeper_up
fi
done
health:
image: twinproduction/gatus:latest
restart: always
labels:
- "enable_gatekeeper=true"
env_file:
- .env
environment:

View file

@ -3,24 +3,30 @@ x-common: &common
restart: always
env_file:
- .env
environment:
- DEBUG=True
- DISABLE_SIGNUPS=True
- PUBLIC_HOST=staging.pkmntrade.club
- ALLOWED_HOSTS=staging.pkmntrade.club,127.0.0.1
services:
web-staging:
<<: *common
environment:
- DEBUG=False
- DISABLE_SIGNUPS=True
- PUBLIC_HOST=staging.pkmntrade.club
- ALLOWED_HOSTS=staging.pkmntrade.club,127.0.0.1
labels:
- "enable_gatekeeper=true"
deploy:
mode: replicated
replicas: 2
# healthcheck:
# test: ["CMD", "curl", "-f", "http://127.0.0.1:8000"]
# test: ["CMD", "curl", "-f", "http://127.0.0.1:8000/health"]
# interval: 30s
# timeout: 10s
# retries: 3
# start_period: 30s
celery-staging:
<<: *common
environment:
- DEBUG=False
- DISABLE_SIGNUPS=True
- PUBLIC_HOST=staging.pkmntrade.club
- ALLOWED_HOSTS=staging.pkmntrade.club,127.0.0.1
command: ["celery", "-A", "pkmntrade_club.django_project", "worker", "-l", "INFO", "-B", "-E"]

View file

@ -2,11 +2,6 @@ x-common: &common
restart: always
env_file:
- .env
environment:
- DEBUG=False
- DISABLE_SIGNUPS=True
- PUBLIC_HOST=pkmntrade.club
- ALLOWED_HOSTS=pkmntrade.club,127.0.0.1
services:
web:
@ -14,17 +9,28 @@ services:
image: ghcr.io/xe/x/httpdebug
entrypoint: ["/ko-app/httpdebug", "--bind", ":8000"]
#image: badbl0cks/pkmntrade-club:stable
#command: ["granian", "--interface", "wsgi", "pkmntrade_club.django_project.wsgi:app", "--host", "0.0.0.0", "--port", "8000", "--workers", "1", "--workers-kill-timeout", "180", "--access-log"]
environment:
- DEBUG=False
- DISABLE_SIGNUPS=True
- PUBLIC_HOST=pkmntrade.club
- ALLOWED_HOSTS=pkmntrade.club,127.0.0.1
labels:
- "enable_gatekeeper=true"
deploy:
mode: replicated
replicas: 4
# healthcheck:
# test: ["CMD", "curl", "-f", "http://127.0.0.1:8000"]
# test: ["CMD", "curl", "-f", "http://127.0.0.1:8000/health"]
# interval: 30s
# timeout: 10s
# retries: 3
# start_period: 30s
celery:
<<: *common
image: badbl0cks/pkmntrade-club:stable
command: ["celery", "-A", "pkmntrade_club.django_project", "worker", "-l", "INFO", "-B", "-E"]
# celery:
# <<: *common
# image: badbl0cks/pkmntrade-club:stable
# environment:
# - DEBUG=False
# - DISABLE_SIGNUPS=True
# - PUBLIC_HOST=pkmntrade.club
# - ALLOWED_HOSTS=pkmntrade.club,127.0.0.1
# command: ["celery", "-A", "pkmntrade_club.django_project", "worker", "-l", "INFO", "-B", "-E"]

View file

@ -0,0 +1,45 @@
services:
{{ $all_containers := whereLabelValueMatches . "enable_gatekeeper" "true" }}
{{ $all_containers = sortObjectsByKeysAsc $all_containers "Name" }}
{{ range $container := $all_containers }}
{{ $serviceLabel := index $container.Labels "com.docker.compose.service" }}
{{ $containerNumber := index $container.Labels "com.docker.compose.container-number" }}
{{ $port := "" }}
{{ if eq $serviceLabel "web" }}
{{ $port = ":8000" }}
{{ end }}
{{ if eq $serviceLabel "web-staging" }}
{{ $port = ":8000" }}
{{ end }}
{{ if eq $serviceLabel "feedback" }}
{{ $port = ":3000" }}
{{ end }}
{{ if eq $serviceLabel "health" }}
{{ $port = ":8080" }}
{{ end }}
gatekeeper-{{ $serviceLabel }}-{{ $containerNumber }}:
image: ghcr.io/techarohq/anubis:latest
container_name: pkmntrade-club-gatekeeper-{{ $serviceLabel }}-{{ $containerNumber }}
env_file:
- .env
environment:
- TARGET=http://{{ $container.Name }}{{ $port }}
{{ if eq $serviceLabel "web" }}
- TARGET_HOST=pkmntrade.club # pass this host to django, which checks it with ALLOWED_HOSTS
{{ end }}
{{ if eq $serviceLabel "web-staging" }}
- TARGET_HOST=staging.pkmntrade.club # pass this host to django, which checks it with ALLOWED_HOSTS
{{ end }}
labels:
- gatekeeper=true
networks:
default:
aliases:
- pkmntrade-club-gatekeeper-{{ $serviceLabel }}
- gatekeeper-{{ $serviceLabel }}
{{ end }}
networks:
default:
name: pkmntrade-club_default
external: true

View file

@ -0,0 +1,154 @@
storage:
type: postgres
path: "${GATUS_DATABASE_URL}"
web:
read-buffer-size: 32768
connectivity:
checker:
target: 1.1.1.1:53
interval: 60s
external-endpoints:
- name: Database
group: Services
token: "${GATUS_TOKEN}"
alerts:
- type: email
- name: Cache
group: Services
token: "${GATUS_TOKEN}"
alerts:
- type: email
- name: Gatekeeper
group: Services
token: "${GATUS_TOKEN}"
alerts:
- type: email
endpoints:
- name: Domain
group: Expirations
url: "https://pkmntrade.club"
interval: 1h
conditions:
- "[DOMAIN_EXPIRATION] > 720h"
alerts:
- type: email
- name: Certificate
group: Expirations
url: "https://pkmntrade.club"
interval: 1h
conditions:
- "[CERTIFICATE_EXPIRATION] > 240h"
alerts:
- type: email
- name: Cloudflare
group: DNS
url: "1.1.1.1"
interval: 60s
dns:
query-name: "pkmntrade.club"
query-type: "A"
conditions:
- "[DNS_RCODE] == NOERROR"
alerts:
- type: email
- name: Google
group: DNS
url: "8.8.8.8"
interval: 60s
dns:
query-name: "pkmntrade.club"
query-type: "A"
conditions:
- "[DNS_RCODE] == NOERROR"
alerts:
- type: email
- name: Quad9
group: DNS
url: "9.9.9.9"
interval: 60s
dns:
query-name: "pkmntrade.club"
query-type: "A"
conditions:
- "[DNS_RCODE] == NOERROR"
alerts:
- type: email
- name: Load Balancer
group: Services
url: "http://loba/"
interval: 60s
conditions:
- "[STATUS] == 200"
- "[BODY] == OK/HEALTHY"
alerts:
- type: email
- name: Feedback
group: Main
url: "http://pkmntrade-club-feedback-1:3000/"
interval: 60s
conditions:
- "[STATUS] == 200"
alerts:
- type: email
{{ $all_containers := . }}
{{ $web_containers := list }}
{{ $web_staging_containers := list }}
{{ range $container := $all_containers }}
{{ $serviceLabel := index $container.Labels "com.docker.compose.service" }}
{{ if eq $serviceLabel "web" }}
{{ $web_containers = append $web_containers $container }}
{{ end }}
{{ if eq $serviceLabel "web-staging" }}
{{ $web_staging_containers = append $web_staging_containers $container }}
{{ end }}
{{ end }}
{{ $web_containers = sortObjectsByKeysAsc $web_containers "Name" }}
{{ $web_staging_containers = sortObjectsByKeysAsc $web_staging_containers "Name" }}
{{ range $container := $web_containers }}
{{ $containerNumber := index $container.Labels "com.docker.compose.container-number" }}
- name: "Web Worker {{ $containerNumber }}"
group: Main
url: "http://{{ $container.Name }}:8000/health/"
headers:
Host: "pkmntrade.club"
interval: 60s
conditions:
- "[STATUS] == 200"
# - "[BODY] == OK/HEALTHY"
alerts:
- type: email
{{ end }}
{{ range $container := $web_staging_containers }}
{{ $containerNumber := index $container.Labels "com.docker.compose.container-number" }}
- name: "Web Worker {{ $containerNumber }}"
group: Staging
url: "http://{{ $container.Name }}:8000/health/"
headers:
Host: "staging.pkmntrade.club"
interval: 60s
conditions:
- "[STATUS] == 200"
# - "[BODY] == OK/HEALTHY"
alerts:
- type: email
{{ end }}
alerting:
email:
from: "${GATUS_SMTP_FROM}"
username: "${GATUS_SMTP_USER}"
password: "${GATUS_SMTP_PASS}"
host: "${GATUS_SMTP_HOST}"
port: ${GATUS_SMTP_PORT}
to: "${GATUS_SMTP_TO}"
client:
insecure: false
default-alert:
enabled: true
failure-threshold: 3
success-threshold: 2
send-on-resolved: true

View file

@ -8,14 +8,19 @@ connectivity:
target: 1.1.1.1:53
interval: 60s
external-endpoints:
- name: pg_isready
group: db
- name: Database
group: Services
token: "${GATUS_TOKEN}"
alerts:
- type: email
- name: Redis
group: Services
token: "${GATUS_TOKEN}"
alerts:
- type: email
endpoints:
- name: Domain
group: expirations
group: Expirations
url: "https://pkmntrade.club"
interval: 1h
conditions:
@ -23,7 +28,7 @@ endpoints:
alerts:
- type: email
- name: Certificate
group: expirations
group: Expirations
url: "https://pkmntrade.club"
interval: 1h
conditions:
@ -31,7 +36,7 @@ endpoints:
alerts:
- type: email
- name: Cloudflare
group: dns
group: DNS
url: "1.1.1.1"
interval: 60s
dns:
@ -42,7 +47,7 @@ endpoints:
alerts:
- type: email
- name: Google
group: dns
group: DNS
url: "8.8.8.8"
interval: 60s
dns:
@ -53,7 +58,7 @@ endpoints:
alerts:
- type: email
- name: Quad9
group: dns
group: DNS
url: "9.9.9.9"
interval: 60s
dns:
@ -64,7 +69,7 @@ endpoints:
alerts:
- type: email
- name: HAProxy
group: loadbalancer
group: Load Balancer
url: "http://loba/"
interval: 60s
conditions:
@ -73,60 +78,22 @@ endpoints:
alerts:
- type: email
- name: Feedback
group: backends
group: Services
url: "http://feedback:3000/"
interval: 60s
conditions:
- "[STATUS] == 200"
alerts:
- type: email
- name: Web Worker 1
group: backends
url: "http://pkmntrade-club-web-1:8000/health/"
interval: 60s
conditions:
- "[STATUS] == 200"
#- "[BODY] == OK/HEALTHY"
#- [BODY].database == UP
# must return json like {"database": "UP"} first
alerts:
- type: email
- name: Web Worker 2
group: backends
url: "http://pkmntrade-club-web-2:8000/health/"
interval: 60s
conditions:
- "[STATUS] == 200"
#- "[BODY] == OK/HEALTHY"
alerts:
- type: email
- name: Web Worker 3
group: backends
url: "http://pkmntrade-club-web-3:8000/health/"
interval: 60s
conditions:
- "[STATUS] == 200"
#- "[BODY] == OK/HEALTHY"
alerts:
- type: email
- name: Web Worker 4
group: backends
url: "http://pkmntrade-club-web-4:8000/health/"
interval: 60s
conditions:
- "[STATUS] == 200"
#- "[BODY] == OK/HEALTHY"
alerts:
- type: email
# todo: add cadvisor checks via api https://github.com/google/cadvisor/blob/master/docs/api.md
alerting:
email:
from: noreply@pkmntrade.club
username: dd2cd354-de6d-4fa4-bfe8-31c961cb4e90
password: 1622e8a1-9a45-4a7f-8071-cccca29d8675
host: smtp.tem.scaleway.com
port: 465
to: rob@badblocks.email
from: "${GATUS_SMTP_FROM}"
username: "${GATUS_SMTP_USER}"
password: "${GATUS_SMTP_PASS}"
host: "${GATUS_SMTP_HOST}"
port: ${GATUS_SMTP_PORT}
to: "${GATUS_SMTP_TO}"
client:
insecure: false
default-alert:

View file

@ -21,7 +21,7 @@ defaults
timeout http-request 120s
option httplog
frontend web_frontend
frontend haproxy_entrypoint
bind :443 ssl crt /certs/crt.pem verify required ca-file /certs/ca.pem
use_backend %[req.hdr(host),lower,word(1,:)] # strip out port from host
@ -34,17 +34,23 @@ backend basic_check
backend pkmntrade.club
balance leastconn
server-template web- 10 web:8000 check resolvers docker_resolver init-addr libc,none
http-request set-header Host pkmntrade.club
server-template gatekeeper-web- 4 gatekeeper-web:8000 check resolvers docker_resolver init-addr libc,none
backend staging.pkmntrade.club
balance leastconn
server-template web-staging- 10 web-staging:8000 check resolvers docker_resolver init-addr libc,none
http-request set-header Host staging.pkmntrade.club
server-template gatekeeper-web-staging- 4 gatekeeper-web-staging:8000 check resolvers docker_resolver init-addr libc,none
backend feedback.pkmntrade.club
server feedback-1 feedback:3000
balance leastconn
http-request set-header Host feedback.pkmntrade.club
server-template gatekeeper-feedback- 4 gatekeeper-feedback:8000 check resolvers docker_resolver init-addr libc,none
backend health.pkmntrade.club
server health-1 health:8080
balance leastconn
http-request set-header Host health.pkmntrade.club
server-template gatekeeper-health- 4 gatekeeper-health:8000 check resolvers docker_resolver init-addr libc,none
#EOF - trailing newline required

View file

@ -0,0 +1,5 @@
"""pkmntrade.club - A django project for trading Pokémon TCG Pocket Cards"""
from pkmntrade_club._version import __version__, get_version, get_version_info
__all__ = ['__version__', 'get_version', 'get_version_info']

View file

@ -0,0 +1,61 @@
from importlib.metadata import version, PackageNotFoundError
from setuptools_scm import get_version
"""
Version module for pkmntrade.club
This module provides version information from git tags via setuptools-scm.
"""
try:
__version__ = version("pkmntrade-club")
except PackageNotFoundError:
# Package is not installed, try to get version from setuptools_scm
try:
__version__ = get_version(root='../../..', relative_to=__file__)
except (ImportError, LookupError):
__version__ = "0.0.0+unknown"
def get_version():
"""Return the current version."""
return __version__
def get_version_info():
"""Return detailed version information."""
import re
# Parse version string (e.g., "1.2.3", "1.2.3.dev4+gabc1234", "1.2.3-prerelease")
match = re.match(
r'^(\d+)\.(\d+)\.(\d+)'
r'(?:\.dev(\d+))?'
r'(?:\+g([a-f0-9]+))?'
r'(?:-(.+))?$',
__version__
)
if match:
major, minor, patch, dev, git_sha, prerelease = match.groups()
return {
'version': __version__,
'major': int(major),
'minor': int(minor),
'patch': int(patch),
'dev': int(dev) if dev else None,
'git_sha': git_sha,
'prerelease': prerelease,
'is_release': dev is None and not prerelease,
'is_prerelease': bool(prerelease),
'is_dev': dev is not None
}
return {
'version': __version__,
'major': 0,
'minor': 0,
'patch': 0,
'dev': None,
'git_sha': None,
'prerelease': None,
'is_release': False,
'is_prerelease': False,
'is_dev': True
}

View file

@ -4,3 +4,9 @@ def cache_settings(request):
return {
'CACHE_TIMEOUT': settings.CACHE_TIMEOUT,
}
def version_info(request):
return {
'VERSION': settings.VERSION,
'VERSION_INFO': settings.VERSION_INFO,
}

View file

@ -4,9 +4,29 @@ import environ
import os
import logging
import sys
from django.utils.translation import gettext_lazy as _
from pkmntrade_club._version import __version__, get_version_info
# set default values to local dev values
env = environ.Env(
DEBUG=(bool, False)
DEBUG=(bool, False), # MUST STAY FALSE FOR DEFAULT FOR SECURITY REASONS (e.g. if app can't access .env, prevent showing debug output)
DISABLE_SIGNUPS=(bool, True),
DISABLE_CACHE=(bool, True),
DJANGO_DATABASE_URL=(str, 'postgresql://postgres@localhost:5432/postgres?sslmode=disable'),
DJANGO_EMAIL_HOST=(str, ''),
DJANGO_EMAIL_PORT=(int, 587),
DJANGO_EMAIL_USER=(str, ''),
DJANGO_EMAIL_PASSWORD=(str, ''),
DJANGO_EMAIL_USE_TLS=(bool, True),
DJANGO_DEFAULT_FROM_EMAIL=(str, ''),
SECRET_KEY=(str, '0000000000000000000000000000000000000000000000000000000000000000'),
ALLOWED_HOSTS=(str, 'localhost,127.0.0.1'),
PUBLIC_HOST=(str, 'localhost'),
ACCOUNT_EMAIL_VERIFICATION=(str, 'none'),
SCHEME=(str, 'http'),
REDIS_URL=(str, 'redis://localhost:6379'),
CACHE_TIMEOUT=(int, 604800),
TIME_ZONE=(str, 'America/Los_Angeles'),
)
LOGGING = {
@ -59,6 +79,16 @@ BASE_DIR = Path(__file__).resolve().parent.parent
# Take environment variables from .env file
environ.Env.read_env(os.path.join(BASE_DIR, '.env'))
SCHEME = env('SCHEME')
PUBLIC_HOST = env('PUBLIC_HOST')
REDIS_URL = env('REDIS_URL')
CACHE_TIMEOUT = env('CACHE_TIMEOUT')
DISABLE_SIGNUPS = env('DISABLE_SIGNUPS')
DISABLE_CACHE = env('DISABLE_CACHE')
VERSION = __version__
VERSION_INFO = get_version_info()
# Quick-start development settings - unsuitable for production
# See https://docs.djangoproject.com/en/dev/howto/deployment/checklist/
@ -66,11 +96,6 @@ environ.Env.read_env(os.path.join(BASE_DIR, '.env'))
# SECURITY WARNING: keep the secret key used in production secret!
SECRET_KEY = env('SECRET_KEY')
# Scaleway Secret Key
SCW_SECRET_KEY = env('SCW_SECRET_KEY')
DISABLE_SIGNUPS = env('DISABLE_SIGNUPS', default=False)
# https://docs.djangoproject.com/en/dev/ref/settings/#debug
# SECURITY WARNING: don't run with debug turned on in production!
DEBUG = env('DEBUG')
@ -85,9 +110,7 @@ try:
except Exception:
logging.getLogger(__name__).info(f"Error determining server hostname for allowed hosts.")
PUBLIC_HOST = env('PUBLIC_HOST')
CSRF_TRUSTED_ORIGINS = [f"https://{PUBLIC_HOST}"]
CSRF_TRUSTED_ORIGINS = [f"{SCHEME}://{PUBLIC_HOST}"]
FIRST_PARTY_APPS = [
'pkmntrade_club.accounts',
@ -118,6 +141,15 @@ INSTALLED_APPS = [
"crispy_tailwind",
"tailwind",
"django_linear_migrations",
'health_check',
'health_check.db',
'health_check.cache',
'health_check.storage',
'health_check.contrib.migrations',
'health_check.contrib.celery',
'health_check.contrib.celery_ping',
'health_check.contrib.psutil',
'health_check.contrib.redis',
"meta",
] + FIRST_PARTY_APPS
@ -131,9 +163,9 @@ if DEBUG:
TAILWIND_APP_NAME = 'theme'
META_SITE_NAME = 'PKMN Trade Club'
META_SITE_PROTOCOL = 'https'
META_SITE_PROTOCOL = SCHEME
META_USE_SITES = True
META_IMAGE_URL = f'https://{PUBLIC_HOST}/'
META_IMAGE_URL = f'{SCHEME}://{PUBLIC_HOST}/'
# https://docs.djangoproject.com/en/dev/ref/settings/#middleware
MIDDLEWARE = [
@ -155,6 +187,11 @@ if DEBUG:
"django_browser_reload.middleware.BrowserReloadMiddleware",
]
HEALTH_CHECK = {
'DISK_USAGE_MAX': 90, # percent
'MEMORY_MIN': 100, # in MB
}
DAISY_SETTINGS = {
'SITE_TITLE': 'PKMN Trade Club Admin',
'DONT_SUPPORT_ME': True,
@ -181,6 +218,7 @@ TEMPLATES = [
"django.contrib.auth.context_processors.auth",
"django.contrib.messages.context_processors.messages",
"pkmntrade_club.common.context_processors.cache_settings",
"pkmntrade_club.common.context_processors.version_info",
],
},
},
@ -208,14 +246,13 @@ AUTH_PASSWORD_VALIDATORS = [
},
]
# Internationalization
# https://docs.djangoproject.com/en/dev/topics/i18n/
# https://docs.djangoproject.com/en/dev/ref/settings/#language-code
LANGUAGE_CODE = "en-us"
# https://docs.djangoproject.com/en/dev/ref/settings/#time-zone
TIME_ZONE = "UTC"
TIME_ZONE = env('TIME_ZONE')
# https://docs.djangoproject.com/en/dev/ref/settings/#std:setting-USE_I18N
USE_I18N = True
@ -268,19 +305,14 @@ CRISPY_TEMPLATE_PACK = "tailwind"
# https://docs.djangoproject.com/en/dev/ref/settings/#email-backend
EMAIL_BACKEND = "django.core.mail.backends.smtp.EmailBackend"
# EMAIL_HOST = "smtp.resend.com"
# EMAIL_PORT = 587
# EMAIL_HOST_USER = "resend"
# EMAIL_HOST_PASSWORD = RESEND_API_KEY
# EMAIL_USE_TLS = True
EMAIL_HOST = "smtp.tem.scaleway.com"
EMAIL_PORT = 587
EMAIL_HOST_USER = "dd2cd354-de6d-4fa4-bfe8-31c961cb4e90"
EMAIL_HOST_PASSWORD = SCW_SECRET_KEY
EMAIL_USE_TLS = True
EMAIL_HOST = env('DJANGO_EMAIL_HOST')
EMAIL_PORT = env('DJANGO_EMAIL_PORT')
EMAIL_HOST_USER = env('DJANGO_EMAIL_USER')
EMAIL_HOST_PASSWORD = env('DJANGO_EMAIL_PASSWORD')
EMAIL_USE_TLS = env('DJANGO_EMAIL_USE_TLS')
# https://docs.djangoproject.com/en/dev/ref/settings/#default-from-email
DEFAULT_FROM_EMAIL = "noreply@pkmntrade.club"
DEFAULT_FROM_EMAIL = env('DJANGO_DEFAULT_FROM_EMAIL')
# django-debug-toolbar
# https://django-debug-toolbar.readthedocs.io/en/latest/installation.html
@ -289,7 +321,7 @@ INTERNAL_IPS = [
"127.0.0.1",
]
# for docker + debug toolbar
# for docker
hostname, _, ips = socket.gethostbyname_ex(socket.gethostname())
for ip in ips:
INTERNAL_IPS.append(ip)
@ -324,7 +356,7 @@ ACCOUNT_EMAIL_REQUIRED = True
ACCOUNT_EMAIL_VERIFICATION = env('ACCOUNT_EMAIL_VERIFICATION')
ACCOUNT_EMAIL_NOTIFICATIONS = True
ACCOUNT_EMAIL_UNKNOWN_ACCOUNTS = False
ACCOUNT_DEFAULT_HTTP_PROTOCOL = "https"
ACCOUNT_DEFAULT_HTTP_PROTOCOL = SCHEME
ACCOUNT_LOGIN_ON_EMAIL_CONFIRMATION = True
ACCOUNT_USERNAME_MIN_LENGTH = 2
ACCOUNT_CHANGE_EMAIL = True
@ -340,12 +372,12 @@ SOCIALACCOUNT_EMAIL_AUTHENTICATION = False
SOCIALACCOUNT_EMAIL_AUTHENTICATION_AUTO_CONNECT = False
SOCIALACCOUNT_ONLY = False
CACHE_TIMEOUT = 604800 # 1 week
SESSION_ENGINE = "django.contrib.sessions.backends.signed_cookies"
SESSION_COOKIE_HTTPONLY = True
# auto-detection doesn't work properly sometimes, so we'll just use the DEBUG setting
DEBUG_TOOLBAR_CONFIG = {"SHOW_TOOLBAR_CALLBACK": lambda request: DEBUG}
DISABLE_CACHE = env('DISABLE_CACHE', default=DEBUG)
if DISABLE_CACHE:
CACHES = {
"default": {
@ -356,12 +388,12 @@ else:
CACHES = {
"default": {
"BACKEND": "django.core.cache.backends.redis.RedisCache",
"LOCATION": "redis://redis:6379",
"LOCATION": REDIS_URL,
}
}
CELERY_BROKER_URL = "redis://redis:6379"
CELERY_RESULT_BACKEND = "redis://redis:6379"
CELERY_TIMEZONE = "America/Los_Angeles"
CELERY_BROKER_URL = REDIS_URL
CELERY_RESULT_BACKEND = REDIS_URL
CELERY_TIMEZONE = TIME_ZONE
CELERY_ENABLE_UTC = True
CELERY_BEAT_SCHEDULER = "django_celery_beat.schedulers:DatabaseScheduler"

View file

@ -4,10 +4,11 @@ from debug_toolbar.toolbar import debug_toolbar_urls
urlpatterns = [
path("admin/", admin.site.urls),
path('account/', include('pkmntrade_club.accounts.urls')),
path("accounts/", include("allauth.urls")),
path("", include("pkmntrade_club.home.urls")),
path("cards/", include("pkmntrade_club.cards.urls")),
path('account/', include('pkmntrade_club.accounts.urls')),
path("health/", include('health_check.urls')),
path("trades/", include("pkmntrade_club.trades.urls")),
path("__reload__/", include("django_browser_reload.urls")),
] + debug_toolbar_urls()

View file

@ -1,9 +1,7 @@
from django.urls import path
from .views import HomePageView, HealthCheckView
from .views import HomePageView
urlpatterns = [
path("", HomePageView.as_view(), name="home"),
path("health", HealthCheckView.as_view(), name="health"),
path("health/", HealthCheckView.as_view(), name="health"),
]

View file

@ -139,29 +139,3 @@ class HomePageView(TemplateView):
def get(self, request, *args, **kwargs):
"""Override get method to add caching"""
return super().get(request, *args, **kwargs)
class HealthCheckView(View):
def get(self, request, *args, **kwargs):
try:
from django.db import connection
connection.cursor().execute("SELECT 1")
except Exception as e:
return HttpResponse("Database connection failed", status=500)
try:
from pkmntrade_club.trades.models import TradeOffer
with contextlib.redirect_stdout(None):
print(TradeOffer.objects.count())
except Exception as e:
return HttpResponse("DB models not reachable, but db is reachable", status=500)
try:
from django.core.cache import cache
cache.set("test", "test")
with contextlib.redirect_stdout(None):
print(cache.get("test"))
except Exception as e:
return HttpResponse("Cache not reachable", status=500)
return HttpResponse("OK/HEALTHY")

View file

@ -38,8 +38,8 @@
<link rel="stylesheet" href="{% static 'css/base.css' %}">
<!-- Floating UI -->
<script src="{% static 'js/floating-ui_core@1.6.9.9.min.js' %}"></script>
<script src="{% static 'js/floating-ui_dom@1.6.13.13.min.js' %}"></script>
<script src="{% static 'js/floating-ui_core-1.6.9.9.min.js' %}"></script>
<script src="{% static 'js/floating-ui_dom-1.6.13.13.min.js' %}"></script>
<script defer src="{% static 'js/card-multiselect.js' %}"></script>
<link rel="stylesheet" href="{% static 'css/card-multiselect.css' %}">
@ -130,13 +130,13 @@
</div>
<!-- Alpine Plugins -->
<script defer src="{% static 'js/alpinejs.collapse@3.14.8.min.js' %}"></script>
<script defer src="{% static 'js/alpinejs.collapse-3.14.8.min.js' %}"></script>
<!-- Alpine Core -->
<script defer src="{% static 'js/alpinejs@3.14.8.min.js' %}"></script>
<script defer src="{% static 'js/alpinejs-3.14.8.min.js' %}"></script>
<!-- 100% privacy-first, no tracking analytics -->
<script data-goatcounter="https://stats.pkmntrade.club/count" async src="/static/js/count@v4.js"></script>
<!-- Goatcounter: 100% privacy-first, no tracking analytics -->
<script data-goatcounter="https://stats.pkmntrade.club/count" async src="{% static 'js/count-v4.js' %}"></script>
{% block javascript %}{% endblock %}
</body>

View file

@ -1,4 +1,4 @@
{% load gravatar card_badge %}
{% load gravatar card_badge cache %}
{% cache CACHE_TIMEOUT trade_acceptance cache_key %}
<div class="card card-border bg-base-100 shadow-lg max-w-90 mx-auto">

135
uv.lock generated
View file

@ -104,14 +104,14 @@ wheels = [
[[package]]
name = "click"
version = "8.2.0"
version = "8.2.1"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "colorama", marker = "sys_platform == 'win32'" },
]
sdist = { url = "https://files.pythonhosted.org/packages/cd/0f/62ca20172d4f87d93cf89665fbaedcd560ac48b465bd1d92bfc7ea6b0a41/click-8.2.0.tar.gz", hash = "sha256:f5452aeddd9988eefa20f90f05ab66f17fce1ee2a36907fd30b05bbb5953814d", size = 235857 }
sdist = { url = "https://files.pythonhosted.org/packages/60/6c/8ca2efa64cf75a977a0d7fac081354553ebe483345c734fb6b6515d96bbc/click-8.2.1.tar.gz", hash = "sha256:27c491cc05d968d271d5a1db13e3b5a184636d9d930f148c50b038f0d0646202", size = 286342 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/a2/58/1f37bf81e3c689cc74ffa42102fa8915b59085f54a6e4a80bc6265c0f6bf/click-8.2.0-py3-none-any.whl", hash = "sha256:6b303f0b2aa85f1cb4e5303078fadcbcd4e476f114fab9b5007005711839325c", size = 102156 },
{ url = "https://files.pythonhosted.org/packages/85/32/10bb5764d90a8eee674e9dc6f4db6a0ab47c8c4d0d83c27f7c39ac415a4d/click-8.2.1-py3-none-any.whl", hash = "sha256:61a3265b914e850b85317d0b3109c7f8cd35a670f963866005d6ef1d5175a12b", size = 102215 },
]
[[package]]
@ -313,6 +313,18 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/83/b3/0a3bec4ecbfee960f39b1842c2f91e4754251e0a6ed443db9fe3f666ba8f/django_environ-0.12.0-py2.py3-none-any.whl", hash = "sha256:92fb346a158abda07ffe6eb23135ce92843af06ecf8753f43adf9d2366dcc0ca", size = 19957 },
]
[[package]]
name = "django-health-check"
version = "3.18.3"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "django" },
]
sdist = { url = "https://files.pythonhosted.org/packages/66/e9/0699ea3debfda75e5960ff99f56974136380e6f8202d453de7357e1f67fc/django_health_check-3.18.3.tar.gz", hash = "sha256:18b75daca4551c69a43f804f9e41e23f5f5fb9efd06cf6a313b3d5031bb87bd0", size = 20919 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/e2/1e/3b23b580762cca7456427731de9b90718d15eec02ebe096437469d767dfe/django_health_check-3.18.3-py2.py3-none-any.whl", hash = "sha256:f5f58762b80bdf7b12fad724761993d6e83540f97e2c95c42978f187e452fa07", size = 30331 },
]
[[package]]
name = "django-linear-migrations"
version = "2.17.0"
@ -334,6 +346,18 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/a6/78/2fb6ff7df06fe4ad31f3f9b9b80e682317b6d22188148dca52e0ec87bf4a/django_meta-2.4.2-py2.py3-none-any.whl", hash = "sha256:afc6b77c3885db0cd97883d1dc3df47f91a9c7951b2f4928fee91ca60a7d0ff2", size = 27792 },
]
[[package]]
name = "django-parler"
version = "2.3"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "django" },
]
sdist = { url = "https://files.pythonhosted.org/packages/8c/2b/2423d31620efe8ab0d0390e60afab4f9cc2e62d4bf39fe0e05df0eef1b93/django-parler-2.3.tar.gz", hash = "sha256:2c8f5012ceb5e49af93b16ea3fe4d0c83d70b91b2d0f470c05d7d742b6f3083d", size = 69167 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/47/38/11f1a7e3d56f3a6b74cf99e307f2554b741cadebc9b1c45b05e2ec1f35a2/django_parler-2.3-py3-none-any.whl", hash = "sha256:8f6c8061e4b5690f1ee2d8e5760940ef06bf78a5bfa033d11178377559c749cf", size = 83288 },
]
[[package]]
name = "django-tailwind-4"
version = "0.1.4"
@ -405,41 +429,41 @@ wheels = [
[[package]]
name = "granian"
version = "2.2.5"
version = "2.3.1"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "click" },
]
sdist = { url = "https://files.pythonhosted.org/packages/d8/59/064df25d63fbfc27c7ec48c1d0efe3fffe6b70b8d3d03c59f136f390cad7/granian-2.2.5.tar.gz", hash = "sha256:90b832270b6b03a41b1706051113a3ffcca307860d5c864dc1f47ea290fc4b58", size = 94178 }
sdist = { url = "https://files.pythonhosted.org/packages/82/0f/04aacf7ec30ba04018c7be761e5a6964d73cf82da5969b35e912e8e4e662/granian-2.3.1.tar.gz", hash = "sha256:5e9bddf3580e8ffccfaa97196672a6351630c959c37eb2498772504759a9f1ba", size = 100302 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/7c/90/84dd92375dfb33876c82a05e1942c8800931b7c439299d5e1485ef7216c8/granian-2.2.5-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:2ff8916ba37490fef696a461c2a43498b8865b3dcfa73e3dbff9d72ea2f6fbb9", size = 2848961 },
{ url = "https://files.pythonhosted.org/packages/2e/0d/e62d645ec01ac0b6dd3860949eda41de4e2ec1b014dc50b11a34989d4c4d/granian-2.2.5-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:6cb56eccde7fe1c94ffb9ae60d516221c57b2e29224b6c6c2484ded044852320", size = 2592306 },
{ url = "https://files.pythonhosted.org/packages/b8/94/f955777a6d75c79198d8ca34132d04698dd0bf9b833833646e77c4fb164f/granian-2.2.5-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:1883c7e3e1b1154ba49d1317e42a660b2c12a7bda8e4bc79b9279904db01d48b", size = 3114729 },
{ url = "https://files.pythonhosted.org/packages/a4/9c/814f88c8bf4eb1b9758bacf38838c8d3de3deb9c51b8d7ecdf5dd524988a/granian-2.2.5-cp312-cp312-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:84ce4b171e3dd10a8d1b0ddf6a09665faae257ca5e35394af0784d1682252903", size = 2877067 },
{ url = "https://files.pythonhosted.org/packages/3b/e9/aa321896f8ce46e370a3b52dbd4104d3a145e47884cb885da1492bb20486/granian-2.2.5-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2cdded34f6a0f9f4bdbb26535c4b16312b38b7edb799b39e2282f57b605919ea", size = 3049879 },
{ url = "https://files.pythonhosted.org/packages/e4/b0/14f73043a7762138681a07d7bf18125e7a7d7ba5e2b96406ccf281ad3251/granian-2.2.5-cp312-cp312-manylinux_2_28_aarch64.whl", hash = "sha256:41f90d5123c1d772c24dfa552551831cd96742f72be26d550d3ac0bae733e870", size = 2960461 },
{ url = "https://files.pythonhosted.org/packages/f5/7d/c004df81422bfe1857f768a98550c8f1017140f00a6d9179e37ce29086dc/granian-2.2.5-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:fadb34659db0e4eaba19e3d3547eaa4f75a64a63f222a57df7badcc17d3842d9", size = 2865627 },
{ url = "https://files.pythonhosted.org/packages/64/31/70bbfef65e5612f790d2b7140309ccde8736744203b78ef281288b6f201a/granian-2.2.5-cp312-cp312-musllinux_1_1_armv7l.whl", hash = "sha256:bbeeeb96e7148cc36afa56f652da24f65098bd5e64a528ce894f60ab2db85ff7", size = 3128615 },
{ url = "https://files.pythonhosted.org/packages/5e/5c/179a7f3f0c1c46ccaee8ef9b78dd679db304461c2538907e826c20a0025d/granian-2.2.5-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:2ae5587061a95b936ecaaf337e1aff35d19e27c1872d09e1a86debf1010841f8", size = 2997155 },
{ url = "https://files.pythonhosted.org/packages/a6/99/331354780b32f1fc626e510856e43d094fe68e6ac101805cef9351e3078f/granian-2.2.5-cp312-cp312-win_amd64.whl", hash = "sha256:fc1fa600bf0be3e8a3e2a49fb013aa9edf740dbf1ab14a19cad75088bd44dae4", size = 2586303 },
{ url = "https://files.pythonhosted.org/packages/b7/61/652d9817f6dff310950ab835b8838c44a370fa5c3ac8f997f4ec2738a403/granian-2.2.5-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:00d0d1797df1e0c3c9e438670e0f1f27500efef543ced42415d821e6162f884e", size = 2848540 },
{ url = "https://files.pythonhosted.org/packages/f3/50/c63b8b7d4951be43ba5f1c9d3e67f9fde1ddddaca61164ab7ae70f3405c3/granian-2.2.5-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:05d0852938a40948ce48a14a0186c9757b738e2715bd817a7931cb5b65aff4cb", size = 2591960 },
{ url = "https://files.pythonhosted.org/packages/23/c5/631c10134ced73dfcf03f3ba1157aa02dffa1d30cd5ec3b85a5d469c7090/granian-2.2.5-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:19b6817f25f1c29858e06c2ded96d16734ebb5c7d0f2d29f71c0e6e3da240906", size = 3113616 },
{ url = "https://files.pythonhosted.org/packages/64/d2/7015aa7b6faedccb1498acd0b2f838c1cf15b13faa4052077b3a82d7035c/granian-2.2.5-cp313-cp313-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a6a79710035d5c5964e3f4a528e0d9b74de5d73a69b1ea48142804469a4c635f", size = 2876933 },
{ url = "https://files.pythonhosted.org/packages/6b/fb/284b5fee9630f512c1ba9f54992f321a9f3b29e1d9c71199fb3cd700eb1a/granian-2.2.5-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8a8b508b785e0a68593b111e598a199567c6fb98841cbd9cd1b5a10baa4cc13d", size = 3049641 },
{ url = "https://files.pythonhosted.org/packages/71/c5/6e92f8d897017b53ac2e9608f268eccfa268433179dda5f7f6c6e87d71b6/granian-2.2.5-cp313-cp313-manylinux_2_28_aarch64.whl", hash = "sha256:9c6f1f4351ccf9783db6d6da022f1ba83ef4c83c2d26f52ca5d30acf5fbac2df", size = 2960064 },
{ url = "https://files.pythonhosted.org/packages/43/c7/86422d387da46eb956660d9a1fd12da07c165bd1033fc32badee854e4797/granian-2.2.5-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:ac26b7933e3302b526d481b7c61f017288e06eb56bf9168133f649097b2ce5ab", size = 2865506 },
{ url = "https://files.pythonhosted.org/packages/e8/68/f6e5f9b087e1ede11fcd4dbb8d70bff8eed4f9b5ea691863035292ec9d39/granian-2.2.5-cp313-cp313-musllinux_1_1_armv7l.whl", hash = "sha256:4950e77678378995df3b2be5544179ae5757a3ab6079272f14a161e14f0fe1eb", size = 3128304 },
{ url = "https://files.pythonhosted.org/packages/40/31/65595f29a42fb7b6639ca4566d547219655796c45ad372cba7168dff2689/granian-2.2.5-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:928389b89ffe88af96fbd8120fc3cb64afe9a62797000f6e7e6ff88ff5612ccc", size = 2996875 },
{ url = "https://files.pythonhosted.org/packages/bd/76/6435e413702cc80963044627f96e03c49110bdf86e11a571e76560df5edc/granian-2.2.5-cp313-cp313-win_amd64.whl", hash = "sha256:db1f3c2ae4496e82803014587b822c702d5ea263c35a8edf1a2b098ee9459c9a", size = 2586059 },
{ url = "https://files.pythonhosted.org/packages/a1/3e/fa02abd294ddf5e0e432c01727cc76a931d030e4f24141cfdcdfb078357a/granian-2.2.5-cp313-cp313t-macosx_10_12_x86_64.whl", hash = "sha256:f0b688402347e646a2d2b95114eef2eb786ec4c9cb747157e7e892f809e0bb3f", size = 2697330 },
{ url = "https://files.pythonhosted.org/packages/e9/c8/a1dfaec4b6308e47a90c3e1920f681db36449829be445fef653e8ef7d3fa/granian-2.2.5-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:4e8fd8688c79fd7d9dec162b5217869288c1da64ce26518d9fbb69d8f8e97ac9", size = 2466298 },
{ url = "https://files.pythonhosted.org/packages/c6/f8/ea86317f6582be1b3ac6b29781631ae69c5d4693e5da9467fd9fb18abe02/granian-2.2.5-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:12fe0a5f37affa90d2d576b9a7c4e1bbe18ff4cce59f6cd05d33375e6d5b4b5a", size = 2816442 },
{ url = "https://files.pythonhosted.org/packages/dd/26/fd6d5d19ce5a4a149cc93c05d7522ce90ee6728c56b13035a2d5259404bc/granian-2.2.5-cp313-cp313t-manylinux_2_28_aarch64.whl", hash = "sha256:48a43bf791ec7ec8c75cf538f5957875aedc5b4653688c9292887911738d3f51", size = 2735825 },
{ url = "https://files.pythonhosted.org/packages/06/34/148a6f3918dbb71824845edbe2a6d8512a52ae2a8c323a9071002a68d6d1/granian-2.2.5-cp313-cp313t-musllinux_1_1_aarch64.whl", hash = "sha256:bb44abd371bf054626aa750ad25dfe6b17216a0dbf10faa4f6a61a2fea57eaf6", size = 2857911 },
{ url = "https://files.pythonhosted.org/packages/28/1c/6c0c5aeae2a090ac046065944863fb76608c6b09c5249fda46148391b128/granian-2.2.5-cp313-cp313t-musllinux_1_1_armv7l.whl", hash = "sha256:4cdccad590be2183eed4e11f4aef7e62c5326df777e4aaefceecb23edea474ad", size = 3118748 },
{ url = "https://files.pythonhosted.org/packages/1a/34/22ada66b585c9a3076c63777491dc6daf1773a86cb262a613cd9af3cb24f/granian-2.2.5-cp313-cp313t-musllinux_1_1_x86_64.whl", hash = "sha256:0ea5be0b02b78a024fc730cf2a271e58ec553aa39b0b43bdb11492c4c82024ba", size = 2989738 },
{ url = "https://files.pythonhosted.org/packages/7f/6e/6063f3a44e20dcfa838467b5a3358b907e367edf3596056f86abed532085/granian-2.2.5-cp313-cp313t-win_amd64.whl", hash = "sha256:3e83de670fd6c75b405f28f62053d3477433650f102cb88e6582df6acced0a6c", size = 2510704 },
{ url = "https://files.pythonhosted.org/packages/26/cd/30917d357be84957b3e8a1a82ac45e14fdfeb8e1afcd82ffe50a94f759f1/granian-2.3.1-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:9067454777e14b8d3d5ad2d3e5f11ee2cc1ae18c09455908d44e6b5a0d018207", size = 3047974 },
{ url = "https://files.pythonhosted.org/packages/27/a5/c3752565733da327441e602e6dddafa79219e752c25ee70416b48df30321/granian-2.3.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:4851094be97758f9d72a7d304021edeaf902818a5635b13ea1784092090098d8", size = 2722452 },
{ url = "https://files.pythonhosted.org/packages/77/0a/38d6eb581c43cffb5b4a87a2bbd8b3f39e0874785d95db5d94c01f259809/granian-2.3.1-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:7c6f805468151a756142d59fab8414c986bdbdeea26983178d5e3c00705aaba6", size = 3365275 },
{ url = "https://files.pythonhosted.org/packages/33/44/5fa5aab9a1cf27295bee81e22ecf758adef68c12244c0fd8d9d82da350e2/granian-2.3.1-cp312-cp312-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:300954af2a155d8827c6c7af45d8bff8c080303c23fac502e21c34cfb3f92de1", size = 3001384 },
{ url = "https://files.pythonhosted.org/packages/21/25/df592394d957933dbe905510dc4ad35141ea3e49fd4e562bc529727a8f44/granian-2.3.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:73659a68149d159cfadcf44fcf6cdb91b027aa4ebb7ad561a9febbfaaecc903b", size = 3215845 },
{ url = "https://files.pythonhosted.org/packages/ef/67/9213bf996d0e687939924468615762d106fd38f8c098f34266648f465d2b/granian-2.3.1-cp312-cp312-manylinux_2_28_aarch64.whl", hash = "sha256:3fa062db9d6fe7e57aa9c0043e30d50e1ee9fcf4226768b1b13a4fddef81d761", size = 3163131 },
{ url = "https://files.pythonhosted.org/packages/25/de/1829c71fd0cba459a9bfc998138ca3ff18f8b913c9ae3c3a3c8c675ceb0c/granian-2.3.1-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:4d00d26d7d02436ca2d1a26b6390710bea6d98cd519927835b04991043777852", size = 3139498 },
{ url = "https://files.pythonhosted.org/packages/e7/8a/ce0adbeefcd7a78981d6a4f99709d82344a5c76d163d741547e9cc6f864e/granian-2.3.1-cp312-cp312-musllinux_1_1_armv7l.whl", hash = "sha256:bd35d898c7be876924e73b13004ccee20b6bc071bf851c3d7eb029f01be22306", size = 3460123 },
{ url = "https://files.pythonhosted.org/packages/58/f1/1918ff96843125ae12480eb7430692fa3243e42206b263918b087c34852c/granian-2.3.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:5979218d189e8da8de50a690c7399e1f0b052419c0b10dd20210ec73dfe29f83", size = 3273879 },
{ url = "https://files.pythonhosted.org/packages/3b/14/bc6f049852d26045839dc6bddfd06ca6efa8966b939fa4c747ad7c9ab9bf/granian-2.3.1-cp312-cp312-win_amd64.whl", hash = "sha256:878dea8b920a52d7ab36ee8110f960a8a2dde1cb0d43331bf71e1815f1051628", size = 2766821 },
{ url = "https://files.pythonhosted.org/packages/e2/c1/71c4a64ca6e65e390ba82270c967318956ea67ed1467f68fc1bd236cc338/granian-2.3.1-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:5e97505ba05f73d76669f732221d01c1c69b0ce01384db872d0b0c240cc422e4", size = 3047647 },
{ url = "https://files.pythonhosted.org/packages/1b/f7/e1135ee1f9b6188438ca0b406f081b040ebf5c8bcd290b8a2086c4e1cdf3/granian-2.3.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:8f0add56a7188830e82ac76bc8c11ab758edaec889f6f744023d4cd0ac45a908", size = 2721628 },
{ url = "https://files.pythonhosted.org/packages/ad/d8/73d953e94c7d62c60db7c0ae8448bed578fdbd85b6aa6d939f52541da266/granian-2.3.1-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:006fd310d23e904802ea293abdc909c1871762f955deeb4c32232f7ddec37a3f", size = 3364877 },
{ url = "https://files.pythonhosted.org/packages/a3/5e/a81c96fb365cee2395092250d97f600b6bc4b477702f98e1befbef27f937/granian-2.3.1-cp313-cp313-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:172ed6e87b588eb54dfaf3a39a486b06c82e0accbe3b86427333ea3a57c9b2c9", size = 3001067 },
{ url = "https://files.pythonhosted.org/packages/40/b0/9270bf7d1b612923d14d9783dde0d065afec62146753a2d64f17ef49e955/granian-2.3.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:413617122d940bbcf793bd1a9ba6a0fabadd5ba75b03acf99c101f735030dc0e", size = 3215182 },
{ url = "https://files.pythonhosted.org/packages/e3/27/a1166372c0f40fde0ea3778e2ddacbf752d4e1ce3a2ecb49b5e100c7fbaf/granian-2.3.1-cp313-cp313-manylinux_2_28_aarch64.whl", hash = "sha256:a7a9b93e7dd2f63a1762343c6d396eef52163fb2cea044569102ae41fa3fd607", size = 3163178 },
{ url = "https://files.pythonhosted.org/packages/57/cc/53d257346b9274001141421fca0436df67662634dfdd9f6ac5a595737804/granian-2.3.1-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:b6eea65f77527aeb5deb75186c90063e4368a94f7076aa9495067804f06d0345", size = 3138821 },
{ url = "https://files.pythonhosted.org/packages/9e/79/a1f50daf41dc1a50384a359dcd1f02a844c0a9b4f009a5d5c399c1893a9a/granian-2.3.1-cp313-cp313-musllinux_1_1_armv7l.whl", hash = "sha256:d1ae09ae5424ca5e57dbea660088878aca858f176b2ddf26dc5bf685b939b567", size = 3460024 },
{ url = "https://files.pythonhosted.org/packages/e7/3a/78038ab237eda59707d0f0e0fae750ff21db543e7175dfb7ac3b87a87b7f/granian-2.3.1-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:8c10fdee7afee4d8b69a92f937552c94b979521e6f0169bb300c9612a85b9989", size = 3273042 },
{ url = "https://files.pythonhosted.org/packages/62/2c/ab47958d0d808fda5659535a30214ed24622d89190f37fa00d2200e88fb5/granian-2.3.1-cp313-cp313-win_amd64.whl", hash = "sha256:e58d2945ab1f99a5f2f804b9d552b337cccf77620dd17ddda94b0baaff0d78ef", size = 2766257 },
{ url = "https://files.pythonhosted.org/packages/b2/58/5ef889557401cd01d9f4380dc4a23ee679d835b51f84956d06b97b4bcb8d/granian-2.3.1-cp313-cp313t-macosx_10_12_x86_64.whl", hash = "sha256:f662331ff69802ffdc02160feadb1a347c652afe8192739d4adf34da6cd1bbff", size = 2999112 },
{ url = "https://files.pythonhosted.org/packages/13/70/4363eb6ac164063af7d322691be221477d127c6c6986a339688c32dbd1d1/granian-2.3.1-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:3941e11bcb018cab31ed9a597c67458d278db1469242775e28711d5a3c0be481", size = 2667844 },
{ url = "https://files.pythonhosted.org/packages/53/dc/bdcd9f18f7e070d17236c685cd56ec5834a088e8f885de671bd13c608176/granian-2.3.1-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6d53338bb064586e929cece3611d1a49b398ac328f87604e56eda7754c2d0c55", size = 3076031 },
{ url = "https://files.pythonhosted.org/packages/74/18/9dbc3c4b7a14c3eeecac702a8d351e4c1220c89e99ffe7f0211e856f3c54/granian-2.3.1-cp313-cp313t-manylinux_2_28_aarch64.whl", hash = "sha256:16941818186789804da8b0837130117ca310f88d419856d8df2655ccae27f670", size = 3028926 },
{ url = "https://files.pythonhosted.org/packages/9b/b5/7b72ada8a04203c4e9165a3ba7bf266568bf0507ea40c170f96566e0b390/granian-2.3.1-cp313-cp313t-musllinux_1_1_aarch64.whl", hash = "sha256:8265480fc80cf0546af509f9f9d27519af26038fbd586081cdd3641d4dd3f44e", size = 3130077 },
{ url = "https://files.pythonhosted.org/packages/79/34/2395be45fea5e818a5ba0b8871c0fb5776e23f664fdd05a9b00849757314/granian-2.3.1-cp313-cp313t-musllinux_1_1_armv7l.whl", hash = "sha256:0bf2ace1107950110e7da5cca88ecca1f97bb5ef7398c6bc9c95bd0787f0edac", size = 3450662 },
{ url = "https://files.pythonhosted.org/packages/44/c3/9a037020e26ede18b8570f559254911722875ae129293337dc0170ec7c0e/granian-2.3.1-cp313-cp313t-musllinux_1_1_x86_64.whl", hash = "sha256:4a08c567a2472fdabd49c1564c560ffe1734e8acd1e0fc3027907296f36434fc", size = 3264479 },
{ url = "https://files.pythonhosted.org/packages/cd/32/660601986c5b5815d13399e9c3b6e84a119be010c974d32d505eb1ef4c7e/granian-2.3.1-cp313-cp313t-win_amd64.whl", hash = "sha256:d828917559a53ff9581ac4a475a508d7b6de7abebb256a27543af8363eb7c844", size = 2792073 },
]
[[package]]
@ -572,7 +596,6 @@ wheels = [
[[package]]
name = "pkmntrade-club"
version = "0.1.0"
source = { editable = "." }
dependencies = [
{ name = "asgiref" },
@ -591,8 +614,10 @@ dependencies = [
{ name = "django-daisy" },
{ name = "django-debug-toolbar" },
{ name = "django-environ" },
{ name = "django-health-check" },
{ name = "django-linear-migrations" },
{ name = "django-meta" },
{ name = "django-parler" },
{ name = "django-tailwind-4", extra = ["reload"] },
{ name = "django-widget-tweaks" },
{ name = "gevent" },
@ -603,6 +628,7 @@ dependencies = [
{ name = "packaging" },
{ name = "pillow" },
{ name = "playwright" },
{ name = "psutil" },
{ name = "psycopg" },
{ name = "psycopg-binary" },
{ name = "pycparser" },
@ -611,6 +637,7 @@ dependencies = [
{ name = "redis" },
{ name = "requests" },
{ name = "requests-oauthlib" },
{ name = "setuptools-scm" },
{ name = "sqlparse" },
{ name = "typing-extensions" },
{ name = "urllib3" },
@ -635,18 +662,21 @@ requires-dist = [
{ name = "django-daisy", specifier = "==1.0.13" },
{ name = "django-debug-toolbar", specifier = "==4.4.6" },
{ name = "django-environ", specifier = "==0.12.0" },
{ name = "django-health-check", specifier = ">=3.18.3" },
{ name = "django-linear-migrations", specifier = ">=2.17.0" },
{ name = "django-meta", specifier = "==2.4.2" },
{ name = "django-parler", specifier = ">=2.3" },
{ name = "django-tailwind-4", extras = ["reload"], specifier = "==0.1.4" },
{ name = "django-widget-tweaks", specifier = "==1.5.0" },
{ name = "gevent", specifier = "==25.4.1" },
{ name = "granian", specifier = "==2.2.5" },
{ name = "granian", specifier = "==2.3.1" },
{ name = "gunicorn", specifier = "==23.0.0" },
{ name = "idna", specifier = "==3.4" },
{ name = "oauthlib", specifier = "==3.2.2" },
{ name = "packaging", specifier = "==23.1" },
{ name = "pillow", specifier = ">=11.2.1" },
{ name = "playwright", specifier = "==1.52.0" },
{ name = "psutil", specifier = ">=7.0.0" },
{ name = "psycopg", specifier = "==3.2.3" },
{ name = "psycopg-binary", specifier = "==3.2.3" },
{ name = "pycparser", specifier = "==2.21" },
@ -655,6 +685,7 @@ requires-dist = [
{ name = "redis", specifier = ">=6.1.0" },
{ name = "requests", specifier = "==2.28.2" },
{ name = "requests-oauthlib", specifier = "==1.3.1" },
{ name = "setuptools-scm", specifier = ">=8.3.1" },
{ name = "sqlparse", specifier = "==0.4.3" },
{ name = "typing-extensions", specifier = "==4.9.0" },
{ name = "urllib3", specifier = "==1.26.14" },
@ -692,6 +723,21 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/ce/4f/5249960887b1fbe561d9ff265496d170b55a735b76724f10ef19f9e40716/prompt_toolkit-3.0.51-py3-none-any.whl", hash = "sha256:52742911fde84e2d423e2f9a4cf1de7d7ac4e51958f648d9540e0fb8db077b07", size = 387810 },
]
[[package]]
name = "psutil"
version = "7.0.0"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/2a/80/336820c1ad9286a4ded7e845b2eccfcb27851ab8ac6abece774a6ff4d3de/psutil-7.0.0.tar.gz", hash = "sha256:7be9c3eba38beccb6495ea33afd982a44074b78f28c434a1f51cc07fd315c456", size = 497003 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/ed/e6/2d26234410f8b8abdbf891c9da62bee396583f713fb9f3325a4760875d22/psutil-7.0.0-cp36-abi3-macosx_10_9_x86_64.whl", hash = "sha256:101d71dc322e3cffd7cea0650b09b3d08b8e7c4109dd6809fe452dfd00e58b25", size = 238051 },
{ url = "https://files.pythonhosted.org/packages/04/8b/30f930733afe425e3cbfc0e1468a30a18942350c1a8816acfade80c005c4/psutil-7.0.0-cp36-abi3-macosx_11_0_arm64.whl", hash = "sha256:39db632f6bb862eeccf56660871433e111b6ea58f2caea825571951d4b6aa3da", size = 239535 },
{ url = "https://files.pythonhosted.org/packages/2a/ed/d362e84620dd22876b55389248e522338ed1bf134a5edd3b8231d7207f6d/psutil-7.0.0-cp36-abi3-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1fcee592b4c6f146991ca55919ea3d1f8926497a713ed7faaf8225e174581e91", size = 275004 },
{ url = "https://files.pythonhosted.org/packages/bf/b9/b0eb3f3cbcb734d930fdf839431606844a825b23eaf9a6ab371edac8162c/psutil-7.0.0-cp36-abi3-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4b1388a4f6875d7e2aff5c4ca1cc16c545ed41dd8bb596cefea80111db353a34", size = 277986 },
{ url = "https://files.pythonhosted.org/packages/eb/a2/709e0fe2f093556c17fbafda93ac032257242cabcc7ff3369e2cb76a97aa/psutil-7.0.0-cp36-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a5f098451abc2828f7dc6b58d44b532b22f2088f4999a937557b603ce72b1993", size = 279544 },
{ url = "https://files.pythonhosted.org/packages/50/e6/eecf58810b9d12e6427369784efe814a1eec0f492084ce8eb8f4d89d6d61/psutil-7.0.0-cp37-abi3-win32.whl", hash = "sha256:ba3fcef7523064a6c9da440fc4d6bd07da93ac726b5733c29027d7dc95b39d99", size = 241053 },
{ url = "https://files.pythonhosted.org/packages/50/1b/6921afe68c74868b4c9fa424dad3be35b095e16687989ebbb50ce4fceb7c/psutil-7.0.0-cp37-abi3-win_amd64.whl", hash = "sha256:4cf3d4eb1aa9b348dec30105c55cd9b7d4629285735a102beb4441e38db90553", size = 244885 },
]
[[package]]
name = "psycopg"
version = "3.2.3"
@ -839,11 +885,24 @@ wheels = [
[[package]]
name = "setuptools"
version = "80.7.1"
version = "80.8.0"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/9e/8b/dc1773e8e5d07fd27c1632c45c1de856ac3dbf09c0147f782ca6d990cf15/setuptools-80.7.1.tar.gz", hash = "sha256:f6ffc5f0142b1bd8d0ca94ee91b30c0ca862ffd50826da1ea85258a06fd94552", size = 1319188 }
sdist = { url = "https://files.pythonhosted.org/packages/8d/d2/ec1acaaff45caed5c2dedb33b67055ba9d4e96b091094df90762e60135fe/setuptools-80.8.0.tar.gz", hash = "sha256:49f7af965996f26d43c8ae34539c8d99c5042fbff34302ea151eaa9c207cd257", size = 1319720 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/a1/18/0e835c3a557dc5faffc8f91092f62fc337c1dab1066715842e7a4b318ec4/setuptools-80.7.1-py3-none-any.whl", hash = "sha256:ca5cc1069b85dc23070a6628e6bcecb3292acac802399c7f8edc0100619f9009", size = 1200776 },
{ url = "https://files.pythonhosted.org/packages/58/29/93c53c098d301132196c3238c312825324740851d77a8500a2462c0fd888/setuptools-80.8.0-py3-none-any.whl", hash = "sha256:95a60484590d24103af13b686121328cc2736bee85de8936383111e421b9edc0", size = 1201470 },
]
[[package]]
name = "setuptools-scm"
version = "8.3.1"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "packaging" },
{ name = "setuptools" },
]
sdist = { url = "https://files.pythonhosted.org/packages/b9/19/7ae64b70b2429c48c3a7a4ed36f50f94687d3bfcd0ae2f152367b6410dff/setuptools_scm-8.3.1.tar.gz", hash = "sha256:3d555e92b75dacd037d32bafdf94f97af51ea29ae8c7b234cf94b7a5bd242a63", size = 78088 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/ab/ac/8f96ba9b4cfe3e4ea201f23f4f97165862395e9331a424ed325ae37024a8/setuptools_scm-8.3.1-py3-none-any.whl", hash = "sha256:332ca0d43791b818b841213e76b1971b7711a960761c5bea5fc5cdb5196fbce3", size = 43935 },
]
[[package]]