diff --git a/.gitignore b/.gitignore index 71b2312..2566b3e 100644 --- a/.gitignore +++ b/.gitignore @@ -6,3 +6,4 @@ src/excalidraw .env src/frontend/.env.local .vscode/settings.json +dev/ diff --git a/docs/context.md b/docs/context.md deleted file mode 100644 index b5eb613..0000000 --- a/docs/context.md +++ /dev/null @@ -1,172 +0,0 @@ -# Pad.ws Developer Onboarding Guide - -## Project Overview - -Pad.ws is an innovative project that combines a whiteboard canvas with cloud development environments, delivering an "IDE-in-a-whiteboard" experience. The system allows users to seamlessly transition between visual thinking (drawing, diagramming) and coding directly in the browser. - -### Key Features - -- **Interactive Whiteboard**: Built on a fork of Excalidraw for drawing and visualizing ideas -- **Cloud Development Environment**: Complete development environment accessible from the browser -- **Seamless Workflow**: Switch between ideation and coding in a single interface -- **Collaboration**: Support for collaborative work with backup and versioning - -## Architecture Overview - -The system follows a microservices architecture with the following components: - -``` - ┌─────────────┐ - │ Client │ - │ (Browser) │ - └──────┬──────┘ - │ - ▼ -┌────────────────────────────────────────────────┐ -│ FastAPI Backend App │ -│ │ -│ ┌─────────────┐ ┌────────────────────┐ │ -│ │ Static File │ │ API Controllers │ │ -│ │ Serving │ │ (routers/*.py) │ │ -│ │ (Excalidraw)│ └────────────────────┘ │ -│ └─────────────┘ │ │ -└─────────────────────────┬──────┼──────────────┘ - │ │ - │ │ - ┌─────────────▼──────▼──────────────┐ - │ Services │ - │ │ -┌───────────▼───────┐ ┌─────────▼────────┐ ┌─▼───────────────┐ -│ Database │ │ Keycloak │ │ Coder │ -│ (PostgreSQL) │ │ (Auth/OIDC) │ │ (Dev Envs) │ -└───────────────────┘ └──────────────────┘ └──────────────────┘ - │ │ │ - │ │ │ - ▼ ▼ ▼ -┌────────────────┐ ┌────────────────┐ ┌────────────────┐ -│ Pad Data & │ │ User Auth & │ │ Dev Container │ -│ Backups │ │ Sessions │ │ Environments │ -└────────────────┘ └────────────────┘ └────────────────┘ -``` - -### Core Components - -1. **FastAPI Backend** - - Serves the frontend (Excalidraw fork) - - Handles API requests for pad management - - Manages authentication flow with Keycloak - - Interfaces with Coder API for workspace management - -2. **PostgreSQL Database** - - Stores user data, pad content, and backups - - Shared with Keycloak and Coder for their data - -3. **Redis** - - Manages user sessions - - Provides caching for performance - -4. **Keycloak** - - Provides OIDC authentication - - Manages user accounts and roles - -5. **Coder** - - Provisions and manages cloud development environments - - Accessed through the pad's interface - -## Code Structure - -The repository is structured as follows: - -### Backend (`/backend` directory) - -``` -backend/ -├── coder.py # Coder API integration -├── config.py # Configuration and environment variables -├── dependencies.py # FastAPI dependencies -├── main.py # Application entry point -├── requirements.txt # Python dependencies -├── database/ # Database models and operations -│ ├── database.py # Database connection -│ ├── models/ # SQLAlchemy models -│ ├── repository/ # Data access layer -│ └── service/ # Business logic layer -├── routers/ # API routes -│ ├── app_router.py # General app routes -│ ├── auth_router.py # Authentication routes -│ ├── pad_router.py # Pad management routes -│ └── workspace_router.py # Coder workspace routes -└── templates/ # Default pad templates -``` - -### Key Classes and Components - -#### Auth Flow - -1. Users authenticate via Keycloak OIDC -2. Session tokens are stored in Redis -3. The `UserSession` class in `dependencies.py` provides access to user information -4. The `auth_router.py` handles login, callback, and logout endpoints - -#### Pad Management - -1. `PadModel` represents a canvas instance -2. `BackupModel` stores point-in-time backups of pads -3. `TemplatePadModel` provides reusable templates for new pads -4. The Repository pattern is used for data access -5. Service classes implement business logic - -#### Coder Integration - -The `coder.py` module handles: -1. User management in Coder -2. Workspace creation and provisioning -3. Workspace state management (start/stop) - -## Database Schema - -The database uses a schema called `pad_ws` with the following tables: - -1. **users** - Stores user information - - Synced with Keycloak user data - -2. **pads** - Stores canvas/pad data - - Each pad belongs to a user - - Contains the complete state of the canvas - -3. **backups** - Stores point-in-time backups of pads - - Automatically created based on time intervals - - Limited to a maximum number per user - -4. **template_pads** - Stores reusable templates - - Used when creating new pads - -## Development Workflow - -1. The FastAPI app serves the Excalidraw frontend at the root route -2. Users interact with the whiteboard interface -3. Canvas data is periodically saved to the backend -4. When a user accesses development features, their Coder workspace is started -5. The UI integrates the dev environment within the whiteboard - -## Getting Started - -1. Follow the self-hosting instructions in the README to set up the development environment -2. The `.env` file contains configuration for all services -3. For local development, you can use `docker-compose` to run the dependencies (PostgreSQL, Redis, Keycloak, Coder) -4. Run the FastAPI app with `uvicorn main:app --reload` for local development - -## Key APIs and Endpoints - -- `/auth/*` - Authentication endpoints -- `/api/pad/*` - Canvas/pad management -- `/api/workspace/*` - Coder workspace management -- `/api/users/*` - User management -- `/api/templates/*` - Template management - -## Additional Resources - -- Excalidraw documentation: https://github.com/excalidraw/excalidraw -- Coder documentation: https://coder.com/docs/ -- FastAPI documentation: https://fastapi.tiangolo.com/ -- SQLAlchemy documentation: https://docs.sqlalchemy.org/ diff --git a/docs/frontend-backend-communication.md b/docs/frontend-backend-communication.md deleted file mode 100644 index 006318e..0000000 --- a/docs/frontend-backend-communication.md +++ /dev/null @@ -1,137 +0,0 @@ -# Frontend-Backend Communication (React Query Architecture) - -This document describes the current architecture and all communication points between the frontend and backend in the Pad.ws application, following the React Query refactor. All API interactions are now managed through React Query hooks, providing deduplication, caching, polling, and robust error handling. - ---- - -## 1. Overview of Communication Architecture - -- **All frontend-backend communication is handled via React Query hooks.** -- **API calls are centralized in `src/frontend/src/api/hooks.ts` and `apiUtils.ts`.** -- **No custom context providers for authentication or workspace state are used; hooks are called directly in components.** -- **Error and loading states are managed by React Query.** -- **Mutations (e.g., saving data, starting/stopping workspace) automatically invalidate relevant queries.** - ---- - -## 2. Authentication - -### 2.1. Authentication Status - -- **Hook:** `useAuthCheck` -- **Endpoint:** `GET /api/workspace/state` -- **Usage:** Determines if the user is authenticated. Returns `true` if authenticated, `false` if 401 Unauthorized. -- **Example:** - ```typescript - import { useAuthCheck } from "./api/hooks"; - const { data: isAuthenticated = true } = useAuthCheck(); - ``` -- **UI:** If `isAuthenticated` is `false`, the login modal (`AuthModal`) is displayed. - -### 2.2. Login/Logout - -- **Login:** Handled via OAuth redirects (e.g., `/auth/login?kc_idp_hint=google`). -- **Logout:** Handled via redirect to `/auth/logout`. -- **No direct API call from React Query; handled by browser navigation.** - ---- - -## 3. User Profile - -- **Hook:** `useUserProfile` -- **Endpoint:** `GET /api/user/me` -- **Usage:** Fetches the authenticated user's profile. -- **Example:** - ```typescript - import { useUserProfile } from "./api/hooks"; - const { data: userProfile, isLoading, error } = useUserProfile(); - ``` - ---- - -## 4. Workspace Management - -### 4.1. Workspace State - -- **Hook:** `useWorkspaceState` -- **Endpoint:** `GET /api/workspace/state` -- **Usage:** Polls workspace state every 5 seconds. -- **Example:** - ```typescript - import { useWorkspaceState } from "./api/hooks"; - const { data: workspaceState, isLoading, error } = useWorkspaceState(); - ``` - -### 4.2. Start/Stop Workspace - -- **Hooks:** `useStartWorkspace`, `useStopWorkspace` -- **Endpoints:** `POST /api/workspace/start`, `POST /api/workspace/stop` -- **Usage:** Mutations to start/stop the workspace. On success, invalidate and refetch workspace state. -- **Example:** - ```typescript - import { useStartWorkspace, useStopWorkspace } from "./api/hooks"; - const { mutate: startWorkspace } = useStartWorkspace(); - const { mutate: stopWorkspace } = useStopWorkspace(); - // Usage: startWorkspace(); stopWorkspace(); - ``` - ---- - -## 5. Canvas Data Management - -### 5.1. Load Canvas - -- **Hooks:** `useCanvas`, `useDefaultCanvas` -- **Endpoints:** `GET /api/canvas`, `GET /api/canvas/default` -- **Usage:** Loads user canvas data; falls back to default if not available or on error. -- **Example:** - ```typescript - import { useCanvas, useDefaultCanvas } from "./api/hooks"; - const { data: canvasData, isError } = useCanvas(); - const { data: defaultCanvasData } = useDefaultCanvas({ enabled: isError }); - ``` - -### 5.2. Save Canvas - -- **Hook:** `useSaveCanvas` -- **Endpoint:** `POST /api/canvas` -- **Usage:** Saves canvas data. Only called if user is authenticated. -- **Example:** - ```typescript - import { useSaveCanvas } from "./api/hooks"; - const { mutate: saveCanvas } = useSaveCanvas(); - // Usage: saveCanvas(canvasData); - ``` - ---- - -## 6. Error Handling - -- **All API errors are handled by React Query and the `fetchApi` utility.** -- **401 Unauthorized:** Triggers unauthenticated state; login modal is shown. -- **Other errors:** Exposed via `error` property in hook results; components can display error messages or fallback UI. -- **Example:** - ```typescript - const { data, error, isLoading } = useWorkspaceState(); - if (error) { /* Show error UI */ } - ``` - ---- - -## 7. API Utility Functions - -- **File:** `src/frontend/src/api/apiUtils.ts` -- **Functions:** `fetchApi`, `handleResponse` -- **Purpose:** Centralizes fetch logic, error handling, and credentials management for all API calls. - ---- - -## 8. Summary - -- **All frontend-backend communication is now declarative and managed by React Query hooks.** -- **No legacy context classes or direct fetches remain.** -- **API logic is centralized, maintainable, and testable.** -- **Error handling, caching, and polling are handled automatically.** -- **UI components react to hook state for loading, error, and data.** - -This architecture ensures robust, efficient, and maintainable communication between the frontend and backend. diff --git a/src/backend/cache/__init__.py b/src/backend/cache/__init__.py new file mode 100644 index 0000000..bbb771d --- /dev/null +++ b/src/backend/cache/__init__.py @@ -0,0 +1,3 @@ +from .redis_client import RedisClient + +__all__ = ["RedisClient"] \ No newline at end of file diff --git a/src/backend/cache/redis_client.py b/src/backend/cache/redis_client.py new file mode 100644 index 0000000..3dbce07 --- /dev/null +++ b/src/backend/cache/redis_client.py @@ -0,0 +1,44 @@ +import os +from redis import asyncio as aioredis +from dotenv import load_dotenv + +# Load environment variables +load_dotenv() + +# Redis Configuration +REDIS_HOST = os.getenv('REDIS_HOST', 'localhost') +REDIS_PASSWORD = os.getenv('REDIS_PASSWORD', None) +REDIS_PORT = int(os.getenv('REDIS_PORT', 6379)) +REDIS_URL = f"redis://{REDIS_HOST}:{REDIS_PORT}" + +class RedisClient: + """Service for managing Redis connections with proper lifecycle management.""" + + _instance = None + + @classmethod + async def get_instance(cls) -> aioredis.Redis: + """Get or create a Redis client instance.""" + if cls._instance is None: + cls._instance = cls() + await cls._instance.initialize() + return cls._instance.client + + def __init__(self): + self.client = None + + async def initialize(self) -> None: + """Initialize the Redis client.""" + self.client = aioredis.from_url( + REDIS_URL, + password=REDIS_PASSWORD, + decode_responses=True, + health_check_interval=30 + ) + + async def close(self) -> None: + """Close the Redis client connection.""" + if self.client: + await self.client.close() + self.client = None + print("Redis client closed.") \ No newline at end of file diff --git a/src/backend/config.py b/src/backend/config.py index bce429e..fe5ae23 100644 --- a/src/backend/config.py +++ b/src/backend/config.py @@ -2,12 +2,11 @@ import json import time import httpx -import redis -from redis import ConnectionPool, Redis import jwt from jwt.jwks_client import PyJWKClient from typing import Optional, Dict, Any, Tuple from dotenv import load_dotenv +from cache import RedisClient # Load environment variables once load_dotenv() @@ -16,6 +15,8 @@ STATIC_DIR = os.getenv("STATIC_DIR") ASSETS_DIR = os.getenv("ASSETS_DIR") FRONTEND_URL = os.getenv('FRONTEND_URL') +PAD_DEV_MODE = os.getenv('PAD_DEV_MODE', 'false').lower() == 'true' +DEV_FRONTEND_URL = os.getenv('DEV_FRONTEND_URL', 'http://localhost:3003') MAX_BACKUPS_PER_USER = 10 # Maximum number of backups to keep per user MIN_INTERVAL_MINUTES = 5 # Minimum interval in minutes between backups @@ -33,30 +34,9 @@ OIDC_REALM = os.getenv('OIDC_REALM') OIDC_REDIRECT_URI = os.getenv('REDIRECT_URI') -# ===== Redis Configuration ===== -REDIS_HOST = os.getenv('REDIS_HOST', 'localhost') -REDIS_PASSWORD = os.getenv('REDIS_PASSWORD', None) -REDIS_PORT = int(os.getenv('REDIS_PORT', 6379)) - -# Create a Redis connection pool -redis_pool = ConnectionPool( - host=REDIS_HOST, - password=REDIS_PASSWORD, - port=REDIS_PORT, - db=0, - decode_responses=True, - max_connections=10, # Adjust based on your application's needs - socket_timeout=5.0, - socket_connect_timeout=1.0, - health_check_interval=30 -) - -# Create a Redis client that uses the connection pool -redis_client = Redis(connection_pool=redis_pool) - -def get_redis_client(): - """Get a Redis client from the connection pool""" - return Redis(connection_pool=redis_pool) +default_pad = {} +with open("templates/default.json", 'r') as f: + default_pad = json.load(f) # ===== Coder API Configuration ===== CODER_API_KEY = os.getenv("CODER_API_KEY") @@ -68,113 +48,6 @@ def get_redis_client(): # Cache for JWKS client _jwks_client = None -# Session management functions -def get_session(session_id: str) -> Optional[Dict[str, Any]]: - """Get session data from Redis""" - client = get_redis_client() - session_data = client.get(f"session:{session_id}") - if session_data: - return json.loads(session_data) - return None - -def set_session(session_id: str, data: Dict[str, Any], expiry: int) -> None: - """Store session data in Redis with expiry in seconds""" - client = get_redis_client() - client.setex( - f"session:{session_id}", - expiry, - json.dumps(data) - ) - -def delete_session(session_id: str) -> None: - """Delete session data from Redis""" - client = get_redis_client() - client.delete(f"session:{session_id}") - -def get_auth_url() -> str: - """Generate the authentication URL for Keycloak login""" - auth_url = f"{OIDC_SERVER_URL}/realms/{OIDC_REALM}/protocol/openid-connect/auth" - params = { - 'client_id': OIDC_CLIENT_ID, - 'response_type': 'code', - 'redirect_uri': OIDC_REDIRECT_URI, - 'scope': 'openid profile email' - } - return f"{auth_url}?{'&'.join(f'{k}={v}' for k,v in params.items())}" - -def get_token_url() -> str: - """Get the token endpoint URL""" - return f"{OIDC_SERVER_URL}/realms/{OIDC_REALM}/protocol/openid-connect/token" - -def is_token_expired(token_data: Dict[str, Any], buffer_seconds: int = 30) -> bool: - if not token_data or 'access_token' not in token_data: - return True - - try: - # Get the signing key - jwks_client = get_jwks_client() - signing_key = jwks_client.get_signing_key_from_jwt(token_data['access_token']) - - # Decode with verification - decoded = jwt.decode( - token_data['access_token'], - signing_key.key, - algorithms=["RS256"], # Common algorithm for OIDC - audience=OIDC_CLIENT_ID, - ) - - # Check expiration - exp_time = decoded.get('exp', 0) - current_time = time.time() - return current_time + buffer_seconds >= exp_time - except jwt.ExpiredSignatureError: - return True - except Exception as e: - print(f"Error checking token expiration: {str(e)}") - return True - -async def refresh_token(session_id: str, token_data: Dict[str, Any]) -> Tuple[bool, Dict[str, Any]]: - """ - Refresh the access token using the refresh token - - Args: - session_id: The session ID - token_data: The current token data containing the refresh token - - Returns: - Tuple[bool, Dict[str, Any]]: Success status and updated token data - """ - if not token_data or 'refresh_token' not in token_data: - return False, token_data - - try: - async with httpx.AsyncClient() as client: - refresh_response = await client.post( - get_token_url(), - data={ - 'grant_type': 'refresh_token', - 'client_id': OIDC_CLIENT_ID, - 'client_secret': OIDC_CLIENT_SECRET, - 'refresh_token': token_data['refresh_token'] - } - ) - - if refresh_response.status_code != 200: - print(f"Token refresh failed: {refresh_response.text}") - return False, token_data - - # Get new token data - new_token_data = refresh_response.json() - - # Update session with new tokens - expiry = new_token_data['expires_in'] - set_session(session_id, new_token_data, expiry) - - return True, new_token_data - except Exception as e: - print(f"Error refreshing token: {str(e)}") - return False, token_data - def get_jwks_client(): """Get or create a PyJWKClient for token verification""" global _jwks_client @@ -182,3 +55,4 @@ def get_jwks_client(): jwks_url = f"{OIDC_SERVER_URL}/realms/{OIDC_REALM}/protocol/openid-connect/certs" _jwks_client = PyJWKClient(jwks_url) return _jwks_client + diff --git a/src/backend/database/__init__.py b/src/backend/database/__init__.py index 6c5c0e7..1e81b82 100644 --- a/src/backend/database/__init__.py +++ b/src/backend/database/__init__.py @@ -6,26 +6,14 @@ from .database import ( init_db, - get_session, - get_user_repository, - get_pad_repository, - get_backup_repository, - get_template_pad_repository, - get_user_service, - get_pad_service, - get_backup_service, - get_template_pad_service + get_session, + engine, + async_session, ) __all__ = [ 'init_db', 'get_session', - 'get_user_repository', - 'get_pad_repository', - 'get_backup_repository', - 'get_template_pad_repository', - 'get_user_service', - 'get_pad_service', - 'get_backup_service', - 'get_template_pad_service', + 'engine', + 'async_session', ] diff --git a/src/backend/database/database.py b/src/backend/database/database.py index 87ca120..cd441d8 100644 --- a/src/backend/database/database.py +++ b/src/backend/database/database.py @@ -31,9 +31,7 @@ engine = create_async_engine(DATABASE_URL, echo=False) # Create async session factory -async_session = sessionmaker( - engine, class_=AsyncSession, expire_on_commit=False -) +async_session = sessionmaker(engine, class_=AsyncSession, expire_on_commit=False) async def init_db() -> None: @@ -59,45 +57,3 @@ async def get_session() -> AsyncGenerator[AsyncSession, None]: yield session finally: await session.close() - -# Dependency for getting repositories -async def get_user_repository(session: AsyncSession = Depends(get_session)): - """Get a user repository""" - from .repository import UserRepository - return UserRepository(session) - -async def get_pad_repository(session: AsyncSession = Depends(get_session)): - """Get a pad repository""" - from .repository import PadRepository - return PadRepository(session) - -async def get_backup_repository(session: AsyncSession = Depends(get_session)): - """Get a backup repository""" - from .repository import BackupRepository - return BackupRepository(session) - -async def get_template_pad_repository(session: AsyncSession = Depends(get_session)): - """Get a template pad repository""" - from .repository import TemplatePadRepository - return TemplatePadRepository(session) - -# Dependency for getting services -async def get_user_service(session: AsyncSession = Depends(get_session)): - """Get a user service""" - from .service import UserService - return UserService(session) - -async def get_pad_service(session: AsyncSession = Depends(get_session)): - """Get a pad service""" - from .service import PadService - return PadService(session) - -async def get_backup_service(session: AsyncSession = Depends(get_session)): - """Get a backup service""" - from .service import BackupService - return BackupService(session) - -async def get_template_pad_service(session: AsyncSession = Depends(get_session)): - """Get a template pad service""" - from .service import TemplatePadService - return TemplatePadService(session) diff --git a/src/backend/database/models/__init__.py b/src/backend/database/models/__init__.py index 41fe372..250a149 100644 --- a/src/backend/database/models/__init__.py +++ b/src/backend/database/models/__init__.py @@ -5,16 +5,13 @@ """ from .base_model import Base, BaseModel, SCHEMA_NAME -from .user_model import UserModel -from .pad_model import PadModel, TemplatePadModel -from .backup_model import BackupModel +from .user_model import UserStore +from .pad_model import PadStore __all__ = [ 'Base', 'BaseModel', - 'UserModel', - 'PadModel', - 'BackupModel', - 'TemplatePadModel', + 'UserStore', + 'PadStore', 'SCHEMA_NAME', ] diff --git a/src/backend/database/models/backup_model.py b/src/backend/database/models/backup_model.py deleted file mode 100644 index 5e2f7d3..0000000 --- a/src/backend/database/models/backup_model.py +++ /dev/null @@ -1,42 +0,0 @@ -from typing import Dict, Any, TYPE_CHECKING - -from sqlalchemy import Column, ForeignKey, Index, UUID -from sqlalchemy.dialects.postgresql import JSONB -from sqlalchemy.orm import relationship, Mapped - -from .base_model import Base, BaseModel, SCHEMA_NAME - -if TYPE_CHECKING: - from .pad_model import PadModel - -class BackupModel(Base, BaseModel): - """Model for backups table in app schema""" - __tablename__ = "backups" - __table_args__ = ( - Index("ix_backups_source_id", "source_id"), - Index("ix_backups_created_at", "created_at"), - {"schema": SCHEMA_NAME} - ) - - # Backup-specific fields - source_id = Column( - UUID(as_uuid=True), - ForeignKey(f"{SCHEMA_NAME}.pads.id", ondelete="CASCADE"), - nullable=False - ) - data = Column(JSONB, nullable=False) - - # Relationships - pad: Mapped["PadModel"] = relationship("PadModel", back_populates="backups") - - def __repr__(self) -> str: - return f"" - - def to_dict(self) -> Dict[str, Any]: - """Convert model instance to dictionary with additional fields""" - result = super().to_dict() - # Convert data to dict if it's not already - if isinstance(result["data"], str): - import json - result["data"] = json.loads(result["data"]) - return result diff --git a/src/backend/database/models/pad_model.py b/src/backend/database/models/pad_model.py index fafddbc..3bc5367 100644 --- a/src/backend/database/models/pad_model.py +++ b/src/backend/database/models/pad_model.py @@ -1,17 +1,19 @@ -from typing import List, Dict, Any, TYPE_CHECKING +from typing import Dict, Any, Optional, List, TYPE_CHECKING +from uuid import UUID +from datetime import datetime -from sqlalchemy import Column, String, ForeignKey, Index, UUID +from sqlalchemy import Column, String, ForeignKey, Index, UUID as SQLUUID, select, update, delete, ARRAY from sqlalchemy.dialects.postgresql import JSONB from sqlalchemy.orm import relationship, Mapped +from sqlalchemy.ext.asyncio import AsyncSession from .base_model import Base, BaseModel, SCHEMA_NAME if TYPE_CHECKING: - from .backup_model import BackupModel - from .user_model import UserModel + from .user_model import UserStore -class PadModel(Base, BaseModel): - """Model for pads table in app schema""" +class PadStore(Base, BaseModel): + """Combined model and repository for pad storage""" __tablename__ = "pads" __table_args__ = ( Index("ix_pads_owner_id", "owner_id"), @@ -21,47 +23,100 @@ class PadModel(Base, BaseModel): # Pad-specific fields owner_id = Column( - UUID(as_uuid=True), + SQLUUID(as_uuid=True), ForeignKey(f"{SCHEMA_NAME}.users.id", ondelete="CASCADE"), nullable=False ) display_name = Column(String(100), nullable=False) data = Column(JSONB, nullable=False) + sharing_policy = Column(String(20), nullable=False, default="private") + whitelist = Column(ARRAY(SQLUUID(as_uuid=True)), nullable=True, default=[]) # Relationships - owner: Mapped["UserModel"] = relationship("UserModel", back_populates="pads") - backups: Mapped[List["BackupModel"]] = relationship( - "BackupModel", - back_populates="pad", - cascade="all, delete-orphan", - lazy="selectin" - ) + owner: Mapped["UserStore"] = relationship("UserStore", back_populates="pads") def __repr__(self) -> str: - return f"" - - def to_dict(self) -> Dict[str, Any]: - """Convert model instance to dictionary with additional fields""" - result = super().to_dict() - # Convert data to dict if it's not already - if isinstance(result["data"], str): - import json - result["data"] = json.loads(result["data"]) - return result + return f"" + @classmethod + async def create_pad( + cls, + session: AsyncSession, + owner_id: UUID, + display_name: str, + data: Dict[str, Any], + sharing_policy: str = "private", + whitelist: List[UUID] = [] + ) -> 'PadStore': + """Create a new pad""" + pad = cls( + owner_id=owner_id, + display_name=display_name, + data=data, + sharing_policy=sharing_policy, + whitelist=whitelist + ) + session.add(pad) + await session.commit() + await session.refresh(pad) + return pad -class TemplatePadModel(Base, BaseModel): - """Model for template pads table in app schema""" - __tablename__ = "template_pads" - __table_args__ = ( - Index("ix_template_pads_display_name", "display_name"), - Index("ix_template_pads_name", "name"), - {"schema": SCHEMA_NAME} - ) + @classmethod + async def get_by_id(cls, session: AsyncSession, pad_id: UUID) -> Optional['PadStore']: + """Get a pad by ID""" + stmt = select(cls).where(cls.id == pad_id) + result = await session.execute(stmt) + return result.scalars().first() - name = Column(String(100), nullable=False, unique=True) - display_name = Column(String(100), nullable=False) - data = Column(JSONB, nullable=False) + async def save(self, session: AsyncSession) -> 'PadStore': + """Update the pad in the database""" + self.updated_at = datetime.now() + try: + # Just execute the update statement without adding to session + stmt = update(self.__class__).where(self.__class__.id == self.id).values( + owner_id=self.owner_id, + display_name=self.display_name, + data=self.data, + sharing_policy=self.sharing_policy, + whitelist=self.whitelist, + updated_at=self.updated_at + ) + await session.execute(stmt) + await session.commit() + + # After update, get the fresh object from the database + refreshed = await self.get_by_id(session, self.id) + if refreshed: + # Update this object's attributes from the database + self.owner_id = refreshed.owner_id + self.display_name = refreshed.display_name + self.data = refreshed.data + self.sharing_policy = refreshed.sharing_policy + self.whitelist = refreshed.whitelist + self.created_at = refreshed.created_at + self.updated_at = refreshed.updated_at + + return self + except Exception as e: + print(f"Error saving pad {self.id}: {str(e)}", flush=True) + raise e - def __repr__(self) -> str: - return f"" \ No newline at end of file + async def delete(self, session: AsyncSession) -> bool: + """Delete the pad""" + stmt = delete(self.__class__).where(self.__class__.id == self.id) + result = await session.execute(stmt) + await session.commit() + return result.rowcount > 0 + + def to_dict(self) -> Dict[str, Any]: + """Convert to dictionary representation""" + return { + "id": str(self.id), + "owner_id": str(self.owner_id), + "display_name": self.display_name, + "data": self.data, + "sharing_policy": self.sharing_policy, + "whitelist": [str(uid) for uid in self.whitelist] if self.whitelist else [], + "created_at": self.created_at.isoformat(), + "updated_at": self.updated_at.isoformat() + } diff --git a/src/backend/database/models/user_model.py b/src/backend/database/models/user_model.py index 28526b5..6ce3217 100644 --- a/src/backend/database/models/user_model.py +++ b/src/backend/database/models/user_model.py @@ -1,15 +1,19 @@ -from typing import List, TYPE_CHECKING -from sqlalchemy import Column, Index, String, UUID, Boolean -from sqlalchemy.dialects.postgresql import JSONB +from typing import List, Optional, Dict, Any, TYPE_CHECKING +from uuid import UUID +from datetime import datetime + +from sqlalchemy import Column, Index, String, UUID as SQLUUID, Boolean, select, update, delete, func, ARRAY, and_, or_, text +from sqlalchemy.dialects.postgresql import JSONB, array from sqlalchemy.orm import relationship, Mapped +from sqlalchemy.ext.asyncio import AsyncSession from .base_model import Base, BaseModel, SCHEMA_NAME if TYPE_CHECKING: - from .pad_model import PadModel + from .pad_model import PadStore -class UserModel(Base, BaseModel): - """Model for users table in app schema""" +class UserStore(Base, BaseModel): + """Combined model and repository for user storage""" __tablename__ = "users" __table_args__ = ( Index("ix_users_username", "username"), @@ -18,7 +22,7 @@ class UserModel(Base, BaseModel): ) # Override the default id column to use Keycloak's UUID - id = Column(UUID(as_uuid=True), primary_key=True) + id = Column(SQLUUID(as_uuid=True), primary_key=True) # User-specific fields username = Column(String(254), nullable=False, unique=True) @@ -28,14 +32,171 @@ class UserModel(Base, BaseModel): given_name = Column(String(254), nullable=True) family_name = Column(String(254), nullable=True) roles = Column(JSONB, nullable=False, default=[]) + open_pads = Column(ARRAY(SQLUUID(as_uuid=True)), nullable=False, default=[]) + last_selected_pad = Column(SQLUUID(as_uuid=True), nullable=True) # Relationships - pads: Mapped[List["PadModel"]] = relationship( - "PadModel", + pads: Mapped[List["PadStore"]] = relationship( + "PadStore", back_populates="owner", cascade="all, delete-orphan", lazy="selectin" ) - + def __repr__(self) -> str: - return f"" + return f"" + + @classmethod + async def create_user( + cls, + session: AsyncSession, + id: UUID, + username: str, + email: str, + email_verified: bool = False, + name: Optional[str] = None, + given_name: Optional[str] = None, + family_name: Optional[str] = None, + roles: List[str] = None, + open_pads: List[UUID] = None, + last_selected_pad: Optional[UUID] = None + ) -> 'UserStore': + """Create a new user""" + user = cls( + id=id, + username=username, + email=email, + email_verified=email_verified, + name=name, + given_name=given_name, + family_name=family_name, + roles=roles or [], + open_pads=open_pads or [], + last_selected_pad=last_selected_pad + ) + session.add(user) + await session.commit() + await session.refresh(user) + return user + + @classmethod + async def get_by_id(cls, session: AsyncSession, user_id: UUID) -> Optional['UserStore']: + """Get a user by ID""" + stmt = select(cls).where(cls.id == user_id) + result = await session.execute(stmt) + return result.scalars().first() + + @classmethod + async def get_by_username(cls, session: AsyncSession, username: str) -> Optional['UserStore']: + """Get a user by username""" + stmt = select(cls).where(cls.username == username) + result = await session.execute(stmt) + return result.scalars().first() + + @classmethod + async def get_by_email(cls, session: AsyncSession, email: str) -> Optional['UserStore']: + """Get a user by email""" + stmt = select(cls).where(cls.email == email) + result = await session.execute(stmt) + return result.scalars().first() + + @classmethod + async def get_all(cls, session: AsyncSession) -> List['UserStore']: + """Get all users""" + stmt = select(cls) + result = await session.execute(stmt) + return result.scalars().all() + + @classmethod + async def get_open_pads(cls, session: AsyncSession, user_id: UUID) -> List[Dict[str, Any]]: + """Get all pad IDs that the user has access to (both open pads and owned pads)""" + from .pad_model import PadStore # Import here to avoid circular imports + + # Get the user to access their open_pads + user = await cls.get_by_id(session, user_id) + if not user: + return [] + + # Get both open_pads and owned pad IDs + open_pad_ids = user.open_pads or [] + owned_pad_ids = [pad.id for pad in user.pads] + + # Combine and deduplicate pad IDs + all_pad_ids = list(set(open_pad_ids + owned_pad_ids)) + + stmt = select( + PadStore.id, + PadStore.owner_id, + PadStore.display_name, + PadStore.created_at, + PadStore.updated_at, + PadStore.sharing_policy + ).where( + PadStore.id.in_(all_pad_ids) + ).order_by(PadStore.created_at) + + result = await session.execute(stmt) + pads = result.all() + + return [{ + "id": str(pad.id), + "owner_id": str(pad.owner_id), + "display_name": pad.display_name, + "created_at": pad.created_at.isoformat(), + "updated_at": pad.updated_at.isoformat(), + "sharing_policy": pad.sharing_policy + } for pad in pads] + + async def save(self, session: AsyncSession) -> 'UserStore': + """Save the current user state""" + if self.id is None: + session.add(self) + await session.commit() + await session.refresh(self) + return self + + async def update(self, session: AsyncSession, data: Dict[str, Any]) -> 'UserStore': + """Update user data""" + for key, value in data.items(): + setattr(self, key, value) + self.updated_at = datetime.now() + return await self.save(session) + + async def delete(self, session: AsyncSession) -> bool: + """Delete the user""" + stmt = delete(self.__class__).where(self.__class__.id == self.id) + result = await session.execute(stmt) + await session.commit() + return result.rowcount > 0 + + def to_dict(self) -> Dict[str, Any]: + """Convert to dictionary representation""" + return { + "id": str(self.id), + "username": self.username, + "email": self.email, + "email_verified": self.email_verified, + "name": self.name, + "given_name": self.given_name, + "family_name": self.family_name, + "roles": self.roles, + "open_pads": [str(pid) for pid in self.open_pads] if self.open_pads else [], + "last_selected_pad": str(self.last_selected_pad) if self.last_selected_pad else None, + "created_at": self.created_at.isoformat(), + "updated_at": self.updated_at.isoformat() + } + + async def remove_open_pad(self, session: AsyncSession, pad_id: UUID) -> 'UserStore': + """Remove a pad from the user's open_pads list""" + + if pad_id in self.open_pads: + pads = self.open_pads.copy() + pads.pop(pads.index(pad_id)) + await self.update(session, {"open_pads": pads}) + + return self + + async def set_last_selected_pad(self, session: AsyncSession, pad_id: UUID) -> 'UserStore': + """Set the last selected pad for the user""" + await self.update(session, {"last_selected_pad": pad_id}) + return self \ No newline at end of file diff --git a/src/backend/database/repository/__init__.py b/src/backend/database/repository/__init__.py deleted file mode 100644 index a2433d7..0000000 --- a/src/backend/database/repository/__init__.py +++ /dev/null @@ -1,17 +0,0 @@ -""" -Repository module for database operations. - -This module provides access to all repositories used for database operations. -""" - -from .user_repository import UserRepository -from .pad_repository import PadRepository -from .backup_repository import BackupRepository -from .template_pad_repository import TemplatePadRepository - -__all__ = [ - 'UserRepository', - 'PadRepository', - 'BackupRepository', - 'TemplatePadRepository', -] diff --git a/src/backend/database/repository/backup_repository.py b/src/backend/database/repository/backup_repository.py deleted file mode 100644 index dcae35a..0000000 --- a/src/backend/database/repository/backup_repository.py +++ /dev/null @@ -1,116 +0,0 @@ -""" -Backup repository for database operations related to backups. -""" - -from typing import List, Optional, Dict, Any -from uuid import UUID -from datetime import datetime - -from sqlalchemy.ext.asyncio import AsyncSession -from sqlalchemy.future import select -from sqlalchemy import delete, func, join - -from ..models import BackupModel, PadModel - -class BackupRepository: - """Repository for backup-related database operations""" - - def __init__(self, session: AsyncSession): - """Initialize the repository with a database session""" - self.session = session - - async def create(self, source_id: UUID, data: Dict[str, Any]) -> BackupModel: - """Create a new backup""" - backup = BackupModel(source_id=source_id, data=data) - self.session.add(backup) - await self.session.commit() - await self.session.refresh(backup) - return backup - - async def get_by_id(self, backup_id: UUID) -> Optional[BackupModel]: - """Get a backup by ID""" - stmt = select(BackupModel).where(BackupModel.id == backup_id) - result = await self.session.execute(stmt) - return result.scalars().first() - - async def get_by_source(self, source_id: UUID) -> List[BackupModel]: - """Get all backups for a specific source pad""" - stmt = select(BackupModel).where(BackupModel.source_id == source_id).order_by(BackupModel.created_at.desc()) - result = await self.session.execute(stmt) - return result.scalars().all() - - async def get_latest_by_source(self, source_id: UUID) -> Optional[BackupModel]: - """Get the most recent backup for a specific source pad""" - stmt = select(BackupModel).where(BackupModel.source_id == source_id).order_by(BackupModel.created_at.desc()).limit(1) - result = await self.session.execute(stmt) - return result.scalars().first() - - async def get_by_date_range(self, source_id: UUID, start_date: datetime, end_date: datetime) -> List[BackupModel]: - """Get backups for a specific source pad within a date range""" - stmt = select(BackupModel).where( - BackupModel.source_id == source_id, - BackupModel.created_at >= start_date, - BackupModel.created_at <= end_date - ).order_by(BackupModel.created_at.desc()) - result = await self.session.execute(stmt) - return result.scalars().all() - - async def delete(self, backup_id: UUID) -> bool: - """Delete a backup""" - stmt = delete(BackupModel).where(BackupModel.id == backup_id) - result = await self.session.execute(stmt) - await self.session.commit() - return result.rowcount > 0 - - async def delete_older_than(self, source_id: UUID, keep_count: int) -> int: - """Delete older backups, keeping only the most recent ones""" - # Get the created_at timestamp of the backup at position keep_count - subquery = select(BackupModel.created_at).where( - BackupModel.source_id == source_id - ).order_by(BackupModel.created_at.desc()).offset(keep_count).limit(1) - - result = await self.session.execute(subquery) - cutoff_date = result.scalar() - - if not cutoff_date: - return 0 # Not enough backups to delete any - - # Delete backups older than the cutoff date - stmt = delete(BackupModel).where( - BackupModel.source_id == source_id, - BackupModel.created_at < cutoff_date - ) - result = await self.session.execute(stmt) - await self.session.commit() - return result.rowcount - - async def count_by_source(self, source_id: UUID) -> int: - """Count the number of backups for a specific source pad""" - stmt = select(func.count()).select_from(BackupModel).where(BackupModel.source_id == source_id) - result = await self.session.execute(stmt) - return result.scalar() - - async def get_backups_by_user(self, user_id: UUID, limit: int = 10) -> List[BackupModel]: - """ - Get backups for a user's first pad directly using a join operation. - This eliminates the N+1 query problem by fetching the pad and its backups in a single query. - - Args: - user_id: The user ID to get backups for - limit: Maximum number of backups to return - - Returns: - List of backup models - """ - # Create a join between PadModel and BackupModel - stmt = select(BackupModel).join( - PadModel, - BackupModel.source_id == PadModel.id - ).where( - PadModel.owner_id == user_id - ).order_by( - BackupModel.created_at.desc() - ).limit(limit) - - result = await self.session.execute(stmt) - return result.scalars().all() diff --git a/src/backend/database/repository/pad_repository.py b/src/backend/database/repository/pad_repository.py deleted file mode 100644 index a9b6c67..0000000 --- a/src/backend/database/repository/pad_repository.py +++ /dev/null @@ -1,66 +0,0 @@ -""" -Pad repository for database operations related to pads. -""" - -from typing import List, Optional, Dict, Any -from uuid import UUID - -from sqlalchemy.ext.asyncio import AsyncSession -from sqlalchemy.future import select -from sqlalchemy import update, delete - -from ..models import PadModel - -class PadRepository: - """Repository for pad-related database operations""" - - def __init__(self, session: AsyncSession): - """Initialize the repository with a database session""" - self.session = session - - async def create(self, owner_id: UUID, display_name: str, data: Dict[str, Any]) -> PadModel: - """Create a new pad""" - pad = PadModel(owner_id=owner_id, display_name=display_name, data=data) - self.session.add(pad) - await self.session.commit() - await self.session.refresh(pad) - return pad - - async def get_by_id(self, pad_id: UUID) -> Optional[PadModel]: - """Get a pad by ID""" - stmt = select(PadModel).where(PadModel.id == pad_id) - result = await self.session.execute(stmt) - return result.scalars().first() - - async def get_by_owner(self, owner_id: UUID) -> List[PadModel]: - """Get all pads for a specific owner, sorted by created_at timestamp""" - stmt = select(PadModel).where(PadModel.owner_id == owner_id).order_by(PadModel.created_at) - result = await self.session.execute(stmt) - return result.scalars().all() - - async def get_by_name(self, owner_id: UUID, display_name: str) -> Optional[PadModel]: - """Get a pad by owner and display name""" - stmt = select(PadModel).where( - PadModel.owner_id == owner_id, - PadModel.display_name == display_name - ) - result = await self.session.execute(stmt) - return result.scalars().first() - - async def update(self, pad_id: UUID, data: Dict[str, Any]) -> Optional[PadModel]: - """Update a pad""" - stmt = update(PadModel).where(PadModel.id == pad_id).values(**data).returning(PadModel) - result = await self.session.execute(stmt) - await self.session.commit() - return result.scalars().first() - - async def update_data(self, pad_id: UUID, pad_data: Dict[str, Any]) -> Optional[PadModel]: - """Update just the data field of a pad""" - return await self.update(pad_id, {"data": pad_data}) - - async def delete(self, pad_id: UUID) -> bool: - """Delete a pad""" - stmt = delete(PadModel).where(PadModel.id == pad_id) - result = await self.session.execute(stmt) - await self.session.commit() - return result.rowcount > 0 diff --git a/src/backend/database/repository/template_pad_repository.py b/src/backend/database/repository/template_pad_repository.py deleted file mode 100644 index ff85e39..0000000 --- a/src/backend/database/repository/template_pad_repository.py +++ /dev/null @@ -1,63 +0,0 @@ -""" -Template pad repository for database operations related to template pads. -""" - -from typing import List, Optional, Dict, Any -from uuid import UUID - -from sqlalchemy.ext.asyncio import AsyncSession -from sqlalchemy.future import select -from sqlalchemy import update, delete - -from ..models import TemplatePadModel - -class TemplatePadRepository: - """Repository for template pad-related database operations""" - - def __init__(self, session: AsyncSession): - """Initialize the repository with a database session""" - self.session = session - - async def create(self, name: str, display_name: str, data: Dict[str, Any]) -> TemplatePadModel: - """Create a new template pad""" - template_pad = TemplatePadModel(name=name, display_name=display_name, data=data) - self.session.add(template_pad) - await self.session.commit() - await self.session.refresh(template_pad) - return template_pad - - async def get_by_id(self, template_id: UUID) -> Optional[TemplatePadModel]: - """Get a template pad by ID""" - stmt = select(TemplatePadModel).where(TemplatePadModel.id == template_id) - result = await self.session.execute(stmt) - return result.scalars().first() - - async def get_by_name(self, name: str) -> Optional[TemplatePadModel]: - """Get a template pad by name""" - stmt = select(TemplatePadModel).where(TemplatePadModel.name == name) - result = await self.session.execute(stmt) - return result.scalars().first() - - async def get_all(self) -> List[TemplatePadModel]: - """Get all template pads""" - stmt = select(TemplatePadModel).order_by(TemplatePadModel.display_name) - result = await self.session.execute(stmt) - return result.scalars().all() - - async def update(self, name: str, data: Dict[str, Any]) -> Optional[TemplatePadModel]: - """Update a template pad""" - stmt = update(TemplatePadModel).where(TemplatePadModel.name == name).values(**data).returning(TemplatePadModel) - result = await self.session.execute(stmt) - await self.session.commit() - return result.scalars().first() - - async def update_data(self, name: str, template_data: Dict[str, Any]) -> Optional[TemplatePadModel]: - """Update just the data field of a template pad""" - return await self.update(name, {"data": template_data}) - - async def delete(self, name: str) -> bool: - """Delete a template pad""" - stmt = delete(TemplatePadModel).where(TemplatePadModel.name == name) - result = await self.session.execute(stmt) - await self.session.commit() - return result.rowcount > 0 diff --git a/src/backend/database/repository/user_repository.py b/src/backend/database/repository/user_repository.py deleted file mode 100644 index 2e25e26..0000000 --- a/src/backend/database/repository/user_repository.py +++ /dev/null @@ -1,76 +0,0 @@ -""" -User repository for database operations related to users. -""" - -from typing import List, Optional, Dict, Any -from uuid import UUID - -from sqlalchemy.ext.asyncio import AsyncSession -from sqlalchemy.future import select -from sqlalchemy import update, delete - -from ..models import UserModel - -class UserRepository: - """Repository for user-related database operations""" - - def __init__(self, session: AsyncSession): - """Initialize the repository with a database session""" - self.session = session - - async def create(self, user_id: UUID, username: str, email: str, email_verified: bool = False, - name: str = None, given_name: str = None, family_name: str = None, - roles: list = None) -> UserModel: - """Create a new user with specified ID and optional fields""" - user = UserModel( - id=user_id, - username=username, - email=email, - email_verified=email_verified, - name=name, - given_name=given_name, - family_name=family_name, - roles=roles or [] - ) - self.session.add(user) - await self.session.commit() - await self.session.refresh(user) - return user - - async def get_by_id(self, user_id: UUID) -> Optional[UserModel]: - """Get a user by ID""" - stmt = select(UserModel).where(UserModel.id == user_id) - result = await self.session.execute(stmt) - return result.scalars().first() - - async def get_by_username(self, username: str) -> Optional[UserModel]: - """Get a user by username""" - stmt = select(UserModel).where(UserModel.username == username) - result = await self.session.execute(stmt) - return result.scalars().first() - - async def get_by_email(self, email: str) -> Optional[UserModel]: - """Get a user by email""" - stmt = select(UserModel).where(UserModel.email == email) - result = await self.session.execute(stmt) - return result.scalars().first() - - async def get_all(self) -> List[UserModel]: - """Get all users""" - stmt = select(UserModel) - result = await self.session.execute(stmt) - return result.scalars().all() - - async def update(self, user_id: UUID, data: Dict[str, Any]) -> Optional[UserModel]: - """Update a user""" - stmt = update(UserModel).where(UserModel.id == user_id).values(**data).returning(UserModel) - result = await self.session.execute(stmt) - await self.session.commit() - return result.scalars().first() - - async def delete(self, user_id: UUID) -> bool: - """Delete a user""" - stmt = delete(UserModel).where(UserModel.id == user_id) - result = await self.session.execute(stmt) - await self.session.commit() - return result.rowcount > 0 diff --git a/src/backend/database/service/__init__.py b/src/backend/database/service/__init__.py deleted file mode 100644 index d362c0b..0000000 --- a/src/backend/database/service/__init__.py +++ /dev/null @@ -1,17 +0,0 @@ -""" -Service module for business logic. - -This module provides access to all services used for business logic operations. -""" - -from .user_service import UserService -from .pad_service import PadService -from .backup_service import BackupService -from .template_pad_service import TemplatePadService - -__all__ = [ - 'UserService', - 'PadService', - 'BackupService', - 'TemplatePadService', -] diff --git a/src/backend/database/service/backup_service.py b/src/backend/database/service/backup_service.py deleted file mode 100644 index a9e44db..0000000 --- a/src/backend/database/service/backup_service.py +++ /dev/null @@ -1,176 +0,0 @@ -""" -Backup service for business logic related to backups. -""" - -from typing import List, Optional, Dict, Any -from uuid import UUID -from datetime import datetime, timezone - -from sqlalchemy.ext.asyncio import AsyncSession - -from ..repository import BackupRepository, PadRepository, UserRepository - -class BackupService: - """Service for backup-related business logic""" - - def __init__(self, session: AsyncSession): - """Initialize the service with a database session""" - self.session = session - self.repository = BackupRepository(session) - self.pad_repository = PadRepository(session) - - async def create_backup(self, source_id: UUID, data: Dict[str, Any]) -> Dict[str, Any]: - """Create a new backup""" - # Validate input - if not data: - raise ValueError("Backup data is required") - - # Check if source pad exists - source_pad = await self.pad_repository.get_by_id(source_id) - if not source_pad: - raise ValueError(f"Pad with ID '{source_id}' does not exist") - - # Create backup - backup = await self.repository.create(source_id, data) - return backup.to_dict() - - async def get_backup(self, backup_id: UUID) -> Optional[Dict[str, Any]]: - """Get a backup by ID""" - backup = await self.repository.get_by_id(backup_id) - return backup.to_dict() if backup else None - - async def get_backups_by_source(self, source_id: UUID) -> List[Dict[str, Any]]: - """Get all backups for a specific source pad""" - # Check if source pad exists - source_pad = await self.pad_repository.get_by_id(source_id) - if not source_pad: - raise ValueError(f"Pad with ID '{source_id}' does not exist") - - backups = await self.repository.get_by_source(source_id) - return [backup.to_dict() for backup in backups] - - async def get_latest_backup(self, source_id: UUID) -> Optional[Dict[str, Any]]: - """Get the most recent backup for a specific source pad""" - # Check if source pad exists - source_pad = await self.pad_repository.get_by_id(source_id) - if not source_pad: - raise ValueError(f"Pad with ID '{source_id}' does not exist") - - backup = await self.repository.get_latest_by_source(source_id) - return backup.to_dict() if backup else None - - async def get_backups_by_date_range(self, source_id: UUID, start_date: datetime, end_date: datetime) -> List[Dict[str, Any]]: - """Get backups for a specific source pad within a date range""" - # Check if source pad exists - source_pad = await self.pad_repository.get_by_id(source_id) - if not source_pad: - raise ValueError(f"Pad with ID '{source_id}' does not exist") - - # Validate date range - if start_date > end_date: - raise ValueError("Start date must be before end date") - - backups = await self.repository.get_by_date_range(source_id, start_date, end_date) - return [backup.to_dict() for backup in backups] - - async def delete_backup(self, backup_id: UUID) -> bool: - """Delete a backup""" - # Get the backup to check if it exists - backup = await self.repository.get_by_id(backup_id) - if not backup: - raise ValueError(f"Backup with ID '{backup_id}' does not exist") - - return await self.repository.delete(backup_id) - - async def manage_backups(self, source_id: UUID, max_backups: int = 10) -> int: - """Manage backups for a source pad, keeping only the most recent ones""" - # Check if source pad exists - source_pad = await self.pad_repository.get_by_id(source_id) - if not source_pad: - raise ValueError(f"Pad with ID '{source_id}' does not exist") - - # Validate max_backups - if max_backups < 1: - raise ValueError("Maximum number of backups must be at least 1") - - # Count current backups - backup_count = await self.repository.count_by_source(source_id) - - # If we have more backups than the maximum, delete the oldest ones - if backup_count > max_backups: - return await self.repository.delete_older_than(source_id, max_backups) - - return 0 # No backups deleted - - async def get_backups_by_user(self, user_id: UUID, limit: int = 10) -> List[Dict[str, Any]]: - """ - Get backups for a user's first pad directly using a join operation. - This eliminates the N+1 query problem by fetching the pad and its backups in a single query. - - Args: - user_id: The user ID to get backups for - limit: Maximum number of backups to return - - Returns: - List of backup dictionaries - """ - # Check if user exists - user_repository = UserRepository(self.session) - user = await user_repository.get_by_id(user_id) - if not user: - raise ValueError(f"User with ID '{user_id}' does not exist") - - # Get backups directly with a single query - backups = await self.repository.get_backups_by_user(user_id, limit) - return [backup.to_dict() for backup in backups] - - async def create_backup_if_needed(self, source_id: UUID, data: Dict[str, Any], - min_interval_minutes: int = 5, - max_backups: int = 10) -> Optional[Dict[str, Any]]: - """ - Create a backup only if needed: - - If there are no existing backups - - If the latest backup is older than the specified interval - - Args: - source_id: The ID of the source pad - data: The data to backup - min_interval_minutes: Minimum time between backups in minutes - max_backups: Maximum number of backups to keep - - Returns: - The created backup dict if a backup was created, None otherwise - """ - # Check if source pad exists - source_pad = await self.pad_repository.get_by_id(source_id) - if not source_pad: - raise ValueError(f"Pad with ID '{source_id}' does not exist") - - # Get the latest backup - latest_backup = await self.repository.get_latest_by_source(source_id) - - # Calculate the current time with timezone information - current_time = datetime.now(timezone.utc) - - # Determine if we need to create a backup - create_backup = False - - if not latest_backup: - # No backups exist yet, so create one - create_backup = True - else: - # Check if the latest backup is older than the minimum interval - backup_age = current_time - latest_backup.created_at - if backup_age.total_seconds() > (min_interval_minutes * 60): - create_backup = True - - # Create a backup if needed - if create_backup: - backup = await self.repository.create(source_id, data) - - # Manage backups (clean up old ones) - await self.manage_backups(source_id, max_backups) - - return backup.to_dict() - - return None diff --git a/src/backend/database/service/pad_service.py b/src/backend/database/service/pad_service.py deleted file mode 100644 index c1c382e..0000000 --- a/src/backend/database/service/pad_service.py +++ /dev/null @@ -1,125 +0,0 @@ -""" -Pad service for business logic related to pads. -""" - -from typing import List, Optional, Dict, Any, TYPE_CHECKING -from uuid import UUID - -from sqlalchemy.ext.asyncio import AsyncSession - -from ..repository import PadRepository, UserRepository -from .user_service import UserService -from ..models import PadModel -# Use TYPE_CHECKING to avoid circular imports -if TYPE_CHECKING: - from dependencies import UserSession - -class PadService: - """Service for pad-related business logic""" - - def __init__(self, session: AsyncSession): - """Initialize the service with a database session""" - self.session = session - self.repository = PadRepository(session) - self.user_repository = UserRepository(session) - - async def create_pad(self, owner_id: UUID, display_name: str, data: Dict[str, Any], user_session: "UserSession" = None) -> Dict[str, Any]: - """Create a new pad""" - # Validate input - if not display_name: - raise ValueError("Display name is required") - - if not data: - raise ValueError("Pad data is required") - - # Check if owner exists - owner = await self.user_repository.get_by_id(owner_id) - if not owner and user_session: - # Reaching here is an anomaly, this is a failsafe - # User should already exist in the database - - # Create a UserService instance - user_service = UserService(self.session) - - # Create token data dictionary from UserSession properties - token_data = { - "username": user_session.username, - "email": user_session.email, - "email_verified": user_session.email_verified, - "name": user_session.name, - "given_name": user_session.given_name, - "family_name": user_session.family_name, - "roles": user_session.roles - } - - # Use sync_user_with_token_data which handles race conditions - try: - await user_service.sync_user_with_token_data(owner_id, token_data) - # Get the user again to confirm it exists - owner = await self.user_repository.get_by_id(owner_id) - if not owner: - raise ValueError(f"Failed to create user with ID '{owner_id}'") - except Exception as e: - print(f"Error creating user as failsafe: {str(e)}") - raise ValueError(f"Failed to create user with ID '{owner_id}': {str(e)}") - - # Create pad - pad = await self.repository.create(owner_id, display_name, data) - return pad.to_dict() - - async def get_pad(self, pad_id: UUID) -> Optional[Dict[str, Any]]: - """Get a pad by ID""" - pad = await self.repository.get_by_id(pad_id) - return pad.to_dict() if pad else None - - async def get_pads_by_owner(self, owner_id: UUID) -> List[Dict[str, Any]]: - """Get all pads for a specific owner""" - # Check if owner exists - owner = await self.user_repository.get_by_id(owner_id) - if not owner: - # Return empty list instead of raising an error - # This allows the pad_router to handle the case where a user doesn't exist - return [] - - pads: list[PadModel] = await self.repository.get_by_owner(owner_id) - return [pad.to_dict() for pad in pads] - - async def get_pad_by_name(self, owner_id: UUID, display_name: str) -> Optional[Dict[str, Any]]: - """Get a pad by owner and display name""" - pad = await self.repository.get_by_name(owner_id, display_name) - return pad.to_dict() if pad else None - - async def update_pad(self, pad_id: UUID, data: Dict[str, Any]) -> Optional[Dict[str, Any]]: - """Update a pad""" - # Get the pad to check if it exists - pad = await self.repository.get_by_id(pad_id) - if not pad: - raise ValueError(f"Pad with ID '{pad_id}' does not exist") - - # Validate display_name if it's being updated - if 'display_name' in data and not data['display_name']: - raise ValueError("Display name cannot be empty") - - # Update pad - updated_pad = await self.repository.update(pad_id, data) - return updated_pad.to_dict() if updated_pad else None - - async def update_pad_data(self, pad_id: UUID, pad_data: Dict[str, Any]) -> Optional[Dict[str, Any]]: - """Update just the data field of a pad""" - # Get the pad to check if it exists - pad = await self.repository.get_by_id(pad_id) - if not pad: - raise ValueError(f"Pad with ID '{pad_id}' does not exist") - - # Update pad data - updated_pad = await self.repository.update_data(pad_id, pad_data) - return updated_pad.to_dict() if updated_pad else None - - async def delete_pad(self, pad_id: UUID) -> bool: - """Delete a pad""" - # Get the pad to check if it exists - pad = await self.repository.get_by_id(pad_id) - if not pad: - raise ValueError(f"Pad with ID '{pad_id}' does not exist") - - return await self.repository.delete(pad_id) diff --git a/src/backend/database/service/template_pad_service.py b/src/backend/database/service/template_pad_service.py deleted file mode 100644 index ead220b..0000000 --- a/src/backend/database/service/template_pad_service.py +++ /dev/null @@ -1,98 +0,0 @@ -""" -Template pad service for business logic related to template pads. -""" - -from typing import List, Optional, Dict, Any -from uuid import UUID - -from sqlalchemy.ext.asyncio import AsyncSession - -from ..repository import TemplatePadRepository - -class TemplatePadService: - """Service for template pad-related business logic""" - - def __init__(self, session: AsyncSession): - """Initialize the service with a database session""" - self.session = session - self.repository = TemplatePadRepository(session) - - async def create_template(self, name: str, display_name: str, data: Dict[str, Any]) -> Dict[str, Any]: - """Create a new template pad""" - # Validate input - if not name: - raise ValueError("Name is required") - - if not display_name: - raise ValueError("Display name is required") - - if not data: - raise ValueError("Template data is required") - - # Check if template with same name already exists - existing_template = await self.repository.get_by_name(name) - if existing_template: - raise ValueError(f"Template with name '{name}' already exists") - - # Create template pad - template_pad = await self.repository.create(name, display_name, data) - return template_pad.to_dict() - - async def get_template(self, template_id: UUID) -> Optional[Dict[str, Any]]: - """Get a template pad by ID""" - template_pad = await self.repository.get_by_id(template_id) - return template_pad.to_dict() if template_pad else None - - async def get_template_by_name(self, name: str) -> Optional[Dict[str, Any]]: - """Get a template pad by name""" - template_pad = await self.repository.get_by_name(name) - return template_pad.to_dict() if template_pad else None - - async def get_all_templates(self) -> List[Dict[str, Any]]: - """Get all template pads""" - template_pads = await self.repository.get_all() - return [template_pad.to_dict() for template_pad in template_pads] - - async def update_template(self, name: str, data: Dict[str, Any]) -> Optional[Dict[str, Any]]: - """Update a template pad""" - # Get the template pad to check if it exists - template_pad = await self.repository.get_by_name(name) - if not template_pad: - raise ValueError(f"Template pad with name '{name}' does not exist") - - # Validate name and display_name if they're being updated - if 'name' in data and not data['name']: - raise ValueError("Name cannot be empty") - - if 'display_name' in data and not data['display_name']: - raise ValueError("Display name cannot be empty") - - # Check if new name already exists (if being updated) - if 'name' in data and data['name'] != template_pad.name: - existing_template = await self.repository.get_by_name(data['name']) - if existing_template: - raise ValueError(f"Template with name '{data['name']}' already exists") - - # Update template pad - updated_template = await self.repository.update(name, data) - return updated_template.to_dict() if updated_template else None - - async def update_template_data(self, name: str, template_data: Dict[str, Any]) -> Optional[Dict[str, Any]]: - """Update just the data field of a template pad""" - # Get the template pad to check if it exists - template_pad = await self.repository.get_by_name(name) - if not template_pad: - raise ValueError(f"Template pad with name '{name}' does not exist") - - # Update template pad data - updated_template = await self.repository.update_data(name, template_data) - return updated_template.to_dict() if updated_template else None - - async def delete_template(self, name: str) -> bool: - """Delete a template pad""" - # Get the template pad to check if it exists - template_pad = await self.repository.get_by_name(name) - if not template_pad: - raise ValueError(f"Template pad with name '{name}' does not exist") - - return await self.repository.delete(name) diff --git a/src/backend/database/service/user_service.py b/src/backend/database/service/user_service.py deleted file mode 100644 index 00a834d..0000000 --- a/src/backend/database/service/user_service.py +++ /dev/null @@ -1,181 +0,0 @@ -""" -User service for business logic related to users. -""" - -from typing import List, Optional, Dict, Any -from uuid import UUID - -from sqlalchemy.ext.asyncio import AsyncSession - -from ..repository import UserRepository - -class UserService: - """Service for user-related business logic""" - - def __init__(self, session: AsyncSession): - """Initialize the service with a database session""" - self.session = session - self.repository = UserRepository(session) - - async def create_user(self, user_id: UUID, username: str, email: str, - email_verified: bool = False, name: str = None, - given_name: str = None, family_name: str = None, - roles: list = None) -> Dict[str, Any]: - """Create a new user with specified ID and optional fields""" - # Validate input - if not user_id or not username or not email: - raise ValueError("User ID, username, and email are required") - - # Check if user_id already exists - existing_id = await self.repository.get_by_id(user_id) - if existing_id: - raise ValueError(f"User with ID '{user_id}' already exists") - - # Check if username already exists - existing_user = await self.repository.get_by_username(username) - if existing_user: - raise ValueError(f"Username '{username}' is already taken") - - # Create user - user = await self.repository.create( - user_id=user_id, - username=username, - email=email, - email_verified=email_verified, - name=name, - given_name=given_name, - family_name=family_name, - roles=roles - ) - return user.to_dict() - - async def get_user(self, user_id: UUID) -> Optional[Dict[str, Any]]: - """Get a user by ID""" - user = await self.repository.get_by_id(user_id) - return user.to_dict() if user else None - - async def get_user_by_username(self, username: str) -> Optional[Dict[str, Any]]: - """Get a user by username""" - user = await self.repository.get_by_username(username) - return user.to_dict() if user else None - - async def get_user_by_email(self, email: str) -> Optional[Dict[str, Any]]: - """Get a user by email""" - user = await self.repository.get_by_email(email) - return user.to_dict() if user else None - - async def get_all_users(self) -> List[Dict[str, Any]]: - """Get all users""" - users = await self.repository.get_all() - return [user.to_dict() for user in users] - - async def update_user(self, user_id: UUID, data: Dict[str, Any]) -> Optional[Dict[str, Any]]: - """Update a user""" - # Validate input - if 'username' in data and not data['username']: - raise ValueError("Username cannot be empty") - - if 'email' in data and not data['email']: - raise ValueError("Email cannot be empty") - - # Check if username already exists (if being updated) - if 'username' in data: - existing_user = await self.repository.get_by_username(data['username']) - if existing_user and existing_user.id != user_id: - raise ValueError(f"Username '{data['username']}' is already taken") - - # Check if email already exists (if being updated) - if 'email' in data: - existing_email = await self.repository.get_by_email(data['email']) - if existing_email and existing_email.id != user_id: - raise ValueError(f"Email '{data['email']}' is already registered") - - # Update user - user = await self.repository.update(user_id, data) - return user.to_dict() if user else None - - async def delete_user(self, user_id: UUID) -> bool: - """Delete a user""" - return await self.repository.delete(user_id) - - async def update_user_if_needed(self, user_id: UUID, token_data: Dict[str, Any], user_data: Dict[str, Any]) -> Dict[str, Any]: - """ - Update user only if data has changed - - Args: - user_id: The user's UUID - token_data: Dictionary containing user data from the authentication token - user_data: Current user data from the database - - Returns: - The updated user data dictionary or the original if no update was needed - """ - # Check if user data needs to be updated - update_data = {} - fields_to_check = [ - "username", "email", "email_verified", - "name", "given_name", "family_name" - ] - - for field in fields_to_check: - token_value = token_data.get(field) - if token_value is not None and user_data.get(field) != token_value: - update_data[field] = token_value - - # Handle roles separately as they might have a different structure - if "roles" in token_data and user_data.get("roles") != token_data["roles"]: - update_data["roles"] = token_data["roles"] - - # Update user if any field has changed - if update_data: - return await self.update_user(user_id, update_data) - - return user_data - - async def sync_user_with_token_data(self, user_id: UUID, token_data: Dict[str, Any]) -> Optional[Dict[str, Any]]: - """ - Synchronize user data in the database with data from the authentication token. - If the user doesn't exist, it will be created. If it exists but has different data, - it will be updated to match the token data. - - Args: - user_id: The user's UUID - token_data: Dictionary containing user data from the authentication token - - Returns: - The user data dictionary or None if operation failed - """ - # Check if user exists - user_data = await self.get_user(user_id) - - # If user doesn't exist, create a new one - if not user_data: - try: - print(f"User with ID '{user_id}' does not exist. Creating user from token data.") - return await self.create_user( - user_id=user_id, - username=token_data.get("username", ""), - email=token_data.get("email", ""), - email_verified=token_data.get("email_verified", False), - name=token_data.get("name"), - given_name=token_data.get("given_name"), - family_name=token_data.get("family_name"), - roles=token_data.get("roles", []) - ) - except ValueError as e: - print(f"Error creating user: {e}") - # Handle case where user might have been created in a race condition - if "already exists" in str(e): - print(f"Race condition detected: User with ID '{user_id}' was created by another process.") - user_data = await self.get_user(user_id) - if user_data: - # User exists now, proceed with update if needed - return await self.update_user_if_needed(user_id, token_data, user_data) - else: - # This shouldn't happen - user creation failed but user doesn't exist - raise ValueError(f"Failed to create or find user with ID '{user_id}'") - else: - raise e - - # User exists, update if needed - return await self.update_user_if_needed(user_id, token_data, user_data) diff --git a/src/backend/dependencies.py b/src/backend/dependencies.py index 749c26b..26588ac 100644 --- a/src/backend/dependencies.py +++ b/src/backend/dependencies.py @@ -1,40 +1,60 @@ import jwt -from typing import Optional, Dict, Any +from typing import Optional, Dict, Any, Tuple from uuid import UUID +import os +import asyncio +from sqlalchemy.ext.asyncio import AsyncSession from fastapi import Request, HTTPException, Depends -from config import get_session, is_token_expired, refresh_token -from database.service import UserService +from cache import RedisClient +from domain.session import Session +from domain.user import User +from domain.pad import Pad from coder import CoderAPI +from database.database import get_session + +# oidc_config for session creation and user sessions +oidc_config = { + 'server_url': os.getenv('OIDC_SERVER_URL'), + 'realm': os.getenv('OIDC_REALM'), + 'client_id': os.getenv('OIDC_CLIENT_ID'), + 'client_secret': os.getenv('OIDC_CLIENT_SECRET'), + 'redirect_uri': os.getenv('REDIRECT_URI') +} + +async def get_session_domain() -> Session: + """Get a Session domain instance for the current request.""" + redis_client = await RedisClient.get_instance() + return Session(redis_client, oidc_config) class UserSession: """ Unified user session model that integrates authentication data with user information. This provides a single interface for accessing both token data and user details. """ - def __init__(self, access_token: str, token_data: dict, user_id: UUID = None): + def __init__(self, access_token: str, token_data: dict, session_domain: Session, user_id: UUID = None): self.access_token = access_token self._user_data = None + self._session_domain = session_domain # Get the signing key and decode with verification - from config import get_jwks_client, OIDC_CLIENT_ID try: - jwks_client = get_jwks_client() + jwks_client = self._session_domain._get_jwks_client() signing_key = jwks_client.get_signing_key_from_jwt(access_token) self.token_data = jwt.decode( access_token, signing_key.key, algorithms=["RS256"], - audience=OIDC_CLIENT_ID + audience=oidc_config['client_id'] ) - + except jwt.InvalidTokenError as e: # Log the error and raise an appropriate exception print(f"Invalid token: {str(e)}") raise ValueError(f"Invalid authentication token: {str(e)}") - + @property def is_authenticated(self) -> bool: """Check if the session is authenticated""" @@ -84,12 +104,6 @@ def roles(self) -> list: def is_admin(self) -> bool: """Check if user has admin role""" return "admin" in self.roles - - async def get_user_data(self, user_service: UserService) -> Dict[str, Any]: - """Get user data from database, caching the result""" - if self._user_data is None and self.id: - self._user_data = await user_service.get_user(self.id) - return self._user_data class AuthDependency: """ @@ -104,27 +118,31 @@ async def __call__(self, request: Request) -> Optional[UserSession]: # Get session ID from cookies session_id = request.cookies.get('session_id') + # Get session domain instance + current_session_domain = await get_session_domain() + # Handle missing session ID if not session_id: return self._handle_auth_error("Not authenticated") # Get session data from Redis - session = get_session(session_id) - if not session: + session_data = await current_session_domain.get(session_id) + if not session_data: return self._handle_auth_error("Not authenticated") # Handle token expiration - if is_token_expired(session): + if current_session_domain.is_token_expired(session_data): # Try to refresh the token - success, new_session = await refresh_token(session_id, session) + success, new_session_data = await current_session_domain.refresh_token(session_id, session_data) if not success: return self._handle_auth_error("Session expired") - session = new_session + session_data = new_session_data # Create user session object user_session = UserSession( - access_token=session.get('access_token'), - token_data=session + access_token=session_data.get('access_token'), + token_data=session_data, + session_domain=current_session_domain ) # Check admin requirement if specified @@ -154,3 +172,47 @@ def get_coder_api(): Dependency that provides a CoderAPI instance. """ return CoderAPI() + +class PadAccess: + """ + Dependency for handling pad access control. + Usage: + - require_pad_access = PadAccess() # For requiring any valid access + - require_pad_owner = PadAccess(require_owner=True) # For owner-only operations + """ + def __init__(self, require_owner: bool = False): + self.require_owner = require_owner + + async def __call__( + self, + pad_id: UUID, + user: UserSession = Depends(require_auth), + session: AsyncSession = Depends(get_session) + ) -> Tuple[Pad, UserSession]: + # Get the pad + pad = await Pad.get_by_id(session, pad_id) + if not pad: + raise HTTPException( + status_code=404, + detail="Pad not found" + ) + + # Check access permissions + if not pad.can_access(user.id): + raise HTTPException( + status_code=403, + detail="Not authorized to access this pad" + ) + + # Check owner requirement if specified + if self.require_owner and pad.owner_id != user.id: + raise HTTPException( + status_code=403, + detail="Only the pad owner can perform this operation" + ) + + return pad, user + +# Create dependency instances for pad access +require_pad_access = PadAccess() +require_pad_owner = PadAccess(require_owner=True) diff --git a/src/backend/domain/pad.py b/src/backend/domain/pad.py new file mode 100644 index 0000000..a482a1a --- /dev/null +++ b/src/backend/domain/pad.py @@ -0,0 +1,378 @@ +from uuid import UUID +from typing import Dict, Any, Optional, List +from datetime import datetime +from redis import RedisError +from sqlalchemy.ext.asyncio import AsyncSession +from config import default_pad +import json + +from cache import RedisClient +from database.models.pad_model import PadStore +from redis.asyncio import Redis as AsyncRedis + +class Pad: + """ + Domain entity representing a collaborative pad. + + This class contains the core business logic for pad manipulation, + manages the collaboration state, and provides methods for Redis + synchronization and database persistence. + """ + + # Cache expiration time in seconds (1 hour) + CACHE_EXPIRY = 3600 + + def __init__( + self, + id: UUID, + owner_id: UUID, + display_name: str, + created_at: datetime, + updated_at: datetime, + store: PadStore, + redis: AsyncRedis, + data: Dict[str, Any] = None, + sharing_policy: str = "private", + whitelist: List[UUID] = None, + worker_id: Optional[str] = None, + ): + self.id = id + self.owner_id = owner_id + self.display_name = display_name + self.created_at = created_at or datetime.now() + self.updated_at = updated_at or datetime.now() + self._store = store + self._redis = redis + self.data = data or {} + self.sharing_policy = sharing_policy or "private" + self.whitelist = whitelist or [] + self.worker_id = worker_id # Cache-only field, not persisted to database + + @classmethod + async def create( + cls, + session: AsyncSession, + owner_id: UUID, + display_name: str, + data: Dict[str, Any] = default_pad, + sharing_policy: str = "private", + whitelist: List[UUID] = None, + ) -> 'Pad': + """Create a new pad with multi-user app state support""" + # Create a deep copy of the default template + pad_data = { + "files": data.get("files", {}), + "elements": data.get("elements", []), + "appState": { + str(owner_id): data.get("appState", {}) + } + } + + store = await PadStore.create_pad( + session=session, + owner_id=owner_id, + display_name=display_name, + data=pad_data, + sharing_policy=sharing_policy or "private", + whitelist=whitelist or [] + ) + redis = await RedisClient.get_instance() + pad = cls.from_store(store, redis) + + await pad.ensure_worker() + await pad.cache() + + return pad + + @classmethod + async def from_redis(cls, redis: AsyncRedis, pad_id: UUID) -> Optional['Pad']: + """Create a Pad instance from Redis cache data""" + cache_key = f"pad:{pad_id}" + + try: + if not await redis.exists(cache_key): + return None + + cached_data = await redis.hgetall(cache_key) + if not cached_data: + return None + + pad_id = UUID(cached_data['id']) + owner_id = UUID(cached_data['owner_id']) + display_name = cached_data['display_name'] + data = json.loads(cached_data['data']) + created_at = datetime.fromisoformat(cached_data['created_at']) + updated_at = datetime.fromisoformat(cached_data['updated_at']) + + # Get sharing_policy and whitelist (or use defaults if not in cache) + sharing_policy = cached_data.get('sharing_policy', 'private') + whitelist_str = cached_data.get('whitelist', '[]') + whitelist = [UUID(uid) for uid in json.loads(whitelist_str)] if whitelist_str else [] + # Get worker_id from cache (cache-only field) + worker_id = cached_data.get('worker_id', None) + + # Create a minimal PadStore instance + store = PadStore( + id=pad_id, + owner_id=owner_id, + display_name=display_name, + data=data, + created_at=created_at, + updated_at=updated_at, + sharing_policy=sharing_policy, + whitelist=whitelist + ) + + return cls( + id=pad_id, + owner_id=owner_id, + display_name=display_name, + data=data, + created_at=created_at, + updated_at=updated_at, + store=store, + redis=redis, + sharing_policy=sharing_policy, + whitelist=whitelist, + worker_id=worker_id + ) + except (json.JSONDecodeError, KeyError, ValueError, RedisError) as e: + return None + except Exception as e: + print(f"Unexpected error retrieving pad from cache: {str(e)}") + return None + + @classmethod + async def get_by_id(cls, session: AsyncSession, pad_id: UUID) -> Optional['Pad']: + """Get a pad by ID, first trying Redis cache then falling back to database""" + redis = await RedisClient.get_instance() + + # Try to get from cache first + pad = await cls.from_redis(redis, pad_id) + if pad: + await pad.ensure_worker() + return pad + + # Fall back to database + store = await PadStore.get_by_id(session, pad_id) + if store: + pad = cls.from_store(store, redis) + await pad.ensure_worker() + await pad.cache() + return pad + return None + + @classmethod + def from_store(cls, store: PadStore, redis: AsyncRedis) -> 'Pad': + """Create a Pad instance from a store""" + return cls( + id=store.id, + owner_id=store.owner_id, + display_name=store.display_name, + data=store.data, + created_at=store.created_at, + updated_at=store.updated_at, + store=store, + redis=redis, + sharing_policy=store.sharing_policy or "private", + whitelist=store.whitelist or [] + ) + + async def save(self, session: AsyncSession) -> 'Pad': + """Save the pad to the database and update cache""" + self._store.display_name = self.display_name + self._store.data = self.data + self._store.sharing_policy = self.sharing_policy + self._store.whitelist = self.whitelist + self._store.updated_at = datetime.now() + self._store = await self._store.save(session) + + await self.cache() + return self + + async def rename(self, session: AsyncSession, new_display_name: str) -> 'Pad': + """Rename the pad by updating its display name""" + self.display_name = new_display_name + self.updated_at = datetime.now() + if self._store: + self._store.display_name = new_display_name + self._store.updated_at = self.updated_at + self._store.sharing_policy = self.sharing_policy + self._store.whitelist = self.whitelist + self._store = await self._store.save(session) + + await self.cache() + + return self + + async def delete(self, session: AsyncSession) -> bool: + """Delete the pad from both database and cache""" + await self.release_worker() + + success = await self._store.delete(session) + if success: + await self.invalidate_cache() + else: + print(f"Failed to delete pad {self.id} from database") + return False + + print(f"Deleted pad {self.id} from database and cache") + return success + + async def cache(self) -> None: + """Cache the pad data in Redis using hash structure""" + + cache_key = f"pad:{self.id}" + + cache_data = { + 'id': str(self.id), + 'owner_id': str(self.owner_id), + 'display_name': self.display_name, + 'data': json.dumps(self.data), + 'created_at': self.created_at.isoformat(), + 'updated_at': self.updated_at.isoformat(), + 'sharing_policy': self.sharing_policy, + 'whitelist': json.dumps([str(uid) for uid in self.whitelist]) if self.whitelist else '[]', + 'worker_id': self.worker_id or '' # Cache-only field + } + try: + async with self._redis.pipeline() as pipe: + await pipe.hset(cache_key, mapping=cache_data) + await pipe.expire(cache_key, self.CACHE_EXPIRY) + await pipe.execute() + except Exception as e: + print(f"Error caching pad {self.id}: {str(e)}") + + async def invalidate_cache(self) -> None: + """Remove the pad from Redis cache""" + cache_key = f"pad:{self.id}" + await self._redis.delete(cache_key) + + async def set_sharing_policy(self, session: AsyncSession, policy: str) -> 'Pad': + """Update the sharing policy of the pad""" + if policy not in ["private", "whitelist", "public"]: + raise ValueError("Invalid sharing policy") + + print(f"Changing sharing policy for pad {self.id} from {self.sharing_policy} to {policy}") + self.sharing_policy = policy + self._store.sharing_policy = policy + self.updated_at = datetime.now() + self._store.updated_at = self.updated_at + + await self._store.save(session) + await self.cache() + + return self + + async def add_to_whitelist(self, session: AsyncSession, user_id: UUID) -> 'Pad': + """Add a user to the pad's whitelist""" + if user_id not in self.whitelist: + self.whitelist.append(user_id) + self._store.whitelist = self.whitelist + self.updated_at = datetime.now() + self._store.updated_at = self.updated_at + + await self._store.save(session) + await self.cache() + + return self + + async def remove_from_whitelist(self, session: AsyncSession, user_id: UUID) -> 'Pad': + """Remove a user from the pad's whitelist""" + if user_id in self.whitelist: + self.whitelist.remove(user_id) + self._store.whitelist = self.whitelist + self.updated_at = datetime.now() + self._store.updated_at = self.updated_at + + await self._store.save(session) + await self.cache() + + return self + + def can_access(self, user_id: UUID) -> bool: + """Check if a user can access the pad""" + if self.owner_id == user_id: + return True + if self.sharing_policy == "public": + return True + if self.sharing_policy == "whitelist": + return user_id in self.whitelist + return False + + async def ensure_worker(self) -> bool: + """Ensure a worker is assigned to this pad and processing updates""" + from workers.canvas_worker import CanvasWorker + + # If we already have a worker assigned, check if it's still active + if self.worker_id and self.worker_id.strip(): + # TODO: Add worker health check if needed + return True + + # Get the canvas worker instance and assign it to this pad + canvas_worker = await CanvasWorker.get_instance() + success = await canvas_worker.start_processing_pad(self.id) + + if success: + self.worker_id = canvas_worker.worker_id + # Update cache with new worker assignment + await self.cache() + print(f"Assigned worker {self.worker_id[:8]} to pad {self.id}") + return True + else: + print(f"Failed to assign worker to pad {self.id}") + return False + + async def assign_worker(self, worker_id: str) -> None: + """Assign a specific worker to this pad (cache-only)""" + self.worker_id = worker_id + await self.cache() + + async def release_worker(self) -> None: + """Release the worker from this pad""" + if self.worker_id: + from workers.canvas_worker import CanvasWorker + canvas_worker = await CanvasWorker.get_instance() + await canvas_worker.stop_processing_pad(self.id) + + old_worker_id = self.worker_id + self.worker_id = None + await self.cache() + print(f"Released worker {old_worker_id[:8]} from pad {self.id}") + + async def get_connected_users(self) -> List[Dict[str, str]]: + """Get all connected users from the pad users hash as a list of dicts with user_id and username.""" + key = f"pad:users:{self.id}" + try: + # Get all users from the hash + all_users = await self._redis.hgetall(key) + + # Convert to list of dicts with user_id and username + connected_users = [] + for user_id, user_data_str in all_users.items(): + user_id_str = user_id.decode() if isinstance(user_id, bytes) else user_id + user_data = json.loads(user_data_str.decode() if isinstance(user_data_str, bytes) else user_data_str) + connected_users.append({ + "user_id": user_id_str, + "username": user_data["username"] + }) + + return connected_users + except Exception as e: + print(f"Error getting connected users from Redis for pad {self.id}: {e}") + return [] + + def to_dict(self) -> Dict[str, Any]: + """Convert to dictionary representation""" + return { + "id": str(self.id), + "owner_id": str(self.owner_id), + "display_name": self.display_name, + "data": self.data, + "sharing_policy": self.sharing_policy, + "whitelist": [str(uid) for uid in self.whitelist], + "created_at": self.created_at.isoformat(), + "updated_at": self.updated_at.isoformat(), + "worker_id": self.worker_id if self.worker_id else None + } + \ No newline at end of file diff --git a/src/backend/domain/session.py b/src/backend/domain/session.py new file mode 100644 index 0000000..45fbe49 --- /dev/null +++ b/src/backend/domain/session.py @@ -0,0 +1,232 @@ +from typing import Optional, Dict, Any, Tuple +import json +import time +import jwt +from jwt.jwks_client import PyJWKClient +import httpx +from redis.asyncio import Redis as AsyncRedis + +class Session: + """Domain class for managing user sessions""" + + def __init__(self, redis_client: AsyncRedis, oidc_config: Dict[str, str]): + """ + Initialize a new Session instance. + + Args: + redis_client: The Redis client to use for session storage + oidc_config: Configuration for the OIDC provider + """ + self.redis_client = redis_client + self.oidc_config = oidc_config + self._jwks_client = None + + async def get(self, session_id: str) -> Optional[Dict[str, Any]]: + """ + Get session data from Redis. + + Args: + session_id: The session ID to retrieve + + Returns: + The session data or None if not found + """ + try: + session_data = await self.redis_client.get(f"session:{session_id}") + if session_data: + return json.loads(session_data) + except json.JSONDecodeError as e: + print(f"Error decoding session data for {session_id}: {str(e)}") + except Exception as e: + print(f"Error retrieving session {session_id}: {str(e)}") + return None + + async def set(self, session_id: str, data: Dict[str, Any], expiry: int) -> bool: + """ + Store session data in Redis with expiry in seconds. + + Args: + session_id: The session ID to store + data: The session data to store + expiry: Time to live in seconds + + Returns: + True if successful, False otherwise + """ + try: + await self.redis_client.setex( + f"session:{session_id}", + expiry, + json.dumps(data) + ) + return True + except Exception as e: + print(f"Error storing session {session_id}: {str(e)}") + return False + + async def delete(self, session_id: str) -> bool: + """ + Delete session data from Redis. + + Args: + session_id: The session ID to delete + + Returns: + True if successful, False otherwise + """ + try: + await self.redis_client.delete(f"session:{session_id}") + return True + except Exception as e: + print(f"Error deleting session {session_id}: {str(e)}") + return False + + def get_auth_url(self) -> str: + """ + Generate the authentication URL for OIDC login. + + Returns: + The authentication URL + """ + auth_url = f"{self.oidc_config['server_url']}/realms/{self.oidc_config['realm']}/protocol/openid-connect/auth" + params = { + 'client_id': self.oidc_config['client_id'], + 'response_type': 'code', + 'redirect_uri': self.oidc_config['redirect_uri'], + 'scope': 'openid profile email' + } + return f"{auth_url}?{'&'.join(f'{k}={v}' for k,v in params.items())}" + + def get_token_url(self) -> str: + """ + Get the token endpoint URL. + + Returns: + The token endpoint URL + """ + return f"{self.oidc_config['server_url']}/realms/{self.oidc_config['realm']}/protocol/openid-connect/token" + + def is_token_expired(self, token_data: Dict[str, Any], buffer_seconds: int = 30) -> bool: + """ + Check if the access token is expired. + + Args: + token_data: The token data to check + buffer_seconds: Buffer time in seconds before actual expiration + + Returns: + True if the token is expired, False otherwise + """ + if not token_data or 'access_token' not in token_data: + return True + + try: + # Get the signing key + jwks_client = self._get_jwks_client() + signing_key = jwks_client.get_signing_key_from_jwt(token_data['access_token']) + + # Decode with verification + decoded = jwt.decode( + token_data['access_token'], + signing_key.key, + algorithms=["RS256"], + audience=self.oidc_config['client_id'], + ) + + # Check expiration + exp_time = decoded.get('exp', 0) + current_time = time.time() + return current_time + buffer_seconds >= exp_time + except jwt.ExpiredSignatureError: + return True + except Exception as e: + print(f"Error checking token expiration: {str(e)}") + return True + + async def refresh_token(self, session_id: str, token_data: Dict[str, Any]) -> Tuple[bool, Dict[str, Any]]: + """ + Refresh the access token using the refresh token. + + Args: + session_id: The session ID + token_data: The current token data containing the refresh token + + Returns: + Tuple of (success, token_data) + """ + if not token_data or 'refresh_token' not in token_data: + return False, token_data + + try: + async with httpx.AsyncClient() as client: + refresh_response = await client.post( + self.get_token_url(), + data={ + 'grant_type': 'refresh_token', + 'client_id': self.oidc_config['client_id'], + 'client_secret': self.oidc_config['client_secret'], + 'refresh_token': token_data['refresh_token'] + } + ) + + if refresh_response.status_code != 200: + print(f"Token refresh failed: {refresh_response.text}") + return False, token_data + + # Get new token data + new_token_data = refresh_response.json() + + # Update session with new tokens + expiry = new_token_data['refresh_expires_in'] + success = await self.set(session_id, new_token_data, expiry) + if not success: + return False, token_data + + return True, new_token_data + except Exception as e: + print(f"Error refreshing token: {str(e)}") + return False, token_data + + def _get_jwks_client(self) -> PyJWKClient: + """ + Get or create a PyJWKClient for token verification. + + Returns: + The JWKs client + """ + if self._jwks_client is None: + jwks_url = f"{self.oidc_config['server_url']}/realms/{self.oidc_config['realm']}/protocol/openid-connect/certs" + self._jwks_client = PyJWKClient(jwks_url) + return self._jwks_client + + async def track_event(self, session_id: str, event_type: str, metadata: Dict[str, Any] = None) -> bool: + """ + Track a session event (login, logout, etc.). + + Args: + session_id: The session ID + event_type: The type of event + metadata: Additional metadata for the event + + Returns: + True if successful, False otherwise + """ + try: + session_data = await self.get(session_id) + if session_data: + if 'events' not in session_data: + session_data['events'] = [] + + event = { + 'type': event_type, + 'timestamp': time.time(), + 'metadata': metadata or {} + } + session_data['events'].append(event) + + # Update session with new event + return await self.set(session_id, session_data, session_data.get('expires_in', 3600)) + return False + except Exception as e: + print(f"Error tracking event {event_type} for session {session_id}: {str(e)}") + return False \ No newline at end of file diff --git a/src/backend/domain/user.py b/src/backend/domain/user.py new file mode 100644 index 0000000..b139518 --- /dev/null +++ b/src/backend/domain/user.py @@ -0,0 +1,201 @@ +from uuid import UUID +from typing import Dict, Any, Optional, List +from datetime import datetime +from sqlalchemy.ext.asyncio import AsyncSession + +from database.models.user_model import UserStore + + +class User: + """ + Domain entity representing a user. + + This class contains the core business logic for user management + and provides methods for database persistence. + """ + + def __init__( + self, + id: UUID, + username: str, + email: str, + email_verified: bool = False, + name: Optional[str] = None, + given_name: Optional[str] = None, + family_name: Optional[str] = None, + roles: List[str] = None, + last_selected_pad: Optional[UUID] = None, + created_at: datetime = None, + updated_at: datetime = None, + store: UserStore = None + ): + self.id = id + self.username = username + self.email = email + self.email_verified = email_verified + self.name = name + self.given_name = given_name + self.family_name = family_name + self.roles = roles or [] + self.last_selected_pad = last_selected_pad + self.created_at = created_at or datetime.now() + self.updated_at = updated_at or datetime.now() + self._store = store + + @classmethod + async def create( + cls, + session: AsyncSession, + id: UUID, + username: str, + email: str, + email_verified: bool = False, + name: Optional[str] = None, + given_name: Optional[str] = None, + family_name: Optional[str] = None, + roles: List[str] = None, + last_selected_pad: Optional[UUID] = None + ) -> 'User': + """Create a new user""" + store = await UserStore.create_user( + session=session, + id=id, + username=username, + email=email, + email_verified=email_verified, + name=name, + given_name=given_name, + family_name=family_name, + roles=roles, + last_selected_pad=last_selected_pad + ) + return cls.from_store(store) + + @classmethod + async def get_by_id(cls, session: AsyncSession, user_id: UUID) -> Optional['User']: + """Get a user by ID""" + store = await UserStore.get_by_id(session, user_id) + return cls.from_store(store) if store else None + + @classmethod + def from_store(cls, store: UserStore) -> 'User': + """Create a User instance from a store""" + return cls( + id=store.id, + username=store.username, + email=store.email, + email_verified=store.email_verified, + name=store.name, + given_name=store.given_name, + family_name=store.family_name, + roles=store.roles, + last_selected_pad=store.last_selected_pad, + created_at=store.created_at, + updated_at=store.updated_at, + store=store + ) + + async def save(self, session: AsyncSession) -> 'User': + """Save the user to the database""" + if not self._store: + self._store = UserStore( + id=self.id, + username=self.username, + email=self.email, + email_verified=self.email_verified, + name=self.name, + given_name=self.given_name, + family_name=self.family_name, + roles=self.roles, + last_selected_pad=self.last_selected_pad, + created_at=self.created_at, + updated_at=self.updated_at + ) + else: + self._store.username = self.username + self._store.email = self.email + self._store.email_verified = self.email_verified + self._store.name = self.name + self._store.given_name = self.given_name + self._store.family_name = self.family_name + self._store.roles = self.roles + self._store.last_selected_pad = self.last_selected_pad + self._store.updated_at = datetime.now() + + self._store = await self._store.save(session) + self.id = self._store.id + self.created_at = self._store.created_at + self.updated_at = self._store.updated_at + return self + + async def update(self, session: AsyncSession, data: Dict[str, Any]) -> 'User': + """Update user data""" + for key, value in data.items(): + setattr(self, key, value) + self.updated_at = datetime.now() + if self._store: + self._store = await self._store.update(session, data) + return self + + async def delete(self, session: AsyncSession) -> bool: + """Delete the user""" + if self._store: + return await self._store.delete(session) + return False + + def to_dict(self) -> Dict[str, Any]: + """Convert to dictionary representation""" + return { + "id": str(self.id), + "username": self.username, + "email": self.email, + "email_verified": self.email_verified, + "name": self.name, + "given_name": self.given_name, + "family_name": self.family_name, + "roles": self.roles, + "last_selected_pad": str(self.last_selected_pad) if self.last_selected_pad else None, + "created_at": self.created_at.isoformat(), + "updated_at": self.updated_at.isoformat() + } + + @classmethod + async def get_open_pads(cls, session: AsyncSession, user_id: UUID) -> List[Dict[str, Any]]: + """Get just the metadata of pads owned by the user without loading full pad data""" + return await UserStore.get_open_pads(session, user_id) + + @classmethod + async def ensure_exists(cls, session: AsyncSession, user_info: dict) -> 'User': + """Ensure a user exists in the database, creating them if they don't""" + user_id = UUID(user_info['sub']) + user = await cls.get_by_id(session, user_id) + + if not user: + print(f"Creating user {user_id}, {user_info.get('preferred_username', '')}") + user = await cls.create( + session=session, + id=user_id, + username=user_info.get('preferred_username', ''), + email=user_info.get('email', ''), + email_verified=user_info.get('email_verified', False), + name=user_info.get('name'), + given_name=user_info.get('given_name'), + family_name=user_info.get('family_name'), + roles=user_info.get('realm_access', {}).get('roles', []), + last_selected_pad=None + ) + + return user + + async def remove_open_pad(self, session: AsyncSession, pad_id: UUID) -> 'User': + """Remove a pad from the user's open_pads list""" + if self._store and pad_id in self._store.open_pads: + self._store = await self._store.remove_open_pad(session, pad_id) + return self + + async def set_last_selected_pad(self, session: AsyncSession, pad_id: UUID) -> 'User': + """Set the last selected pad for the user""" + self.last_selected_pad = pad_id + if self._store: + self._store = await self._store.set_last_selected_pad(session, pad_id) + return self \ No newline at end of file diff --git a/src/backend/main.py b/src/backend/main.py index 6f07ee0..9c4fabe 100644 --- a/src/backend/main.py +++ b/src/backend/main.py @@ -2,104 +2,69 @@ import json from contextlib import asynccontextmanager from typing import Optional +from uuid import UUID +from sqlalchemy.ext.asyncio import AsyncSession import posthog -from fastapi import FastAPI, Request, Depends +import httpx +from fastapi import FastAPI, Request, Depends, Response from fastapi.responses import FileResponse from fastapi.middleware.cors import CORSMiddleware from fastapi.staticfiles import StaticFiles -from database import init_db -from config import STATIC_DIR, ASSETS_DIR, POSTHOG_API_KEY, POSTHOG_HOST, redis_client, redis_pool +from database import init_db, engine +from config import ( + STATIC_DIR, ASSETS_DIR, POSTHOG_API_KEY, POSTHOG_HOST, + PAD_DEV_MODE, DEV_FRONTEND_URL +) +from cache import RedisClient from dependencies import UserSession, optional_auth from routers.auth_router import auth_router -from routers.user_router import user_router +from routers.users_router import users_router from routers.workspace_router import workspace_router from routers.pad_router import pad_router -from routers.template_pad_router import template_pad_router from routers.app_router import app_router -from database.service import TemplatePadService -from database.database import async_session +from routers.ws_router import ws_router +from database.database import get_session +from database.models.user_model import UserStore +from domain.pad import Pad +from workers.canvas_worker import CanvasWorker +from domain.user import User # Initialize PostHog if API key is available if POSTHOG_API_KEY: posthog.project_api_key = POSTHOG_API_KEY posthog.host = POSTHOG_HOST -async def load_templates(): - """ - Load all templates from the templates directory into the database if they don't exist. +@asynccontextmanager +async def lifespan(app: FastAPI): + """Manage the lifecycle of the application and its services.""" - This function reads all JSON files in the templates directory, extracts the display name - from the "appState.pad.displayName" field, uses the filename as the name, and stores - the entire JSON as the data. - """ - try: - # Get a session and template service - async with async_session() as session: - template_service = TemplatePadService(session) - - # Get the templates directory path - templates_dir = os.path.join(os.path.dirname(__file__), "templates") - - # Iterate through all JSON files in the templates directory - for filename in os.listdir(templates_dir): - if filename.endswith(".json"): - # Use the filename without extension as the name - name = os.path.splitext(filename)[0] - - # Check if template already exists - existing_template = await template_service.get_template_by_name(name) - - if not existing_template: - - file_path = os.path.join(templates_dir, filename) - - # Read the JSON file - with open(file_path, 'r') as f: - template_data = json.load(f) - - # Extract the display name from the JSON - display_name = template_data.get("appState", {}).get("pad", {}).get("displayName", "Untitled") - - # Create the template if it doesn't exist - await template_service.create_template( - name=name, - display_name=display_name, - data=template_data - ) - print(f"Added template: {name} ({display_name})") - else: - print(f"Template already in database: '{name}'") - - except Exception as e: - print(f"Error loading templates: {str(e)}") + if PAD_DEV_MODE: + print("Starting in dev mode") -@asynccontextmanager -async def lifespan(_: FastAPI): # Initialize database await init_db() print("Database connection established successfully") - redis_client.ping() + # Initialize Redis client and verify connection + redis = await RedisClient.get_instance() + await redis.ping() print("Redis connection established successfully") - # Load all templates from the templates directory - await load_templates() - print("Templates loaded successfully") + # Initialize the canvas worker + canvas_worker = await CanvasWorker.get_instance() + print("Canvas worker started successfully") yield - # Clean up connections when shutting down - try: - redis_pool.disconnect() - print("Redis connections closed") - except Exception as e: - print(f"Error closing Redis connections: {str(e)}") + # Shutdown + await CanvasWorker.shutdown_instance() + await redis.close() + await engine.dispose() app = FastAPI(lifespan=lifespan) -# CORS middleware setup app.add_middleware( CORSMiddleware, allow_origins=["*"], @@ -111,17 +76,97 @@ async def lifespan(_: FastAPI): app.mount("/assets", StaticFiles(directory=ASSETS_DIR), name="assets") app.mount("/static", StaticFiles(directory=STATIC_DIR), name="static") +async def serve_index_html(request: Request = None, response: Response = None, pad_id: Optional[UUID] = None): + """ + Helper function to serve the index.html file or proxy to dev server based on PAD_DEV_MODE. + Optionally sets a pending_pad_id cookie if pad_id is provided. + """ + + if PAD_DEV_MODE: + try: + # Proxy the request to the development server's root URL + url = f"{DEV_FRONTEND_URL}/" + # If request path is available, use it for proxying + if request and str(request.url).replace(str(request.base_url), ""): + url = f"{DEV_FRONTEND_URL}{request.url.path}" + + async with httpx.AsyncClient() as client: + proxy_response = await client.get(url) + # Create a new response with the proxied content + final_response = Response( + content=proxy_response.content, + status_code=proxy_response.status_code, + media_type=proxy_response.headers.get("content-type") + ) + + # Set cookie if pad_id is provided + if pad_id is not None: + final_response.set_cookie( + key="pending_pad_id", + value=str(pad_id), + httponly=True, + secure=True, + samesite="lax" + ) + + return final_response + except Exception as e: + error_message = f"Error proxying to dev server: {e}" + print(error_message) + return Response(content=error_message, status_code=500) + else: + # For production, serve the static build + file_response = FileResponse(os.path.join(STATIC_DIR, "index.html")) + + # Set cookie if pad_id is provided + if pad_id is not None: + file_response.set_cookie( + key="pending_pad_id", + value=str(pad_id), + httponly=True, + secure=True, + samesite="lax" + ) + + return file_response + +@app.get("/pad/{pad_id}") +async def read_pad( + pad_id: UUID, + request: Request, + response: Response, + user: Optional[UserSession] = Depends(optional_auth), + session: AsyncSession = Depends(get_session) +): + if not user: + return await serve_index_html(request, response, pad_id) + + try: + pad = await Pad.get_by_id(session, pad_id) + if not pad: + print("No pad found") + return await serve_index_html(request, response) + + if not pad.can_access(user.id): + print("No access to pad") + return await serve_index_html(request, response) + + # Just serve the page if user has access + return await serve_index_html(request, response, pad_id) + except Exception as e: + print(f"Error in read_pad endpoint: {e}") + return await serve_index_html(request, response, pad_id) + @app.get("/") async def read_root(request: Request, auth: Optional[UserSession] = Depends(optional_auth)): - return FileResponse(os.path.join(STATIC_DIR, "index.html")) + return await serve_index_html(request) -# Include routers in the main app with the /api prefix -app.include_router(auth_router, prefix="/auth") -app.include_router(user_router, prefix="/api/users") +app.include_router(auth_router, prefix="/api/auth") +app.include_router(users_router, prefix="/api/users") app.include_router(workspace_router, prefix="/api/workspace") app.include_router(pad_router, prefix="/api/pad") -app.include_router(template_pad_router, prefix="/api/templates") app.include_router(app_router, prefix="/api/app") +app.include_router(ws_router) if __name__ == "__main__": import uvicorn diff --git a/src/backend/requirements.txt b/src/backend/requirements.txt index d82093d..eb93c29 100644 --- a/src/backend/requirements.txt +++ b/src/backend/requirements.txt @@ -1,5 +1,5 @@ fastapi -uvicorn +uvicorn[standard] httpx jinja2 asyncpg @@ -11,4 +11,5 @@ posthog redis psycopg2-binary python-multipart -cryptography # Required for JWT key handling +websockets +cryptography diff --git a/src/backend/routers/app_router.py b/src/backend/routers/app_router.py index 2070224..f053808 100644 --- a/src/backend/routers/app_router.py +++ b/src/backend/routers/app_router.py @@ -7,21 +7,22 @@ app_router = APIRouter() -@app_router.get("/build-info") -async def get_build_info(): - """ - Return the current build information from the static assets - """ - try: - # Read the build-info.json file that will be generated during build - build_info_path = os.path.join(STATIC_DIR, "build-info.json") - with open(build_info_path, 'r') as f: - build_info = json.load(f) - return build_info - except Exception as e: - # Return a default response if file doesn't exist - print(f"Error reading build-info.json: {str(e)}") - return {"buildHash": "development", "timestamp": int(time.time())} +# TODO: Add build info back in +# @app_router.get("/build-info") +# async def get_build_info(): +# """ +# Return the current build information from the static assets +# """ +# try: +# # Read the build-info.json file that will be generated during build +# build_info_path = os.path.join(STATIC_DIR, "build-info.json") +# with open(build_info_path, 'r') as f: +# build_info = json.load(f) +# return build_info +# except Exception as e: +# # Return a default response if file doesn't exist +# print(f"Error reading build-info.json: {str(e)}") +# return {"buildHash": "development", "timestamp": int(time.time())} @app_router.get("/config") async def get_app_config(): diff --git a/src/backend/routers/auth_router.py b/src/backend/routers/auth_router.py index 2212124..f33968f 100644 --- a/src/backend/routers/auth_router.py +++ b/src/backend/routers/auth_router.py @@ -4,20 +4,30 @@ from fastapi import APIRouter, Request, HTTPException, Depends from fastapi.responses import RedirectResponse, FileResponse, JSONResponse import os +from typing import Optional +import time -from config import (get_auth_url, get_token_url, set_session, delete_session, get_session, - FRONTEND_URL, OIDC_CLIENT_ID, OIDC_CLIENT_SECRET, OIDC_SERVER_URL, OIDC_REALM, OIDC_REDIRECT_URI, STATIC_DIR) -from dependencies import get_coder_api +from config import (FRONTEND_URL, STATIC_DIR) +from dependencies import get_coder_api, get_session_domain from coder import CoderAPI +from dependencies import optional_auth, UserSession +from domain.session import Session +from database.database import async_session +from domain.user import User auth_router = APIRouter() @auth_router.get("/login") -async def login(request: Request, kc_idp_hint: str = None, popup: str = None): +async def login( + request: Request, + session_domain: Session = Depends(get_session_domain), + kc_idp_hint: str = None, + popup: str = None +): session_id = secrets.token_urlsafe(32) - auth_url = get_auth_url() + auth_url = session_domain.get_auth_url() state = "popup" if popup == "1" else "default" if kc_idp_hint: @@ -36,23 +46,23 @@ async def callback( request: Request, code: str, state: str = "default", - coder_api: CoderAPI = Depends(get_coder_api) + coder_api: CoderAPI = Depends(get_coder_api), + session_domain: Session = Depends(get_session_domain) ): session_id = request.cookies.get('session_id') if not session_id: - print("No session ID found") raise HTTPException(status_code=400, detail="No session") # Exchange code for token async with httpx.AsyncClient() as client: token_response = await client.post( - get_token_url(), + session_domain.get_token_url(), data={ 'grant_type': 'authorization_code', - 'client_id': OIDC_CLIENT_ID, - 'client_secret': OIDC_CLIENT_SECRET, + 'client_id': session_domain.oidc_config['client_id'], + 'client_secret': session_domain.oidc_config['client_secret'], 'code': code, - 'redirect_uri': OIDC_REDIRECT_URI + 'redirect_uri': session_domain.oidc_config['redirect_uri'] } ) @@ -60,11 +70,30 @@ async def callback( raise HTTPException(status_code=400, detail="Auth failed") token_data = token_response.json() - expiry = token_data['expires_in'] - set_session(session_id, token_data, expiry) + expiry = token_data['refresh_expires_in'] + + # Store the token data in Redis + success = await session_domain.set(session_id, token_data, expiry) + if not success: + raise HTTPException(status_code=500, detail="Failed to store session") + + # Track the login event + await session_domain.track_event(session_id, 'login') + access_token = token_data['access_token'] user_info = jwt.decode(access_token, options={"verify_signature": False}) + # Ensure user exists in database (only during login) + async with async_session() as db_session: + try: + await User.ensure_exists(db_session, user_info) + except Exception as e: + # Handle duplicate key violations gracefully - this means user already exists + if "duplicate key value violates unique constraint" in str(e) or "already exists" in str(e): + print(f"User {user_info.get('sub')} already exists in database (race condition handled)") + else: + raise e + try: user_data, _ = coder_api.ensure_user_exists( user_info @@ -80,23 +109,90 @@ async def callback( return RedirectResponse('/') @auth_router.get("/logout") -async def logout(request: Request): +async def logout(request: Request, session_domain: Session = Depends(get_session_domain)): session_id = request.cookies.get('session_id') - session_data = get_session(session_id) + if not session_id: + return RedirectResponse('/') + + session_data = await session_domain.get(session_id) if not session_data: return RedirectResponse('/') id_token = session_data.get('id_token', '') + # Track logout event before deleting session + await session_domain.track_event(session_id, 'logout') + # Delete the session from Redis - delete_session(session_id) + success = await session_domain.delete(session_id) + if not success: + print(f"Warning: Failed to delete session {session_id}") # Create the Keycloak logout URL with redirect back to our app - logout_url = f"{OIDC_SERVER_URL}/realms/{OIDC_REALM}/protocol/openid-connect/logout" + logout_url = f"{session_domain.oidc_config['server_url']}/realms/{session_domain.oidc_config['realm']}/protocol/openid-connect/logout" full_logout_url = f"{logout_url}?id_token_hint={id_token}&post_logout_redirect_uri={FRONTEND_URL}" - # Create a redirect response to Keycloak's logout endpoint + # Create a response with the logout URL and clear the session cookie response = JSONResponse({"status": "success", "logout_url": full_logout_url}) + response.delete_cookie( + key="session_id", + path="/", + secure=True, + httponly=True, + samesite="lax" + ) return response + +@auth_router.get("/status") +async def auth_status( + user_session: Optional[UserSession] = Depends(optional_auth) +): + """Check if the user is authenticated and return session information""" + if not user_session: + return JSONResponse({ + "authenticated": False, + "message": "Not authenticated" + }) + + try: + expires_in = user_session.token_data.get('exp') - time.time() + + return JSONResponse({ + "authenticated": True, + "user": { + "id": str(user_session.id), + "username": user_session.username, + "email": user_session.email, + "name": user_session.name + }, + "expires_in": expires_in + }) + except Exception as e: + return JSONResponse({ + "authenticated": False, + "message": f"Error processing session: {str(e)}" + }) + +@auth_router.post("/refresh") +async def refresh_session(request: Request, session_domain: Session = Depends(get_session_domain)): + """Refresh the current session's access token""" + session_id = request.cookies.get('session_id') + if not session_id: + raise HTTPException(status_code=401, detail="No session found") + + session_data = await session_domain.get(session_id) + if not session_data: + raise HTTPException(status_code=401, detail="Invalid session") + + # Try to refresh the token + success, new_token_data = await session_domain.refresh_token(session_id, session_data) + if not success: + raise HTTPException(status_code=401, detail="Failed to refresh session") + + # Return the new expiry time + return JSONResponse({ + "expires_in": new_token_data.get('expires_in'), + "authenticated": True + }) \ No newline at end of file diff --git a/src/backend/routers/pad_router.py b/src/backend/routers/pad_router.py index de3b669..23ef86e 100644 --- a/src/backend/routers/pad_router.py +++ b/src/backend/routers/pad_router.py @@ -1,381 +1,163 @@ from uuid import UUID -from typing import Dict, Any +from typing import Dict, Any, Tuple -from fastapi import APIRouter, HTTPException, Depends, Request -from fastapi.responses import JSONResponse +from fastapi import APIRouter, Depends, HTTPException +from sqlalchemy.ext.asyncio import AsyncSession +from pydantic import BaseModel + +from dependencies import UserSession, require_auth, require_pad_access, require_pad_owner +from database.models import PadStore +from database.database import get_session +from domain.pad import Pad +from domain.user import User -from dependencies import UserSession, require_auth -from database import get_pad_service, get_backup_service, get_template_pad_service -from database.service import PadService, BackupService, TemplatePadService -from config import MAX_BACKUPS_PER_USER, MIN_INTERVAL_MINUTES, DEFAULT_PAD_NAME, DEFAULT_TEMPLATE_NAME pad_router = APIRouter() -def ensure_pad_metadata(data: Dict[str, Any], pad_id: str, display_name: str) -> Dict[str, Any]: - """ - Ensure the pad metadata (uniqueId and displayName) is set in the data. - - Args: - data: The pad data to modify - pad_id: The pad ID to set as uniqueId - display_name: The display name to set - - Returns: - The modified data - """ - # Ensure the appState and pad objects exist - if "appState" not in data: - data["appState"] = {} - if "pad" not in data["appState"]: - data["appState"]["pad"] = {} - - # Set the uniqueId to match the database ID - data["appState"]["pad"]["uniqueId"] = str(pad_id) - data["appState"]["pad"]["displayName"] = display_name - - return data +# Request models +class RenameRequest(BaseModel): + display_name: str +class SharingPolicyUpdate(BaseModel): + policy: str # "private", "whitelist", or "public" -@pad_router.post("") -async def update_first_pad( - data: Dict[str, Any], +class WhitelistUpdate(BaseModel): + user_id: UUID + +@pad_router.post("/new") +async def create_new_pad( user: UserSession = Depends(require_auth), - pad_service: PadService = Depends(get_pad_service), - backup_service: BackupService = Depends(get_backup_service), - template_pad_service: TemplatePadService = Depends(get_template_pad_service), -): - """ - Update the first pad for the authenticated user. - - This is a backward compatibility endpoint that assumes the user is trying to update their first pad. - It will be deprecated in the future. Please use POST /api/pad/{pad_id} instead. - """ + session: AsyncSession = Depends(get_session) +) -> Dict[str, Any]: + """Create a new pad for the authenticated user""" try: - # Get user's pads - user_pads = await pad_service.get_pads_by_owner(user.id) - - # If user has no pads, create a default one - if not user_pads: - new_pad = await create_pad_from_template( - name=DEFAULT_TEMPLATE_NAME, - display_name=DEFAULT_PAD_NAME, - user=user, - pad_service=pad_service, - template_pad_service=template_pad_service, - backup_service=backup_service - ) - pad_id = new_pad["id"] - else: - # Use the first pad - pad_id = user_pads[0]["id"] - - # Get the pad to verify ownership - pad = await pad_service.get_pad(pad_id) - - if not pad: - raise HTTPException(status_code=404, detail="Pad not found") - - # Verify the user owns this pad - if str(pad["owner_id"]) != str(user.id): - raise HTTPException(status_code=403, detail="You don't have permission to update this pad") - - # Ensure the uniqueId and displayName are set in the data - data = ensure_pad_metadata(data, str(pad_id), pad["display_name"]) - - # Update the pad - await pad_service.update_pad_data(pad_id, data) - - # Create a backup if needed - await backup_service.create_backup_if_needed( - source_id=pad_id, - data=data, - min_interval_minutes=MIN_INTERVAL_MINUTES, - max_backups=MAX_BACKUPS_PER_USER - ) - - # Return success with deprecation notice - return JSONResponse( - content={"status": "success", "message": "This endpoint is deprecated. Please use POST /api/pad/{pad_id} instead."}, - headers={"Deprecation": "true", "Sunset": "Mon, 10 May 2025 00:00:00 GMT"} + pad = await Pad.create( + session=session, + owner_id=user.id, + display_name="New pad" ) + return pad.to_dict() except Exception as e: - print(f"Error updating pad: {str(e)}") - raise HTTPException(status_code=500, detail=f"Failed to update pad: {str(e)}") - + raise HTTPException( + status_code=500, + detail=f"Failed to create new pad: {str(e)}" + ) -@pad_router.post("/{pad_id}") -async def update_specific_pad( - pad_id: UUID, - data: Dict[str, Any], - user: UserSession = Depends(require_auth), - pad_service: PadService = Depends(get_pad_service), - backup_service: BackupService = Depends(get_backup_service), -): - """Update a specific pad's data for the authenticated user""" +@pad_router.get("/{pad_id}") +async def get_pad( + pad_access: Tuple[Pad, UserSession] = Depends(require_pad_access), + session: AsyncSession = Depends(get_session) +) -> Dict[str, Any]: + """Get a specific pad for the authenticated user""" try: - # Get the pad to verify ownership - pad = await pad_service.get_pad(pad_id) - - if not pad: - raise HTTPException(status_code=404, detail="Pad not found") + pad, user = pad_access - # Verify the user owns this pad - if str(pad["owner_id"]) != str(user.id): - raise HTTPException(status_code=403, detail="You don't have permission to update this pad") - - # Ensure the uniqueId and displayName are set in the data - data = ensure_pad_metadata(data, str(pad_id), pad["display_name"]) - - # Update the pad - await pad_service.update_pad_data(pad_id, data) - - # Create a backup if needed - await backup_service.create_backup_if_needed( - source_id=pad_id, - data=data, - min_interval_minutes=MIN_INTERVAL_MINUTES, - max_backups=MAX_BACKUPS_PER_USER - ) - - return {"status": "success"} + # Update the user's last selected pad + user_obj = await User.get_by_id(session, user.id) + if user_obj: + await user_obj.set_last_selected_pad(session, pad.id) + + pad_dict = pad.to_dict() + # Get only this user's appState + user_app_state = pad_dict["data"]["appState"].get(str(user.id), {}) + pad_dict["data"]["appState"] = user_app_state + return pad_dict["data"] except Exception as e: - print(f"Error updating pad: {str(e)}") - raise HTTPException(status_code=500, detail=f"Failed to update pad: {str(e)}") - + raise HTTPException( + status_code=500, + detail=f"Failed to get pad: {str(e)}" + ) -@pad_router.patch("/{pad_id}") +@pad_router.put("/{pad_id}/rename") async def rename_pad( - pad_id: UUID, - data: Dict[str, str], - user: UserSession = Depends(require_auth), - pad_service: PadService = Depends(get_pad_service), -): - """Rename a pad for the authenticated user""" + rename_data: RenameRequest, + pad_access: Tuple[Pad, UserSession] = Depends(require_pad_owner), + session: AsyncSession = Depends(get_session) +) -> Dict[str, Any]: + """Rename a pad (owner only)""" try: - # Get the pad to verify ownership - pad = await pad_service.get_pad(pad_id) - - if not pad: - raise HTTPException(status_code=404, detail="Pad not found") - - # Verify the user owns this pad - if str(pad["owner_id"]) != str(user.id): - raise HTTPException(status_code=403, detail="You don't have permission to rename this pad") - - # Check if display_name is provided - if "display_name" not in data: - raise HTTPException(status_code=400, detail="display_name is required") - - # Update the pad's display name - update_data = {"display_name": data["display_name"]} - updated_pad = await pad_service.update_pad(pad_id, update_data) - - return {"status": "success", "pad": updated_pad} - except ValueError as e: - print(f"Error renaming pad: {str(e)}") - raise HTTPException(status_code=400, detail=str(e)) + pad, _ = pad_access + await pad.rename(session, rename_data.display_name) + return pad.to_dict() except Exception as e: - print(f"Error renaming pad: {str(e)}") - raise HTTPException(status_code=500, detail=f"Failed to rename pad: {str(e)}") - + raise HTTPException( + status_code=500, + detail=f"Failed to rename pad: {str(e)}" + ) @pad_router.delete("/{pad_id}") async def delete_pad( - pad_id: UUID, - user: UserSession = Depends(require_auth), - pad_service: PadService = Depends(get_pad_service), -): - """Delete a pad for the authenticated user""" + pad_access: Tuple[Pad, UserSession] = Depends(require_pad_owner), + session: AsyncSession = Depends(get_session) +) -> Dict[str, Any]: + """Delete a pad (owner only)""" try: - # Get the pad to verify ownership - pad = await pad_service.get_pad(pad_id) - - if not pad: - raise HTTPException(status_code=404, detail="Pad not found") - - # Verify the user owns this pad - if str(pad["owner_id"]) != str(user.id): - raise HTTPException(status_code=403, detail="You don't have permission to delete this pad") - - # Delete the pad - success = await pad_service.delete_pad(pad_id) - + pad, _ = pad_access + success = await pad.delete(session) if not success: - raise HTTPException(status_code=500, detail="Failed to delete pad") - - return {"status": "success"} - except ValueError as e: - print(f"Error deleting pad: {str(e)}") - raise HTTPException(status_code=400, detail=str(e)) - except Exception as e: - print(f"Error deleting pad: {str(e)}") - raise HTTPException(status_code=500, detail=f"Failed to delete pad: {str(e)}") - - -@pad_router.get("") -async def get_all_pads( - user: UserSession = Depends(require_auth), - pad_service: PadService = Depends(get_pad_service), - template_pad_service: TemplatePadService = Depends(get_template_pad_service), - backup_service: BackupService = Depends(get_backup_service) -): - """Get all pads for the authenticated user""" - try: - # Get user's pads - user_pads = await pad_service.get_pads_by_owner(user.id) - - if not user_pads: - # Create a default pad if user doesn't have any - new_pad = await create_pad_from_template( - name=DEFAULT_TEMPLATE_NAME, - display_name=DEFAULT_PAD_NAME, - user=user, - pad_service=pad_service, - template_pad_service=template_pad_service, - backup_service=backup_service + raise HTTPException( + status_code=500, + detail="Failed to delete pad" ) - - # Return the new pad in a list - return [new_pad] - - # Ensure each pad's data has the uniqueId and displayName set - for pad in user_pads: - pad_data = pad["data"] - - # Ensure the uniqueId and displayName are set in the data - pad_data = ensure_pad_metadata(pad_data, str(pad["id"]), pad["display_name"]) - # Return all pads - return user_pads + return {"success": True, "message": "Pad deleted successfully"} except Exception as e: - print(f"Error getting pad data: {str(e)}") - raise HTTPException(status_code=500, detail=f"Failed to get pad data: {str(e)}") - - -@pad_router.post("/from-template/{name}") -async def create_pad_from_template( - name: str, - display_name: str = DEFAULT_PAD_NAME, - user: UserSession = Depends(require_auth), - pad_service: PadService = Depends(get_pad_service), - template_pad_service: TemplatePadService = Depends(get_template_pad_service), - backup_service: BackupService = Depends(get_backup_service) -): - """Create a new pad from a template""" + raise HTTPException( + status_code=500, + detail=f"Failed to delete pad: {str(e)}" + ) +@pad_router.put("/{pad_id}/sharing") +async def update_sharing_policy( + policy_update: SharingPolicyUpdate, + pad_access: Tuple[Pad, UserSession] = Depends(require_pad_owner), + session: AsyncSession = Depends(get_session) +) -> Dict[str, Any]: + """Update the sharing policy of a pad (owner only)""" try: - # Get the template - template = await template_pad_service.get_template_by_name(name) - if not template: - raise HTTPException(status_code=404, detail="Template not found") - - # Get the template data - template_data = template["data"] - - # Before creating, ensure the pad object exists in the data - template_data = ensure_pad_metadata(template_data, "", "") - - # Create a new pad using the template data - pad = await pad_service.create_pad( - owner_id=user.id, - display_name=display_name, - data=template_data, - user_session=user - ) - - # Set the uniqueId and displayName to match the database ID and display name - template_data = ensure_pad_metadata(template_data, str(pad["id"]), display_name) - - # Update the pad with the modified data - await pad_service.update_pad_data(pad["id"], template_data) - - # Create an initial backup for the new pad - await backup_service.create_backup_if_needed( - source_id=pad["id"], - data=template_data, - min_interval_minutes=0, # Always create initial backup - max_backups=MAX_BACKUPS_PER_USER - ) - - return pad + pad, _ = pad_access + await pad.set_sharing_policy(session, policy_update.policy) + return pad.to_dict() except ValueError as e: - print(f"Error creating pad from template: {str(e)}") - raise HTTPException(status_code=400, detail=str(e)) + raise HTTPException( + status_code=400, + detail=str(e) + ) except Exception as e: - print(f"Error creating pad from template: {str(e)}") - raise HTTPException(status_code=500, detail=f"Failed to create pad from template: {str(e)}") - + raise HTTPException( + status_code=500, + detail=f"Failed to update sharing policy: {str(e)}" + ) -@pad_router.get("/{pad_id}/backups") -async def get_pad_backups( - pad_id: UUID, - limit: int = MAX_BACKUPS_PER_USER, - user: UserSession = Depends(require_auth), - pad_service: PadService = Depends(get_pad_service), - backup_service: BackupService = Depends(get_backup_service) -): - """Get backups for a specific pad""" - # Limit the number of backups to the maximum configured value - if limit > MAX_BACKUPS_PER_USER: - limit = MAX_BACKUPS_PER_USER - +@pad_router.post("/{pad_id}/whitelist") +async def add_to_whitelist( + whitelist_update: WhitelistUpdate, + pad_access: Tuple[Pad, UserSession] = Depends(require_pad_owner), + session: AsyncSession = Depends(get_session) +) -> Dict[str, Any]: + """Add a user to the pad's whitelist (owner only)""" try: - # Get the pad to verify ownership - pad = await pad_service.get_pad(pad_id) - - if not pad: - raise HTTPException(status_code=404, detail="Pad not found") - - # Verify the user owns this pad - if str(pad["owner_id"]) != str(user.id): - raise HTTPException(status_code=403, detail="You don't have permission to access this pad's backups") - - # Get backups for this specific pad - backups_data = await backup_service.get_backups_by_source(pad_id) - - # Limit the number of backups if needed - if len(backups_data) > limit: - backups_data = backups_data[:limit] - - # Format backups to match the expected response format - backups = [] - for backup in backups_data: - backups.append({ - "id": backup["id"], - "timestamp": backup["created_at"], - "data": backup["data"] - }) - - return {"backups": backups, "pad_name": pad["display_name"]} + pad, _ = pad_access + await pad.add_to_whitelist(session, whitelist_update.user_id) + return pad.to_dict() except Exception as e: - print(f"Error getting pad backups: {str(e)}") - raise HTTPException(status_code=500, detail=f"Failed to get pad backups: {str(e)}") - + raise HTTPException( + status_code=500, + detail=f"Failed to add user to whitelist: {str(e)}" + ) -@pad_router.get("/recent") -async def get_recent_canvas_backups( - limit: int = MAX_BACKUPS_PER_USER, - user: UserSession = Depends(require_auth), - backup_service: BackupService = Depends(get_backup_service) -): - """Get the most recent canvas backups for the authenticated user""" - # Limit the number of backups to the maximum configured value - if limit > MAX_BACKUPS_PER_USER: - limit = MAX_BACKUPS_PER_USER - +@pad_router.delete("/{pad_id}/whitelist/{user_id}") +async def remove_from_whitelist( + user_id: UUID, + pad_access: Tuple[Pad, UserSession] = Depends(require_pad_owner), + session: AsyncSession = Depends(get_session) +) -> Dict[str, Any]: + """Remove a user from the pad's whitelist (owner only)""" try: - # Get backups directly with a single query - backups_data = await backup_service.get_backups_by_user(user.id, limit) - - # Format backups to match the expected response format - backups = [] - for backup in backups_data: - backups.append({ - "id": backup["id"], - "timestamp": backup["created_at"], - "data": backup["data"] - }) - - return {"backups": backups} + pad, _ = pad_access + await pad.remove_from_whitelist(session, user_id) + return pad.to_dict() except Exception as e: - print(f"Error getting canvas backups: {str(e)}") - raise HTTPException(status_code=500, detail=f"Failed to get canvas backups: {str(e)}") + raise HTTPException( + status_code=500, + detail=f"Failed to remove user from whitelist: {str(e)}" + ) diff --git a/src/backend/routers/template_pad_router.py b/src/backend/routers/template_pad_router.py deleted file mode 100644 index 6b57786..0000000 --- a/src/backend/routers/template_pad_router.py +++ /dev/null @@ -1,118 +0,0 @@ -from typing import Dict, Any - -from fastapi import APIRouter, HTTPException, Depends - -from dependencies import UserSession, require_auth, require_admin -from database import get_template_pad_service -from database.service import TemplatePadService - -template_pad_router = APIRouter() - - -@template_pad_router.post("") -async def create_template_pad( - data: Dict[str, Any], - name: str, - display_name: str, - _: bool = Depends(require_admin), - template_pad_service: TemplatePadService = Depends(get_template_pad_service) -): - """Create a new template pad (admin only)""" - try: - template_pad = await template_pad_service.create_template( - name=name, - display_name=display_name, - data=data - ) - return template_pad - except ValueError as e: - print(f"Error creating template pad: {str(e)}") - raise HTTPException(status_code=400, detail=str(e)) - - -@template_pad_router.get("") -async def get_all_template_pads( - _: bool = Depends(require_admin), - template_pad_service: TemplatePadService = Depends(get_template_pad_service) -): - """Get all template pads""" - try: - template_pads = await template_pad_service.get_all_templates() - return template_pads - except Exception as e: - print(f"Error getting template pads: {str(e)}") - raise HTTPException(status_code=500, detail=f"Failed to get template pads: {str(e)}") - - -@template_pad_router.get("/{name}") -async def get_template_pad( - name: str, - _: UserSession = Depends(require_auth), - template_pad_service: TemplatePadService = Depends(get_template_pad_service) -): - """Get a specific template pad by name""" - template_pad = await template_pad_service.get_template_by_name(name) - if not template_pad: - print(f"Template pad not found: {name}") - raise HTTPException(status_code=404, detail="Template pad not found") - - return template_pad - - -@template_pad_router.put("/{name}") -async def update_template_pad( - name: str, - data: Dict[str, Any], - _: bool = Depends(require_admin), - template_pad_service: TemplatePadService = Depends(get_template_pad_service) -): - """Update a template pad (admin only)""" - try: - updated_template = await template_pad_service.update_template(name, data) - if not updated_template: - print(f"Template pad not found: {name}") - raise HTTPException(status_code=404, detail="Template pad not found") - - return updated_template - except ValueError as e: - print(f"Error updating template pad: {str(e)}") - raise HTTPException(status_code=400, detail=str(e)) - - -@template_pad_router.put("/{name}/data") -async def update_template_pad_data( - name: str, - data: Dict[str, Any], - _: bool = Depends(require_admin), - template_pad_service: TemplatePadService = Depends(get_template_pad_service) -): - """Update just the data field of a template pad (admin only)""" - try: - updated_template = await template_pad_service.update_template_data(name, data) - if not updated_template: - print(f"Template pad not found: {name}") - raise HTTPException(status_code=404, detail="Template pad not found") - - return updated_template - except ValueError as e: - print(f"Error updating template pad data: {str(e)}") - raise HTTPException(status_code=400, detail=str(e)) - - -@template_pad_router.delete("/{name}") -async def delete_template_pad( - name: str, - _: bool = Depends(require_admin), - template_pad_service: TemplatePadService = Depends(get_template_pad_service) -): - """Delete a template pad (admin only)""" - try: - success = await template_pad_service.delete_template(name) - if not success: - print(f"Template pad not found: {name}") - raise HTTPException(status_code=404, detail="Template pad not found") - - return {"status": "success", "message": "Template pad deleted successfully"} - except ValueError as e: - print(f"Error deleting template pad: {str(e)}") - raise HTTPException(status_code=400, detail=str(e)) diff --git a/src/backend/routers/user_router.py b/src/backend/routers/user_router.py deleted file mode 100644 index fe62a01..0000000 --- a/src/backend/routers/user_router.py +++ /dev/null @@ -1,161 +0,0 @@ -import os -import json -from uuid import UUID - -import posthog -import jwt -from fastapi import APIRouter, Depends, HTTPException - -from config import get_redis_client, get_jwks_client, OIDC_CLIENT_ID, FRONTEND_URL -from database import get_user_service -from database.service import UserService -from dependencies import UserSession, require_admin, require_auth - -user_router = APIRouter() - - -@user_router.post("") -async def create_user( - user_id: UUID, - username: str, - email: str, - email_verified: bool = False, - name: str = None, - given_name: str = None, - family_name: str = None, - roles: list = None, - _: bool = Depends(require_admin), - user_service: UserService = Depends(get_user_service) -): - """Create a new user (admin only)""" - try: - user = await user_service.create_user( - user_id=user_id, - username=username, - email=email, - email_verified=email_verified, - name=name, - given_name=given_name, - family_name=family_name, - roles=roles - ) - return user - except ValueError as e: - print(f"Error creating user: {str(e)}") - raise HTTPException(status_code=400, detail=str(e)) - - -@user_router.get("") -async def get_all_users( - _: bool = Depends(require_admin), - user_service: UserService = Depends(get_user_service) -): - """Get all users (admin only)""" - users = await user_service.get_all_users() - return users - - -@user_router.get("/me") -async def get_user_info( - user: UserSession = Depends(require_auth), - user_service: UserService = Depends(get_user_service), -): - """Get the current user's information and sync with token data""" - - # Create token data dictionary from UserSession properties - token_data = { - "username": user.username, - "email": user.email, - "email_verified": user.email_verified, - "name": user.name, - "given_name": user.given_name, - "family_name": user.family_name, - "roles": user.roles - } - - try: - # Sync user with token data - user_data = await user_service.sync_user_with_token_data(user.id, token_data) - except Exception as e: - print(f"Error syncing user data: {str(e)}") - raise HTTPException( - status_code=500, - detail=f"Error syncing user data: {e}" - ) - - if os.getenv("VITE_PUBLIC_POSTHOG_KEY"): - telemetry = user_data.copy() - telemetry["$current_url"] = FRONTEND_URL - posthog.identify(distinct_id=user_data["id"], properties=telemetry) - - return user_data - - -@user_router.get("/count") -async def get_user_count( - _: bool = Depends(require_admin), -): - """Get the number of active sessions (admin only)""" - client = get_redis_client() - session_count = len(client.keys("session:*")) - return {"active_sessions": session_count } - - -@user_router.get("/online") -async def get_online_users( - _: bool = Depends(require_admin), - user_service: UserService = Depends(get_user_service) -): - """Get all online users with their information (admin only)""" - client = get_redis_client() - - # Get all session keys - session_keys = client.keys("session:*") - - # Extract user IDs from sessions and fetch user data - online_users = [] - for key in session_keys: - session_data = client.get(key) - if session_data: - try: - # Parse session data - session_json = json.loads(session_data) - - # Extract user ID from token - token_data = session_json.get('access_token') - if token_data: - # Decode JWT token to get user ID - jwks_client = get_jwks_client() - signing_key = jwks_client.get_signing_key_from_jwt(token_data) - decoded = jwt.decode( - token_data, - signing_key.key, - algorithms=["RS256"], - audience=OIDC_CLIENT_ID, - ) - - # Get user ID from token - user_id = UUID(decoded.get('sub')) - - # Fetch user data from database - user_data = await user_service.get_user(user_id) - if user_data: - online_users.append(user_data) - except Exception as e: - print(f"Error processing session {key}: {str(e)}") - continue - - return {"online_users": online_users, "count": len(online_users)} - -@user_router.get("/{user_id}") -async def get_user( - user_id: UUID, - _: bool = Depends(require_admin), - user_service: UserService = Depends(get_user_service) -): - """Get a user by ID (admin only)""" - user = await user_service.get_user(user_id) - if not user: - raise HTTPException(status_code=404, detail="User not found") - - return user diff --git a/src/backend/routers/users_router.py b/src/backend/routers/users_router.py new file mode 100644 index 0000000..d4cea22 --- /dev/null +++ b/src/backend/routers/users_router.py @@ -0,0 +1,174 @@ +import os +import json +from uuid import UUID + +import posthog +import jwt +from fastapi import APIRouter, Depends, HTTPException, Request, Response +from sqlalchemy.ext.asyncio import AsyncSession + +from cache import RedisClient +from config import get_jwks_client, OIDC_CLIENT_ID, FRONTEND_URL +from dependencies import UserSession, require_admin, require_auth +from database.database import get_session +from domain.user import User +from domain.pad import Pad + +users_router = APIRouter() + + +@users_router.get("/me") +async def get_user_info( + request: Request, + response: Response, + user: UserSession = Depends(require_auth), + session: AsyncSession = Depends(get_session), +): + """Get the current user's information and their pads""" + + # Get the user from database to access last_selected_pad + user_obj = await User.get_by_id(session, user.id) + + # Check for pending pad cookie + pending_pad_id = request.cookies.get("pending_pad_id") + if pending_pad_id: + try: + pad_id = UUID(pending_pad_id) + pad = await Pad.get_by_id(session, pad_id) + + if pad and pad.can_access(user.id): + # Convert all UUIDs to strings for comparison + open_pads_str = [str(pid) for pid in user_obj._store.open_pads] + if str(pad_id) not in open_pads_str: + user_obj._store.open_pads = [UUID(pid) for pid in open_pads_str] + [pad_id] + await user_obj.save(session) + + # Update last selected pad + await user_obj.set_last_selected_pad(session, pad_id) + + except (ValueError, Exception) as e: + print(f"Error processing pending pad: {e}") + finally: + # Always clear the cookie with same settings as when setting it + response.delete_cookie( + key="pending_pad_id", + secure=True, + httponly=False, + samesite="lax" + ) + + # Create token data dictionary from UserSession properties + token_data = { + "id": user.id, + "username": user.username, + "email": user.email, + "email_verified": user.email_verified, + "name": user.name, + "given_name": user.given_name, + "family_name": user.family_name, + "roles": user.roles + } + + # Get user's pad metadata + pads = await User.get_open_pads(session, user.id) + + user_data = { + **token_data, + "pads": pads, + "last_selected_pad": str(user_obj.last_selected_pad) if user_obj and user_obj.last_selected_pad else None + } + + if os.getenv("VITE_PUBLIC_POSTHOG_KEY"): + telemetry = user_data.copy() + telemetry["$current_url"] = FRONTEND_URL + posthog.identify(distinct_id=user.id, properties=telemetry) + + return user_data + + +@users_router.get("/online") +async def get_online_users( + _: bool = Depends(require_admin), +): + """Get all online users with their information (admin only)""" + try: + client = await RedisClient.get_instance() + + # Get all session keys + session_keys = await client.keys("session:*") + + # Extract user IDs from sessions and fetch user data + online_users = [] + jwks_client = get_jwks_client() + + for key in session_keys: + try: + # Get session data + session_data_raw = await client.get(key) + if not session_data_raw: + continue + + # Parse session data + session_json = json.loads(session_data_raw) + + # Extract user ID from token + token_data = session_json.get('access_token') + if not token_data: + continue + + # Decode JWT token to get user ID + signing_key = jwks_client.get_signing_key_from_jwt(token_data) + decoded = jwt.decode( + token_data, + signing_key.key, + algorithms=["RS256"], + audience=OIDC_CLIENT_ID, + ) + + # Get user ID from token + user_id = UUID(decoded.get('sub')) + + # This endpoint is partially implemented - would need to fetch user data + raise NotImplementedError("/online Not implemented") + + except json.JSONDecodeError as e: + print(f"Error parsing session data: {str(e)}") + continue + except jwt.PyJWTError as e: + print(f"Error decoding JWT: {str(e)}") + continue + except Exception as e: + print(f"Error processing session {key}: {str(e)}") + continue + + return {"online_users": online_users, "count": len(online_users)} + + except Exception as e: + print(f"Error getting online users: {str(e)}") + raise HTTPException(status_code=500, detail="Failed to retrieve online users") + +@users_router.delete("/close/{pad_id}") +async def close_pad( + pad_id: UUID, + user: UserSession = Depends(require_auth), + session: AsyncSession = Depends(get_session) +): + """Remove a pad from the user's open_pads list""" + try: + # Get the user + user_obj = await User.get_by_id(session, user.id) + if not user_obj: + raise HTTPException( + status_code=404, + detail="User not found" + ) + await user_obj.remove_open_pad(session, pad_id) + + return {"success": True, "message": "Pad removed from open pads"} + except HTTPException: + raise + except Exception as e: + raise HTTPException( + status_code=500, + detail=f"Failed to remove pad from open pads: {str(e)}" + ) diff --git a/src/backend/routers/ws_router.py b/src/backend/routers/ws_router.py new file mode 100644 index 0000000..9a765ab --- /dev/null +++ b/src/backend/routers/ws_router.py @@ -0,0 +1,474 @@ +import json +import asyncio +import uuid +from uuid import UUID +from typing import Optional, Any, Dict, List, Tuple +from datetime import datetime, timezone + +from fastapi import APIRouter, WebSocket, WebSocketDisconnect, Depends, HTTPException +from pydantic import BaseModel, Field, field_validator +from redis import asyncio as aioredis +from sqlalchemy.ext.asyncio import AsyncSession + +from dependencies import UserSession, get_session_domain, PadAccess +from cache import RedisClient +from domain.pad import Pad +from database.database import async_session +ws_router = APIRouter() + +STREAM_EXPIRY = 3600 +PAD_USERS_EXPIRY = 3600 # Expiry time for the pad users hash +POINTER_CHANNEL_PREFIX = "pad:pointer:updates:" # Prefix for pointer update pub/sub channels + +class WebSocketMessage(BaseModel): + type: str + pad_id: Optional[str] = None + timestamp: datetime = Field(default_factory=lambda: datetime.now(timezone.utc)) + user_id: Optional[str] = None # ID of the user related to the event or sending the message + connection_id: Optional[str] = None # Connection ID related to the event or sending the message + data: Optional[Any] = None # Payload; structure depends entirely on 'type' + + @field_validator('timestamp', mode='before') + @classmethod + def ensure_datetime_object(cls, v): + if isinstance(v, str): + if v.endswith('Z'): + return datetime.fromisoformat(v[:-1] + '+00:00') + return datetime.fromisoformat(v) + if isinstance(v, datetime): + return v + raise ValueError("Timestamp must be a datetime object or an ISO 8601 string") + + class Config: + json_encoders = { + datetime: lambda dt: dt.isoformat().replace('+00:00', 'Z') if dt.tzinfo else dt.replace(tzinfo=timezone.utc).isoformat().replace('+00:00', 'Z') + } + +async def get_ws_user(websocket: WebSocket) -> Optional[UserSession]: + """WebSocket-specific authentication dependency""" + try: + session_id = websocket.cookies.get('session_id') + if not session_id: + return None + + current_session_domain = await get_session_domain() + + session_data = await current_session_domain.get(session_id) + if not session_data: + return None + + if current_session_domain.is_token_expired(session_data): + success, new_session_data = await current_session_domain.refresh_token(session_id, session_data) + if not success: + return None + session_data = new_session_data + + return UserSession( + access_token=session_data.get('access_token'), + token_data=session_data, + session_domain=current_session_domain + ) + except Exception as e: + print(f"Error in WebSocket authentication: {str(e)}") + return None + +async def publish_event_to_redis(redis_client: aioredis.Redis, stream_key: str, event_model: WebSocketMessage): + """Formats event data from WebSocketMessage model and publishes it to a Redis stream.""" + message_dict = event_model.model_dump() + + # Ensure all values are suitable for Redis stream (mostly strings, or numbers) + field_value_dict = {} + for k, v in message_dict.items(): + if isinstance(v, datetime): + field_value_dict[k] = v.isoformat().replace('+00:00', 'Z') + elif isinstance(v, (dict, list)): # Serialize complex data field to JSON string + field_value_dict[k] = json.dumps(v) + elif v is None: + continue # Optionally skip None values or convert to empty string + else: + field_value_dict[k] = str(v) + + try: + async with redis_client.pipeline() as pipe: + # Add message to stream + await pipe.xadd(stream_key, field_value_dict, maxlen=100, approximate=True) + # Set expiration on the stream key + await pipe.expire(stream_key, STREAM_EXPIRY) + await pipe.execute() + except Exception as e: + print(f"Error publishing event to Redis stream {stream_key}: {str(e)}") + +async def publish_pointer_update(redis_client: aioredis.Redis, pad_id: UUID, message: WebSocketMessage): + """ + Publish pointer updates through Redis pub/sub instead of streams. + Since we don't care about persistence or consuming history for pointer updates, + pub/sub is more efficient than streams for this high-frequency data. + """ + try: + channel = f"{POINTER_CHANNEL_PREFIX}{pad_id}" + # Serialize the message and publish it + message_json = message.model_dump_json() + await redis_client.publish(channel, message_json) + except Exception as e: + print(f"Error publishing pointer update to Redis pub/sub {pad_id}: {str(e)}") + +async def check_pad_access(pad_id: UUID, user: UserSession, session: AsyncSession) -> Tuple[bool, Optional[str]]: + """Check if user still has access to the pad. Returns (has_access, error_reason).""" + try: + pad_access = PadAccess() + await pad_access(pad_id, user, session) + return True, None + except HTTPException as e: + return False, e.detail + except Exception as e: + return False, str(e) + +async def periodic_auth_check( + websocket: WebSocket, + pad_id: UUID, + user: UserSession, + redis_client: aioredis.Redis, + stream_key: str, + connection_id: str, + session: AsyncSession +): + """Periodically check if the user still has access to the pad.""" + while websocket.client_state.CONNECTED: + try: + # Check pad access + has_access, error_reason = await check_pad_access(pad_id, user, session) + + if not has_access: + # Create a disconnect message + disconnect_message = WebSocketMessage( + type="force_disconnect", + pad_id=str(pad_id), + user_id=str(user.id), + connection_id=connection_id, + data={"reason": error_reason} + ) + # Publish the disconnect message so other clients know + await publish_event_to_redis(redis_client, stream_key, disconnect_message) + # Close the websocket + await websocket.close(code=4003, reason=error_reason) + break + + except Exception as e: + print(f"Error in periodic auth check for {connection_id[:5]}: {e}") + break + + # Wait for 1 second before next check + await asyncio.sleep(1) + +async def _handle_received_data(raw_data: str, pad_id: UUID, user: UserSession, + redis_client: aioredis.Redis, stream_key: str, connection_id: str, + session: AsyncSession): + """Processes decoded message data, wraps it in WebSocketMessage, and publishes to Redis.""" + try: + + client_message_dict = json.loads(raw_data) + + # Create a WebSocketMessage instance from the client's data + processed_message = WebSocketMessage( + type=client_message_dict.get("type", "unknown_client_message"), + pad_id=str(pad_id), + user_id=str(user.id), + connection_id=connection_id, + timestamp=datetime.now(timezone.utc), + data=client_message_dict.get("data") + ) + + if processed_message.type == 'pointer_update': + await publish_pointer_update(redis_client, pad_id, processed_message) + else: + await publish_event_to_redis(redis_client, stream_key, processed_message) + + + except WebSocketDisconnect: + raise + except json.JSONDecodeError: + print(f"Invalid JSON received from {connection_id[:5]}") + except Exception as e: + print(f"Error processing message from {connection_id[:5]}: {e}") + +async def consume_redis_stream(redis_client: aioredis.Redis, stream_key: str, + websocket: WebSocket, connection_id: str, last_id: str = '$'): + """Consumes messages from Redis stream, parses to WebSocketMessage, and forwards them.""" + while websocket.client_state.CONNECTED: + try: + # Read from Redis stream + streams = await redis_client.xread({stream_key: last_id}, count=5, block=1000) + + if not streams: + await asyncio.sleep(0) + continue + + stream_name, stream_messages = streams[0] + for message_id, message_data_raw_redis in stream_messages: + if not websocket.client_state.CONNECTED: + return + + # Convert raw Redis data to a standard dict + redis_dict = {} + for k, v in message_data_raw_redis.items(): + key = k.decode() if isinstance(k, bytes) else k + value_str = v.decode() if isinstance(v, bytes) else v + + # Parse 'data' field if it's JSON + if key == 'data': + try: + redis_dict[key] = json.loads(value_str) + except json.JSONDecodeError: + redis_dict[key] = value_str + elif key == 'pad_id' and value_str == 'None': + redis_dict[key] = None + else: + redis_dict[key] = value_str + + try: + # Create WebSocketMessage and send to client (if not from this connection) + message_to_send = WebSocketMessage(**redis_dict) + + if message_to_send.connection_id != connection_id and websocket.client_state.CONNECTED and message_to_send.type != 'appstate_update': + await websocket.send_text(message_to_send.model_dump_json()) + except Exception as e: + print(f"Error sending message from Redis: {e}") + + last_id = message_id + + except Exception as e: + if websocket.client_state.CONNECTED: + print(f"Error in Redis stream consumer for {stream_key}: {e}") + return + + +async def consume_pointer_updates(redis_client: aioredis.Redis, pad_id: UUID, + websocket: WebSocket, connection_id: str): + """Consumes pointer updates from Redis pub/sub channel and forwards them to the client.""" + channel = f"{POINTER_CHANNEL_PREFIX}{pad_id}" + pubsub = redis_client.pubsub() + + try: + await pubsub.subscribe(channel) + + # Process messages as they arrive + while websocket.client_state.CONNECTED: + message = await pubsub.get_message(ignore_subscribe_messages=True, timeout=1.0) + + if message and message["type"] == "message": + try: + # Parse the message data + message_data = json.loads(message["data"]) + pointer_message = WebSocketMessage(**message_data) + + # Only forward messages from other connections + if pointer_message.connection_id != connection_id and websocket.client_state.CONNECTED: + await websocket.send_text(message["data"]) + except Exception as e: + print(f"Error processing pointer update: {e}") + + # Prevent CPU hogging + await asyncio.sleep(0) + + except Exception as e: + if websocket.client_state.CONNECTED: + print(f"Error in pointer update consumer for {pad_id}: {e}") + finally: + # Clean up the subscription + try: + await pubsub.unsubscribe(channel) + await pubsub.close() + except Exception: + pass + + +async def add_connection(redis_client: aioredis.Redis, pad_id: UUID, user_id: str, + username: str, connection_id: str) -> None: + """Add a user connection to the pad users hash in Redis.""" + key = f"pad:users:{pad_id}" + try: + # Get existing user data if any + user_data_str = await redis_client.hget(key, user_id) + + if user_data_str: + user_data = json.loads(user_data_str) + # Add the connection ID if it doesn't exist + if connection_id not in user_data["connections"]: + user_data["connections"].append(connection_id) + else: + # Create new user data + user_data = { + "username": username, + "connections": [connection_id] + } + + # Update the hash in Redis + await redis_client.hset(key, user_id, json.dumps(user_data)) + # Set expiry on the hash + await redis_client.expire(key, PAD_USERS_EXPIRY) + except Exception as e: + print(f"Error adding connection to Redis: {e}") + +async def remove_connection(redis_client: aioredis.Redis, pad_id: UUID, user_id: str, + connection_id: str) -> None: + """Remove a user connection from the pad users hash in Redis.""" + key = f"pad:users:{pad_id}" + try: + # Get existing user data + user_data_str = await redis_client.hget(key, user_id) + + if user_data_str: + user_data = json.loads(user_data_str) + + # Remove the connection + if connection_id in user_data["connections"]: + user_data["connections"].remove(connection_id) + + # If there are still connections, update the user data + if user_data["connections"]: + await redis_client.hset(key, user_id, json.dumps(user_data)) + else: + # If no connections left, remove the user from the hash + await redis_client.hdel(key, user_id) + + # Refresh expiry on the hash if it still exists + if await redis_client.exists(key): + await redis_client.expire(key, PAD_USERS_EXPIRY) + except Exception as e: + print(f"Error removing connection from Redis: {e}") + +@ws_router.websocket("/ws/pad/{pad_id}") +async def websocket_endpoint( + websocket: WebSocket, + pad_id: UUID, + user: Optional[UserSession] = Depends(get_ws_user) +): + """WebSocket endpoint for pad collaboration.""" + if not user: + await websocket.close(code=4001, reason="Authentication required") + return + + connection_id = None # Initialize connection_id before try block + redis_client = None # Initialize redis_client before try block + + try: + # Create a database session for the WebSocket connection + async with async_session() as session: + # Check initial pad access + pad_access = PadAccess() + try: + pad, _ = await pad_access(pad_id, user, session) + except HTTPException as e: + await websocket.close(code=e.status_code, reason=e.detail) + return + + # Accept the connection and set up + await websocket.accept() + connection_id = str(uuid.uuid4()) + stream_key = f"pad:stream:{pad_id}" + redis_client = await RedisClient.get_instance() + + await add_connection(redis_client, pad_id, str(user.id), user.username, connection_id) + + # Send connected message to client with connected users info + connected_users = await pad.get_connected_users() + connected_msg = WebSocketMessage( + type="connected", + pad_id=str(pad_id), + user_id=str(user.id), + connection_id=connection_id, + data={ + "collaboratorsList": connected_users + } + ) + await websocket.send_text(connected_msg.model_dump_json()) + + # Broadcast user joined message + join_event_data = {"username": user.username} + join_message = WebSocketMessage( + type="user_joined", + pad_id=str(pad_id), + user_id=str(user.id), + connection_id=connection_id, + data=join_event_data + ) + await publish_event_to_redis(redis_client, stream_key, join_message) + + # Handle incoming messages from client + async def handle_websocket_messages(): + while websocket.client_state.CONNECTED: + try: + data = await websocket.receive_text() + await _handle_received_data(data, pad_id, user, redis_client, stream_key, connection_id, session) + except WebSocketDisconnect as e: + print(f"WebSocket disconnected for user {str(user.id)[:5]} conn {connection_id[:5]}: {e.reason}") + break + except json.JSONDecodeError as e: + print(f"Invalid JSON received from {connection_id[:5]}: {e}") + await websocket.send_text(WebSocketMessage( + type="error", + pad_id=str(pad_id), + data={"message": "Invalid message format. Please send valid JSON."} + ).model_dump_json()) + except Exception as e: + print(f"Error in WebSocket connection for {connection_id[:5]}: {e}") + break + + # Set up tasks for message handling + ws_task = asyncio.create_task(handle_websocket_messages()) + redis_task = asyncio.create_task( + consume_redis_stream(redis_client, stream_key, websocket, connection_id, last_id='$') + ) + pointer_task = asyncio.create_task( + consume_pointer_updates(redis_client, pad_id, websocket, connection_id) + ) + auth_task = asyncio.create_task( + periodic_auth_check(websocket, pad_id, user, redis_client, stream_key, connection_id, session) + ) + + # Wait for any task to complete + done, pending = await asyncio.wait( + [ws_task, redis_task, pointer_task, auth_task], + return_when=asyncio.FIRST_COMPLETED + ) + + # Cancel pending tasks + for task in pending: + task.cancel() + try: + await task + except (asyncio.CancelledError, Exception): + pass + + except Exception as e: + print(f"Error in WebSocket connection: {e}") + + finally: + if connection_id: # Only try to clean up if connection_id was set + + # Remove the connection from Redis + if redis_client: + try: + await remove_connection(redis_client, pad_id, str(user.id), connection_id) + except Exception as e: + print(f"Error removing connection from Redis: {e}") + + # Send user left message + try: + leave_message = WebSocketMessage( + type="user_left", + pad_id=str(pad_id), + user_id=str(user.id), + connection_id=connection_id, + data={} + ) + await publish_event_to_redis(redis_client, stream_key, leave_message) + except Exception as e: + print(f"Error publishing leave message: {e}") + + # Close the WebSocket if still connected + if websocket.client_state.CONNECTED: + try: + await websocket.close() + except Exception: + pass diff --git a/src/backend/templates/default.json b/src/backend/templates/default.json index 0629611..17383ab 100644 --- a/src/backend/templates/default.json +++ b/src/backend/templates/default.json @@ -2180,7 +2180,13 @@ "backgroundColor": "#e9ecef", "customData": { "showHyperlinkIcon": false, - "showClickableHint": false + "showClickableHint": false, + "borderOffsets": { + "top": 40, + "right": 10, + "bottom": 10, + "left": 10 + } } } ] diff --git a/src/backend/templates/dev.json b/src/backend/templates/dev.json new file mode 100644 index 0000000..0a6e58d --- /dev/null +++ b/src/backend/templates/dev.json @@ -0,0 +1,163 @@ +{ + "files": {}, + "appState": { + "pad": { + "displayName": "Welcome!" + }, + "name": "Pad.ws", + "zoom": { + "value": 1 + }, + "stats": { + "open": false, + "panels": 3 + }, + "theme": "dark", + "toast": null, + "width": 1920, + "height": 957, + "penMode": false, + "scrollX": 220, + "scrollY": 220, + "gridSize": 20, + "gridStep": 5, + "openMenu": null, + "isLoading": false, + "offsetTop": 0, + "openPopup": null, + "snapLines": [], + "activeTool": { + "type": "selection", + "locked": false, + "customType": null, + "lastActiveTool": { + "type": "selection", + "locked": false, + "customType": null, + "lastActiveTool": null + } + }, + "fileHandle": null, + "followedBy": {}, + "isCropping": false, + "isResizing": false, + "isRotating": false, + "newElement": null, + "offsetLeft": 0, + "openDialog": null, + "contextMenu": null, + "exportScale": 1, + "openSidebar": null, + "pasteDialog": { + "data": null, + "shown": false + }, + "penDetected": true, + "cursorButton": "up", + "editingFrame": null, + "errorMessage": null, + "multiElement": null, + "userToFollow": null, + "collaborators": {}, + "searchMatches": [], + "editingGroupId": null, + "frameRendering": { + "clip": true, + "name": true, + "enabled": true, + "outline": true + }, + "zenModeEnabled": false, + "gridModeEnabled": true, + "resizingElement": null, + "scrolledOutside": false, + "viewModeEnabled": false, + "activeEmbeddable": null, + "currentChartType": "bar", + "exportBackground": true, + "exportEmbedScene": false, + "frameToHighlight": null, + "isBindingEnabled": true, + "originSnapOffset": null, + "selectedGroupIds": {}, + "selectionElement": null, + "croppingElementId": null, + "hoveredElementIds": {}, + "showWelcomeScreen": true, + "startBoundElement": null, + "suggestedBindings": [], + "currentItemOpacity": 100, + "editingTextElement": null, + "exportWithDarkMode": false, + "selectedElementIds": {}, + "showHyperlinkPopup": false, + "currentItemFontSize": 20, + "elementsToHighlight": null, + "lastPointerDownWith": "mouse", + "viewBackgroundColor": "#ffffff", + "currentItemArrowType": "round", + "currentItemFillStyle": "solid", + "currentItemRoughness": 1, + "currentItemRoundness": "round", + "currentItemTextAlign": "left", + "editingLinearElement": null, + "currentItemFontFamily": 5, + "pendingImageElementId": null, + "selectedLinearElement": null, + "shouldCacheIgnoreZoom": false, + "currentItemStrokeColor": "#1e1e1e", + "currentItemStrokeStyle": "solid", + "currentItemStrokeWidth": 2, + "objectsSnapModeEnabled": false, + "currentItemEndArrowhead": "arrow", + "currentHoveredFontFamily": null, + "currentItemStartArrowhead": null, + "currentItemBackgroundColor": "transparent", + "previousSelectedElementIds": {}, + "defaultSidebarDockedPreference": false, + "selectedElementsAreBeingDragged": false + }, + "elements": [ + { + "x": 0, + "y": 0, + "id": "1igpgsvrsh3", + "link": "!dev", + "seed": 76441, + "type": "embeddable", + "angle": 0, + "index": "b0q", + "width": 700, + "height": 420, + "locked": false, + "frameId": null, + "opacity": 100, + "updated": 1747831905652, + "version": 2127, + "groupIds": [], + "fillStyle": "solid", + "isDeleted": false, + "roughness": 0, + "roundness": { + "type": 3 + }, + "customData": { + "borderOffsets": { + "top": 40, + "left": 10, + "right": 10, + "bottom": 10 + }, + "showClickableHint": false, + "showHyperlinkIcon": false + }, + "isSelected": false, + "strokeColor": "#ced4da", + "strokeStyle": "solid", + "strokeWidth": 2, + "versionNonce": 253042044, + "boundElements": [], + "backgroundColor": "#e9ecef" + } + ] +} \ No newline at end of file diff --git a/src/backend/workers/__init__.py b/src/backend/workers/__init__.py new file mode 100644 index 0000000..0c0309b --- /dev/null +++ b/src/backend/workers/__init__.py @@ -0,0 +1 @@ +"""Worker modules for background tasks.""" \ No newline at end of file diff --git a/src/backend/workers/canvas_worker.py b/src/backend/workers/canvas_worker.py new file mode 100644 index 0000000..94a0a48 --- /dev/null +++ b/src/backend/workers/canvas_worker.py @@ -0,0 +1,454 @@ +import asyncio +import uuid +import json +from typing import Dict, Any, List, Optional, Tuple, Set +from uuid import UUID +from datetime import datetime + +from database.database import async_session +from domain.pad import Pad + +SAVE_INTERVAL = 300 # 5 minutes in seconds + +class CanvasWorker: + """ + Background worker that processes canvas updates from Redis streams. + + This worker can handle multiple pads dynamically. + Uses singleton pattern for proper lifecycle management. + """ + + _instance = None + + @classmethod + async def get_instance(cls) -> 'CanvasWorker': + """Get or create a CanvasWorker instance.""" + if cls._instance is None: + cls._instance = cls() + await cls._instance.initialize() + return cls._instance + + @classmethod + async def shutdown_instance(cls) -> None: + """Shutdown the singleton instance.""" + if cls._instance is not None: + await cls._instance.stop() + cls._instance = None + + def __init__(self): + self._redis = None + self.worker_id = str(uuid.uuid4()) + self._active_pads: Set[UUID] = set() + self._pad_tasks: Dict[UUID, asyncio.Task] = {} + self._last_processed_ids: Dict[UUID, str] = {} + self._periodic_save_tasks: Dict[UUID, asyncio.Task] = {} + + async def initialize(self) -> None: + """Initialize the worker with Redis connection.""" + from cache import RedisClient + self._redis = await RedisClient.get_instance() + + async def stop(self) -> None: + """Stop the worker and all pad processing tasks.""" + print(f"Stopping Canvas worker {self.worker_id[:8]}") + + for pad_id in list(self._active_pads): + await self.stop_processing_pad(pad_id, graceful=True) + + async def start_processing_pad(self, pad_id: UUID) -> bool: + """Start processing updates for a specific pad.""" + if pad_id in self._active_pads: + return True # Already processing + + print(f"Worker {self.worker_id[:8]} starting to process pad {pad_id}") + + # Add to active pads and start task + self._active_pads.add(pad_id) + task = asyncio.create_task(self._process_pad_updates(pad_id)) + self._pad_tasks[pad_id] = task + + # Start periodic save task + save_task = asyncio.create_task(self._periodic_save_to_db(pad_id)) + self._periodic_save_tasks[pad_id] = save_task + + # Set up task cleanup on completion + def cleanup_task(task_ref): + if pad_id in self._active_pads: + self._active_pads.remove(pad_id) + if pad_id in self._pad_tasks: + del self._pad_tasks[pad_id] + + def cleanup_save_task(task_ref): + if pad_id in self._periodic_save_tasks: + del self._periodic_save_tasks[pad_id] + + task.add_done_callback(cleanup_task) + save_task.add_done_callback(cleanup_save_task) + return True + + async def stop_processing_pad(self, pad_id: UUID, graceful: bool = True) -> None: + """Stop processing updates for a specific pad.""" + if pad_id not in self._active_pads: + return + + print(f"Worker {self.worker_id[:8]} stopping processing for pad {pad_id} {'gracefully' if graceful else ''}") + + if graceful: + # Graceful shutdown: remove from active pads and let the task finish naturally + self._active_pads.discard(pad_id) + + # Stop the periodic save task + if pad_id in self._periodic_save_tasks: + save_task = self._periodic_save_tasks[pad_id] + save_task.cancel() + try: + await save_task + except asyncio.CancelledError: + pass + + # Perform final save to database before stopping + await self._save_pad(pad_id) + + # Wait for the task to complete naturally (it will exit the while loop) + if pad_id in self._pad_tasks: + task = self._pad_tasks[pad_id] + try: + # Give it a reasonable time to finish processing + await asyncio.wait_for(task, timeout=10.0) + print(f"Gracefully stopped processing for pad {pad_id}") + except asyncio.TimeoutError: + print(f"Timeout waiting for graceful shutdown of pad {pad_id}, forcing cancellation") + task.cancel() + try: + await task + except asyncio.CancelledError: + pass + except asyncio.CancelledError: + print(f"Processing task for pad {pad_id} was cancelled during graceful shutdown") + pass + else: + # Immediate shutdown: cancel the tasks + if pad_id in self._periodic_save_tasks: + save_task = self._periodic_save_tasks[pad_id] + save_task.cancel() + try: + await save_task + except asyncio.CancelledError: + pass + + if pad_id in self._pad_tasks: + task = self._pad_tasks[pad_id] + task.cancel() + try: + await task + except asyncio.CancelledError: + print(f"Processing task for pad {pad_id} was cancelled") + pass + + self._active_pads.discard(pad_id) + + # Clean up task references + self._pad_tasks.pop(pad_id, None) + self._periodic_save_tasks.pop(pad_id, None) + + await self._release_pad_worker(pad_id) + + async def _release_pad_worker(self, pad_id: UUID) -> None: + """Release the worker assignment for a pad.""" + try: + # Get the pad using proper session management and clear the worker assignment + async with async_session() as session: + pad = await Pad.get_by_id(session, pad_id) + + if pad and pad.worker_id == self.worker_id: + # Only clear if this worker is actually assigned to the pad + pad.worker_id = None + await pad.cache() + print(f"Released worker assignment for pad {pad_id}") + + # Clean up the in-memory tracking + self._last_processed_ids.pop(pad_id, None) + print(f"Cleaned up in-memory tracking for pad {pad_id}") + + except Exception as e: + print(f"Error releasing worker assignment for pad {pad_id}: {e}") + + async def _process_pad_updates(self, pad_id: UUID) -> None: + """Process updates for a specific pad.""" + stream_key = f"pad:stream:{pad_id}" + last_id = "$" # Only process new messages for this worker session + + try: + while pad_id in self._active_pads: + try: + # Read from Redis stream + streams = await self._redis.xread({stream_key: last_id}, count=10, block=5000) + + if not streams: + await asyncio.sleep(0) + continue + + stream_name, stream_messages = streams[0] + + for message_id, message_data in stream_messages: + try: + # Process the message + await self._process_message(pad_id, message_id, message_data) + + # Update last processed ID in memory + self._last_processed_ids[pad_id] = message_id.decode() if isinstance(message_id, bytes) else message_id + except Exception as e: + print(f"Error processing message for pad {pad_id}: {e}") + + # Update last ID + last_id = message_id + except asyncio.CancelledError: + print(f"Processing task for pad {pad_id} was cancelled") + raise + except Exception as e: + print(f"Error reading stream for pad {pad_id}: {e}") + await asyncio.sleep(1) + + # Graceful shutdown: process any remaining messages before exiting + try: + # Process any remaining messages with a shorter timeout + streams = await self._redis.xread({stream_key: last_id}, count=50, block=1000) + + if streams: + stream_name, stream_messages = streams[0] + print(f"Processing {len(stream_messages)} remaining messages for pad {pad_id}") + + for message_id, message_data in stream_messages: + try: + await self._process_message(pad_id, message_id, message_data) + # Update last processed ID for final messages too + self._last_processed_ids[pad_id] = message_id.decode() if isinstance(message_id, bytes) else message_id + except Exception as e: + print(f"Error processing final message for pad {pad_id}: {e}") + + except Exception as e: + print(f"Error processing remaining messages for pad {pad_id}: {e}") + + except asyncio.CancelledError: + print(f"Processing task for pad {pad_id} was cancelled") + finally: + print(f"Stopped processing updates for pad {pad_id}") + + async def _process_message(self, pad_id: UUID, message_id: bytes, message_data: Dict[bytes, bytes]) -> None: + """Process a message from the Redis stream.""" + # Convert bytes keys/values to strings + data = {} + for k, v in message_data.items(): + key = k.decode() if isinstance(k, bytes) else k + value = v.decode() if isinstance(v, bytes) else v + + # Parse 'data' field if it's JSON + if key == 'data': + try: + data[key] = json.loads(value) + except json.JSONDecodeError: + data[key] = value + else: + data[key] = value + + message_type = data.get('type') + message_data = data.get('data') + + if message_type == 'scene_update' and message_data: + await self.handle_scene_update( + pad_id=pad_id, + data=message_data + ) + elif message_type == 'appstate_update' and message_data: + await self.handle_appstate_update( + pad_id=pad_id, + user_id=data.get('user_id'), + data=message_data + ) + + async def handle_scene_update( + self, + pad_id: UUID, + data: Dict[str, Any] + ) -> None: + """Handle a scene_update message from a client.""" + try: + async with async_session() as session: + pad = await Pad.get_by_id(session, pad_id) + if not pad: + print(f"Pad {pad_id} not found for scene update") + return + + client_elements = data.get("elements", []) + client_files = data.get("files", {}) + + changes_made = False + + # Update files if needed + if client_files and client_files != pad.data.get("files", {}): + pad.data["files"] = client_files + changes_made = True + + # Reconcile elements if needed + if client_elements: + current_elements = pad.data.get("elements", []) + reconciled_elements, elements_changed = self._reconcile_elements(current_elements, client_elements) + if elements_changed: + pad.data["elements"] = reconciled_elements + changes_made = True + + if changes_made: + await pad.cache() + + except Exception as e: + print(f"Error handling scene update for pad {pad_id}: {e}") + + async def handle_appstate_update( + self, + pad_id: UUID, + user_id: str, + data: Dict[str, Any] + ) -> None: + """Handle an appstate_update message from a client. Last writer wins for entire appState.""" + try: + new_appstate = data.get("appState", {}) + + if not new_appstate: + return + + async with async_session() as session: + pad = await Pad.get_by_id(session, pad_id) + if not pad: + print(f"Pad {pad_id} not found for appstate update") + return + + # Update the user's appState (last writer wins - replace entirely) + if "appState" not in pad.data: + pad.data["appState"] = {} + + pad.data["appState"][user_id] = new_appstate + + await pad.cache() + + except Exception as e: + print(f"Error handling appstate update for pad {pad_id}, user {user_id}: {e}") + + def _reconcile_elements( + self, + server_elements: List[Dict[str, Any]], + client_elements: List[Dict[str, Any]] + ) -> Tuple[List[Dict[str, Any]], bool]: + """ + Reconcile incoming elements with current server state. + + Returns: + Tuple of (reconciled_elements, has_changes) + """ + # Map server elements by ID for quick lookup + server_elements_map = {elem["id"]: elem for elem in server_elements} + reconciled_elements = [] + processed_ids = set() + has_changes = False + + # Process client elements first + for client_elem in client_elements: + elem_id = client_elem.get("id") + if not elem_id or elem_id in processed_ids: + continue + + server_elem = server_elements_map.get(elem_id) + + # Determine if we should keep server or client version + if self._should_discard_client_element(server_elem, client_elem): + reconciled_elements.append(server_elem) + else: + reconciled_elements.append(client_elem) + # Check if there was an actual change + if not server_elem or client_elem != server_elem: + has_changes = True + + processed_ids.add(elem_id) + + # Add remaining server elements + for elem_id, server_elem in server_elements_map.items(): + if elem_id not in processed_ids: + reconciled_elements.append(server_elem) + + # Order elements by fractional index + ordered_elements = self._order_by_fractional_index(reconciled_elements) + + return ordered_elements, has_changes + + def _should_discard_client_element( + self, + server_element: Optional[Dict[str, Any]], + client_element: Dict[str, Any] + ) -> bool: + """ + Determine if a client element should be discarded in favor of server version. + """ + if not server_element: + # No server version, accept client version + return False + + # Get versions for comparison + server_version = server_element.get("version", 0) + client_version = client_element.get("version", 0) + + # Compare versions - higher version wins + if client_version < server_version: + return True + + if client_version > server_version: + return False + + # If versions are equal, use versionNonce as tie-breaker + server_nonce = server_element.get("versionNonce", 0) + client_nonce = client_element.get("versionNonce", 0) + + # Lower nonce wins (same behavior as Excalidraw frontend) + return client_nonce > server_nonce + + def _order_by_fractional_index(self, elements: List[Dict[str, Any]]) -> List[Dict[str, Any]]: + """Sort elements by their fractional index.""" + def get_sort_key(elem): + index = elem.get("index") + if not index: + return ("", elem.get("id", "")) + return (index, elem.get("id", "")) + + return sorted(elements, key=get_sort_key) + + async def _periodic_save_to_db(self, pad_id: UUID) -> None: + """Periodically save pad data to database every 5 minutes.""" + try: + while pad_id in self._active_pads: + await asyncio.sleep(SAVE_INTERVAL) + + # Only save if pad is still active + if pad_id in self._active_pads: + await self._save_pad(pad_id) + + except asyncio.CancelledError: + print(f"Periodic save task for pad {pad_id} was cancelled") + except Exception as e: + print(f"Error in periodic save for pad {pad_id}: {e}") + + async def _save_pad(self, pad_id: UUID) -> bool: + """Save pad data using the Pad domain class.""" + try: + # Create database session and save using Pad domain + async with async_session() as session: + # Get the pad from database (this will also check cache first) + pad = await Pad.get_by_id(session, pad_id) + + if not pad: + print(f"Pad {pad_id} not found in database, skipping save") + return False + + await pad.save(session) + return True + + except Exception as e: + print(f"Error saving pad {pad_id} to database via domain: {e}") + return False \ No newline at end of file diff --git a/src/frontend/index.html b/src/frontend/index.html index 52a2c82..ec6fabc 100644 --- a/src/frontend/index.html +++ b/src/frontend/index.html @@ -8,19 +8,13 @@ /> - Pad.ws + Whiteboard IDE — pad.ws - +
- - diff --git a/src/frontend/index.scss b/src/frontend/index.scss index af2312b..1032ce2 100644 --- a/src/frontend/index.scss +++ b/src/frontend/index.scss @@ -1,87 +1,13 @@ -@font-face { - font-family: 'Roboto'; - src: url('/assets/fonts/Roboto-VariableFont_wdth,wght.ttf') format('truetype-variations'); - font-weight: 100 900; - font-stretch: 75% 100%; - font-style: normal; - font-display: swap; -} - -/* CSS Variables */ -:root { - --embeddable-pointer-events: all; -} - -/* Override Excalidraw styles */ - -body { - margin: 0; -} +@import 'src/css/_colors.scss'; +@import 'src/css/_excalidraw-overrides.scss'; +@import 'src/css/_fonts.scss'; +/* Makes excalidraw fill the entire screen */ #root { - height: 100%; - width: 100%; - position: absolute; - z-index: 1; - background-color: #111111; + height: 100vb; } -.excalidraw-wrapper { - height: 100%; - width: 100%; - display: flex; - z-index: 1; -} - -.excalidraw.theme--dark { - --color-primary-light: #a4571b !important; - --color-on-primary-container: #e0dfff !important; - --color-surface-primary-container: #cc6d24 !important; - --color-selection: #a4571b !important; -} - -.excalidraw .sidebar-trigger { - display: none !important; -} - -.excalidraw .layer-ui__wrapper__top-right { - gap: 0rem; -} - -.dropdown-menu-group-title { - font-weight: 1000 !important; - color: #fa8933 !important; - margin-left: 11px !important; - font-size: 15px !important; -} - -.excalidraw .Modal__background { - background-color: rgba(0, 0, 0, 0); - backdrop-filter: blur(0px); - animation: Modal__background__fade-in 0.3s ease-out forwards !important; - - &.animations-disabled { - animation: none !important; - } -} - -.excalidraw .Modal__content { - animation: Modal__content_fade-in 0.3s ease-out forwards !important; -} - -/* Embeddables */ - -.excalidraw__embeddable-container { /* 1st layer */ - display: block; -} - -.excalidraw__embeddable-container__inner { /* 2nd layer */ - border-color: #525252 !important; - overflow: visible !important; -} - -.excalidraw__embeddable__outer { /* 3rd layer */ - padding: 0px !important; - pointer-events: var(--embeddable-pointer-events, all) !important; - display: flex; +/* Removes the default margin */ +body { + margin: 0; } diff --git a/src/frontend/index.tsx b/src/frontend/index.tsx index 2d4d0b3..1401655 100644 --- a/src/frontend/index.tsx +++ b/src/frontend/index.tsx @@ -1,50 +1,34 @@ import React, { StrictMode } from "react"; import { createRoot } from "react-dom/client"; - -import posthog from "./src/utils/posthog"; -import { PostHogProvider } from 'posthog-js/react'; - -import { QueryClientProvider } from '@tanstack/react-query'; +import { QueryClient, QueryClientProvider } from '@tanstack/react-query'; import { ReactQueryDevtools } from '@tanstack/react-query-devtools'; -import { queryClient } from './src/api/queryClient'; + +// import posthog from "./src/lib/posthog"; +// import { PostHogProvider } from 'posthog-js/react'; import "@atyrode/excalidraw/index.css"; import "./index.scss"; -import type * as TExcalidraw from "@atyrode/excalidraw"; - import App from "./src/App"; import AuthGate from "./src/AuthGate"; -import { BuildVersionCheck } from "./src/BuildVersionCheck"; -declare global { - interface Window { - ExcalidrawLib: typeof TExcalidraw; - } -} +// Create a client +const queryClient = new QueryClient(); async function initApp() { const rootElement = document.getElementById("root")!; const root = createRoot(rootElement); - const { Excalidraw } = window.ExcalidrawLib; root.render( - - - - - - { }} - excalidrawLib={window.ExcalidrawLib} - > - - - - - - - , + // + + {/* */} + + + {/* */} + + + // , ); } diff --git a/src/frontend/package.json b/src/frontend/package.json index 707a206..ca1f4ab 100644 --- a/src/frontend/package.json +++ b/src/frontend/package.json @@ -1,9 +1,9 @@ { - "name": "with-script-in-browser", + "name": "pad.ws", "version": "1.0.0", "private": true, "dependencies": { - "@atyrode/excalidraw": "^0.18.0-9", + "@atyrode/excalidraw": "^0.18.0-15", "@monaco-editor/react": "^4.7.0", "@tanstack/react-query": "^5.74.3", "@tanstack/react-query-devtools": "^5.74.3", @@ -11,10 +11,14 @@ "browser-fs-access": "0.29.1", "clsx": "^2.1.1", "crypto-js": "^4.2.0", + "lodash.isequal": "^4.5.0", + "lodash.throttle": "^4.1.1", "lucide-react": "^0.488.0", "posthog-js": "^1.236.0", "react": "19.0.0", - "react-dom": "19.0.0" + "react-dom": "19.0.0", + "react-use-websocket": "^4.13.0", + "zod": "^3.24.4" }, "resolutions": { "cytoscape": "3.31.2", diff --git a/src/frontend/public/auth/popup-close.html b/src/frontend/public/auth/popup-close.html index f86d8b2..419ea47 100644 --- a/src/frontend/public/auth/popup-close.html +++ b/src/frontend/public/auth/popup-close.html @@ -10,4 +10,4 @@ window.close() - + \ No newline at end of file diff --git a/src/frontend/src/App.tsx b/src/frontend/src/App.tsx index 2630c5a..fd1c2b4 100644 --- a/src/frontend/src/App.tsx +++ b/src/frontend/src/App.tsx @@ -1,163 +1,131 @@ -import React, { useState, useCallback, useEffect, useRef } from "react"; -import { useAllPads, useUserProfile } from "./api/hooks"; -import { ExcalidrawWrapper } from "./ExcalidrawWrapper"; -import { debounce } from "./utils/debounce"; -import posthog from "./utils/posthog"; -import { - normalizeCanvasData, - getPadData, - storePadData, - setActivePad, - getActivePad, - getStoredActivePad, - loadPadData -} from "./utils/canvasUtils"; -import { useSaveCanvas } from "./api/hooks"; -import type * as TExcalidraw from "@atyrode/excalidraw"; -import type { NonDeletedExcalidrawElement } from "@atyrode/excalidraw/element/types"; +import React, { useState } from "react"; +import { Excalidraw, MainMenu, Footer } from "@atyrode/excalidraw"; import type { ExcalidrawImperativeAPI, AppState } from "@atyrode/excalidraw/types"; -import { useAuthCheck } from "./api/hooks"; - -export interface AppProps { - useCustom: (api: ExcalidrawImperativeAPI | null, customArgs?: any[]) => void; - customArgs?: any[]; - children?: React.ReactNode; - excalidrawLib: typeof TExcalidraw; -} - -export default function App({ - useCustom, - customArgs, - children, - excalidrawLib, -}: AppProps) { - const { useHandleLibrary, MainMenu } = excalidrawLib; - - const { data: isAuthenticated, isLoading: isAuthLoading } = useAuthCheck(); - const { data: userProfile } = useUserProfile(); - - // Only enable pad queries if authenticated and not loading - const { data: pads } = useAllPads({ - queryKey: ['allPads'], - enabled: isAuthenticated === true && !isAuthLoading, - retry: 1, - }); - - // Get the first pad's data to use as the canvas data - const canvasData = pads && pads.length > 0 ? pads[0].data : null; - - // Excalidraw API ref - const [excalidrawAPI, setExcalidrawAPI] = useState(null); - useCustom(excalidrawAPI, customArgs); - useHandleLibrary({ excalidrawAPI }); - - // Using imported functions from canvasUtils.ts - - useEffect(() => { - if (excalidrawAPI && pads && pads.length > 0) { - // Check if there's a stored active pad ID - const storedActivePadId = getStoredActivePad(); - - // Find the pad that matches the stored ID, or use the first pad if no match - let padToActivate = pads[0]; - - if (storedActivePadId) { - // Try to find the pad with the stored ID - const matchingPad = pads.find(pad => pad.id === storedActivePadId); - if (matchingPad) { - console.debug(`[pad.ws] Found stored active pad in App.tsx: ${storedActivePadId}`); - padToActivate = matchingPad; - } else { - console.debug(`[pad.ws] Stored active pad ${storedActivePadId} not found in available pads`); - } - } - - // Set the active pad ID globally - setActivePad(padToActivate.id); - - // Load the pad data for the selected pad - loadPadData(excalidrawAPI, padToActivate.id, padToActivate.data); - } - }, [excalidrawAPI, pads]); - - const { mutate: saveCanvas } = useSaveCanvas({ - onSuccess: () => { - console.debug("[pad.ws] Canvas saved to database successfully"); - }, - onError: (error) => { - console.error("[pad.ws] Failed to save canvas to database:", error); - } - }); - - - useEffect(() => { - if (excalidrawAPI) { - (window as any).excalidrawAPI = excalidrawAPI; - } - return () => { - (window as any).excalidrawAPI = null; - }; - }, [excalidrawAPI]); - - const lastSentCanvasDataRef = useRef(""); - - const debouncedLogChange = useCallback( - debounce( - (elements: NonDeletedExcalidrawElement[], state: AppState, files: any) => { - if (!isAuthenticated) return; - - // Get the active pad ID using the imported function - const activePadId = getActivePad(); - if (!activePadId) return; - - const canvasData = { - elements, - appState: state, - files - }; - - const serialized = JSON.stringify(canvasData); - if (serialized !== lastSentCanvasDataRef.current) { - lastSentCanvasDataRef.current = serialized; - - // Store the canvas data in local storage - storePadData(activePadId, canvasData); - - // Save the canvas data to the server - saveCanvas(canvasData); - } - }, - 1200 - ), - [saveCanvas, isAuthenticated, storePadData] - ); - - useEffect(() => { - if (userProfile?.id) { - posthog.identify(userProfile.id); - if (posthog.people && typeof posthog.people.set === "function") { - const { - id, // do not include in properties - ...personProps - } = userProfile; - posthog.people.set(personProps); - } - } - }, [userProfile]); +import type { ExcalidrawEmbeddableElement, NonDeleted } from "@atyrode/excalidraw/element/types"; + +// Hooks +import { useAuthStatus } from "./hooks/useAuthStatus"; +import { usePadTabs } from "./hooks/usePadTabs"; +import { useCallbackRefState } from "./hooks/useCallbackRefState"; + +// Components +import DiscordButton from './ui/DiscordButton'; +import { MainMenuConfig } from './ui/MainMenu'; +import AuthDialog from './ui/AuthDialog'; +import SettingsDialog from './ui/SettingsDialog'; +import Collab from './lib/collab/Collab'; + +// Utils +import { initializePostHog } from "./lib/posthog"; +import { lockEmbeddables, renderCustomEmbeddable } from './CustomEmbeddableRenderer'; +import Tabs from "./ui/Tabs"; +import { INITIAL_APP_DATA, HIDDEN_UI_ELEMENTS } from "./constants"; + +export default function App() { + const { isAuthenticated, isLoading: isLoadingAuth, user } = useAuthStatus(); + + const { + tabs, + selectedTabId, + isLoading: isLoadingTabs, + createNewPadAsync, + isCreating: isCreatingPad, + renamePad, + deletePad, + selectTab, + updateSharingPolicy, + leaveSharedPad + } = usePadTabs(isAuthenticated); + + const [showSettingsModal, setShowSettingsModal] = useState(false); + const [excalidrawAPI, excalidrawRefCallback] = useCallbackRefState(); + + + const handleCloseSettingsModal = () => { + setShowSettingsModal(false); + }; + + const handleOnScrollChange = (scrollX: number, scrollY: number) => { + lockEmbeddables(excalidrawAPI?.getAppState()); + }; + + // useEffect(() => { + // if (appConfig?.posthogKey && appConfig?.posthogHost) { + // initializePostHog({ + // posthogKey: appConfig.posthogKey, + // posthogHost: appConfig.posthogHost, + // }); + // } else if (configError) { + // console.error('[pad.ws] Failed to load app config:', configError); + // } + // }, [appConfig, configError]); return ( <> - , appState: AppState) => { + return renderCustomEmbeddable(element, appState, excalidrawAPI); + }} + renderTopRightUI={() => ( +
+ +
+ )} > - {children} -
- + + + {!isLoadingAuth && !isAuthenticated && ( + { }} /> + )} + + {showSettingsModal && ( + + )} + + {excalidrawAPI && ( +
+ {isAuthenticated && ( + + )} +
+ )} + {excalidrawAPI && user && ( + + )} + ); } diff --git a/src/frontend/src/AuthGate.tsx b/src/frontend/src/AuthGate.tsx index 6434020..d10d9c5 100644 --- a/src/frontend/src/AuthGate.tsx +++ b/src/frontend/src/AuthGate.tsx @@ -1,6 +1,6 @@ import React, { useEffect, useRef, useState } from "react"; -import { useAuthCheck } from "./api/hooks"; -import { getAppConfig } from "./api/configService"; +import { useAppConfig } from "./hooks/useAppConfig"; // Import useAppConfig +import { useAuthStatus } from "./hooks/useAuthStatus"; /** * If unauthenticated, it shows the AuthModal as an overlay, but still renders the app behind it. @@ -11,22 +11,20 @@ import { getAppConfig } from "./api/configService"; * * The iframe is removed as soon as it loads, or after a fallback timeout. */ -export default function AuthGate({ children }: { children: React.ReactNode }) { - const { data: isAuthenticated, isLoading } = useAuthCheck(); +export default function AuthGate() { const [coderAuthDone, setCoderAuthDone] = useState(false); const iframeRef = useRef(null); const timeoutRef = useRef(null); + const { config, isLoadingConfig, configError } = useAppConfig(); // Use the hook + const { isAuthenticated, isLoading: isLoadingAuth } = useAuthStatus(); useEffect(() => { - // Only run the Coder OIDC priming once per session, after auth is confirmed - if (isAuthenticated === true && !coderAuthDone) { + if (isAuthenticated && !isLoadingAuth && !coderAuthDone && config && !isLoadingConfig && !configError) { + console.debug('[pad.ws] Priming Coder OIDC session'); const setupIframe = async () => { try { - // Get config from API - const config = await getAppConfig(); - if (!config.coderUrl) { - console.warn('[pad.ws] Coder URL not found, skipping OIDC priming'); + console.warn('[pad.ws] Coder URL not found in config, skipping OIDC priming'); setCoderAuthDone(true); return; } @@ -65,10 +63,12 @@ export default function AuthGate({ children }: { children: React.ReactNode }) { clearTimeout(timeoutRef.current); } }; + } else if (configError) { + console.error('[pad.ws] Failed to load app config for OIDC priming:', configError); + setCoderAuthDone(true); // Mark as done to prevent retries if config fails } // eslint-disable-next-line react-hooks/exhaustive-deps - }, [isAuthenticated, coderAuthDone]); + }, [isAuthenticated, isLoadingAuth, coderAuthDone, config, isLoadingConfig, configError]); - // Just render children - AuthModal is now handled by ExcalidrawWrapper - return <>{children}; + return null; } diff --git a/src/frontend/src/BuildVersionCheck.tsx b/src/frontend/src/BuildVersionCheck.tsx deleted file mode 100644 index 162732c..0000000 --- a/src/frontend/src/BuildVersionCheck.tsx +++ /dev/null @@ -1,61 +0,0 @@ -import { useEffect, useState, useCallback } from 'react'; -import { useBuildInfo, useSaveCanvas } from './api/hooks'; -import { saveCurrentCanvas } from './utils/canvasUtils'; - -/** - * Component that checks for application version changes and refreshes the page when needed. - * This component doesn't render anything visible. - */ -export function BuildVersionCheck() { - // Store the initial build hash when the component first loads - const [initialBuildHash, setInitialBuildHash] = useState(null); - - // Query for the current build info from the server - const { data: buildInfo } = useBuildInfo(); - - // Get the saveCanvas mutation - const { mutate: saveCanvas } = useSaveCanvas({ - onSuccess: () => { - console.debug("[pad.ws] Canvas saved before refresh"); - // Refresh the page immediately after saving - window.location.reload(); - }, - onError: (error) => { - console.error("[pad.ws] Failed to save canvas before refresh:", error); - // Refresh anyway even if save fails - window.location.reload(); - } - }); - - // Function to handle version update - const handleVersionUpdate = useCallback(() => { - // Save the canvas and then refresh - saveCurrentCanvas( - saveCanvas, - undefined, // No success callback needed as it's handled in the useSaveCanvas hook - () => window.location.reload() // On error, just refresh - ); - }, [saveCanvas]); - - useEffect(() => { - // On first load, store the initial build hash - if (buildInfo?.buildHash && initialBuildHash === null) { - console.debug('[pad.ws] Initial build hash:', buildInfo.buildHash); - setInitialBuildHash(buildInfo.buildHash); - } - - // If we have both values and they don't match, a new version is available - if (initialBuildHash !== null && - buildInfo?.buildHash && - initialBuildHash !== buildInfo.buildHash) { - - console.debug('[pad.ws] New version detected. Current:', initialBuildHash, 'New:', buildInfo.buildHash); - - // Save the canvas and then refresh - handleVersionUpdate(); - } - }, [buildInfo, initialBuildHash, handleVersionUpdate]); - - // This component doesn't render anything - return null; -} diff --git a/src/frontend/src/CustomEmbeddableRenderer.tsx b/src/frontend/src/CustomEmbeddableRenderer.tsx index 6a34312..4023add 100644 --- a/src/frontend/src/CustomEmbeddableRenderer.tsx +++ b/src/frontend/src/CustomEmbeddableRenderer.tsx @@ -1,6 +1,6 @@ import React, { useEffect, useState } from 'react'; import { Lock } from 'lucide-react'; -import { debounce } from './utils/debounce'; +import { debounce } from './lib/debounce'; import type { NonDeleted, ExcalidrawEmbeddableElement } from '@atyrode/excalidraw/element/types'; import type { AppState } from '@atyrode/excalidraw/types'; import { @@ -9,6 +9,7 @@ import { ControlButton, Editor, Terminal, + DevTools, } from './pad'; import { ActionButton } from './pad/buttons'; import "./CustomEmbeddableRenderer.scss"; @@ -63,6 +64,10 @@ export const renderCustomEmbeddable = ( content = ; title = "Dashboard"; break; + case 'dev': + content = ; + title = "Dev Tools"; + break; default: title = "Untitled"; return null; diff --git a/src/frontend/src/ExcalidrawWrapper.tsx b/src/frontend/src/ExcalidrawWrapper.tsx deleted file mode 100644 index 267e548..0000000 --- a/src/frontend/src/ExcalidrawWrapper.tsx +++ /dev/null @@ -1,168 +0,0 @@ -import React, { Children, cloneElement, useState, useEffect } from 'react'; -import DiscordButton from './ui/DiscordButton'; -import GitHubButton from './ui/GitHubButton'; -import type { ExcalidrawImperativeAPI } from '@atyrode/excalidraw/types'; -import type { NonDeletedExcalidrawElement } from '@atyrode/excalidraw/element/types'; -import type { AppState } from '@atyrode/excalidraw/types'; -import { MainMenuConfig } from './ui/MainMenu'; -import { lockEmbeddables, renderCustomEmbeddable } from './CustomEmbeddableRenderer'; -import AuthDialog from './ui/AuthDialog'; -import BackupsModal from './ui/BackupsDialog'; -import PadsDialog from './ui/PadsDialog'; -import SettingsDialog from './ui/SettingsDialog'; -import { capture } from './utils/posthog'; -import { Footer } from '@atyrode/excalidraw'; -import Tabs from './ui/Tabs'; - -const defaultInitialData = { - elements: [], - appState: { - gridModeEnabled: true, - gridSize: 20, - gridStep: 5, - }, - files: {}, -}; - -interface ExcalidrawWrapperProps { - children: React.ReactNode; - excalidrawAPI: ExcalidrawImperativeAPI | null; - setExcalidrawAPI: (api: ExcalidrawImperativeAPI) => void; - initialData?: any; - onChange: (elements: NonDeletedExcalidrawElement[], state: AppState) => void; - onScrollChange: (scrollX: number, scrollY: number) => void; - MainMenu: any; - renderTopRightUI?: () => React.ReactNode; - isAuthenticated?: boolean | null; - isAuthLoading?: boolean; -} - -export const ExcalidrawWrapper: React.FC = ({ - children, - excalidrawAPI, - setExcalidrawAPI, - initialData, - onChange, - onScrollChange, - MainMenu, - renderTopRightUI, - isAuthenticated = null, - isAuthLoading = false, -}) => { - // Add state for modal animation - const [isExiting, setIsExiting] = useState(false); - - // State for modals - const [showBackupsModal, setShowBackupsModal] = useState(false); - const [showPadsModal, setShowPadsModal] = useState(false); - const [showSettingsModal, setShowSettingsModal] = useState(false); - - // Handle auth state changes - useEffect(() => { - if (isAuthenticated === true) { - setIsExiting(true); - capture('signed_in'); - } - }, [isAuthenticated]); - - - // Handlers for closing modals - const handleCloseBackupsModal = () => { - setShowBackupsModal(false); - }; - - const handleClosePadsModal = () => { - setShowPadsModal(false); - }; - - const handleCloseSettingsModal = () => { - setShowSettingsModal(false); - }; - - const renderExcalidraw = (children: React.ReactNode) => { - const Excalidraw = Children.toArray(children).find( - (child: any) => - React.isValidElement(child) && - typeof child.type !== "string" && - child.type.displayName === "Excalidraw", - ); - - if (!Excalidraw) { - return null; - } - - return cloneElement( - Excalidraw as React.ReactElement, - { - excalidrawAPI: (api: ExcalidrawImperativeAPI) => setExcalidrawAPI(api), - theme: "dark", - initialData: initialData ?? defaultInitialData, - onChange: onChange, - name: "Pad.ws", - onScrollChange: (scrollX, scrollY) => { - lockEmbeddables(excalidrawAPI?.getAppState()); - if (onScrollChange) onScrollChange(scrollX, scrollY); - }, - validateEmbeddable: true, - renderEmbeddable: (element, appState) => renderCustomEmbeddable(element, appState, excalidrawAPI), - renderTopRightUI: renderTopRightUI ?? (() => ( -
- - -
- )), - }, - <> - {excalidrawAPI && ( -
- -
- )} - - {!isAuthLoading && isAuthenticated === false && ( - {}} - /> - )} - - {showBackupsModal && ( - - )} - - {showPadsModal && ( - - )} - - {showSettingsModal && ( - - )} - - ); - }; - - return ( -
- {renderExcalidraw(children)} -
- ); -}; diff --git a/src/frontend/src/api/apiUtils.ts b/src/frontend/src/api/apiUtils.ts deleted file mode 100644 index 5c6f6bf..0000000 --- a/src/frontend/src/api/apiUtils.ts +++ /dev/null @@ -1,51 +0,0 @@ -import { queryClient } from './queryClient'; - -/** - * Handle unauthorized errors by updating the auth state in the query cache - * This will trigger the AuthModal to appear - */ -export function handleUnauthorized() { - // Set auth state to false to trigger the AuthModal - queryClient.setQueryData(['auth'], false); -} - -// Common error handling for API responses -export async function handleResponse(response: Response) { - if (!response.ok) { - if (response.status === 401) { - // Update auth state when 401 is encountered - handleUnauthorized(); - throw new Error('Unauthorized'); - } - - const errorText = await response.text(); - throw new Error(errorText || `API error: ${response.status}`); - } - - // For endpoints that return no content - if (response.status === 204) { - return null; - } - - // For endpoints that return JSON - return response.json(); -} - -// Base fetch function with error handling -export async function fetchApi(url: string, options?: RequestInit) { - try { - const response = await fetch(url, { - ...options, - credentials: 'include', - headers: { - 'Content-Type': 'application/json', - ...options?.headers, - }, - }); - - return handleResponse(response); - } catch (error) { - // Re-throw the error after handling it - throw error; - } -} diff --git a/src/frontend/src/api/configService.ts b/src/frontend/src/api/configService.ts deleted file mode 100644 index 66050cf..0000000 --- a/src/frontend/src/api/configService.ts +++ /dev/null @@ -1,39 +0,0 @@ -import { fetchApi } from './apiUtils'; - -/** - * Application configuration interface - */ -export interface AppConfig { - coderUrl: string; - posthogKey: string; - posthogHost: string; -} - -// Cache the config to avoid unnecessary API calls -let cachedConfig: AppConfig | null = null; - -/** - * Get the application configuration from the API - * @returns The application configuration - */ -export async function getAppConfig(): Promise { - // Return cached config if available - if (cachedConfig) { - return cachedConfig; - } - - try { - // Fetch config from API - const config = await fetchApi('/api/app/config'); - cachedConfig = config; - return config; - } catch (error) { - console.error('[pad.ws] Failed to load application configuration:', error); - // Return default values as fallback - return { - coderUrl: '', - posthogKey: '', - posthogHost: '' - }; - } -} diff --git a/src/frontend/src/api/hooks.ts b/src/frontend/src/api/hooks.ts deleted file mode 100644 index a4870fa..0000000 --- a/src/frontend/src/api/hooks.ts +++ /dev/null @@ -1,330 +0,0 @@ -import { useQuery, useMutation, UseQueryOptions, UseMutationOptions } from '@tanstack/react-query'; -import { fetchApi } from './apiUtils'; -import { queryClient } from './queryClient'; - -// Types -export interface WorkspaceState { - status: 'running' | 'starting' | 'stopping' | 'stopped' | 'error'; - username: string | null; - name: string | null; - base_url: string | null; - agent: string | null; - id: string | null; - error?: string; -} - -export interface UserProfile { - id: string; - email: string; - username: string; - name: string; - given_name: string; - family_name: string; - email_verified: boolean; -} - -export interface CanvasData { - elements: any[]; - appState: any; - files: any; -} - -export interface PadData { - id: string; - owner_id: string; - display_name: string; - data: CanvasData; - created_at: string; - updated_at: string; -} - -export interface CanvasBackup { - id: number; - timestamp: string; - data: CanvasData; -} - -export interface CanvasBackupsResponse { - backups: CanvasBackup[]; - pad_name?: string; -} - -export interface BuildInfo { - buildHash: string; - timestamp: number; -} - -// API functions -export const api = { - // Authentication - checkAuth: async (): Promise => { - try { - await fetchApi('/api/workspace/state'); - return true; - } catch (error) { - if (error instanceof Error && error.message === 'Unauthorized') { - return false; - } - throw error; - } - }, - - // User profile - getUserProfile: async (): Promise => { - try { - const result = await fetchApi('/api/users/me'); - return result; - } catch (error) { - throw error; - } - }, - - // Workspace - getWorkspaceState: async (): Promise => { - try { - const result = await fetchApi('/api/workspace/state'); - // Map backend 'state' property to frontend 'status' - return { ...result, status: result.state }; - } catch (error) { - // Let the error propagate to be handled by the global error handler - throw error; - } - }, - - startWorkspace: async (): Promise => { - try { - const result = await fetchApi('/api/workspace/start', { method: 'POST' }); - return result; - } catch (error) { - throw error; - } - }, - - stopWorkspace: async (): Promise => { - try { - const result = await fetchApi('/api/workspace/stop', { method: 'POST' }); - return result; - } catch (error) { - throw error; - } - }, - - // Canvas functions are now handled through getAllPads - - getAllPads: async (): Promise => { - try { - const result = await fetchApi('/api/pad'); - return result; - } catch (error) { - throw error; - } - }, - - saveCanvas: async (data: CanvasData): Promise => { - try { - // Get the active pad ID from the global variable - const activePadId = (window as any).activePadId; - - // We must have an active pad ID to save - if (!activePadId) { - throw new Error("No active pad ID found. Cannot save canvas."); - } - - // Use the specific pad endpoint - const endpoint = `/api/pad/${activePadId}`; - - const result = await fetchApi(endpoint, { - method: 'POST', - body: JSON.stringify(data), - }); - return result; - } catch (error) { - throw error; - } - }, - - renamePad: async (padId: string, newName: string): Promise => { - try { - const endpoint = `/api/pad/${padId}`; - const result = await fetchApi(endpoint, { - method: 'PATCH', - body: JSON.stringify({ display_name: newName }), - }); - return result; - } catch (error) { - throw error; - } - }, - - deletePad: async (padId: string): Promise => { - try { - const endpoint = `/api/pad/${padId}`; - const result = await fetchApi(endpoint, { - method: 'DELETE', - }); - return result; - } catch (error) { - throw error; - } - }, - - getDefaultCanvas: async (): Promise => { - try { - const result = await fetchApi('/api/templates/default'); - return result.data; - } catch (error) { - throw error; - } - }, - - // Canvas Backups - getCanvasBackups: async (limit: number = 10): Promise => { - try { - const result = await fetchApi(`/api/pad/recent?limit=${limit}`); - return result; - } catch (error) { - throw error; - } - }, - - getPadBackups: async (padId: string, limit: number = 10): Promise => { - try { - const result = await fetchApi(`/api/pad/${padId}/backups?limit=${limit}`); - return result; - } catch (error) { - throw error; - } - }, - - // Build Info - getBuildInfo: async (): Promise => { - try { - const result = await fetchApi('/api/app/build-info'); - return result; - } catch (error) { - throw error; - } - }, -}; - -// Query hooks -export function useAuthCheck(options?: UseQueryOptions) { - return useQuery({ - queryKey: ['auth'], - queryFn: api.checkAuth, - ...options, - }); -} - -export function useUserProfile(options?: UseQueryOptions) { - return useQuery({ - queryKey: ['userProfile'], - queryFn: api.getUserProfile, - ...options, - }); -} - -export function useWorkspaceState(options?: UseQueryOptions) { - // Get the current auth state from the query cache - const authState = queryClient.getQueryData(['auth']); - - return useQuery({ - queryKey: ['workspaceState'], - queryFn: api.getWorkspaceState, - // Only poll if authenticated - refetchInterval: authState === true ? 5000 : false, // Poll every 5 seconds if authenticated, otherwise don't poll - // Don't retry on error if not authenticated - retry: authState === true ? 1 : false, - ...options, - }); -} - -export function useAllPads(options?: UseQueryOptions) { - return useQuery({ - queryKey: ['allPads'], - queryFn: api.getAllPads, - ...options, - }); -} - -export function useCanvasBackups(limit: number = 10, options?: UseQueryOptions) { - return useQuery({ - queryKey: ['canvasBackups', limit], - queryFn: () => api.getCanvasBackups(limit), - ...options, - }); -} - -export function usePadBackups(padId: string | null, limit: number = 10, options?: UseQueryOptions) { - return useQuery({ - queryKey: ['padBackups', padId, limit], - queryFn: () => padId ? api.getPadBackups(padId, limit) : Promise.reject('No pad ID provided'), - enabled: !!padId, // Only run the query if padId is provided - ...options, - }); -} - -export function useBuildInfo(options?: UseQueryOptions) { - return useQuery({ - queryKey: ['buildInfo'], - queryFn: api.getBuildInfo, - refetchInterval: 60000, // Check every minute - ...options, - }); -} - -// Mutation hooks -export function useStartWorkspace(options?: UseMutationOptions) { - return useMutation({ - mutationFn: api.startWorkspace, - onSuccess: () => { - // Invalidate workspace state query to trigger refetch - queryClient.invalidateQueries({ queryKey: ['workspaceState'] }); - }, - ...options, - }); -} - -export function useStopWorkspace(options?: UseMutationOptions) { - return useMutation({ - mutationFn: api.stopWorkspace, - onSuccess: () => { - // Invalidate workspace state query to trigger refetch - queryClient.invalidateQueries({ queryKey: ['workspaceState'] }); - }, - ...options, - }); -} - -export function useSaveCanvas(options?: UseMutationOptions) { - return useMutation({ - mutationFn: api.saveCanvas, - onSuccess: () => { - // Get the active pad ID from the global variable - const activePadId = (window as any).activePadId; - - // Invalidate canvas backups queries to trigger refetch - queryClient.invalidateQueries({ queryKey: ['canvasBackups'] }); - if (activePadId) { - queryClient.invalidateQueries({ queryKey: ['padBackups', activePadId] }); - } - }, - ...options, - }); -} - -export function useRenamePad(options?: UseMutationOptions) { - return useMutation({ - mutationFn: ({ padId, newName }) => api.renamePad(padId, newName), - // No automatic invalidation - we'll update the cache manually - ...options, - }); -} - -export function useDeletePad(options?: UseMutationOptions) { - return useMutation({ - mutationFn: (padId) => api.deletePad(padId), - // No automatic invalidation - we'll update the cache manually - ...options, - }); -} diff --git a/src/frontend/src/api/queryClient.ts b/src/frontend/src/api/queryClient.ts deleted file mode 100644 index bc91540..0000000 --- a/src/frontend/src/api/queryClient.ts +++ /dev/null @@ -1,14 +0,0 @@ -import { QueryClient } from '@tanstack/react-query'; - -// Create a client -export const queryClient = new QueryClient({ - defaultOptions: { - queries: { - retry: 1, - refetchOnWindowFocus: true, - staleTime: 30000, // 30 seconds - gcTime: 1000 * 60 * 5, // 5 minutes (formerly cacheTime) - refetchOnMount: true, - }, - }, -}); diff --git a/src/frontend/src/constants.ts b/src/frontend/src/constants.ts new file mode 100644 index 0000000..9e8bf94 --- /dev/null +++ b/src/frontend/src/constants.ts @@ -0,0 +1,26 @@ +// Default app values +export const INITIAL_APP_DATA = { + appState: { + theme: "dark", + gridModeEnabled: true, + gridSize: 20, + gridStep: 5, + }, + elements: [], + files: [], +}; + +// UI elements +export const HIDDEN_UI_ELEMENTS = { + toolbar: false, + zoomControls: false, + undoRedo: false, + helpButton: false, + mainMenu: false, + sidebar: true, +}; + +// Collab constants +export const POINTER_MOVE_THROTTLE_MS = 30; // Throttle pointer move events to reduce the number of updates sent to the server +export const ENABLE_PERIODIC_FULL_SYNC = false; // Set to false to disable periodic scene_update full sync +export const PERIODIC_FULL_SYNC_INTERVAL_MS = 60000; // Sync scene_update every 60 seconds if ENABLE_PERIODIC_FULL_SYNC is true diff --git a/src/frontend/src/css/_colors.scss b/src/frontend/src/css/_colors.scss new file mode 100644 index 0000000..7841564 --- /dev/null +++ b/src/frontend/src/css/_colors.scss @@ -0,0 +1,24 @@ +// Black & White +$black: #000; +$white: #fff; + +// Greys (https://coolors.co/fafafa-f6f6f6-eeeeee-dddddd-cccccc-999999-666666-595959-333333) +$grey-100: #fafafa; +$grey-200: #f6f6f6; +$grey-300: #eee; +$grey-400: #ddd; +$grey-500: #ccc; +$grey-600: #999; +$grey-700: #666; +$grey-800: #595959; +$grey-900: #333; + + +// Accent colors (https://coolors.co/ff9747-ff7d1a-f06800-e06100-cc5800) +$accent-faded-light: hsl(26, 100%, 90%); +$accent-light: hsl(26, 100%, 64%); +$accent-lighter: hsl(26, 100%, 55%); +$accent: hsl(26, 100%, 47%); +$accent-darker: hsl(26, 100%, 44%); +$accent-dark: hsl(26, 100%, 40%); +$accent-faded-dark: hsl(26, 20%, 20%); \ No newline at end of file diff --git a/src/frontend/src/css/_excalidraw-overrides.scss b/src/frontend/src/css/_excalidraw-overrides.scss new file mode 100644 index 0000000..19bf3aa --- /dev/null +++ b/src/frontend/src/css/_excalidraw-overrides.scss @@ -0,0 +1,79 @@ +/* Excalidraw overrides */ +.excalidraw { + + /* Dropdown "main" menu's titles */ + .dropdown-menu-group-title { + font-weight: 1000 !important; + font-size: 15px !important; + margin-left: 11px !important; + } + + .Modal { + + /* When a modal is opening */ + &__content { + animation: Modal__content_fade-in 0.3s ease-out forwards !important; /* Fades in the modal */ + } + + /* When a modal is open */ + &__background { + background-color: rgba(0, 0, 0, 0.2); /* Fades out the background */ + backdrop-filter: blur(1px); /* Blurs the background */ + } + + } + + /* Excalidraw embeddables container */ + &__embeddable-container { + &__inner { /* 2nd layer */ + border-color: $grey-700 !important; /* Border color */ + overflow: visible !important; /* Allows display content on the border (title bar, etc) */ + } + } + + /* Excalidraw embeddable (inside the container) */ + &__embeddable { + &__outer { /* 3rd layer */ + padding: 0px !important; /* Removes the default padding */ + pointer-events: var(--embeddable-pointer-events, all) !important; /* Allows conditional interaction with the embeddable */ + } + } +} + +/* Overriding Excalidraw's default (light theme) colors */ +.excalidraw { + + --select-highlight-color: #{$accent-lighter} !important; /* Context menu highlight color */ + --color-on-primary-container: #{$black} !important; + --color-surface-primary-container: #{$accent-light} !important; + --color-selection: #{$accent-lighter} !important; + + .dropdown-menu-button { + &:hover { + background-color: $accent-faded-light !important; + } + } + + .dropdown-menu-group-title { + color: $accent-lighter !important; + } +} + +/* Overriding Excalidraw's default (dark theme) colors */ +.excalidraw.theme--dark { + + --select-highlight-color: #{$accent-darker} !important; /* Context menu highlight color */ + --color-on-primary-container: #{$white} !important; + --color-surface-primary-container: #{$accent} !important; + --color-selection: #{$accent-darker} !important; + + .dropdown-menu-button { + &:hover { + background-color: $accent-faded-dark !important; + } + } + .dropdown-menu-group-title { + color: $accent-darker !important; + } +} + \ No newline at end of file diff --git a/src/frontend/src/css/_fonts.scss b/src/frontend/src/css/_fonts.scss new file mode 100644 index 0000000..b611d26 --- /dev/null +++ b/src/frontend/src/css/_fonts.scss @@ -0,0 +1,8 @@ +@font-face { + font-family: 'Roboto'; + src: url('/assets/fonts/Roboto-VariableFont_wdth,wght.ttf') format('truetype-variations'); + font-weight: 100 900; + font-stretch: 75% 100%; + font-style: normal; + font-display: swap; +} \ No newline at end of file diff --git a/src/frontend/src/env.d.ts b/src/frontend/src/env.d.ts deleted file mode 100644 index fc443ba..0000000 --- a/src/frontend/src/env.d.ts +++ /dev/null @@ -1,11 +0,0 @@ -/// - -interface ImportMetaEnv { - readonly VITE_PUBLIC_POSTHOG_KEY: string - readonly VITE_PUBLIC_POSTHOG_HOST: string - readonly CODER_URL: string -} - -interface ImportMeta { - readonly env: ImportMetaEnv -} \ No newline at end of file diff --git a/src/frontend/src/global.d.ts b/src/frontend/src/global.d.ts deleted file mode 100644 index 70713f3..0000000 --- a/src/frontend/src/global.d.ts +++ /dev/null @@ -1,3 +0,0 @@ -interface Window { - ExcalidrawLib: any; -} diff --git a/src/frontend/src/hooks/useAppConfig.ts b/src/frontend/src/hooks/useAppConfig.ts new file mode 100644 index 0000000..5a52f95 --- /dev/null +++ b/src/frontend/src/hooks/useAppConfig.ts @@ -0,0 +1,40 @@ +import { useQuery } from '@tanstack/react-query'; + +interface AppConfig { + coderUrl: string; + posthogKey: string; + posthogHost: string; +} + +const fetchAppConfig = async (): Promise => { + const response = await fetch('/api/app/config'); + if (!response.ok) { + let errorMessage = 'Failed to fetch app configuration.'; + try { + const errorData = await response.json(); + if (errorData && errorData.message) { + errorMessage = errorData.message; + } + } catch (e) { + // Ignore if error response is not JSON or empty + } + throw new Error(errorMessage); + } + return response.json(); +}; + +export const useAppConfig = () => { + const { data, isLoading, error, isError } = useQuery({ + queryKey: ['appConfig'], + queryFn: fetchAppConfig, + staleTime: Infinity, // Config is not expected to change during a session + gcTime: Infinity, // Renamed from cacheTime in v5 + }); + + return { + config: data, + isLoadingConfig: isLoading, + configError: error, + isConfigError: isError, + }; +}; diff --git a/src/frontend/src/hooks/useAuthStatus.ts b/src/frontend/src/hooks/useAuthStatus.ts new file mode 100644 index 0000000..cdfb400 --- /dev/null +++ b/src/frontend/src/hooks/useAuthStatus.ts @@ -0,0 +1,92 @@ +import { useQuery, useQueryClient } from '@tanstack/react-query'; +import { useEffect } from 'react'; +import { scheduleTokenRefresh, AUTH_STATUS_KEY } from '../lib/authRefreshManager'; + +export interface UserInfo { + id?: string; + username?: string; + email?: string; + name?: string; +} + +interface AuthStatusResponse { + authenticated: boolean; + user?: UserInfo; + expires_in?: number; + message?: string; +} + +// API function for getting status +const getAuthStatus = async (): Promise => { + const response = await fetch('/api/auth/status'); + if (!response.ok) { + throw new Error('Failed to fetch authentication status'); + } + return response.json(); +}; + +export const useAuthStatus = () => { + const queryClient = useQueryClient(); + + // Main auth status query + const { + data, + isLoading, + error, + isError, + refetch + } = useQuery({ + queryKey: [AUTH_STATUS_KEY], + queryFn: getAuthStatus, + staleTime: 4 * 60 * 1000, // 4 minutes + }); + + // Schedule refresh when auth data changes + useEffect(() => { + if (!data?.authenticated || !data?.expires_in) return; + + scheduleTokenRefresh( + data.expires_in, + // Success callback + (refreshedData) => { + queryClient.setQueryData([AUTH_STATUS_KEY], (initialData: AuthStatusResponse | undefined) => { + if (refreshedData.authenticated) { + return { + ...initialData, + authenticated: refreshedData.authenticated, + expires_in: refreshedData.expires_in, + }; + } + // If refresh resulted in not authenticated, return the new (unauthenticated) data. + return refreshedData; + }); + }, + // Error callback + () => { + queryClient.invalidateQueries({ queryKey: [AUTH_STATUS_KEY] }); + } + ); + }, [data?.authenticated, data?.expires_in, queryClient]); + + // Handle auth events from popup windows + useEffect(() => { + const handleStorageChange = (event: StorageEvent) => { + if (event.key === 'auth_completed') { + queryClient.invalidateQueries({ queryKey: [AUTH_STATUS_KEY] }); + } + }; + + window.addEventListener('storage', handleStorageChange); + return () => window.removeEventListener('storage', handleStorageChange); + }, [queryClient]); + + return { + isAuthenticated: data?.authenticated, + user: data?.user, + expires_in: data?.expires_in, + isLoading, + error: error || (data?.authenticated === false && data?.message ? new Error(data.message) : null), + isError, + refetchAuthStatus: refetch, + }; +}; diff --git a/src/frontend/src/hooks/useCallbackRefState.ts b/src/frontend/src/hooks/useCallbackRefState.ts new file mode 100644 index 0000000..4a8552b --- /dev/null +++ b/src/frontend/src/hooks/useCallbackRefState.ts @@ -0,0 +1,7 @@ +import { useCallback, useState } from "react"; + +export const useCallbackRefState = () => { + const [refValue, setRefValue] = useState(null); + const refCallback = useCallback((value: T | null) => setRefValue(value), []); + return [refValue, refCallback] as const; +}; diff --git a/src/frontend/src/hooks/useLogout.ts b/src/frontend/src/hooks/useLogout.ts new file mode 100644 index 0000000..11f5d9c --- /dev/null +++ b/src/frontend/src/hooks/useLogout.ts @@ -0,0 +1,49 @@ +import { useMutation, useQueryClient } from '@tanstack/react-query'; + +interface LogoutResponse { + status: string; + logout_url: string; +} + +interface LogoutError extends Error {} + +const logoutUser = async (): Promise => { + const response = await fetch('/api/auth/logout', { + method: 'GET', + credentials: 'include', + }); + + if (!response.ok) { + let errorMessage = `Logout failed with status: ${response.status}`; + try { + const errorData = await response.json(); + if (errorData && (errorData.detail || errorData.message)) { + errorMessage = errorData.detail || errorData.message; + } + } catch (e) { + console.warn('[pad.ws] Could not parse JSON from logout error response.'); + } + throw new Error(errorMessage); + } + + return response.json(); +}; + +export const useLogout = () => { + const queryClient = useQueryClient(); + + return useMutation({ + mutationFn: logoutUser, + onSuccess: (data) => { + console.debug('[pad.ws] Logout mutation successful, Keycloak URL:', data.logout_url); + + // Invalidate authStatus query to trigger a re-fetch and update UI. + // This will make useAuthStatus re-evaluate, and isAuthenticated should become false. + // TODO + queryClient.invalidateQueries({ queryKey: ['authStatus'] }); + }, + onError: (error) => { + console.error('[pad.ws] Logout mutation failed:', error.message); + }, + }); +}; diff --git a/src/frontend/src/hooks/usePadData.ts b/src/frontend/src/hooks/usePadData.ts new file mode 100644 index 0000000..5b0a4db --- /dev/null +++ b/src/frontend/src/hooks/usePadData.ts @@ -0,0 +1,73 @@ +import { useQuery } from '@tanstack/react-query'; +import { useEffect } from 'react'; +import type { ExcalidrawImperativeAPI, AppState } from "@atyrode/excalidraw/types"; +import type { ExcalidrawElement } from "@atyrode/excalidraw/element/types"; +import { normalizeCanvasData } from '../lib/canvas'; +import { INITIAL_APP_DATA } from '../constants'; + +interface PadData { + elements?: readonly ExcalidrawElement[]; + appState?: Pick; + files?: Record; +} + +const fetchPadById = async (padId: string): Promise => { + const response = await fetch(`/api/pad/${padId}`); + if (!response.ok) { + let errorMessage = 'Failed to fetch pad data.'; + try { + const errorData = await response.json(); + if (errorData && errorData.detail) { + errorMessage = errorData.detail; + } + } catch (e) { + // Ignore if error response is not JSON or empty + } + throw new Error(errorMessage); + } + return response.json(); +}; + +export const usePad = (padId: string | null, excalidrawAPI: ExcalidrawImperativeAPI | null) => { + const isTemporaryPad = padId?.startsWith('temp-'); + + const { data, isLoading, error, isError } = useQuery({ + queryKey: ['pad', padId], + queryFn: () => { + if (!padId) throw new Error("padId is required"); + return fetchPadById(padId); + }, + enabled: !!padId && !isTemporaryPad, + }); + + useEffect(() => { + if (isTemporaryPad && excalidrawAPI) { + console.debug(`[pad.ws] Initializing new temporary pad ${padId}`); + const normalizedData = normalizeCanvasData(INITIAL_APP_DATA); + excalidrawAPI.updateScene(normalizedData); + return; + } + + if (data && excalidrawAPI && !isTemporaryPad) { + const normalizedData = normalizeCanvasData(data); + console.debug(`[pad.ws] Loading pad ${padId}`); + excalidrawAPI.updateScene(normalizedData); + } + }, [data, excalidrawAPI, padId, isTemporaryPad]); + + if (isTemporaryPad) { + return { + padData: INITIAL_APP_DATA, + isLoading: false, + error: null, + isError: false, + }; + } + + return { + padData: data, + isLoading, + error, + isError + }; +}; diff --git a/src/frontend/src/hooks/usePadTabs.ts b/src/frontend/src/hooks/usePadTabs.ts new file mode 100644 index 0000000..57267e9 --- /dev/null +++ b/src/frontend/src/hooks/usePadTabs.ts @@ -0,0 +1,441 @@ +import { useQuery, useMutation, useQueryClient } from '@tanstack/react-query'; +import { useState, useEffect } from 'react'; +import { capture } from "../lib/posthog"; + +export enum SharingPolicy { + PRIVATE = 'private', + WHITELIST = 'whitelist', + PUBLIC = 'public', +} + +export interface Tab { + id: string; + title: string; + ownerId: string; + sharingPolicy: SharingPolicy; + createdAt: string; + updatedAt: string; +} + +interface PadResponse { + tabs: Tab[]; + activeTabId: string; +} + +interface UserResponse { + username: string; + email: string; + email_verified: boolean; + name: string; + given_name: string; + family_name: string; + roles: string[]; + last_selected_pad: string | null; + pads: { + id: string; + display_name: string; + owner_id: string; + sharing_policy: string; + created_at: string; + updated_at: string; + }[]; +} + +const fetchUserPads = async (): Promise => { + const response = await fetch('/api/users/me'); + if (!response.ok) { + let errorMessage = 'Failed to fetch user pads.'; + try { + const errorData = await response.json(); + if (errorData && errorData.message) { + errorMessage = errorData.message; + } + } catch (e) { + // Ignore if error response is not JSON or empty + } + throw new Error(errorMessage); + } + const userData: UserResponse = await response.json(); + + // Transform pads into tabs format + const tabs = userData.pads.map(pad => ({ + id: pad.id, + title: pad.display_name, + ownerId: pad.owner_id, + sharingPolicy: pad.sharing_policy as SharingPolicy, + createdAt: pad.created_at, + updatedAt: pad.updated_at + })); + + // Use last_selected_pad if it exists and is in the current tabs, otherwise use first tab + let activeTabId = ''; + if (userData.last_selected_pad && tabs.some(tab => tab.id === userData.last_selected_pad)) { + activeTabId = userData.last_selected_pad; + } else if (tabs.length > 0) { + activeTabId = tabs[0].id; + } + + return { + tabs, + activeTabId + }; +}; + +interface NewPadApiResponse { + id: string; + display_name: string; + owner_id: string; + sharing_policy: SharingPolicy; + created_at: string; + updated_at: string; +} + +const createNewPad = async (): Promise => { + const response = await fetch('/api/pad/new', { + method: 'POST', + }); + if (!response.ok) { + let errorMessage = 'Failed to create new pad'; + try { + const errorData = await response.json(); + if (errorData && errorData.detail) { + errorMessage = errorData.detail; + } else if (errorData && errorData.message) { + errorMessage = errorData.message; + } + } catch (e) { + // Ignore if error response is not JSON or empty + } + throw new Error(errorMessage); + } + const newPadResponse: NewPadApiResponse = await response.json(); + return { + id: newPadResponse.id, + title: newPadResponse.display_name, + ownerId: newPadResponse.owner_id, + sharingPolicy: newPadResponse.sharing_policy as SharingPolicy, + createdAt: newPadResponse.created_at, + updatedAt: newPadResponse.updated_at, + }; +}; + +export const usePadTabs = (isAuthenticated?: boolean) => { + const queryClient = useQueryClient(); + const [selectedTabId, setSelectedTabId] = useState(''); + + const { data, isLoading, error, isError } = useQuery({ + queryKey: ['padTabs'], + queryFn: fetchUserPads, + enabled: isAuthenticated === true, + }); + + // Effect to manage tab selection based on data changes and selectedTabId validity + useEffect(() => { + if (isLoading || !data?.tabs) { + return; + } + + // If we don't have a selectedTabId yet, use the server's activeTabId + if (!selectedTabId && data.activeTabId) { + setSelectedTabId(data.activeTabId); + return; + } + + // Only set a tab if we don't have a valid selection + if (data.tabs.length > 0 && (!selectedTabId || !data.tabs.some(tab => tab.id === selectedTabId))) { + setSelectedTabId(data.tabs[0].id); + } else if (data.tabs.length === 0) { + setSelectedTabId(''); + } + }, [data, isLoading]); + + const createPadMutation = useMutation({ + mutationFn: createNewPad, + onMutate: async () => { + await queryClient.cancelQueries({ queryKey: ['padTabs'] }); + const previousTabsResponse = queryClient.getQueryData(['padTabs']); + + const tempTabId = `temp-${Date.now()}`; + const tempTab: Tab = { + id: tempTabId, + title: 'New pad', + ownerId: '', + sharingPolicy: SharingPolicy.PRIVATE, + createdAt: new Date().toISOString(), + updatedAt: new Date().toISOString(), + }; + + queryClient.setQueryData(['padTabs'], (old) => { + const newTabs = old ? [...old.tabs, tempTab] : [tempTab]; + return { + tabs: newTabs, + activeTabId: old?.activeTabId || tempTab.id, // Keep old active or use new if first + }; + }); + setSelectedTabId(tempTabId); + + return { previousTabsResponse, tempTabId }; + }, + onError: (err, variables, context) => { + if (context?.previousTabsResponse) { + queryClient.setQueryData(['padTabs'], context.previousTabsResponse); + } + // Revert selectedTabId if it was the temporary one + if (selectedTabId === context?.tempTabId && context?.previousTabsResponse?.activeTabId) { + setSelectedTabId(context.previousTabsResponse.activeTabId); + } else if (selectedTabId === context?.tempTabId && context?.previousTabsResponse?.tabs && context.previousTabsResponse.tabs.length > 0) { + setSelectedTabId(context.previousTabsResponse.tabs[0].id); + } else if (selectedTabId === context?.tempTabId) { + setSelectedTabId(''); + } + }, + onSuccess: (newlyCreatedTab, variables, context) => { + queryClient.setQueryData(['padTabs'], (old) => { + if (!old) return { tabs: [newlyCreatedTab], activeTabId: newlyCreatedTab.id }; + const newTabs = old.tabs.map(tab => + tab.id === context?.tempTabId ? newlyCreatedTab : tab + ); + if (!newTabs.find(tab => tab.id === newlyCreatedTab.id)) { + newTabs.push(newlyCreatedTab); + } + return { + tabs: newTabs, + activeTabId: old.activeTabId === context?.tempTabId ? newlyCreatedTab.id : old.activeTabId, + }; + }); + if (selectedTabId === context?.tempTabId) { + setSelectedTabId(newlyCreatedTab.id); + } + }, + onSettled: () => { + queryClient.invalidateQueries({ queryKey: ['padTabs'] }); + }, + }); + + const renamePadAPI = async ({ padId, newName }: { padId: string, newName: string }): Promise => { + const response = await fetch(`/api/pad/${padId}/rename`, { + method: 'PUT', + headers: { + 'Content-Type': 'application/json', + }, + body: JSON.stringify({ display_name: newName }), + }); + if (!response.ok) { + throw new Error('Failed to rename pad'); + } + }; + + const renamePadMutation = useMutation({ + mutationFn: renamePadAPI, + onMutate: async ({ padId, newName }) => { + await queryClient.cancelQueries({ queryKey: ['padTabs'] }); + const previousTabsResponse = queryClient.getQueryData(['padTabs']); + let oldName: string | undefined; + + queryClient.setQueryData(['padTabs'], (old) => { + if (!old) return undefined; + const newTabs = old.tabs.map(tab => { + if (tab.id === padId) { + oldName = tab.title; + return { ...tab, title: newName, updatedAt: new Date().toISOString() }; + } + return tab; + }); + return { ...old, tabs: newTabs }; + }); + return { previousTabsResponse, padId, oldName }; + }, + onError: (err, variables, context) => { + if (context?.previousTabsResponse) { + queryClient.setQueryData(['padTabs'], context.previousTabsResponse); + } + }, + onSettled: (data, error, variables, context) => { + queryClient.invalidateQueries({ queryKey: ['padTabs'] }); + }, + }); + + const deletePadAPI = async (padId: string): Promise => { + const response = await fetch(`/api/pad/${padId}`, { + method: 'DELETE', + }); + if (!response.ok) { + throw new Error('Failed to delete pad'); + } + }; + + const deletePadMutation = useMutation({ + mutationFn: deletePadAPI, // padId is the variable passed to mutate + onMutate: async (padIdToDelete) => { + await queryClient.cancelQueries({ queryKey: ['padTabs'] }); + const previousTabsResponse = queryClient.getQueryData(['padTabs']); + const previousSelectedTabId = selectedTabId; + let deletedTab: Tab | undefined; + + queryClient.setQueryData(['padTabs'], (old) => { + if (!old) return { tabs: [], activeTabId: '' }; + deletedTab = old.tabs.find(tab => tab.id === padIdToDelete); + const newTabs = old.tabs.filter(tab => tab.id !== padIdToDelete); + + let newSelectedTabId = selectedTabId; + if (selectedTabId === padIdToDelete) { + if (newTabs.length > 0) { + const currentIndex = old.tabs.findIndex(tab => tab.id === padIdToDelete); + newSelectedTabId = newTabs[Math.max(0, currentIndex - 1)]?.id || newTabs[0]?.id; + } else { + newSelectedTabId = ''; + } + setSelectedTabId(newSelectedTabId); + } + + return { + tabs: newTabs, + activeTabId: newSelectedTabId, + }; + }); + return { previousTabsResponse, previousSelectedTabId, deletedTab }; + }, + onError: (err, padId, context) => { + if (context?.previousTabsResponse) { + queryClient.setQueryData(['padTabs'], context.previousTabsResponse); + } + if (context?.previousSelectedTabId) { + setSelectedTabId(context.previousSelectedTabId); + } + }, + onSettled: () => { + queryClient.invalidateQueries({ queryKey: ['padTabs'] }); + }, + }); + + const updateSharingPolicyAPI = async ({ padId, policy }: { padId: string, policy: string }): Promise => { + const response = await fetch(`/api/pad/${padId}/sharing`, { + method: 'PUT', + headers: { + 'Content-Type': 'application/json', + }, + body: JSON.stringify({ policy }), + }); + if (!response.ok) { + throw new Error('Failed to update sharing policy'); + } + }; + + const updateSharingPolicyMutation = useMutation({ + mutationFn: updateSharingPolicyAPI, + onMutate: async ({ padId, policy }) => { + await queryClient.cancelQueries({ queryKey: ['padTabs'] }); + const previousTabsResponse = queryClient.getQueryData(['padTabs']); + + queryClient.setQueryData(['padTabs'], (old) => { + if (!old) return undefined; + const newTabs = old.tabs.map(tab => { + if (tab.id === padId) { + return { ...tab, updatedAt: new Date().toISOString() }; + } + return tab; + }); + return { ...old, tabs: newTabs }; + }); + return { previousTabsResponse }; + }, + onError: (err, variables, context) => { + if (context?.previousTabsResponse) { + queryClient.setQueryData(['padTabs'], context.previousTabsResponse); + } + }, + onSettled: () => { + queryClient.invalidateQueries({ queryKey: ['padTabs'] }); + }, + }); + + const leaveSharedPadAPI = async (padId: string): Promise => { + const response = await fetch(`/api/users/close/${padId}`, { + method: 'DELETE', + // Add headers if necessary, e.g., Authorization + }); + if (!response.ok) { + let errorMessage = 'Failed to leave shared pad.'; + try { + const errorData = await response.json(); + if (errorData && (errorData.detail || errorData.message)) { + errorMessage = errorData.detail || errorData.message; + } + } catch (e) { /* Ignore if response is not JSON */ } + throw new Error(errorMessage); + } + }; + + const leaveSharedPadMutation = useMutation({ + mutationFn: leaveSharedPadAPI, + onMutate: async (padIdToLeave) => { + await queryClient.cancelQueries({ queryKey: ['padTabs'] }); + const previousTabsResponse = queryClient.getQueryData(['padTabs']); + const previousSelectedTabId = selectedTabId; + + queryClient.setQueryData(['padTabs'], (old) => { + if (!old) return { tabs: [], activeTabId: '' }; + const newTabs = old.tabs.filter(tab => tab.id !== padIdToLeave); + + let newSelectedTabId = selectedTabId; + if (selectedTabId === padIdToLeave) { + if (newTabs.length > 0) { + const currentIndex = old.tabs.findIndex(tab => tab.id === padIdToLeave); + newSelectedTabId = newTabs[Math.max(0, currentIndex - 1)]?.id || newTabs[0]?.id; + } else { + newSelectedTabId = ''; + } + setSelectedTabId(newSelectedTabId); + } + + return { + tabs: newTabs, + activeTabId: newSelectedTabId, + }; + }); + return { previousTabsResponse, previousSelectedTabId, leftPadId: padIdToLeave }; + }, + onSuccess: (data, padId, context) => { + const tabLeft = context?.previousTabsResponse?.tabs.find(t => t.id === context.leftPadId); + if (typeof capture !== 'undefined') { + capture("pad_left", { padId: context.leftPadId, padName: tabLeft?.title || "" }); + } + }, + onError: (err, padId, context) => { + if (context?.previousTabsResponse) { + queryClient.setQueryData(['padTabs'], context.previousTabsResponse); + } + if (context?.previousSelectedTabId) { + setSelectedTabId(context.previousSelectedTabId); + } + alert(`Error leaving pad: ${err.message}`); + }, + onSettled: () => { + queryClient.invalidateQueries({ queryKey: ['padTabs'] }); + }, + }); + + const selectTab = async (tabId: string) => { + setSelectedTabId(tabId); + }; + + return { + tabs: data?.tabs ?? [], + selectedTabId: selectedTabId || data?.activeTabId || '', + isLoading, + error, + isError, + createNewPad: createPadMutation.mutate, // Standard mutate for fire-and-forget + createNewPadAsync: createPadMutation.mutateAsync, // For components needing the result + isCreating: createPadMutation.isPending, + renamePad: renamePadMutation.mutate, + isRenaming: renamePadMutation.isPending, + deletePad: deletePadMutation.mutate, + isDeleting: deletePadMutation.isPending, + leaveSharedPad: leaveSharedPadMutation.mutate, + isLeavingSharedPad: leaveSharedPadMutation.isPending, + updateSharingPolicy: updateSharingPolicyMutation.mutate, + isUpdatingSharingPolicy: updateSharingPolicyMutation.isPending, + selectTab + }; +}; diff --git a/src/frontend/src/hooks/useWorkspace.ts b/src/frontend/src/hooks/useWorkspace.ts new file mode 100644 index 0000000..a853aac --- /dev/null +++ b/src/frontend/src/hooks/useWorkspace.ts @@ -0,0 +1,134 @@ +import { useQuery, useMutation, useQueryClient } from '@tanstack/react-query'; + +// Matches the Pydantic model in workspace_router.py +export interface WorkspaceState { + id: string; + state: string; // e.g., "pending", "starting", "running", "stopping", "stopped", "failed", "canceling", "canceled", "deleting", "deleted" + name: string; + username: string; + base_url: string; + agent: string; +} + +const WORKSPACE_QUERY_KEY = ['workspaceState']; + +// API function to fetch workspace state (https://pkg.go.dev/github.com/coder/coder/codersdk#WorkspaceStatus) +const fetchWorkspaceState = async (): Promise => { + try { + const response = await fetch('/api/workspace/state'); + + if (!response.ok) { + let errorMessage = `Failed to fetch workspace state. Status: ${response.status}`; + try { + const errorData = await response.json(); + if (errorData && errorData.detail) { + errorMessage = errorData.detail; + } + } catch (e) { + // Ignore if error response cannot be parsed + } + throw new Error(errorMessage); + } + + const jsonData = await response.json(); + + if (!jsonData || typeof jsonData.state !== 'string') { + throw new Error('Invalid data structure received for workspace state.'); + } + + return jsonData as WorkspaceState; + + } catch (error) { + throw error; + } +}; + +// API function to start the workspace +const callStartWorkspace = async (): Promise => { + const response = await fetch('/api/workspace/start', { + method: 'POST', + }); + if (!response.ok) { + let errorMessage = 'Failed to start workspace.'; + try { + const errorData = await response.json(); + if (errorData && errorData.detail) { + errorMessage = errorData.detail; + } + } catch (e) { + // Error response parsing failed + } + throw new Error(errorMessage); + } + return response.json(); +}; + +// API function to stop the workspace +const callStopWorkspace = async (): Promise => { + const response = await fetch('/api/workspace/stop', { + method: 'POST', + }); + if (!response.ok) { + let errorMessage = 'Failed to stop workspace.'; + try { + const errorData = await response.json(); + if (errorData && errorData.detail) { + errorMessage = errorData.detail; + } + } catch (e) { + // Error response parsing failed + } + throw new Error(errorMessage); + } + return response.json(); +}; + +export const useWorkspace = () => { + const queryClient = useQueryClient(); + + const { + data: workspaceState, + isLoading: isLoadingState, + error: stateError, + isError: isStateError, + refetch: refetchWorkspaceState, + } = useQuery({ + queryKey: WORKSPACE_QUERY_KEY, + queryFn: fetchWorkspaceState, + refetchInterval: 5000, // Poll every 5 seconds + }); + + const startMutation = useMutation({ + mutationFn: callStartWorkspace, + onSuccess: () => { + // Invalidate and refetch workspace state after starting + queryClient.invalidateQueries({ queryKey: WORKSPACE_QUERY_KEY }); + }, + }); + + const stopMutation = useMutation({ + mutationFn: callStopWorkspace, + onSuccess: () => { + // Invalidate and refetch workspace state after stopping + queryClient.invalidateQueries({ queryKey: WORKSPACE_QUERY_KEY }); + }, + }); + + return { + workspaceState, + isLoadingState, + stateError, + isStateError, + refetchWorkspaceState, + + startWorkspace: startMutation.mutate, + isStarting: startMutation.isPending, + startError: startMutation.error, + isStartError: startMutation.isError, + + stopWorkspace: stopMutation.mutate, + isStopping: stopMutation.isPending, + stopError: stopMutation.error, + isStopError: stopMutation.isError, + }; +}; diff --git a/src/frontend/src/icons/NewPadIcon.tsx b/src/frontend/src/icons/NewPadIcon.tsx new file mode 100644 index 0000000..51c561d --- /dev/null +++ b/src/frontend/src/icons/NewPadIcon.tsx @@ -0,0 +1,34 @@ +import React from "react"; + +interface NewPadIconProps { + className?: string; + width?: number; + height?: number; +} + +export const NewPadIcon: React.FC = ({ + className = "", + width = 20, + height = 20, +}) => { + return ( + + + + + + + ); +}; + +export default NewPadIcon; \ No newline at end of file diff --git a/src/frontend/src/icons/index.ts b/src/frontend/src/icons/index.ts index 05d4ba1..9f70ef4 100644 --- a/src/frontend/src/icons/index.ts +++ b/src/frontend/src/icons/index.ts @@ -1,3 +1,4 @@ export { default as GoogleIcon } from './GoogleIcon'; export { default as GithubIcon } from './GithubIcon'; export { default as DiscordIcon } from './DiscordIcon'; +export { default as NewPadIcon } from './NewPadIcon'; diff --git a/src/frontend/src/lib/authRefreshManager.ts b/src/frontend/src/lib/authRefreshManager.ts new file mode 100644 index 0000000..d3a1a2e --- /dev/null +++ b/src/frontend/src/lib/authRefreshManager.ts @@ -0,0 +1,86 @@ +// Auth refresh singleton manager +interface UserInfo { + username?: string; + email?: string; + name?: string; +} + +interface AuthStatusResponse { + authenticated: boolean; + user?: UserInfo; + expires_in?: number; + message?: string; +} + +// Refresh API function +const refreshAuth = async (): Promise => { + const response = await fetch('/api/auth/refresh', { + method: 'POST', + credentials: 'include', + }); + if (!response.ok) { + throw new Error('Failed to refresh session'); + } + return response.json(); +}; + +// Singleton state +let refreshTimer: NodeJS.Timeout | null = null; +let isRefreshScheduled = false; + +/** + * Schedule a token refresh operation. + * Only one refresh will be scheduled at a time across the application. + */ +export const scheduleTokenRefresh = ( + expiresIn: number, + onRefresh: (data: AuthStatusResponse) => void, + onError: (err: Error) => void +): void => { + // Don't schedule if already scheduled + if (isRefreshScheduled) { + return; + } + + const msUntilExpiry = expiresIn * 1000; + const refreshTime = msUntilExpiry - (5 * 60 * 1000); // 5 minutes before expiry + + // Don't schedule if token expires too soon + if (refreshTime <= 0) { + return; + } + + isRefreshScheduled = true; + + // Clear any existing timer first + if (refreshTimer) { + clearTimeout(refreshTimer); + } + + // Set up new timer + refreshTimer = setTimeout(async () => { + try { + const refreshData = await refreshAuth(); + onRefresh(refreshData); + isRefreshScheduled = false; + } catch (err) { + console.error('[pad.ws] Auth refresh failed:', err); + onError(err instanceof Error ? err : new Error(String(err))); + isRefreshScheduled = false; + } + }, refreshTime); +}; + +/** + * Cancel any scheduled token refresh + */ +export const cancelTokenRefresh = (): void => { + if (refreshTimer) { + clearTimeout(refreshTimer); + refreshTimer = null; + } + isRefreshScheduled = false; +}; + +// Export auth status query key for consistency +export const AUTH_STATUS_KEY = 'authStatus'; \ No newline at end of file diff --git a/src/frontend/src/lib/canvas.ts b/src/frontend/src/lib/canvas.ts new file mode 100644 index 0000000..3c25701 --- /dev/null +++ b/src/frontend/src/lib/canvas.ts @@ -0,0 +1,42 @@ +import { DEFAULT_SETTINGS } from '../types/settings'; + +/** + * + * @param data The canvas data to normalize + * @returns Normalized canvas data + */ +export function normalizeCanvasData(data: any) { + if (!data) return data; + + const appState = { ...data.appState }; + + // Remove width and height properties + if ("width" in appState) { + delete appState.width; + } + if ("height" in appState) { + delete appState.height; + } + + // Preserve existing pad settings if they exist, otherwise create new ones + const existingPad = appState.pad || {}; + const existingUserSettings = existingPad.userSettings || {}; + + // Merge existing pad properties with our updates + appState.pad = { + ...existingPad, // Preserve all existing properties (uniqueId, displayName, etc.) + // Merge existing user settings with default settings + userSettings: { + ...DEFAULT_SETTINGS, + ...existingUserSettings + } + }; + + // Reset collaborators (https://github.com/excalidraw/excalidraw/issues/8637) + appState.collaborators = new Map(); + + // Support new appState key default value (https://github.com/excalidraw/excalidraw/commit/a30e1b25c60a9c5c6f049daada0443df874a5266#diff-b7eb4d88c1bc5b4756a01281478e2105db6502e96c2a4b855496c508cef05397L124-R124) + appState.searchMatches = null; + + return { ...data, appState }; +} \ No newline at end of file diff --git a/src/frontend/src/lib/collab/Collab.tsx b/src/frontend/src/lib/collab/Collab.tsx new file mode 100644 index 0000000..f1e66b7 --- /dev/null +++ b/src/frontend/src/lib/collab/Collab.tsx @@ -0,0 +1,579 @@ +import React, { PureComponent } from 'react'; +import type { ExcalidrawImperativeAPI, AppState, SocketId, Collaborator as ExcalidrawCollaboratorType } from '@atyrode/excalidraw/types'; +import type { ExcalidrawElement as ExcalidrawElementType } from '@atyrode/excalidraw/element/types'; +import { + viewportCoordsToSceneCoords, + getSceneVersion, + reconcileElements, + restoreElements +} from '@atyrode/excalidraw'; +import throttle from 'lodash.throttle'; +import isEqual from 'lodash.isequal'; + +import Portal from './Portal'; +import type { WebSocketMessage, ConnectionStatus } from './Portal'; +import type { UserInfo } from '../../hooks/useAuthStatus'; +import { debounce, type DebouncedFunction } from '../debounce'; +import { POINTER_MOVE_THROTTLE_MS, ENABLE_PERIODIC_FULL_SYNC, PERIODIC_FULL_SYNC_INTERVAL_MS } from '../../constants'; + +interface PointerData { + x: number; + y: number; + tool: 'laser' | 'pointer'; + button?: 'up' | 'down'; +} + +export interface Collaborator { + id: SocketId; + pointer?: PointerData; + button?: 'up' | 'down'; + selectedElementIds?: AppState['selectedElementIds']; + username?: string; + userState?: 'active' | 'away' | 'idle'; + color?: { background: string; stroke: string }; + avatarUrl?: string; +} + +const getRandomCollaboratorColor = () => { + const colors = [ + { background: "#5C2323", stroke: "#FF6B6B" }, { background: "#1E4620", stroke: "#6BCB77" }, + { background: "#1A3A5F", stroke: "#4F9CF9" }, { background: "#5F4D1C", stroke: "#FFC83D" }, + { background: "#3A1E5C", stroke: "#C56CF0" }, { background: "#5F3A1C", stroke: "#FF9F43" }, + { background: "#1E4647", stroke: "#5ECED4" }, { background: "#4E1A3A", stroke: "#F368BC" }, + ]; + return colors[Math.floor(Math.random() * colors.length)]; +}; + +interface CollabProps { + excalidrawAPI: ExcalidrawImperativeAPI | null; + user: UserInfo | null; + isOnline: boolean; + isLoadingAuth: boolean; + padId: string | null; +} + +interface CollabState { + errorMessage: string | null; + connectionStatus: ConnectionStatus; + username: string; + collaborators: Map; + lastProcessedSceneVersion: number; +} + +class Collab extends PureComponent { + [x: string]: any; + readonly state: CollabState; + private portal: Portal; + private debouncedBroadcastAppState: DebouncedFunction<[AppState]>; + private lastSentAppState: AppState | null = null; + + private throttledOnPointerMove: any; + private unsubExcalidrawPointerDown: (() => void) | null = null; + private unsubExcalidrawPointerUp: (() => void) | null = null; + private unsubExcalidrawSceneChange: (() => void) | null = null; + private lastBroadcastedSceneVersion: number = -1; + private isInitialLoad: boolean = true; + + props: any; + + constructor(props: CollabProps) { + super(props); + this.state = { + errorMessage: null, + connectionStatus: 'Uninstantiated', + username: props.user?.username || props.user?.id || '', + collaborators: new Map(), + lastProcessedSceneVersion: -1, + }; + + this.portal = new Portal( + this, + props.padId, + props.user, + props.isOnline, // Passing isOnline as isAuthenticated + props.isLoadingAuth, + this.handlePortalStatusChange, + this.handlePortalMessage + ); + + this.throttledOnPointerMove = throttle((event: PointerEvent) => { + this.handlePointerMove(event); + }, POINTER_MOVE_THROTTLE_MS); + + this.debouncedBroadcastAppState = debounce((appState: AppState) => { + if (this.portal.isOpen() && this.props.isOnline) { + if (!this.lastSentAppState || !isEqual(this.lastSentAppState, appState)) { + if (this.lastSentAppState) { + const changes = this.detectAppStateChanges(this.lastSentAppState, appState); + if (Object.keys(changes).length > 0) { + console.debug('[pad.ws] AppState changes detected:', changes); + this.portal.broadcastAppStateUpdate(appState); + this.lastSentAppState = { ...appState }; + } + } else { + this.portal.broadcastAppStateUpdate(appState); + this.lastSentAppState = { ...appState }; + } + } + } + }, 500); + } + + /* AppState Change Detection */ + + private detectAppStateChanges = (oldState: AppState, newState: AppState): Record => { + const changes: Record = {}; + + // Get all unique keys from both old and new state + const allKeys = new Set([ + ...Object.keys(oldState), + ...Object.keys(newState) + ]); + + // Compare each field dynamically, but exclude collaborators field + allKeys.forEach(field => { + if (field === 'collaborators') return; + + const oldValue = oldState[field as keyof AppState]; + const newValue = newState[field as keyof AppState]; + + if (this.hasChanged(oldValue, newValue)) { + changes[field] = { + old: this.serializeValue(oldValue), + new: this.serializeValue(newValue) + }; + } + }); + + return changes; + }; + + private hasChanged = (oldValue: any, newValue: any): boolean => { + // Handle Maps (like selectedElementIds) + if (oldValue instanceof Map && newValue instanceof Map) { + if (oldValue.size !== newValue.size) return true; + const oldEntries = Array.from(oldValue.entries()); + for (const [key, value] of oldEntries) { + if (!newValue.has(key) || newValue.get(key) !== value) return true; + } + return false; + } + + // Handle objects with zoom value + if (typeof oldValue === 'object' && typeof newValue === 'object' && oldValue !== null && newValue !== null) { + // Special handling for zoom object + if ('value' in oldValue && 'value' in newValue) { + return oldValue.value !== newValue.value; + } + return JSON.stringify(oldValue) !== JSON.stringify(newValue); + } + + // Handle primitives + return oldValue !== newValue; + }; + + private serializeValue = (value: any): any => { + if (value instanceof Map) { + return Object.fromEntries(value); + } + if (typeof value === 'object' && value !== null) { + return { ...value }; + } + return value; + }; + + /* Component Lifecycle */ + + componentDidMount() { + if (this.portal) { + this.portal.initiate(); + } + + if (this.props.user) { + this.updateUsername(this.props.user); + } + this.updateExcalidrawCollaborators(); // Initial update for collaborators + this.addPointerEventListeners(); + this.addSceneChangeListeners(); + + // Initialize lastBroadcastedSceneVersion + if (this.props.excalidrawAPI) { + const initialElements = this.props.excalidrawAPI.getSceneElementsIncludingDeleted(); + // Set initial broadcast version. + this.lastBroadcastedSceneVersion = getSceneVersion(initialElements); + // Also set the initial processed version from local state + this.setState({ lastProcessedSceneVersion: this.lastBroadcastedSceneVersion }); + } + if (this.props.isOnline && this.props.padId) { + // Potentially call a method to broadcast initial scene if this client is the first or needs to sync + // this.broadcastFullSceneUpdate(true); // Example: true for SCENE_INIT + } + + // Start periodic full sync if enabled + if (ENABLE_PERIODIC_FULL_SYNC) { + this.startPeriodicFullSync(); + } + + this.isInitialLoad = false; + } + + componentDidUpdate(prevProps: CollabProps, prevState: CollabState) { + if ( + this.props.user !== prevProps.user || + this.props.isOnline !== prevProps.isOnline || + this.props.isLoadingAuth !== prevProps.isLoadingAuth + ) { + this.updateUsername(this.props.user); // Update username if user object changed + this.portal.updateAuthInfo(this.props.user, this.props.isOnline, this.props.isLoadingAuth); + } + + if (this.props.padId !== prevProps.padId) { + // Portal's updatePadId will handle disconnection from old and connection to new + this.debouncedBroadcastAppState.cancel(); // Cancel any pending app state updates for the old pad + this.lastSentAppState = null; // Reset last sent app state for the new pad + + // Reset versions immediately when switching pads + this.lastBroadcastedSceneVersion = -1; + // Mark as initial load for the new pad + this.isInitialLoad = true; + + this.portal.updatePadId(this.props.padId); + this.setState({ + collaborators: new Map(), + lastProcessedSceneVersion: -1, + username: this.props.user?.username || this.props.user?.id || '', + // connectionStatus will be updated by portal's callbacks + }); + } + + if (this.state.collaborators !== prevState.collaborators) { + if (this.updateExcalidrawCollaborators) this.updateExcalidrawCollaborators(); + } + } + + componentWillUnmount() { + this.portal.closePortal(); // Changed from close() + this.removePointerEventListeners(); + if (this.throttledOnPointerMove && typeof this.throttledOnPointerMove.cancel === 'function') { + this.throttledOnPointerMove.cancel(); + } + this.debouncedBroadcastAppState.cancel(); + this.removeSceneChangeListeners(); + this.stopPeriodicFullSync(); // Stop periodic sync on unmount + } + + /* Periodic Full Sync */ + + private periodicFullSyncIntervalId: ReturnType | null = null; + + private startPeriodicFullSync = () => { + if (this.periodicFullSyncIntervalId !== null) { + // Already running + return; + } + this.periodicFullSyncIntervalId = setInterval(() => { + if (this.props.excalidrawAPI && this.portal.isOpen() && this.props.isOnline) { + console.debug('[pad.ws] Performing periodic full scene sync.'); + const allCurrentElements = this.props.excalidrawAPI.getSceneElementsIncludingDeleted(); + this.portal.broadcastSceneUpdate('SCENE_UPDATE', allCurrentElements, true); + this.lastBroadcastedSceneVersion = getSceneVersion(allCurrentElements); + } + }, PERIODIC_FULL_SYNC_INTERVAL_MS); + }; + + private stopPeriodicFullSync = () => { + if (this.periodicFullSyncIntervalId !== null) { + clearInterval(this.periodicFullSyncIntervalId); + this.periodicFullSyncIntervalId = null; + console.debug('[pad.ws] Stopped periodic full scene sync.'); + } + }; + + + /* Pointer */ + + private addPointerEventListeners = () => { + if (!this.props.excalidrawAPI) return; + document.addEventListener('pointermove', this.throttledOnPointerMove); + this.unsubExcalidrawPointerDown = this.props.excalidrawAPI.onPointerDown( + (_activeTool, _pointerDownState, event) => this.handlePointerInteraction('down', event) + ); + this.unsubExcalidrawPointerUp = this.props.excalidrawAPI.onPointerUp( + (_activeTool, _pointerUpState, event) => this.handlePointerInteraction('up', event) + ); + }; + + private removePointerEventListeners = () => { + document.removeEventListener('pointermove', this.throttledOnPointerMove); + if (this.unsubExcalidrawPointerDown) this.unsubExcalidrawPointerDown(); + if (this.unsubExcalidrawPointerUp) this.unsubExcalidrawPointerUp(); + this.unsubExcalidrawPointerDown = null; + this.unsubExcalidrawPointerUp = null; + }; + + private handlePointerInteraction = (button: 'down' | 'up', event: MouseEvent | PointerEvent) => { + if (!this.props.excalidrawAPI || !this.portal.isOpen() || !this.props.isOnline) return; + const appState = this.props.excalidrawAPI.getAppState(); + const sceneCoords = viewportCoordsToSceneCoords({ clientX: event.clientX, clientY: event.clientY }, appState); + const currentTool = appState.activeTool.type; + const displayTool: 'laser' | 'pointer' = currentTool === 'laser' ? 'laser' : 'pointer'; + const pointerData: PointerData = { x: sceneCoords.x, y: sceneCoords.y, tool: displayTool, button: button }; + this.portal.broadcastMouseLocation(pointerData, button); + }; + + private handlePointerMove = (event: PointerEvent) => { + if (!this.props.excalidrawAPI || !this.portal.isOpen() || !this.props.isOnline) return; + const appState = this.props.excalidrawAPI.getAppState(); + const sceneCoords = viewportCoordsToSceneCoords({ clientX: event.clientX, clientY: event.clientY }, appState); + const currentTool = appState.activeTool.type; + const displayTool: 'laser' | 'pointer' = currentTool === 'laser' ? 'laser' : 'pointer'; + const pointerData: PointerData = { x: sceneCoords.x, y: sceneCoords.y, tool: displayTool }; + this.portal.broadcastMouseLocation(pointerData, appState.cursorButton || 'up'); + }; + + /* Scene */ + + private addSceneChangeListeners = () => { + if (!this.props.excalidrawAPI) return; + // The onChange callback from Excalidraw provides elements and appState, + // but we'll fetch the latest scene directly to ensure we have deleted elements for versioning. + this.unsubExcalidrawSceneChange = this.props.excalidrawAPI.onChange( + (_elements, appState, _files) => { + this.handleSceneChange(appState); + } + ); + }; + + private removeSceneChangeListeners = () => { + if (this.unsubExcalidrawSceneChange) { + this.unsubExcalidrawSceneChange(); + this.unsubExcalidrawSceneChange = null; + } + }; + + private handleSceneChange = (currentAppState: AppState) => { + if (!this.props.excalidrawAPI || !this.portal.isOpen() || !this.props.isOnline) { + return; + } + + // Broadcast AppState update + if (currentAppState) { + this.debouncedBroadcastAppState(currentAppState); + } + + // Broadcast Scene (elements) update + const allCurrentElements = this.props.excalidrawAPI.getSceneElementsIncludingDeleted(); + const currentSceneVersion = getSceneVersion(allCurrentElements); + + if (allCurrentElements.length === 0 && this.isInitialLoad) { + // No elements in the scene is the temporary scene + return; + } + + // Handle version initialization (either fresh pad or loading from backend) + if (this.lastBroadcastedSceneVersion === -1 || this.state.lastProcessedSceneVersion === -1) { + // Initialize versions + this.lastBroadcastedSceneVersion = currentSceneVersion; + this.setState({ lastProcessedSceneVersion: currentSceneVersion }); + + // Only broadcast if this is NOT an initial load (i.e., it's a genuinely fresh pad with user changes) + // and there are elements to broadcast + if (!this.isInitialLoad && allCurrentElements.length > 0) { + this.portal.broadcastSceneUpdate('SCENE_UPDATE', allCurrentElements, false); + } + + // Mark initial load as complete if it was an initial load + if (this.isInitialLoad) { + this.isInitialLoad = false; + } + + return; + } + + // Avoid broadcasting if the scene version hasn't actually increased from what this client last broadcasted + // and isn't newer than what this client last processed from a remote update (prevents echo). + if (currentSceneVersion > this.lastBroadcastedSceneVersion && currentSceneVersion > this.state.lastProcessedSceneVersion) { + // Send only changed elements (syncAll: false) + this.portal.broadcastSceneUpdate('SCENE_UPDATE', allCurrentElements, false); + this.lastBroadcastedSceneVersion = currentSceneVersion; + } + // Note: If currentSceneVersion <= this.lastBroadcastedSceneVersion but > this.state.lastProcessedSceneVersion, + // it might indicate an undo/redo or a local change that didn't increment element versions. + // The current logic avoids broadcasting in this specific case to prevent potential loops, + // relying on the periodic full sync to eventually correct any minor inconsistencies. + }; + + /* Collaborators */ + + private updateUsername = (user: UserInfo | null) => { + const newUsername = user?.username || user?.id || ""; + if (this.state.username !== newUsername) { + this.setState({ username: newUsername }); + } + }; + + private updateExcalidrawCollaborators = () => { + if (!this.props.excalidrawAPI) return; + const excalidrawCollaborators = new Map(); + if (this.props.isOnline) { + this.state.collaborators.forEach((collab, id) => { + if (this.props.user && this.props.user.id === collab.id) return; + + excalidrawCollaborators.set(id, { + id: collab.id, + pointer: collab.pointer, + username: collab.username, + button: collab.button, + selectedElementIds: + collab.selectedElementIds, + color: collab.color, + avatarUrl: collab.avatarUrl, + }); + }); + } + this.props.excalidrawAPI.updateScene({ collaborators: excalidrawCollaborators }); + }; + + /* Portal & Core logic */ + + handlePortalStatusChange = (status: ConnectionStatus, message?: string) => { + this.setState({ connectionStatus: status }); + // Potentially update UI or take actions based on status + if (status === 'Failed' || (status === 'Closed' && !this.portal.isOpen())) { + // Clear collaborators if connection is definitively lost + this.setState({ collaborators: new Map() }, () => { + if (this.updateExcalidrawCollaborators) this.updateExcalidrawCollaborators(); + }); + } + }; + + public handlePortalMessage = (message: WebSocketMessage) => { + const { type, connection_id, user_id, data: messageData } = message; + const senderIdString = connection_id || user_id; + + if (this.props.user?.id && senderIdString === this.props.user.id) return; + if (!senderIdString) return; + const senderId = senderIdString as SocketId; + + switch (type) { + case 'user_joined': { + const username = messageData?.username || senderIdString; + console.debug(`[pad.ws] User joined: ${username}`); + this.setState(prevState => { + if (prevState.collaborators.has(senderId) || (this.props.user?.id && senderIdString === this.props.user.id)) return null; + const newCollaborator: Collaborator = { + id: user_id as SocketId, + username: username, + pointer: { x: 0, y: 0, tool: 'pointer' }, + color: getRandomCollaboratorColor(), + userState: 'active', + }; + const newCollaborators = new Map(prevState.collaborators); + newCollaborators.set(user_id as SocketId, newCollaborator); + return { collaborators: newCollaborators }; + }); + break; + } + case 'user_left': { + console.debug(`[pad.ws] User left: ${user_id}`); + this.setState(prevState => { + if (!prevState.collaborators.has(user_id as SocketId) || (this.props.user?.id && user_id === this.props.user.id)) return null; + const newCollaborators = new Map(prevState.collaborators); + newCollaborators.delete(user_id as SocketId); + return { collaborators: newCollaborators }; + }); + break; + } + case 'pointer_update': { + if (!messageData?.pointer) return; + const pointerDataIn = messageData.pointer as PointerData; + if (messageData.button) pointerDataIn.button = messageData.button; + this.setState(prevState => { + const newCollaborators = new Map(prevState.collaborators); + const existing = newCollaborators.get(user_id); + const updatedCollaborator: Collaborator = { + ...(existing as Collaborator), + pointer: pointerDataIn, + button: pointerDataIn.button + }; + newCollaborators.set(user_id, updatedCollaborator); + return { collaborators: newCollaborators }; + }); + break; + } + case 'scene_update': { + const remoteElements = messageData?.elements as ExcalidrawElementType[] | undefined; + + if (remoteElements !== undefined && this.props.excalidrawAPI) { + console.debug(`[pad.ws] Received scene update. Elements count: ${remoteElements.length}`, remoteElements); + const localElements = this.props.excalidrawAPI.getSceneElementsIncludingDeleted(); + const currentAppState = this.props.excalidrawAPI.getAppState(); + + // Ensure elements are properly restored (e.g., if they are plain objects from JSON) + const restoredRemoteElements = restoreElements(remoteElements, null); + + const reconciled = reconcileElements( + localElements, + restoredRemoteElements as any[], // Cast as any if type conflicts, ensure it matches Excalidraw's expected RemoteExcalidrawElement[] + currentAppState + ); + + this.props.excalidrawAPI.updateScene({ elements: reconciled as ExcalidrawElementType[], commitToHistory: false }); + + const reconciledVersion = getSceneVersion(reconciled); + this.setState({ lastProcessedSceneVersion: reconciledVersion }); + + // If this is a fresh pad (lastBroadcastedSceneVersion is -1), initialize it + if (this.lastBroadcastedSceneVersion === -1) { + console.debug('[pad.ws] Initializing lastBroadcastedSceneVersion from remote scene update'); + this.lastBroadcastedSceneVersion = reconciledVersion; + // Mark initial load as complete since we received remote data + this.isInitialLoad = false; + } + } + break; + } + case 'connected': { + const collaboratorsList = messageData?.collaboratorsList as any | undefined; + + if (collaboratorsList && Array.isArray(collaboratorsList)) { + console.debug(`[pad.ws] Received 'connected' message with ${collaboratorsList.length} collaborators.`); + this.setState(prevState => { + const newCollaborators = new Map(); + collaboratorsList.forEach(collabData => { + + console.debug(`[pad.ws] Collaborator data: ${JSON.stringify(collabData)}`); + if (collabData.user_id && collabData.user_id !== this.props.user?.id) { + + const newCollaborator: Collaborator = { + id: collabData.user_id as SocketId, + username: collabData.username, + pointer: collabData.pointer || { x: 0, y: 0, tool: 'pointer' }, + button: collabData.button || 'up', + selectedElementIds: collabData.selectedElementIds || {}, + userState: collabData.userState || 'active', + color: collabData.color || getRandomCollaboratorColor(), + avatarUrl: collabData.avatarUrl || '', + }; + newCollaborators.set(collabData.user_id as SocketId, newCollaborator); + } + }); + + return { collaborators: newCollaborators }; + }); + } else { + console.warn("[pad.ws] 'connected' message received without valid collaboratorsList.", messageData); + } + break; + } + default: + console.warn(`Unknown message type received: ${type}`, messageData); + } + }; + + render() { + return null; + } +} + +export default Collab; diff --git a/src/frontend/src/lib/collab/Portal.tsx b/src/frontend/src/lib/collab/Portal.tsx new file mode 100644 index 0000000..7f0f715 --- /dev/null +++ b/src/frontend/src/lib/collab/Portal.tsx @@ -0,0 +1,374 @@ +import { z } from 'zod'; +import type Collab from './Collab'; +import type { OrderedExcalidrawElement } from '@atyrode/excalidraw/element/types'; +import type { AppState } from '@atyrode/excalidraw/types'; +import type { UserInfo } from '../../hooks/useAuthStatus'; // For user details + +export const WebSocketMessageSchema = z.object({ + type: z.string(), + pad_id: z.string().nullable(), + timestamp: z.string().datetime({ offset: true, message: "Invalid timestamp format" }), + user_id: z.string().optional(), + connection_id: z.string().optional(), + data: z.any().optional(), +}); +export type WebSocketMessage = z.infer; + +export type ConnectionStatus = 'Uninstantiated' | 'Connecting' | 'Open' | 'Closing' | 'Closed' | 'Reconnecting' | 'Failed'; + +const MAX_RECONNECT_ATTEMPTS = 5; +const INITIAL_RECONNECT_DELAY = 1000; // 1 second + +class Portal { + private collab: Collab; + private socket: WebSocket | null = null; + private roomId: string | null = null; // This will be the padId + + // Auth and connection state + private user: UserInfo | null = null; + private isAuthenticated: boolean = false; + private isLoadingAuth: boolean = true; // Start with true until first auth update + + private reconnectAttemptCount: number = 0; + private isPermanentlyDisconnected: boolean = false; + private reconnectTimeoutId: ReturnType | null = null; + private currentConnectionStatus: ConnectionStatus = 'Uninstantiated'; + + // Callback for Collab to react to status changes + private onStatusChange: ((status: ConnectionStatus, message?: string) => void) | null = null; + private onMessage: ((message: WebSocketMessage) => void) | null = null; + + + broadcastedElementVersions: Map = new Map(); + + constructor( + collab: Collab, + padId: string | null, + user: UserInfo | null, + isAuthenticated: boolean, + isLoadingAuth: boolean, + onStatusChange?: (status: ConnectionStatus, message?: string) => void, + onMessage?: (message: WebSocketMessage) => void, + ) { + this.collab = collab; + this.roomId = padId; + this.user = user; + this.isAuthenticated = isAuthenticated; + this.isLoadingAuth = isLoadingAuth; + if (onStatusChange) this.onStatusChange = onStatusChange; + if (onMessage) this.onMessage = onMessage; } + + public initiate(): void { + if (!this.onStatusChange && (this.roomId || this.currentConnectionStatus !== 'Uninstantiated')) { + console.warn("[Portal] initiate called before onStatusChange callback was set, or logic error."); + } + + if (this.roomId) { + this.connect(); + } else { + // If Collab's initial state.connectionStatus is 'Uninstantiated' + // and Portal's this.currentConnectionStatus is also 'Uninstantiated' (its default), + // this call to _updateStatus will not trigger onStatusChange due to the + // "if (this.currentConnectionStatus !== status)" check within _updateStatus. + // This is safe and ensures status consistency if padId is null. + this._updateStatus('Uninstantiated'); + } + } + + private _updateStatus(status: ConnectionStatus, message?: string) { + if (this.currentConnectionStatus !== status) { + this.currentConnectionStatus = status; + console.debug(`[pad.ws] Status changed to: ${status}${message ? ` (${message})` : ''}`); + if (this.onStatusChange) { + this.onStatusChange(status, message); + } + } + } + + public getStatus(): ConnectionStatus { + return this.currentConnectionStatus; + } + + private getSocketUrl(): string | null { + if (!this.roomId || this.roomId.startsWith('temp-')) { + return null; + } + const protocol = window.location.protocol === 'https:' ? 'wss:' : 'ws:'; + return `${protocol}//${window.location.host}/ws/pad/${this.roomId}`; + } + + private shouldBeConnected(): boolean { + return !!this.getSocketUrl() && this.isAuthenticated && !this.isLoadingAuth && !this.isPermanentlyDisconnected; + } + + public connect(): void { + if (this.socket && this.socket.readyState === WebSocket.OPEN) { + console.debug('[pad.ws] Already connected.'); + return; + } + if (this.socket && this.socket.readyState === WebSocket.CONNECTING) { + console.debug('[pad.ws] Already connecting.'); + return; + } + + if (this.reconnectTimeoutId) { + clearTimeout(this.reconnectTimeoutId); + this.reconnectTimeoutId = null; + } + + if (!this.shouldBeConnected()) { + console.debug('[pad.ws] Conditions not met for connection.'); + this._updateStatus(this.isPermanentlyDisconnected ? 'Failed' : 'Closed'); + return; + } + + const socketUrl = this.getSocketUrl(); + if (!socketUrl) { + console.error('[pad.ws] Cannot connect: Socket URL is invalid.'); + this._updateStatus('Failed', 'Invalid URL'); + return; + } + + this._updateStatus(this.reconnectAttemptCount > 0 ? 'Reconnecting' : 'Connecting'); + console.debug(`[pad.ws] Attempting to connect to: ${socketUrl}`); + this.socket = new WebSocket(socketUrl); + + this.socket.onopen = () => { + console.debug(`[pad.ws] Connection established for pad: ${this.roomId}`); + this.isPermanentlyDisconnected = false; + this.reconnectAttemptCount = 0; + if (this.reconnectTimeoutId) clearTimeout(this.reconnectTimeoutId); + this._updateStatus('Open'); + }; + + this.socket.onmessage = (event: MessageEvent) => { + try { + const parsedData = JSON.parse(event.data as string); + const validationResult = WebSocketMessageSchema.safeParse(parsedData); + if (validationResult.success) { + if (this.onMessage) { + this.onMessage(validationResult.data); + } else { + // Fallback to direct call if onMessage prop not set by Collab + this.collab.handlePortalMessage(validationResult.data); + } + } else { + console.error(`[pad.ws] Incoming message validation failed for pad ${this.roomId}:`, validationResult.error.issues); + console.error(`[pad.ws] Raw message: ${event.data}`); + } + } catch (error) { + console.error(`[pad.ws] Error parsing incoming JSON message for pad ${this.roomId}:`, error); + } + }; + + this.socket.onclose = (event: CloseEvent) => { + console.debug(`[pad.ws] Connection closed for pad: ${this.roomId}. Code: ${event.code}, Reason: '${event.reason}'`); + this.socket = null; // Clear the socket instance + + const isAbnormalClosure = event.code !== 1000 && event.code !== 1001; // 1000 = Normal, 1001 = Going Away + + if (this.isPermanentlyDisconnected) { + this._updateStatus('Failed', `Permanently disconnected. Code: ${event.code}`); + return; + } + + if (isAbnormalClosure && this.shouldBeConnected()) { + this.reconnectAttemptCount++; + if (this.reconnectAttemptCount > MAX_RECONNECT_ATTEMPTS) { + console.warn(`[pad.ws] Failed to reconnect to pad ${this.roomId} after ${this.reconnectAttemptCount -1} attempts. Stopping.`); + this.isPermanentlyDisconnected = true; + this._updateStatus('Failed', `Max reconnect attempts reached.`); + } else { + const delay = INITIAL_RECONNECT_DELAY * Math.pow(2, this.reconnectAttemptCount -1); + console.debug(`[pad.ws] Reconnecting attempt ${this.reconnectAttemptCount}/${MAX_RECONNECT_ATTEMPTS} in ${delay}ms for pad: ${this.roomId}`); + this._updateStatus('Reconnecting', `Attempt ${this.reconnectAttemptCount}`); + this.reconnectTimeoutId = setTimeout(() => this.connect(), delay); + } + } else { + this._updateStatus('Closed', `Code: ${event.code}`); + } + }; + + this.socket.onerror = (event: Event) => { + console.error(`[pad.ws] WebSocket error for pad: ${this.roomId}:`, event); + this._updateStatus('Failed', 'WebSocket error'); + }; + } + + public disconnect(): void { + console.debug(`[pad.ws] Disconnecting from pad: ${this.roomId}`); + this.isPermanentlyDisconnected = true; // Mark intent to disconnect this session + + if (this.reconnectTimeoutId) { + clearTimeout(this.reconnectTimeoutId); + this.reconnectTimeoutId = null; + } + + const socketToClose = this.socket; // Capture the current socket reference + this.socket = null; // Nullify the instance's main socket reference immediately + + if (socketToClose) { + // Detach all handlers from the old socket. + // This is crucial to prevent its onclose (and other) handlers from + // executing and potentially interfering with the state of the Portal instance, + // which is now focused on a new connection or a definitive closed state. + socketToClose.onopen = null; + socketToClose.onmessage = null; + socketToClose.onclose = null; // <--- Key change: prevent our generic onclose + socketToClose.onerror = null; + + // Only attempt to close if it's in a state that can be closed. + if (socketToClose.readyState === WebSocket.OPEN || socketToClose.readyState === WebSocket.CONNECTING) { + try { + socketToClose.close(1000, 'Client initiated disconnect'); + } catch (e) { + console.warn(`[pad.ws] Error while closing socket for pad ${this.roomId}:`, e); + } + } else { + console.debug(`[pad.ws] Socket for pad ${this.roomId} was not OPEN or CONNECTING. Current state: ${socketToClose.readyState}. No explicit close call needed.`); + } + } + + // This status update reflects the client's *action* to disconnect. + // The actual closure of the socket on the wire is handled by socketToClose.close(). + this._updateStatus('Closed', 'Client initiated disconnect'); + } + + public closePortal(): void { // Renamed from 'close' to avoid conflict with WebSocket.close + this.disconnect(); // For now, closing the portal means disconnecting. + this.roomId = null; + this.broadcastedElementVersions.clear(); + this.onStatusChange = null; + this.onMessage = null; + } + + public updatePadId(padId: string | null): void { + if (this.roomId === padId) return; + + this.disconnect(); // Disconnect from the old pad + this.roomId = padId; + this.isPermanentlyDisconnected = false; // Reset for new pad + this.reconnectAttemptCount = 0; + + if (this.roomId) { + this.connect(); + } else { + this._updateStatus('Uninstantiated'); + } + } + + public updateAuthInfo(user: UserInfo | null, isAuthenticated: boolean, isLoadingAuth: boolean): void { + const oldShouldBeConnected = this.shouldBeConnected(); + this.user = user; + this.isAuthenticated = isAuthenticated; + this.isLoadingAuth = isLoadingAuth; + const newShouldBeConnected = this.shouldBeConnected(); + + if (oldShouldBeConnected !== newShouldBeConnected) { + if (newShouldBeConnected) { + console.debug('[pad.ws] Auth state changed, attempting to connect/reconnect.'); + this.isPermanentlyDisconnected = false; // Allow reconnection attempts if auth is now valid + this.reconnectAttemptCount = 0; // Reset attempts + this.connect(); + } else { + console.debug('[pad.ws] Auth state changed, disconnecting.'); + this.disconnect(); // Disconnect if auth conditions no longer met + } + } + } + + public isOpen(): boolean { + return this.socket !== null && this.socket.readyState === WebSocket.OPEN; + } + + private sendJsonMessage(payload: WebSocketMessage): void { + if (!this.isOpen()) { + console.warn('[pad.ws] Cannot send message: WebSocket is not open.', payload.type); + return; + } + const validationResult = WebSocketMessageSchema.safeParse(payload); + if (!validationResult.success) { + console.error(`[pad.ws] Outgoing message validation failed for pad ${this.roomId}:`, validationResult.error.issues); + return; + } + this.socket?.send(JSON.stringify(payload)); + } + + public sendMessage(type: string, data?: any): void { + const messagePayload: WebSocketMessage = { + type, + pad_id: this.roomId, + timestamp: new Date().toISOString(), + user_id: this.user?.id, + data: data, + }; + if (messagePayload.type != 'pointer_update') { + console.debug(`[pad.ws] Sending message of type: ${messagePayload.type} for pad ${this.roomId}`, messagePayload); + } + this.sendJsonMessage(messagePayload); + } + + public broadcastMouseLocation = ( + pointerData: { x: number; y: number; tool: 'laser' | 'pointer' }, + button?: 'up' | 'down', + ) => { + const payload = { + pointer: pointerData, + button: button || 'up', + }; + this.sendMessage('pointer_update', payload); + }; + + public broadcastSceneUpdate = ( + updateType: 'SCENE_INIT' | 'SCENE_UPDATE', + elements: ReadonlyArray, + syncAll: boolean + ) => { + let elementsToSend = elements; + + if (!syncAll) { + // Filter elements to send only those that have changed since last broadcast + elementsToSend = elements.filter(element => { + const lastBroadcastedVersion = this.broadcastedElementVersions.get(element.id) || -1; + return element.version > lastBroadcastedVersion; + }); + } + + const payload = { + update_subtype: updateType, + elements: elementsToSend, + // appState: if sending app state changes + }; + + if (elementsToSend.length > 0 || syncAll) { + this.sendMessage('scene_update', payload); + + // Update broadcasted versions for the elements that were actually sent + elementsToSend.forEach(element => { + if (element && typeof element.id === 'string' && typeof element.version === 'number') { + this.broadcastedElementVersions.set(element.id, element.version); + } + }); + } + + + if (syncAll) { + // Clear versions on full sync to ensure all elements are tracked again + this.broadcastedElementVersions.clear(); + elements.forEach(element => { + if (element && typeof element.id === 'string' && typeof element.version === 'number') { + this.broadcastedElementVersions.set(element.id, element.version); + } + }); + } + }; + + public broadcastAppStateUpdate = (appState: AppState): void => { + const payload = { + appState: appState, + }; + this.sendMessage('appstate_update', payload); + }; +} + +export default Portal; diff --git a/src/frontend/src/utils/debounce.ts b/src/frontend/src/lib/debounce.ts similarity index 97% rename from src/frontend/src/utils/debounce.ts rename to src/frontend/src/lib/debounce.ts index b48d503..d4a8dd0 100644 --- a/src/frontend/src/utils/debounce.ts +++ b/src/frontend/src/lib/debounce.ts @@ -21,7 +21,7 @@ */ // Define the interface for the debounced function -interface DebouncedFunction { +export interface DebouncedFunction { (...args: T): void; /** * Immediately executes the debounced function with the last provided arguments diff --git a/src/frontend/src/lib/ExcalidrawElementFactory.ts b/src/frontend/src/lib/elementFactory.ts similarity index 95% rename from src/frontend/src/lib/ExcalidrawElementFactory.ts rename to src/frontend/src/lib/elementFactory.ts index 6fe6ccd..bfa7ef4 100644 --- a/src/frontend/src/lib/ExcalidrawElementFactory.ts +++ b/src/frontend/src/lib/elementFactory.ts @@ -10,10 +10,10 @@ import type { ExcalidrawImperativeAPI } from '@atyrode/excalidraw/types'; import { PlacementMode, placeElement -} from '../utils/elementPlacement'; +} from './elementPlacement'; // Re-export PlacementMode to maintain backward compatibility -export { PlacementMode } from '../utils/elementPlacement'; +export { PlacementMode } from './elementPlacement'; // Base interface with common properties for all Excalidraw elements interface BaseElementOptions { @@ -122,7 +122,13 @@ export class ExcalidrawElementFactory { link: options.link, customData: { showHyperlinkIcon: false, - showClickableHint: false + showClickableHint: false, + borderOffsets: { + left: 10, + right: 10, + top: 40, + bottom: 10 + } } }; } @@ -213,7 +219,7 @@ export class ExcalidrawElementFactory { excalidrawAPI: ExcalidrawImperativeAPI, scrollToView: boolean = true ): void { - const elements = excalidrawAPI.getSceneElements(); + const elements = excalidrawAPI.getSceneElementsIncludingDeleted(); excalidrawAPI.updateScene({ elements: [...elements, element], diff --git a/src/frontend/src/utils/elementPlacement.ts b/src/frontend/src/lib/elementPlacement.ts similarity index 99% rename from src/frontend/src/utils/elementPlacement.ts rename to src/frontend/src/lib/elementPlacement.ts index 04aadb0..621622b 100644 --- a/src/frontend/src/utils/elementPlacement.ts +++ b/src/frontend/src/lib/elementPlacement.ts @@ -277,7 +277,7 @@ export function placeElement( targetPosition.y, element.width, element.height, - excalidrawAPI.getSceneElements(), + excalidrawAPI.getSceneElementsIncludingDeleted(), bufferPercentage ); } diff --git a/src/frontend/src/lib/posthog.ts b/src/frontend/src/lib/posthog.ts new file mode 100644 index 0000000..75af816 --- /dev/null +++ b/src/frontend/src/lib/posthog.ts @@ -0,0 +1,39 @@ +import posthog from 'posthog-js'; + +let isPostHogInitialized = false; + +interface PostHogConfigKeys { + posthogKey: string; + posthogHost: string; +} + +export const initializePostHog = (config: PostHogConfigKeys) => { + if (isPostHogInitialized) { + console.warn('[pad.ws] PostHog is already initialized. Skipping re-initialization.'); + return; + } + if (!config || !config.posthogKey || !config.posthogHost) { + console.warn('[pad.ws] PostHog initialization skipped due to missing or invalid config.'); + return; + } + + try { + posthog.init(config.posthogKey, { + api_host: config.posthogHost, + }); + isPostHogInitialized = true; + console.debug('[pad.ws] PostHog initialized successfully from posthog.ts with config:', config); + } catch (error) { + console.error('[pad.ws] Error initializing PostHog:', error); + } +}; + +export const capture = (eventName: string, properties?: Record) => { + if (!isPostHogInitialized) { + console.warn(`[pad.ws] PostHog not yet initialized. Event "${eventName}" was not captured.`); + return; + } + posthog.capture(eventName, properties); +}; + +export default posthog; diff --git a/src/frontend/src/pad/Dashboard.tsx b/src/frontend/src/pad/Dashboard.tsx index 0538ac9..b19724d 100644 --- a/src/frontend/src/pad/Dashboard.tsx +++ b/src/frontend/src/pad/Dashboard.tsx @@ -4,7 +4,7 @@ import type { AppState } from '@atyrode/excalidraw/types'; import StateIndicator from './StateIndicator'; import ControlButton from './buttons/ControlButton'; import { ActionButtonGrid } from './buttons'; -import { useWorkspaceState } from '../api/hooks'; +import { useWorkspace } from '../hooks/useWorkspace'; import './Dashboard.scss'; // Direct import from types @@ -32,7 +32,43 @@ export const Dashboard: React.FC = ({ appState, excalidrawAPI }) => { - const { data: workspaceState } = useWorkspaceState(); + + const { workspaceState, isLoadingState, stateError } = useWorkspace(); + + const getWelcomeMessage = () => { + if (isLoadingState) { + return 'Loading workspace status...'; + } + if (stateError) { + return 'Error loading workspace status. Please try again.'; + } + if (!workspaceState) { + return 'Loading workspace status...'; // Or a more specific "Unknown state" + } + + switch (workspaceState.state) { + case 'pending': + return 'Your workspace is pending...'; + case 'starting': + return 'Your workspace is starting...'; + case 'stopping': + return 'Your workspace is stopping...'; + case 'stopped': + return 'Your workspace is stopped. Start it again to continue.'; + case 'failed': + return 'A workspace error occurred. Please check logs or try restarting.'; + case 'canceling': + return 'Your workspace is being canceled...'; + case 'canceled': + return 'Your workspace has been canceled.'; + case 'deleting': + return 'Your workspace is being deleted...'; + case 'deleted': + return 'Your workspace has been deleted.'; + default: + return `Workspace status: ${workspaceState.state}`; + } + }; const buttonConfigs: ActionButtonConfig[] = [ // First row: Terminal buttons @@ -131,7 +167,7 @@ export const Dashboard: React.FC = ({ {showBottomSection && (
- {workspaceState?.status === 'running' ? ( + {workspaceState?.state === 'running' ? ( = ({ ) : (

- {workspaceState?.status === 'starting' ? 'Your workspace is starting...' : - workspaceState?.status === 'stopping' ? 'Your workspace is stopping...' : - workspaceState?.status === 'stopped' ? 'Your workspace is stopped, start it again to continue' : - workspaceState?.status === 'error' ? 'Workspace error occurred' : - 'Loading workspace status...'} + {getWelcomeMessage()}

)} diff --git a/src/frontend/src/pad/DevTools.scss b/src/frontend/src/pad/DevTools.scss new file mode 100644 index 0000000..faaa2f7 --- /dev/null +++ b/src/frontend/src/pad/DevTools.scss @@ -0,0 +1,86 @@ +.dev-tools { + display: flex; + flex-direction: column; + height: 100%; + width: 100%; + overflow: hidden; + + &__content { + flex: 1; + overflow: hidden; + position: relative; + } + + &__collab-container { + display: flex; + height: 100%; + width: 100%; + gap: 8px; + } + + &__collab-events-wrapper { + display: flex; + flex-direction: column; + flex: 1; + gap: 8px; + overflow: hidden; + } + + &__collab-events { + border: 1px solid #333; + border-radius: 4px; + display: flex; + flex-direction: column; + background-color: #1e1e1e; + overflow: hidden; + + .dev-tools__collab-events-wrapper > & { + width: auto; + flex-basis: 0; + flex-grow: 1; + flex-shrink: 1; + min-width: 0; + } + } + + &__collab-events-header { + padding: 6px 10px; + background-color: #2d2d2d; + border-bottom: 1px solid #444; + font-size: 12px; + font-weight: 500; + color: #e0e0e0; + } + + &__collab-events-list { + flex: 1; + overflow-y: auto; + padding: 4px; + } + + &__collab-empty { + padding: 12px; + color: #a0a0a0; + font-size: 12px; + text-align: center; + font-style: italic; + } + + &__collab-details { + flex: 1; + display: flex; + flex-direction: column; + border: 1px solid #333; + border-radius: 4px; + overflow: hidden; + } + + &__editor-header { + padding: 6px 10px; + background-color: #2d2d2d; + border-bottom: 1px solid #444; + font-size: 12px; + font-weight: 500; + color: #e0e0e0; + } +} diff --git a/src/frontend/src/pad/DevTools.tsx b/src/frontend/src/pad/DevTools.tsx new file mode 100644 index 0000000..d0080b6 --- /dev/null +++ b/src/frontend/src/pad/DevTools.tsx @@ -0,0 +1,47 @@ +import React from 'react'; +import MonacoEditor from '@monaco-editor/react'; +import './DevTools.scss'; + +interface DevToolsProps {} + +const DevTools: React.FC = () => { + return ( +
+
+
+
+
+
+ Events +
+
+
+ Event display area. +
+
+
+
+
+
Event Details
+ +
+
+
+
+ ); +}; + +export default DevTools; diff --git a/src/frontend/src/pad/StateIndicator.scss b/src/frontend/src/pad/StateIndicator.scss index 5ee2091..cbf7b3a 100644 --- a/src/frontend/src/pad/StateIndicator.scss +++ b/src/frontend/src/pad/StateIndicator.scss @@ -22,7 +22,7 @@ } &--starting { - background-color: #3498db; // Blue for starting + background-color: #e67e22; // Orange for starting, pending, loading } &--stopping { @@ -33,9 +33,7 @@ background-color: #e74c3c; // Red for stopped } - &--loading { - background-color: #9b59b6; // Purple for loading - } + // &--loading style removed as it will use 'starting' modifier &--unauthenticated { background-color: #34495e; // Dark blue for unauthenticated diff --git a/src/frontend/src/pad/StateIndicator.tsx b/src/frontend/src/pad/StateIndicator.tsx index 93fe304..048d352 100644 --- a/src/frontend/src/pad/StateIndicator.tsx +++ b/src/frontend/src/pad/StateIndicator.tsx @@ -1,44 +1,44 @@ import React from 'react'; -import { useWorkspaceState, useAuthCheck } from '../api/hooks'; import './StateIndicator.scss'; +import { useWorkspace } from '../hooks/useWorkspace'; export const StateIndicator: React.FC = () => { - const { data: isAuthenticated, isLoading: isAuthLoading } = useAuthCheck(); - - // Only fetch workspace state if authenticated - const { data: workspaceState, isLoading: isWorkspaceLoading } = useWorkspaceState({ - queryKey: ['workspaceState'], - enabled: isAuthenticated === true && !isAuthLoading, - // Explicitly set refetchInterval to false when not authenticated - refetchInterval: isAuthenticated === true ? undefined : false, - }); + const { workspaceState, isLoadingState, stateError } = useWorkspace(); const getState = () => { - if (isAuthLoading || isWorkspaceLoading) { - return { modifier: 'loading', text: 'Loading...' }; + if (isLoadingState) { + return { modifier: 'starting', text: 'Loading...' }; // Orange } - - if (isAuthenticated === false) { - return { modifier: 'unauthenticated', text: 'Not Authenticated' }; + if (stateError) { + return { modifier: 'error', text: 'Error Loading State' }; // Light gray } - if (!workspaceState) { - return { modifier: 'unknown', text: 'Unknown' }; + return { modifier: 'unknown', text: 'Unknown' }; // Dark gray } - switch (workspaceState.status) { - case 'running': - return { modifier: 'running', text: 'Running' }; + switch (workspaceState.state) { + case 'pending': + return { modifier: 'starting', text: 'Pending' }; // Orange case 'starting': - return { modifier: 'starting', text: 'Starting' }; + return { modifier: 'starting', text: 'Starting' }; // Orange + case 'running': + return { modifier: 'running', text: 'Running' }; // Green case 'stopping': - return { modifier: 'stopping', text: 'Stopping' }; + return { modifier: 'stopping', text: 'Stopping' }; // Orange case 'stopped': - return { modifier: 'stopped', text: 'Stopped' }; - case 'error': - return { modifier: 'error', text: 'Error' }; + return { modifier: 'stopped', text: 'Stopped' }; // Red + case 'failed': + return { modifier: 'error', text: 'Failed' }; // Light gray + case 'canceling': + return { modifier: 'stopping', text: 'Canceling' }; // Orange + case 'canceled': + return { modifier: 'stopped', text: 'Canceled' }; // Red + case 'deleting': + return { modifier: 'stopping', text: 'Deleting' }; // Orange + case 'deleted': + return { modifier: 'stopped', text: 'Deleted' }; // Red default: - return { modifier: 'unknown', text: 'Unknown' }; + return { modifier: 'unknown', text: `Unknown (${workspaceState.state})` }; // Dark gray } }; diff --git a/src/frontend/src/pad/Terminal.tsx b/src/frontend/src/pad/Terminal.tsx index 94d1d1a..4a62171 100644 --- a/src/frontend/src/pad/Terminal.tsx +++ b/src/frontend/src/pad/Terminal.tsx @@ -1,8 +1,8 @@ import React, { useState, useEffect, useCallback, useRef } from 'react'; -import { useWorkspaceState } from '../api/hooks'; import type { NonDeleted, ExcalidrawEmbeddableElement } from '@atyrode/excalidraw/element/types'; import type { AppState } from '@atyrode/excalidraw/types'; import './Terminal.scss'; +import { useWorkspace } from '../hooks/useWorkspace'; interface TerminalProps { element: NonDeleted; @@ -24,7 +24,9 @@ export const Terminal: React.FC = ({ appState, excalidrawAPI }) => { - const { data: workspaceState } = useWorkspaceState(); + + const { workspaceState } = useWorkspace(); + const [terminalId, setTerminalId] = useState(null); const [iframeLoaded, setIframeLoaded] = useState(false); const [shouldRenderIframe, setShouldRenderIframe] = useState(false); @@ -45,13 +47,14 @@ export const Terminal: React.FC = ({ if (!element || !excalidrawAPI || !workspaceState || !terminalId) return; try { - // Get all elements from the scene - const elements = excalidrawAPI.getSceneElements(); + // Get all scene elements, including deleted ones + const allElements = excalidrawAPI.getSceneElementsIncludingDeleted(); - // Find and update the element - const updatedElements = elements.map(el => { + let elementFound = false; + // Map over all elements to find and update the target terminal element + const updatedElements = allElements.map((el: ExcalidrawEmbeddableElement) => { if (el.id === element.id) { - // Create a new customData object with the terminal connection info + elementFound = true; const connectionInfo: TerminalConnectionInfo = { terminalId, baseUrl: workspaceState.base_url, @@ -60,12 +63,12 @@ export const Terminal: React.FC = ({ agent: workspaceState.agent }; - const customData = { + const newCustomData = { ...(el.customData || {}), terminalConnectionInfo: connectionInfo }; - - return { ...el, customData }; + // Return a new object for the updated element, preserving all its other properties + return { ...el, customData: newCustomData }; } return el; }); diff --git a/src/frontend/src/pad/buttons/ActionButton.tsx b/src/frontend/src/pad/buttons/ActionButton.tsx index c752e05..66c6a3b 100644 --- a/src/frontend/src/pad/buttons/ActionButton.tsx +++ b/src/frontend/src/pad/buttons/ActionButton.tsx @@ -1,11 +1,10 @@ import React, { useState, useEffect, useRef } from 'react'; -import { useWorkspaceState } from '../../api/hooks'; -// Import SVGs as modules - using relative paths from the action button location import { Terminal, Braces, Settings, Plus, ExternalLink, Monitor } from 'lucide-react'; +import { useWorkspace } from '../../hooks/useWorkspace'; import { ActionType, TargetType, CodeVariant, ActionButtonProps } from './types'; import './ActionButton.scss'; -import { capture } from '../../utils/posthog'; -import { ExcalidrawElementFactory, PlacementMode } from '../../lib/ExcalidrawElementFactory'; +import { capture } from '../../lib/posthog'; +import { ExcalidrawElementFactory, PlacementMode } from '../../lib/elementFactory'; // Interface for button settings stored in customData interface ButtonSettings { @@ -28,7 +27,7 @@ const ActionButton: React.FC = ({ settingsEnabled = true, // Default to enabled for backward compatibility backgroundColor // Custom background color }) => { - const { data: workspaceState } = useWorkspaceState(); + const { workspaceState } = useWorkspace(); // Parse settings from parent element's customData if available const parseElementSettings = (): { @@ -119,7 +118,7 @@ const ActionButton: React.FC = ({ currentSettingsRef.current = newSettings; // Get all elements from the scene - const elements = excalidrawAPI.getSceneElements(); + const elements = excalidrawAPI.getSceneElementsIncludingDeleted(); // Find and update the parent element const updatedElements = elements.map(el => { @@ -299,9 +298,9 @@ const ActionButton: React.FC = ({ }); if (selectedAction === 'embed') { - const excalidrawAPI = (window as any).excalidrawAPI; + // Use the excalidrawAPI prop passed to the component if (!excalidrawAPI) { - console.error('Excalidraw API not available'); + console.error('Excalidraw API not available from props'); return; } diff --git a/src/frontend/src/pad/buttons/ControlButton.tsx b/src/frontend/src/pad/buttons/ControlButton.tsx index f765663..c28562c 100644 --- a/src/frontend/src/pad/buttons/ControlButton.tsx +++ b/src/frontend/src/pad/buttons/ControlButton.tsx @@ -1,30 +1,62 @@ import React from 'react'; -import { useWorkspaceState, useStartWorkspace, useStopWorkspace } from '../../api/hooks'; import './ControlButton.scss'; import { Play, Square, LoaderCircle } from 'lucide-react'; +import { useWorkspace } from '../../hooks/useWorkspace'; export const ControlButton: React.FC = () => { - const { data: workspaceState } = useWorkspaceState({ - queryKey: ['workspaceState'], - enabled: true, - }); + const { + workspaceState, + isLoadingState, + stateError, + startWorkspace, + isStarting, + stopWorkspace, + isStopping, + } = useWorkspace(); - const { mutate: startWorkspace, isPending: isStarting } = useStartWorkspace(); - const { mutate: stopWorkspace, isPending: isStopping } = useStopWorkspace(); - - // Determine current status - const currentStatus = workspaceState?.status || 'unknown'; + let currentUiStatus = 'unknown'; + if (isLoadingState) { + currentUiStatus = 'loading'; + } else if (stateError) { + currentUiStatus = 'error'; + } else if (workspaceState) { + switch (workspaceState.state) { + case 'pending': + case 'starting': + currentUiStatus = 'starting'; + break; + case 'running': + currentUiStatus = 'running'; + break; + case 'stopping': + case 'canceling': + case 'deleting': + currentUiStatus = 'stopping'; + break; + case 'stopped': + case 'canceled': + case 'deleted': + currentUiStatus = 'stopped'; + break; + case 'failed': + currentUiStatus = 'error'; + break; + default: + currentUiStatus = 'unknown'; + } + } const handleClick = () => { - if (isStarting || isStopping) return; - if (currentStatus === 'running') { + if (isStarting || isStopping || isLoadingState) return; + + if (currentUiStatus === 'running') { stopWorkspace(); - } else if (currentStatus === 'stopped' || currentStatus === 'error') { + } else if (currentUiStatus === 'stopped' || currentUiStatus === 'error' || currentUiStatus === 'unknown') { startWorkspace(); } }; - if (currentStatus === 'starting' || currentStatus === 'stopping' || isStarting || isStopping) { + if (currentUiStatus === 'loading' || currentUiStatus === 'starting' || currentUiStatus === 'stopping' || isStarting || isStopping) { return ( ); - } else if (currentStatus === 'running' && (!workspaceState || !workspaceState.error)) { + } else if (currentUiStatus === 'running') { return ( ); - } else { + } else { // Covers 'stopped', 'error', 'unknown' return ( -
- {/* Footer */}
- {/* Warning message */} -
+
{warningText}
@@ -135,42 +84,29 @@ export const AuthDialog = ({ aria-label="Star pad-ws/pad.ws on GitHub"> Star - -
-
); return ( - <> - {modalIsShown && ( -
-
- pad.ws logo -
- {randomMessage} -
+
+
+ pad.ws logo +
{randomMessage}
+
+ { }} + title={ + - -

pad.ws

-
- } - closeOnClickOutside={false} - children={children || dialogContent} - /> -
- )} - + } + closeOnClickOutside={false} + children={children || dialogContent} + /> +
); }; diff --git a/src/frontend/src/ui/BackupsDialog.scss b/src/frontend/src/ui/BackupsDialog.scss deleted file mode 100644 index 4c8b072..0000000 --- a/src/frontend/src/ui/BackupsDialog.scss +++ /dev/null @@ -1,233 +0,0 @@ -/* Backups Modal Styles */ - -.excalidraw .Dialog--fullscreen { - &.backups-modal { - .Dialog { - &__content { - margin-top: 0 !important; - } - } - .Island { - padding-left: 8px !important; - padding-right: 10px !important; - } - } -} - -.backups-modal { - - .Island { - padding-top: 15px !important; - padding-bottom: 20px !important; - } - - &__wrapper { - position: absolute; - top: 0; - left: 0; - width: 100%; - height: 100%; - z-index: 5; - background-color: rgba(0, 0, 0, 0.2); - backdrop-filter: blur(1px); - } - - &__title-container { - display: flex; - align-items: center; - } - - &__title { - margin: 0 auto; - font-size: 1.5rem; - font-weight: 600; - color: white; - text-align: center; - } - - &__content { - display: flex; - flex-direction: column; - align-items: center; - padding: 20px; - max-height: 80vh; - overflow-y: auto; - } - - &__header { - display: flex; - justify-content: space-between; - align-items: center; - margin-bottom: 1rem; - width: 100%; - } - - &__close-button { - background: none; - border: none; - color: #ffffff; - font-size: 1.5rem; - cursor: pointer; - padding: 0; - width: 30px; - height: 30px; - display: flex; - align-items: center; - justify-content: center; - border-radius: 50%; - transition: background-color 0.2s ease; - - &:hover { - background-color: rgba(255, 255, 255, 0.1); - } - } - - &__loading, - &__error, - &__empty { - display: flex; - align-items: center; - justify-content: center; - padding: 2rem; - color: #a0a0a9; - font-style: italic; - font-size: 18px; - animation: fadeIn 0.5s cubic-bezier(0.00, 1.26, 0.64, 0.95) forwards; - } - - &__error { - color: #f44336; - } - - &__list { - list-style: none; - padding: 0; - margin: 0; - max-height: 100%; - overflow-y: auto; - width: 100%; - } - - &__item { - display: flex; - align-items: center; - justify-content: space-between; - padding: 12px 15px; - margin-bottom: 8px; - background-color: #464652; - border: 2px solid #727279; - border-radius: 6px; - cursor: pointer; - transition: all 0.2s ease; - position: relative; - overflow: hidden; - - &:hover { - border: 2px solid #cc6d24; - } - - &:last-child { - margin-bottom: 0; - } - } - - &__item-content { - display: flex; - align-items: center; - gap: 10px; - } - - &__number { - font-size: 0.9rem; - font-weight: 600; - color: #fa8933; - background-color: rgba(250, 137, 51, 0.1); - padding: 4px 8px; - border-radius: 4px; - min-width: 28px; - text-align: center; - } - - &__timestamp { - font-size: 0.9rem; - color: #ffffff; - } - - &__restore-button { - background-color: transparent; - border: none; - color: #fa8933; - font-size: 0.9rem; - font-weight: 500; - cursor: pointer; - padding: 0.25rem 0.5rem; - border-radius: 4px; - transition: all 0.2s ease; - - &:hover { - background-color: rgba(250, 137, 51, 0.1); - } - } - - &__confirmation { - display: flex; - flex-direction: column; - align-items: center; - justify-content: center; - padding: 20px; - background-color: #464652; - border: 2px solid #727279; - border-radius: 6px; - text-align: center; - color: #ffffff; - animation: fadeIn 0.4s cubic-bezier(0.00, 1.26, 0.64, 0.95) forwards; - width: 80%; - max-width: 500px; - } - - &__warning { - color: #f44336; - font-weight: 500; - margin: 0.5rem 0 1rem; - } - - &__actions { - display: flex; - gap: 1rem; - margin-top: 20px; - } - - &__button { - display: flex; - align-items: center; - justify-content: center; - padding: 10px 16px; - height: 44px; - border-radius: 6px; - border: 2px solid #727279; - font-size: 15px; - font-weight: 500; - transition: all 0.2s ease; - cursor: pointer; - min-width: 120px; - - &:hover { - border: 2px solid #cc6d24; - } - - &--restore { - background-color: #464652; - color: white; - } - - &--cancel { - background-color: #464652; - color: #ffffff; - } - } -} - -@keyframes fadeIn { - from { opacity: 0; transform: translateY(10px); } - to { opacity: 1; transform: translateY(0); } -} diff --git a/src/frontend/src/ui/BackupsDialog.tsx b/src/frontend/src/ui/BackupsDialog.tsx deleted file mode 100644 index 6cfe627..0000000 --- a/src/frontend/src/ui/BackupsDialog.tsx +++ /dev/null @@ -1,131 +0,0 @@ -import React, { useState, useCallback } from "react"; -import { Dialog } from "@atyrode/excalidraw"; -import { usePadBackups, CanvasBackup } from "../api/hooks"; -import { normalizeCanvasData, getActivePad } from "../utils/canvasUtils"; -import "./BackupsDialog.scss"; - -interface BackupsModalProps { - excalidrawAPI?: any; - onClose?: () => void; -} - -const BackupsModal: React.FC = ({ - excalidrawAPI, - onClose, -}) => { - const [modalIsShown, setModalIsShown] = useState(true); - const activePadId = getActivePad(); - const { data, isLoading, error } = usePadBackups(activePadId); - const [selectedBackup, setSelectedBackup] = useState(null); - - // Functions from CanvasBackups.tsx - const handleBackupSelect = (backup: CanvasBackup) => { - setSelectedBackup(backup); - }; - - const handleRestoreBackup = () => { - if (selectedBackup && excalidrawAPI) { - // Load the backup data into the canvas - const normalizedData = normalizeCanvasData(selectedBackup.data); - excalidrawAPI.updateScene(normalizedData); - setSelectedBackup(null); - handleClose(); - } - }; - - const handleCancel = () => { - setSelectedBackup(null); - }; - - // Format date function from CanvasBackups.tsx - const formatDate = (dateString: string): string => { - const date = new Date(dateString); - return date.toLocaleString(undefined, { - year: 'numeric', - month: 'short', - day: 'numeric', - hour: '2-digit', - minute: '2-digit' - }); - }; - - const handleClose = useCallback(() => { - setModalIsShown(false); - if (onClose) { - onClose(); - } - }, [onClose]); - - // Dialog content - const dialogContent = ( -
- {isLoading ? ( -
Loading backups...
- ) : error ? ( -
Error loading backups
- ) : !data || data.backups.length === 0 ? ( -
No backups available
- ) : selectedBackup ? ( -
-

Restore canvas from backup #{data.backups.findIndex(b => b.id === selectedBackup.id) + 1} created on {formatDate(selectedBackup.timestamp)}?

-

This will replace your current canvas!

-
- - -
-
- ) : ( -
    - {data.backups.map((backup, index) => ( -
  • handleBackupSelect(backup)} - > -
    - #{index + 1} - {formatDate(backup.timestamp)} -
    - -
  • - ))} -
- )} -
- ); - - return ( - <> - {modalIsShown && ( -
- -

- {data?.pad_name ? `${data.pad_name} (this pad) - Backups` : 'Canvas Backups'} -

-
- } - closeOnClickOutside={true} - children={dialogContent} - /> - - )} - - ); -}; - -export default BackupsModal; diff --git a/src/frontend/src/ui/MainMenu.tsx b/src/frontend/src/ui/MainMenu.tsx index 1058282..5fe4ad2 100644 --- a/src/frontend/src/ui/MainMenu.tsx +++ b/src/frontend/src/ui/MainMenu.tsx @@ -3,15 +3,15 @@ import React, { useState } from 'react'; import type { ExcalidrawImperativeAPI } from '@atyrode/excalidraw/types'; import type { MainMenu as MainMenuType } from '@atyrode/excalidraw'; -import { LogOut, SquarePlus, LayoutDashboard, SquareCode, Eye, Coffee, Grid2x2, User, Text, ArchiveRestore, Settings, Terminal, FileText } from 'lucide-react'; +import { LogOut, SquarePlus, LayoutDashboard, User, Text, Settings, Terminal, FileText, FlaskConical } from 'lucide-react'; import AccountDialog from './AccountDialog'; import md5 from 'crypto-js/md5'; -import { capture } from '../utils/posthog'; -import { ExcalidrawElementFactory, PlacementMode } from '../lib/ExcalidrawElementFactory'; -import { useUserProfile } from "../api/hooks"; -import { queryClient } from "../api/queryClient"; +import { useLogout } from '../hooks/useLogout'; +import { useAuthStatus } from '../hooks/useAuthStatus'; +import { ExcalidrawElementFactory, PlacementMode } from '../lib/elementFactory'; import "./MainMenu.scss"; - +import { INITIAL_APP_DATA } from '../constants'; +import { capture } from '../lib/posthog'; // Function to generate gravatar URL const getGravatarUrl = (email: string, size = 32) => { const hash = md5(email.toLowerCase().trim()).toString(); @@ -20,8 +20,6 @@ const getGravatarUrl = (email: string, size = 32) => { interface MainMenuConfigProps { MainMenu: typeof MainMenuType; excalidrawAPI: ExcalidrawImperativeAPI | null; - showBackupsModal: boolean; - setShowBackupsModal: (show: boolean) => void; showPadsModal: boolean; setShowPadsModal: (show: boolean) => void; showSettingsModal?: boolean; @@ -31,22 +29,22 @@ interface MainMenuConfigProps { export const MainMenuConfig: React.FC = ({ MainMenu, excalidrawAPI, - setShowBackupsModal, setShowPadsModal, setShowSettingsModal = (show: boolean) => {}, }) => { const [showAccountModal, setShowAccountModal] = useState(false); - const { data, isLoading, isError } = useUserProfile(); + const { mutate: logoutMutation, isPending: isLoggingOut } = useLogout(); + const { user, isLoading, isError } = useAuthStatus(); let username = ""; let email = ""; if (isLoading) { username = "Loading..."; - } else if (isError || !data?.username) { + } else if (isError || !user?.username) { username = "Unknown"; } else { - username = data.username; - email = data.email || ""; + username = user.username; + email = user.email || ""; } const handleDashboardButtonClick = () => { @@ -65,6 +63,22 @@ export const MainMenuConfig: React.FC = ({ }); }; + const handleDevToolsClick = () => { + if (!excalidrawAPI) return; + + const devToolsElement = ExcalidrawElementFactory.createEmbeddableElement({ + link: "!dev", + width: 800, + height: 500 + }); + + ExcalidrawElementFactory.placeInScene(devToolsElement, excalidrawAPI, { + mode: PlacementMode.NEAR_VIEWPORT_CENTER, + bufferPercentage: 10, + scrollToView: true + }); + }; + const handleInsertButtonClick = () => { if (!excalidrawAPI) return; @@ -113,10 +127,6 @@ export const MainMenuConfig: React.FC = ({ }); }; - const handleCanvasBackupsClick = () => { - setShowBackupsModal(true); - }; - const handleManagePadsClick = () => { setShowPadsModal(true); }; @@ -129,67 +139,57 @@ export const MainMenuConfig: React.FC = ({ setShowAccountModal(true); }; - const handleLogout = async () => { + const handleLogout = () => { capture('logout_clicked'); - - try { - // Call the logout endpoint and get the logout_url - const response = await fetch('/auth/logout', { - method: 'GET', - credentials: 'include' - }); - const data = await response.json(); - const keycloakLogoutUrl = data.logout_url; - - // Create a function to create an iframe and return a promise that resolves when it loads or times out - const createIframeLoader = (url: string, debugName: string): Promise => { - return new Promise((resolve) => { - const iframe = document.createElement("iframe"); - iframe.style.display = "none"; - iframe.src = url; - console.debug(`[pad.ws] (Silently) Priming ${debugName} logout for ${url}`); - const cleanup = () => { - if (iframe.parentNode) iframe.parentNode.removeChild(iframe); - resolve(); - }; + logoutMutation(undefined, { + onSuccess: (data) => { + const keycloakLogoutUrl = data.logout_url; - iframe.onload = cleanup; - // Fallback: remove iframe after 2s if onload doesn't fire - const timeoutId = window.setTimeout(cleanup, 2000); + const createIframeLoader = (url: string, debugName: string): Promise => { + return new Promise((resolve, reject) => { // Added reject for error handling + const iframe = document.createElement("iframe"); + iframe.style.display = "none"; + iframe.src = url; + console.debug(`[pad.ws] (Silently) Priming ${debugName} logout for ${url}`); - // Also clean up if the iframe errors - iframe.onerror = () => { - clearTimeout(timeoutId); - cleanup(); - }; + let timeoutId: number | undefined; - // Add the iframe to the DOM - document.body.appendChild(iframe); - }); - }; + const cleanup = (error?: any) => { + if (timeoutId) clearTimeout(timeoutId); + if (iframe.parentNode) iframe.parentNode.removeChild(iframe); + if (error) reject(error); else resolve(); + }; - // Create a promise for Keycloak logout iframe - const promises = []; + iframe.onload = () => cleanup(); + // Fallback: remove iframe after 2s if onload doesn't fire + timeoutId = window.setTimeout(() => cleanup(new Error(`${debugName} iframe load timed out`)), 5000); - // Add Keycloak logout iframe - promises.push(createIframeLoader(keycloakLogoutUrl, "Keycloak")); + iframe.onerror = (event) => { // event can be an ErrorEvent or string + const errorMsg = typeof event === 'string' ? event : (event instanceof ErrorEvent ? event.message : `Error loading ${debugName} iframe`); + cleanup(new Error(errorMsg)); + }; + document.body.appendChild(iframe); + }); + }; - // Wait for both iframes to complete - await Promise.all(promises); + const promises = [createIframeLoader(keycloakLogoutUrl, "Keycloak")]; - // Wait for the iframe to complete - await Promise.all(promises); - - // Invalidate auth query to show the AuthModal - queryClient.invalidateQueries({ queryKey: ['auth'] }); - queryClient.invalidateQueries({ queryKey: ['userProfile'] }); - - // No need to redirect to the logout URL since we're already handling it via iframe - console.debug("[pad.ws] Logged out successfully"); - } catch (error) { - console.error("[pad.ws] Logout failed:", error); - } + Promise.all(promises) + .then(() => { + console.debug("[pad.ws] Keycloak iframe logout process completed successfully."); + }) + .catch(err => { + console.error("[pad.ws] Error during iframe logout process:", err); + }); + }, + onError: (error) => { + console.error("[pad.ws] Logout failed in MainMenu component:", error.message); + } + }); + + excalidrawAPI.updateScene(INITIAL_APP_DATA); + }; return ( @@ -228,12 +228,6 @@ export const MainMenuConfig: React.FC = ({ > Manage pads... - } - onClick={handleCanvasBackupsClick} - > - Load backup... - @@ -264,6 +258,12 @@ export const MainMenuConfig: React.FC = ({ > Action Button + } + onClick={handleDevToolsClick} + > + Dev. Tools + @@ -285,8 +285,9 @@ export const MainMenuConfig: React.FC = ({ } onClick={handleLogout} + disabled={isLoggingOut} // Disable button while logout is in progress > - Logout + {isLoggingOut ? "Logging out..." : "Logout"} diff --git a/src/frontend/src/ui/PadsDialog.scss b/src/frontend/src/ui/PadsDialog.scss deleted file mode 100644 index 033e0b6..0000000 --- a/src/frontend/src/ui/PadsDialog.scss +++ /dev/null @@ -1,215 +0,0 @@ -.pads-dialog { - &__wrapper { - position: fixed; - top: 0; - left: 0; - right: 0; - bottom: 0; - z-index: 1000; - background-color: rgba(0, 0, 0, 0.2); - backdrop-filter: blur(1px); - } - - &__title-container { - display: flex; - align-items: center; - justify-content: space-between; - } - - &__title { - margin: 0; - font-size: 1.2rem; - font-weight: 600; - } - - &__new-pad-button { - display: flex; - align-items: center; - gap: 0.5rem; - padding: 0.4rem 0.8rem; - border-radius: 4px; - border: none; - background-color: #cc6d24; - color: white; - font-size: 0.9rem; - cursor: pointer; - transition: background-color 0.2s ease; - - &:hover { - background-color: #b05e1f; - } - - &:disabled { - opacity: 0.6; - cursor: not-allowed; - } - - &--creating { - opacity: 0.7; - cursor: wait; - } - } - - &__content { - padding: 1rem; - max-height: 70vh; - overflow-y: auto; - } - - &__loading, - &__error, - &__empty { - text-align: center; - padding: 2rem 0; - color: var(--text-primary-color); - } - - &__list { - list-style: none; - padding: 0; - margin: 0; - } - - &__item { - display: flex; - justify-content: space-between; - align-items: center; - padding: 0.75rem; - border-radius: 6px; - margin-bottom: 0.5rem; - background-color: var(--island-bg-color); - transition: background-color 0.2s ease; - - &:hover { - background-color: var(--button-hover-bg); - } - - &--active { - border-left: 3px solid #cc6d24; - } - } - - &__item-content { - display: flex; - flex-direction: column; - flex: 1; - padding: 0.25rem; - border-radius: 4px; - transition: background-color 0.2s ease; - - &--clickable { - cursor: pointer; - - &:hover { - background-color: var(--button-hover-bg); - } - } - - &--current { - cursor: default; - } - } - - &__name { - font-weight: 500; - margin-bottom: 0.25rem; - } - - &__current { - font-weight: normal; - font-style: italic; - color: #cc6d24; - } - - &__timestamps { - display: flex; - flex-direction: column; - gap: 0.2rem; - } - - &__timestamp { - font-size: 0.8rem; - color: var(--text-secondary-color); - } - - &__actions { - display: flex; - gap: 0.5rem; - } - - &__icon-button { - display: flex; - align-items: center; - justify-content: center; - background: none; - border: none; - border-radius: 4px; - padding: 0.4rem; - cursor: pointer; - color: var(--text-primary-color); - transition: background-color 0.2s ease, color 0.2s ease; - - &:hover { - background-color: var(--button-hover-bg); - color: #cc6d24; - } - - &:disabled { - opacity: 0.5; - cursor: not-allowed; - - &:hover { - background: none; - color: var(--text-primary-color); - } - } - } - - &__edit-form { - display: flex; - flex-direction: column; - width: 100%; - gap: 0.5rem; - - input { - padding: 0.5rem; - border-radius: 4px; - border: 1px solid var(--button-gray-2); - background-color: var(--input-bg-color); - color: var(--text-primary-color); - font-size: 1rem; - } - } - - &__edit-actions { - display: flex; - gap: 0.5rem; - } - - &__button { - padding: 0.4rem 0.8rem; - border-radius: 4px; - border: none; - cursor: pointer; - font-size: 0.9rem; - transition: background-color 0.2s ease; - - &--save { - background-color: #cc6d24; - color: white; - - &:hover { - background-color: #b05e1f; - } - } - - &--cancel { - background-color: var(--button-gray-2); - color: var(--text-primary-color); - - &:hover { - background-color: var(--button-gray-3); - } - } - } -} diff --git a/src/frontend/src/ui/PadsDialog.tsx b/src/frontend/src/ui/PadsDialog.tsx deleted file mode 100644 index c269afb..0000000 --- a/src/frontend/src/ui/PadsDialog.tsx +++ /dev/null @@ -1,320 +0,0 @@ -import React, { useState, useCallback } from "react"; -import { Dialog } from "@atyrode/excalidraw"; -import { Pencil, Trash2, FilePlus2 } from "lucide-react"; -import { useAllPads, useRenamePad, useDeletePad, PadData } from "../api/hooks"; -import { loadPadData, getActivePad, setActivePad, saveCurrentPadBeforeSwitching, createNewPad } from "../utils/canvasUtils"; -import { queryClient } from "../api/queryClient"; -import { capture } from "../utils/posthog"; -import "./PadsDialog.scss"; - -interface PadsDialogProps { - excalidrawAPI?: any; - onClose?: () => void; -} - -const PadsDialog: React.FC = ({ - excalidrawAPI, - onClose, -}) => { - const [modalIsShown, setModalIsShown] = useState(true); - const { data: pads, isLoading, error } = useAllPads(); - const activePadId = getActivePad(); - const [editingPadId, setEditingPadId] = useState(null); - const [newPadName, setNewPadName] = useState(""); - const [isCreatingPad, setIsCreatingPad] = useState(false); - - // Get the renamePad mutation - const { mutate: renamePad } = useRenamePad({ - onSuccess: (data, variables) => { - console.debug("[pad.ws] Pad renamed successfully"); - - // Update the cache directly instead of refetching - const { padId, newName } = variables; - - // Get the current pads from the query cache - const currentPads = queryClient.getQueryData(['allPads']); - - if (currentPads) { - // Create a new array with the updated pad name - const updatedPads = currentPads.map(pad => - pad.id === padId - ? { ...pad, display_name: newName } - : pad - ); - - // Update the query cache with the new data - queryClient.setQueryData(['allPads'], updatedPads); - } - - // Reset editing state - setEditingPadId(null); - }, - onError: (error) => { - console.error("[pad.ws] Failed to rename pad:", error); - setEditingPadId(null); - } - }); - - // Get the deletePad mutation - const { mutate: deletePad } = useDeletePad({ - onSuccess: (data, padId) => { - console.debug("[pad.ws] Pad deleted successfully"); - - // Update the cache directly instead of refetching - // Get the current pads from the query cache - const currentPads = queryClient.getQueryData(['allPads']); - - if (currentPads) { - // Create a new array without the deleted pad - const updatedPads = currentPads.filter(pad => pad.id !== padId); - - // Update the query cache with the new data - queryClient.setQueryData(['allPads'], updatedPads); - } - }, - onError: (error) => { - console.error("[pad.ws] Failed to delete pad:", error); - } - }); - - const handleCreateNewPad = async () => { - if (isCreatingPad || !excalidrawAPI) return; // Prevent multiple clicks or if API not available - - try { - setIsCreatingPad(true); - - // Create a new pad using the imported function - // Note: createNewPad already updates the query cache internally - const newPad = await createNewPad(excalidrawAPI, activePadId, (data) => { - console.debug("[pad.ws] Canvas saved before creating new pad"); - }); - - // Track pad creation event - capture("pad_created", { - padId: newPad.id, - padName: newPad.display_name - }); - - // Set the new pad as active and load it - setActivePad(newPad.id); - loadPadData(excalidrawAPI, newPad.id, newPad.data); - - // Close the dialog - handleClose(); - } catch (error) { - console.error('[pad.ws] Error creating new pad:', error); - } finally { - setIsCreatingPad(false); - } - }; - - const handleClose = useCallback(() => { - setModalIsShown(false); - if (onClose) { - onClose(); - } - }, [onClose]); - - const handleRenameClick = (pad: PadData) => { - setEditingPadId(pad.id); - setNewPadName(pad.display_name); - }; - - const handleRenameSubmit = (padId: string) => { - if (newPadName.trim() === "") return; - - // Track pad rename event - capture("pad_renamed", { - padId, - newName: newPadName - }); - - // Call the renamePad mutation - renamePad({ padId, newName: newPadName }); - }; - - const handleDeleteClick = (pad: PadData) => { - // Don't allow deleting the last pad - if (pads && pads.length <= 1) { - alert("Cannot delete the last pad"); - return; - } - - // Confirm deletion - if (!window.confirm(`Are you sure you want to delete "${pad.display_name}"?`)) { - return; - } - - // Track pad deletion event - capture("pad_deleted", { - padId: pad.id, - padName: pad.display_name - }); - - // If deleting the active pad, switch to another pad first but keep dialog open - if (pad.id === activePadId && pads) { - const otherPad = pads.find(p => p.id !== pad.id); - if (otherPad && excalidrawAPI) { - handleLoadPad(otherPad, true); // Pass true to keep dialog open - } - } - - // Call the deletePad mutation - deletePad(pad.id); - }; - - const handleLoadPad = (pad: PadData, keepDialogOpen: boolean = false) => { - if (!excalidrawAPI) return; - - // Save the current canvas before switching tabs - if (activePadId) { - saveCurrentPadBeforeSwitching(excalidrawAPI, activePadId, (data) => { - console.debug("[pad.ws] Canvas saved before switching"); - }); - } - - // Set the new active pad ID - setActivePad(pad.id); - - // Load the pad data - loadPadData(excalidrawAPI, pad.id, pad.data); - - // Close the dialog only if keepDialogOpen is false - if (!keepDialogOpen) { - handleClose(); - } - }; - - // Format date function - const formatDate = (dateString: string): string => { - const date = new Date(dateString); - return date.toLocaleString(undefined, { - year: 'numeric', - month: 'short', - day: 'numeric', - hour: '2-digit', - minute: '2-digit' - }); - }; - - // Dialog content - const dialogContent = ( -
- {isLoading ? ( -
Loading pads...
- ) : error ? ( -
Error loading pads
- ) : !pads || pads.length === 0 ? ( -
No pads available
- ) : ( -
    - {pads.map((pad) => ( -
  • - {editingPadId === pad.id ? ( -
    - setNewPadName(e.target.value)} - autoFocus - onKeyDown={(e) => { - if (e.key === 'Enter') { - handleRenameSubmit(pad.id); - } else if (e.key === 'Escape') { - setEditingPadId(null); - } - }} - /> -
    - - -
    -
    - ) : ( - <> -
    pad.id !== activePadId && handleLoadPad(pad)} - > - - {pad.display_name} - {pad.id === activePadId && (current)} - -
    - Created: {formatDate(pad.created_at)} - Last updated: {formatDate(pad.updated_at || pad.created_at)} -
    -
    -
    - - -
    - - )} -
  • - ))} -
- )} -
- ); - - return ( - <> - {modalIsShown && ( -
- -

- Manage Pads -

- -
- } - closeOnClickOutside={true} - children={dialogContent} - /> - - )} - - ); -}; - -export default PadsDialog; diff --git a/src/frontend/src/ui/SettingsDialog.scss b/src/frontend/src/ui/SettingsDialog.scss index 53968ff..1a71496 100644 --- a/src/frontend/src/ui/SettingsDialog.scss +++ b/src/frontend/src/ui/SettingsDialog.scss @@ -7,8 +7,6 @@ right: 0; bottom: 0; z-index: 1000; - background-color: rgba(0, 0, 0, 0.2); - backdrop-filter: blur(1px); } &__title-container { @@ -57,79 +55,4 @@ &__range-container { margin: 1rem 0; } - - &__restore-button { - display: flex; - align-items: center; - gap: 8px; - padding: 8px 16px; - background-color: var(--button-bg-color, #cc6d24); - border: 1px solid var(--button-border-color, #ddd); - border-radius: 4px; - color: var(--text-primary-color); - cursor: pointer; - font-size: 0.9rem; - transition: background-color 0.2s ease; - - &:hover { - background-color: var(--button-hover-bg-color, #a4571b); - } - } - - &__confirmation { - padding: 1rem; - border-radius: 4px; - background-color: var(--dialog-bg-color); - border: 1px solid var(--warning-color, #f0ad4e); - } - - &__warning { - color: var(--warning-color, #f0ad4e); - font-weight: bold; - margin: 1rem 0; - } - - &__actions { - display: flex; - gap: 8px; - margin-top: 1rem; - } - - &__button { - padding: 8px 16px; - border-radius: 4px; - cursor: pointer; - font-size: 0.9rem; - border: none; - transition: background-color 0.2s ease; - - &--restore { - background-color: var(--danger-color, #d9534f); - color: white; - - &:hover { - background-color: var(--danger-hover-color, #c9302c); - } - - &:disabled { - background-color: var(--disabled-color, #cccccc); - cursor: not-allowed; - } - } - - &--cancel { - background-color: var(--button-bg-color, #cc6d24); - border: 1px solid var(--button-border-color, #ddd); - color: var(--text-primary-color); - - &:hover { - background-color: var(--button-hover-bg-color, #a4571b); - } - - &:disabled { - background-color: var(--disabled-color, #cccccc); - cursor: not-allowed; - } - } - } } diff --git a/src/frontend/src/ui/SettingsDialog.tsx b/src/frontend/src/ui/SettingsDialog.tsx index 4876f5c..6c67deb 100644 --- a/src/frontend/src/ui/SettingsDialog.tsx +++ b/src/frontend/src/ui/SettingsDialog.tsx @@ -2,10 +2,7 @@ import React, { useState, useCallback, useEffect } from "react"; import { Dialog } from "@atyrode/excalidraw"; import { Range } from "./Range"; import { UserSettings, DEFAULT_SETTINGS } from "../types/settings"; -import { RefreshCw } from "lucide-react"; -import { normalizeCanvasData } from "../utils/canvasUtils"; -import { capture } from "../utils/posthog"; -import { api } from "../api/hooks"; +// import { capture } from "../lib/posthog"; import "./SettingsDialog.scss"; interface SettingsDialogProps { @@ -19,8 +16,6 @@ const SettingsDialog: React.FC = ({ }) => { const [modalIsShown, setModalIsShown] = useState(true); const [settings, setSettings] = useState(DEFAULT_SETTINGS); - const [showRestoreConfirmation, setShowRestoreConfirmation] = useState(false); - const [isRestoring, setIsRestoring] = useState(false); // Get current settings from excalidrawAPI when component mounts useEffect(() => { @@ -41,36 +36,6 @@ const SettingsDialog: React.FC = ({ } }, [onClose]); - const handleRestoreTutorialCanvas = async () => { - if (!excalidrawAPI) return; - - try { - setIsRestoring(true); - capture('restore_tutorial_canvas_clicked'); - - // Use the API function from hooks.ts to fetch the default canvas - const defaultCanvasData = await api.getDefaultCanvas(); - - console.debug("Default canvas data:", defaultCanvasData); - - // Normalize the canvas data before updating the scene - const normalizedData = normalizeCanvasData(defaultCanvasData); - - // Update the canvas with the normalized default data - excalidrawAPI.updateScene(normalizedData); - - console.debug("Canvas reset to default successfully"); - - // Close the dialog after successful restore - handleClose(); - } catch (error) { - console.error("Failed to reset canvas:", error); - } finally { - setIsRestoring(false); - setShowRestoreConfirmation(false); - } - }; - /** * Updates a specific setting and syncs it with the excalidraw app state * @param key The setting key to update @@ -128,42 +93,6 @@ const SettingsDialog: React.FC = ({ - -
-

Canvas Management

- {showRestoreConfirmation ? ( -
-

Are you sure you want to restore the tutorial canvas?

-

This will replace your current canvas and cannot be undone!

-
- - -
-
- ) : ( -
- -
- )} -
); diff --git a/src/frontend/src/ui/TabBar.scss b/src/frontend/src/ui/TabBar.scss new file mode 100644 index 0000000..6801038 --- /dev/null +++ b/src/frontend/src/ui/TabBar.scss @@ -0,0 +1,58 @@ +.tabs-bar { + margin-inline-start: 0.6rem; + height: var(--lg-button-size); + position: relative; + display: flex; + gap: 8px; + align-items: center; + padding: 0 8px; + + .tab { + height: var(--lg-button-size) !important; + width: 100px !important; + min-width: 100px !important; + margin-right: 0.6rem; + text-overflow: ellipsis; + overflow: hidden; + white-space: nowrap; + display: flex; + align-items: center; + justify-content: center; + padding: 0 16px; + border: none; + border-radius: var(--border-radius-lg); + background: var(--island-bg-color); + color: #bdbdbd; + font-size: 14px; + font-weight: 500; + cursor: pointer; + box-shadow: 0 1px 3px rgba(0, 0, 0, 0.1); + transition: all 0.2s ease; + outline: none; + + &:hover { + background: #4a4a54; + color: #ffffff; + } + + &.active-pad { + background-color: #cc6d24 !important; + color: var(--color-on-primary) !important; + font-weight: bold; + border: 1px solid #cccccc !important; + } + + &.new-tab { + min-width: var(--lg-button-size) !important; + width: var(--lg-button-size) !important; + padding: 0; + background: var(--island-bg-color); + color: #bdbdbd; + + &:hover { + color: #ffffff; + background: #4a4a54; + } + } + } +} \ No newline at end of file diff --git a/src/frontend/src/ui/TabBar.tsx b/src/frontend/src/ui/TabBar.tsx new file mode 100644 index 0000000..a161553 --- /dev/null +++ b/src/frontend/src/ui/TabBar.tsx @@ -0,0 +1,42 @@ +import React from 'react'; +import { NewPadIcon } from '../icons'; +import './TabBar.scss'; + +interface Tab { + id: string; + label: string; +} + +interface TabBarProps { + tabs: Tab[]; + activeTabId: string; + onTabSelect: (tabId: string) => void; + onNewTab?: () => void; +} + +const TabBar: React.FC = ({ tabs, activeTabId, onTabSelect, onNewTab }) => { + return ( +
+ {tabs.map((tab) => ( + + ))} + {onNewTab && ( + + )} +
+ ); +}; + +export default TabBar; \ No newline at end of file diff --git a/src/frontend/src/ui/TabContextMenu.scss b/src/frontend/src/ui/TabContextMenu.scss index a840665..c3c3f0b 100644 --- a/src/frontend/src/ui/TabContextMenu.scss +++ b/src/frontend/src/ui/TabContextMenu.scss @@ -157,4 +157,4 @@ .tab-context-menu .menu-item .menu-item__label { margin-inline-end: 0; } -} +} \ No newline at end of file diff --git a/src/frontend/src/ui/TabContextMenu.tsx b/src/frontend/src/ui/TabContextMenu.tsx index 9e8b9a2..e81a3f4 100644 --- a/src/frontend/src/ui/TabContextMenu.tsx +++ b/src/frontend/src/ui/TabContextMenu.tsx @@ -37,8 +37,13 @@ interface TabContextMenuProps { padId: string; padName: string; onRename: (padId: string, newName: string) => void; - onDelete: (padId: string) => void; + onDelete: (padId: string) => void; // For deleting owned pads + onUpdateSharingPolicy: (padId: string, policy: string) => void; onClose: () => void; + currentUserId?: string; + tabOwnerId?: string; + sharingPolicy?: string; + onLeaveSharedPad: (padId: string) => void; // For leaving shared pads } // Popover component @@ -52,69 +57,69 @@ const Popover: React.FC<{ viewportWidth?: number; viewportHeight?: number; children: React.ReactNode; -}> = ({ - onCloseRequest, - top, - left, - children, +}> = ({ + onCloseRequest, + top, + left, + children, fitInViewport = false, offsetLeft = 0, offsetTop = 0, viewportWidth = window.innerWidth, viewportHeight = window.innerHeight }) => { - const popoverRef = useRef(null); - - // Handle clicks outside the popover to close it - useEffect(() => { - const handleClickOutside = (event: MouseEvent) => { - if (popoverRef.current && !popoverRef.current.contains(event.target as Node)) { - onCloseRequest(); - } - }; - - document.addEventListener('mousedown', handleClickOutside); - return () => { - document.removeEventListener('mousedown', handleClickOutside); - }; - }, [onCloseRequest]); - - // Adjust position if needed to fit in viewport - useEffect(() => { - if (fitInViewport && popoverRef.current) { - const rect = popoverRef.current.getBoundingClientRect(); - const adjustedLeft = Math.min(left, viewportWidth - rect.width); - const adjustedTop = Math.min(top, viewportHeight - rect.height); - - if (popoverRef.current) { - popoverRef.current.style.left = `${adjustedLeft}px`; - popoverRef.current.style.top = `${adjustedTop}px`; + const popoverRef = useRef(null); + + // Handle clicks outside the popover to close it + useEffect(() => { + const handleClickOutside = (event: MouseEvent) => { + if (popoverRef.current && !popoverRef.current.contains(event.target as Node)) { + onCloseRequest(); + } + }; + + document.addEventListener('mousedown', handleClickOutside); + return () => { + document.removeEventListener('mousedown', handleClickOutside); + }; + }, [onCloseRequest]); + + // Adjust position if needed to fit in viewport + useEffect(() => { + if (fitInViewport && popoverRef.current) { + const rect = popoverRef.current.getBoundingClientRect(); + const adjustedLeft = Math.min(left, viewportWidth - rect.width); + const adjustedTop = Math.min(top, viewportHeight - rect.height); + + if (popoverRef.current) { + popoverRef.current.style.left = `${adjustedLeft}px`; + popoverRef.current.style.top = `${adjustedTop}px`; + } } - } - }, [fitInViewport, left, top, viewportWidth, viewportHeight]); + }, [fitInViewport, left, top, viewportWidth, viewportHeight]); - return ( -
- {children} -
- ); -}; + return ( +
+ {children} +
+ ); + }; // ContextMenu component -const ContextMenu: React.FC = ({ - actionManager, - items, - top, - left, - onClose +const ContextMenu: React.FC = ({ + actionManager, + items, + top, + left, + onClose }) => { // Filter items based on predicate const filteredItems = items.reduce((acc: ContextMenuItem[], item) => { @@ -170,14 +175,11 @@ const ContextMenu: React.FC = ({ key={idx} data-testid={actionName} onClick={() => { - // Log the click - console.debug('[pad.ws] Menu item clicked:', item.name); - // Store the callback to execute after closing const callback = () => { actionManager.executeAction(item, "contextMenu"); }; - + // Close the menu and execute the callback onClose(callback); }} @@ -205,36 +207,58 @@ class TabActionManager implements ActionManager { padId: string; padName: string; onRename: (padId: string, newName: string) => void; - onDelete: (padId: string) => void; + onDelete: (padId: string) => void; // For deleteOwnedPad + onUpdateSharingPolicy: (padId: string, policy: string) => void; app: any; + sharingPolicy?: string; + onLeaveSharedPad: (padId: string) => void; // For leaveSharedPad constructor( padId: string, padName: string, onRename: (padId: string, newName: string) => void, - onDelete: (padId: string) => void + onDelete: (padId: string) => void, // This is for deleteOwnedPad + onUpdateSharingPolicy: (padId: string, policy: string) => void, + onLeaveSharedPad: (padId: string) => void, // Moved before optional param + sharingPolicy?: string ) { this.padId = padId; this.padName = padName; this.onRename = onRename; - this.onDelete = onDelete; + this.onDelete = onDelete; // Will be called by 'deleteOwnedPad' + this.onUpdateSharingPolicy = onUpdateSharingPolicy; + this.onLeaveSharedPad = onLeaveSharedPad; // Will be called by 'leaveSharedPad' + this.sharingPolicy = sharingPolicy; this.app = { props: {} }; } executeAction(action: Action, source: string) { - console.debug('[pad.ws] Executing action:', action.name, 'from source:', source); - if (action.name === 'rename') { const newName = window.prompt('Rename pad', this.padName); if (newName && newName.trim() !== '') { this.onRename(this.padId, newName); } - } else if (action.name === 'delete') { - console.debug('[pad.ws] Attempting to delete pad:', this.padId, this.padName); + } else if (action.name === 'deleteOwnedPad') { // Renamed from 'delete' + console.debug('[pad.ws] Attempting to delete owned pad:', this.padId, this.padName); if (window.confirm(`Are you sure you want to delete "${this.padName}"?`)) { console.debug('[pad.ws] User confirmed delete, calling onDelete'); - this.onDelete(this.padId); + this.onDelete(this.padId); // Calls original onDelete for owned pads } + } else if (action.name === 'leaveSharedPad') { // New action for leaving + console.debug('[pad.ws] Attempting to leave shared pad:', this.padId, this.padName); + if (window.confirm(`Are you sure you want to leave "${this.padName}"? This will remove it from your list of open pads.`)) { + this.onLeaveSharedPad(this.padId); // Calls the new handler + } + } else if (action.name === 'toggleSharingPolicy') { + const newPolicy = this.sharingPolicy === 'public' ? 'private' : 'public'; + this.onUpdateSharingPolicy(this.padId, newPolicy); + } else if (action.name === 'copyUrl') { + const url = `${window.location.origin}/pad/${this.padId}`; + navigator.clipboard.writeText(url).then(() => { + console.debug('[pad.ws] URL copied to clipboard:', url); + }).catch(err => { + console.error('[pad.ws] Failed to copy URL:', err); + }); } } } @@ -247,35 +271,71 @@ const TabContextMenu: React.FC = ({ padName, onRename, onDelete, - onClose + onUpdateSharingPolicy, + onClose, + currentUserId, + tabOwnerId, + sharingPolicy, + onLeaveSharedPad // Destructure new prop }) => { + const isOwner = currentUserId && tabOwnerId && currentUserId === tabOwnerId; + const isPadPublic = sharingPolicy === 'public'; + // Create an action manager instance - const actionManager = new TabActionManager(padId, padName, onRename, onDelete); + const actionManager = new TabActionManager(padId, padName, onRename, onDelete, onUpdateSharingPolicy, onLeaveSharedPad, sharingPolicy); // Define menu items - const menuItems = [ - { + const menuItemsResult: ContextMenuItems = []; + + if (isOwner) { + menuItemsResult.push({ name: 'rename', label: 'Rename', - predicate: () => true, - }, - CONTEXT_MENU_SEPARATOR, // Add separator between rename and delete - { - name: 'delete', - label: 'Delete', - predicate: () => true, - dangerous: true, + }); + // No separator needed here if toggleSharingPolicy directly follows + } + + // Always show Copy URL + menuItemsResult.push({ + name: 'copyUrl', + label: 'Copy URL', + }); + + if (isOwner) { + // Add separator if rename was added, before toggle policy + const renameItemIndex = menuItemsResult.findIndex(item => item && typeof item !== 'string' && item.name === 'rename'); + const copyUrlItemIndex = menuItemsResult.findIndex(item => item && typeof item !== 'string' && item.name === 'copyUrl'); + + if (renameItemIndex !== -1 && copyUrlItemIndex !== -1 && copyUrlItemIndex > renameItemIndex) { + menuItemsResult.splice(copyUrlItemIndex, 0, CONTEXT_MENU_SEPARATOR); + } else if (renameItemIndex !== -1 && copyUrlItemIndex === -1) { + // If copyUrl is not there for some reason, but rename is, add separator after rename + menuItemsResult.push(CONTEXT_MENU_SEPARATOR); } - ]; + + menuItemsResult.push({ + name: 'toggleSharingPolicy', + label: () => isPadPublic ? 'Set Private' : 'Set Public', + }); + } + + // Separator before delete/leave + if (menuItemsResult.length > 0 && menuItemsResult[menuItemsResult.length -1] !== CONTEXT_MENU_SEPARATOR) { + menuItemsResult.push(CONTEXT_MENU_SEPARATOR); + } + + menuItemsResult.push({ + name: !isOwner ? 'leaveSharedPad' : 'deleteOwnedPad', // Dynamically set action name + label: () => (!isOwner ? 'Leave shared pad' : 'Delete'), + dangerous: true, + }); + + const menuItems = menuItemsResult.filter(Boolean) as ContextMenuItems; + // Create a wrapper for onClose that handles the callback const handleClose = (callback?: () => void) => { - console.debug('[pad.ws] TabContextMenu handleClose called, has callback:', !!callback); - - // First call the original onClose onClose(); - - // Then execute the callback if provided if (callback) { callback(); } diff --git a/src/frontend/src/ui/Tabs.scss b/src/frontend/src/ui/Tabs.scss index fa12aa6..af0f70d 100644 --- a/src/frontend/src/ui/Tabs.scss +++ b/src/frontend/src/ui/Tabs.scss @@ -1,3 +1,16 @@ +// Add these at a suitable top-level scope in your SCSS file +@property --a { + syntax: ''; + initial-value: 0deg; + inherits: false; +} + +@keyframes a { + to { + --a: 1turn; + } +} + .tabs-bar { margin-inline-start: 0.6rem; height: var(--lg-button-size); @@ -5,38 +18,35 @@ Button { height: var(--lg-button-size) !important; - width: 100px !important; - min-width: 100px !important; + width: 120px !important; + min-width: 120px !important; margin-right: 0.6rem; - text-overflow: ellipsis; overflow: hidden; white-space: nowrap; - - &.active-pad { - background-color: #cc6d24 !important; - color: var(--color-on-primary) !important; - font-weight: bold; - border: 1px solid #cccccc !important; + position: relative; - .tab-position { - color: var(--color-on-primary) !important; - } - } - - &.creating-pad { - opacity: 0.6; - cursor: not-allowed; - } - .tab-content { position: relative; width: 100%; height: 100%; display: flex; - flex-direction: column; + flex-direction: row; + align-items: center; justify-content: center; - + .tab-title { + flex-grow: 1; + text-align: center; + overflow: hidden; + white-space: nowrap; + + /* Only apply fadeout effect when content is overflowing */ + &.tab-title-overflow { + mask-image: linear-gradient(to right, black 75%, transparent 95%); + -webkit-mask-image: linear-gradient(to right, black 75%, transparent 95%); + } + } + .tab-position { position: absolute; bottom: -7px; @@ -46,35 +56,146 @@ color: var(--keybinding-color); font-weight: normal; } + + .tab-users-icon.tab-position { + width: 16px; + height: 16px; + font-size: 16px; + bottom: -7px; + right: -4px; + opacity: 0.5; + z-index: 0; + } + } + + &.active-pad { + color: var(--color-on-primary); + font-weight: bold; + + .tab-position { + color: var(--color-on-primary); + } + } + + &.creating-pad { + opacity: 0.6; + cursor: not-allowed; + } + + + /* Styles for tabs with 'public' sharing policy */ + &.tab-sharing-public { + --b-width: 0px; + --tab-border-radius: var(--border-radius-lg); + --l-colors: #de9457, #f7b538, #ff7b00, #c37434, #c39427, #de9457; + + --tab-inner-bg: var(--button-bg, var(--island-bg-color)); + + --tab-glow-dilate-radius: 1; + --tab-glow-blur-std-deviation: 6; + --tab-glow-opacity-slope: 0.5; + + box-sizing: border-box; + border-width: var(--b-width); + border-color: transparent; + + background-color: var(--tab-inner-bg); + background-image: none; + background-clip: padding-box; + + &:hover:not(.active-pad) { + --b-width: 1px; + --tab-inner-bg: #2E2D39; + border-color: transparent; + + .tab-users-icon.tab-position { + opacity: 1; + } + + background-image: linear-gradient(var(--tab-inner-bg), var(--tab-inner-bg)), + repeating-conic-gradient(from var(--a, 0deg), var(--l-colors)); + background-clip: padding-box, + border-box; + filter: url(#glow-1); + animation: a 1s linear infinite; + } + + &.active-pad { + --tab-inner-bg: #cc6d24; + border: 1px solid #c6c6c6; + background-color: var(--tab-inner-bg); + background-image: none; + background-clip: padding-box; + + .tab-users-icon.tab-position { + opacity: 1; + } + + .tab-position { + color: var(--color-on-primary) !important; + } + + &:hover { + + + + background-image: linear-gradient(var(--tab-inner-bg), var(--tab-inner-bg)), + repeating-conic-gradient(from var(--a, 0deg), var(--l-colors)); + background-clip: padding-box, + border-box; + filter: url(#glow-1); + animation: a 5s linear infinite; + } + + + } + } + + /* Styles for tabs with 'whitelist' sharing policy */ + /* &.tab-sharing-whitelist { + /* TODO: Add styles for whitelisted tabs */ + /* } */ + + /* Styles for tabs with 'private' sharing policy (default) */ + &.tab-sharing-private { + + &.active-pad { + background-color: #cc6d24; + border: 1px solid #c6c6c6; + } + + &:hover { + border: 1px solid #c6c6c6; + } } } - + .tabs-wrapper { display: flex; flex-direction: row; align-items: center; position: relative; } - + .tabs-container { display: flex; flex-direction: row; align-items: center; position: relative; - + .loading-indicator { font-size: 0.8rem; color: var(--color-muted); margin-right: 0.5rem; } } - + .scroll-buttons-container { display: flex; flex-direction: row; align-items: center; } - + .scroll-button { height: var(--lg-button-size) !important; width: var(--lg-button-size) !important; // Square button @@ -96,23 +217,23 @@ &:hover:not(.disabled) { color: #ffffff; } - + &:active:not(.disabled) { - color: #ffffff; + color: #ffffff; } - + &.disabled { color: #575757; // Light gray color for the icons opacity: 1; cursor: default; } - + &.left { margin-right: 4px; // Add a small margin between left button and tabs } - + } - + .new-tab-button-container { Button { border: none !important; @@ -120,4 +241,4 @@ width: var(--lg-button-size) !important; } } -} +} \ No newline at end of file diff --git a/src/frontend/src/ui/Tabs.tsx b/src/frontend/src/ui/Tabs.tsx index 0997cfa..809b454 100644 --- a/src/frontend/src/ui/Tabs.tsx +++ b/src/frontend/src/ui/Tabs.tsx @@ -1,42 +1,99 @@ -import React, { useState, useEffect, useRef, useLayoutEffect } from "react"; +import React, { useState, useEffect, useRef, useLayoutEffect, useCallback } from "react"; import type { ExcalidrawImperativeAPI } from "@atyrode/excalidraw/types"; import { Stack, Button, Section, Tooltip } from "@atyrode/excalidraw"; -import { FilePlus2, ChevronLeft, ChevronRight } from "lucide-react"; -import { useAllPads, useSaveCanvas, useRenamePad, useDeletePad, PadData } from "../api/hooks"; -import { queryClient } from "../api/queryClient"; -import { capture } from "../utils/posthog"; -import { - getPadData, - storePadData, - setActivePad, - getStoredActivePad, - loadPadData, - saveCurrentPadBeforeSwitching, - createNewPad, - setScrollIndex, - getStoredScrollIndex -} from "../utils/canvasUtils"; +import { FilePlus2, ChevronLeft, ChevronRight, Users } from "lucide-react"; + +import { usePad } from "../hooks/usePadData"; +import { useAuthStatus } from "../hooks/useAuthStatus"; +import type { Tab } from "../hooks/usePadTabs"; +import { capture } from "../lib/posthog"; import TabContextMenu from "./TabContextMenu"; import "./Tabs.scss"; interface TabsProps { - excalidrawAPI: ExcalidrawImperativeAPI; + excalidrawAPI: ExcalidrawImperativeAPI; + tabs: Tab[]; + selectedTabId: string | null; + isLoading: boolean; + isCreatingPad: boolean; + createNewPadAsync: () => Promise; + renamePad: (args: { padId: string; newName: string }) => void; + deletePad: (padId: string) => void; + leaveSharedPad: (padId: string) => void; // Added prop + updateSharingPolicy: (args: { padId: string; policy: string }) => void; + selectTab: (tabId: string) => void; } +// Custom hook to check if text is overflowing its container +const useTextOverflow = () => { + const [overflowMap, setOverflowMap] = useState>({}); + + const checkOverflow = useCallback((element: HTMLElement | null, id: string) => { + if (element) { + const isOverflowing = element.scrollWidth > element.clientWidth; + setOverflowMap(prev => { + if (prev[id] !== isOverflowing) { + return { ...prev, [id]: isOverflowing }; + } + return prev; + }); + return isOverflowing; + } + return false; + }, []); + + return { overflowMap, checkOverflow }; +}; + const Tabs: React.FC = ({ - excalidrawAPI, -}: { - excalidrawAPI: ExcalidrawImperativeAPI; + excalidrawAPI, + tabs, + selectedTabId, + isLoading, + isCreatingPad, + createNewPadAsync, + renamePad, + deletePad, + leaveSharedPad, // Destructure new prop + updateSharingPolicy, + selectTab, }) => { - const { data: pads, isLoading } = useAllPads(); + const { user: currentUser } = useAuthStatus(); + const { isLoading: isPadLoading, error: padError } = usePad(selectedTabId, excalidrawAPI); + const [displayPadLoadingIndicator, setDisplayPadLoadingIndicator] = useState(false); + const { overflowMap, checkOverflow } = useTextOverflow(); + + // Create refs for tab titles + const titleRefs = useRef>({}); + const appState = excalidrawAPI.getAppState(); - const [isCreatingPad, setIsCreatingPad] = useState(false); - const [activePadId, setActivePadId] = useState(null); - const [startPadIndex, setStartPadIndex] = useState(getStoredScrollIndex()); - const PADS_PER_PAGE = 5; // Show 5 pads at a time + const [startPadIndex, setStartPadIndex] = useState(0); + const PADS_PER_PAGE = 5; - // Context menu state + // Check for overflow when tabs change or window resizes + useEffect(() => { + if (!tabs) return; + + // Check overflow for visible tabs + const visibleTabs = tabs.slice(startPadIndex, startPadIndex + PADS_PER_PAGE); + visibleTabs.forEach(tab => { + checkOverflow(titleRefs.current[tab.id], tab.id); + }); + + // Also check on window resize + const handleResize = () => { + visibleTabs.forEach(tab => { + checkOverflow(titleRefs.current[tab.id], tab.id); + }); + }; + + window.addEventListener('resize', handleResize); + return () => { + window.removeEventListener('resize', handleResize); + }; + }, [tabs, startPadIndex, checkOverflow]); + const [contextMenu, setContextMenu] = useState<{ visible: boolean; x: number; @@ -50,279 +107,209 @@ const Tabs: React.FC = ({ padId: '', padName: '' }); - - // Get the saveCanvas mutation - const { mutate: saveCanvas } = useSaveCanvas({ - onSuccess: () => { - console.debug("[pad.ws] Canvas saved to database successfully"); - }, - onError: (error) => { - console.error("[pad.ws] Failed to save canvas to database:", error); - } - }); - - // Get the renamePad mutation - const { mutate: renamePad } = useRenamePad({ - onSuccess: (data, variables) => { - console.debug("[pad.ws] Pad renamed successfully"); - - // Update the cache directly instead of refetching - const { padId, newName } = variables; - - // Get the current pads from the query cache - const currentPads = queryClient.getQueryData(['allPads']); - - if (currentPads) { - // Create a new array with the updated pad name - const updatedPads = currentPads.map(pad => - pad.id === padId - ? { ...pad, display_name: newName } - : pad - ); - - // Update the query cache with the new data - queryClient.setQueryData(['allPads'], updatedPads); - } - }, - onError: (error) => { - console.error("[pad.ws] Failed to rename pad:", error); - } - }); - - // Get the deletePad mutation - const { mutate: deletePad } = useDeletePad({ - onSuccess: (data, padId) => { - console.debug("[pad.ws] Pad deleted successfully"); - - // Update the cache directly instead of refetching - // Get the current pads from the query cache - const currentPads = queryClient.getQueryData(['allPads']); - - if (currentPads) { - // Create a new array without the deleted pad - const updatedPads = currentPads.filter(pad => pad.id !== padId); - - // Update the query cache with the new data - queryClient.setQueryData(['allPads'], updatedPads); - - // Recompute the startPadIndex to avoid visual artifacts - // If deleting a pad would result in an empty space at the end of the tab bar - if (startPadIndex > 0 && startPadIndex + PADS_PER_PAGE > updatedPads.length) { - // Calculate the new index that ensures the tab bar is filled properly - // but never goes below 0 - const newIndex = Math.max(0, updatedPads.length - PADS_PER_PAGE); - setStartPadIndex(newIndex); - setScrollIndex(newIndex); - } - } - }, - onError: (error) => { - console.error("[pad.ws] Failed to delete pad:", error); - } - }); - const handlePadSelect = (pad: any) => { - // Save the current canvas before switching tabs - if (activePadId) { - saveCurrentPadBeforeSwitching(excalidrawAPI, activePadId, saveCanvas); - } - - // Set the new active pad ID - setActivePadId(pad.id); - // Store the active pad ID globally - setActivePad(pad.id); - - // Load the pad data - loadPadData(excalidrawAPI, pad.id, pad.data); + const handlePadSelect = (pad: Tab) => { + selectTab(pad.id); }; - // Listen for active pad change events - useEffect(() => { - const handleActivePadChange = (event: Event) => { - const customEvent = event as CustomEvent; - const newActivePadId = customEvent.detail; - console.debug(`[pad.ws] Received activePadChanged event with padId: ${newActivePadId}`); - setActivePadId(newActivePadId); - }; - - // Add event listener - window.addEventListener('activePadChanged', handleActivePadChange); - - // Clean up - return () => { - window.removeEventListener('activePadChanged', handleActivePadChange); - }; - }, []); - - // Set the active pad ID when the component mounts and when the pads data changes - useEffect(() => { - if (!isLoading && pads && pads.length > 0) { - // Check if there's a stored active pad ID - const storedActivePadId = getStoredActivePad(); - - if (!activePadId || !pads.some(pad => pad.id === activePadId)) { - // Find the pad that matches the stored ID, or use the first pad if no match - let padToActivate = pads[0]; - - if (storedActivePadId) { - // Try to find the pad with the stored ID - const matchingPad = pads.find(pad => pad.id === storedActivePadId); - if (matchingPad) { - console.debug(`[pad.ws] Found stored active pad: ${storedActivePadId}`); - padToActivate = matchingPad; - } else { - console.debug(`[pad.ws] Stored active pad ${storedActivePadId} not found in available pads`); - } - } - - // Set the active pad ID - setActivePadId(padToActivate.id); - // Store the active pad ID globally - setActivePad(padToActivate.id); - - // If the current canvas is empty, load the pad data - const currentElements = excalidrawAPI.getSceneElements(); - if (currentElements.length === 0) { - // Load the pad data using the imported function - loadPadData(excalidrawAPI, padToActivate.id, padToActivate.data); - } - } else if (storedActivePadId && storedActivePadId !== activePadId) { - // Update local state to match global state - setActivePadId(storedActivePadId); - } - - // Store all pads in local storage for the first time - pads.forEach(pad => { - // Only store if not already in local storage - if (!getPadData(pad.id)) { - storePadData(pad.id, pad.data); - } - }); - } - }, [pads, isLoading, activePadId, excalidrawAPI]); - const handleCreateNewPad = async () => { - if (isCreatingPad) return; // Prevent multiple clicks - + if (isCreatingPad) return; + try { - setIsCreatingPad(true); - - // Create a new pad using the imported function - const newPad = await createNewPad(excalidrawAPI, activePadId, saveCanvas); - - // Track pad creation event - capture("pad_created", { - padId: newPad.id, - padName: newPad.display_name - }); - - // Set the active pad ID in the component state - setActivePadId(newPad.id); - - // Get the current pads from the query cache - const currentPads = queryClient.getQueryData(['allPads']); - - if (currentPads) { - // Find the index of the newly created pad - const newPadIndex = currentPads.findIndex(pad => pad.id === newPad.id); - + const newPad = await createNewPadAsync(); + + if (newPad) { + capture("pad_created", { + padId: newPad.id, + padName: newPad.title + }); + + const newPadIndex = tabs.findIndex((tab: { id: any; }) => tab.id === newPad.id); if (newPadIndex !== -1) { - const newStartIndex = Math.max(0, Math.min(newPadIndex - PADS_PER_PAGE + 1, currentPads.length - PADS_PER_PAGE)); + const newStartIndex = Math.max(0, Math.min(newPadIndex - PADS_PER_PAGE + 1, tabs.length - PADS_PER_PAGE)); setStartPadIndex(newStartIndex); - setScrollIndex(newStartIndex); + } else { + if (tabs.length >= PADS_PER_PAGE) { + setStartPadIndex(Math.max(0, tabs.length + 1 - PADS_PER_PAGE)); + } } } } catch (error) { console.error('Error creating new pad:', error); - } finally { - setIsCreatingPad(false); } }; - // Navigation functions - move by 1 pad at a time + useEffect(() => { + if (tabs && startPadIndex > 0 && startPadIndex + PADS_PER_PAGE > tabs.length) { + const newIndex = Math.max(0, tabs.length - PADS_PER_PAGE); + setStartPadIndex(newIndex); + } + }, [tabs, startPadIndex, PADS_PER_PAGE]); + + useEffect(() => { + let timer: NodeJS.Timeout; + if (isPadLoading && selectedTabId) { + if (!displayPadLoadingIndicator) { + timer = setTimeout(() => { + if (isPadLoading) { + setDisplayPadLoadingIndicator(true); + } + }, 200); + } + } else { + setDisplayPadLoadingIndicator(false); + } + + return () => { + clearTimeout(timer); + }; + }, [isPadLoading, selectedTabId, displayPadLoadingIndicator]); + + const showPreviousPads = () => { const newIndex = Math.max(0, startPadIndex - 1); setStartPadIndex(newIndex); - setScrollIndex(newIndex); }; const showNextPads = () => { - if (pads) { - const newIndex = Math.min(startPadIndex + 1, Math.max(0, pads.length - PADS_PER_PAGE)); + if (tabs) { + const newIndex = Math.min(startPadIndex + 1, Math.max(0, tabs.length - PADS_PER_PAGE)); setStartPadIndex(newIndex); - setScrollIndex(newIndex); } }; - // Create a ref for the tabs wrapper to handle wheel events const tabsWrapperRef = useRef(null); - - // Track last wheel event time to throttle scrolling const lastWheelTimeRef = useRef(0); - const wheelThrottleMs = 70; // Minimum time between wheel actions in milliseconds - - // Set up wheel event listener with passive: false to properly prevent default behavior + const wheelThrottleMs = 70; + useLayoutEffect(() => { const handleWheel = (e: WheelEvent) => { - // Always prevent default to stop page navigation e.preventDefault(); e.stopPropagation(); - - // Throttle wheel events to prevent too rapid scrolling + const now = Date.now(); if (now - lastWheelTimeRef.current < wheelThrottleMs) { return; } - - // Update last wheel time lastWheelTimeRef.current = now; - - // Prioritize horizontal scrolling (deltaX) if present + if (Math.abs(e.deltaX) > Math.abs(e.deltaY)) { - // Horizontal scrolling - if (e.deltaX > 0 && pads && startPadIndex < pads.length - PADS_PER_PAGE) { + if (e.deltaX > 0 && tabs && startPadIndex < tabs.length - PADS_PER_PAGE) { showNextPads(); } else if (e.deltaX < 0 && startPadIndex > 0) { showPreviousPads(); } } else { - // Vertical scrolling - treat down as right, up as left (common convention) - if (e.deltaY > 0 && pads && startPadIndex < pads.length - PADS_PER_PAGE) { + if (e.deltaY > 0 && tabs && startPadIndex < tabs.length - PADS_PER_PAGE) { showNextPads(); } else if (e.deltaY < 0 && startPadIndex > 0) { showPreviousPads(); } } }; - - const tabsWrapper = tabsWrapperRef.current; - if (tabsWrapper) { - // Add wheel event listener with passive: false option - tabsWrapper.addEventListener('wheel', handleWheel, { passive: false }); - - // Clean up the event listener when component unmounts + + const localTabsWrapperRef = tabsWrapperRef.current; + if (localTabsWrapperRef) { + localTabsWrapperRef.addEventListener('wheel', handleWheel, { passive: false }); return () => { - tabsWrapper.removeEventListener('wheel', handleWheel); + localTabsWrapperRef.removeEventListener('wheel', handleWheel); }; } - }, [pads, startPadIndex, PADS_PER_PAGE]); // Dependencies needed for boundary checks + }, [tabs, startPadIndex, PADS_PER_PAGE, showNextPads, showPreviousPads]); + + useEffect(() => { + // Update SVG filter attributes based on CSS variables + const publicTabElement = document.querySelector('.tab-sharing-public'); + if (publicTabElement) { + const computedStyle = getComputedStyle(publicTabElement); + + const dilateRadius = computedStyle.getPropertyValue('--tab-glow-dilate-radius').trim(); + const blurStdDeviation = computedStyle.getPropertyValue('--tab-glow-blur-std-deviation').trim(); + const opacitySlope = computedStyle.getPropertyValue('--tab-glow-opacity-slope').trim(); + + const filterGlow1 = document.getElementById('glow-1'); + if (filterGlow1) { + const feMorphology = filterGlow1.querySelector('feMorphology'); + if (feMorphology) { + feMorphology.setAttribute('radius', dilateRadius); + } + + const feGaussianBlur = filterGlow1.querySelector('feGaussianBlur[result="glow"]'); + if (feGaussianBlur) { + feGaussianBlur.setAttribute('stdDeviation', blurStdDeviation); + } + + const feComponentTransfers = filterGlow1.querySelectorAll('feComponentTransfer'); + let targetFeFuncA: SVGFEFuncAElement | null = null; + feComponentTransfers.forEach(feComp => { + if (feComp.getAttribute('in') !== 'SourceGraphic') { + const feFunc = feComp.querySelector('feFuncA[type="linear"]'); + if (feFunc) { + targetFeFuncA = feFunc as SVGFEFuncAElement; + } + } + }); + + if (targetFeFuncA) { + targetFeFuncA.setAttribute('slope', opacitySlope); + } + } + } + // Rerun if tabs change, as a public tab might become visible/active + }, [tabs, selectedTabId]); + return ( <> +
{!appState.viewModeEnabled && ( <> -
- {/* New pad button */}
{} : handleCreateNewPad} + onSelect={isCreatingPad ? () => { } : handleCreateNewPad} className={isCreatingPad ? "creating-pad" : ""} children={
@@ -332,77 +319,90 @@ const Tabs: React.FC = ({ /> } />
- +
- {/* Loading indicator */} - {isLoading && ( -
- Loading pads... -
- )} - - {/* List visible pads (5 at a time) */} - {!isLoading && pads && pads.slice(startPadIndex, startPadIndex + PADS_PER_PAGE).map((pad) => ( -
{ - e.preventDefault(); - setContextMenu({ - visible: true, - x: e.clientX, - y: e.clientY, - padId: pad.id, - padName: pad.display_name - }); - }} - > - {/* Show tooltip for all active tabs or truncated names */} - {(activePadId === pad.id || pad.display_name.length > 11) ? ( - 11 - ? `${pad.display_name} (current pad)` - : "Current pad") - : pad.display_name - } - children={ + {isLoading && !isPadLoading && ( +
+ Loading pads... +
+ )} + + {!isLoading && tabs && tabs.slice(startPadIndex, startPadIndex + PADS_PER_PAGE).map((tab: Tab, index: any) => ( +
void; clientX: any; clientY: any; }) => { + e.preventDefault(); + setContextMenu({ + visible: true, + x: e.clientX, + y: e.clientY, + padId: tab.id, + padName: tab.title + }); + }} + > + {(selectedTabId === tab.id || tab.title.length > 8) ? ( + 8 + ? `${tab.title} (current pad)` + : "Current pad") + : tab.title + } + children={ +
- ))} - + )} +
+ ))} +
- - {/* Left scroll button - only visible when there are more pads than can fit in the view */} - {pads && pads.length > PADS_PER_PAGE && ( + + {tabs && tabs.length > PADS_PER_PAGE && ( - 0 ? `\n(${startPadIndex} more)` : ''}`} + 0 ? `\n(${startPadIndex} more)` : ''}`} children={ - - } + } /> )} - - {/* Right scroll button - only visible when there are more pads than can fit in the view */} - {pads && pads.length > PADS_PER_PAGE && ( + + {tabs && tabs.length > PADS_PER_PAGE && ( - 0 ? `\n(${Math.max(0, pads.length - (startPadIndex + PADS_PER_PAGE))} more)` : ''}`} + 0 ? `\n(${Math.max(0, tabs.length - (startPadIndex + PADS_PER_PAGE))} more)` : ''}`} children={ - - } + } /> )} @@ -439,54 +438,47 @@ const Tabs: React.FC = ({
- - {/* Context Menu */} + {contextMenu.visible && ( { - // Track pad rename event - capture("pad_renamed", { - padId, - newName - }); - - // Call the renamePad mutation + sharingPolicy={tabs.find(tab => tab.id === contextMenu.padId)?.sharingPolicy} + currentUserId={currentUser?.id} + tabOwnerId={tabs.find(tab => tab.id === contextMenu.padId)?.ownerId} + onRename={(padId: any, newName: any) => { + capture("pad_renamed", { padId, newName }); renamePad({ padId, newName }); }} - onDelete={(padId) => { - // Don't allow deleting the last pad - if (pads && pads.length <= 1) { + onDelete={(padId: any) => { // This is for 'deleteOwnedPad' + if (tabs && tabs.length <= 1) { alert("Cannot delete the last pad"); return; } - - // Find the pad name before deletion for analytics - const padToDelete = pads?.find(p => p.id === padId); - const padName = padToDelete?.display_name || ""; - - // Track pad deletion event - capture("pad_deleted", { - padId, - padName - }); - - // If deleting the active pad, switch to another pad first - if (padId === activePadId && pads) { - const otherPad = pads.find(p => p.id !== padId); - if (otherPad) { - handlePadSelect(otherPad); + + const tabToDelete = tabs?.find((t: { id: any; }) => t.id === padId); + const padName = tabToDelete?.title || ""; + capture("pad_deleted", { padId, padName }); + + if (padId === selectedTabId && tabs) { + const otherTab = tabs.find((t: { id: any; }) => t.id !== padId); + if (otherTab) { + selectTab(otherTab.id); } } - - // Call the deletePad mutation - deletePad(padId); + deletePad(padId); // Calls the prop for deleting owned pad + }} + onLeaveSharedPad={(padId: string) => { // New prop for 'leaveSharedPad' + leaveSharedPad(padId); // Calls the prop for leaving shared pad + }} + onUpdateSharingPolicy={(padId: string, policy: string) => { + capture("pad_sharing_policy_updated", { padId, policy }); + updateSharingPolicy({ padId, policy }); }} onClose={() => { - setContextMenu(prev => ({ ...prev, visible: false })); + setContextMenu((prev: any) => ({ ...prev, visible: false })); }} /> )} diff --git a/src/frontend/src/utils/canvasUtils.ts b/src/frontend/src/utils/canvasUtils.ts deleted file mode 100644 index ca9a68b..0000000 --- a/src/frontend/src/utils/canvasUtils.ts +++ /dev/null @@ -1,332 +0,0 @@ -import { DEFAULT_SETTINGS } from '../types/settings'; -import type { ExcalidrawImperativeAPI } from "@atyrode/excalidraw/types"; -import type { NonDeletedExcalidrawElement } from "@atyrode/excalidraw/element/types"; -import type { AppState } from "@atyrode/excalidraw/types"; -import { CanvasData, PadData } from '../api/hooks'; -import { fetchApi } from '../api/apiUtils'; -import { queryClient } from '../api/queryClient'; - -/** - * - * @param data The canvas data to normalize - * @returns Normalized canvas data - */ -export function normalizeCanvasData(data: any) { - if (!data) return data; - - const appState = { ...data.appState }; - - // Remove width and height properties - if ("width" in appState) { - delete appState.width; - } - if ("height" in appState) { - delete appState.height; - } - - // Preserve existing pad settings if they exist, otherwise create new ones - const existingPad = appState.pad || {}; - const existingUserSettings = existingPad.userSettings || {}; - - // Merge existing pad properties with our updates - appState.pad = { - ...existingPad, // Preserve all existing properties (uniqueId, displayName, etc.) - moduleBorderOffset: { - left: 10, - right: 10, - top: 40, - bottom: 10, - }, - // Merge existing user settings with default settings - userSettings: { - ...DEFAULT_SETTINGS, - ...existingUserSettings - } - }; - - // Reset collaborators (https://github.com/excalidraw/excalidraw/issues/8637) - appState.collaborators = new Map(); - - return { ...data, appState }; -} - -// Local storage keys -export const LOCAL_STORAGE_PADS_KEY = 'pad_ws_pads'; -export const LOCAL_STORAGE_ACTIVE_PAD_KEY = 'pad_ws_active_pad'; -export const LOCAL_STORAGE_SCROLL_INDEX_KEY = 'pad_ws_scroll_index'; - -/** - * Stores pad data in local storage - * @param padId The ID of the pad to store - * @param data The pad data to store - */ -export function storePadData(padId: string, data: any): void { - try { - // Get existing pads data from local storage - const storedPadsString = localStorage.getItem(LOCAL_STORAGE_PADS_KEY); - const storedPads = storedPadsString ? JSON.parse(storedPadsString) : {}; - - // Update the pad data - storedPads[padId] = data; - - // Save back to local storage - localStorage.setItem(LOCAL_STORAGE_PADS_KEY, JSON.stringify(storedPads)); - - console.debug(`[pad.ws] Stored pad ${padId} data in local storage`); - } catch (error) { - console.error('[pad.ws] Error storing pad data in local storage:', error); - } -} - -/** - * Gets pad data from local storage - * @param padId The ID of the pad to retrieve - * @returns The pad data or null if not found - */ -export function getPadData(padId: string): any | null { - try { - // Get pads data from local storage - const storedPadsString = localStorage.getItem(LOCAL_STORAGE_PADS_KEY); - if (!storedPadsString) return null; - - const storedPads = JSON.parse(storedPadsString); - - // Return the pad data if it exists - return storedPads[padId] || null; - } catch (error) { - console.error('[pad.ws] Error getting pad data from local storage:', error); - return null; - } -} - -/** - * Sets the active pad ID globally and stores it in local storage - * @param padId The ID of the pad to set as active - */ -export function setActivePad(padId: string): void { - (window as any).activePadId = padId; - - // Store the active pad ID in local storage - try { - localStorage.setItem(LOCAL_STORAGE_ACTIVE_PAD_KEY, padId); - console.debug(`[pad.ws] Stored active pad ID ${padId} in local storage`); - } catch (error) { - console.error('[pad.ws] Error storing active pad ID in local storage:', error); - } - - // Dispatch a custom event to notify components of the active pad change - const event = new CustomEvent('activePadChanged', { detail: padId }); - window.dispatchEvent(event); - - console.debug(`[pad.ws] Set active pad to ${padId}`); -} - -/** - * Gets the current active pad ID from the global variable - * @returns The active pad ID or null if not set - */ -export function getActivePad(): string | null { - return (window as any).activePadId || null; -} - -/** - * Gets the stored active pad ID from local storage - * @returns The stored active pad ID or null if not found - */ -export function getStoredActivePad(): string | null { - try { - const storedActivePadId = localStorage.getItem(LOCAL_STORAGE_ACTIVE_PAD_KEY); - return storedActivePadId; - } catch (error) { - console.error('[pad.ws] Error getting active pad ID from local storage:', error); - return null; - } -} - -/** - * Sets the scroll index in local storage - * @param index The scroll index to store - */ -export function setScrollIndex(index: number): void { - try { - localStorage.setItem(LOCAL_STORAGE_SCROLL_INDEX_KEY, index.toString()); - console.debug(`[pad.ws] Stored scroll index ${index} in local storage`); - } catch (error) { - console.error('[pad.ws] Error storing scroll index in local storage:', error); - } -} - -/** - * Gets the stored scroll index from local storage - * @returns The stored scroll index or 0 if not found - */ -export function getStoredScrollIndex(): number { - try { - const storedScrollIndex = localStorage.getItem(LOCAL_STORAGE_SCROLL_INDEX_KEY); - return storedScrollIndex ? parseInt(storedScrollIndex, 10) : 0; - } catch (error) { - console.error('[pad.ws] Error getting scroll index from local storage:', error); - return 0; - } -} - -/** - * Saves the current pad data before switching to another pad - * @param excalidrawAPI The Excalidraw API instance - * @param activePadId The current active pad ID - * @param saveCanvas The saveCanvas mutation function - */ -export function saveCurrentPadBeforeSwitching( - excalidrawAPI: ExcalidrawImperativeAPI, - activePadId: string | null, - saveCanvas: (data: CanvasData) => void -): void { - if (!activePadId) return; - - // Get the current elements, state, and files - const elements = excalidrawAPI.getSceneElements(); - const appState = excalidrawAPI.getAppState(); - const files = excalidrawAPI.getFiles(); - - // Create the canvas data object - const canvasData = { - elements: [...elements] as any[], // Convert readonly array to mutable array - appState, - files - }; - - // Save the canvas data to local storage - storePadData(activePadId, canvasData); - - // Save the canvas data to the server - saveCanvas(canvasData); - - console.debug("[pad.ws] Saved canvas before switching"); -} - -/** - * Loads pad data into the Excalidraw canvas - * @param excalidrawAPI The Excalidraw API instance - * @param padId The ID of the pad to load - * @param serverData The server data to use as fallback - */ -export function loadPadData( - excalidrawAPI: ExcalidrawImperativeAPI, - padId: string, - serverData: any -): void { - // Try to get the pad data from local storage first - const localPadData = getPadData(padId); - - if (localPadData) { - // Use the local data if available - console.debug(`[pad.ws] Loading pad ${padId} data from local storage`); - excalidrawAPI.updateScene(normalizeCanvasData(localPadData)); - } else if (serverData) { - // Fall back to the server data - console.debug(`[pad.ws] No local data found for pad ${padId}, using server data`); - excalidrawAPI.updateScene(normalizeCanvasData(serverData)); - } -} - -/** - * Creates a new pad from the default template - * @param excalidrawAPI The Excalidraw API instance - * @param activePadId The current active pad ID - * @param saveCanvas The saveCanvas mutation function - * @returns Promise resolving to the new pad data - */ -export async function createNewPad( - excalidrawAPI: ExcalidrawImperativeAPI, - activePadId: string | null, - saveCanvas: (data: CanvasData) => void -): Promise { - // Save the current canvas before creating a new pad - if (activePadId) { - saveCurrentPadBeforeSwitching(excalidrawAPI, activePadId, saveCanvas); - } - - // Create a new pad from the default template - const newPad = await fetchApi('/api/pad/from-template/default', { - method: 'POST', - body: JSON.stringify({ - display_name: `New Pad ${new Date().toLocaleTimeString()}`, - }), - }); - - // Manually update the pads list instead of refetching - // Get the current pads from the query cache - const currentPads = queryClient.getQueryData(['allPads']) || []; - - // Add the new pad to the list - queryClient.setQueryData(['allPads'], [...currentPads, newPad]); - - // Store the new pad data in local storage - storePadData(newPad.id, newPad.data); - - // Update the canvas with the new pad's data - // Normalize the data before updating the scene - excalidrawAPI.updateScene(normalizeCanvasData(newPad.data)); - console.debug("[pad.ws] Loaded new pad data"); - - // Set the active pad ID globally - setActivePad(newPad.id); - - return newPad; -} - -/** - * Saves the current canvas state using the Excalidraw API - * @param saveCanvas The saveCanvas mutation function from useSaveCanvas hook - * @param onSuccess Optional callback to run after successful save - * @param onError Optional callback to run if save fails - */ -export function saveCurrentCanvas( - saveCanvas: (data: CanvasData) => void, - onSuccess?: () => void, - onError?: (error: any) => void -) { - try { - // Get the excalidrawAPI from the window object - const excalidrawAPI = (window as any).excalidrawAPI as ExcalidrawImperativeAPI | null; - - if (excalidrawAPI) { - // Get the current elements, state, and files - const elements = excalidrawAPI.getSceneElements(); - const appState = excalidrawAPI.getAppState(); - const files = excalidrawAPI.getFiles(); - - // Save the canvas data - saveCanvas({ - elements: [...elements] as any[], // Convert readonly array to mutable array - appState, - files - }); - - // Call onSuccess callback if provided - if (onSuccess) { - onSuccess(); - } - - return true; - } else { - console.warn("[pad.ws] ExcalidrawAPI not available"); - - // Call onError callback if provided - if (onError) { - onError(new Error("ExcalidrawAPI not available")); - } - - return false; - } - } catch (error) { - console.error("[pad.ws] Error saving canvas:", error); - - // Call onError callback if provided - if (onError) { - onError(error); - } - - return false; - } -} diff --git a/src/frontend/src/utils/posthog.ts b/src/frontend/src/utils/posthog.ts deleted file mode 100644 index 212eb14..0000000 --- a/src/frontend/src/utils/posthog.ts +++ /dev/null @@ -1,25 +0,0 @@ -import posthog from 'posthog-js'; -import { getAppConfig } from '../api/configService'; - -// Initialize PostHog with empty values first -posthog.init('', { api_host: '' }); - -// Then update with real values when config is loaded -getAppConfig().then(config => { - if (config.posthogKey) { - posthog.init(config.posthogKey, { - api_host: config.posthogHost, - }); - console.debug('[pad.ws] PostHog initialized successfully'); - } else { - console.warn('[pad.ws] PostHog API key not found. Analytics will not be tracked.'); - } -}); - -// Helper function to track custom events -export const capture = (eventName: string, properties?: Record) => { - posthog.capture(eventName, properties); -}; - -// Export PostHog instance for direct use -export default posthog; diff --git a/src/frontend/vite.config.mts b/src/frontend/vite.config.mts index 856bb0d..e1f10be 100644 --- a/src/frontend/vite.config.mts +++ b/src/frontend/vite.config.mts @@ -1,32 +1,4 @@ -import { defineConfig, loadEnv, Plugin } from "vite"; -import fs from "fs"; -import path from "path"; - -// Create a plugin to generate build-info.json during build -const generateBuildInfoPlugin = (): Plugin => ({ - name: 'generate-build-info', - closeBundle() { - // Generate a unique build hash (timestamp + random string) - const buildInfo = { - buildHash: Date.now().toString(36) + Math.random().toString(36).substring(2), - timestamp: Date.now() - }; - - // Ensure the dist directory exists - const distDir = path.resolve(__dirname, 'dist'); - if (!fs.existsSync(distDir)) { - fs.mkdirSync(distDir, { recursive: true }); - } - - // Write to the output directory - fs.writeFileSync( - path.resolve(distDir, 'build-info.json'), - JSON.stringify(buildInfo, null, 2) - ); - - console.debug('[pad.ws] Generated build-info.json with hash:', buildInfo.buildHash); - } -}); +import { defineConfig, loadEnv } from "vite"; // https://vitejs.dev/config/ export default defineConfig(({ mode }) => { @@ -47,14 +19,7 @@ export default defineConfig(({ mode }) => { }, }, }, - define: { - // Make non-prefixed CODER_URL available to import.meta.env - 'import.meta.env.CODER_URL': JSON.stringify(env.CODER_URL), - }, publicDir: "public", - plugins: [ - generateBuildInfoPlugin(), - ], optimizeDeps: { esbuildOptions: { // Bumping to 2022 due to "Arbitrary module namespace identifier names" not being diff --git a/src/frontend/yarn.lock b/src/frontend/yarn.lock index 73f29bc..c00abf4 100644 --- a/src/frontend/yarn.lock +++ b/src/frontend/yarn.lock @@ -2,10 +2,10 @@ # yarn lockfile v1 -"@atyrode/excalidraw@^0.18.0-9": - version "0.18.0-9" - resolved "https://registry.yarnpkg.com/@atyrode/excalidraw/-/excalidraw-0.18.0-9.tgz#6a69b5d0b44b902c10ba9b27e46803231a628d95" - integrity sha512-Wej+UFAemSTHrLTcOOYCYxnZ4DTsYgRmE+NKMLtCOXLAI69nCQ7eyIMeKdI01rfNetjzT7OOZLHi/AIhx6GYGg== +"@atyrode/excalidraw@^0.18.0-15": + version "0.18.0-15" + resolved "https://registry.yarnpkg.com/@atyrode/excalidraw/-/excalidraw-0.18.0-15.tgz#c330633ff8b60473aa30668b4b0c3196f6ad90f3" + integrity sha512-Fn1+oQHgPv1O9wSa6x6DO1zuAQzY59IyhUwgTEOsIzcbwYjpGaztW6XksfmQxvX79SkQrs21YwUVpCHkhgIh7w== dependencies: "@braintree/sanitize-url" "6.0.2" "@excalidraw/laser-pointer" "1.3.1" @@ -542,17 +542,17 @@ resolved "https://registry.yarnpkg.com/@tanstack/query-core/-/query-core-5.74.9.tgz#35d5b1075663072bea22aa3ce21508b195306ecd" integrity sha512-qmjXpWyigDw4SfqdSBy24FzRvpBPXlaSbl92N77lcrL+yvVQLQkf0T6bQNbTxl9IEB/SvVFhhVZoIlQvFnNuuw== -"@tanstack/query-devtools@5.74.7": - version "5.74.7" - resolved "https://registry.yarnpkg.com/@tanstack/query-devtools/-/query-devtools-5.74.7.tgz#c9b022b386ac86e6395228b5d6912e6444b3b971" - integrity sha512-nSNlfuGdnHf4yB0S+BoNYOE1o3oAH093weAYZolIHfS2stulyA/gWfSk/9H4ZFk5mAAHb5vNqAeJOmbdcGPEQw== +"@tanstack/query-devtools@5.76.0": + version "5.76.0" + resolved "https://registry.yarnpkg.com/@tanstack/query-devtools/-/query-devtools-5.76.0.tgz#ba43754ed8d23a265ed72f17de618fa9f9c7649d" + integrity sha512-1p92nqOBPYVqVDU0Ua5nzHenC6EGZNrLnB2OZphYw8CNA1exuvI97FVgIKON7Uug3uQqvH/QY8suUKpQo8qHNQ== "@tanstack/react-query-devtools@^5.74.3": - version "5.74.11" - resolved "https://registry.yarnpkg.com/@tanstack/react-query-devtools/-/react-query-devtools-5.74.11.tgz#81c078d4f202c51065de1735415360b80f2e1e12" - integrity sha512-vx8MzH4WUUk4ZW8uHq7T45XNDgePF5ecRoa7haWJZxDMQyAHM80GGMhEW/yRz6TeyS9UlfTUz2OLPvgGRvvVOA== + version "5.76.1" + resolved "https://registry.yarnpkg.com/@tanstack/react-query-devtools/-/react-query-devtools-5.76.1.tgz#20157a5880df5fd4d4fe8fd4fca2c8663d8dfa3e" + integrity sha512-LFVWgk/VtXPkerNLfYIeuGHh0Aim/k9PFGA+JxLdRaUiroQ4j4eoEqBrUpQ1Pd/KXoG4AB9vVE/M6PUQ9vwxBQ== dependencies: - "@tanstack/query-devtools" "5.74.7" + "@tanstack/query-devtools" "5.76.0" "@tanstack/react-query@^5.74.3": version "5.74.11" @@ -1264,7 +1264,12 @@ lodash.debounce@4.0.8: resolved "https://registry.yarnpkg.com/lodash.debounce/-/lodash.debounce-4.0.8.tgz#82d79bff30a67c4005ffd5e2515300ad9ca4d7af" integrity sha512-FT1yDzDYEoYWhnSGnpE/4Kj1fLZkDFyqRb7fNt6FdYOSxlUWAtp42Eh6Wb0rGIv/m9Bgo7x4GhQbm5Ys4SG5ow== -lodash.throttle@4.1.1: +lodash.isequal@^4.5.0: + version "4.5.0" + resolved "https://registry.yarnpkg.com/lodash.isequal/-/lodash.isequal-4.5.0.tgz#415c4478f2bcc30120c22ce10ed3226f7d3e18e0" + integrity sha512-pDo3lu8Jhfjqls6GkMgpahsF9kCyayhgykjyLMNFTKWrpVdAQtYyB4muAMWozBB4ig/dtWAmsMxLEI8wuz+DYQ== + +lodash.throttle@4.1.1, lodash.throttle@^4.1.1: version "4.1.1" resolved "https://registry.yarnpkg.com/lodash.throttle/-/lodash.throttle-4.1.1.tgz#c23e91b710242ac70c37f1e1cda9274cc39bf2f4" integrity sha512-wIkUCfVKpVsWo3JSZlc+8MB5it+2AN5W8J7YVMST30UrvcQNZ1Okbj+rbVniijTWE6FGYy4XJq/rHkas8qJMLQ== @@ -1714,6 +1719,11 @@ react-style-singleton@^2.2.2, react-style-singleton@^2.2.3: get-nonce "^1.0.0" tslib "^2.0.0" +react-use-websocket@^4.13.0: + version "4.13.0" + resolved "https://registry.yarnpkg.com/react-use-websocket/-/react-use-websocket-4.13.0.tgz#9db1dbac6dc8ba2fdc02a5bba06205fbf6406736" + integrity sha512-anMuVoV//g2N76Wxqvqjjo1X48r9Np3y1/gMl7arX84tAPXdy5R7sB5lO5hvCzQRYjqXwV8XMAiEBOUbyrZFrw== + react@19.0.0: version "19.0.0" resolved "https://registry.yarnpkg.com/react/-/react-19.0.0.tgz#6e1969251b9f108870aa4bff37a0ce9ddfaaabdd" @@ -1942,6 +1952,11 @@ which@^2.0.1: dependencies: isexe "^2.0.0" +zod@^3.24.4: + version "3.24.4" + resolved "https://registry.yarnpkg.com/zod/-/zod-3.24.4.tgz#e2e2cca5faaa012d76e527d0d36622e0a90c315f" + integrity sha512-OdqJE9UDRPwWsrHjLN2F8bPxvwJBK22EHLWtanu0LSYr5YqzsaaW3RMgmjwr8Rypg5k+meEJdSPXJZXE/yqOMg== + zustand@^4.3.2: version "4.5.6" resolved "https://registry.yarnpkg.com/zustand/-/zustand-4.5.6.tgz#6857d52af44874a79fb3408c9473f78367255c96"