# CLAUDE.md This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository. ## Development Commands **Dependencies & Setup:** - `uv sync` - Install dependencies using uv (recommended) - `cp .env.example .env` - Setup environment variables, then edit `.env` with your API keys **Running the Application:** - `streamlit run src/streamlit_app.py` - Start the Streamlit web application (available at http://localhost:8501) - `uv run streamlit run src/streamlit_app.py` - Alternative using uv **Testing:** - `pytest -q` - Run tests quietly - `pytest src/` - Run tests for specific directory - `pytest -v` - Run tests with verbose output **Development:** - `uv add package-name` - Add new dependencies - `uv run python -m streamlit run src/streamlit_app.py` - Alternative run command ## Architecture Overview This is a Streamlit-based AI resume and cover letter generator that analyzes job descriptions against user background to create tailored documents. **Core Architecture:** - **Frontend**: Streamlit app (`src/streamlit_app.py`) with sidebar inputs and main preview area - **Services Layer**: Business logic split into focused services - `analyse_service.py` - JD analysis and user feedback refinement - `generation_service.py` - Resume/cover letter generation - `pdf_service.py` - PDF export functionality - `llm_service.py` - LLM interaction wrapper - **LLM Layer**: LiteLLM integration for multi-provider support - `litellm_client.py` - API client wrapper - `prompt_templates.py` - Template management **Data Flow:** 1. User inputs JD + background → Analysis service → LLM analysis 2. Analysis summary → Generation service → Resume (MD) + Cover letter (text) 3. Generated content → PDF service → Exportable PDFs **LLM Integration:** - Uses LiteLLM for multi-provider support (OpenAI, Azure, Gemini) - Responses parsed as JSON for structured analysis data - Templates centralized in `llm/prompt_templates.py` **PDF Generation:** - WeasyPrint for Markdown → PDF conversion - Exports to timestamped files in `exports/` directory **Key Dependencies:** - `streamlit` - Web UI framework - `litellm` - Multi-provider LLM client - `weasyprint` - PDF generation - `pydantic` - Data validation **Project Structure:** ``` src/ ├── streamlit_app.py # Main application entry point ├── services/ # Business logic layer │ ├── analyse_service.py # Job description analysis │ ├── generation_service.py # Document generation │ ├── llm_service.py # LLM interaction wrapper │ └── pdf_service.py # PDF export functionality └── llm/ # LLM integration ├── litellm_client.py # Multi-provider API client └── prompt_templates.py # Centralized prompt management ``` **Important Notes:** - Application uses Streamlit session state for workflow persistence - Environment variables must be configured in `.env` before running - Supports OpenAI, Azure OpenAI, and Google Gemini via LiteLLM - PDF exports are saved to `exports/` directory with timestamps