A lightweight, local AI-powered assistant that combines Retrieval-Augmented Generation (RAG) with the Gemini API and GitHub repository integration. Upload reference documents, connect to your GitHub repositories, and get intelligent responses for process documentation, SOX compliance, MLOps workflows, DevOps pipelines, and more.
git clone <your-repo-url>
cd github-process-manager
# Windows
python -m venv venv
venv\Scripts\activate
# macOS/Linux
python3 -m venv venv
source venv/bin/activate
pip install -r requirements.txt
Copy the template and edit with your credentials:
# Windows
copy .env.template .env
# macOS/Linux
cp .env.template .env
Edit .env file:
# Required: Gemini API Key
GEMINI_API_KEY=your_gemini_api_key_here
# Optional: GitHub Integration
GITHUB_TOKEN=your_github_personal_access_token_here
GITHUB_REPO_URL=https://github.com/username/repository
# Flask Configuration
FLASK_SECRET_KEY=your_secret_key_here
FLASK_DEBUG=True
Getting Your API Keys:
repo, workflow (for triggering actions)python app.py
The application will be available at: http://localhost:5000
For a consistent, isolated environment, use Docker:
# 1. Configure environment
cp .env.template .env
# Edit .env with your API keys
# 2. Start the application
docker-compose up -d
# 3. View logs
docker-compose logs -f app
# 4. Access at http://localhost:5000
F1 → “Remote-Containers: Reopen in Container”# Stop the application
docker-compose down
# Rebuild after changes
docker-compose up -d --build
# Production mode
docker-compose -f docker-compose.prod.yml up -d
# View container shell
docker-compose exec app /bin/bash
For detailed Docker setup, see README.docker.md
https://github.com/username/repo)The application supports customizable system prompts to tailor AI responses to your needs:
You are a helpful assistant specializing in cloud infrastructure.
Focus on AWS best practices, security, and cost optimization.
Provide actionable recommendations with specific service names.
For persistent customization across server restarts:
.env file# Use a pre-defined template
SYSTEM_PROMPT_TEMPLATE=technical_expert
# Or set a custom prompt
CUSTOM_SYSTEM_PROMPT="Your custom system instruction here"
Available Templates: default, technical_expert, security_auditor, developer_assistant, data_analyst, technical_educator
Note: Session-based changes (via UI) take priority over .env settings until the server restarts.
The application supports configurable Word document templates with custom branding:
Edit your .env file to personalize generated documents:
# Project name for document headers
PROJECT_NAME=GitHub Process Manager
# Optional: Add company name to headers
COMPANY_NAME=Your Company Name
# Brand color (hex format #RRGGBB)
BRAND_COLOR=#4A90E2
# Optional: Add logo to document headers (.png, .jpg, .jpeg)
DOCUMENT_LOGO_PATH=/path/to/your/logo.png
# Default template type
DEFAULT_TEMPLATE_TYPE=generic
Modify document_templates.json to add new templates:
{
"templates": {
"your_template": {
"name": "Your Template Name",
"report_title": "Your Report Title",
"sections": [
{"number": 1, "title": "Section 1", "key": "Section 1"},
{"number": 2, "title": "Section 2", "key": "Section 2"}
],
"keywords": ["keyword1", "keyword2"]
}
}
}
Template Features:
The application includes specialized MLOps templates and workflows for managing machine learning operations.
Located in templates/mlops/, these guides provide comprehensive MLOps best practices:
templates/mlops/Located in .github/workflows/mlops/, trigger workflows for automated documentation:
Model Validation Report (mlops-model-validation.yml):
{"accuracy": 0.95, "f1": 0.93, "precision": 0.94}Deployment Documentation (mlops-deployment-doc.yml):
Try these queries with MLOps templates uploaded:
Model Training:
"Document the training process for a fraud detection model with 95% accuracy"
Deployment Planning:
"Create a deployment checklist for deploying a recommendation model to production"
Monitoring Setup:
"What alerts should I configure for monitoring a prediction model in production?"
Validation Reporting:
"Generate a validation report for model version 2.1.0 with accuracy 94.2%, precision 93.8%, recall 94.5%"
The MLOps templates include guidance for integrating with popular ML platforms:
Export metrics from these tools and use the GitHub Actions workflows to generate documentation with your actual performance data.
github-process-manager/
├── app.py # Main Flask application
├── config.py # Configuration management
├── logger.py # Logging setup
├── rag_engine.py # RAG document processing
├── gemini_client.py # Gemini API integration
├── github_client.py # GitHub API integration
├── word_generator.py # Word document generation
├── requirements.txt # Python dependencies
├── .env.template # Environment variable template
├── .gitignore # Git ignore rules
├── document_templates.json # Document template configuration
├── templates/
│ ├── base.html # Base template
│ ├── index.html # Chat interface
│ └── settings.html # Settings page
├── static/
│ └── css/
│ └── style.css # Application styling
├── .github/
│ └── workflows/
│ ├── process-analysis-doc.yml # Generic process workflow
│ └── sox-analysis-doc.yml # SOX-specific workflow (legacy)
├── chroma_db/ # ChromaDB storage (auto-created)
├── uploads/ # Temporary upload folder (auto-created)
├── generated_reports/ # Generated Word documents (auto-created)
└── README.md # This file
Edit config.py or set environment variables:
| Variable | Description | Default | ||||
|---|---|---|---|---|---|---|
GEMINI_API_KEY |
Google Gemini API key | Required | ||||
GEMINI_TEMPERATURE |
AI response randomness (0.0-1.0) | 0.7 | ||||
GEMINI_MAX_TOKENS |
Maximum response length | 2048 | ||||
SYSTEM_PROMPT_TEMPLATE |
Pre-defined prompt template | default |
||||
CUSTOM_SYSTEM_PROMPT |
Custom system instruction | None | PROJECT_NAME |
Project name for documents | GitHub Process Manager |
|
COMPANY_NAME |
Company name for documents | None | ||||
BRAND_COLOR |
Document brand color (hex) | #4A90E2 |
||||
DOCUMENT_LOGO_PATH |
Path to logo for documents | None | ||||
DEFAULT_TEMPLATE_TYPE |
Default document template | generic |
||||
DOCUMENT_TEMPLATES_PATH |
Template config file path | document_templates.json |
GITHUB_TOKEN |
GitHub personal access token | Optional | |
GITHUB_REPO_URL |
GitHub repository URL | Optional | ||||
FLASK_SECRET_KEY |
Flask session secret | Auto-generated | ||||
CHROMA_DB_PATH |
ChromaDB storage location | ./chroma_db |
||||
CHUNK_SIZE |
Characters per document chunk | 800 | ||||
CHUNK_OVERLAP |
Overlap between chunks | 200 | ||||
TOP_K_RESULTS |
RAG chunks to retrieve | 3 | ||||
MLOPS_FEATURES_ENABLED |
Enable MLOps features | false |
||||
MLOPS_TEMPLATES_DIR |
MLOps templates directory | templates/mlops |
||||
MLOPS_WORKFLOWS_DIR |
MLOps workflows directory | .github/workflows/mlops |
POST /api/chat - Send query and get AI responsePOST /api/upload - Upload document for RAGGET /api/rag/stats - Get RAG database statisticsPOST /api/rag/clear - Clear all documentsPOST /api/github/connect - Connect to repositoryGET /api/github/info - Get repository infoGET /api/github/workflows - List workflowsPOST /api/github/workflow/trigger - Trigger workflowGET /api/github/pulls - Get pull requestsGET /api/github/issues - Get issuesGET /api/prompts/templates - Get available prompt templatesGET /api/prompts/current - Get current active promptPOST /api/prompts/update - Update system prompt (session-based)POST /api/prompts/reset - Reset to default promptGET /api/mlops/status - Check MLOps feature availability and configurationPOST /api/mlops/parse-metrics - Parse and format ML metrics JSONPOST /api/mlops/validate-metrics - Validate ML metrics against schemaGET /api/mlops/templates - List available MLOps documentation templatesGET /health - Health check endpoint.env file from .env.template.env fileapp.log for detailed error messagesrepo, workflow)chroma_db/ folder and restart the applicationThis is a personal project, but suggestions and improvements are welcome!
This project is provided as-is for educational and personal use.
For issues or questions, please check the logs in app.log or review the troubleshooting section above.
Built with ❤️ using Python, Flask, and ChromaDB