Skip to content

beingmartinbmc/epic-backend

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

41 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

Epic Backend - Religious Guidance API

A modern, scalable backend API for providing religious guidance using OpenAI's GPT models and MongoDB for conversation storage.

πŸš€ Features

  • OpenAI Integration: Seamless integration with OpenAI's GPT models for religious guidance
  • MongoDB Storage: Persistent conversation storage with advanced querying capabilities
  • Actuator Monitoring: Comprehensive health checks, metrics, and observability
  • Serverless Ready: Optimized for Vercel deployment
  • Modern Architecture: Clean, maintainable code structure with best practices
  • Security: CORS protection, input validation, and environment variable masking
  • Error Handling: Robust error handling with proper logging and monitoring

πŸ“‹ Prerequisites

  • Node.js >= 20.0.0
  • npm >= 10.0.0
  • MongoDB Atlas account (for database)
  • OpenAI API key

πŸ› οΈ Installation

  1. Clone the repository

    git clone <repository-url>
    cd epic-backend
  2. Install dependencies

    npm install
  3. Set up environment variables Create a .env file in the root directory:

    # OpenAI Configuration
    OPENAI_API_KEY=your_openai_api_key_here
    OPENAI_MODEL=gpt-3.5-turbo
    OPENAI_TOKEN=1000
    OPENAI_TEMPERATURE=0.8
    
    # MongoDB Configuration
    mongodb_username=your_mongodb_username
    mongodb_password=your_mongodb_password
    MONGODB_CLUSTER=epic.kcjssht.mongodb.net
    MONGODB_DATABASE=religious-guide
    
    # Application Configuration
    NODE_ENV=development
    PORT=3000
  4. Run the application

    # Development mode
    npm run dev
    
    # Production mode
    npm start

πŸ—οΈ Project Structure

epic-backend/
β”œβ”€β”€ src/
β”‚   β”œβ”€β”€ config/           # Configuration files
β”‚   β”‚   β”œβ”€β”€ database.js   # MongoDB configuration
β”‚   β”‚   └── actuator.js   # Actuator configuration
β”‚   β”œβ”€β”€ controllers/      # Request handlers
β”‚   β”‚   └── openai.controller.js
β”‚   β”œβ”€β”€ middleware/       # Express middleware
β”‚   β”‚   └── cors.js
β”‚   β”œβ”€β”€ models/          # Data models
β”‚   β”‚   └── conversation.js
β”‚   β”œβ”€β”€ services/        # Business logic
β”‚   β”‚   └── openai.service.js
β”‚   β”œβ”€β”€ utils/           # Utility functions
β”‚   β”œβ”€β”€ tests/           # Test files
β”‚   β”‚   └── test-actuator.js
β”‚   └── index.js         # Main application entry point
β”œβ”€β”€ api/                 # Vercel serverless functions
β”‚   β”œβ”€β”€ openai-proxy.js
β”‚   └── actuator/
β”‚       └── [...path].js
β”œβ”€β”€ package.json
β”œβ”€β”€ vercel.json
β”œβ”€β”€ .eslintrc.json
β”œβ”€β”€ .prettierrc
└── README.md

πŸ”§ Available Scripts

Script Description
npm start Start the production server
npm run dev Start development server with Vercel
npm run build Run linting and tests
npm run lint Run ESLint for code quality
npm run lint:fix Fix ESLint issues automatically
npm run format Format code with Prettier
npm test Run Jest tests
npm run test:watch Run tests in watch mode
npm run test:coverage Run tests with coverage report
npm run test:actuator Test actuator functionality

🌐 API Endpoints

Core Endpoints

Endpoint Method Description
/ GET API information and available endpoints
/health GET Application health status
/api/openai-proxy POST OpenAI chat completion proxy
/api/generic POST Custom prompt API
/api/stats GET Conversation statistics
/api/conversations GET Get conversations with pagination

Actuator Endpoints

Endpoint Method Description
/api/actuator/health GET Detailed health checks
/api/actuator/metrics GET System and custom metrics
/api/actuator/prometheus GET Prometheus-formatted metrics
/api/actuator/info GET Application information
/api/actuator/env GET Environment variables (filtered)
/api/actuator/threaddump GET Event loop analysis
/api/actuator/heapdump POST Generate heap snapshot

πŸ“Š Usage Examples

Custom Prompt API

const response = await fetch('/api/generic', {
  method: 'POST',
  headers: {
    'Content-Type': 'application/json',
  },
  body: JSON.stringify({
    prompt: 'Explain quantum physics in simple terms',
    context: 'You are a science teacher explaining to a 10-year-old'
  })
});

const result = await response.json();
console.log(result.data.choices[0].message.content);

OpenAI Proxy Request (Legacy)

const response = await fetch('/api/openai-proxy', {
  method: 'POST',
  headers: {
    'Content-Type': 'application/json',
  },
  body: JSON.stringify({
    messages: [
      {
        role: 'system',
        content: 'You are a religious guidance assistant...'
      },
      {
        role: 'user',
        content: 'I need guidance on dealing with stress...'
      }
    ]
  })
});

const data = await response.json();

Get Conversation Statistics

const response = await fetch('/api/stats');
const stats = await response.json();
console.log('Total conversations:', stats.totalConversations);

Get Conversations with Pagination

const response = await fetch('/api/conversations?limit=10&skip=0');
const data = await response.json();
console.log('Conversations:', data.conversations);

πŸ” Monitoring & Observability

Health Checks

The application provides comprehensive health checks for:

  • MongoDB: Database connectivity and statistics
  • OpenAI API: API availability and model information
  • Application: Environment variable validation

Metrics

Custom metrics are automatically collected:

  • epic_conversations_total: Total conversations processed
  • epic_response_time_seconds: OpenAI API response times
  • epic_errors_total: Total errors encountered

Prometheus Integration

Metrics are available in Prometheus format for integration with monitoring systems.

πŸš€ Deployment

Vercel Deployment

  1. Install Vercel CLI

    npm i -g vercel
  2. Deploy

    npm run deploy
  3. Set environment variables in Vercel dashboard

Environment Variables

Variable Description Required
OPENAI_API_KEY OpenAI API key Yes
OPENAI_MODEL GPT model to use No (default: gpt-3.5-turbo)
OPENAI_TOKEN Max tokens for responses No (default: 1000)
OPENAI_TEMPERATURE Response creativity No (default: 0.8)
mongodb_username MongoDB username Yes
mongodb_password MongoDB password Yes
MONGODB_CLUSTER MongoDB cluster URL No
MONGODB_DATABASE Database name No (default: religious-guide)
NODE_ENV Environment No (default: development)

πŸ§ͺ Testing

Run All Tests

npm test

Run Actuator Tests

npm run test:actuator

Run Tests with Coverage

npm run test:coverage

πŸ”’ Security

  • CORS Protection: Strict CORS policy - only allows https://beingmartinbmc.github.io
  • Input Validation: Comprehensive request validation
  • Environment Variable Masking: Sensitive data is masked in logs and endpoints
  • Helmet.js: Security headers and protection
  • Rate Limiting: Built-in rate limiting support
  • No Localhost Access: API is locked down to production domain only

πŸ“ˆ Performance

  • Connection Pooling: Optimized MongoDB connections
  • Compression: Response compression for better performance
  • Caching: Built-in caching support
  • Error Recovery: Retry logic for external API calls

🀝 Contributing

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

Development Guidelines

  • Follow ESLint rules and Prettier formatting
  • Write tests for new features
  • Update documentation as needed
  • Use conventional commit messages

πŸ“ License

This project is licensed under the MIT License - see the LICENSE file for details.

πŸ†˜ Support

For support and questions:

  • Create an issue in the repository
  • Check the documentation
  • Review the health endpoints for system status

πŸ”„ Changelog

v1.0.0 (Current)

  • Complete refactoring with modern architecture
  • Improved error handling and monitoring
  • Enhanced security features
  • Better code organization and maintainability
  • Comprehensive testing setup
  • Updated documentation

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published