Loading project details...

Home Portfolio StudyBot

StudyBot

AI-Powered Chat Application

Live Project AI-Powered

Project Overview

StudyBot is a full-stack web application that delivers an intelligent conversational AI experience using Azure OpenAI's Assistant API. The application provides a modern chat interface with secure user authentication, real-time message delivery, and persistent conversation history.

Real-Time Communication

Bidirectional WebSocket connections using Socket.io for instant messaging

AI-Powered Responses

Azure OpenAI Assistant API for intelligent, context-aware conversations

Secure Authentication

JWT-based authentication with bcrypt password hashing

Persistent Storage

MongoDB for maintaining conversation continuity across sessions

Technical Specifications

Frontend: React 19, Material-UI v7, React Router v7
Backend: Node.js, Express.js, JWT Authentication
Database: MongoDB with Mongoose ODM
AI Service: Azure OpenAI Assistant API
Real-Time: Socket.io WebSocket library
Hosting: Azure App Service
Security: bcryptjs, CORS, Custom auth middleware
Logging: Winston for production-grade monitoring

Key Features

Secure Authentication System

Comprehensive JWT-based authentication with bcrypt password hashing, providing secure user registration, login, and session management. All user credentials are protected with industry-standard encryption.

  • JWT tokens with 24-hour expiration
  • bcrypt hashing with 10 salt rounds
  • Custom authentication middleware for route protection
  • Token storage in localStorage with Axios interceptors

Real-Time WebSocket Communication

Bidirectional communication using Socket.io for instant message delivery. Messages are broadcast to all connected clients in real-time, creating a seamless chat experience with connection state management.

  • WebSocket connections with Socket.io
  • User-specific socket management
  • Real-time message broadcasting
  • Automatic reconnection handling

AI-Powered Conversations

Integration with Azure OpenAI Assistant API for intelligent, context-aware responses. Thread management maintains conversation continuity with run status polling, timeout handling, and rate limit detection.

  • Azure OpenAI Assistant API integration
  • Thread creation for conversation context
  • Message queueing and processing
  • Rate limit detection (HTTP 429 handling)

Persistent Chat History

MongoDB storage for maintaining conversation continuity across sessions. All messages and chat sessions are persistently stored with timestamps, allowing users to access their conversation history anytime.

  • MongoDB with Mongoose ODM
  • Chat and message schema validation
  • User-specific chat organization
  • Automatic timestamp tracking

Modern UI/UX Design

Material Design components with responsive layout and markdown support. Built with Material-UI v7 and Framer Motion for smooth transitions, providing an intuitive and visually appealing user experience.

  • Material-UI v7 component library
  • Responsive Grid system
  • react-markdown for formatted responses
  • Framer Motion animations

Production-Ready Logging

Structured logging with Winston for comprehensive monitoring and debugging. Request logging with IP tracking, error logging with stack traces, and environment-aware configuration for production insights.

  • Winston production-grade logging
  • Request logging with IP tracking
  • Error logging with stack traces
  • Structured log format for parsing

Technical Architecture

Frontend Layer

React 19 Material-UI v7 React Router v7 Socket.io Client Axios react-markdown

Backend Layer

Node.js Express.js Socket.io JWT bcryptjs Winston

External Services

Azure OpenAI MongoDB Azure App Service

Architecture Highlights

Authentication Flow:

User credentials are hashed with bcrypt and stored in MongoDB. On login, JWT tokens are generated with 24-hour expiration. Auth middleware validates tokens on protected routes.

AI Conversation Flow:

Messages are sent via REST API, Azure OpenAI threads are created with conversation history for context, assistant runs are monitored with polling, and responses are saved to MongoDB.

Real-Time Updates:

Socket.io connections are established on client mount, server broadcasts new messages to all connected clients, and frontend listens for message events to update the UI.

Database Schema:

User model with username, email, and hashed password. Chat model with userId reference, title, and embedded messages array containing role, content, and timestamps.

API Endpoints

Authentication

POST /api/auth/register

Create new user account

POST /api/auth/login

Authenticate and receive JWT token

Chat Management

GET /api/chat

Fetch all chats for authenticated user

POST /api/chat

Create new chat session

GET /api/chat/:chatId

Retrieve specific chat with message history

POST /api/chat/:chatId/messages

Send message and receive AI response

PUT /api/chat/:chatId

Update chat title

All chat endpoints require JWT authentication via Authorization header.

Development Challenges & Solutions

Error Handling & Resilience

Challenge: Ensuring robust error handling across all system components, including AI API failures and database connection issues.

Solution: Implemented rate limit detection with user-friendly error messages, retry logic with exponential backoff, comprehensive try-catch blocks with structured logging, and graceful MongoDB connection failure handling.

Security Best Practices

Challenge: Protecting user data and preventing common security vulnerabilities in a real-time chat application.

Solution: Implemented password hashing with bcrypt, JWT token expiration, CORS configuration for specific origins, MongoDB injection prevention via Mongoose, and environment variable usage for sensitive data.

Real-Time Synchronization

Challenge: Managing bidirectional real-time communication while maintaining data consistency across clients.

Solution: Implemented Socket.io connection pooling, user-specific socket management, and automatic reconnection logic to ensure reliable message delivery and state synchronization.

AI Response Management

Challenge: Handling long-running AI requests with unpredictable response times and potential rate limits.

Solution: Implemented run status polling with 1-second intervals and 30-second timeouts, HTTP 429 rate limit detection, and structured error logging for monitoring and debugging.

Skills Demonstrated

Full-Stack Development

  • Full-stack JavaScript development (Node.js + React)
  • RESTful API design and implementation
  • Modern React patterns and hooks
  • Async/await and Promise management

Cloud & DevOps

  • Azure cloud services integration
  • Production logging and monitoring
  • DevOps and cloud deployment
  • Environment configuration management

Data Management

  • NoSQL database design and modeling
  • MongoDB with Mongoose ODM
  • Query optimization with indexing
  • Schema validation and data integrity

Real-Time Systems

  • Real-time communication with WebSockets
  • Socket.io implementation
  • Connection state management
  • Error handling and resilience patterns

AI Integration

  • Azure OpenAI API integration
  • Assistant API and thread management
  • Context-aware AI responses
  • Rate limiting and error handling

Security & Auth

  • Authentication and authorization
  • JWT token management
  • Password hashing with bcrypt
  • CORS and security headers