Add comprehensive database integration, authentication, and admin dashboard
This commit introduces major enhancements to Talk2Me: ## Database Integration - PostgreSQL support with SQLAlchemy ORM - Redis integration for caching and real-time analytics - Automated database initialization scripts - Migration support infrastructure ## User Authentication System - JWT-based API authentication - Session-based web authentication - API key authentication for programmatic access - User roles and permissions (admin/user) - Login history and session tracking - Rate limiting per user with customizable limits ## Admin Dashboard - Real-time analytics and monitoring - User management interface (create, edit, delete users) - System health monitoring - Request/error tracking - Language pair usage statistics - Performance metrics visualization ## Key Features - Dual authentication support (token + user accounts) - Graceful fallback for missing services - Non-blocking analytics middleware - Comprehensive error handling - Session management with security features ## Bug Fixes - Fixed rate limiting bypass for admin routes - Added missing email validation method - Improved error handling for missing database tables - Fixed session-based authentication for API endpoints 🤖 Generated with [Claude Code](https://claude.ai/code) Co-Authored-By: Claude <noreply@anthropic.com>
This commit is contained in:
parent
d818ec7d73
commit
fa951c3141
221
ADMIN_DASHBOARD.md
Normal file
221
ADMIN_DASHBOARD.md
Normal file
@ -0,0 +1,221 @@
|
||||
# Talk2Me Admin Analytics Dashboard
|
||||
|
||||
A comprehensive analytics dashboard for monitoring and managing the Talk2Me application.
|
||||
|
||||
## Features
|
||||
|
||||
### Real-time Monitoring
|
||||
- **Request Volume**: Track requests per minute, hour, and day
|
||||
- **Active Sessions**: Monitor current active user sessions
|
||||
- **Error Rates**: Real-time error tracking and analysis
|
||||
- **System Health**: Monitor Redis, PostgreSQL, and ML services status
|
||||
|
||||
### Analytics & Insights
|
||||
- **Translation & Transcription Metrics**: Usage statistics by operation type
|
||||
- **Language Pair Analysis**: Most popular translation combinations
|
||||
- **Response Time Monitoring**: Track performance across all operations
|
||||
- **Cache Performance**: Monitor cache hit rates for optimization
|
||||
|
||||
### Performance Metrics
|
||||
- **Response Time Percentiles**: P95 and P99 latency tracking
|
||||
- **Throughput Analysis**: Requests per minute visualization
|
||||
- **Slow Request Detection**: Identify and analyze performance bottlenecks
|
||||
- **Resource Usage**: Memory and GPU utilization tracking
|
||||
|
||||
### Data Management
|
||||
- **Export Capabilities**: Download analytics data in JSON format
|
||||
- **Historical Data**: View trends over time (daily, weekly, monthly)
|
||||
- **Error Logs**: Detailed error tracking with stack traces
|
||||
- **Session Management**: Track and manage user sessions
|
||||
|
||||
## Setup
|
||||
|
||||
### 1. Database Setup
|
||||
|
||||
Initialize the analytics database tables:
|
||||
|
||||
```bash
|
||||
python init_analytics_db.py
|
||||
```
|
||||
|
||||
This creates the following tables:
|
||||
- `error_logs`: Detailed error tracking
|
||||
- `request_logs`: Request-level analytics
|
||||
- `translation_logs`: Translation operation metrics
|
||||
- `transcription_logs`: Transcription operation metrics
|
||||
- `tts_logs`: Text-to-speech operation metrics
|
||||
- `daily_stats`: Aggregated daily statistics
|
||||
|
||||
### 2. Configuration
|
||||
|
||||
Set the following environment variables:
|
||||
|
||||
```bash
|
||||
# Admin access token (required)
|
||||
export ADMIN_TOKEN="your-secure-admin-token"
|
||||
|
||||
# Database configuration
|
||||
export DATABASE_URL="postgresql://user:password@localhost/talk2me"
|
||||
|
||||
# Redis configuration
|
||||
export REDIS_URL="redis://localhost:6379/0"
|
||||
```
|
||||
|
||||
### 3. Access the Dashboard
|
||||
|
||||
1. Navigate to: `http://your-domain/admin`
|
||||
2. Enter your admin token
|
||||
3. Access the analytics dashboard
|
||||
|
||||
## Dashboard Sections
|
||||
|
||||
### Overview Cards
|
||||
- Total Requests (all-time and today)
|
||||
- Active Sessions (real-time)
|
||||
- Error Rate (24-hour percentage)
|
||||
- Cache Hit Rate (performance metric)
|
||||
|
||||
### Charts & Visualizations
|
||||
|
||||
#### Request Volume Chart
|
||||
- Toggle between minute, hour, and day views
|
||||
- Real-time updates every 5 seconds
|
||||
- Historical data for trend analysis
|
||||
|
||||
#### Language Pairs Donut Chart
|
||||
- Top 6 most used language combinations
|
||||
- Visual breakdown of translation patterns
|
||||
|
||||
#### Operations Bar Chart
|
||||
- Daily translation and transcription counts
|
||||
- 7-day historical view
|
||||
|
||||
#### Response Time Line Chart
|
||||
- Average, P95, and P99 response times
|
||||
- Broken down by operation type
|
||||
|
||||
#### Error Analysis
|
||||
- Error type distribution pie chart
|
||||
- Recent errors list with details
|
||||
- Timeline of error occurrences
|
||||
|
||||
### Performance Table
|
||||
- Detailed metrics for each operation type
|
||||
- Average response times
|
||||
- 95th and 99th percentile latencies
|
||||
|
||||
## Real-time Updates
|
||||
|
||||
The dashboard uses Server-Sent Events (SSE) for real-time updates:
|
||||
- Automatic refresh every 5 seconds
|
||||
- Connection status indicator
|
||||
- Automatic reconnection on disconnect
|
||||
|
||||
## Data Export
|
||||
|
||||
Export analytics data for external analysis:
|
||||
|
||||
1. Click "Export Data" in the navigation
|
||||
2. Choose data type:
|
||||
- `requests`: Request and operation counts
|
||||
- `errors`: Error logs and details
|
||||
- `performance`: Response time metrics
|
||||
- `all`: Complete data export
|
||||
|
||||
## API Endpoints
|
||||
|
||||
The admin dashboard provides the following API endpoints:
|
||||
|
||||
### Authentication Required
|
||||
All endpoints require the `X-Admin-Token` header.
|
||||
|
||||
### Available Endpoints
|
||||
|
||||
#### Overview Stats
|
||||
```
|
||||
GET /admin/api/stats/overview
|
||||
```
|
||||
Returns overall system statistics
|
||||
|
||||
#### Request Statistics
|
||||
```
|
||||
GET /admin/api/stats/requests/{timeframe}
|
||||
```
|
||||
Timeframes: `minute`, `hour`, `day`
|
||||
|
||||
#### Operation Statistics
|
||||
```
|
||||
GET /admin/api/stats/operations
|
||||
```
|
||||
Translation and transcription metrics
|
||||
|
||||
#### Error Statistics
|
||||
```
|
||||
GET /admin/api/stats/errors
|
||||
```
|
||||
Error types, timeline, and recent errors
|
||||
|
||||
#### Performance Statistics
|
||||
```
|
||||
GET /admin/api/stats/performance
|
||||
```
|
||||
Response times and throughput metrics
|
||||
|
||||
#### Data Export
|
||||
```
|
||||
GET /admin/api/export/{data_type}
|
||||
```
|
||||
Data types: `requests`, `errors`, `performance`, `all`
|
||||
|
||||
#### Real-time Updates
|
||||
```
|
||||
GET /admin/api/stream/updates
|
||||
```
|
||||
Server-Sent Events stream for real-time updates
|
||||
|
||||
## Mobile Optimization
|
||||
|
||||
The dashboard is fully responsive and optimized for mobile devices:
|
||||
- Touch-friendly controls
|
||||
- Responsive charts that adapt to screen size
|
||||
- Collapsible navigation for small screens
|
||||
- Optimized data tables for mobile viewing
|
||||
|
||||
## Security
|
||||
|
||||
- Admin token authentication required
|
||||
- Session-based authentication after login
|
||||
- Separate CORS configuration for admin endpoints
|
||||
- All sensitive data masked in exports
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Dashboard Not Loading
|
||||
1. Check Redis and PostgreSQL connections
|
||||
2. Verify admin token is set correctly
|
||||
3. Check browser console for JavaScript errors
|
||||
|
||||
### Missing Data
|
||||
1. Ensure analytics middleware is initialized
|
||||
2. Check database tables are created
|
||||
3. Verify Redis is running and accessible
|
||||
|
||||
### Real-time Updates Not Working
|
||||
1. Check SSE support in your reverse proxy
|
||||
2. Ensure `X-Accel-Buffering: no` header is set
|
||||
3. Verify firewall allows SSE connections
|
||||
|
||||
## Performance Considerations
|
||||
|
||||
- Charts limited to reasonable data points for performance
|
||||
- Automatic data aggregation for historical views
|
||||
- Efficient database queries with proper indexing
|
||||
- Client-side caching for static data
|
||||
|
||||
## Future Enhancements
|
||||
|
||||
- WebSocket support for lower latency updates
|
||||
- Customizable dashboards and widgets
|
||||
- Alert configuration for thresholds
|
||||
- Integration with external monitoring tools
|
||||
- Machine learning for anomaly detection
|
315
AUTHENTICATION.md
Normal file
315
AUTHENTICATION.md
Normal file
@ -0,0 +1,315 @@
|
||||
# Talk2Me Authentication System
|
||||
|
||||
This document describes the comprehensive user authentication and authorization system implemented for the Talk2Me application.
|
||||
|
||||
## Overview
|
||||
|
||||
The authentication system provides:
|
||||
- User account management with roles (admin, user)
|
||||
- JWT-based API authentication
|
||||
- Session management for web interface
|
||||
- API key authentication for programmatic access
|
||||
- User-specific rate limiting
|
||||
- Admin dashboard for user management
|
||||
- Secure password hashing with bcrypt
|
||||
|
||||
## Features
|
||||
|
||||
### 1. User Management
|
||||
|
||||
#### User Account Model
|
||||
- **Email & Username**: Unique identifiers for each user
|
||||
- **Password**: Securely hashed using bcrypt
|
||||
- **API Key**: Unique key for each user (format: `tk_<random_string>`)
|
||||
- **Roles**: `admin` or `user`
|
||||
- **Account Status**: Active, verified, suspended
|
||||
- **Rate Limits**: Configurable per-user limits
|
||||
- **Usage Tracking**: Tracks requests, translations, transcriptions, and TTS usage
|
||||
|
||||
#### Admin Features
|
||||
- Create, update, delete users
|
||||
- Suspend/unsuspend accounts
|
||||
- Reset passwords
|
||||
- Manage user permissions
|
||||
- View login history
|
||||
- Monitor active sessions
|
||||
- Bulk operations
|
||||
|
||||
### 2. Authentication Methods
|
||||
|
||||
#### JWT Authentication
|
||||
- Access tokens (1 hour expiration)
|
||||
- Refresh tokens (30 days expiration)
|
||||
- Token blacklisting for revocation
|
||||
- Secure token storage
|
||||
|
||||
#### API Key Authentication
|
||||
- Bearer token in `X-API-Key` header
|
||||
- Query parameter fallback: `?api_key=tk_xxx`
|
||||
- Per-key rate limiting
|
||||
|
||||
#### Session Management
|
||||
- Track active sessions per user
|
||||
- Session expiration handling
|
||||
- Multi-device support
|
||||
- Session revocation
|
||||
|
||||
### 3. Security Features
|
||||
|
||||
#### Password Security
|
||||
- Bcrypt hashing with salt
|
||||
- Minimum 8 character requirement
|
||||
- Password change tracking
|
||||
- Failed login attempt tracking
|
||||
- Account lockout after 5 failed attempts (30 minutes)
|
||||
|
||||
#### Rate Limiting
|
||||
- User-specific limits (per minute/hour/day)
|
||||
- IP-based fallback for unauthenticated requests
|
||||
- Admin users get 10x higher limits
|
||||
- Endpoint-specific overrides
|
||||
|
||||
#### Audit Trail
|
||||
- Login history with IP and user agent
|
||||
- Success/failure tracking
|
||||
- Suspicious activity flagging
|
||||
- Security event logging
|
||||
|
||||
## Database Schema
|
||||
|
||||
### Users Table
|
||||
```sql
|
||||
- id (UUID, primary key)
|
||||
- email (unique)
|
||||
- username (unique)
|
||||
- password_hash
|
||||
- api_key (unique)
|
||||
- role (admin/user)
|
||||
- is_active, is_verified, is_suspended
|
||||
- rate limits (per_minute, per_hour, per_day)
|
||||
- usage stats (total_requests, translations, etc.)
|
||||
- timestamps (created_at, updated_at, last_login_at)
|
||||
```
|
||||
|
||||
### Login History Table
|
||||
```sql
|
||||
- id (UUID)
|
||||
- user_id (foreign key)
|
||||
- login_at, logout_at
|
||||
- login_method (password/api_key/jwt)
|
||||
- success (boolean)
|
||||
- ip_address, user_agent
|
||||
- session_id, jwt_jti
|
||||
```
|
||||
|
||||
### User Sessions Table
|
||||
```sql
|
||||
- id (UUID)
|
||||
- session_id (unique)
|
||||
- user_id (foreign key)
|
||||
- access_token_jti, refresh_token_jti
|
||||
- created_at, last_active_at, expires_at
|
||||
- ip_address, user_agent
|
||||
```
|
||||
|
||||
### Revoked Tokens Table
|
||||
```sql
|
||||
- id (UUID)
|
||||
- jti (unique, token ID)
|
||||
- token_type (access/refresh)
|
||||
- user_id
|
||||
- revoked_at, expires_at
|
||||
- reason
|
||||
```
|
||||
|
||||
## API Endpoints
|
||||
|
||||
### Authentication Endpoints
|
||||
|
||||
#### POST /api/auth/login
|
||||
Login with username/email and password.
|
||||
```json
|
||||
{
|
||||
"username": "user@example.com",
|
||||
"password": "password123"
|
||||
}
|
||||
```
|
||||
|
||||
Response:
|
||||
```json
|
||||
{
|
||||
"success": true,
|
||||
"user": { ... },
|
||||
"tokens": {
|
||||
"access_token": "eyJ...",
|
||||
"refresh_token": "eyJ...",
|
||||
"expires_in": 3600
|
||||
},
|
||||
"session_id": "uuid"
|
||||
}
|
||||
```
|
||||
|
||||
#### POST /api/auth/logout
|
||||
Logout and revoke current token.
|
||||
|
||||
#### POST /api/auth/refresh
|
||||
Refresh access token using refresh token.
|
||||
|
||||
#### GET /api/auth/profile
|
||||
Get current user profile.
|
||||
|
||||
#### PUT /api/auth/profile
|
||||
Update user profile (name, settings).
|
||||
|
||||
#### POST /api/auth/change-password
|
||||
Change user password.
|
||||
|
||||
#### POST /api/auth/regenerate-api-key
|
||||
Generate new API key.
|
||||
|
||||
### Admin User Management
|
||||
|
||||
#### GET /api/auth/admin/users
|
||||
List all users with filtering and pagination.
|
||||
|
||||
#### POST /api/auth/admin/users
|
||||
Create new user (admin only).
|
||||
|
||||
#### GET /api/auth/admin/users/:id
|
||||
Get user details with login history.
|
||||
|
||||
#### PUT /api/auth/admin/users/:id
|
||||
Update user details.
|
||||
|
||||
#### DELETE /api/auth/admin/users/:id
|
||||
Delete user account.
|
||||
|
||||
#### POST /api/auth/admin/users/:id/suspend
|
||||
Suspend user account.
|
||||
|
||||
#### POST /api/auth/admin/users/:id/reset-password
|
||||
Reset user password.
|
||||
|
||||
## Usage Examples
|
||||
|
||||
### Authenticating Requests
|
||||
|
||||
#### Using JWT Token
|
||||
```bash
|
||||
curl -H "Authorization: Bearer eyJ..." \
|
||||
https://api.talk2me.app/translate
|
||||
```
|
||||
|
||||
#### Using API Key
|
||||
```bash
|
||||
curl -H "X-API-Key: tk_your_api_key" \
|
||||
https://api.talk2me.app/translate
|
||||
```
|
||||
|
||||
### Python Client Example
|
||||
```python
|
||||
import requests
|
||||
|
||||
# Login and get token
|
||||
response = requests.post('https://api.talk2me.app/api/auth/login', json={
|
||||
'username': 'user@example.com',
|
||||
'password': 'password123'
|
||||
})
|
||||
tokens = response.json()['tokens']
|
||||
|
||||
# Use token for requests
|
||||
headers = {'Authorization': f"Bearer {tokens['access_token']}"}
|
||||
translation = requests.post(
|
||||
'https://api.talk2me.app/translate',
|
||||
headers=headers,
|
||||
json={'text': 'Hello', 'target_lang': 'Spanish'}
|
||||
)
|
||||
```
|
||||
|
||||
## Setup Instructions
|
||||
|
||||
### 1. Install Dependencies
|
||||
```bash
|
||||
pip install -r requirements.txt
|
||||
```
|
||||
|
||||
### 2. Initialize Database
|
||||
```bash
|
||||
python init_auth_db.py
|
||||
```
|
||||
|
||||
This will:
|
||||
- Create all database tables
|
||||
- Prompt you to create an admin user
|
||||
- Display the admin's API key
|
||||
|
||||
### 3. Configure Environment
|
||||
Add to your `.env` file:
|
||||
```env
|
||||
JWT_SECRET_KEY=your-secret-key-change-in-production
|
||||
DATABASE_URL=postgresql://user:pass@localhost/talk2me
|
||||
```
|
||||
|
||||
### 4. Run Migrations (if needed)
|
||||
```bash
|
||||
alembic upgrade head
|
||||
```
|
||||
|
||||
## Security Best Practices
|
||||
|
||||
1. **JWT Secret**: Use a strong, random secret key in production
|
||||
2. **HTTPS Only**: Always use HTTPS in production
|
||||
3. **Rate Limiting**: Configure appropriate limits per user role
|
||||
4. **Password Policy**: Enforce strong passwords
|
||||
5. **Session Timeout**: Configure appropriate session durations
|
||||
6. **Audit Logging**: Monitor login attempts and suspicious activity
|
||||
7. **API Key Rotation**: Encourage regular API key rotation
|
||||
8. **Database Security**: Use encrypted connections to database
|
||||
|
||||
## Admin Dashboard
|
||||
|
||||
Access the admin dashboard at `/admin/users` (requires admin login).
|
||||
|
||||
Features:
|
||||
- User list with search and filters
|
||||
- User details with usage statistics
|
||||
- Create/edit/delete users
|
||||
- Suspend/unsuspend accounts
|
||||
- View login history
|
||||
- Monitor active sessions
|
||||
- Bulk operations
|
||||
|
||||
## Rate Limiting
|
||||
|
||||
Default limits:
|
||||
- **Regular Users**: 30/min, 500/hour, 5000/day
|
||||
- **Admin Users**: 300/min, 5000/hour, 50000/day
|
||||
|
||||
Endpoint-specific limits are configured in `user_rate_limiter.py`.
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Common Issues
|
||||
|
||||
1. **"Token expired"**: Refresh token using `/api/auth/refresh`
|
||||
2. **"Account locked"**: Wait 30 minutes or contact admin
|
||||
3. **"Rate limit exceeded"**: Check your usage limits
|
||||
4. **"Invalid API key"**: Regenerate key in profile settings
|
||||
|
||||
### Debug Mode
|
||||
Enable debug logging:
|
||||
```python
|
||||
import logging
|
||||
logging.getLogger('auth').setLevel(logging.DEBUG)
|
||||
```
|
||||
|
||||
## Future Enhancements
|
||||
|
||||
- [ ] OAuth2 integration (Google, GitHub)
|
||||
- [ ] Two-factor authentication
|
||||
- [ ] Email verification workflow
|
||||
- [ ] Password reset via email
|
||||
- [ ] User groups and team management
|
||||
- [ ] Fine-grained permissions
|
||||
- [ ] API key scopes
|
||||
- [ ] Usage quotas and billing
|
302
DATABASE_INTEGRATION.md
Normal file
302
DATABASE_INTEGRATION.md
Normal file
@ -0,0 +1,302 @@
|
||||
# Database Integration Guide
|
||||
|
||||
This guide explains the Redis and PostgreSQL integration for the Talk2Me application.
|
||||
|
||||
## Overview
|
||||
|
||||
The Talk2Me application now uses:
|
||||
- **PostgreSQL**: For persistent storage of translations, transcriptions, user preferences, and analytics
|
||||
- **Redis**: For caching, session management, and rate limiting
|
||||
|
||||
## Architecture
|
||||
|
||||
### PostgreSQL Database Schema
|
||||
|
||||
1. **translations** - Stores translation history
|
||||
- Source and target text
|
||||
- Languages
|
||||
- Translation time and model used
|
||||
- Session and user tracking
|
||||
|
||||
2. **transcriptions** - Stores transcription history
|
||||
- Transcribed text
|
||||
- Detected language
|
||||
- Audio metadata
|
||||
- Performance metrics
|
||||
|
||||
3. **user_preferences** - Stores user settings
|
||||
- Preferred languages
|
||||
- Voice preferences
|
||||
- Usage statistics
|
||||
|
||||
4. **usage_analytics** - Aggregated analytics
|
||||
- Hourly and daily metrics
|
||||
- Service performance
|
||||
- Language pair statistics
|
||||
|
||||
5. **api_keys** - API key management
|
||||
- Rate limits
|
||||
- Permissions
|
||||
- Usage tracking
|
||||
|
||||
### Redis Usage
|
||||
|
||||
1. **Translation Cache**
|
||||
- Key: `translation:{source_lang}:{target_lang}:{text_hash}`
|
||||
- Expires: 24 hours
|
||||
- Reduces API calls to Ollama
|
||||
|
||||
2. **Session Management**
|
||||
- Key: `session:{session_id}`
|
||||
- Stores session data and resources
|
||||
- Expires: 1 hour (configurable)
|
||||
|
||||
3. **Rate Limiting**
|
||||
- Token bucket implementation
|
||||
- Per-client and global limits
|
||||
- Sliding window tracking
|
||||
|
||||
4. **Push Subscriptions**
|
||||
- Set: `push_subscriptions`
|
||||
- Individual subscriptions: `push_subscription:{id}`
|
||||
|
||||
## Setup Instructions
|
||||
|
||||
### Prerequisites
|
||||
|
||||
1. Install PostgreSQL:
|
||||
```bash
|
||||
# Ubuntu/Debian
|
||||
sudo apt-get install postgresql postgresql-contrib
|
||||
|
||||
# MacOS
|
||||
brew install postgresql
|
||||
```
|
||||
|
||||
2. Install Redis:
|
||||
```bash
|
||||
# Ubuntu/Debian
|
||||
sudo apt-get install redis-server
|
||||
|
||||
# MacOS
|
||||
brew install redis
|
||||
```
|
||||
|
||||
3. Install Python dependencies:
|
||||
```bash
|
||||
pip install -r requirements.txt
|
||||
```
|
||||
|
||||
### Quick Setup
|
||||
|
||||
Run the setup script:
|
||||
```bash
|
||||
./setup_databases.sh
|
||||
```
|
||||
|
||||
### Manual Setup
|
||||
|
||||
1. Create PostgreSQL database:
|
||||
```bash
|
||||
createdb talk2me
|
||||
```
|
||||
|
||||
2. Start Redis:
|
||||
```bash
|
||||
redis-server
|
||||
```
|
||||
|
||||
3. Create .env file with database URLs:
|
||||
```env
|
||||
DATABASE_URL=postgresql://username@localhost/talk2me
|
||||
REDIS_URL=redis://localhost:6379/0
|
||||
```
|
||||
|
||||
4. Initialize database:
|
||||
```bash
|
||||
python database_init.py
|
||||
```
|
||||
|
||||
5. Run migrations:
|
||||
```bash
|
||||
python migrations.py init
|
||||
python migrations.py create "Initial migration"
|
||||
python migrations.py run
|
||||
```
|
||||
|
||||
## Configuration
|
||||
|
||||
### Environment Variables
|
||||
|
||||
```env
|
||||
# PostgreSQL
|
||||
DATABASE_URL=postgresql://username:password@host:port/database
|
||||
SQLALCHEMY_DATABASE_URI=${DATABASE_URL}
|
||||
SQLALCHEMY_ENGINE_OPTIONS_POOL_SIZE=10
|
||||
SQLALCHEMY_ENGINE_OPTIONS_POOL_RECYCLE=3600
|
||||
|
||||
# Redis
|
||||
REDIS_URL=redis://localhost:6379/0
|
||||
REDIS_DECODE_RESPONSES=false
|
||||
REDIS_MAX_CONNECTIONS=50
|
||||
REDIS_SOCKET_TIMEOUT=5
|
||||
|
||||
# Session Management
|
||||
MAX_SESSION_DURATION=3600
|
||||
MAX_SESSION_IDLE_TIME=900
|
||||
MAX_RESOURCES_PER_SESSION=100
|
||||
MAX_BYTES_PER_SESSION=104857600
|
||||
```
|
||||
|
||||
## Migration from In-Memory to Database
|
||||
|
||||
### What Changed
|
||||
|
||||
1. **Rate Limiting**
|
||||
- Before: In-memory dictionaries
|
||||
- After: Redis sorted sets and hashes
|
||||
|
||||
2. **Session Management**
|
||||
- Before: In-memory session storage
|
||||
- After: Redis with automatic expiration
|
||||
|
||||
3. **Translation Cache**
|
||||
- Before: Client-side IndexedDB only
|
||||
- After: Server-side Redis cache + client cache
|
||||
|
||||
4. **Analytics**
|
||||
- Before: No persistent analytics
|
||||
- After: PostgreSQL aggregated metrics
|
||||
|
||||
### Migration Steps
|
||||
|
||||
1. Backup current app.py:
|
||||
```bash
|
||||
cp app.py app_backup.py
|
||||
```
|
||||
|
||||
2. Use the new app with database support:
|
||||
```bash
|
||||
cp app_with_db.py app.py
|
||||
```
|
||||
|
||||
3. Update any custom configurations in the new app.py
|
||||
|
||||
## API Changes
|
||||
|
||||
### New Endpoints
|
||||
|
||||
- `/api/history/translations` - Get translation history
|
||||
- `/api/history/transcriptions` - Get transcription history
|
||||
- `/api/preferences` - Get/update user preferences
|
||||
- `/api/analytics` - Get usage analytics
|
||||
|
||||
### Enhanced Features
|
||||
|
||||
1. **Translation Caching**
|
||||
- Automatic server-side caching
|
||||
- Reduced response time for repeated translations
|
||||
|
||||
2. **Session Persistence**
|
||||
- Sessions survive server restarts
|
||||
- Better resource tracking
|
||||
|
||||
3. **Improved Rate Limiting**
|
||||
- Distributed rate limiting across multiple servers
|
||||
- More accurate tracking
|
||||
|
||||
## Performance Considerations
|
||||
|
||||
1. **Database Indexes**
|
||||
- Indexes on session_id, user_id, languages
|
||||
- Composite indexes for common queries
|
||||
|
||||
2. **Redis Memory Usage**
|
||||
- Monitor with: `redis-cli info memory`
|
||||
- Configure maxmemory policy
|
||||
|
||||
3. **Connection Pooling**
|
||||
- PostgreSQL: 10 connections default
|
||||
- Redis: 50 connections default
|
||||
|
||||
## Monitoring
|
||||
|
||||
### PostgreSQL
|
||||
```sql
|
||||
-- Check database size
|
||||
SELECT pg_database_size('talk2me');
|
||||
|
||||
-- Active connections
|
||||
SELECT count(*) FROM pg_stat_activity;
|
||||
|
||||
-- Slow queries
|
||||
SELECT * FROM pg_stat_statements ORDER BY mean_time DESC LIMIT 10;
|
||||
```
|
||||
|
||||
### Redis
|
||||
```bash
|
||||
# Memory usage
|
||||
redis-cli info memory
|
||||
|
||||
# Connected clients
|
||||
redis-cli info clients
|
||||
|
||||
# Monitor commands
|
||||
redis-cli monitor
|
||||
```
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Common Issues
|
||||
|
||||
1. **PostgreSQL Connection Failed**
|
||||
- Check if PostgreSQL is running: `sudo systemctl status postgresql`
|
||||
- Verify DATABASE_URL in .env
|
||||
- Check pg_hba.conf for authentication
|
||||
|
||||
2. **Redis Connection Failed**
|
||||
- Check if Redis is running: `redis-cli ping`
|
||||
- Verify REDIS_URL in .env
|
||||
- Check Redis logs: `sudo journalctl -u redis`
|
||||
|
||||
3. **Migration Errors**
|
||||
- Drop and recreate database if needed
|
||||
- Check migration files in `migrations/`
|
||||
- Run `python migrations.py init` to reinitialize
|
||||
|
||||
## Backup and Restore
|
||||
|
||||
### PostgreSQL Backup
|
||||
```bash
|
||||
# Backup
|
||||
pg_dump talk2me > talk2me_backup.sql
|
||||
|
||||
# Restore
|
||||
psql talk2me < talk2me_backup.sql
|
||||
```
|
||||
|
||||
### Redis Backup
|
||||
```bash
|
||||
# Backup (if persistence enabled)
|
||||
redis-cli BGSAVE
|
||||
|
||||
# Copy dump.rdb file
|
||||
cp /var/lib/redis/dump.rdb redis_backup.rdb
|
||||
```
|
||||
|
||||
## Security Notes
|
||||
|
||||
1. **Database Credentials**
|
||||
- Never commit .env file
|
||||
- Use strong passwords
|
||||
- Limit database user permissions
|
||||
|
||||
2. **Redis Security**
|
||||
- Consider enabling Redis AUTH
|
||||
- Bind to localhost only
|
||||
- Use SSL for remote connections
|
||||
|
||||
3. **Data Privacy**
|
||||
- Translations/transcriptions contain user data
|
||||
- Implement data retention policies
|
||||
- Consider encryption at rest
|
42
README.md
42
README.md
@ -11,6 +11,7 @@ A production-ready, mobile-friendly web application that provides real-time tran
|
||||
- **Multi-Speaker Support**: Track and translate conversations with multiple participants
|
||||
- **Enterprise Security**: Comprehensive rate limiting, session management, and encrypted secrets
|
||||
- **Production Ready**: Docker support, load balancing, and extensive monitoring
|
||||
- **Admin Dashboard**: Real-time analytics, performance monitoring, and system health tracking
|
||||
|
||||
## Table of Contents
|
||||
|
||||
@ -538,6 +539,47 @@ All admin endpoints require `X-Admin-Token` header.
|
||||
- `POST /admin/block-ip` - Block IP address
|
||||
- `GET /admin/logs/security` - Security events
|
||||
|
||||
## Admin Dashboard
|
||||
|
||||
Talk2Me includes a comprehensive admin analytics dashboard for monitoring and managing the application.
|
||||
|
||||
### Features
|
||||
|
||||
- **Real-time Analytics**: Monitor requests, active sessions, and error rates
|
||||
- **Performance Metrics**: Track response times, throughput, and resource usage
|
||||
- **System Health**: Monitor Redis, PostgreSQL, and ML services status
|
||||
- **Language Analytics**: View popular language pairs and usage patterns
|
||||
- **Error Analysis**: Detailed error tracking with types and trends
|
||||
- **Data Export**: Download analytics data in JSON format
|
||||
|
||||
### Setup
|
||||
|
||||
1. **Initialize Database**:
|
||||
```bash
|
||||
python init_analytics_db.py
|
||||
```
|
||||
|
||||
2. **Configure Admin Token**:
|
||||
```bash
|
||||
export ADMIN_TOKEN="your-secure-admin-token"
|
||||
```
|
||||
|
||||
3. **Access Dashboard**:
|
||||
- Navigate to `https://yourdomain.com/admin`
|
||||
- Enter your admin token
|
||||
- View real-time analytics
|
||||
|
||||
### Dashboard Sections
|
||||
|
||||
- **Overview Cards**: Key metrics at a glance
|
||||
- **Request Volume**: Visualize traffic patterns
|
||||
- **Operations**: Translation and transcription statistics
|
||||
- **Performance**: Response time percentiles (P95, P99)
|
||||
- **Error Tracking**: Error types and recent issues
|
||||
- **System Health**: Component status monitoring
|
||||
|
||||
For detailed admin documentation, see [ADMIN_DASHBOARD.md](ADMIN_DASHBOARD.md).
|
||||
|
||||
## Development
|
||||
|
||||
### TypeScript Development
|
||||
|
534
admin/__init__.py
Normal file
534
admin/__init__.py
Normal file
@ -0,0 +1,534 @@
|
||||
from flask import Blueprint, request, jsonify, render_template, redirect, url_for, session
|
||||
from functools import wraps
|
||||
import os
|
||||
import logging
|
||||
import json
|
||||
from datetime import datetime, timedelta
|
||||
import redis
|
||||
import psycopg2
|
||||
from psycopg2.extras import RealDictCursor
|
||||
import time
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
# Create admin blueprint
|
||||
admin_bp = Blueprint('admin', __name__,
|
||||
template_folder='templates',
|
||||
static_folder='static',
|
||||
static_url_path='/admin/static')
|
||||
|
||||
# Initialize Redis and PostgreSQL connections
|
||||
redis_client = None
|
||||
pg_conn = None
|
||||
|
||||
def init_admin(app):
|
||||
"""Initialize admin module with app configuration"""
|
||||
global redis_client, pg_conn
|
||||
|
||||
try:
|
||||
# Initialize Redis
|
||||
redis_client = redis.from_url(
|
||||
app.config.get('REDIS_URL', 'redis://localhost:6379/0'),
|
||||
decode_responses=True
|
||||
)
|
||||
redis_client.ping()
|
||||
logger.info("Redis connection established for admin dashboard")
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to connect to Redis: {e}")
|
||||
redis_client = None
|
||||
|
||||
try:
|
||||
# Initialize PostgreSQL
|
||||
pg_conn = psycopg2.connect(
|
||||
app.config.get('DATABASE_URL', 'postgresql://localhost/talk2me')
|
||||
)
|
||||
logger.info("PostgreSQL connection established for admin dashboard")
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to connect to PostgreSQL: {e}")
|
||||
pg_conn = None
|
||||
|
||||
def admin_required(f):
|
||||
"""Decorator to require admin authentication"""
|
||||
@wraps(f)
|
||||
def decorated_function(*args, **kwargs):
|
||||
# Check if user is logged in with admin role (from unified login)
|
||||
user_role = session.get('user_role')
|
||||
if user_role == 'admin':
|
||||
return f(*args, **kwargs)
|
||||
|
||||
# Also support the old admin token for backward compatibility
|
||||
auth_token = request.headers.get('X-Admin-Token')
|
||||
session_token = session.get('admin_token')
|
||||
expected_token = os.environ.get('ADMIN_TOKEN', 'default-admin-token')
|
||||
|
||||
if auth_token == expected_token or session_token == expected_token:
|
||||
if auth_token == expected_token:
|
||||
session['admin_token'] = expected_token
|
||||
return f(*args, **kwargs)
|
||||
|
||||
# For API endpoints, return JSON error
|
||||
if request.path.startswith('/admin/api/'):
|
||||
return jsonify({'error': 'Unauthorized'}), 401
|
||||
|
||||
# For web pages, redirect to unified login
|
||||
return redirect(url_for('login', next=request.url))
|
||||
|
||||
return decorated_function
|
||||
|
||||
@admin_bp.route('/login', methods=['GET', 'POST'])
|
||||
def login():
|
||||
"""Admin login - redirect to main login page"""
|
||||
# Redirect to the unified login page
|
||||
next_url = request.args.get('next', url_for('admin.dashboard'))
|
||||
return redirect(url_for('login', next=next_url))
|
||||
|
||||
@admin_bp.route('/logout')
|
||||
def logout():
|
||||
"""Admin logout - redirect to main logout"""
|
||||
# Clear all session data
|
||||
session.clear()
|
||||
return redirect(url_for('index'))
|
||||
|
||||
@admin_bp.route('/')
|
||||
@admin_bp.route('/dashboard')
|
||||
@admin_required
|
||||
def dashboard():
|
||||
"""Main admin dashboard"""
|
||||
return render_template('dashboard.html')
|
||||
|
||||
@admin_bp.route('/users')
|
||||
@admin_required
|
||||
def users():
|
||||
"""User management page"""
|
||||
# The template is in the main templates folder, not admin/templates
|
||||
return render_template('admin_users.html')
|
||||
|
||||
# Analytics API endpoints
|
||||
@admin_bp.route('/api/stats/overview')
|
||||
@admin_required
|
||||
def get_overview_stats():
|
||||
"""Get overview statistics"""
|
||||
try:
|
||||
stats = {
|
||||
'requests': {'total': 0, 'today': 0, 'hour': 0},
|
||||
'translations': {'total': 0, 'today': 0},
|
||||
'transcriptions': {'total': 0, 'today': 0},
|
||||
'active_sessions': 0,
|
||||
'error_rate': 0,
|
||||
'cache_hit_rate': 0,
|
||||
'system_health': check_system_health()
|
||||
}
|
||||
|
||||
# Get data from Redis
|
||||
if redis_client:
|
||||
try:
|
||||
# Request counts
|
||||
stats['requests']['total'] = int(redis_client.get('stats:requests:total') or 0)
|
||||
stats['requests']['today'] = int(redis_client.get(f'stats:requests:daily:{datetime.now().strftime("%Y-%m-%d")}') or 0)
|
||||
stats['requests']['hour'] = int(redis_client.get(f'stats:requests:hourly:{datetime.now().strftime("%Y-%m-%d-%H")}') or 0)
|
||||
|
||||
# Operation counts
|
||||
stats['translations']['total'] = int(redis_client.get('stats:translations:total') or 0)
|
||||
stats['translations']['today'] = int(redis_client.get(f'stats:translations:daily:{datetime.now().strftime("%Y-%m-%d")}') or 0)
|
||||
stats['transcriptions']['total'] = int(redis_client.get('stats:transcriptions:total') or 0)
|
||||
stats['transcriptions']['today'] = int(redis_client.get(f'stats:transcriptions:daily:{datetime.now().strftime("%Y-%m-%d")}') or 0)
|
||||
|
||||
# Active sessions
|
||||
stats['active_sessions'] = len(redis_client.keys('session:*'))
|
||||
|
||||
# Cache stats
|
||||
cache_hits = int(redis_client.get('stats:cache:hits') or 0)
|
||||
cache_misses = int(redis_client.get('stats:cache:misses') or 0)
|
||||
if cache_hits + cache_misses > 0:
|
||||
stats['cache_hit_rate'] = round((cache_hits / (cache_hits + cache_misses)) * 100, 2)
|
||||
|
||||
# Error rate
|
||||
total_requests = stats['requests']['today']
|
||||
errors_today = int(redis_client.get(f'stats:errors:daily:{datetime.now().strftime("%Y-%m-%d")}') or 0)
|
||||
if total_requests > 0:
|
||||
stats['error_rate'] = round((errors_today / total_requests) * 100, 2)
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error fetching Redis stats: {e}")
|
||||
|
||||
return jsonify(stats)
|
||||
except Exception as e:
|
||||
logger.error(f"Error in get_overview_stats: {e}")
|
||||
return jsonify({'error': str(e)}), 500
|
||||
|
||||
@admin_bp.route('/api/stats/requests/<timeframe>')
|
||||
@admin_required
|
||||
def get_request_stats(timeframe):
|
||||
"""Get request statistics for different timeframes"""
|
||||
try:
|
||||
if timeframe not in ['minute', 'hour', 'day']:
|
||||
return jsonify({'error': 'Invalid timeframe'}), 400
|
||||
|
||||
data = []
|
||||
labels = []
|
||||
|
||||
if redis_client:
|
||||
now = datetime.now()
|
||||
|
||||
if timeframe == 'minute':
|
||||
# Last 60 minutes
|
||||
for i in range(59, -1, -1):
|
||||
time_key = (now - timedelta(minutes=i)).strftime('%Y-%m-%d-%H-%M')
|
||||
count = int(redis_client.get(f'stats:requests:minute:{time_key}') or 0)
|
||||
data.append(count)
|
||||
labels.append((now - timedelta(minutes=i)).strftime('%H:%M'))
|
||||
|
||||
elif timeframe == 'hour':
|
||||
# Last 24 hours
|
||||
for i in range(23, -1, -1):
|
||||
time_key = (now - timedelta(hours=i)).strftime('%Y-%m-%d-%H')
|
||||
count = int(redis_client.get(f'stats:requests:hourly:{time_key}') or 0)
|
||||
data.append(count)
|
||||
labels.append((now - timedelta(hours=i)).strftime('%H:00'))
|
||||
|
||||
elif timeframe == 'day':
|
||||
# Last 30 days
|
||||
for i in range(29, -1, -1):
|
||||
time_key = (now - timedelta(days=i)).strftime('%Y-%m-%d')
|
||||
count = int(redis_client.get(f'stats:requests:daily:{time_key}') or 0)
|
||||
data.append(count)
|
||||
labels.append((now - timedelta(days=i)).strftime('%m/%d'))
|
||||
|
||||
return jsonify({
|
||||
'labels': labels,
|
||||
'data': data,
|
||||
'timeframe': timeframe
|
||||
})
|
||||
except Exception as e:
|
||||
logger.error(f"Error in get_request_stats: {e}")
|
||||
return jsonify({'error': str(e)}), 500
|
||||
|
||||
@admin_bp.route('/api/stats/operations')
|
||||
@admin_required
|
||||
def get_operation_stats():
|
||||
"""Get translation and transcription statistics"""
|
||||
try:
|
||||
stats = {
|
||||
'translations': {'data': [], 'labels': []},
|
||||
'transcriptions': {'data': [], 'labels': []},
|
||||
'language_pairs': {},
|
||||
'response_times': {'translation': [], 'transcription': []}
|
||||
}
|
||||
|
||||
if redis_client:
|
||||
now = datetime.now()
|
||||
|
||||
# Get daily stats for last 7 days
|
||||
for i in range(6, -1, -1):
|
||||
date_key = (now - timedelta(days=i)).strftime('%Y-%m-%d')
|
||||
date_label = (now - timedelta(days=i)).strftime('%m/%d')
|
||||
|
||||
# Translation counts
|
||||
trans_count = int(redis_client.get(f'stats:translations:daily:{date_key}') or 0)
|
||||
stats['translations']['data'].append(trans_count)
|
||||
stats['translations']['labels'].append(date_label)
|
||||
|
||||
# Transcription counts
|
||||
transcr_count = int(redis_client.get(f'stats:transcriptions:daily:{date_key}') or 0)
|
||||
stats['transcriptions']['data'].append(transcr_count)
|
||||
stats['transcriptions']['labels'].append(date_label)
|
||||
|
||||
# Get language pair statistics
|
||||
lang_pairs = redis_client.hgetall('stats:language_pairs') or {}
|
||||
stats['language_pairs'] = {k: int(v) for k, v in lang_pairs.items()}
|
||||
|
||||
# Get response times (last 100 operations)
|
||||
trans_times = redis_client.lrange('stats:response_times:translation', 0, 99)
|
||||
transcr_times = redis_client.lrange('stats:response_times:transcription', 0, 99)
|
||||
|
||||
stats['response_times']['translation'] = [float(t) for t in trans_times[:20]]
|
||||
stats['response_times']['transcription'] = [float(t) for t in transcr_times[:20]]
|
||||
|
||||
return jsonify(stats)
|
||||
except Exception as e:
|
||||
logger.error(f"Error in get_operation_stats: {e}")
|
||||
return jsonify({'error': str(e)}), 500
|
||||
|
||||
@admin_bp.route('/api/stats/errors')
|
||||
@admin_required
|
||||
def get_error_stats():
|
||||
"""Get error statistics"""
|
||||
try:
|
||||
stats = {
|
||||
'error_types': {},
|
||||
'error_timeline': {'data': [], 'labels': []},
|
||||
'recent_errors': []
|
||||
}
|
||||
|
||||
if pg_conn:
|
||||
try:
|
||||
with pg_conn.cursor(cursor_factory=RealDictCursor) as cursor:
|
||||
# Get error types distribution
|
||||
cursor.execute("""
|
||||
SELECT error_type, COUNT(*) as count
|
||||
FROM error_logs
|
||||
WHERE created_at > NOW() - INTERVAL '24 hours'
|
||||
GROUP BY error_type
|
||||
ORDER BY count DESC
|
||||
LIMIT 10
|
||||
""")
|
||||
error_types = cursor.fetchall()
|
||||
stats['error_types'] = {row['error_type']: row['count'] for row in error_types}
|
||||
|
||||
# Get error timeline (hourly for last 24 hours)
|
||||
cursor.execute("""
|
||||
SELECT
|
||||
DATE_TRUNC('hour', created_at) as hour,
|
||||
COUNT(*) as count
|
||||
FROM error_logs
|
||||
WHERE created_at > NOW() - INTERVAL '24 hours'
|
||||
GROUP BY hour
|
||||
ORDER BY hour
|
||||
""")
|
||||
timeline = cursor.fetchall()
|
||||
|
||||
for row in timeline:
|
||||
stats['error_timeline']['labels'].append(row['hour'].strftime('%H:00'))
|
||||
stats['error_timeline']['data'].append(row['count'])
|
||||
|
||||
# Get recent errors
|
||||
cursor.execute("""
|
||||
SELECT
|
||||
error_type,
|
||||
error_message,
|
||||
endpoint,
|
||||
created_at
|
||||
FROM error_logs
|
||||
ORDER BY created_at DESC
|
||||
LIMIT 10
|
||||
""")
|
||||
recent = cursor.fetchall()
|
||||
stats['recent_errors'] = [
|
||||
{
|
||||
'type': row['error_type'],
|
||||
'message': row['error_message'][:100],
|
||||
'endpoint': row['endpoint'],
|
||||
'time': row['created_at'].isoformat()
|
||||
}
|
||||
for row in recent
|
||||
]
|
||||
except Exception as e:
|
||||
logger.error(f"Error querying PostgreSQL: {e}")
|
||||
|
||||
# Fallback to Redis if PostgreSQL fails
|
||||
if not stats['error_types'] and redis_client:
|
||||
error_types = redis_client.hgetall('stats:error_types') or {}
|
||||
stats['error_types'] = {k: int(v) for k, v in error_types.items()}
|
||||
|
||||
# Get hourly error counts
|
||||
now = datetime.now()
|
||||
for i in range(23, -1, -1):
|
||||
hour_key = (now - timedelta(hours=i)).strftime('%Y-%m-%d-%H')
|
||||
count = int(redis_client.get(f'stats:errors:hourly:{hour_key}') or 0)
|
||||
stats['error_timeline']['data'].append(count)
|
||||
stats['error_timeline']['labels'].append((now - timedelta(hours=i)).strftime('%H:00'))
|
||||
|
||||
return jsonify(stats)
|
||||
except Exception as e:
|
||||
logger.error(f"Error in get_error_stats: {e}")
|
||||
return jsonify({'error': str(e)}), 500
|
||||
|
||||
@admin_bp.route('/api/stats/performance')
|
||||
@admin_required
|
||||
def get_performance_stats():
|
||||
"""Get performance metrics"""
|
||||
try:
|
||||
stats = {
|
||||
'response_times': {
|
||||
'translation': {'avg': 0, 'p95': 0, 'p99': 0},
|
||||
'transcription': {'avg': 0, 'p95': 0, 'p99': 0},
|
||||
'tts': {'avg': 0, 'p95': 0, 'p99': 0}
|
||||
},
|
||||
'throughput': {'data': [], 'labels': []},
|
||||
'slow_requests': []
|
||||
}
|
||||
|
||||
if redis_client:
|
||||
# Calculate response time percentiles
|
||||
for operation in ['translation', 'transcription', 'tts']:
|
||||
times = redis_client.lrange(f'stats:response_times:{operation}', 0, -1)
|
||||
if times:
|
||||
times = sorted([float(t) for t in times])
|
||||
stats['response_times'][operation]['avg'] = round(sum(times) / len(times), 2)
|
||||
stats['response_times'][operation]['p95'] = round(times[int(len(times) * 0.95)], 2)
|
||||
stats['response_times'][operation]['p99'] = round(times[int(len(times) * 0.99)], 2)
|
||||
|
||||
# Get throughput (requests per minute for last hour)
|
||||
now = datetime.now()
|
||||
for i in range(59, -1, -1):
|
||||
time_key = (now - timedelta(minutes=i)).strftime('%Y-%m-%d-%H-%M')
|
||||
count = int(redis_client.get(f'stats:requests:minute:{time_key}') or 0)
|
||||
stats['throughput']['data'].append(count)
|
||||
stats['throughput']['labels'].append((now - timedelta(minutes=i)).strftime('%H:%M'))
|
||||
|
||||
# Get slow requests
|
||||
slow_requests = redis_client.lrange('stats:slow_requests', 0, 9)
|
||||
stats['slow_requests'] = [json.loads(req) for req in slow_requests if req]
|
||||
|
||||
return jsonify(stats)
|
||||
except Exception as e:
|
||||
logger.error(f"Error in get_performance_stats: {e}")
|
||||
return jsonify({'error': str(e)}), 500
|
||||
|
||||
@admin_bp.route('/api/export/<data_type>')
|
||||
@admin_required
|
||||
def export_data(data_type):
|
||||
"""Export analytics data"""
|
||||
try:
|
||||
if data_type not in ['requests', 'errors', 'performance', 'all']:
|
||||
return jsonify({'error': 'Invalid data type'}), 400
|
||||
|
||||
export_data = {
|
||||
'export_time': datetime.now().isoformat(),
|
||||
'data_type': data_type
|
||||
}
|
||||
|
||||
if data_type in ['requests', 'all']:
|
||||
# Export request data
|
||||
request_data = []
|
||||
if redis_client:
|
||||
# Get daily stats for last 30 days
|
||||
now = datetime.now()
|
||||
for i in range(29, -1, -1):
|
||||
date_key = (now - timedelta(days=i)).strftime('%Y-%m-%d')
|
||||
request_data.append({
|
||||
'date': date_key,
|
||||
'requests': int(redis_client.get(f'stats:requests:daily:{date_key}') or 0),
|
||||
'translations': int(redis_client.get(f'stats:translations:daily:{date_key}') or 0),
|
||||
'transcriptions': int(redis_client.get(f'stats:transcriptions:daily:{date_key}') or 0),
|
||||
'errors': int(redis_client.get(f'stats:errors:daily:{date_key}') or 0)
|
||||
})
|
||||
export_data['requests'] = request_data
|
||||
|
||||
if data_type in ['errors', 'all']:
|
||||
# Export error data from PostgreSQL
|
||||
error_data = []
|
||||
if pg_conn:
|
||||
try:
|
||||
with pg_conn.cursor(cursor_factory=RealDictCursor) as cursor:
|
||||
cursor.execute("""
|
||||
SELECT * FROM error_logs
|
||||
WHERE created_at > NOW() - INTERVAL '7 days'
|
||||
ORDER BY created_at DESC
|
||||
""")
|
||||
errors = cursor.fetchall()
|
||||
error_data = [dict(row) for row in errors]
|
||||
except Exception as e:
|
||||
logger.error(f"Error exporting from PostgreSQL: {e}")
|
||||
export_data['errors'] = error_data
|
||||
|
||||
if data_type in ['performance', 'all']:
|
||||
# Export performance data
|
||||
perf_data = {
|
||||
'response_times': {},
|
||||
'slow_requests': []
|
||||
}
|
||||
if redis_client:
|
||||
for op in ['translation', 'transcription', 'tts']:
|
||||
times = redis_client.lrange(f'stats:response_times:{op}', 0, -1)
|
||||
perf_data['response_times'][op] = [float(t) for t in times]
|
||||
|
||||
slow_reqs = redis_client.lrange('stats:slow_requests', 0, -1)
|
||||
perf_data['slow_requests'] = [json.loads(req) for req in slow_reqs if req]
|
||||
|
||||
export_data['performance'] = perf_data
|
||||
|
||||
# Return as downloadable JSON
|
||||
response = jsonify(export_data)
|
||||
response.headers['Content-Disposition'] = f'attachment; filename=talk2me_analytics_{data_type}_{datetime.now().strftime("%Y%m%d_%H%M%S")}.json'
|
||||
return response
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error in export_data: {e}")
|
||||
return jsonify({'error': str(e)}), 500
|
||||
|
||||
def check_system_health():
|
||||
"""Check health of system components"""
|
||||
health = {
|
||||
'redis': 'unknown',
|
||||
'postgresql': 'unknown',
|
||||
'overall': 'healthy'
|
||||
}
|
||||
|
||||
# Check Redis
|
||||
if redis_client:
|
||||
try:
|
||||
redis_client.ping()
|
||||
health['redis'] = 'healthy'
|
||||
except:
|
||||
health['redis'] = 'unhealthy'
|
||||
health['overall'] = 'degraded'
|
||||
else:
|
||||
health['redis'] = 'not_configured'
|
||||
health['overall'] = 'degraded'
|
||||
|
||||
# Check PostgreSQL
|
||||
if pg_conn:
|
||||
try:
|
||||
with pg_conn.cursor() as cursor:
|
||||
cursor.execute("SELECT 1")
|
||||
cursor.fetchone()
|
||||
health['postgresql'] = 'healthy'
|
||||
except:
|
||||
health['postgresql'] = 'unhealthy'
|
||||
health['overall'] = 'degraded'
|
||||
else:
|
||||
health['postgresql'] = 'not_configured'
|
||||
health['overall'] = 'degraded'
|
||||
|
||||
return health
|
||||
|
||||
# WebSocket support for real-time updates (using Server-Sent Events as fallback)
|
||||
@admin_bp.route('/api/stream/updates')
|
||||
@admin_required
|
||||
def stream_updates():
|
||||
"""Stream real-time updates using Server-Sent Events"""
|
||||
def generate():
|
||||
last_update = time.time()
|
||||
|
||||
while True:
|
||||
# Send update every 5 seconds
|
||||
if time.time() - last_update > 5:
|
||||
try:
|
||||
# Get current stats
|
||||
stats = {
|
||||
'timestamp': datetime.now().isoformat(),
|
||||
'requests_per_minute': 0,
|
||||
'active_sessions': 0,
|
||||
'recent_errors': 0
|
||||
}
|
||||
|
||||
if redis_client:
|
||||
# Current requests per minute
|
||||
current_minute = datetime.now().strftime('%Y-%m-%d-%H-%M')
|
||||
stats['requests_per_minute'] = int(redis_client.get(f'stats:requests:minute:{current_minute}') or 0)
|
||||
|
||||
# Active sessions
|
||||
stats['active_sessions'] = len(redis_client.keys('session:*'))
|
||||
|
||||
# Recent errors
|
||||
current_hour = datetime.now().strftime('%Y-%m-%d-%H')
|
||||
stats['recent_errors'] = int(redis_client.get(f'stats:errors:hourly:{current_hour}') or 0)
|
||||
|
||||
yield f"data: {json.dumps(stats)}\n\n"
|
||||
last_update = time.time()
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error in stream_updates: {e}")
|
||||
yield f"data: {json.dumps({'error': str(e)})}\n\n"
|
||||
|
||||
time.sleep(1)
|
||||
|
||||
return app.response_class(
|
||||
generate(),
|
||||
mimetype='text/event-stream',
|
||||
headers={
|
||||
'Cache-Control': 'no-cache',
|
||||
'X-Accel-Buffering': 'no'
|
||||
}
|
||||
)
|
192
admin/static/css/admin.css
Normal file
192
admin/static/css/admin.css
Normal file
@ -0,0 +1,192 @@
|
||||
/* Admin Dashboard Styles */
|
||||
|
||||
body {
|
||||
background-color: #f8f9fa;
|
||||
padding-top: 56px; /* For fixed navbar */
|
||||
}
|
||||
|
||||
/* Cards */
|
||||
.card {
|
||||
border: none;
|
||||
box-shadow: 0 0.125rem 0.25rem rgba(0, 0, 0, 0.075);
|
||||
transition: transform 0.2s;
|
||||
}
|
||||
|
||||
.card:hover {
|
||||
transform: translateY(-2px);
|
||||
box-shadow: 0 0.5rem 1rem rgba(0, 0, 0, 0.15);
|
||||
}
|
||||
|
||||
.card-header {
|
||||
background-color: #fff;
|
||||
border-bottom: 1px solid #e3e6f0;
|
||||
font-weight: 600;
|
||||
}
|
||||
|
||||
/* Status Badges */
|
||||
.badge {
|
||||
padding: 0.375rem 0.75rem;
|
||||
font-weight: normal;
|
||||
}
|
||||
|
||||
.badge.bg-success {
|
||||
background-color: #1cc88a !important;
|
||||
}
|
||||
|
||||
.badge.bg-warning {
|
||||
background-color: #f6c23e !important;
|
||||
color: #000;
|
||||
}
|
||||
|
||||
.badge.bg-danger {
|
||||
background-color: #e74a3b !important;
|
||||
}
|
||||
|
||||
/* Charts */
|
||||
canvas {
|
||||
max-width: 100%;
|
||||
}
|
||||
|
||||
/* Tables */
|
||||
.table {
|
||||
font-size: 0.875rem;
|
||||
}
|
||||
|
||||
.table th {
|
||||
font-weight: 600;
|
||||
text-transform: uppercase;
|
||||
font-size: 0.75rem;
|
||||
color: #6c757d;
|
||||
}
|
||||
|
||||
/* Login Page */
|
||||
.login-container {
|
||||
min-height: 100vh;
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
}
|
||||
|
||||
/* Responsive adjustments */
|
||||
@media (max-width: 768px) {
|
||||
.card-body h2 {
|
||||
font-size: 1.5rem;
|
||||
}
|
||||
|
||||
.btn-group {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
}
|
||||
|
||||
.btn-group .btn {
|
||||
border-radius: 0.25rem !important;
|
||||
margin: 2px 0;
|
||||
}
|
||||
}
|
||||
|
||||
/* Loading spinners */
|
||||
.spinner-border-sm {
|
||||
width: 1rem;
|
||||
height: 1rem;
|
||||
}
|
||||
|
||||
/* Error list */
|
||||
.error-item {
|
||||
padding: 0.5rem;
|
||||
border-bottom: 1px solid #dee2e6;
|
||||
}
|
||||
|
||||
.error-item:last-child {
|
||||
border-bottom: none;
|
||||
}
|
||||
|
||||
.error-type {
|
||||
font-weight: 600;
|
||||
color: #e74a3b;
|
||||
}
|
||||
|
||||
.error-time {
|
||||
font-size: 0.75rem;
|
||||
color: #6c757d;
|
||||
}
|
||||
|
||||
/* Toast notifications */
|
||||
.toast {
|
||||
background-color: white;
|
||||
box-shadow: 0 0.5rem 1rem rgba(0, 0, 0, 0.15);
|
||||
}
|
||||
|
||||
/* Animations */
|
||||
@keyframes pulse {
|
||||
0% {
|
||||
opacity: 1;
|
||||
}
|
||||
50% {
|
||||
opacity: 0.5;
|
||||
}
|
||||
100% {
|
||||
opacity: 1;
|
||||
}
|
||||
}
|
||||
|
||||
.updating {
|
||||
animation: pulse 1s infinite;
|
||||
}
|
||||
|
||||
/* Dark mode support */
|
||||
@media (prefers-color-scheme: dark) {
|
||||
body {
|
||||
background-color: #1a1a1a;
|
||||
color: #e0e0e0;
|
||||
}
|
||||
|
||||
.card {
|
||||
background-color: #2a2a2a;
|
||||
color: #e0e0e0;
|
||||
}
|
||||
|
||||
.card-header {
|
||||
background-color: #2a2a2a;
|
||||
border-bottom-color: #3a3a3a;
|
||||
}
|
||||
|
||||
.table {
|
||||
color: #e0e0e0;
|
||||
}
|
||||
|
||||
.table-striped tbody tr:nth-of-type(odd) {
|
||||
background-color: rgba(255, 255, 255, 0.05);
|
||||
}
|
||||
|
||||
.form-control {
|
||||
background-color: #3a3a3a;
|
||||
border-color: #4a4a4a;
|
||||
color: #e0e0e0;
|
||||
}
|
||||
}
|
||||
|
||||
/* Performance optimization */
|
||||
.chart-container {
|
||||
position: relative;
|
||||
height: 40vh;
|
||||
width: 100%;
|
||||
}
|
||||
|
||||
/* Scrollbar styling */
|
||||
::-webkit-scrollbar {
|
||||
width: 8px;
|
||||
height: 8px;
|
||||
}
|
||||
|
||||
::-webkit-scrollbar-track {
|
||||
background: #f1f1f1;
|
||||
}
|
||||
|
||||
::-webkit-scrollbar-thumb {
|
||||
background: #888;
|
||||
border-radius: 4px;
|
||||
}
|
||||
|
||||
::-webkit-scrollbar-thumb:hover {
|
||||
background: #555;
|
||||
}
|
519
admin/static/js/admin.js
Normal file
519
admin/static/js/admin.js
Normal file
@ -0,0 +1,519 @@
|
||||
// Admin Dashboard JavaScript
|
||||
|
||||
// Global variables
|
||||
let charts = {};
|
||||
let currentTimeframe = 'minute';
|
||||
let eventSource = null;
|
||||
|
||||
// Chart.js default configuration
|
||||
Chart.defaults.responsive = true;
|
||||
Chart.defaults.maintainAspectRatio = false;
|
||||
|
||||
// Initialize dashboard
|
||||
function initializeDashboard() {
|
||||
// Initialize all charts
|
||||
initializeCharts();
|
||||
|
||||
// Set up event handlers
|
||||
setupEventHandlers();
|
||||
}
|
||||
|
||||
// Initialize all charts
|
||||
function initializeCharts() {
|
||||
// Request Volume Chart
|
||||
const requestCtx = document.getElementById('requestChart').getContext('2d');
|
||||
charts.request = new Chart(requestCtx, {
|
||||
type: 'line',
|
||||
data: {
|
||||
labels: [],
|
||||
datasets: [{
|
||||
label: 'Requests',
|
||||
data: [],
|
||||
borderColor: 'rgb(75, 192, 192)',
|
||||
backgroundColor: 'rgba(75, 192, 192, 0.1)',
|
||||
tension: 0.1
|
||||
}]
|
||||
},
|
||||
options: {
|
||||
scales: {
|
||||
y: {
|
||||
beginAtZero: true
|
||||
}
|
||||
},
|
||||
plugins: {
|
||||
legend: {
|
||||
display: false
|
||||
}
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
// Language Pairs Chart
|
||||
const languageCtx = document.getElementById('languageChart').getContext('2d');
|
||||
charts.language = new Chart(languageCtx, {
|
||||
type: 'doughnut',
|
||||
data: {
|
||||
labels: [],
|
||||
datasets: [{
|
||||
data: [],
|
||||
backgroundColor: [
|
||||
'#FF6384',
|
||||
'#36A2EB',
|
||||
'#FFCE56',
|
||||
'#4BC0C0',
|
||||
'#9966FF',
|
||||
'#FF9F40'
|
||||
]
|
||||
}]
|
||||
},
|
||||
options: {
|
||||
plugins: {
|
||||
legend: {
|
||||
position: 'bottom'
|
||||
}
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
// Operations Chart
|
||||
const operationsCtx = document.getElementById('operationsChart').getContext('2d');
|
||||
charts.operations = new Chart(operationsCtx, {
|
||||
type: 'bar',
|
||||
data: {
|
||||
labels: [],
|
||||
datasets: [
|
||||
{
|
||||
label: 'Translations',
|
||||
data: [],
|
||||
backgroundColor: 'rgba(54, 162, 235, 0.8)'
|
||||
},
|
||||
{
|
||||
label: 'Transcriptions',
|
||||
data: [],
|
||||
backgroundColor: 'rgba(255, 159, 64, 0.8)'
|
||||
}
|
||||
]
|
||||
},
|
||||
options: {
|
||||
scales: {
|
||||
y: {
|
||||
beginAtZero: true
|
||||
}
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
// Response Time Chart
|
||||
const responseCtx = document.getElementById('responseTimeChart').getContext('2d');
|
||||
charts.responseTime = new Chart(responseCtx, {
|
||||
type: 'line',
|
||||
data: {
|
||||
labels: ['Translation', 'Transcription', 'TTS'],
|
||||
datasets: [
|
||||
{
|
||||
label: 'Average',
|
||||
data: [],
|
||||
borderColor: 'rgb(75, 192, 192)',
|
||||
backgroundColor: 'rgba(75, 192, 192, 0.2)'
|
||||
},
|
||||
{
|
||||
label: 'P95',
|
||||
data: [],
|
||||
borderColor: 'rgb(255, 206, 86)',
|
||||
backgroundColor: 'rgba(255, 206, 86, 0.2)'
|
||||
},
|
||||
{
|
||||
label: 'P99',
|
||||
data: [],
|
||||
borderColor: 'rgb(255, 99, 132)',
|
||||
backgroundColor: 'rgba(255, 99, 132, 0.2)'
|
||||
}
|
||||
]
|
||||
},
|
||||
options: {
|
||||
scales: {
|
||||
y: {
|
||||
beginAtZero: true,
|
||||
title: {
|
||||
display: true,
|
||||
text: 'Response Time (ms)'
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
// Error Type Chart
|
||||
const errorCtx = document.getElementById('errorTypeChart').getContext('2d');
|
||||
charts.errorType = new Chart(errorCtx, {
|
||||
type: 'pie',
|
||||
data: {
|
||||
labels: [],
|
||||
datasets: [{
|
||||
data: [],
|
||||
backgroundColor: [
|
||||
'#e74a3b',
|
||||
'#f6c23e',
|
||||
'#4e73df',
|
||||
'#1cc88a',
|
||||
'#36b9cc',
|
||||
'#858796'
|
||||
]
|
||||
}]
|
||||
},
|
||||
options: {
|
||||
plugins: {
|
||||
legend: {
|
||||
position: 'right'
|
||||
}
|
||||
}
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
// Load overview statistics
|
||||
function loadOverviewStats() {
|
||||
$.ajax({
|
||||
url: '/admin/api/stats/overview',
|
||||
method: 'GET',
|
||||
success: function(data) {
|
||||
// Update cards
|
||||
$('#total-requests').text(data.requests.total.toLocaleString());
|
||||
$('#today-requests').text(data.requests.today.toLocaleString());
|
||||
$('#active-sessions').text(data.active_sessions);
|
||||
$('#error-rate').text(data.error_rate + '%');
|
||||
$('#cache-hit-rate').text(data.cache_hit_rate + '%');
|
||||
|
||||
// Update system health
|
||||
updateSystemHealth(data.system_health);
|
||||
},
|
||||
error: function(xhr, status, error) {
|
||||
console.error('Failed to load overview stats:', error);
|
||||
showToast('Failed to load overview statistics', 'error');
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
// Update system health indicators
|
||||
function updateSystemHealth(health) {
|
||||
// Redis status
|
||||
const redisStatus = $('#redis-status');
|
||||
redisStatus.removeClass('bg-success bg-warning bg-danger');
|
||||
if (health.redis === 'healthy') {
|
||||
redisStatus.addClass('bg-success').text('Healthy');
|
||||
} else if (health.redis === 'not_configured') {
|
||||
redisStatus.addClass('bg-warning').text('Not Configured');
|
||||
} else {
|
||||
redisStatus.addClass('bg-danger').text('Unhealthy');
|
||||
}
|
||||
|
||||
// PostgreSQL status
|
||||
const pgStatus = $('#postgresql-status');
|
||||
pgStatus.removeClass('bg-success bg-warning bg-danger');
|
||||
if (health.postgresql === 'healthy') {
|
||||
pgStatus.addClass('bg-success').text('Healthy');
|
||||
} else if (health.postgresql === 'not_configured') {
|
||||
pgStatus.addClass('bg-warning').text('Not Configured');
|
||||
} else {
|
||||
pgStatus.addClass('bg-danger').text('Unhealthy');
|
||||
}
|
||||
|
||||
// ML services status (check via main app health endpoint)
|
||||
$.ajax({
|
||||
url: '/health/detailed',
|
||||
method: 'GET',
|
||||
success: function(data) {
|
||||
const mlStatus = $('#ml-status');
|
||||
mlStatus.removeClass('bg-success bg-warning bg-danger');
|
||||
|
||||
if (data.components.whisper.status === 'healthy' &&
|
||||
data.components.tts.status === 'healthy') {
|
||||
mlStatus.addClass('bg-success').text('Healthy');
|
||||
} else if (data.status === 'degraded') {
|
||||
mlStatus.addClass('bg-warning').text('Degraded');
|
||||
} else {
|
||||
mlStatus.addClass('bg-danger').text('Unhealthy');
|
||||
}
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
// Load request chart data
|
||||
function loadRequestChart(timeframe) {
|
||||
currentTimeframe = timeframe;
|
||||
|
||||
// Update button states
|
||||
$('.btn-group button').removeClass('active');
|
||||
$(`button[onclick="updateRequestChart('${timeframe}')"]`).addClass('active');
|
||||
|
||||
$.ajax({
|
||||
url: `/admin/api/stats/requests/${timeframe}`,
|
||||
method: 'GET',
|
||||
success: function(data) {
|
||||
charts.request.data.labels = data.labels;
|
||||
charts.request.data.datasets[0].data = data.data;
|
||||
charts.request.update();
|
||||
},
|
||||
error: function(xhr, status, error) {
|
||||
console.error('Failed to load request chart:', error);
|
||||
showToast('Failed to load request data', 'error');
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
// Update request chart
|
||||
function updateRequestChart(timeframe) {
|
||||
loadRequestChart(timeframe);
|
||||
}
|
||||
|
||||
// Load operation statistics
|
||||
function loadOperationStats() {
|
||||
$.ajax({
|
||||
url: '/admin/api/stats/operations',
|
||||
method: 'GET',
|
||||
success: function(data) {
|
||||
// Update operations chart
|
||||
charts.operations.data.labels = data.translations.labels;
|
||||
charts.operations.data.datasets[0].data = data.translations.data;
|
||||
charts.operations.data.datasets[1].data = data.transcriptions.data;
|
||||
charts.operations.update();
|
||||
|
||||
// Update language pairs chart
|
||||
const langPairs = Object.entries(data.language_pairs)
|
||||
.sort((a, b) => b[1] - a[1])
|
||||
.slice(0, 6); // Top 6 language pairs
|
||||
|
||||
charts.language.data.labels = langPairs.map(pair => pair[0]);
|
||||
charts.language.data.datasets[0].data = langPairs.map(pair => pair[1]);
|
||||
charts.language.update();
|
||||
},
|
||||
error: function(xhr, status, error) {
|
||||
console.error('Failed to load operation stats:', error);
|
||||
showToast('Failed to load operation data', 'error');
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
// Load error statistics
|
||||
function loadErrorStats() {
|
||||
$.ajax({
|
||||
url: '/admin/api/stats/errors',
|
||||
method: 'GET',
|
||||
success: function(data) {
|
||||
// Update error type chart
|
||||
const errorTypes = Object.entries(data.error_types)
|
||||
.sort((a, b) => b[1] - a[1])
|
||||
.slice(0, 6);
|
||||
|
||||
charts.errorType.data.labels = errorTypes.map(type => type[0]);
|
||||
charts.errorType.data.datasets[0].data = errorTypes.map(type => type[1]);
|
||||
charts.errorType.update();
|
||||
|
||||
// Update recent errors list
|
||||
updateRecentErrors(data.recent_errors);
|
||||
},
|
||||
error: function(xhr, status, error) {
|
||||
console.error('Failed to load error stats:', error);
|
||||
showToast('Failed to load error data', 'error');
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
// Update recent errors list
|
||||
function updateRecentErrors(errors) {
|
||||
const errorsList = $('#recent-errors-list');
|
||||
|
||||
if (errors.length === 0) {
|
||||
errorsList.html('<p class="text-muted text-center">No recent errors</p>');
|
||||
return;
|
||||
}
|
||||
|
||||
let html = '';
|
||||
errors.forEach(error => {
|
||||
const time = new Date(error.time).toLocaleString();
|
||||
html += `
|
||||
<div class="error-item">
|
||||
<div class="error-type">${error.type}</div>
|
||||
<div class="text-muted small">${error.endpoint}</div>
|
||||
<div>${error.message}</div>
|
||||
<div class="error-time">${time}</div>
|
||||
</div>
|
||||
`;
|
||||
});
|
||||
|
||||
errorsList.html(html);
|
||||
}
|
||||
|
||||
// Load performance statistics
|
||||
function loadPerformanceStats() {
|
||||
$.ajax({
|
||||
url: '/admin/api/stats/performance',
|
||||
method: 'GET',
|
||||
success: function(data) {
|
||||
// Update response time chart
|
||||
const operations = ['translation', 'transcription', 'tts'];
|
||||
const avgData = operations.map(op => data.response_times[op].avg);
|
||||
const p95Data = operations.map(op => data.response_times[op].p95);
|
||||
const p99Data = operations.map(op => data.response_times[op].p99);
|
||||
|
||||
charts.responseTime.data.datasets[0].data = avgData;
|
||||
charts.responseTime.data.datasets[1].data = p95Data;
|
||||
charts.responseTime.data.datasets[2].data = p99Data;
|
||||
charts.responseTime.update();
|
||||
|
||||
// Update performance table
|
||||
updatePerformanceTable(data.response_times);
|
||||
},
|
||||
error: function(xhr, status, error) {
|
||||
console.error('Failed to load performance stats:', error);
|
||||
showToast('Failed to load performance data', 'error');
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
// Update performance table
|
||||
function updatePerformanceTable(responseData) {
|
||||
const tbody = $('#performance-table');
|
||||
let html = '';
|
||||
|
||||
const operations = {
|
||||
'translation': 'Translation',
|
||||
'transcription': 'Transcription',
|
||||
'tts': 'Text-to-Speech'
|
||||
};
|
||||
|
||||
for (const [key, label] of Object.entries(operations)) {
|
||||
const data = responseData[key];
|
||||
html += `
|
||||
<tr>
|
||||
<td>${label}</td>
|
||||
<td>${data.avg || '-'}</td>
|
||||
<td>${data.p95 || '-'}</td>
|
||||
<td>${data.p99 || '-'}</td>
|
||||
</tr>
|
||||
`;
|
||||
}
|
||||
|
||||
tbody.html(html);
|
||||
}
|
||||
|
||||
// Start real-time updates
|
||||
function startRealtimeUpdates() {
|
||||
if (eventSource) {
|
||||
eventSource.close();
|
||||
}
|
||||
|
||||
eventSource = new EventSource('/admin/api/stream/updates');
|
||||
|
||||
eventSource.onmessage = function(event) {
|
||||
const data = JSON.parse(event.data);
|
||||
|
||||
// Update real-time metrics
|
||||
if (data.requests_per_minute !== undefined) {
|
||||
$('#requests-per-minute').text(data.requests_per_minute);
|
||||
}
|
||||
|
||||
if (data.active_sessions !== undefined) {
|
||||
$('#active-sessions').text(data.active_sessions);
|
||||
}
|
||||
|
||||
// Update last update time
|
||||
$('#last-update').text('Just now');
|
||||
|
||||
// Show update indicator
|
||||
$('#update-status').text('Connected').removeClass('text-danger').addClass('text-success');
|
||||
};
|
||||
|
||||
eventSource.onerror = function(error) {
|
||||
console.error('EventSource error:', error);
|
||||
$('#update-status').text('Disconnected').removeClass('text-success').addClass('text-danger');
|
||||
|
||||
// Reconnect after 5 seconds
|
||||
setTimeout(startRealtimeUpdates, 5000);
|
||||
};
|
||||
}
|
||||
|
||||
// Export data function
|
||||
function exportData(dataType) {
|
||||
window.location.href = `/admin/api/export/${dataType}`;
|
||||
}
|
||||
|
||||
// Show toast notification
|
||||
function showToast(message, type = 'info') {
|
||||
const toast = $('#update-toast');
|
||||
const toastBody = toast.find('.toast-body');
|
||||
|
||||
toastBody.removeClass('text-success text-danger text-warning');
|
||||
|
||||
if (type === 'success') {
|
||||
toastBody.addClass('text-success');
|
||||
} else if (type === 'error') {
|
||||
toastBody.addClass('text-danger');
|
||||
} else if (type === 'warning') {
|
||||
toastBody.addClass('text-warning');
|
||||
}
|
||||
|
||||
toastBody.text(message);
|
||||
|
||||
const bsToast = new bootstrap.Toast(toast[0]);
|
||||
bsToast.show();
|
||||
}
|
||||
|
||||
// Setup event handlers
|
||||
function setupEventHandlers() {
|
||||
// Auto-refresh toggle
|
||||
$('#auto-refresh').on('change', function() {
|
||||
if ($(this).prop('checked')) {
|
||||
startAutoRefresh();
|
||||
} else {
|
||||
stopAutoRefresh();
|
||||
}
|
||||
});
|
||||
|
||||
// Export buttons
|
||||
$('.export-btn').on('click', function() {
|
||||
const dataType = $(this).data('type');
|
||||
exportData(dataType);
|
||||
});
|
||||
}
|
||||
|
||||
// Auto-refresh functionality
|
||||
let refreshIntervals = {};
|
||||
|
||||
function startAutoRefresh() {
|
||||
refreshIntervals.overview = setInterval(loadOverviewStats, 10000);
|
||||
refreshIntervals.operations = setInterval(loadOperationStats, 30000);
|
||||
refreshIntervals.errors = setInterval(loadErrorStats, 60000);
|
||||
refreshIntervals.performance = setInterval(loadPerformanceStats, 30000);
|
||||
}
|
||||
|
||||
function stopAutoRefresh() {
|
||||
Object.values(refreshIntervals).forEach(interval => clearInterval(interval));
|
||||
refreshIntervals = {};
|
||||
}
|
||||
|
||||
// Utility functions
|
||||
function formatBytes(bytes, decimals = 2) {
|
||||
if (bytes === 0) return '0 Bytes';
|
||||
|
||||
const k = 1024;
|
||||
const dm = decimals < 0 ? 0 : decimals;
|
||||
const sizes = ['Bytes', 'KB', 'MB', 'GB', 'TB'];
|
||||
|
||||
const i = Math.floor(Math.log(bytes) / Math.log(k));
|
||||
|
||||
return parseFloat((bytes / Math.pow(k, i)).toFixed(dm)) + ' ' + sizes[i];
|
||||
}
|
||||
|
||||
function formatDuration(ms) {
|
||||
if (ms < 1000) return ms + 'ms';
|
||||
if (ms < 60000) return (ms / 1000).toFixed(1) + 's';
|
||||
return (ms / 60000).toFixed(1) + 'm';
|
||||
}
|
||||
|
||||
// Initialize on page load
|
||||
$(document).ready(function() {
|
||||
if ($('#requestChart').length > 0) {
|
||||
initializeDashboard();
|
||||
}
|
||||
});
|
75
admin/templates/base.html
Normal file
75
admin/templates/base.html
Normal file
@ -0,0 +1,75 @@
|
||||
<!DOCTYPE html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="UTF-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
||||
<title>{% block title %}Talk2Me Admin Dashboard{% endblock %}</title>
|
||||
|
||||
<!-- Bootstrap CSS -->
|
||||
<link href="https://cdn.jsdelivr.net/npm/bootstrap@5.3.0/dist/css/bootstrap.min.css" rel="stylesheet">
|
||||
|
||||
<!-- Font Awesome -->
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/font-awesome/6.4.0/css/all.min.css">
|
||||
|
||||
<!-- Chart.js -->
|
||||
<script src="https://cdn.jsdelivr.net/npm/chart.js@4.4.0/dist/chart.umd.js"></script>
|
||||
|
||||
<!-- Custom CSS -->
|
||||
<link rel="stylesheet" href="{{ url_for('admin.static', filename='css/admin.css') }}">
|
||||
|
||||
{% block extra_css %}{% endblock %}
|
||||
</head>
|
||||
<body>
|
||||
<!-- Navigation -->
|
||||
<nav class="navbar navbar-expand-lg navbar-dark bg-dark fixed-top">
|
||||
<div class="container-fluid">
|
||||
<a class="navbar-brand" href="{{ url_for('admin.dashboard') }}">
|
||||
<i class="fas fa-language"></i> Talk2Me Admin
|
||||
</a>
|
||||
<button class="navbar-toggler" type="button" data-bs-toggle="collapse" data-bs-target="#navbarNav">
|
||||
<span class="navbar-toggler-icon"></span>
|
||||
</button>
|
||||
<div class="collapse navbar-collapse" id="navbarNav">
|
||||
<ul class="navbar-nav ms-auto">
|
||||
<li class="nav-item">
|
||||
<a class="nav-link" href="{{ url_for('admin.dashboard') }}">
|
||||
<i class="fas fa-tachometer-alt"></i> Dashboard
|
||||
</a>
|
||||
</li>
|
||||
<li class="nav-item">
|
||||
<a class="nav-link" href="{{ url_for('admin.users') }}">
|
||||
<i class="fas fa-users"></i> Users
|
||||
</a>
|
||||
</li>
|
||||
<li class="nav-item">
|
||||
<a class="nav-link" href="#" onclick="exportData('all')">
|
||||
<i class="fas fa-download"></i> Export Data
|
||||
</a>
|
||||
</li>
|
||||
<li class="nav-item">
|
||||
<a class="nav-link" href="{{ url_for('admin.logout') }}">
|
||||
<i class="fas fa-sign-out-alt"></i> Logout
|
||||
</a>
|
||||
</li>
|
||||
</ul>
|
||||
</div>
|
||||
</div>
|
||||
</nav>
|
||||
|
||||
<!-- Main Content -->
|
||||
<main class="container-fluid mt-5 pt-3">
|
||||
{% block content %}{% endblock %}
|
||||
</main>
|
||||
|
||||
<!-- Bootstrap JS -->
|
||||
<script src="https://cdn.jsdelivr.net/npm/bootstrap@5.3.0/dist/js/bootstrap.bundle.min.js"></script>
|
||||
|
||||
<!-- jQuery (for AJAX) -->
|
||||
<script src="https://code.jquery.com/jquery-3.7.0.min.js"></script>
|
||||
|
||||
<!-- Custom JS -->
|
||||
<script src="{{ url_for('admin.static', filename='js/admin.js') }}"></script>
|
||||
|
||||
{% block extra_js %}{% endblock %}
|
||||
</body>
|
||||
</html>
|
277
admin/templates/dashboard.html
Normal file
277
admin/templates/dashboard.html
Normal file
@ -0,0 +1,277 @@
|
||||
{% extends "base.html" %}
|
||||
|
||||
{% block title %}Dashboard - Talk2Me Admin{% endblock %}
|
||||
|
||||
{% block content %}
|
||||
<!-- Quick Actions -->
|
||||
<div class="row mb-4">
|
||||
<div class="col-12">
|
||||
<div class="card bg-light">
|
||||
<div class="card-body">
|
||||
<h5 class="card-title">Quick Actions</h5>
|
||||
<div class="btn-group" role="group">
|
||||
<a href="{{ url_for('admin.users') }}" class="btn btn-primary">
|
||||
<i class="fas fa-users"></i> Manage Users
|
||||
</a>
|
||||
<button onclick="exportData('all')" class="btn btn-secondary">
|
||||
<i class="fas fa-download"></i> Export Data
|
||||
</button>
|
||||
<button onclick="clearCache()" class="btn btn-warning">
|
||||
<i class="fas fa-trash"></i> Clear Cache
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Overview Cards -->
|
||||
<div class="row mb-4">
|
||||
<div class="col-md-3 mb-3">
|
||||
<div class="card text-white bg-primary">
|
||||
<div class="card-body">
|
||||
<h5 class="card-title">Total Requests</h5>
|
||||
<h2 class="card-text" id="total-requests">
|
||||
<div class="spinner-border spinner-border-sm" role="status"></div>
|
||||
</h2>
|
||||
<p class="card-text"><small>Today: <span id="today-requests">-</span></small></p>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="col-md-3 mb-3">
|
||||
<div class="card text-white bg-success">
|
||||
<div class="card-body">
|
||||
<h5 class="card-title">Active Sessions</h5>
|
||||
<h2 class="card-text" id="active-sessions">
|
||||
<div class="spinner-border spinner-border-sm" role="status"></div>
|
||||
</h2>
|
||||
<p class="card-text"><small>Live users</small></p>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="col-md-3 mb-3">
|
||||
<div class="card text-white bg-warning">
|
||||
<div class="card-body">
|
||||
<h5 class="card-title">Error Rate</h5>
|
||||
<h2 class="card-text" id="error-rate">
|
||||
<div class="spinner-border spinner-border-sm" role="status"></div>
|
||||
</h2>
|
||||
<p class="card-text"><small>Last 24 hours</small></p>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="col-md-3 mb-3">
|
||||
<div class="card text-white bg-info">
|
||||
<div class="card-body">
|
||||
<h5 class="card-title">Cache Hit Rate</h5>
|
||||
<h2 class="card-text" id="cache-hit-rate">
|
||||
<div class="spinner-border spinner-border-sm" role="status"></div>
|
||||
</h2>
|
||||
<p class="card-text"><small>Performance metric</small></p>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- System Health Status -->
|
||||
<div class="row mb-4">
|
||||
<div class="col-12">
|
||||
<div class="card">
|
||||
<div class="card-header">
|
||||
<h5 class="mb-0"><i class="fas fa-heartbeat"></i> System Health</h5>
|
||||
</div>
|
||||
<div class="card-body">
|
||||
<div class="row">
|
||||
<div class="col-md-4">
|
||||
<div class="d-flex align-items-center">
|
||||
<i class="fas fa-database fa-2x me-3"></i>
|
||||
<div>
|
||||
<h6 class="mb-0">Redis</h6>
|
||||
<span class="badge" id="redis-status">Checking...</span>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="col-md-4">
|
||||
<div class="d-flex align-items-center">
|
||||
<i class="fas fa-server fa-2x me-3"></i>
|
||||
<div>
|
||||
<h6 class="mb-0">PostgreSQL</h6>
|
||||
<span class="badge" id="postgresql-status">Checking...</span>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="col-md-4">
|
||||
<div class="d-flex align-items-center">
|
||||
<i class="fas fa-microphone fa-2x me-3"></i>
|
||||
<div>
|
||||
<h6 class="mb-0">Whisper/TTS</h6>
|
||||
<span class="badge" id="ml-status">Checking...</span>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Charts Row 1 -->
|
||||
<div class="row mb-4">
|
||||
<div class="col-md-8">
|
||||
<div class="card">
|
||||
<div class="card-header">
|
||||
<h5 class="mb-0">Request Volume</h5>
|
||||
<div class="btn-group btn-group-sm float-end" role="group">
|
||||
<button type="button" class="btn btn-outline-primary active" onclick="updateRequestChart('minute')">Minute</button>
|
||||
<button type="button" class="btn btn-outline-primary" onclick="updateRequestChart('hour')">Hour</button>
|
||||
<button type="button" class="btn btn-outline-primary" onclick="updateRequestChart('day')">Day</button>
|
||||
</div>
|
||||
</div>
|
||||
<div class="card-body">
|
||||
<canvas id="requestChart" height="100"></canvas>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="col-md-4">
|
||||
<div class="card">
|
||||
<div class="card-header">
|
||||
<h5 class="mb-0">Language Pairs</h5>
|
||||
</div>
|
||||
<div class="card-body">
|
||||
<canvas id="languageChart" height="200"></canvas>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Charts Row 2 -->
|
||||
<div class="row mb-4">
|
||||
<div class="col-md-6">
|
||||
<div class="card">
|
||||
<div class="card-header">
|
||||
<h5 class="mb-0">Operations</h5>
|
||||
</div>
|
||||
<div class="card-body">
|
||||
<canvas id="operationsChart" height="120"></canvas>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="col-md-6">
|
||||
<div class="card">
|
||||
<div class="card-header">
|
||||
<h5 class="mb-0">Response Times (ms)</h5>
|
||||
</div>
|
||||
<div class="card-body">
|
||||
<canvas id="responseTimeChart" height="120"></canvas>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Error Analysis -->
|
||||
<div class="row mb-4">
|
||||
<div class="col-md-6">
|
||||
<div class="card">
|
||||
<div class="card-header">
|
||||
<h5 class="mb-0">Error Types</h5>
|
||||
</div>
|
||||
<div class="card-body">
|
||||
<canvas id="errorTypeChart" height="150"></canvas>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="col-md-6">
|
||||
<div class="card">
|
||||
<div class="card-header">
|
||||
<h5 class="mb-0">Recent Errors</h5>
|
||||
</div>
|
||||
<div class="card-body" style="max-height: 300px; overflow-y: auto;">
|
||||
<div id="recent-errors-list">
|
||||
<div class="text-center">
|
||||
<div class="spinner-border" role="status"></div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Performance Metrics -->
|
||||
<div class="row mb-4">
|
||||
<div class="col-12">
|
||||
<div class="card">
|
||||
<div class="card-header">
|
||||
<h5 class="mb-0">Performance Metrics</h5>
|
||||
</div>
|
||||
<div class="card-body">
|
||||
<div class="table-responsive">
|
||||
<table class="table table-striped">
|
||||
<thead>
|
||||
<tr>
|
||||
<th>Operation</th>
|
||||
<th>Average (ms)</th>
|
||||
<th>95th Percentile (ms)</th>
|
||||
<th>99th Percentile (ms)</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody id="performance-table">
|
||||
<tr>
|
||||
<td colspan="4" class="text-center">
|
||||
<div class="spinner-border" role="status"></div>
|
||||
</td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Real-time Updates Status -->
|
||||
<div class="position-fixed bottom-0 end-0 p-3" style="z-index: 1000">
|
||||
<div class="toast" id="update-toast" role="alert">
|
||||
<div class="toast-header">
|
||||
<i class="fas fa-sync-alt me-2"></i>
|
||||
<strong class="me-auto">Real-time Updates</strong>
|
||||
<small id="last-update">Just now</small>
|
||||
<button type="button" class="btn-close" data-bs-dismiss="toast"></button>
|
||||
</div>
|
||||
<div class="toast-body">
|
||||
<span id="update-status">Connected</span>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{% endblock %}
|
||||
|
||||
{% block extra_js %}
|
||||
<script>
|
||||
// Initialize dashboard
|
||||
$(document).ready(function() {
|
||||
initializeDashboard();
|
||||
|
||||
// Start real-time updates
|
||||
startRealtimeUpdates();
|
||||
|
||||
// Load initial data
|
||||
loadOverviewStats();
|
||||
loadRequestChart('minute');
|
||||
loadOperationStats();
|
||||
loadErrorStats();
|
||||
loadPerformanceStats();
|
||||
|
||||
// Refresh data periodically
|
||||
setInterval(loadOverviewStats, 10000); // Every 10 seconds
|
||||
setInterval(function() {
|
||||
loadRequestChart(currentTimeframe);
|
||||
}, 30000); // Every 30 seconds
|
||||
});
|
||||
</script>
|
||||
{% endblock %}
|
120
admin/templates/dashboard_simple.html
Normal file
120
admin/templates/dashboard_simple.html
Normal file
@ -0,0 +1,120 @@
|
||||
{% extends "base.html" %}
|
||||
|
||||
{% block title %}Dashboard - Talk2Me Admin{% endblock %}
|
||||
|
||||
{% block content %}
|
||||
<!-- Simple Mode Notice -->
|
||||
<div class="alert alert-warning" role="alert">
|
||||
<h4 class="alert-heading">Simple Mode Active</h4>
|
||||
<p>The admin dashboard is running in simple mode because Redis and PostgreSQL services are not available.</p>
|
||||
<hr>
|
||||
<p class="mb-0">To enable full analytics and monitoring features, please ensure Redis and PostgreSQL are running.</p>
|
||||
</div>
|
||||
|
||||
<!-- Basic Info Cards -->
|
||||
<div class="row mb-4">
|
||||
<div class="col-md-4 mb-3">
|
||||
<div class="card">
|
||||
<div class="card-body">
|
||||
<h5 class="card-title">System Status</h5>
|
||||
<p class="card-text">
|
||||
<span class="badge badge-success">Online</span>
|
||||
</p>
|
||||
<small class="text-muted">Talk2Me API is running</small>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="col-md-4 mb-3">
|
||||
<div class="card">
|
||||
<div class="card-body">
|
||||
<h5 class="card-title">Admin Access</h5>
|
||||
<p class="card-text">
|
||||
<span class="badge badge-primary">Authenticated</span>
|
||||
</p>
|
||||
<small class="text-muted">You are logged in as admin</small>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="col-md-4 mb-3">
|
||||
<div class="card">
|
||||
<div class="card-body">
|
||||
<h5 class="card-title">Services</h5>
|
||||
<p class="card-text">
|
||||
Redis: <span class="badge badge-secondary">Not configured</span><br>
|
||||
PostgreSQL: <span class="badge badge-secondary">Not configured</span>
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Available Actions -->
|
||||
<div class="card">
|
||||
<div class="card-header">
|
||||
<h5 class="mb-0">Available Actions</h5>
|
||||
</div>
|
||||
<div class="card-body">
|
||||
<p>In simple mode, you can:</p>
|
||||
<ul>
|
||||
<li>Access the Talk2Me API with admin privileges</li>
|
||||
<li>View system health status</li>
|
||||
<li>Logout from the admin session</li>
|
||||
</ul>
|
||||
|
||||
<p class="mt-3">To enable full features, set up the following services:</p>
|
||||
<ol>
|
||||
<li><strong>Redis</strong>: For caching, rate limiting, and session management</li>
|
||||
<li><strong>PostgreSQL</strong>: For persistent storage of analytics and user data</li>
|
||||
</ol>
|
||||
|
||||
<div class="mt-4">
|
||||
<a href="/admin/logout" class="btn btn-secondary">Logout</a>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Setup Instructions -->
|
||||
<div class="card mt-4">
|
||||
<div class="card-header">
|
||||
<h5 class="mb-0">Quick Setup Guide</h5>
|
||||
</div>
|
||||
<div class="card-body">
|
||||
<h6>1. Install Redis:</h6>
|
||||
<pre class="bg-light p-2"><code># Ubuntu/Debian
|
||||
sudo apt-get install redis-server
|
||||
sudo systemctl start redis
|
||||
|
||||
# macOS
|
||||
brew install redis
|
||||
brew services start redis</code></pre>
|
||||
|
||||
<h6>2. Install PostgreSQL:</h6>
|
||||
<pre class="bg-light p-2"><code># Ubuntu/Debian
|
||||
sudo apt-get install postgresql
|
||||
sudo systemctl start postgresql
|
||||
|
||||
# macOS
|
||||
brew install postgresql
|
||||
brew services start postgresql</code></pre>
|
||||
|
||||
<h6>3. Configure Environment:</h6>
|
||||
<pre class="bg-light p-2"><code># Add to .env file
|
||||
REDIS_URL=redis://localhost:6379/0
|
||||
DATABASE_URL=postgresql://user:pass@localhost/talk2me</code></pre>
|
||||
|
||||
<h6>4. Initialize Database:</h6>
|
||||
<pre class="bg-light p-2"><code>python init_auth_db.py</code></pre>
|
||||
|
||||
<p class="mt-3">After completing these steps, restart the Talk2Me server to enable full admin features.</p>
|
||||
</div>
|
||||
</div>
|
||||
{% endblock %}
|
||||
|
||||
{% block scripts %}
|
||||
<script>
|
||||
// Simple mode - no need for real-time updates or API calls
|
||||
console.log('Admin dashboard loaded in simple mode');
|
||||
</script>
|
||||
{% endblock %}
|
35
admin/templates/login.html
Normal file
35
admin/templates/login.html
Normal file
@ -0,0 +1,35 @@
|
||||
{% extends "base.html" %}
|
||||
|
||||
{% block title %}Admin Login - Talk2Me{% endblock %}
|
||||
|
||||
{% block content %}
|
||||
<div class="row justify-content-center mt-5">
|
||||
<div class="col-md-6 col-lg-4">
|
||||
<div class="card shadow">
|
||||
<div class="card-body">
|
||||
<h3 class="card-title text-center mb-4">
|
||||
<i class="fas fa-lock"></i> Admin Login
|
||||
</h3>
|
||||
|
||||
{% if error %}
|
||||
<div class="alert alert-danger" role="alert">
|
||||
{{ error }}
|
||||
</div>
|
||||
{% endif %}
|
||||
|
||||
<form method="POST">
|
||||
<div class="mb-3">
|
||||
<label for="token" class="form-label">Admin Token</label>
|
||||
<input type="password" class="form-control" id="token" name="token" required autofocus>
|
||||
<div class="form-text">Enter your admin access token</div>
|
||||
</div>
|
||||
|
||||
<button type="submit" class="btn btn-primary w-100">
|
||||
<i class="fas fa-sign-in-alt"></i> Login
|
||||
</button>
|
||||
</form>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
{% endblock %}
|
47
admin_loader.py
Normal file
47
admin_loader.py
Normal file
@ -0,0 +1,47 @@
|
||||
"""
|
||||
Dynamic admin module loader that chooses between full and simple admin based on service availability
|
||||
"""
|
||||
import os
|
||||
import logging
|
||||
from dotenv import load_dotenv
|
||||
|
||||
# Load environment variables
|
||||
load_dotenv()
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
def load_admin_module():
|
||||
"""
|
||||
Dynamically load admin module based on service availability
|
||||
Returns (admin_bp, init_admin) tuple
|
||||
"""
|
||||
# Check if we should force simple mode
|
||||
if os.environ.get('ADMIN_SIMPLE_MODE', '').lower() in ('1', 'true', 'yes'):
|
||||
logger.info("Simple admin mode forced by environment variable")
|
||||
from admin_simple import admin_bp, init_admin
|
||||
return admin_bp, init_admin
|
||||
|
||||
# Try to import full admin module
|
||||
try:
|
||||
# Quick check for Redis
|
||||
import redis
|
||||
r = redis.Redis.from_url(os.environ.get('REDIS_URL', 'redis://localhost:6379/0'))
|
||||
r.ping()
|
||||
|
||||
# Quick check for PostgreSQL
|
||||
from sqlalchemy import create_engine, text
|
||||
db_url = os.environ.get('DATABASE_URL', 'postgresql://localhost/talk2me')
|
||||
engine = create_engine(db_url, pool_pre_ping=True)
|
||||
with engine.connect() as conn:
|
||||
conn.execute(text("SELECT 1"))
|
||||
|
||||
# If we get here, both services are available
|
||||
from admin import admin_bp, init_admin
|
||||
logger.info("Using full admin module with Redis and PostgreSQL support")
|
||||
return admin_bp, init_admin
|
||||
|
||||
except Exception as e:
|
||||
logger.warning(f"Cannot use full admin module: {e}")
|
||||
logger.info("Falling back to simple admin module")
|
||||
from admin_simple import admin_bp, init_admin
|
||||
return admin_bp, init_admin
|
77
admin_simple.py
Normal file
77
admin_simple.py
Normal file
@ -0,0 +1,77 @@
|
||||
"""
|
||||
Simple admin blueprint that works without Redis/PostgreSQL
|
||||
"""
|
||||
from flask import Blueprint, request, jsonify, render_template, redirect, url_for, session
|
||||
from functools import wraps
|
||||
import os
|
||||
import logging
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
# Create admin blueprint
|
||||
admin_bp = Blueprint('admin', __name__,
|
||||
template_folder='admin/templates',
|
||||
static_folder='admin/static',
|
||||
static_url_path='/admin/static')
|
||||
|
||||
def init_admin(app):
|
||||
"""Initialize admin module with app configuration"""
|
||||
logger.info("Admin dashboard initialized (simple mode)")
|
||||
|
||||
def admin_required(f):
|
||||
"""Decorator to require admin authentication"""
|
||||
@wraps(f)
|
||||
def decorated_function(*args, **kwargs):
|
||||
# Check if user is logged in as admin
|
||||
if not session.get('admin_logged_in'):
|
||||
# Check for admin token in headers (for API access)
|
||||
auth_header = request.headers.get('Authorization', '')
|
||||
if auth_header.startswith('Bearer '):
|
||||
token = auth_header[7:]
|
||||
expected_token = os.environ.get('ADMIN_TOKEN', 'default-admin-token')
|
||||
if token == expected_token:
|
||||
return f(*args, **kwargs)
|
||||
|
||||
# Redirect to login for web access
|
||||
return redirect(url_for('admin.login', next=request.url))
|
||||
return f(*args, **kwargs)
|
||||
return decorated_function
|
||||
|
||||
@admin_bp.route('/')
|
||||
@admin_required
|
||||
def dashboard():
|
||||
"""Main admin dashboard"""
|
||||
# Use simple dashboard template
|
||||
return render_template('dashboard_simple.html')
|
||||
|
||||
@admin_bp.route('/login', methods=['GET', 'POST'])
|
||||
def login():
|
||||
"""Admin login page"""
|
||||
if request.method == 'POST':
|
||||
token = request.form.get('token', '')
|
||||
expected_token = os.environ.get('ADMIN_TOKEN', 'default-admin-token')
|
||||
|
||||
if token == expected_token:
|
||||
session['admin_logged_in'] = True
|
||||
next_page = request.args.get('next', url_for('admin.dashboard'))
|
||||
return redirect(next_page)
|
||||
else:
|
||||
return render_template('login.html', error='Invalid admin token')
|
||||
|
||||
return render_template('login.html')
|
||||
|
||||
@admin_bp.route('/logout')
|
||||
def logout():
|
||||
"""Admin logout"""
|
||||
session.pop('admin_logged_in', None)
|
||||
return redirect(url_for('admin.login'))
|
||||
|
||||
@admin_bp.route('/health')
|
||||
def health():
|
||||
"""Check admin dashboard health"""
|
||||
return jsonify({
|
||||
'status': 'ok',
|
||||
'mode': 'simple',
|
||||
'redis': 'not configured',
|
||||
'postgresql': 'not configured'
|
||||
})
|
426
analytics_middleware.py
Normal file
426
analytics_middleware.py
Normal file
@ -0,0 +1,426 @@
|
||||
"""Analytics middleware for tracking requests and operations"""
|
||||
|
||||
import time
|
||||
import json
|
||||
import logging
|
||||
from datetime import datetime
|
||||
from flask import request, g
|
||||
import redis
|
||||
import psycopg2
|
||||
from psycopg2.extras import RealDictCursor
|
||||
import threading
|
||||
from queue import Queue
|
||||
from functools import wraps
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
class AnalyticsTracker:
|
||||
"""Track and store analytics data"""
|
||||
|
||||
def __init__(self, app=None):
|
||||
self.app = app
|
||||
self.redis_client = None
|
||||
self.pg_conn = None
|
||||
self.write_queue = Queue()
|
||||
self.writer_thread = None
|
||||
|
||||
if app:
|
||||
self.init_app(app)
|
||||
|
||||
def init_app(self, app):
|
||||
"""Initialize analytics with Flask app"""
|
||||
self.app = app
|
||||
|
||||
# Initialize Redis connection
|
||||
try:
|
||||
self.redis_client = redis.from_url(
|
||||
app.config.get('REDIS_URL', 'redis://localhost:6379/0'),
|
||||
decode_responses=True
|
||||
)
|
||||
self.redis_client.ping()
|
||||
logger.info("Analytics Redis connection established")
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to connect to Redis for analytics: {e}")
|
||||
self.redis_client = None
|
||||
|
||||
# Initialize PostgreSQL connection
|
||||
try:
|
||||
self.pg_conn = psycopg2.connect(
|
||||
app.config.get('DATABASE_URL', 'postgresql://localhost/talk2me')
|
||||
)
|
||||
self.pg_conn.autocommit = True
|
||||
logger.info("Analytics PostgreSQL connection established")
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to connect to PostgreSQL for analytics: {e}")
|
||||
self.pg_conn = None
|
||||
|
||||
# Start background writer thread
|
||||
self.writer_thread = threading.Thread(target=self._write_worker, daemon=True)
|
||||
self.writer_thread.start()
|
||||
|
||||
# Register before/after request handlers
|
||||
app.before_request(self.before_request)
|
||||
app.after_request(self.after_request)
|
||||
|
||||
def before_request(self):
|
||||
"""Track request start time"""
|
||||
g.start_time = time.time()
|
||||
g.request_size = request.content_length or 0
|
||||
|
||||
def after_request(self, response):
|
||||
"""Track request completion and metrics"""
|
||||
try:
|
||||
# Skip if analytics is disabled
|
||||
if not self.enabled:
|
||||
return response
|
||||
|
||||
# Calculate response time
|
||||
response_time = int((time.time() - g.start_time) * 1000) # in ms
|
||||
|
||||
# Track in Redis for real-time stats
|
||||
if self.redis_client:
|
||||
self._track_redis_stats(request, response, response_time)
|
||||
|
||||
# Queue for PostgreSQL logging
|
||||
if self.pg_conn and request.endpoint not in ['static', 'admin.static']:
|
||||
self._queue_request_log(request, response, response_time)
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error in analytics after_request: {e}")
|
||||
|
||||
return response
|
||||
|
||||
def _track_redis_stats(self, request, response, response_time):
|
||||
"""Track statistics in Redis"""
|
||||
try:
|
||||
now = datetime.now()
|
||||
|
||||
# Increment request counters
|
||||
pipe = self.redis_client.pipeline()
|
||||
|
||||
# Total requests
|
||||
pipe.incr('stats:requests:total')
|
||||
|
||||
# Time-based counters
|
||||
pipe.incr(f'stats:requests:minute:{now.strftime("%Y-%m-%d-%H-%M")}')
|
||||
pipe.expire(f'stats:requests:minute:{now.strftime("%Y-%m-%d-%H-%M")}', 3600) # 1 hour
|
||||
|
||||
pipe.incr(f'stats:requests:hourly:{now.strftime("%Y-%m-%d-%H")}')
|
||||
pipe.expire(f'stats:requests:hourly:{now.strftime("%Y-%m-%d-%H")}', 86400) # 24 hours
|
||||
|
||||
pipe.incr(f'stats:requests:daily:{now.strftime("%Y-%m-%d")}')
|
||||
pipe.expire(f'stats:requests:daily:{now.strftime("%Y-%m-%d")}', 604800) # 7 days
|
||||
|
||||
# Track errors
|
||||
if response.status_code >= 400:
|
||||
pipe.incr(f'stats:errors:daily:{now.strftime("%Y-%m-%d")}')
|
||||
pipe.incr(f'stats:errors:hourly:{now.strftime("%Y-%m-%d-%H")}')
|
||||
pipe.expire(f'stats:errors:hourly:{now.strftime("%Y-%m-%d-%H")}', 86400)
|
||||
|
||||
# Track response times
|
||||
endpoint_key = request.endpoint or 'unknown'
|
||||
pipe.lpush(f'stats:response_times:{endpoint_key}', response_time)
|
||||
pipe.ltrim(f'stats:response_times:{endpoint_key}', 0, 999) # Keep last 1000
|
||||
|
||||
# Track slow requests
|
||||
if response_time > 1000: # Over 1 second
|
||||
slow_request = {
|
||||
'endpoint': request.endpoint,
|
||||
'method': request.method,
|
||||
'response_time': response_time,
|
||||
'timestamp': now.isoformat()
|
||||
}
|
||||
pipe.lpush('stats:slow_requests', json.dumps(slow_request))
|
||||
pipe.ltrim('stats:slow_requests', 0, 99) # Keep last 100
|
||||
|
||||
pipe.execute()
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error tracking Redis stats: {e}")
|
||||
|
||||
def _queue_request_log(self, request, response, response_time):
|
||||
"""Queue request log for PostgreSQL"""
|
||||
try:
|
||||
log_entry = {
|
||||
'endpoint': request.endpoint,
|
||||
'method': request.method,
|
||||
'status_code': response.status_code,
|
||||
'response_time_ms': response_time,
|
||||
'ip_address': request.remote_addr,
|
||||
'user_agent': request.headers.get('User-Agent', '')[:500],
|
||||
'request_size_bytes': g.get('request_size', 0),
|
||||
'response_size_bytes': len(response.get_data()),
|
||||
'session_id': g.get('session_id'),
|
||||
'created_at': datetime.now()
|
||||
}
|
||||
|
||||
self.write_queue.put(('request_log', log_entry))
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error queuing request log: {e}")
|
||||
|
||||
def track_operation(self, operation_type, **kwargs):
|
||||
"""Track specific operations (translation, transcription, etc.)"""
|
||||
def decorator(f):
|
||||
@wraps(f)
|
||||
def wrapped(*args, **inner_kwargs):
|
||||
start_time = time.time()
|
||||
success = True
|
||||
error_message = None
|
||||
result = None
|
||||
|
||||
try:
|
||||
result = f(*args, **inner_kwargs)
|
||||
return result
|
||||
except Exception as e:
|
||||
success = False
|
||||
error_message = str(e)
|
||||
raise
|
||||
finally:
|
||||
# Track operation
|
||||
response_time = int((time.time() - start_time) * 1000)
|
||||
self._track_operation_complete(
|
||||
operation_type, response_time, success,
|
||||
error_message, kwargs, result
|
||||
)
|
||||
|
||||
return wrapped
|
||||
return decorator
|
||||
|
||||
def _track_operation_complete(self, operation_type, response_time, success,
|
||||
error_message, metadata, result):
|
||||
"""Track operation completion"""
|
||||
try:
|
||||
now = datetime.now()
|
||||
|
||||
# Update Redis counters
|
||||
if self.redis_client:
|
||||
pipe = self.redis_client.pipeline()
|
||||
|
||||
# Operation counters
|
||||
pipe.incr(f'stats:{operation_type}:total')
|
||||
pipe.incr(f'stats:{operation_type}:daily:{now.strftime("%Y-%m-%d")}')
|
||||
pipe.expire(f'stats:{operation_type}:daily:{now.strftime("%Y-%m-%d")}', 604800)
|
||||
|
||||
# Response times
|
||||
pipe.lpush(f'stats:response_times:{operation_type}', response_time)
|
||||
pipe.ltrim(f'stats:response_times:{operation_type}', 0, 999)
|
||||
|
||||
# Language pairs for translations
|
||||
if operation_type == 'translations' and 'source_lang' in metadata:
|
||||
lang_pair = f"{metadata.get('source_lang')} -> {metadata.get('target_lang')}"
|
||||
pipe.hincrby('stats:language_pairs', lang_pair, 1)
|
||||
|
||||
# Error tracking
|
||||
if not success:
|
||||
pipe.hincrby('stats:error_types', error_message[:100], 1)
|
||||
|
||||
pipe.execute()
|
||||
|
||||
# Queue for PostgreSQL
|
||||
if self.pg_conn:
|
||||
log_entry = {
|
||||
'operation_type': operation_type,
|
||||
'response_time_ms': response_time,
|
||||
'success': success,
|
||||
'error_message': error_message,
|
||||
'metadata': metadata,
|
||||
'result': result,
|
||||
'session_id': g.get('session_id'),
|
||||
'created_at': now
|
||||
}
|
||||
|
||||
self.write_queue.put((operation_type, log_entry))
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error tracking operation: {e}")
|
||||
|
||||
def _write_worker(self):
|
||||
"""Background worker to write logs to PostgreSQL"""
|
||||
while True:
|
||||
try:
|
||||
# Get items from queue (blocking)
|
||||
operation_type, log_entry = self.write_queue.get()
|
||||
|
||||
if operation_type == 'request_log':
|
||||
self._write_request_log(log_entry)
|
||||
elif operation_type == 'translations':
|
||||
self._write_translation_log(log_entry)
|
||||
elif operation_type == 'transcriptions':
|
||||
self._write_transcription_log(log_entry)
|
||||
elif operation_type == 'tts':
|
||||
self._write_tts_log(log_entry)
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error in analytics write worker: {e}")
|
||||
|
||||
def _write_request_log(self, log_entry):
|
||||
"""Write request log to PostgreSQL"""
|
||||
try:
|
||||
with self.pg_conn.cursor() as cursor:
|
||||
cursor.execute("""
|
||||
INSERT INTO request_logs
|
||||
(endpoint, method, status_code, response_time_ms,
|
||||
ip_address, user_agent, request_size_bytes,
|
||||
response_size_bytes, session_id, created_at)
|
||||
VALUES (%(endpoint)s, %(method)s, %(status_code)s,
|
||||
%(response_time_ms)s, %(ip_address)s, %(user_agent)s,
|
||||
%(request_size_bytes)s, %(response_size_bytes)s,
|
||||
%(session_id)s, %(created_at)s)
|
||||
""", log_entry)
|
||||
except Exception as e:
|
||||
error_msg = str(e)
|
||||
if 'relation "request_logs" does not exist' in error_msg:
|
||||
logger.warning("Analytics tables not found. Run init_analytics_db.py to create them.")
|
||||
# Disable analytics to prevent repeated errors
|
||||
self.enabled = False
|
||||
else:
|
||||
logger.error(f"Error writing request log: {e}")
|
||||
|
||||
def _write_translation_log(self, log_entry):
|
||||
"""Write translation log to PostgreSQL"""
|
||||
try:
|
||||
metadata = log_entry.get('metadata', {})
|
||||
|
||||
with self.pg_conn.cursor() as cursor:
|
||||
cursor.execute("""
|
||||
INSERT INTO translation_logs
|
||||
(source_language, target_language, text_length,
|
||||
response_time_ms, success, error_message,
|
||||
session_id, created_at)
|
||||
VALUES (%(source_language)s, %(target_language)s,
|
||||
%(text_length)s, %(response_time_ms)s,
|
||||
%(success)s, %(error_message)s,
|
||||
%(session_id)s, %(created_at)s)
|
||||
""", {
|
||||
'source_language': metadata.get('source_lang'),
|
||||
'target_language': metadata.get('target_lang'),
|
||||
'text_length': metadata.get('text_length', 0),
|
||||
'response_time_ms': log_entry['response_time_ms'],
|
||||
'success': log_entry['success'],
|
||||
'error_message': log_entry['error_message'],
|
||||
'session_id': log_entry['session_id'],
|
||||
'created_at': log_entry['created_at']
|
||||
})
|
||||
except Exception as e:
|
||||
logger.error(f"Error writing translation log: {e}")
|
||||
|
||||
def _write_transcription_log(self, log_entry):
|
||||
"""Write transcription log to PostgreSQL"""
|
||||
try:
|
||||
metadata = log_entry.get('metadata', {})
|
||||
result = log_entry.get('result', {})
|
||||
|
||||
with self.pg_conn.cursor() as cursor:
|
||||
cursor.execute("""
|
||||
INSERT INTO transcription_logs
|
||||
(detected_language, audio_duration_seconds,
|
||||
file_size_bytes, response_time_ms, success,
|
||||
error_message, session_id, created_at)
|
||||
VALUES (%(detected_language)s, %(audio_duration_seconds)s,
|
||||
%(file_size_bytes)s, %(response_time_ms)s,
|
||||
%(success)s, %(error_message)s,
|
||||
%(session_id)s, %(created_at)s)
|
||||
""", {
|
||||
'detected_language': result.get('detected_language') if isinstance(result, dict) else None,
|
||||
'audio_duration_seconds': metadata.get('audio_duration', 0),
|
||||
'file_size_bytes': metadata.get('file_size', 0),
|
||||
'response_time_ms': log_entry['response_time_ms'],
|
||||
'success': log_entry['success'],
|
||||
'error_message': log_entry['error_message'],
|
||||
'session_id': log_entry['session_id'],
|
||||
'created_at': log_entry['created_at']
|
||||
})
|
||||
except Exception as e:
|
||||
logger.error(f"Error writing transcription log: {e}")
|
||||
|
||||
def _write_tts_log(self, log_entry):
|
||||
"""Write TTS log to PostgreSQL"""
|
||||
try:
|
||||
metadata = log_entry.get('metadata', {})
|
||||
|
||||
with self.pg_conn.cursor() as cursor:
|
||||
cursor.execute("""
|
||||
INSERT INTO tts_logs
|
||||
(language, text_length, voice, response_time_ms,
|
||||
success, error_message, session_id, created_at)
|
||||
VALUES (%(language)s, %(text_length)s, %(voice)s,
|
||||
%(response_time_ms)s, %(success)s,
|
||||
%(error_message)s, %(session_id)s, %(created_at)s)
|
||||
""", {
|
||||
'language': metadata.get('language'),
|
||||
'text_length': metadata.get('text_length', 0),
|
||||
'voice': metadata.get('voice'),
|
||||
'response_time_ms': log_entry['response_time_ms'],
|
||||
'success': log_entry['success'],
|
||||
'error_message': log_entry['error_message'],
|
||||
'session_id': log_entry['session_id'],
|
||||
'created_at': log_entry['created_at']
|
||||
})
|
||||
except Exception as e:
|
||||
logger.error(f"Error writing TTS log: {e}")
|
||||
|
||||
def log_error(self, error_type, error_message, **kwargs):
|
||||
"""Log error to analytics"""
|
||||
try:
|
||||
# Track in Redis
|
||||
if self.redis_client:
|
||||
pipe = self.redis_client.pipeline()
|
||||
pipe.hincrby('stats:error_types', error_type, 1)
|
||||
pipe.incr(f'stats:errors:daily:{datetime.now().strftime("%Y-%m-%d")}')
|
||||
pipe.execute()
|
||||
|
||||
# Log to PostgreSQL
|
||||
if self.pg_conn:
|
||||
with self.pg_conn.cursor() as cursor:
|
||||
cursor.execute("""
|
||||
INSERT INTO error_logs
|
||||
(error_type, error_message, endpoint, method,
|
||||
status_code, ip_address, user_agent, request_id,
|
||||
stack_trace, created_at)
|
||||
VALUES (%(error_type)s, %(error_message)s,
|
||||
%(endpoint)s, %(method)s, %(status_code)s,
|
||||
%(ip_address)s, %(user_agent)s,
|
||||
%(request_id)s, %(stack_trace)s,
|
||||
%(created_at)s)
|
||||
""", {
|
||||
'error_type': error_type,
|
||||
'error_message': error_message[:1000],
|
||||
'endpoint': kwargs.get('endpoint'),
|
||||
'method': kwargs.get('method'),
|
||||
'status_code': kwargs.get('status_code'),
|
||||
'ip_address': kwargs.get('ip_address'),
|
||||
'user_agent': kwargs.get('user_agent', '')[:500],
|
||||
'request_id': kwargs.get('request_id'),
|
||||
'stack_trace': kwargs.get('stack_trace', '')[:5000],
|
||||
'created_at': datetime.now()
|
||||
})
|
||||
except Exception as e:
|
||||
logger.error(f"Error logging analytics error: {e}")
|
||||
|
||||
def update_cache_stats(self, hit=True):
|
||||
"""Update cache hit/miss statistics"""
|
||||
try:
|
||||
if self.redis_client:
|
||||
if hit:
|
||||
self.redis_client.incr('stats:cache:hits')
|
||||
else:
|
||||
self.redis_client.incr('stats:cache:misses')
|
||||
except Exception as e:
|
||||
logger.error(f"Error updating cache stats: {e}")
|
||||
|
||||
# Create global instance
|
||||
analytics_tracker = AnalyticsTracker()
|
||||
|
||||
# Convenience decorators
|
||||
def track_translation(**kwargs):
|
||||
"""Decorator to track translation operations"""
|
||||
return analytics_tracker.track_operation('translations', **kwargs)
|
||||
|
||||
def track_transcription(**kwargs):
|
||||
"""Decorator to track transcription operations"""
|
||||
return analytics_tracker.track_operation('transcriptions', **kwargs)
|
||||
|
||||
def track_tts(**kwargs):
|
||||
"""Decorator to track TTS operations"""
|
||||
return analytics_tracker.track_operation('tts', **kwargs)
|
284
app.py
284
app.py
@ -5,7 +5,7 @@ import requests
|
||||
import json
|
||||
import logging
|
||||
from dotenv import load_dotenv
|
||||
from flask import Flask, render_template, request, jsonify, Response, send_file, send_from_directory, stream_with_context, g
|
||||
from flask import Flask, render_template, request, jsonify, Response, send_file, send_from_directory, stream_with_context, g, session, redirect, url_for
|
||||
from flask_cors import CORS, cross_origin
|
||||
import whisper
|
||||
import torch
|
||||
@ -42,6 +42,13 @@ from session_manager import init_app as init_session_manager, track_resource
|
||||
from request_size_limiter import RequestSizeLimiter, limit_request_size
|
||||
from error_logger import ErrorLogger, log_errors, log_performance, log_exception, get_logger
|
||||
from memory_manager import MemoryManager, AudioProcessingContext, with_memory_management
|
||||
from analytics_middleware import analytics_tracker, track_translation, track_transcription, track_tts
|
||||
# Admin module will be loaded dynamically based on service availability
|
||||
from admin_loader import load_admin_module
|
||||
from auth import init_auth, require_auth, get_current_user, update_user_usage_stats
|
||||
from auth_routes import auth_bp
|
||||
from auth_models import User
|
||||
from user_rate_limiter import user_aware_rate_limit, get_user_rate_limit_status
|
||||
|
||||
# Error boundary decorator for Flask routes
|
||||
def with_error_boundary(func):
|
||||
@ -166,6 +173,97 @@ error_logger = ErrorLogger(app, {
|
||||
# Update logger to use the new system
|
||||
logger = get_logger(__name__)
|
||||
|
||||
# Initialize analytics tracking
|
||||
analytics_tracker.init_app(app)
|
||||
|
||||
# Initialize database
|
||||
from database import db, init_db
|
||||
init_db(app)
|
||||
|
||||
# Initialize authentication system
|
||||
init_auth(app)
|
||||
|
||||
# Initialize admin dashboard dynamically
|
||||
admin_bp, init_admin = load_admin_module()
|
||||
init_admin(app)
|
||||
app.register_blueprint(admin_bp, url_prefix='/admin')
|
||||
|
||||
# Register authentication routes
|
||||
app.register_blueprint(auth_bp, url_prefix='/api/auth')
|
||||
|
||||
# Test route for session auth
|
||||
@app.route('/api/test-auth')
|
||||
def test_auth():
|
||||
"""Test authentication methods"""
|
||||
from flask import session as flask_session
|
||||
user = get_current_user()
|
||||
|
||||
# Also check admin count
|
||||
admin_count = User.query.filter_by(role='admin').count() if User else 0
|
||||
|
||||
return jsonify({
|
||||
'session_data': {
|
||||
'logged_in': flask_session.get('logged_in'),
|
||||
'user_id': flask_session.get('user_id'),
|
||||
'username': flask_session.get('username'),
|
||||
'user_role': flask_session.get('user_role'),
|
||||
'admin_token': bool(flask_session.get('admin_token'))
|
||||
},
|
||||
'current_user': {
|
||||
'found': user is not None,
|
||||
'username': user.username if user else None,
|
||||
'role': user.role if user else None,
|
||||
'is_admin': user.is_admin if user else None
|
||||
} if user else None,
|
||||
'admin_users_in_db': admin_count
|
||||
})
|
||||
|
||||
# Initialize admin user if none exists
|
||||
@app.route('/api/init-admin-user', methods=['POST'])
|
||||
def init_admin_user():
|
||||
"""Create initial admin user if none exists"""
|
||||
try:
|
||||
# Check if any admin users exist
|
||||
admin_exists = User.query.filter_by(role='admin').first()
|
||||
if admin_exists:
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'error': 'Admin user already exists'
|
||||
}), 400
|
||||
|
||||
# Create default admin user
|
||||
from auth import create_user
|
||||
user, error = create_user(
|
||||
email='admin@talk2me.local',
|
||||
username='admin',
|
||||
password='admin123', # Change this in production!
|
||||
full_name='Administrator',
|
||||
role='admin',
|
||||
is_verified=True
|
||||
)
|
||||
|
||||
if error:
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'error': error
|
||||
}), 400
|
||||
|
||||
return jsonify({
|
||||
'success': True,
|
||||
'message': 'Admin user created successfully',
|
||||
'credentials': {
|
||||
'username': 'admin',
|
||||
'password': 'admin123',
|
||||
'note': 'Please change the password immediately!'
|
||||
}
|
||||
})
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to create admin user: {str(e)}")
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'error': 'Failed to create admin user'
|
||||
}), 500
|
||||
|
||||
# Initialize memory management
|
||||
memory_manager = MemoryManager(app, {
|
||||
'memory_threshold_mb': app.config.get('MEMORY_THRESHOLD_MB', 4096),
|
||||
@ -673,14 +771,130 @@ LANGUAGE_TO_VOICE = {
|
||||
def index():
|
||||
return render_template('index.html', languages=sorted(SUPPORTED_LANGUAGES.values()))
|
||||
|
||||
@app.route('/login', methods=['GET', 'POST'])
|
||||
def login():
|
||||
"""User login page"""
|
||||
if request.method == 'POST':
|
||||
# Handle form-based login (for users without JavaScript)
|
||||
username = request.form.get('username')
|
||||
password = request.form.get('password')
|
||||
|
||||
# Special case: Check if it's the admin token being used as password
|
||||
admin_token = app.config.get('ADMIN_TOKEN', os.environ.get('ADMIN_TOKEN', 'default-admin-token'))
|
||||
if username == 'admin' and password == admin_token:
|
||||
# Direct admin login with token
|
||||
session['user_id'] = 'admin-token-user'
|
||||
session['username'] = 'admin'
|
||||
session['user_role'] = 'admin'
|
||||
session['logged_in'] = True
|
||||
session['admin_token'] = admin_token
|
||||
|
||||
next_url = request.args.get('next', url_for('admin.dashboard'))
|
||||
return redirect(next_url)
|
||||
|
||||
if username and password:
|
||||
# Try regular database authentication
|
||||
try:
|
||||
# Import here to avoid circular imports
|
||||
from auth import authenticate_user
|
||||
|
||||
user, error = authenticate_user(username, password)
|
||||
if not error and user:
|
||||
# Store user info in session
|
||||
session['user_id'] = str(user.id)
|
||||
session['username'] = user.username
|
||||
session['user_role'] = user.role
|
||||
session['logged_in'] = True
|
||||
|
||||
# Redirect based on role
|
||||
next_url = request.args.get('next')
|
||||
if next_url:
|
||||
return redirect(next_url)
|
||||
elif user.role == 'admin':
|
||||
return redirect(url_for('admin.dashboard'))
|
||||
else:
|
||||
return redirect(url_for('index'))
|
||||
else:
|
||||
return render_template('login.html', error=error or 'Login failed')
|
||||
except Exception as e:
|
||||
logger.error(f"Database login error: {e}")
|
||||
# If database login fails, still show error
|
||||
return render_template('login.html', error='Login failed - database error')
|
||||
|
||||
return render_template('login.html')
|
||||
|
||||
@app.route('/logout')
|
||||
def logout():
|
||||
"""Logout user"""
|
||||
session.clear()
|
||||
return redirect(url_for('index'))
|
||||
|
||||
@app.route('/admin-token-login', methods=['GET', 'POST'])
|
||||
def admin_token_login():
|
||||
"""Simple admin login with token only"""
|
||||
if request.method == 'POST':
|
||||
token = request.form.get('token', request.form.get('password', ''))
|
||||
admin_token = app.config.get('ADMIN_TOKEN', os.environ.get('ADMIN_TOKEN', 'default-admin-token'))
|
||||
|
||||
if token == admin_token:
|
||||
# Set admin session
|
||||
session['user_id'] = 'admin-token-user'
|
||||
session['username'] = 'admin'
|
||||
session['user_role'] = 'admin'
|
||||
session['logged_in'] = True
|
||||
session['admin_token'] = admin_token
|
||||
|
||||
next_url = request.args.get('next', url_for('admin.dashboard'))
|
||||
return redirect(next_url)
|
||||
else:
|
||||
error = 'Invalid admin token'
|
||||
else:
|
||||
error = None
|
||||
|
||||
# Simple form template
|
||||
return f'''
|
||||
<!DOCTYPE html>
|
||||
<html>
|
||||
<head>
|
||||
<title>Admin Token Login</title>
|
||||
<style>
|
||||
body {{ font-family: Arial; padding: 50px; background: #f0f0f0; }}
|
||||
.container {{ max-width: 400px; margin: auto; background: white; padding: 30px; border-radius: 10px; box-shadow: 0 2px 10px rgba(0,0,0,0.1); }}
|
||||
input {{ width: 100%; padding: 10px; margin: 10px 0; border: 1px solid #ddd; border-radius: 5px; }}
|
||||
button {{ width: 100%; padding: 10px; background: #667eea; color: white; border: none; border-radius: 5px; cursor: pointer; }}
|
||||
button:hover {{ background: #5a67d8; }}
|
||||
.error {{ color: red; margin: 10px 0; }}
|
||||
.info {{ color: #666; font-size: 14px; margin-top: 20px; }}
|
||||
</style>
|
||||
</head>
|
||||
<body>
|
||||
<div class="container">
|
||||
<h2>Admin Token Login</h2>
|
||||
{f'<div class="error">{error}</div>' if error else ''}
|
||||
<form method="POST">
|
||||
<input type="password" name="token" placeholder="Enter admin token" required autofocus>
|
||||
<button type="submit">Login as Admin</button>
|
||||
</form>
|
||||
<div class="info">
|
||||
<p>Use the ADMIN_TOKEN from your .env file</p>
|
||||
<p>Current token: {admin_token if app.debug else '[hidden in production]'}</p>
|
||||
</div>
|
||||
</div>
|
||||
</body>
|
||||
</html>
|
||||
'''
|
||||
|
||||
@app.route('/transcribe', methods=['POST'])
|
||||
@rate_limit(requests_per_minute=10, requests_per_hour=100, check_size=True)
|
||||
@user_aware_rate_limit(requests_per_minute=10, requests_per_hour=100, check_size=True)
|
||||
@limit_request_size(max_audio_size=25 * 1024 * 1024) # 25MB limit for audio
|
||||
@with_error_boundary
|
||||
@track_resource('audio_file')
|
||||
@log_performance('transcribe_audio')
|
||||
@with_memory_management
|
||||
def transcribe():
|
||||
# Get current user if authenticated
|
||||
user = get_current_user()
|
||||
|
||||
# Use memory management context
|
||||
with AudioProcessingContext(app.memory_manager, name='transcribe') as ctx:
|
||||
if 'audio' not in request.files:
|
||||
@ -767,6 +981,24 @@ def transcribe():
|
||||
# Log detected language
|
||||
logger.info(f"Auto-detected language: {detected_language} ({detected_code})")
|
||||
|
||||
# Update user usage stats if authenticated
|
||||
if user:
|
||||
update_user_usage_stats(user, 'transcription')
|
||||
|
||||
# Track transcription analytics
|
||||
analytics_tracker._track_operation_complete(
|
||||
'transcriptions',
|
||||
int((time.time() - g.start_time) * 1000),
|
||||
True,
|
||||
None,
|
||||
{
|
||||
'detected_language': detected_language or source_lang,
|
||||
'audio_duration': len(transcribed_text.split()) / 3, # Rough estimate
|
||||
'file_size': os.path.getsize(temp_path)
|
||||
},
|
||||
{'detected_language': detected_language, 'text': transcribed_text}
|
||||
)
|
||||
|
||||
# Send notification if push is enabled
|
||||
if len(push_subscriptions) > 0:
|
||||
send_push_notification(
|
||||
@ -804,12 +1036,15 @@ def transcribe():
|
||||
gc.collect()
|
||||
|
||||
@app.route('/translate', methods=['POST'])
|
||||
@rate_limit(requests_per_minute=20, requests_per_hour=300, check_size=True)
|
||||
@user_aware_rate_limit(requests_per_minute=20, requests_per_hour=300, check_size=True)
|
||||
@limit_request_size(max_size=1 * 1024 * 1024) # 1MB limit for JSON
|
||||
@with_error_boundary
|
||||
@log_performance('translate_text')
|
||||
def translate():
|
||||
try:
|
||||
# Get current user if authenticated
|
||||
user = get_current_user()
|
||||
|
||||
# Validate request size
|
||||
if not Validators.validate_json_size(request.json, max_size_kb=100):
|
||||
return jsonify({'error': 'Request too large'}), 413
|
||||
@ -856,6 +1091,24 @@ def translate():
|
||||
|
||||
translated_text = response['message']['content'].strip()
|
||||
|
||||
# Update user usage stats if authenticated
|
||||
if user:
|
||||
update_user_usage_stats(user, 'translation')
|
||||
|
||||
# Track translation analytics
|
||||
analytics_tracker._track_operation_complete(
|
||||
'translations',
|
||||
int((time.time() - g.start_time) * 1000),
|
||||
True,
|
||||
None,
|
||||
{
|
||||
'source_lang': source_lang,
|
||||
'target_lang': target_lang,
|
||||
'text_length': len(text)
|
||||
},
|
||||
{'translation': translated_text}
|
||||
)
|
||||
|
||||
# Send notification if push is enabled
|
||||
if len(push_subscriptions) > 0:
|
||||
send_push_notification(
|
||||
@ -874,7 +1127,7 @@ def translate():
|
||||
return jsonify({'error': f'Translation failed: {str(e)}'}), 500
|
||||
|
||||
@app.route('/translate/stream', methods=['POST'])
|
||||
@rate_limit(requests_per_minute=10, requests_per_hour=150, check_size=True)
|
||||
@user_aware_rate_limit(requests_per_minute=10, requests_per_hour=150, check_size=True)
|
||||
@limit_request_size(max_size=1 * 1024 * 1024) # 1MB limit for JSON
|
||||
@with_error_boundary
|
||||
def translate_stream():
|
||||
@ -972,12 +1225,15 @@ def translate_stream():
|
||||
return jsonify({'error': f'Translation failed: {str(e)}'}), 500
|
||||
|
||||
@app.route('/speak', methods=['POST'])
|
||||
@rate_limit(requests_per_minute=15, requests_per_hour=200, check_size=True)
|
||||
@user_aware_rate_limit(requests_per_minute=15, requests_per_hour=200, check_size=True)
|
||||
@limit_request_size(max_size=1 * 1024 * 1024) # 1MB limit for JSON
|
||||
@with_error_boundary
|
||||
@track_resource('audio_file')
|
||||
def speak():
|
||||
try:
|
||||
# Get current user if authenticated
|
||||
user = get_current_user()
|
||||
|
||||
# Validate request size
|
||||
if not Validators.validate_json_size(request.json, max_size_kb=100):
|
||||
return jsonify({'error': 'Request too large'}), 413
|
||||
@ -1066,6 +1322,24 @@ def speak():
|
||||
# Register for cleanup
|
||||
register_temp_file(temp_audio_path)
|
||||
|
||||
# Update user usage stats if authenticated
|
||||
if user:
|
||||
update_user_usage_stats(user, 'tts')
|
||||
|
||||
# Track TTS analytics
|
||||
analytics_tracker._track_operation_complete(
|
||||
'tts',
|
||||
int((time.time() - g.start_time) * 1000),
|
||||
True,
|
||||
None,
|
||||
{
|
||||
'language': language,
|
||||
'text_length': len(text),
|
||||
'voice': voice
|
||||
},
|
||||
{'audio_file': temp_audio_filename}
|
||||
)
|
||||
|
||||
# Add to session resources
|
||||
if hasattr(g, 'session_manager') and hasattr(g, 'user_session'):
|
||||
file_size = os.path.getsize(temp_audio_path)
|
||||
|
746
app_with_db.py
Normal file
746
app_with_db.py
Normal file
@ -0,0 +1,746 @@
|
||||
# This is the updated app.py with Redis and PostgreSQL integration
|
||||
# To use this, rename it to app.py after backing up the original
|
||||
|
||||
import os
|
||||
import time
|
||||
import tempfile
|
||||
import requests
|
||||
import json
|
||||
import logging
|
||||
from dotenv import load_dotenv
|
||||
from flask import Flask, render_template, request, jsonify, Response, send_file, send_from_directory, stream_with_context, g
|
||||
from flask_cors import CORS, cross_origin
|
||||
import whisper
|
||||
import torch
|
||||
import ollama
|
||||
from whisper_config import MODEL_SIZE, GPU_OPTIMIZATIONS, TRANSCRIBE_OPTIONS
|
||||
from pywebpush import webpush, WebPushException
|
||||
import base64
|
||||
from cryptography.hazmat.primitives.asymmetric import ec
|
||||
from cryptography.hazmat.primitives import serialization
|
||||
from cryptography.hazmat.backends import default_backend
|
||||
import gc
|
||||
from functools import wraps
|
||||
import traceback
|
||||
from validators import Validators
|
||||
import atexit
|
||||
import threading
|
||||
from datetime import datetime, timedelta
|
||||
|
||||
# Import new database and Redis components
|
||||
from database import db, init_db, Translation, Transcription, UserPreferences, UsageAnalytics
|
||||
from redis_manager import RedisManager, redis_cache
|
||||
from redis_rate_limiter import RedisRateLimiter, rate_limit
|
||||
from redis_session_manager import RedisSessionManager, init_app as init_redis_sessions
|
||||
|
||||
# Load environment variables
|
||||
load_dotenv()
|
||||
|
||||
# Initialize logging
|
||||
logging.basicConfig(level=logging.INFO)
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
# Import other components
|
||||
from werkzeug.middleware.proxy_fix import ProxyFix
|
||||
from config import init_app as init_config
|
||||
from secrets_manager import init_app as init_secrets
|
||||
from request_size_limiter import RequestSizeLimiter, limit_request_size
|
||||
from error_logger import ErrorLogger, log_errors, log_performance, log_exception, get_logger
|
||||
from memory_manager import MemoryManager, AudioProcessingContext, with_memory_management
|
||||
|
||||
# Error boundary decorator
|
||||
def with_error_boundary(func):
|
||||
@wraps(func)
|
||||
def wrapper(*args, **kwargs):
|
||||
try:
|
||||
return func(*args, **kwargs)
|
||||
except Exception as e:
|
||||
log_exception(
|
||||
e,
|
||||
message=f"Error in {func.__name__}",
|
||||
endpoint=request.endpoint,
|
||||
method=request.method,
|
||||
path=request.path,
|
||||
ip=request.remote_addr,
|
||||
function=func.__name__,
|
||||
module=func.__module__
|
||||
)
|
||||
|
||||
if any(keyword in str(e).lower() for keyword in ['inject', 'attack', 'malicious', 'unauthorized']):
|
||||
app.error_logger.log_security(
|
||||
'suspicious_error',
|
||||
severity='warning',
|
||||
error_type=type(e).__name__,
|
||||
error_message=str(e),
|
||||
endpoint=request.endpoint,
|
||||
ip=request.remote_addr
|
||||
)
|
||||
|
||||
error_message = str(e) if app.debug else "An internal error occurred"
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'error': error_message,
|
||||
'component': func.__name__,
|
||||
'request_id': getattr(g, 'request_id', None)
|
||||
}), 500
|
||||
return wrapper
|
||||
|
||||
app = Flask(__name__)
|
||||
|
||||
# Apply ProxyFix middleware
|
||||
app.wsgi_app = ProxyFix(
|
||||
app.wsgi_app,
|
||||
x_for=1,
|
||||
x_proto=1,
|
||||
x_host=1,
|
||||
x_prefix=1
|
||||
)
|
||||
|
||||
# Initialize configuration and secrets
|
||||
init_config(app)
|
||||
init_secrets(app)
|
||||
|
||||
# Initialize database
|
||||
init_db(app)
|
||||
|
||||
# Initialize Redis
|
||||
redis_manager = RedisManager(app)
|
||||
app.redis = redis_manager
|
||||
|
||||
# Initialize Redis-based rate limiter
|
||||
redis_rate_limiter = RedisRateLimiter(redis_manager)
|
||||
app.redis_rate_limiter = redis_rate_limiter
|
||||
|
||||
# Initialize Redis-based session management
|
||||
init_redis_sessions(app)
|
||||
|
||||
# Configure CORS
|
||||
cors_config = {
|
||||
"origins": app.config.get('CORS_ORIGINS', ['*']),
|
||||
"methods": ["GET", "POST", "OPTIONS"],
|
||||
"allow_headers": ["Content-Type", "Authorization", "X-Requested-With", "X-Admin-Token"],
|
||||
"expose_headers": ["Content-Range", "X-Content-Range"],
|
||||
"supports_credentials": True,
|
||||
"max_age": 3600
|
||||
}
|
||||
|
||||
CORS(app, resources={
|
||||
r"/api/*": cors_config,
|
||||
r"/transcribe": cors_config,
|
||||
r"/translate": cors_config,
|
||||
r"/translate/stream": cors_config,
|
||||
r"/speak": cors_config,
|
||||
r"/get_audio/*": cors_config,
|
||||
r"/check_tts_server": cors_config,
|
||||
r"/update_tts_config": cors_config,
|
||||
r"/health/*": cors_config,
|
||||
r"/admin/*": {
|
||||
**cors_config,
|
||||
"origins": app.config.get('ADMIN_CORS_ORIGINS', ['http://localhost:*'])
|
||||
}
|
||||
})
|
||||
|
||||
# Configure upload folder
|
||||
upload_folder = app.config.get('UPLOAD_FOLDER')
|
||||
if not upload_folder:
|
||||
upload_folder = os.path.join(tempfile.gettempdir(), 'talk2me_uploads')
|
||||
|
||||
try:
|
||||
os.makedirs(upload_folder, mode=0o755, exist_ok=True)
|
||||
logger.info(f"Using upload folder: {upload_folder}")
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to create upload folder {upload_folder}: {str(e)}")
|
||||
upload_folder = tempfile.mkdtemp(prefix='talk2me_')
|
||||
logger.warning(f"Falling back to temporary folder: {upload_folder}")
|
||||
|
||||
app.config['UPLOAD_FOLDER'] = upload_folder
|
||||
|
||||
# Initialize request size limiter
|
||||
request_size_limiter = RequestSizeLimiter(app, {
|
||||
'max_content_length': app.config.get('MAX_CONTENT_LENGTH', 50 * 1024 * 1024),
|
||||
'max_audio_size': app.config.get('MAX_AUDIO_SIZE', 25 * 1024 * 1024),
|
||||
'max_json_size': app.config.get('MAX_JSON_SIZE', 1 * 1024 * 1024),
|
||||
'max_image_size': app.config.get('MAX_IMAGE_SIZE', 10 * 1024 * 1024),
|
||||
})
|
||||
|
||||
# Initialize error logging
|
||||
error_logger = ErrorLogger(app, {
|
||||
'log_level': app.config.get('LOG_LEVEL', 'INFO'),
|
||||
'log_file': app.config.get('LOG_FILE', 'logs/talk2me.log'),
|
||||
'error_log_file': app.config.get('ERROR_LOG_FILE', 'logs/errors.log'),
|
||||
'max_bytes': app.config.get('LOG_MAX_BYTES', 50 * 1024 * 1024),
|
||||
'backup_count': app.config.get('LOG_BACKUP_COUNT', 10)
|
||||
})
|
||||
|
||||
logger = get_logger(__name__)
|
||||
|
||||
# Initialize memory management
|
||||
memory_manager = MemoryManager(app, {
|
||||
'memory_threshold_mb': app.config.get('MEMORY_THRESHOLD_MB', 4096),
|
||||
'gpu_memory_threshold_mb': app.config.get('GPU_MEMORY_THRESHOLD_MB', 2048),
|
||||
'cleanup_interval': app.config.get('MEMORY_CLEANUP_INTERVAL', 30)
|
||||
})
|
||||
|
||||
# Initialize Whisper model
|
||||
logger.info("Initializing Whisper model with GPU optimization...")
|
||||
|
||||
if torch.cuda.is_available():
|
||||
device = torch.device("cuda")
|
||||
try:
|
||||
gpu_name = torch.cuda.get_device_name(0)
|
||||
if 'AMD' in gpu_name or 'Radeon' in gpu_name:
|
||||
logger.info(f"AMD GPU detected via ROCm: {gpu_name}")
|
||||
else:
|
||||
logger.info(f"NVIDIA GPU detected: {gpu_name}")
|
||||
except:
|
||||
logger.info("GPU detected - using CUDA/ROCm acceleration")
|
||||
elif hasattr(torch.backends, 'mps') and torch.backends.mps.is_available():
|
||||
device = torch.device("mps")
|
||||
logger.info("Apple Silicon detected - using Metal Performance Shaders")
|
||||
else:
|
||||
device = torch.device("cpu")
|
||||
logger.info("No GPU acceleration available - using CPU")
|
||||
|
||||
logger.info(f"Using device: {device}")
|
||||
|
||||
whisper_model = whisper.load_model(MODEL_SIZE, device=device)
|
||||
|
||||
# Enable GPU optimizations
|
||||
if device.type == 'cuda':
|
||||
try:
|
||||
torch.backends.cuda.matmul.allow_tf32 = True
|
||||
torch.backends.cudnn.allow_tf32 = True
|
||||
torch.backends.cudnn.benchmark = True
|
||||
whisper_model.eval()
|
||||
whisper_model = whisper_model.half()
|
||||
torch.cuda.empty_cache()
|
||||
|
||||
logger.info("Warming up GPU with dummy inference...")
|
||||
with torch.no_grad():
|
||||
dummy_audio = torch.randn(1, 16000 * 30).to(device).half()
|
||||
_ = whisper_model.encode(whisper.pad_or_trim(dummy_audio))
|
||||
|
||||
logger.info(f"GPU memory allocated: {torch.cuda.memory_allocated() / 1024**2:.2f} MB")
|
||||
logger.info("Whisper model loaded and optimized for GPU")
|
||||
except Exception as e:
|
||||
logger.warning(f"Some GPU optimizations failed: {e}")
|
||||
elif device.type == 'mps':
|
||||
whisper_model.eval()
|
||||
logger.info("Whisper model loaded and optimized for Apple Silicon")
|
||||
else:
|
||||
whisper_model.eval()
|
||||
logger.info("Whisper model loaded (CPU mode)")
|
||||
|
||||
memory_manager.set_whisper_model(whisper_model)
|
||||
app.whisper_model = whisper_model
|
||||
|
||||
# Supported languages
|
||||
SUPPORTED_LANGUAGES = {
|
||||
"ar": "Arabic",
|
||||
"hy": "Armenian",
|
||||
"az": "Azerbaijani",
|
||||
"en": "English",
|
||||
"fr": "French",
|
||||
"ka": "Georgian",
|
||||
"kk": "Kazakh",
|
||||
"zh": "Mandarin",
|
||||
"fa": "Farsi",
|
||||
"pt": "Portuguese",
|
||||
"ru": "Russian",
|
||||
"es": "Spanish",
|
||||
"tr": "Turkish",
|
||||
"uz": "Uzbek"
|
||||
}
|
||||
|
||||
LANGUAGE_TO_CODE = {v: k for k, v in SUPPORTED_LANGUAGES.items()}
|
||||
|
||||
LANGUAGE_TO_VOICE = {
|
||||
"Arabic": "ar-EG-ShakirNeural",
|
||||
"Armenian": "echo",
|
||||
"Azerbaijani": "az-AZ-BanuNeural",
|
||||
"English": "en-GB-RyanNeural",
|
||||
"French": "fr-FR-DeniseNeural",
|
||||
"Georgian": "ka-GE-GiorgiNeural",
|
||||
"Kazakh": "kk-KZ-DauletNeural",
|
||||
"Mandarin": "zh-CN-YunjianNeural",
|
||||
"Farsi": "fa-IR-FaridNeural",
|
||||
"Portuguese": "pt-BR-ThalitaNeural",
|
||||
"Russian": "ru-RU-SvetlanaNeural",
|
||||
"Spanish": "es-CR-MariaNeural",
|
||||
"Turkish": "tr-TR-EmelNeural",
|
||||
"Uzbek": "uz-UZ-SardorNeural"
|
||||
}
|
||||
|
||||
# Generate VAPID keys for push notifications
|
||||
if not os.path.exists('vapid_private.pem'):
|
||||
private_key = ec.generate_private_key(ec.SECP256R1(), default_backend())
|
||||
public_key = private_key.public_key()
|
||||
|
||||
with open('vapid_private.pem', 'wb') as f:
|
||||
f.write(private_key.private_bytes(
|
||||
encoding=serialization.Encoding.PEM,
|
||||
format=serialization.PrivateFormat.PKCS8,
|
||||
encryption_algorithm=serialization.NoEncryption()
|
||||
))
|
||||
|
||||
with open('vapid_public.pem', 'wb') as f:
|
||||
f.write(public_key.public_bytes(
|
||||
encoding=serialization.Encoding.PEM,
|
||||
format=serialization.PublicFormat.SubjectPublicKeyInfo
|
||||
))
|
||||
|
||||
with open('vapid_private.pem', 'rb') as f:
|
||||
vapid_private_key = f.read()
|
||||
with open('vapid_public.pem', 'rb') as f:
|
||||
vapid_public_pem = f.read()
|
||||
vapid_public_key = serialization.load_pem_public_key(
|
||||
vapid_public_pem,
|
||||
backend=default_backend()
|
||||
)
|
||||
|
||||
public_numbers = vapid_public_key.public_numbers()
|
||||
x = public_numbers.x.to_bytes(32, byteorder='big')
|
||||
y = public_numbers.y.to_bytes(32, byteorder='big')
|
||||
vapid_public_key_base64 = base64.urlsafe_b64encode(b'\x04' + x + y).decode('utf-8').rstrip('=')
|
||||
|
||||
# Store push subscriptions in Redis instead of memory
|
||||
# push_subscriptions = [] # Removed - now using Redis
|
||||
|
||||
# Temporary file cleanup
|
||||
TEMP_FILE_MAX_AGE = 300
|
||||
CLEANUP_INTERVAL = 60
|
||||
|
||||
def cleanup_temp_files():
|
||||
"""Clean up old temporary files"""
|
||||
try:
|
||||
current_time = datetime.now()
|
||||
|
||||
# Clean files from upload folder
|
||||
if os.path.exists(app.config['UPLOAD_FOLDER']):
|
||||
for filename in os.listdir(app.config['UPLOAD_FOLDER']):
|
||||
filepath = os.path.join(app.config['UPLOAD_FOLDER'], filename)
|
||||
if os.path.isfile(filepath):
|
||||
file_age = current_time - datetime.fromtimestamp(os.path.getmtime(filepath))
|
||||
if file_age > timedelta(seconds=TEMP_FILE_MAX_AGE):
|
||||
try:
|
||||
os.remove(filepath)
|
||||
logger.info(f"Cleaned up file: {filepath}")
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to remove file {filepath}: {str(e)}")
|
||||
|
||||
logger.debug("Cleanup completed")
|
||||
except Exception as e:
|
||||
logger.error(f"Error during temp file cleanup: {str(e)}")
|
||||
|
||||
def run_cleanup_loop():
|
||||
"""Run cleanup in a separate thread"""
|
||||
while True:
|
||||
time.sleep(CLEANUP_INTERVAL)
|
||||
cleanup_temp_files()
|
||||
|
||||
# Start cleanup thread
|
||||
cleanup_thread = threading.Thread(target=run_cleanup_loop, daemon=True)
|
||||
cleanup_thread.start()
|
||||
|
||||
# Analytics collection helper
|
||||
def collect_analytics(service: str, duration_ms: int = None, metadata: dict = None):
|
||||
"""Collect usage analytics to database"""
|
||||
try:
|
||||
from sqlalchemy import func
|
||||
|
||||
today = datetime.utcnow().date()
|
||||
hour = datetime.utcnow().hour
|
||||
|
||||
# Get or create analytics record
|
||||
analytics = UsageAnalytics.query.filter_by(date=today, hour=hour).first()
|
||||
if not analytics:
|
||||
analytics = UsageAnalytics(date=today, hour=hour)
|
||||
db.session.add(analytics)
|
||||
|
||||
# Update counters
|
||||
analytics.total_requests += 1
|
||||
|
||||
if service == 'transcription':
|
||||
analytics.transcriptions += 1
|
||||
if duration_ms:
|
||||
if analytics.avg_transcription_time_ms:
|
||||
analytics.avg_transcription_time_ms = (
|
||||
(analytics.avg_transcription_time_ms * (analytics.transcriptions - 1) + duration_ms)
|
||||
/ analytics.transcriptions
|
||||
)
|
||||
else:
|
||||
analytics.avg_transcription_time_ms = duration_ms
|
||||
|
||||
elif service == 'translation':
|
||||
analytics.translations += 1
|
||||
if duration_ms:
|
||||
if analytics.avg_translation_time_ms:
|
||||
analytics.avg_translation_time_ms = (
|
||||
(analytics.avg_translation_time_ms * (analytics.translations - 1) + duration_ms)
|
||||
/ analytics.translations
|
||||
)
|
||||
else:
|
||||
analytics.avg_translation_time_ms = duration_ms
|
||||
|
||||
elif service == 'tts':
|
||||
analytics.tts_requests += 1
|
||||
if duration_ms:
|
||||
if analytics.avg_tts_time_ms:
|
||||
analytics.avg_tts_time_ms = (
|
||||
(analytics.avg_tts_time_ms * (analytics.tts_requests - 1) + duration_ms)
|
||||
/ analytics.tts_requests
|
||||
)
|
||||
else:
|
||||
analytics.avg_tts_time_ms = duration_ms
|
||||
|
||||
db.session.commit()
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to collect analytics: {e}")
|
||||
db.session.rollback()
|
||||
|
||||
# Routes
|
||||
@app.route('/')
|
||||
def index():
|
||||
return render_template('index.html', languages=sorted(SUPPORTED_LANGUAGES.values()))
|
||||
|
||||
@app.route('/transcribe', methods=['POST'])
|
||||
@rate_limit(requests_per_minute=10, requests_per_hour=100, check_size=True)
|
||||
@limit_request_size(max_audio_size=25 * 1024 * 1024)
|
||||
@with_error_boundary
|
||||
@log_performance('transcribe_audio')
|
||||
@with_memory_management
|
||||
def transcribe():
|
||||
start_time = time.time()
|
||||
|
||||
with AudioProcessingContext(app.memory_manager, name='transcribe') as ctx:
|
||||
if 'audio' not in request.files:
|
||||
return jsonify({'error': 'No audio file provided'}), 400
|
||||
|
||||
audio_file = request.files['audio']
|
||||
|
||||
valid, error_msg = Validators.validate_audio_file(audio_file)
|
||||
if not valid:
|
||||
return jsonify({'error': error_msg}), 400
|
||||
|
||||
source_lang = request.form.get('source_lang', '')
|
||||
allowed_languages = set(SUPPORTED_LANGUAGES.values())
|
||||
source_lang = Validators.validate_language_code(source_lang, allowed_languages) or ''
|
||||
|
||||
temp_filename = f'input_audio_{int(time.time() * 1000)}.wav'
|
||||
temp_path = os.path.join(app.config['UPLOAD_FOLDER'], temp_filename)
|
||||
|
||||
with open(temp_path, 'wb') as f:
|
||||
audio_file.save(f)
|
||||
|
||||
ctx.add_temp_file(temp_path)
|
||||
|
||||
# Add to Redis session
|
||||
if hasattr(g, 'session_manager') and hasattr(g, 'user_session'):
|
||||
file_size = os.path.getsize(temp_path)
|
||||
g.session_manager.add_resource(
|
||||
session_id=g.user_session.session_id,
|
||||
resource_type='audio_file',
|
||||
resource_id=temp_filename,
|
||||
path=temp_path,
|
||||
size_bytes=file_size,
|
||||
metadata={'filename': temp_filename, 'purpose': 'transcription'}
|
||||
)
|
||||
|
||||
try:
|
||||
auto_detect = source_lang == 'auto' or source_lang == ''
|
||||
|
||||
transcribe_options = {
|
||||
"task": "transcribe",
|
||||
"temperature": 0,
|
||||
"best_of": 1,
|
||||
"beam_size": 1,
|
||||
"fp16": device.type == 'cuda',
|
||||
"condition_on_previous_text": False,
|
||||
"compression_ratio_threshold": 2.4,
|
||||
"logprob_threshold": -1.0,
|
||||
"no_speech_threshold": 0.6
|
||||
}
|
||||
|
||||
if not auto_detect:
|
||||
transcribe_options["language"] = LANGUAGE_TO_CODE.get(source_lang, None)
|
||||
|
||||
if device.type == 'cuda':
|
||||
torch.cuda.empty_cache()
|
||||
|
||||
with torch.no_grad():
|
||||
result = whisper_model.transcribe(
|
||||
temp_path,
|
||||
**transcribe_options
|
||||
)
|
||||
|
||||
transcribed_text = result["text"]
|
||||
|
||||
detected_language = None
|
||||
if auto_detect and 'language' in result:
|
||||
detected_code = result['language']
|
||||
for lang_name, lang_code in LANGUAGE_TO_CODE.items():
|
||||
if lang_code == detected_code:
|
||||
detected_language = lang_name
|
||||
break
|
||||
|
||||
logger.info(f"Auto-detected language: {detected_language} ({detected_code})")
|
||||
|
||||
# Calculate duration
|
||||
duration_ms = int((time.time() - start_time) * 1000)
|
||||
|
||||
# Save to database
|
||||
try:
|
||||
transcription = Transcription(
|
||||
session_id=g.user_session.session_id if hasattr(g, 'user_session') else None,
|
||||
user_id=g.user_session.user_id if hasattr(g, 'user_session') else None,
|
||||
transcribed_text=transcribed_text,
|
||||
detected_language=detected_language or source_lang,
|
||||
transcription_time_ms=duration_ms,
|
||||
model_used=MODEL_SIZE,
|
||||
audio_file_size=os.path.getsize(temp_path),
|
||||
ip_address=request.remote_addr,
|
||||
user_agent=request.headers.get('User-Agent')
|
||||
)
|
||||
db.session.add(transcription)
|
||||
db.session.commit()
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to save transcription to database: {e}")
|
||||
db.session.rollback()
|
||||
|
||||
# Collect analytics
|
||||
collect_analytics('transcription', duration_ms)
|
||||
|
||||
# Send notification if push is enabled
|
||||
push_count = redis_manager.scard('push_subscriptions')
|
||||
if push_count > 0:
|
||||
send_push_notification(
|
||||
title="Transcription Complete",
|
||||
body=f"Successfully transcribed: {transcribed_text[:50]}...",
|
||||
tag="transcription-complete"
|
||||
)
|
||||
|
||||
response = {
|
||||
'success': True,
|
||||
'text': transcribed_text
|
||||
}
|
||||
|
||||
if detected_language:
|
||||
response['detected_language'] = detected_language
|
||||
|
||||
return jsonify(response)
|
||||
except Exception as e:
|
||||
logger.error(f"Transcription error: {str(e)}")
|
||||
return jsonify({'error': f'Transcription failed: {str(e)}'}), 500
|
||||
finally:
|
||||
try:
|
||||
if 'temp_path' in locals() and os.path.exists(temp_path):
|
||||
os.remove(temp_path)
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to clean up temp file: {e}")
|
||||
|
||||
if device.type == 'cuda':
|
||||
torch.cuda.empty_cache()
|
||||
torch.cuda.synchronize()
|
||||
gc.collect()
|
||||
|
||||
@app.route('/translate', methods=['POST'])
|
||||
@rate_limit(requests_per_minute=20, requests_per_hour=300, check_size=True)
|
||||
@limit_request_size(max_size=1 * 1024 * 1024)
|
||||
@with_error_boundary
|
||||
@log_performance('translate_text')
|
||||
def translate():
|
||||
start_time = time.time()
|
||||
|
||||
try:
|
||||
if not Validators.validate_json_size(request.json, max_size_kb=100):
|
||||
return jsonify({'error': 'Request too large'}), 413
|
||||
|
||||
data = request.json
|
||||
|
||||
text = data.get('text', '')
|
||||
text = Validators.sanitize_text(text)
|
||||
if not text:
|
||||
return jsonify({'error': 'No text provided'}), 400
|
||||
|
||||
allowed_languages = set(SUPPORTED_LANGUAGES.values())
|
||||
source_lang = Validators.validate_language_code(
|
||||
data.get('source_lang', ''), allowed_languages
|
||||
) or 'auto'
|
||||
target_lang = Validators.validate_language_code(
|
||||
data.get('target_lang', ''), allowed_languages
|
||||
)
|
||||
|
||||
if not target_lang:
|
||||
return jsonify({'error': 'Invalid target language'}), 400
|
||||
|
||||
# Check cache first
|
||||
cached_translation = redis_manager.get_cached_translation(
|
||||
text, source_lang, target_lang
|
||||
)
|
||||
if cached_translation:
|
||||
logger.info("Translation served from cache")
|
||||
return jsonify({
|
||||
'success': True,
|
||||
'translation': cached_translation,
|
||||
'cached': True
|
||||
})
|
||||
|
||||
# Create prompt for translation
|
||||
prompt = f"""
|
||||
Translate the following text from {source_lang} to {target_lang}:
|
||||
|
||||
"{text}"
|
||||
|
||||
Provide only the translation without any additional text.
|
||||
"""
|
||||
|
||||
response = ollama.chat(
|
||||
model="gemma3:27b",
|
||||
messages=[
|
||||
{
|
||||
"role": "user",
|
||||
"content": prompt
|
||||
}
|
||||
]
|
||||
)
|
||||
|
||||
translated_text = response['message']['content'].strip()
|
||||
|
||||
# Calculate duration
|
||||
duration_ms = int((time.time() - start_time) * 1000)
|
||||
|
||||
# Cache the translation
|
||||
redis_manager.cache_translation(
|
||||
text, source_lang, target_lang, translated_text
|
||||
)
|
||||
|
||||
# Save to database
|
||||
try:
|
||||
translation = Translation(
|
||||
session_id=g.user_session.session_id if hasattr(g, 'user_session') else None,
|
||||
user_id=g.user_session.user_id if hasattr(g, 'user_session') else None,
|
||||
source_text=text,
|
||||
source_language=source_lang,
|
||||
target_text=translated_text,
|
||||
target_language=target_lang,
|
||||
translation_time_ms=duration_ms,
|
||||
model_used="gemma3:27b",
|
||||
ip_address=request.remote_addr,
|
||||
user_agent=request.headers.get('User-Agent')
|
||||
)
|
||||
db.session.add(translation)
|
||||
db.session.commit()
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to save translation to database: {e}")
|
||||
db.session.rollback()
|
||||
|
||||
# Collect analytics
|
||||
collect_analytics('translation', duration_ms)
|
||||
|
||||
# Send notification
|
||||
push_count = redis_manager.scard('push_subscriptions')
|
||||
if push_count > 0:
|
||||
send_push_notification(
|
||||
title="Translation Complete",
|
||||
body=f"Translated from {source_lang} to {target_lang}",
|
||||
tag="translation-complete",
|
||||
data={'translation': translated_text[:100]}
|
||||
)
|
||||
|
||||
return jsonify({
|
||||
'success': True,
|
||||
'translation': translated_text
|
||||
})
|
||||
except Exception as e:
|
||||
logger.error(f"Translation error: {str(e)}")
|
||||
return jsonify({'error': f'Translation failed: {str(e)}'}), 500
|
||||
|
||||
@app.route('/api/push-subscribe', methods=['POST'])
|
||||
@rate_limit(requests_per_minute=10, requests_per_hour=50)
|
||||
def push_subscribe():
|
||||
try:
|
||||
subscription = request.json
|
||||
# Store subscription in Redis
|
||||
subscription_id = f"sub_{int(time.time() * 1000)}"
|
||||
redis_manager.set(f"push_subscription:{subscription_id}", subscription, expire=86400 * 30) # 30 days
|
||||
redis_manager.sadd('push_subscriptions', subscription_id)
|
||||
|
||||
logger.info(f"New push subscription registered: {subscription_id}")
|
||||
return jsonify({'success': True})
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to register push subscription: {str(e)}")
|
||||
return jsonify({'success': False, 'error': str(e)}), 500
|
||||
|
||||
def send_push_notification(title, body, icon='/static/icons/icon-192x192.png',
|
||||
badge='/static/icons/icon-192x192.png', tag=None, data=None):
|
||||
"""Send push notification to all subscribed clients"""
|
||||
claims = {
|
||||
"sub": "mailto:admin@talk2me.app",
|
||||
"exp": int(time.time()) + 86400
|
||||
}
|
||||
|
||||
notification_sent = 0
|
||||
|
||||
# Get all subscription IDs from Redis
|
||||
subscription_ids = redis_manager.smembers('push_subscriptions')
|
||||
|
||||
for sub_id in subscription_ids:
|
||||
subscription = redis_manager.get(f"push_subscription:{sub_id}")
|
||||
if not subscription:
|
||||
continue
|
||||
|
||||
try:
|
||||
webpush(
|
||||
subscription_info=subscription,
|
||||
data=json.dumps({
|
||||
'title': title,
|
||||
'body': body,
|
||||
'icon': icon,
|
||||
'badge': badge,
|
||||
'tag': tag or 'talk2me-notification',
|
||||
'data': data or {}
|
||||
}),
|
||||
vapid_private_key=vapid_private_key,
|
||||
vapid_claims=claims
|
||||
)
|
||||
notification_sent += 1
|
||||
except WebPushException as e:
|
||||
logger.error(f"Failed to send push notification: {str(e)}")
|
||||
if e.response and e.response.status_code == 410:
|
||||
# Remove invalid subscription
|
||||
redis_manager.delete(f"push_subscription:{sub_id}")
|
||||
redis_manager.srem('push_subscriptions', sub_id)
|
||||
|
||||
logger.info(f"Sent {notification_sent} push notifications")
|
||||
return notification_sent
|
||||
|
||||
# Initialize app
|
||||
app.start_time = time.time()
|
||||
app.request_count = 0
|
||||
|
||||
@app.before_request
|
||||
def before_request():
|
||||
app.request_count = getattr(app, 'request_count', 0) + 1
|
||||
|
||||
# Error handlers
|
||||
@app.errorhandler(404)
|
||||
def not_found_error(error):
|
||||
logger.warning(f"404 error: {request.url}")
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'error': 'Resource not found',
|
||||
'status': 404
|
||||
}), 404
|
||||
|
||||
@app.errorhandler(500)
|
||||
def internal_error(error):
|
||||
logger.error(f"500 error: {str(error)}")
|
||||
logger.error(traceback.format_exc())
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'error': 'Internal server error',
|
||||
'status': 500
|
||||
}), 500
|
||||
|
||||
if __name__ == '__main__':
|
||||
app.run(host='0.0.0.0', port=5005, debug=True)
|
476
auth.py
Normal file
476
auth.py
Normal file
@ -0,0 +1,476 @@
|
||||
"""Authentication and authorization utilities for Talk2Me"""
|
||||
import os
|
||||
import uuid
|
||||
import functools
|
||||
from datetime import datetime, timedelta, timezone
|
||||
from typing import Optional, Dict, Any, Callable, Union, List
|
||||
from flask import request, jsonify, g, current_app
|
||||
from flask_jwt_extended import (
|
||||
JWTManager, create_access_token, create_refresh_token,
|
||||
get_jwt_identity, jwt_required, get_jwt, verify_jwt_in_request
|
||||
)
|
||||
from werkzeug.exceptions import Unauthorized
|
||||
from sqlalchemy.exc import IntegrityError
|
||||
|
||||
from database import db
|
||||
from auth_models import User, LoginHistory, UserSession, RevokedToken, bcrypt
|
||||
from error_logger import log_exception
|
||||
|
||||
# Initialize JWT Manager
|
||||
jwt = JWTManager()
|
||||
|
||||
|
||||
def init_auth(app):
|
||||
"""Initialize authentication system with app"""
|
||||
# Configure JWT
|
||||
app.config['JWT_SECRET_KEY'] = app.config.get('JWT_SECRET_KEY', os.environ.get('JWT_SECRET_KEY', 'your-secret-key-change-in-production'))
|
||||
app.config['JWT_ACCESS_TOKEN_EXPIRES'] = timedelta(hours=1)
|
||||
app.config['JWT_REFRESH_TOKEN_EXPIRES'] = timedelta(days=30)
|
||||
app.config['JWT_ALGORITHM'] = 'HS256'
|
||||
app.config['JWT_BLACKLIST_ENABLED'] = True
|
||||
app.config['JWT_BLACKLIST_TOKEN_CHECKS'] = ['access', 'refresh']
|
||||
|
||||
# Initialize JWT manager
|
||||
jwt.init_app(app)
|
||||
|
||||
# Initialize bcrypt
|
||||
bcrypt.init_app(app)
|
||||
|
||||
# Register JWT callbacks
|
||||
@jwt.token_in_blocklist_loader
|
||||
def check_if_token_revoked(jwt_header, jwt_payload):
|
||||
jti = jwt_payload["jti"]
|
||||
return RevokedToken.is_token_revoked(jti)
|
||||
|
||||
@jwt.expired_token_loader
|
||||
def expired_token_callback(jwt_header, jwt_payload):
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'error': 'Token has expired',
|
||||
'code': 'token_expired'
|
||||
}), 401
|
||||
|
||||
@jwt.invalid_token_loader
|
||||
def invalid_token_callback(error):
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'error': 'Invalid token',
|
||||
'code': 'invalid_token'
|
||||
}), 401
|
||||
|
||||
@jwt.unauthorized_loader
|
||||
def missing_token_callback(error):
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'error': 'Authorization required',
|
||||
'code': 'authorization_required'
|
||||
}), 401
|
||||
|
||||
@jwt.revoked_token_loader
|
||||
def revoked_token_callback(jwt_header, jwt_payload):
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'error': 'Token has been revoked',
|
||||
'code': 'token_revoked'
|
||||
}), 401
|
||||
|
||||
|
||||
def create_user(email: str, username: str, password: str, full_name: Optional[str] = None,
|
||||
role: str = 'user', is_verified: bool = False) -> tuple[Optional[User], Optional[str]]:
|
||||
"""Create a new user account"""
|
||||
try:
|
||||
# Check if user already exists
|
||||
if User.query.filter((User.email == email) | (User.username == username)).first():
|
||||
return None, "User with this email or username already exists"
|
||||
|
||||
# Create user
|
||||
user = User(
|
||||
email=email,
|
||||
username=username,
|
||||
full_name=full_name,
|
||||
role=role,
|
||||
is_verified=is_verified
|
||||
)
|
||||
user.set_password(password)
|
||||
|
||||
db.session.add(user)
|
||||
db.session.commit()
|
||||
|
||||
return user, None
|
||||
except IntegrityError:
|
||||
db.session.rollback()
|
||||
return None, "User with this email or username already exists"
|
||||
except Exception as e:
|
||||
db.session.rollback()
|
||||
log_exception(e, "Failed to create user")
|
||||
return None, "Failed to create user account"
|
||||
|
||||
|
||||
def authenticate_user(username_or_email: str, password: str) -> tuple[Optional[User], Optional[str]]:
|
||||
"""Authenticate user with username/email and password"""
|
||||
# Find user by username or email
|
||||
user = User.query.filter(
|
||||
(User.username == username_or_email) | (User.email == username_or_email)
|
||||
).first()
|
||||
|
||||
if not user:
|
||||
return None, "Invalid credentials"
|
||||
|
||||
# Check if user can login
|
||||
can_login, reason = user.can_login()
|
||||
if not can_login:
|
||||
user.record_login_attempt(False)
|
||||
db.session.commit()
|
||||
return None, reason
|
||||
|
||||
# Verify password
|
||||
if not user.check_password(password):
|
||||
user.record_login_attempt(False)
|
||||
db.session.commit()
|
||||
return None, "Invalid credentials"
|
||||
|
||||
# Success
|
||||
user.record_login_attempt(True)
|
||||
db.session.commit()
|
||||
|
||||
return user, None
|
||||
|
||||
|
||||
def authenticate_api_key(api_key: str) -> tuple[Optional[User], Optional[str]]:
|
||||
"""Authenticate user with API key"""
|
||||
user = User.query.filter_by(api_key=api_key).first()
|
||||
|
||||
if not user:
|
||||
return None, "Invalid API key"
|
||||
|
||||
# Check if user can login
|
||||
can_login, reason = user.can_login()
|
||||
if not can_login:
|
||||
return None, reason
|
||||
|
||||
# Update last active
|
||||
user.last_active_at = datetime.utcnow()
|
||||
db.session.commit()
|
||||
|
||||
return user, None
|
||||
|
||||
|
||||
def create_tokens(user: User, session_id: Optional[str] = None) -> Dict[str, Any]:
|
||||
"""Create JWT tokens for user"""
|
||||
# Generate JTIs
|
||||
access_jti = str(uuid.uuid4())
|
||||
refresh_jti = str(uuid.uuid4())
|
||||
|
||||
# Create tokens with custom claims
|
||||
identity = str(user.id)
|
||||
additional_claims = {
|
||||
'username': user.username,
|
||||
'role': user.role,
|
||||
'permissions': user.permissions or [],
|
||||
'session_id': session_id
|
||||
}
|
||||
|
||||
access_token = create_access_token(
|
||||
identity=identity,
|
||||
additional_claims=additional_claims,
|
||||
fresh=True
|
||||
)
|
||||
|
||||
refresh_token = create_refresh_token(
|
||||
identity=identity,
|
||||
additional_claims={'session_id': session_id}
|
||||
)
|
||||
|
||||
return {
|
||||
'access_token': access_token,
|
||||
'refresh_token': refresh_token,
|
||||
'token_type': 'Bearer',
|
||||
'expires_in': current_app.config['JWT_ACCESS_TOKEN_EXPIRES'].total_seconds()
|
||||
}
|
||||
|
||||
|
||||
def create_user_session(user: User, request_info: Dict[str, Any]) -> UserSession:
|
||||
"""Create a new user session"""
|
||||
session = UserSession(
|
||||
session_id=str(uuid.uuid4()),
|
||||
user_id=user.id,
|
||||
ip_address=request_info.get('ip_address'),
|
||||
user_agent=request_info.get('user_agent'),
|
||||
expires_at=datetime.utcnow() + timedelta(days=30)
|
||||
)
|
||||
|
||||
db.session.add(session)
|
||||
db.session.commit()
|
||||
|
||||
return session
|
||||
|
||||
|
||||
def log_login_attempt(user_id: Optional[uuid.UUID], success: bool, method: str,
|
||||
failure_reason: Optional[str] = None, session_id: Optional[str] = None,
|
||||
jwt_jti: Optional[str] = None) -> LoginHistory:
|
||||
"""Log a login attempt"""
|
||||
login_record = LoginHistory(
|
||||
user_id=user_id,
|
||||
login_method=method,
|
||||
success=success,
|
||||
failure_reason=failure_reason,
|
||||
session_id=session_id,
|
||||
jwt_jti=jwt_jti,
|
||||
ip_address=request.remote_addr,
|
||||
user_agent=request.headers.get('User-Agent')
|
||||
)
|
||||
|
||||
db.session.add(login_record)
|
||||
db.session.commit()
|
||||
|
||||
return login_record
|
||||
|
||||
|
||||
def revoke_token(jti: str, token_type: str, user_id: Optional[uuid.UUID] = None,
|
||||
reason: Optional[str] = None, expires_at: Optional[datetime] = None):
|
||||
"""Revoke a JWT token"""
|
||||
if not expires_at:
|
||||
# Default expiration based on token type
|
||||
if token_type == 'access':
|
||||
expires_at = datetime.utcnow() + current_app.config['JWT_ACCESS_TOKEN_EXPIRES']
|
||||
else:
|
||||
expires_at = datetime.utcnow() + current_app.config['JWT_REFRESH_TOKEN_EXPIRES']
|
||||
|
||||
revoked = RevokedToken(
|
||||
jti=jti,
|
||||
token_type=token_type,
|
||||
user_id=user_id,
|
||||
reason=reason,
|
||||
expires_at=expires_at
|
||||
)
|
||||
|
||||
db.session.add(revoked)
|
||||
db.session.commit()
|
||||
|
||||
|
||||
def get_current_user() -> Optional[User]:
|
||||
"""Get current authenticated user from JWT, API key, or session"""
|
||||
# Try JWT first
|
||||
try:
|
||||
verify_jwt_in_request(optional=True)
|
||||
user_id = get_jwt_identity()
|
||||
if user_id:
|
||||
user = User.query.get(user_id)
|
||||
if user and user.is_active and not user.is_suspended_now:
|
||||
# Update last active
|
||||
user.last_active_at = datetime.utcnow()
|
||||
db.session.commit()
|
||||
return user
|
||||
except:
|
||||
pass
|
||||
|
||||
# Try API key from header
|
||||
api_key = request.headers.get('X-API-Key')
|
||||
if api_key:
|
||||
user, _ = authenticate_api_key(api_key)
|
||||
if user:
|
||||
return user
|
||||
|
||||
# Try API key from query parameter
|
||||
api_key = request.args.get('api_key')
|
||||
if api_key:
|
||||
user, _ = authenticate_api_key(api_key)
|
||||
if user:
|
||||
return user
|
||||
|
||||
# Try session-based authentication (for admin panel)
|
||||
from flask import session
|
||||
if session.get('logged_in') and session.get('user_id'):
|
||||
# Check if it's the admin token user
|
||||
if session.get('user_id') == 'admin-token-user' and session.get('user_role') == 'admin':
|
||||
# Create a pseudo-admin user for session-based admin access
|
||||
admin_user = User.query.filter_by(role='admin').first()
|
||||
if admin_user:
|
||||
return admin_user
|
||||
else:
|
||||
# Create a temporary admin user object (not saved to DB)
|
||||
admin_user = User(
|
||||
id=uuid.uuid4(),
|
||||
username='admin',
|
||||
email='admin@talk2me.local',
|
||||
role='admin',
|
||||
is_active=True,
|
||||
is_verified=True,
|
||||
is_suspended=False,
|
||||
total_requests=0,
|
||||
total_translations=0,
|
||||
total_transcriptions=0,
|
||||
total_tts_requests=0
|
||||
)
|
||||
# Don't add to session, just return for authorization
|
||||
return admin_user
|
||||
else:
|
||||
# Regular user session
|
||||
user = User.query.get(session.get('user_id'))
|
||||
if user and user.is_active and not user.is_suspended_now:
|
||||
# Update last active
|
||||
user.last_active_at = datetime.utcnow()
|
||||
db.session.commit()
|
||||
return user
|
||||
|
||||
return None
|
||||
|
||||
|
||||
def require_auth(f: Callable) -> Callable:
|
||||
"""Decorator to require authentication (JWT, API key, or session)"""
|
||||
@functools.wraps(f)
|
||||
def decorated_function(*args, **kwargs):
|
||||
user = get_current_user()
|
||||
if not user:
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'error': 'Authentication required',
|
||||
'code': 'auth_required'
|
||||
}), 401
|
||||
|
||||
# Store user in g for access in route
|
||||
g.current_user = user
|
||||
|
||||
# Track usage only for database-backed users
|
||||
try:
|
||||
if hasattr(user, 'id') and db.session.query(User).filter_by(id=user.id).first():
|
||||
user.total_requests += 1
|
||||
db.session.commit()
|
||||
except Exception as e:
|
||||
# Ignore tracking errors for temporary users
|
||||
pass
|
||||
|
||||
return f(*args, **kwargs)
|
||||
|
||||
return decorated_function
|
||||
|
||||
|
||||
def require_admin(f: Callable) -> Callable:
|
||||
"""Decorator to require admin role"""
|
||||
@functools.wraps(f)
|
||||
@require_auth
|
||||
def decorated_function(*args, **kwargs):
|
||||
if not g.current_user.is_admin:
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'error': 'Admin access required',
|
||||
'code': 'admin_required'
|
||||
}), 403
|
||||
|
||||
return f(*args, **kwargs)
|
||||
|
||||
return decorated_function
|
||||
|
||||
|
||||
def require_permission(permission: str) -> Callable:
|
||||
"""Decorator to require specific permission"""
|
||||
def decorator(f: Callable) -> Callable:
|
||||
@functools.wraps(f)
|
||||
@require_auth
|
||||
def decorated_function(*args, **kwargs):
|
||||
if not g.current_user.has_permission(permission):
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'error': f'Permission required: {permission}',
|
||||
'code': 'permission_denied'
|
||||
}), 403
|
||||
|
||||
return f(*args, **kwargs)
|
||||
|
||||
return decorated_function
|
||||
|
||||
return decorator
|
||||
|
||||
|
||||
def require_verified(f: Callable) -> Callable:
|
||||
"""Decorator to require verified email"""
|
||||
@functools.wraps(f)
|
||||
@require_auth
|
||||
def decorated_function(*args, **kwargs):
|
||||
if not g.current_user.is_verified:
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'error': 'Email verification required',
|
||||
'code': 'verification_required'
|
||||
}), 403
|
||||
|
||||
return f(*args, **kwargs)
|
||||
|
||||
return decorated_function
|
||||
|
||||
|
||||
def get_user_rate_limits(user: User) -> Dict[str, int]:
|
||||
"""Get user-specific rate limits"""
|
||||
return {
|
||||
'per_minute': user.rate_limit_per_minute,
|
||||
'per_hour': user.rate_limit_per_hour,
|
||||
'per_day': user.rate_limit_per_day
|
||||
}
|
||||
|
||||
|
||||
def check_user_rate_limit(user: User, endpoint: str) -> tuple[bool, Optional[str]]:
|
||||
"""Check if user has exceeded rate limits"""
|
||||
# This would integrate with the existing rate limiter
|
||||
# For now, return True to allow requests
|
||||
return True, None
|
||||
|
||||
|
||||
def update_user_usage_stats(user: User, operation: str) -> None:
|
||||
"""Update user usage statistics"""
|
||||
user.total_requests += 1
|
||||
|
||||
if operation == 'translation':
|
||||
user.total_translations += 1
|
||||
elif operation == 'transcription':
|
||||
user.total_transcriptions += 1
|
||||
elif operation == 'tts':
|
||||
user.total_tts_requests += 1
|
||||
|
||||
user.last_active_at = datetime.utcnow()
|
||||
db.session.commit()
|
||||
|
||||
|
||||
def cleanup_expired_sessions() -> int:
|
||||
"""Clean up expired user sessions"""
|
||||
deleted = UserSession.query.filter(
|
||||
UserSession.expires_at < datetime.utcnow()
|
||||
).delete()
|
||||
db.session.commit()
|
||||
return deleted
|
||||
|
||||
|
||||
def cleanup_expired_tokens() -> int:
|
||||
"""Clean up expired revoked tokens"""
|
||||
return RevokedToken.cleanup_expired()
|
||||
|
||||
|
||||
def get_user_sessions(user_id: Union[str, uuid.UUID]) -> List[UserSession]:
|
||||
"""Get all active sessions for a user"""
|
||||
return UserSession.query.filter_by(
|
||||
user_id=user_id
|
||||
).filter(
|
||||
UserSession.expires_at > datetime.utcnow()
|
||||
).order_by(UserSession.last_active_at.desc()).all()
|
||||
|
||||
|
||||
def revoke_user_sessions(user_id: Union[str, uuid.UUID], except_session: Optional[str] = None) -> int:
|
||||
"""Revoke all sessions for a user"""
|
||||
sessions = UserSession.query.filter_by(user_id=user_id)
|
||||
|
||||
if except_session:
|
||||
sessions = sessions.filter(UserSession.session_id != except_session)
|
||||
|
||||
count = 0
|
||||
for session in sessions:
|
||||
# Revoke associated tokens
|
||||
if session.access_token_jti:
|
||||
revoke_token(session.access_token_jti, 'access', user_id, 'Session revoked')
|
||||
if session.refresh_token_jti:
|
||||
revoke_token(session.refresh_token_jti, 'refresh', user_id, 'Session revoked')
|
||||
count += 1
|
||||
|
||||
# Delete sessions
|
||||
sessions.delete()
|
||||
db.session.commit()
|
||||
|
||||
return count
|
366
auth_models.py
Normal file
366
auth_models.py
Normal file
@ -0,0 +1,366 @@
|
||||
"""Authentication models for Talk2Me application"""
|
||||
import uuid
|
||||
import secrets
|
||||
from datetime import datetime, timedelta
|
||||
from typing import Optional, Dict, Any, List
|
||||
from flask_sqlalchemy import SQLAlchemy
|
||||
from flask_bcrypt import Bcrypt
|
||||
from sqlalchemy import Index, text, func
|
||||
from sqlalchemy.dialects.postgresql import UUID, JSONB, ENUM
|
||||
from sqlalchemy.ext.hybrid import hybrid_property
|
||||
from sqlalchemy.orm import relationship
|
||||
|
||||
from database import db
|
||||
|
||||
bcrypt = Bcrypt()
|
||||
|
||||
|
||||
class User(db.Model):
|
||||
"""User account model with authentication and authorization"""
|
||||
__tablename__ = 'users'
|
||||
|
||||
id = db.Column(UUID(as_uuid=True), primary_key=True, default=uuid.uuid4)
|
||||
email = db.Column(db.String(255), unique=True, nullable=False, index=True)
|
||||
username = db.Column(db.String(100), unique=True, nullable=False, index=True)
|
||||
password_hash = db.Column(db.String(255), nullable=False)
|
||||
|
||||
# User profile
|
||||
full_name = db.Column(db.String(255), nullable=True)
|
||||
avatar_url = db.Column(db.String(500), nullable=True)
|
||||
|
||||
# API Key - unique per user
|
||||
api_key = db.Column(db.String(64), unique=True, nullable=False, index=True)
|
||||
api_key_created_at = db.Column(db.DateTime, nullable=False, default=datetime.utcnow)
|
||||
|
||||
# Account status
|
||||
is_active = db.Column(db.Boolean, default=True, nullable=False)
|
||||
is_verified = db.Column(db.Boolean, default=False, nullable=False)
|
||||
is_suspended = db.Column(db.Boolean, default=False, nullable=False)
|
||||
suspension_reason = db.Column(db.Text, nullable=True)
|
||||
suspended_at = db.Column(db.DateTime, nullable=True)
|
||||
suspended_until = db.Column(db.DateTime, nullable=True)
|
||||
|
||||
# Role and permissions
|
||||
role = db.Column(db.String(20), nullable=False, default='user') # admin, user
|
||||
permissions = db.Column(JSONB, default=[], nullable=False) # Additional granular permissions
|
||||
|
||||
# Usage limits (per user)
|
||||
rate_limit_per_minute = db.Column(db.Integer, default=30, nullable=False)
|
||||
rate_limit_per_hour = db.Column(db.Integer, default=500, nullable=False)
|
||||
rate_limit_per_day = db.Column(db.Integer, default=5000, nullable=False)
|
||||
|
||||
# Usage tracking
|
||||
total_requests = db.Column(db.Integer, default=0, nullable=False)
|
||||
total_translations = db.Column(db.Integer, default=0, nullable=False)
|
||||
total_transcriptions = db.Column(db.Integer, default=0, nullable=False)
|
||||
total_tts_requests = db.Column(db.Integer, default=0, nullable=False)
|
||||
|
||||
# Timestamps
|
||||
created_at = db.Column(db.DateTime, nullable=False, default=datetime.utcnow)
|
||||
updated_at = db.Column(db.DateTime, nullable=False, default=datetime.utcnow, onupdate=datetime.utcnow)
|
||||
last_login_at = db.Column(db.DateTime, nullable=True)
|
||||
last_active_at = db.Column(db.DateTime, nullable=True)
|
||||
|
||||
# Security
|
||||
password_changed_at = db.Column(db.DateTime, nullable=False, default=datetime.utcnow)
|
||||
failed_login_attempts = db.Column(db.Integer, default=0, nullable=False)
|
||||
locked_until = db.Column(db.DateTime, nullable=True)
|
||||
|
||||
# Settings
|
||||
settings = db.Column(JSONB, default={}, nullable=False)
|
||||
|
||||
# Relationships
|
||||
login_history = relationship('LoginHistory', back_populates='user', cascade='all, delete-orphan')
|
||||
sessions = relationship('UserSession', back_populates='user', cascade='all, delete-orphan')
|
||||
|
||||
__table_args__ = (
|
||||
Index('idx_users_email_active', 'email', 'is_active'),
|
||||
Index('idx_users_role_active', 'role', 'is_active'),
|
||||
Index('idx_users_created_at', 'created_at'),
|
||||
)
|
||||
|
||||
def __init__(self, **kwargs):
|
||||
super(User, self).__init__(**kwargs)
|
||||
if not self.api_key:
|
||||
self.api_key = self.generate_api_key()
|
||||
|
||||
@staticmethod
|
||||
def generate_api_key() -> str:
|
||||
"""Generate a secure API key"""
|
||||
return f"tk_{secrets.token_urlsafe(32)}"
|
||||
|
||||
def regenerate_api_key(self) -> str:
|
||||
"""Regenerate user's API key"""
|
||||
self.api_key = self.generate_api_key()
|
||||
self.api_key_created_at = datetime.utcnow()
|
||||
return self.api_key
|
||||
|
||||
def set_password(self, password: str) -> None:
|
||||
"""Hash and set user password"""
|
||||
self.password_hash = bcrypt.generate_password_hash(password).decode('utf-8')
|
||||
self.password_changed_at = datetime.utcnow()
|
||||
|
||||
def check_password(self, password: str) -> bool:
|
||||
"""Check if provided password matches hash"""
|
||||
return bcrypt.check_password_hash(self.password_hash, password)
|
||||
|
||||
@hybrid_property
|
||||
def is_admin(self) -> bool:
|
||||
"""Check if user has admin role"""
|
||||
return self.role == 'admin'
|
||||
|
||||
@hybrid_property
|
||||
def is_locked(self) -> bool:
|
||||
"""Check if account is locked due to failed login attempts"""
|
||||
if self.locked_until is None:
|
||||
return False
|
||||
return datetime.utcnow() < self.locked_until
|
||||
|
||||
@hybrid_property
|
||||
def is_suspended_now(self) -> bool:
|
||||
"""Check if account is currently suspended"""
|
||||
if not self.is_suspended:
|
||||
return False
|
||||
if self.suspended_until is None:
|
||||
return True # Indefinite suspension
|
||||
return datetime.utcnow() < self.suspended_until
|
||||
|
||||
def can_login(self) -> tuple[bool, Optional[str]]:
|
||||
"""Check if user can login"""
|
||||
if not self.is_active:
|
||||
return False, "Account is deactivated"
|
||||
if self.is_locked:
|
||||
return False, "Account is locked due to failed login attempts"
|
||||
if self.is_suspended_now:
|
||||
return False, f"Account is suspended: {self.suspension_reason or 'Policy violation'}"
|
||||
return True, None
|
||||
|
||||
def record_login_attempt(self, success: bool) -> None:
|
||||
"""Record login attempt and handle lockout"""
|
||||
if success:
|
||||
self.failed_login_attempts = 0
|
||||
self.locked_until = None
|
||||
self.last_login_at = datetime.utcnow()
|
||||
else:
|
||||
self.failed_login_attempts += 1
|
||||
# Lock account after 5 failed attempts
|
||||
if self.failed_login_attempts >= 5:
|
||||
self.locked_until = datetime.utcnow() + timedelta(minutes=30)
|
||||
|
||||
def has_permission(self, permission: str) -> bool:
|
||||
"""Check if user has specific permission"""
|
||||
if self.is_admin:
|
||||
return True # Admins have all permissions
|
||||
return permission in (self.permissions or [])
|
||||
|
||||
def add_permission(self, permission: str) -> None:
|
||||
"""Add permission to user"""
|
||||
if self.permissions is None:
|
||||
self.permissions = []
|
||||
if permission not in self.permissions:
|
||||
self.permissions = self.permissions + [permission]
|
||||
|
||||
def remove_permission(self, permission: str) -> None:
|
||||
"""Remove permission from user"""
|
||||
if self.permissions and permission in self.permissions:
|
||||
self.permissions = [p for p in self.permissions if p != permission]
|
||||
|
||||
def suspend(self, reason: str, until: Optional[datetime] = None) -> None:
|
||||
"""Suspend user account"""
|
||||
self.is_suspended = True
|
||||
self.suspension_reason = reason
|
||||
self.suspended_at = datetime.utcnow()
|
||||
self.suspended_until = until
|
||||
|
||||
def unsuspend(self) -> None:
|
||||
"""Unsuspend user account"""
|
||||
self.is_suspended = False
|
||||
self.suspension_reason = None
|
||||
self.suspended_at = None
|
||||
self.suspended_until = None
|
||||
|
||||
def to_dict(self, include_sensitive: bool = False) -> Dict[str, Any]:
|
||||
"""Convert user to dictionary"""
|
||||
data = {
|
||||
'id': str(self.id),
|
||||
'email': self.email,
|
||||
'username': self.username,
|
||||
'full_name': self.full_name,
|
||||
'avatar_url': self.avatar_url,
|
||||
'role': self.role,
|
||||
'is_active': self.is_active,
|
||||
'is_verified': self.is_verified,
|
||||
'is_suspended': self.is_suspended_now,
|
||||
'created_at': self.created_at.isoformat(),
|
||||
'last_login_at': self.last_login_at.isoformat() if self.last_login_at else None,
|
||||
'last_active_at': self.last_active_at.isoformat() if self.last_active_at else None,
|
||||
'total_requests': self.total_requests,
|
||||
'total_translations': self.total_translations,
|
||||
'total_transcriptions': self.total_transcriptions,
|
||||
'total_tts_requests': self.total_tts_requests,
|
||||
'settings': self.settings or {}
|
||||
}
|
||||
|
||||
if include_sensitive:
|
||||
data.update({
|
||||
'api_key': self.api_key,
|
||||
'api_key_created_at': self.api_key_created_at.isoformat(),
|
||||
'permissions': self.permissions or [],
|
||||
'rate_limit_per_minute': self.rate_limit_per_minute,
|
||||
'rate_limit_per_hour': self.rate_limit_per_hour,
|
||||
'rate_limit_per_day': self.rate_limit_per_day,
|
||||
'suspension_reason': self.suspension_reason,
|
||||
'suspended_until': self.suspended_until.isoformat() if self.suspended_until else None,
|
||||
'failed_login_attempts': self.failed_login_attempts,
|
||||
'is_locked': self.is_locked
|
||||
})
|
||||
|
||||
return data
|
||||
|
||||
|
||||
class LoginHistory(db.Model):
|
||||
"""Track user login history for security auditing"""
|
||||
__tablename__ = 'login_history'
|
||||
|
||||
id = db.Column(UUID(as_uuid=True), primary_key=True, default=uuid.uuid4)
|
||||
user_id = db.Column(UUID(as_uuid=True), db.ForeignKey('users.id'), nullable=False, index=True)
|
||||
|
||||
# Login details
|
||||
login_at = db.Column(db.DateTime, nullable=False, default=datetime.utcnow)
|
||||
logout_at = db.Column(db.DateTime, nullable=True)
|
||||
login_method = db.Column(db.String(20), nullable=False) # password, api_key, jwt
|
||||
success = db.Column(db.Boolean, nullable=False)
|
||||
failure_reason = db.Column(db.String(255), nullable=True)
|
||||
|
||||
# Session info
|
||||
session_id = db.Column(db.String(255), nullable=True, index=True)
|
||||
jwt_jti = db.Column(db.String(255), nullable=True, index=True) # JWT ID for revocation
|
||||
|
||||
# Client info
|
||||
ip_address = db.Column(db.String(45), nullable=False)
|
||||
user_agent = db.Column(db.String(500), nullable=True)
|
||||
device_info = db.Column(JSONB, nullable=True) # Parsed user agent info
|
||||
|
||||
# Location info (if available)
|
||||
country = db.Column(db.String(2), nullable=True)
|
||||
city = db.Column(db.String(100), nullable=True)
|
||||
|
||||
# Security flags
|
||||
is_suspicious = db.Column(db.Boolean, default=False, nullable=False)
|
||||
security_notes = db.Column(db.Text, nullable=True)
|
||||
|
||||
# Relationship
|
||||
user = relationship('User', back_populates='login_history')
|
||||
|
||||
__table_args__ = (
|
||||
Index('idx_login_history_user_time', 'user_id', 'login_at'),
|
||||
Index('idx_login_history_session', 'session_id'),
|
||||
Index('idx_login_history_ip', 'ip_address'),
|
||||
)
|
||||
|
||||
def to_dict(self) -> Dict[str, Any]:
|
||||
"""Convert login history to dictionary"""
|
||||
return {
|
||||
'id': str(self.id),
|
||||
'user_id': str(self.user_id),
|
||||
'login_at': self.login_at.isoformat(),
|
||||
'logout_at': self.logout_at.isoformat() if self.logout_at else None,
|
||||
'login_method': self.login_method,
|
||||
'success': self.success,
|
||||
'failure_reason': self.failure_reason,
|
||||
'session_id': self.session_id,
|
||||
'ip_address': self.ip_address,
|
||||
'user_agent': self.user_agent,
|
||||
'device_info': self.device_info,
|
||||
'country': self.country,
|
||||
'city': self.city,
|
||||
'is_suspicious': self.is_suspicious,
|
||||
'security_notes': self.security_notes
|
||||
}
|
||||
|
||||
|
||||
class UserSession(db.Model):
|
||||
"""Active user sessions for session management"""
|
||||
__tablename__ = 'user_sessions'
|
||||
|
||||
id = db.Column(UUID(as_uuid=True), primary_key=True, default=uuid.uuid4)
|
||||
session_id = db.Column(db.String(255), unique=True, nullable=False, index=True)
|
||||
user_id = db.Column(UUID(as_uuid=True), db.ForeignKey('users.id'), nullable=False, index=True)
|
||||
|
||||
# JWT tokens
|
||||
access_token_jti = db.Column(db.String(255), nullable=True, index=True)
|
||||
refresh_token_jti = db.Column(db.String(255), nullable=True, index=True)
|
||||
|
||||
# Session info
|
||||
created_at = db.Column(db.DateTime, nullable=False, default=datetime.utcnow)
|
||||
last_active_at = db.Column(db.DateTime, nullable=False, default=datetime.utcnow)
|
||||
expires_at = db.Column(db.DateTime, nullable=False)
|
||||
|
||||
# Client info
|
||||
ip_address = db.Column(db.String(45), nullable=False)
|
||||
user_agent = db.Column(db.String(500), nullable=True)
|
||||
|
||||
# Session data
|
||||
data = db.Column(JSONB, default={}, nullable=False)
|
||||
|
||||
# Relationship
|
||||
user = relationship('User', back_populates='sessions')
|
||||
|
||||
__table_args__ = (
|
||||
Index('idx_user_sessions_user_active', 'user_id', 'expires_at'),
|
||||
Index('idx_user_sessions_token', 'access_token_jti'),
|
||||
)
|
||||
|
||||
@hybrid_property
|
||||
def is_expired(self) -> bool:
|
||||
"""Check if session is expired"""
|
||||
return datetime.utcnow() > self.expires_at
|
||||
|
||||
def refresh(self, duration_hours: int = 24) -> None:
|
||||
"""Refresh session expiration"""
|
||||
self.last_active_at = datetime.utcnow()
|
||||
self.expires_at = datetime.utcnow() + timedelta(hours=duration_hours)
|
||||
|
||||
def to_dict(self) -> Dict[str, Any]:
|
||||
"""Convert session to dictionary"""
|
||||
return {
|
||||
'id': str(self.id),
|
||||
'session_id': self.session_id,
|
||||
'user_id': str(self.user_id),
|
||||
'created_at': self.created_at.isoformat(),
|
||||
'last_active_at': self.last_active_at.isoformat(),
|
||||
'expires_at': self.expires_at.isoformat(),
|
||||
'is_expired': self.is_expired,
|
||||
'ip_address': self.ip_address,
|
||||
'user_agent': self.user_agent,
|
||||
'data': self.data or {}
|
||||
}
|
||||
|
||||
|
||||
class RevokedToken(db.Model):
|
||||
"""Store revoked JWT tokens for blacklisting"""
|
||||
__tablename__ = 'revoked_tokens'
|
||||
|
||||
id = db.Column(UUID(as_uuid=True), primary_key=True, default=uuid.uuid4)
|
||||
jti = db.Column(db.String(255), unique=True, nullable=False, index=True)
|
||||
token_type = db.Column(db.String(20), nullable=False) # access, refresh
|
||||
user_id = db.Column(UUID(as_uuid=True), nullable=True, index=True)
|
||||
revoked_at = db.Column(db.DateTime, nullable=False, default=datetime.utcnow)
|
||||
expires_at = db.Column(db.DateTime, nullable=False) # When token would have expired
|
||||
reason = db.Column(db.String(255), nullable=True)
|
||||
|
||||
__table_args__ = (
|
||||
Index('idx_revoked_tokens_expires', 'expires_at'),
|
||||
)
|
||||
|
||||
@classmethod
|
||||
def is_token_revoked(cls, jti: str) -> bool:
|
||||
"""Check if a token JTI is revoked"""
|
||||
return cls.query.filter_by(jti=jti).first() is not None
|
||||
|
||||
@classmethod
|
||||
def cleanup_expired(cls) -> int:
|
||||
"""Remove revoked tokens that have expired anyway"""
|
||||
deleted = cls.query.filter(cls.expires_at < datetime.utcnow()).delete()
|
||||
db.session.commit()
|
||||
return deleted
|
899
auth_routes.py
Normal file
899
auth_routes.py
Normal file
@ -0,0 +1,899 @@
|
||||
"""Authentication and user management routes"""
|
||||
import os
|
||||
from datetime import datetime, timedelta
|
||||
from flask import Blueprint, request, jsonify, g
|
||||
from flask_jwt_extended import jwt_required, get_jwt_identity, get_jwt
|
||||
from sqlalchemy import or_, func
|
||||
from werkzeug.exceptions import BadRequest
|
||||
|
||||
from database import db
|
||||
from auth_models import User, LoginHistory, UserSession
|
||||
from auth import (
|
||||
create_user, authenticate_user, create_tokens, create_user_session,
|
||||
revoke_token, get_current_user, require_admin,
|
||||
require_auth, revoke_user_sessions, update_user_usage_stats
|
||||
)
|
||||
from rate_limiter import rate_limit
|
||||
from validators import Validators
|
||||
from error_logger import log_exception
|
||||
|
||||
auth_bp = Blueprint('auth', __name__)
|
||||
|
||||
|
||||
@auth_bp.route('/login', methods=['POST'])
|
||||
@rate_limit(requests_per_minute=5, requests_per_hour=30)
|
||||
def login():
|
||||
"""User login endpoint"""
|
||||
try:
|
||||
data = request.get_json()
|
||||
|
||||
# Validate input
|
||||
username_or_email = data.get('username') or data.get('email')
|
||||
password = data.get('password')
|
||||
|
||||
if not username_or_email or not password:
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'error': 'Username/email and password required'
|
||||
}), 400
|
||||
|
||||
# Authenticate user
|
||||
user, error = authenticate_user(username_or_email, password)
|
||||
if error:
|
||||
# Log failed attempt
|
||||
login_record = LoginHistory(
|
||||
user_id=None,
|
||||
login_method='password',
|
||||
success=False,
|
||||
failure_reason=error,
|
||||
ip_address=request.remote_addr,
|
||||
user_agent=request.headers.get('User-Agent')
|
||||
)
|
||||
db.session.add(login_record)
|
||||
db.session.commit()
|
||||
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'error': error
|
||||
}), 401
|
||||
|
||||
# Create session
|
||||
session = create_user_session(user, {
|
||||
'ip_address': request.remote_addr,
|
||||
'user_agent': request.headers.get('User-Agent')
|
||||
})
|
||||
|
||||
# Create tokens
|
||||
tokens = create_tokens(user, session.session_id)
|
||||
|
||||
# Note: We can't get JWT payload here since we haven't set the JWT context yet
|
||||
# The session JTI will be updated on the next authenticated request
|
||||
db.session.commit()
|
||||
|
||||
# Log successful login with request info
|
||||
login_record = LoginHistory(
|
||||
user_id=user.id,
|
||||
login_method='password',
|
||||
success=True,
|
||||
session_id=session.session_id,
|
||||
ip_address=request.remote_addr,
|
||||
user_agent=request.headers.get('User-Agent')
|
||||
)
|
||||
db.session.add(login_record)
|
||||
db.session.commit()
|
||||
|
||||
# Store user info in Flask session for web access
|
||||
from flask import session as flask_session
|
||||
flask_session['user_id'] = str(user.id)
|
||||
flask_session['username'] = user.username
|
||||
flask_session['user_role'] = user.role
|
||||
flask_session['logged_in'] = True
|
||||
|
||||
return jsonify({
|
||||
'success': True,
|
||||
'user': user.to_dict(),
|
||||
'tokens': tokens,
|
||||
'session_id': session.session_id
|
||||
})
|
||||
|
||||
except Exception as e:
|
||||
log_exception(e, "Login error")
|
||||
# In development, show the actual error
|
||||
import os
|
||||
if os.environ.get('FLASK_ENV') == 'development':
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'error': f'Login failed: {str(e)}'
|
||||
}), 500
|
||||
else:
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'error': 'Login failed'
|
||||
}), 500
|
||||
|
||||
|
||||
@auth_bp.route('/logout', methods=['POST'])
|
||||
@jwt_required()
|
||||
def logout():
|
||||
"""User logout endpoint"""
|
||||
try:
|
||||
jti = get_jwt()["jti"]
|
||||
user_id = get_jwt_identity()
|
||||
|
||||
# Revoke the access token
|
||||
revoke_token(jti, 'access', user_id, 'User logout')
|
||||
|
||||
# Update login history
|
||||
session_id = get_jwt().get('session_id')
|
||||
if session_id:
|
||||
login_record = LoginHistory.query.filter_by(
|
||||
session_id=session_id,
|
||||
logout_at=None
|
||||
).first()
|
||||
if login_record:
|
||||
login_record.logout_at = datetime.utcnow()
|
||||
db.session.commit()
|
||||
|
||||
return jsonify({
|
||||
'success': True,
|
||||
'message': 'Successfully logged out'
|
||||
})
|
||||
|
||||
except Exception as e:
|
||||
log_exception(e, "Logout error")
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'error': 'Logout failed'
|
||||
}), 500
|
||||
|
||||
|
||||
@auth_bp.route('/refresh', methods=['POST'])
|
||||
@jwt_required(refresh=True)
|
||||
def refresh_token():
|
||||
"""Refresh access token"""
|
||||
try:
|
||||
user_id = get_jwt_identity()
|
||||
user = User.query.get(user_id)
|
||||
|
||||
if not user or not user.is_active:
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'error': 'Invalid user'
|
||||
}), 401
|
||||
|
||||
# Check if user can login
|
||||
can_login, reason = user.can_login()
|
||||
if not can_login:
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'error': reason
|
||||
}), 401
|
||||
|
||||
# Create new access token
|
||||
session_id = get_jwt().get('session_id')
|
||||
tokens = create_tokens(user, session_id)
|
||||
|
||||
# Update session if exists
|
||||
if session_id:
|
||||
session = UserSession.query.filter_by(session_id=session_id).first()
|
||||
if session:
|
||||
session.refresh()
|
||||
db.session.commit()
|
||||
|
||||
return jsonify({
|
||||
'success': True,
|
||||
'access_token': tokens['access_token'],
|
||||
'expires_in': tokens['expires_in']
|
||||
})
|
||||
|
||||
except Exception as e:
|
||||
log_exception(e, "Token refresh error")
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'error': 'Token refresh failed'
|
||||
}), 500
|
||||
|
||||
|
||||
@auth_bp.route('/profile', methods=['GET'])
|
||||
@require_auth
|
||||
def get_profile():
|
||||
"""Get current user profile"""
|
||||
try:
|
||||
return jsonify({
|
||||
'success': True,
|
||||
'user': g.current_user.to_dict(include_sensitive=True)
|
||||
})
|
||||
except Exception as e:
|
||||
log_exception(e, "Profile fetch error")
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'error': 'Failed to fetch profile'
|
||||
}), 500
|
||||
|
||||
|
||||
@auth_bp.route('/profile', methods=['PUT'])
|
||||
@require_auth
|
||||
def update_profile():
|
||||
"""Update user profile"""
|
||||
try:
|
||||
data = request.get_json()
|
||||
user = g.current_user
|
||||
|
||||
# Update allowed fields
|
||||
if 'full_name' in data:
|
||||
user.full_name = Validators.sanitize_text(data['full_name'], max_length=255)
|
||||
|
||||
if 'avatar_url' in data:
|
||||
validated_url = Validators.validate_url(data['avatar_url'])
|
||||
if validated_url:
|
||||
user.avatar_url = validated_url
|
||||
|
||||
if 'settings' in data and isinstance(data['settings'], dict):
|
||||
user.settings = {**user.settings, **data['settings']}
|
||||
|
||||
db.session.commit()
|
||||
|
||||
return jsonify({
|
||||
'success': True,
|
||||
'user': user.to_dict(include_sensitive=True)
|
||||
})
|
||||
|
||||
except Exception as e:
|
||||
log_exception(e, "Profile update error")
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'error': 'Failed to update profile'
|
||||
}), 500
|
||||
|
||||
|
||||
@auth_bp.route('/change-password', methods=['POST'])
|
||||
@require_auth
|
||||
def change_password():
|
||||
"""Change user password"""
|
||||
try:
|
||||
data = request.get_json()
|
||||
user = g.current_user
|
||||
|
||||
current_password = data.get('current_password')
|
||||
new_password = data.get('new_password')
|
||||
|
||||
if not current_password or not new_password:
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'error': 'Current and new passwords required'
|
||||
}), 400
|
||||
|
||||
# Verify current password
|
||||
if not user.check_password(current_password):
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'error': 'Invalid current password'
|
||||
}), 401
|
||||
|
||||
# Validate new password
|
||||
if len(new_password) < 8:
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'error': 'Password must be at least 8 characters'
|
||||
}), 400
|
||||
|
||||
# Update password
|
||||
user.set_password(new_password)
|
||||
db.session.commit()
|
||||
|
||||
# Revoke all sessions except current
|
||||
session_id = get_jwt().get('session_id') if hasattr(g, 'jwt_payload') else None
|
||||
revoked_count = revoke_user_sessions(user.id, except_session=session_id)
|
||||
|
||||
return jsonify({
|
||||
'success': True,
|
||||
'message': 'Password changed successfully',
|
||||
'revoked_sessions': revoked_count
|
||||
})
|
||||
|
||||
except Exception as e:
|
||||
log_exception(e, "Password change error")
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'error': 'Failed to change password'
|
||||
}), 500
|
||||
|
||||
|
||||
@auth_bp.route('/regenerate-api-key', methods=['POST'])
|
||||
@require_auth
|
||||
def regenerate_api_key():
|
||||
"""Regenerate user's API key"""
|
||||
try:
|
||||
user = g.current_user
|
||||
new_key = user.regenerate_api_key()
|
||||
db.session.commit()
|
||||
|
||||
return jsonify({
|
||||
'success': True,
|
||||
'api_key': new_key,
|
||||
'created_at': user.api_key_created_at.isoformat()
|
||||
})
|
||||
|
||||
except Exception as e:
|
||||
log_exception(e, "API key regeneration error")
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'error': 'Failed to regenerate API key'
|
||||
}), 500
|
||||
|
||||
|
||||
@auth_bp.route('/sessions', methods=['GET'])
|
||||
@require_auth
|
||||
def get_user_sessions():
|
||||
"""Get user's active sessions"""
|
||||
try:
|
||||
sessions = UserSession.query.filter_by(
|
||||
user_id=g.current_user.id
|
||||
).filter(
|
||||
UserSession.expires_at > datetime.utcnow()
|
||||
).order_by(UserSession.last_active_at.desc()).all()
|
||||
|
||||
return jsonify({
|
||||
'success': True,
|
||||
'sessions': [s.to_dict() for s in sessions]
|
||||
})
|
||||
|
||||
except Exception as e:
|
||||
log_exception(e, "Sessions fetch error")
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'error': 'Failed to fetch sessions'
|
||||
}), 500
|
||||
|
||||
|
||||
@auth_bp.route('/sessions/<session_id>', methods=['DELETE'])
|
||||
@require_auth
|
||||
def revoke_session(session_id):
|
||||
"""Revoke a specific session"""
|
||||
try:
|
||||
session = UserSession.query.filter_by(
|
||||
session_id=session_id,
|
||||
user_id=g.current_user.id
|
||||
).first()
|
||||
|
||||
if not session:
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'error': 'Session not found'
|
||||
}), 404
|
||||
|
||||
# Revoke tokens
|
||||
if session.access_token_jti:
|
||||
revoke_token(session.access_token_jti, 'access', g.current_user.id, 'Session revoked by user')
|
||||
if session.refresh_token_jti:
|
||||
revoke_token(session.refresh_token_jti, 'refresh', g.current_user.id, 'Session revoked by user')
|
||||
|
||||
# Delete session
|
||||
db.session.delete(session)
|
||||
db.session.commit()
|
||||
|
||||
return jsonify({
|
||||
'success': True,
|
||||
'message': 'Session revoked successfully'
|
||||
})
|
||||
|
||||
except Exception as e:
|
||||
log_exception(e, "Session revocation error")
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'error': 'Failed to revoke session'
|
||||
}), 500
|
||||
|
||||
|
||||
# Admin endpoints for user management
|
||||
|
||||
@auth_bp.route('/admin/users', methods=['GET'])
|
||||
@require_admin
|
||||
def admin_list_users():
|
||||
"""List all users (admin only)"""
|
||||
try:
|
||||
# Get query parameters
|
||||
page = request.args.get('page', 1, type=int)
|
||||
per_page = request.args.get('per_page', 20, type=int)
|
||||
search = request.args.get('search', '')
|
||||
role = request.args.get('role')
|
||||
status = request.args.get('status')
|
||||
sort_by = request.args.get('sort_by', 'created_at')
|
||||
sort_order = request.args.get('sort_order', 'desc')
|
||||
|
||||
# Build query
|
||||
query = User.query
|
||||
|
||||
# Search filter
|
||||
if search:
|
||||
search_term = f'%{search}%'
|
||||
query = query.filter(or_(
|
||||
User.email.ilike(search_term),
|
||||
User.username.ilike(search_term),
|
||||
User.full_name.ilike(search_term)
|
||||
))
|
||||
|
||||
# Role filter
|
||||
if role:
|
||||
query = query.filter(User.role == role)
|
||||
|
||||
# Status filter
|
||||
if status == 'active':
|
||||
query = query.filter(User.is_active == True, User.is_suspended == False)
|
||||
elif status == 'suspended':
|
||||
query = query.filter(User.is_suspended == True)
|
||||
elif status == 'inactive':
|
||||
query = query.filter(User.is_active == False)
|
||||
|
||||
# Sorting
|
||||
order_column = getattr(User, sort_by, User.created_at)
|
||||
if sort_order == 'desc':
|
||||
query = query.order_by(order_column.desc())
|
||||
else:
|
||||
query = query.order_by(order_column.asc())
|
||||
|
||||
# Paginate
|
||||
pagination = query.paginate(page=page, per_page=per_page, error_out=False)
|
||||
|
||||
return jsonify({
|
||||
'success': True,
|
||||
'users': [u.to_dict(include_sensitive=True) for u in pagination.items],
|
||||
'pagination': {
|
||||
'page': pagination.page,
|
||||
'per_page': pagination.per_page,
|
||||
'total': pagination.total,
|
||||
'pages': pagination.pages
|
||||
}
|
||||
})
|
||||
|
||||
except Exception as e:
|
||||
log_exception(e, "Admin user list error")
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'error': 'Failed to fetch users'
|
||||
}), 500
|
||||
|
||||
|
||||
@auth_bp.route('/admin/users', methods=['POST'])
|
||||
@require_admin
|
||||
def admin_create_user():
|
||||
"""Create a new user (admin only)"""
|
||||
try:
|
||||
data = request.get_json()
|
||||
|
||||
# Validate required fields
|
||||
email = data.get('email')
|
||||
username = data.get('username')
|
||||
password = data.get('password')
|
||||
|
||||
if not email or not username or not password:
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'error': 'Email, username, and password are required'
|
||||
}), 400
|
||||
|
||||
# Validate email
|
||||
if not Validators.validate_email(email):
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'error': 'Invalid email address'
|
||||
}), 400
|
||||
|
||||
# Create user
|
||||
user, error = create_user(
|
||||
email=email,
|
||||
username=username,
|
||||
password=password,
|
||||
full_name=data.get('full_name'),
|
||||
role=data.get('role', 'user'),
|
||||
is_verified=data.get('is_verified', False)
|
||||
)
|
||||
|
||||
if error:
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'error': error
|
||||
}), 400
|
||||
|
||||
# Set additional properties
|
||||
if 'rate_limit_per_minute' in data:
|
||||
user.rate_limit_per_minute = data['rate_limit_per_minute']
|
||||
if 'rate_limit_per_hour' in data:
|
||||
user.rate_limit_per_hour = data['rate_limit_per_hour']
|
||||
if 'rate_limit_per_day' in data:
|
||||
user.rate_limit_per_day = data['rate_limit_per_day']
|
||||
if 'permissions' in data:
|
||||
user.permissions = data['permissions']
|
||||
|
||||
db.session.commit()
|
||||
|
||||
return jsonify({
|
||||
'success': True,
|
||||
'user': user.to_dict(include_sensitive=True)
|
||||
}), 201
|
||||
|
||||
except Exception as e:
|
||||
log_exception(e, "Admin user creation error")
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'error': 'Failed to create user'
|
||||
}), 500
|
||||
|
||||
|
||||
@auth_bp.route('/admin/users/<user_id>', methods=['GET'])
|
||||
@require_admin
|
||||
def admin_get_user(user_id):
|
||||
"""Get user details (admin only)"""
|
||||
try:
|
||||
user = User.query.get(user_id)
|
||||
if not user:
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'error': 'User not found'
|
||||
}), 404
|
||||
|
||||
# Get additional info
|
||||
login_history = LoginHistory.query.filter_by(
|
||||
user_id=user.id
|
||||
).order_by(LoginHistory.login_at.desc()).limit(10).all()
|
||||
|
||||
active_sessions = UserSession.query.filter_by(
|
||||
user_id=user.id
|
||||
).filter(
|
||||
UserSession.expires_at > datetime.utcnow()
|
||||
).all()
|
||||
|
||||
return jsonify({
|
||||
'success': True,
|
||||
'user': user.to_dict(include_sensitive=True),
|
||||
'login_history': [l.to_dict() for l in login_history],
|
||||
'active_sessions': [s.to_dict() for s in active_sessions]
|
||||
})
|
||||
|
||||
except Exception as e:
|
||||
log_exception(e, "Admin user fetch error")
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'error': 'Failed to fetch user'
|
||||
}), 500
|
||||
|
||||
|
||||
@auth_bp.route('/admin/users/<user_id>', methods=['PUT'])
|
||||
@require_admin
|
||||
def admin_update_user(user_id):
|
||||
"""Update user (admin only)"""
|
||||
try:
|
||||
user = User.query.get(user_id)
|
||||
if not user:
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'error': 'User not found'
|
||||
}), 404
|
||||
|
||||
data = request.get_json()
|
||||
|
||||
# Update allowed fields
|
||||
if 'email' in data:
|
||||
if Validators.validate_email(data['email']):
|
||||
user.email = data['email']
|
||||
|
||||
if 'username' in data:
|
||||
user.username = data['username']
|
||||
|
||||
if 'full_name' in data:
|
||||
user.full_name = data['full_name']
|
||||
|
||||
if 'role' in data and data['role'] in ['admin', 'user']:
|
||||
user.role = data['role']
|
||||
|
||||
if 'is_active' in data:
|
||||
user.is_active = data['is_active']
|
||||
|
||||
if 'is_verified' in data:
|
||||
user.is_verified = data['is_verified']
|
||||
|
||||
if 'permissions' in data:
|
||||
user.permissions = data['permissions']
|
||||
|
||||
if 'rate_limit_per_minute' in data:
|
||||
user.rate_limit_per_minute = data['rate_limit_per_minute']
|
||||
|
||||
if 'rate_limit_per_hour' in data:
|
||||
user.rate_limit_per_hour = data['rate_limit_per_hour']
|
||||
|
||||
if 'rate_limit_per_day' in data:
|
||||
user.rate_limit_per_day = data['rate_limit_per_day']
|
||||
|
||||
db.session.commit()
|
||||
|
||||
return jsonify({
|
||||
'success': True,
|
||||
'user': user.to_dict(include_sensitive=True)
|
||||
})
|
||||
|
||||
except Exception as e:
|
||||
log_exception(e, "Admin user update error")
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'error': 'Failed to update user'
|
||||
}), 500
|
||||
|
||||
|
||||
@auth_bp.route('/admin/users/<user_id>', methods=['DELETE'])
|
||||
@require_admin
|
||||
def admin_delete_user(user_id):
|
||||
"""Delete user (admin only)"""
|
||||
try:
|
||||
user = User.query.get(user_id)
|
||||
if not user:
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'error': 'User not found'
|
||||
}), 404
|
||||
|
||||
# Don't allow deleting admin users
|
||||
if user.is_admin:
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'error': 'Cannot delete admin users'
|
||||
}), 403
|
||||
|
||||
# Revoke all sessions
|
||||
revoke_user_sessions(user.id)
|
||||
|
||||
# Delete user (cascades to related records)
|
||||
db.session.delete(user)
|
||||
db.session.commit()
|
||||
|
||||
return jsonify({
|
||||
'success': True,
|
||||
'message': 'User deleted successfully'
|
||||
})
|
||||
|
||||
except Exception as e:
|
||||
log_exception(e, "Admin user deletion error")
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'error': 'Failed to delete user'
|
||||
}), 500
|
||||
|
||||
|
||||
@auth_bp.route('/admin/users/<user_id>/suspend', methods=['POST'])
|
||||
@require_admin
|
||||
def admin_suspend_user(user_id):
|
||||
"""Suspend user account (admin only)"""
|
||||
try:
|
||||
user = User.query.get(user_id)
|
||||
if not user:
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'error': 'User not found'
|
||||
}), 404
|
||||
|
||||
data = request.get_json()
|
||||
reason = data.get('reason', 'Policy violation')
|
||||
until = data.get('until') # ISO datetime string or None for indefinite
|
||||
|
||||
# Parse until date if provided
|
||||
suspend_until = None
|
||||
if until:
|
||||
try:
|
||||
suspend_until = datetime.fromisoformat(until.replace('Z', '+00:00'))
|
||||
except:
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'error': 'Invalid date format for until'
|
||||
}), 400
|
||||
|
||||
# Suspend user
|
||||
user.suspend(reason, suspend_until)
|
||||
|
||||
# Revoke all sessions
|
||||
revoked_count = revoke_user_sessions(user.id)
|
||||
|
||||
db.session.commit()
|
||||
|
||||
return jsonify({
|
||||
'success': True,
|
||||
'message': 'User suspended successfully',
|
||||
'revoked_sessions': revoked_count,
|
||||
'suspended_until': suspend_until.isoformat() if suspend_until else None
|
||||
})
|
||||
|
||||
except Exception as e:
|
||||
log_exception(e, "Admin user suspension error")
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'error': 'Failed to suspend user'
|
||||
}), 500
|
||||
|
||||
|
||||
@auth_bp.route('/admin/users/<user_id>/unsuspend', methods=['POST'])
|
||||
@require_admin
|
||||
def admin_unsuspend_user(user_id):
|
||||
"""Unsuspend user account (admin only)"""
|
||||
try:
|
||||
user = User.query.get(user_id)
|
||||
if not user:
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'error': 'User not found'
|
||||
}), 404
|
||||
|
||||
user.unsuspend()
|
||||
db.session.commit()
|
||||
|
||||
return jsonify({
|
||||
'success': True,
|
||||
'message': 'User unsuspended successfully'
|
||||
})
|
||||
|
||||
except Exception as e:
|
||||
log_exception(e, "Admin user unsuspension error")
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'error': 'Failed to unsuspend user'
|
||||
}), 500
|
||||
|
||||
|
||||
@auth_bp.route('/admin/users/<user_id>/reset-password', methods=['POST'])
|
||||
@require_admin
|
||||
def admin_reset_password(user_id):
|
||||
"""Reset user password (admin only)"""
|
||||
try:
|
||||
user = User.query.get(user_id)
|
||||
if not user:
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'error': 'User not found'
|
||||
}), 404
|
||||
|
||||
data = request.get_json()
|
||||
new_password = data.get('password')
|
||||
|
||||
if not new_password or len(new_password) < 8:
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'error': 'Password must be at least 8 characters'
|
||||
}), 400
|
||||
|
||||
# Reset password
|
||||
user.set_password(new_password)
|
||||
user.failed_login_attempts = 0
|
||||
user.locked_until = None
|
||||
|
||||
# Revoke all sessions
|
||||
revoked_count = revoke_user_sessions(user.id)
|
||||
|
||||
db.session.commit()
|
||||
|
||||
return jsonify({
|
||||
'success': True,
|
||||
'message': 'Password reset successfully',
|
||||
'revoked_sessions': revoked_count
|
||||
})
|
||||
|
||||
except Exception as e:
|
||||
log_exception(e, "Admin password reset error")
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'error': 'Failed to reset password'
|
||||
}), 500
|
||||
|
||||
|
||||
@auth_bp.route('/admin/users/bulk', methods=['POST'])
|
||||
@require_admin
|
||||
def admin_bulk_operation():
|
||||
"""Perform bulk operations on users (admin only)"""
|
||||
try:
|
||||
data = request.get_json()
|
||||
user_ids = data.get('user_ids', [])
|
||||
operation = data.get('operation')
|
||||
|
||||
if not user_ids or not operation:
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'error': 'User IDs and operation required'
|
||||
}), 400
|
||||
|
||||
# Get users
|
||||
users = User.query.filter(User.id.in_(user_ids)).all()
|
||||
|
||||
if not users:
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'error': 'No users found'
|
||||
}), 404
|
||||
|
||||
results = {
|
||||
'success': 0,
|
||||
'failed': 0,
|
||||
'errors': []
|
||||
}
|
||||
|
||||
for user in users:
|
||||
try:
|
||||
if operation == 'suspend':
|
||||
user.suspend(data.get('reason', 'Bulk suspension'))
|
||||
revoke_user_sessions(user.id)
|
||||
elif operation == 'unsuspend':
|
||||
user.unsuspend()
|
||||
elif operation == 'activate':
|
||||
user.is_active = True
|
||||
elif operation == 'deactivate':
|
||||
user.is_active = False
|
||||
revoke_user_sessions(user.id)
|
||||
elif operation == 'verify':
|
||||
user.is_verified = True
|
||||
elif operation == 'unverify':
|
||||
user.is_verified = False
|
||||
elif operation == 'delete':
|
||||
if not user.is_admin:
|
||||
revoke_user_sessions(user.id)
|
||||
db.session.delete(user)
|
||||
else:
|
||||
results['errors'].append(f"Cannot delete admin user {user.username}")
|
||||
results['failed'] += 1
|
||||
continue
|
||||
else:
|
||||
results['errors'].append(f"Unknown operation for user {user.username}")
|
||||
results['failed'] += 1
|
||||
continue
|
||||
|
||||
results['success'] += 1
|
||||
except Exception as e:
|
||||
results['errors'].append(f"Failed for user {user.username}: {str(e)}")
|
||||
results['failed'] += 1
|
||||
|
||||
db.session.commit()
|
||||
|
||||
return jsonify({
|
||||
'success': True,
|
||||
'results': results
|
||||
})
|
||||
|
||||
except Exception as e:
|
||||
log_exception(e, "Admin bulk operation error")
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'error': 'Failed to perform bulk operation'
|
||||
}), 500
|
||||
|
||||
|
||||
@auth_bp.route('/admin/stats/users', methods=['GET'])
|
||||
@require_admin
|
||||
def admin_user_stats():
|
||||
"""Get user statistics (admin only)"""
|
||||
try:
|
||||
stats = {
|
||||
'total_users': User.query.count(),
|
||||
'active_users': User.query.filter(
|
||||
User.is_active == True,
|
||||
User.is_suspended == False
|
||||
).count(),
|
||||
'suspended_users': User.query.filter(User.is_suspended == True).count(),
|
||||
'verified_users': User.query.filter(User.is_verified == True).count(),
|
||||
'admin_users': User.query.filter(User.role == 'admin').count(),
|
||||
'users_by_role': dict(
|
||||
db.session.query(User.role, func.count(User.id))
|
||||
.group_by(User.role).all()
|
||||
),
|
||||
'recent_registrations': User.query.filter(
|
||||
User.created_at >= datetime.utcnow() - timedelta(days=7)
|
||||
).count(),
|
||||
'active_sessions': UserSession.query.filter(
|
||||
UserSession.expires_at > datetime.utcnow()
|
||||
).count()
|
||||
}
|
||||
|
||||
return jsonify({
|
||||
'success': True,
|
||||
'stats': stats
|
||||
})
|
||||
|
||||
except Exception as e:
|
||||
log_exception(e, "Admin stats error")
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'error': 'Failed to fetch statistics'
|
||||
}), 500
|
48
check_services.py
Normal file
48
check_services.py
Normal file
@ -0,0 +1,48 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Check if Redis and PostgreSQL are available
|
||||
"""
|
||||
import os
|
||||
import sys
|
||||
from dotenv import load_dotenv
|
||||
|
||||
# Load environment variables
|
||||
load_dotenv()
|
||||
|
||||
def check_redis():
|
||||
"""Check if Redis is available"""
|
||||
try:
|
||||
import redis
|
||||
r = redis.Redis.from_url(os.environ.get('REDIS_URL', 'redis://localhost:6379/0'))
|
||||
r.ping()
|
||||
return True
|
||||
except:
|
||||
return False
|
||||
|
||||
def check_postgresql():
|
||||
"""Check if PostgreSQL is available"""
|
||||
try:
|
||||
from sqlalchemy import create_engine, text
|
||||
db_url = os.environ.get('DATABASE_URL', 'postgresql://localhost/talk2me')
|
||||
engine = create_engine(db_url, pool_pre_ping=True)
|
||||
with engine.connect() as conn:
|
||||
result = conn.execute(text("SELECT 1"))
|
||||
result.fetchone()
|
||||
return True
|
||||
except Exception as e:
|
||||
print(f"PostgreSQL connection error: {e}")
|
||||
return False
|
||||
|
||||
if __name__ == '__main__':
|
||||
redis_ok = check_redis()
|
||||
postgres_ok = check_postgresql()
|
||||
|
||||
print(f"Redis: {'✓ Available' if redis_ok else '✗ Not available'}")
|
||||
print(f"PostgreSQL: {'✓ Available' if postgres_ok else '✗ Not available'}")
|
||||
|
||||
if redis_ok and postgres_ok:
|
||||
print("\nAll services available - use full admin module")
|
||||
sys.exit(0)
|
||||
else:
|
||||
print("\nSome services missing - use simple admin module")
|
||||
sys.exit(1)
|
16
config.py
16
config.py
@ -46,13 +46,23 @@ class Config:
|
||||
self.ADMIN_TOKEN = self._get_secret('ADMIN_TOKEN',
|
||||
os.environ.get('ADMIN_TOKEN', 'default-admin-token'))
|
||||
|
||||
# Database configuration (for future use)
|
||||
# Database configuration
|
||||
self.DATABASE_URL = self._get_secret('DATABASE_URL',
|
||||
os.environ.get('DATABASE_URL', 'sqlite:///talk2me.db'))
|
||||
os.environ.get('DATABASE_URL', 'postgresql://localhost/talk2me'))
|
||||
self.SQLALCHEMY_DATABASE_URI = self.DATABASE_URL
|
||||
self.SQLALCHEMY_TRACK_MODIFICATIONS = False
|
||||
self.SQLALCHEMY_ENGINE_OPTIONS = {
|
||||
'pool_size': 10,
|
||||
'pool_recycle': 3600,
|
||||
'pool_pre_ping': True
|
||||
}
|
||||
|
||||
# Redis configuration (for future use)
|
||||
# Redis configuration
|
||||
self.REDIS_URL = self._get_secret('REDIS_URL',
|
||||
os.environ.get('REDIS_URL', 'redis://localhost:6379/0'))
|
||||
self.REDIS_DECODE_RESPONSES = False
|
||||
self.REDIS_MAX_CONNECTIONS = int(os.environ.get('REDIS_MAX_CONNECTIONS', 50))
|
||||
self.REDIS_SOCKET_TIMEOUT = int(os.environ.get('REDIS_SOCKET_TIMEOUT', 5))
|
||||
|
||||
# Whisper configuration
|
||||
self.WHISPER_MODEL_SIZE = os.environ.get('WHISPER_MODEL_SIZE', 'base')
|
||||
|
268
database.py
Normal file
268
database.py
Normal file
@ -0,0 +1,268 @@
|
||||
# Database models and configuration for Talk2Me application
|
||||
import os
|
||||
from datetime import datetime
|
||||
from typing import Optional, Dict, Any
|
||||
from flask_sqlalchemy import SQLAlchemy
|
||||
from sqlalchemy import Index, text
|
||||
from sqlalchemy.dialects.postgresql import UUID, JSONB
|
||||
from sqlalchemy.ext.hybrid import hybrid_property
|
||||
import uuid
|
||||
|
||||
db = SQLAlchemy()
|
||||
|
||||
class Translation(db.Model):
|
||||
"""Store translation history for analytics and caching"""
|
||||
__tablename__ = 'translations'
|
||||
|
||||
id = db.Column(UUID(as_uuid=True), primary_key=True, default=uuid.uuid4)
|
||||
session_id = db.Column(db.String(255), nullable=False, index=True)
|
||||
user_id = db.Column(db.String(255), nullable=True, index=True)
|
||||
|
||||
# Translation data
|
||||
source_text = db.Column(db.Text, nullable=False)
|
||||
source_language = db.Column(db.String(10), nullable=False)
|
||||
target_text = db.Column(db.Text, nullable=False)
|
||||
target_language = db.Column(db.String(10), nullable=False)
|
||||
|
||||
# Metadata
|
||||
translation_time_ms = db.Column(db.Integer, nullable=True)
|
||||
model_used = db.Column(db.String(50), default='gemma3:27b')
|
||||
confidence_score = db.Column(db.Float, nullable=True)
|
||||
|
||||
# Timestamps
|
||||
created_at = db.Column(db.DateTime, nullable=False, default=datetime.utcnow)
|
||||
accessed_at = db.Column(db.DateTime, nullable=False, default=datetime.utcnow)
|
||||
access_count = db.Column(db.Integer, default=1)
|
||||
|
||||
# Client info
|
||||
ip_address = db.Column(db.String(45), nullable=True)
|
||||
user_agent = db.Column(db.String(500), nullable=True)
|
||||
|
||||
# Create indexes for better query performance
|
||||
__table_args__ = (
|
||||
Index('idx_translations_languages', 'source_language', 'target_language'),
|
||||
Index('idx_translations_created_at', 'created_at'),
|
||||
Index('idx_translations_session_user', 'session_id', 'user_id'),
|
||||
)
|
||||
|
||||
def to_dict(self) -> Dict[str, Any]:
|
||||
"""Convert translation to dictionary"""
|
||||
return {
|
||||
'id': str(self.id),
|
||||
'session_id': self.session_id,
|
||||
'user_id': self.user_id,
|
||||
'source_text': self.source_text,
|
||||
'source_language': self.source_language,
|
||||
'target_text': self.target_text,
|
||||
'target_language': self.target_language,
|
||||
'translation_time_ms': self.translation_time_ms,
|
||||
'model_used': self.model_used,
|
||||
'confidence_score': self.confidence_score,
|
||||
'created_at': self.created_at.isoformat(),
|
||||
'accessed_at': self.accessed_at.isoformat(),
|
||||
'access_count': self.access_count
|
||||
}
|
||||
|
||||
|
||||
class Transcription(db.Model):
|
||||
"""Store transcription history"""
|
||||
__tablename__ = 'transcriptions'
|
||||
|
||||
id = db.Column(UUID(as_uuid=True), primary_key=True, default=uuid.uuid4)
|
||||
session_id = db.Column(db.String(255), nullable=False, index=True)
|
||||
user_id = db.Column(db.String(255), nullable=True, index=True)
|
||||
|
||||
# Transcription data
|
||||
transcribed_text = db.Column(db.Text, nullable=False)
|
||||
detected_language = db.Column(db.String(10), nullable=True)
|
||||
audio_duration_seconds = db.Column(db.Float, nullable=True)
|
||||
|
||||
# Metadata
|
||||
transcription_time_ms = db.Column(db.Integer, nullable=True)
|
||||
model_used = db.Column(db.String(50), default='whisper-base')
|
||||
confidence_score = db.Column(db.Float, nullable=True)
|
||||
|
||||
# File info
|
||||
audio_file_size = db.Column(db.Integer, nullable=True)
|
||||
audio_format = db.Column(db.String(10), nullable=True)
|
||||
|
||||
# Timestamps
|
||||
created_at = db.Column(db.DateTime, nullable=False, default=datetime.utcnow)
|
||||
|
||||
# Client info
|
||||
ip_address = db.Column(db.String(45), nullable=True)
|
||||
user_agent = db.Column(db.String(500), nullable=True)
|
||||
|
||||
__table_args__ = (
|
||||
Index('idx_transcriptions_created_at', 'created_at'),
|
||||
Index('idx_transcriptions_session_user', 'session_id', 'user_id'),
|
||||
)
|
||||
|
||||
def to_dict(self) -> Dict[str, Any]:
|
||||
"""Convert transcription to dictionary"""
|
||||
return {
|
||||
'id': str(self.id),
|
||||
'session_id': self.session_id,
|
||||
'user_id': self.user_id,
|
||||
'transcribed_text': self.transcribed_text,
|
||||
'detected_language': self.detected_language,
|
||||
'audio_duration_seconds': self.audio_duration_seconds,
|
||||
'transcription_time_ms': self.transcription_time_ms,
|
||||
'model_used': self.model_used,
|
||||
'confidence_score': self.confidence_score,
|
||||
'audio_file_size': self.audio_file_size,
|
||||
'audio_format': self.audio_format,
|
||||
'created_at': self.created_at.isoformat()
|
||||
}
|
||||
|
||||
|
||||
class UserPreferences(db.Model):
|
||||
"""Store user preferences and settings"""
|
||||
__tablename__ = 'user_preferences'
|
||||
|
||||
id = db.Column(UUID(as_uuid=True), primary_key=True, default=uuid.uuid4)
|
||||
user_id = db.Column(db.String(255), nullable=False, unique=True, index=True)
|
||||
session_id = db.Column(db.String(255), nullable=True)
|
||||
|
||||
# Preferences
|
||||
preferred_source_language = db.Column(db.String(10), nullable=True)
|
||||
preferred_target_language = db.Column(db.String(10), nullable=True)
|
||||
preferred_voice = db.Column(db.String(50), nullable=True)
|
||||
speech_speed = db.Column(db.Float, default=1.0)
|
||||
|
||||
# Settings stored as JSONB for flexibility
|
||||
settings = db.Column(JSONB, default={})
|
||||
|
||||
# Usage stats
|
||||
total_translations = db.Column(db.Integer, default=0)
|
||||
total_transcriptions = db.Column(db.Integer, default=0)
|
||||
total_tts_requests = db.Column(db.Integer, default=0)
|
||||
|
||||
# Timestamps
|
||||
created_at = db.Column(db.DateTime, nullable=False, default=datetime.utcnow)
|
||||
updated_at = db.Column(db.DateTime, nullable=False, default=datetime.utcnow, onupdate=datetime.utcnow)
|
||||
last_active_at = db.Column(db.DateTime, nullable=False, default=datetime.utcnow)
|
||||
|
||||
def to_dict(self) -> Dict[str, Any]:
|
||||
"""Convert preferences to dictionary"""
|
||||
return {
|
||||
'id': str(self.id),
|
||||
'user_id': self.user_id,
|
||||
'preferred_source_language': self.preferred_source_language,
|
||||
'preferred_target_language': self.preferred_target_language,
|
||||
'preferred_voice': self.preferred_voice,
|
||||
'speech_speed': self.speech_speed,
|
||||
'settings': self.settings or {},
|
||||
'total_translations': self.total_translations,
|
||||
'total_transcriptions': self.total_transcriptions,
|
||||
'total_tts_requests': self.total_tts_requests,
|
||||
'created_at': self.created_at.isoformat(),
|
||||
'updated_at': self.updated_at.isoformat(),
|
||||
'last_active_at': self.last_active_at.isoformat()
|
||||
}
|
||||
|
||||
|
||||
class UsageAnalytics(db.Model):
|
||||
"""Store aggregated usage analytics"""
|
||||
__tablename__ = 'usage_analytics'
|
||||
|
||||
id = db.Column(UUID(as_uuid=True), primary_key=True, default=uuid.uuid4)
|
||||
|
||||
# Time period
|
||||
date = db.Column(db.Date, nullable=False, index=True)
|
||||
hour = db.Column(db.Integer, nullable=True) # 0-23, null for daily aggregates
|
||||
|
||||
# Metrics
|
||||
total_requests = db.Column(db.Integer, default=0)
|
||||
unique_sessions = db.Column(db.Integer, default=0)
|
||||
unique_users = db.Column(db.Integer, default=0)
|
||||
|
||||
# Service breakdown
|
||||
transcriptions = db.Column(db.Integer, default=0)
|
||||
translations = db.Column(db.Integer, default=0)
|
||||
tts_requests = db.Column(db.Integer, default=0)
|
||||
|
||||
# Performance metrics
|
||||
avg_transcription_time_ms = db.Column(db.Float, nullable=True)
|
||||
avg_translation_time_ms = db.Column(db.Float, nullable=True)
|
||||
avg_tts_time_ms = db.Column(db.Float, nullable=True)
|
||||
|
||||
# Language stats (stored as JSONB)
|
||||
language_pairs = db.Column(JSONB, default={}) # {"en-es": 100, "fr-en": 50}
|
||||
detected_languages = db.Column(JSONB, default={}) # {"en": 150, "es": 100}
|
||||
|
||||
# Error stats
|
||||
error_count = db.Column(db.Integer, default=0)
|
||||
error_details = db.Column(JSONB, default={})
|
||||
|
||||
__table_args__ = (
|
||||
Index('idx_analytics_date_hour', 'date', 'hour'),
|
||||
db.UniqueConstraint('date', 'hour', name='uq_analytics_date_hour'),
|
||||
)
|
||||
|
||||
|
||||
class ApiKey(db.Model):
|
||||
"""Store API keys for authenticated access"""
|
||||
__tablename__ = 'api_keys'
|
||||
|
||||
id = db.Column(UUID(as_uuid=True), primary_key=True, default=uuid.uuid4)
|
||||
key_hash = db.Column(db.String(255), nullable=False, unique=True, index=True)
|
||||
name = db.Column(db.String(100), nullable=False)
|
||||
user_id = db.Column(db.String(255), nullable=True)
|
||||
|
||||
# Permissions and limits
|
||||
is_active = db.Column(db.Boolean, default=True)
|
||||
rate_limit_per_minute = db.Column(db.Integer, default=60)
|
||||
rate_limit_per_hour = db.Column(db.Integer, default=1000)
|
||||
allowed_endpoints = db.Column(JSONB, default=[]) # Empty = all endpoints
|
||||
|
||||
# Usage tracking
|
||||
total_requests = db.Column(db.Integer, default=0)
|
||||
last_used_at = db.Column(db.DateTime, nullable=True)
|
||||
|
||||
# Timestamps
|
||||
created_at = db.Column(db.DateTime, nullable=False, default=datetime.utcnow)
|
||||
expires_at = db.Column(db.DateTime, nullable=True)
|
||||
|
||||
@hybrid_property
|
||||
def is_expired(self):
|
||||
"""Check if API key is expired"""
|
||||
if self.expires_at is None:
|
||||
return False
|
||||
return datetime.utcnow() > self.expires_at
|
||||
|
||||
|
||||
def init_db(app):
|
||||
"""Initialize database with app"""
|
||||
db.init_app(app)
|
||||
|
||||
with app.app_context():
|
||||
# Create tables if they don't exist
|
||||
db.create_all()
|
||||
|
||||
# Create any custom indexes or functions
|
||||
try:
|
||||
# Create a function for updating updated_at timestamp
|
||||
db.session.execute(text("""
|
||||
CREATE OR REPLACE FUNCTION update_updated_at_column()
|
||||
RETURNS TRIGGER AS $$
|
||||
BEGIN
|
||||
NEW.updated_at = NOW();
|
||||
RETURN NEW;
|
||||
END;
|
||||
$$ language 'plpgsql';
|
||||
"""))
|
||||
|
||||
# Create trigger for user_preferences
|
||||
db.session.execute(text("""
|
||||
CREATE TRIGGER update_user_preferences_updated_at
|
||||
BEFORE UPDATE ON user_preferences
|
||||
FOR EACH ROW
|
||||
EXECUTE FUNCTION update_updated_at_column();
|
||||
"""))
|
||||
|
||||
db.session.commit()
|
||||
except Exception as e:
|
||||
# Triggers might already exist
|
||||
db.session.rollback()
|
||||
app.logger.debug(f"Database initialization note: {e}")
|
135
database_init.py
Normal file
135
database_init.py
Normal file
@ -0,0 +1,135 @@
|
||||
#!/usr/bin/env python3
|
||||
# Database initialization script
|
||||
import os
|
||||
import sys
|
||||
import logging
|
||||
from sqlalchemy import create_engine, text
|
||||
from config import get_config
|
||||
|
||||
logging.basicConfig(level=logging.INFO)
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
def create_database():
|
||||
"""Create the database if it doesn't exist"""
|
||||
config = get_config()
|
||||
db_url = config.DATABASE_URL
|
||||
|
||||
if db_url.startswith('postgresql'):
|
||||
# Parse database name from URL
|
||||
parts = db_url.split('/')
|
||||
db_name = parts[-1].split('?')[0]
|
||||
base_url = '/'.join(parts[:-1])
|
||||
|
||||
# Connect to postgres database to create our database
|
||||
engine = create_engine(f"{base_url}/postgres")
|
||||
|
||||
try:
|
||||
with engine.connect() as conn:
|
||||
# Check if database exists
|
||||
result = conn.execute(
|
||||
text("SELECT 1 FROM pg_database WHERE datname = :dbname"),
|
||||
{"dbname": db_name}
|
||||
)
|
||||
exists = result.fetchone() is not None
|
||||
|
||||
if not exists:
|
||||
# Create database
|
||||
conn.execute(text(f"CREATE DATABASE {db_name}"))
|
||||
logger.info(f"Database '{db_name}' created successfully")
|
||||
else:
|
||||
logger.info(f"Database '{db_name}' already exists")
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error creating database: {e}")
|
||||
return False
|
||||
finally:
|
||||
engine.dispose()
|
||||
|
||||
return True
|
||||
|
||||
def check_redis():
|
||||
"""Check Redis connectivity"""
|
||||
config = get_config()
|
||||
|
||||
try:
|
||||
import redis
|
||||
r = redis.from_url(config.REDIS_URL)
|
||||
r.ping()
|
||||
logger.info("Redis connection successful")
|
||||
return True
|
||||
except Exception as e:
|
||||
logger.error(f"Redis connection failed: {e}")
|
||||
return False
|
||||
|
||||
def init_database_extensions():
|
||||
"""Initialize PostgreSQL extensions"""
|
||||
config = get_config()
|
||||
|
||||
if not config.DATABASE_URL.startswith('postgresql'):
|
||||
return True
|
||||
|
||||
engine = create_engine(config.DATABASE_URL)
|
||||
|
||||
try:
|
||||
with engine.connect() as conn:
|
||||
# Enable UUID extension
|
||||
conn.execute(text("CREATE EXTENSION IF NOT EXISTS \"uuid-ossp\""))
|
||||
logger.info("PostgreSQL extensions initialized")
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error initializing extensions: {e}")
|
||||
return False
|
||||
finally:
|
||||
engine.dispose()
|
||||
|
||||
return True
|
||||
|
||||
def main():
|
||||
"""Main initialization function"""
|
||||
logger.info("Starting database initialization...")
|
||||
|
||||
# Create database
|
||||
if not create_database():
|
||||
logger.error("Failed to create database")
|
||||
sys.exit(1)
|
||||
|
||||
# Initialize extensions
|
||||
if not init_database_extensions():
|
||||
logger.error("Failed to initialize database extensions")
|
||||
sys.exit(1)
|
||||
|
||||
# Check Redis
|
||||
if not check_redis():
|
||||
logger.warning("Redis not available - caching will be disabled")
|
||||
|
||||
logger.info("Database initialization completed successfully")
|
||||
|
||||
# Create all tables using SQLAlchemy models
|
||||
logger.info("Creating database tables...")
|
||||
try:
|
||||
from flask import Flask
|
||||
from database import db, init_db
|
||||
from config import get_config
|
||||
|
||||
# Import all models to ensure they're registered
|
||||
from auth_models import User, LoginHistory, UserSession, RevokedToken
|
||||
|
||||
# Create Flask app context
|
||||
app = Flask(__name__)
|
||||
config = get_config()
|
||||
app.config.from_mapping(config.__dict__)
|
||||
|
||||
# Initialize database
|
||||
init_db(app)
|
||||
|
||||
with app.app_context():
|
||||
# Create all tables
|
||||
db.create_all()
|
||||
logger.info("Database tables created successfully")
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to create database tables: {e}")
|
||||
sys.exit(1)
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
75
init_all_databases.py
Executable file
75
init_all_databases.py
Executable file
@ -0,0 +1,75 @@
|
||||
#!/usr/bin/env python3
|
||||
"""Initialize all database tables for Talk2Me"""
|
||||
|
||||
import os
|
||||
import sys
|
||||
import subprocess
|
||||
import logging
|
||||
from dotenv import load_dotenv
|
||||
|
||||
# Load environment variables
|
||||
load_dotenv()
|
||||
|
||||
logging.basicConfig(level=logging.INFO)
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
def run_script(script_name):
|
||||
"""Run a Python script and return success status"""
|
||||
try:
|
||||
logger.info(f"Running {script_name}...")
|
||||
result = subprocess.run([sys.executable, script_name], capture_output=True, text=True)
|
||||
|
||||
if result.returncode == 0:
|
||||
logger.info(f"✓ {script_name} completed successfully")
|
||||
return True
|
||||
else:
|
||||
logger.error(f"✗ {script_name} failed with return code {result.returncode}")
|
||||
if result.stderr:
|
||||
logger.error(f"Error output: {result.stderr}")
|
||||
return False
|
||||
except Exception as e:
|
||||
logger.error(f"✗ Failed to run {script_name}: {e}")
|
||||
return False
|
||||
|
||||
def main():
|
||||
"""Initialize all databases"""
|
||||
logger.info("=== Talk2Me Database Initialization ===")
|
||||
|
||||
# Check if DATABASE_URL is set
|
||||
if not os.environ.get('DATABASE_URL'):
|
||||
logger.error("DATABASE_URL environment variable not set!")
|
||||
logger.info("Please set DATABASE_URL in your .env file")
|
||||
logger.info("Example: DATABASE_URL=postgresql://postgres:password@localhost:5432/talk2me")
|
||||
return False
|
||||
|
||||
logger.info(f"Using database: {os.environ.get('DATABASE_URL')}")
|
||||
|
||||
scripts = [
|
||||
"database_init.py", # Initialize SQLAlchemy models
|
||||
"init_auth_db.py", # Initialize authentication tables
|
||||
"init_analytics_db.py" # Initialize analytics tables
|
||||
]
|
||||
|
||||
success = True
|
||||
for script in scripts:
|
||||
if os.path.exists(script):
|
||||
if not run_script(script):
|
||||
success = False
|
||||
else:
|
||||
logger.warning(f"Script {script} not found, skipping...")
|
||||
|
||||
if success:
|
||||
logger.info("\n✅ All database initialization completed successfully!")
|
||||
logger.info("\nYou can now:")
|
||||
logger.info("1. Create an admin user by calling POST /api/init-admin-user")
|
||||
logger.info("2. Or use the admin token to log in and create users")
|
||||
logger.info("3. Check /api/test-auth to verify authentication is working")
|
||||
else:
|
||||
logger.error("\n❌ Some database initialization steps failed!")
|
||||
logger.info("Please check the errors above and try again")
|
||||
|
||||
return success
|
||||
|
||||
if __name__ == "__main__":
|
||||
success = main()
|
||||
sys.exit(0 if success else 1)
|
72
init_analytics_db.py
Executable file
72
init_analytics_db.py
Executable file
@ -0,0 +1,72 @@
|
||||
#!/usr/bin/env python3
|
||||
"""Initialize analytics database tables"""
|
||||
|
||||
import os
|
||||
import sys
|
||||
import psycopg2
|
||||
from psycopg2.extensions import ISOLATION_LEVEL_AUTOCOMMIT
|
||||
import logging
|
||||
from dotenv import load_dotenv
|
||||
|
||||
# Load environment variables
|
||||
load_dotenv()
|
||||
|
||||
logging.basicConfig(level=logging.INFO)
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
def init_analytics_db():
|
||||
"""Initialize analytics database tables"""
|
||||
|
||||
# Get database URL from environment
|
||||
database_url = os.environ.get('DATABASE_URL', 'postgresql://localhost/talk2me')
|
||||
|
||||
try:
|
||||
# Connect to PostgreSQL
|
||||
conn = psycopg2.connect(database_url)
|
||||
conn.set_isolation_level(ISOLATION_LEVEL_AUTOCOMMIT)
|
||||
cursor = conn.cursor()
|
||||
|
||||
logger.info("Connected to PostgreSQL database")
|
||||
|
||||
# Read SQL file
|
||||
sql_file = os.path.join(os.path.dirname(__file__), 'migrations', 'create_analytics_tables.sql')
|
||||
|
||||
if not os.path.exists(sql_file):
|
||||
logger.error(f"SQL file not found: {sql_file}")
|
||||
return False
|
||||
|
||||
with open(sql_file, 'r') as f:
|
||||
sql_content = f.read()
|
||||
|
||||
# Execute SQL commands
|
||||
logger.info("Creating analytics tables...")
|
||||
cursor.execute(sql_content)
|
||||
|
||||
logger.info("Analytics tables created successfully!")
|
||||
|
||||
# Verify tables were created
|
||||
cursor.execute("""
|
||||
SELECT table_name
|
||||
FROM information_schema.tables
|
||||
WHERE table_schema = 'public'
|
||||
AND table_name IN (
|
||||
'error_logs', 'request_logs', 'translation_logs',
|
||||
'transcription_logs', 'tts_logs', 'daily_stats'
|
||||
)
|
||||
""")
|
||||
|
||||
created_tables = [row[0] for row in cursor.fetchall()]
|
||||
logger.info(f"Created tables: {', '.join(created_tables)}")
|
||||
|
||||
cursor.close()
|
||||
conn.close()
|
||||
|
||||
return True
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to initialize analytics database: {e}")
|
||||
return False
|
||||
|
||||
if __name__ == "__main__":
|
||||
success = init_analytics_db()
|
||||
sys.exit(0 if success else 1)
|
149
init_auth_db.py
Normal file
149
init_auth_db.py
Normal file
@ -0,0 +1,149 @@
|
||||
#!/usr/bin/env python3
|
||||
"""Initialize authentication database and create default admin user"""
|
||||
|
||||
import os
|
||||
import sys
|
||||
import getpass
|
||||
from flask import Flask
|
||||
from flask_sqlalchemy import SQLAlchemy
|
||||
from dotenv import load_dotenv
|
||||
|
||||
# Load environment variables
|
||||
load_dotenv()
|
||||
|
||||
# Add parent directory to path
|
||||
sys.path.insert(0, os.path.dirname(os.path.abspath(__file__)))
|
||||
|
||||
from config import init_app as init_config
|
||||
from database import db, init_db
|
||||
from auth_models import User, bcrypt
|
||||
from auth import create_user
|
||||
|
||||
def create_admin_user():
|
||||
"""Create the default admin user"""
|
||||
# Skip if running non-interactively
|
||||
if not sys.stdin.isatty():
|
||||
print("Running non-interactively, skipping interactive admin creation.")
|
||||
return True
|
||||
|
||||
print("\n=== Talk2Me Admin User Setup ===\n")
|
||||
|
||||
# Get admin credentials
|
||||
while True:
|
||||
email = input("Admin email: ").strip()
|
||||
if '@' in email and '.' in email:
|
||||
break
|
||||
print("Please enter a valid email address.")
|
||||
|
||||
while True:
|
||||
username = input("Admin username: ").strip()
|
||||
if len(username) >= 3:
|
||||
break
|
||||
print("Username must be at least 3 characters.")
|
||||
|
||||
while True:
|
||||
password = getpass.getpass("Admin password (min 8 chars): ")
|
||||
if len(password) >= 8:
|
||||
password_confirm = getpass.getpass("Confirm password: ")
|
||||
if password == password_confirm:
|
||||
break
|
||||
print("Passwords don't match. Try again.")
|
||||
else:
|
||||
print("Password must be at least 8 characters.")
|
||||
|
||||
full_name = input("Full name (optional): ").strip() or None
|
||||
|
||||
# Create admin user
|
||||
print("\nCreating admin user...")
|
||||
user, error = create_user(
|
||||
email=email,
|
||||
username=username,
|
||||
password=password,
|
||||
full_name=full_name,
|
||||
role='admin',
|
||||
is_verified=True
|
||||
)
|
||||
|
||||
if error:
|
||||
print(f"Error creating admin: {error}")
|
||||
return False
|
||||
|
||||
# Set higher rate limits for admin
|
||||
user.rate_limit_per_minute = 300
|
||||
user.rate_limit_per_hour = 5000
|
||||
user.rate_limit_per_day = 50000
|
||||
|
||||
# Add all permissions
|
||||
user.permissions = ['all']
|
||||
|
||||
db.session.commit()
|
||||
|
||||
print(f"\n✅ Admin user created successfully!")
|
||||
print(f" Email: {user.email}")
|
||||
print(f" Username: {user.username}")
|
||||
print(f" API Key: {user.api_key}")
|
||||
print(f"\n📝 Save your API key securely. You can use it to authenticate API requests.")
|
||||
print(f"\n🔐 Login at: http://localhost:5005/login")
|
||||
print(f"📊 Admin dashboard: http://localhost:5005/admin/users")
|
||||
|
||||
return True
|
||||
|
||||
|
||||
def init_database():
|
||||
"""Initialize the database with all tables"""
|
||||
# Create Flask app
|
||||
app = Flask(__name__)
|
||||
|
||||
# Initialize configuration
|
||||
init_config(app)
|
||||
|
||||
# Initialize bcrypt
|
||||
bcrypt.init_app(app)
|
||||
|
||||
# Initialize database
|
||||
init_db(app)
|
||||
|
||||
with app.app_context():
|
||||
print("Creating database tables...")
|
||||
|
||||
# Import all models to ensure they're registered
|
||||
from auth_models import User, LoginHistory, UserSession, RevokedToken
|
||||
from database import Translation, Transcription, UserPreferences, UsageAnalytics, ApiKey
|
||||
|
||||
# Create all tables
|
||||
db.create_all()
|
||||
print("✅ Database tables created successfully!")
|
||||
|
||||
# Check if admin user already exists
|
||||
admin_exists = User.query.filter_by(role='admin').first()
|
||||
|
||||
if admin_exists:
|
||||
print(f"\n⚠️ Admin user already exists: {admin_exists.username}")
|
||||
# Skip creating new admin if running non-interactively
|
||||
if not sys.stdin.isatty():
|
||||
print("Running non-interactively, skipping admin user creation.")
|
||||
return
|
||||
create_new = input("Create another admin user? (y/n): ").lower().strip()
|
||||
if create_new != 'y':
|
||||
print("\nExiting without creating new admin.")
|
||||
return
|
||||
|
||||
# Create admin user
|
||||
if not create_admin_user():
|
||||
print("\n❌ Failed to create admin user.")
|
||||
sys.exit(1)
|
||||
|
||||
print("\n✨ Authentication system initialized successfully!")
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
try:
|
||||
init_database()
|
||||
except KeyboardInterrupt:
|
||||
print("\n\nSetup cancelled by user.")
|
||||
sys.exit(0)
|
||||
except Exception as e:
|
||||
print(f"\n❌ Error during setup: {str(e)}")
|
||||
import traceback
|
||||
traceback.print_exc()
|
||||
sys.exit(1)
|
@ -1,403 +1,238 @@
|
||||
# Memory management system to prevent leaks and monitor usage
|
||||
"""Memory management for Talk2Me application"""
|
||||
|
||||
import gc
|
||||
import os
|
||||
import logging
|
||||
import psutil
|
||||
import torch
|
||||
import logging
|
||||
import threading
|
||||
import os
|
||||
import time
|
||||
from typing import Dict, Optional, Callable
|
||||
from dataclasses import dataclass, field
|
||||
from datetime import datetime
|
||||
import weakref
|
||||
import tempfile
|
||||
import shutil
|
||||
from contextlib import contextmanager
|
||||
from functools import wraps
|
||||
from dataclasses import dataclass
|
||||
from typing import Optional, Dict, Any
|
||||
import threading
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
@dataclass
|
||||
class MemoryStats:
|
||||
"""Current memory statistics"""
|
||||
timestamp: float = field(default_factory=time.time)
|
||||
process_memory_mb: float = 0.0
|
||||
system_memory_percent: float = 0.0
|
||||
"""Memory statistics"""
|
||||
process_memory_mb: float
|
||||
available_memory_mb: float
|
||||
memory_percent: float
|
||||
gpu_memory_mb: float = 0.0
|
||||
gpu_memory_percent: float = 0.0
|
||||
temp_files_count: int = 0
|
||||
temp_files_size_mb: float = 0.0
|
||||
active_sessions: int = 0
|
||||
gc_collections: Dict[int, int] = field(default_factory=dict)
|
||||
|
||||
class MemoryManager:
|
||||
"""
|
||||
Comprehensive memory management system to prevent leaks
|
||||
"""
|
||||
def __init__(self, app=None, config=None):
|
||||
self.config = config or {}
|
||||
"""Manage memory usage for the application"""
|
||||
|
||||
def __init__(self, app=None, config: Optional[Dict[str, Any]] = None):
|
||||
self.app = app
|
||||
self._cleanup_callbacks = []
|
||||
self._resource_registry = weakref.WeakValueDictionary()
|
||||
self._monitoring_thread = None
|
||||
self._shutdown = False
|
||||
|
||||
# Memory thresholds
|
||||
self.memory_threshold_mb = self.config.get('memory_threshold_mb', 4096) # 4GB
|
||||
self.gpu_memory_threshold_mb = self.config.get('gpu_memory_threshold_mb', 2048) # 2GB
|
||||
self.cleanup_interval = self.config.get('cleanup_interval', 30) # 30 seconds
|
||||
|
||||
# Whisper model reference
|
||||
self.config = config or {}
|
||||
self.memory_threshold_mb = self.config.get('memory_threshold_mb', 4096)
|
||||
self.gpu_memory_threshold_mb = self.config.get('gpu_memory_threshold_mb', 2048)
|
||||
self.cleanup_interval = self.config.get('cleanup_interval', 30)
|
||||
self.whisper_model = None
|
||||
self.model_reload_count = 0
|
||||
self.last_model_reload = time.time()
|
||||
self._cleanup_thread = None
|
||||
self._stop_cleanup = threading.Event()
|
||||
|
||||
if app:
|
||||
self.init_app(app)
|
||||
|
||||
def init_app(self, app):
|
||||
"""Initialize memory management for Flask app"""
|
||||
"""Initialize with Flask app"""
|
||||
self.app = app
|
||||
app.memory_manager = self
|
||||
|
||||
# Start monitoring thread
|
||||
self._start_monitoring()
|
||||
# Start cleanup thread
|
||||
self._start_cleanup_thread()
|
||||
|
||||
# Register cleanup on shutdown
|
||||
import atexit
|
||||
atexit.register(self.shutdown)
|
||||
|
||||
logger.info("Memory manager initialized")
|
||||
logger.info(f"Memory manager initialized with thresholds: "
|
||||
f"Process={self.memory_threshold_mb}MB, "
|
||||
f"GPU={self.gpu_memory_threshold_mb}MB")
|
||||
|
||||
def set_whisper_model(self, model):
|
||||
"""Register the Whisper model for management"""
|
||||
"""Set reference to Whisper model for memory management"""
|
||||
self.whisper_model = model
|
||||
logger.info("Whisper model registered with memory manager")
|
||||
|
||||
def _start_monitoring(self):
|
||||
"""Start background memory monitoring"""
|
||||
self._monitoring_thread = threading.Thread(
|
||||
target=self._monitor_memory,
|
||||
daemon=True
|
||||
)
|
||||
self._monitoring_thread.start()
|
||||
|
||||
def _monitor_memory(self):
|
||||
"""Background thread to monitor and manage memory"""
|
||||
logger.info("Memory monitoring thread started")
|
||||
def get_memory_stats(self) -> MemoryStats:
|
||||
"""Get current memory statistics"""
|
||||
process = psutil.Process()
|
||||
memory_info = process.memory_info()
|
||||
|
||||
while not self._shutdown:
|
||||
stats = MemoryStats(
|
||||
process_memory_mb=memory_info.rss / 1024 / 1024,
|
||||
available_memory_mb=psutil.virtual_memory().available / 1024 / 1024,
|
||||
memory_percent=process.memory_percent()
|
||||
)
|
||||
|
||||
# Check GPU memory if available
|
||||
if torch.cuda.is_available():
|
||||
try:
|
||||
# Collect memory statistics
|
||||
stats = self.get_memory_stats()
|
||||
|
||||
# Check if we need to free memory
|
||||
if self._should_cleanup(stats):
|
||||
logger.warning(f"Memory threshold exceeded - Process: {stats.process_memory_mb:.1f}MB, "
|
||||
f"GPU: {stats.gpu_memory_mb:.1f}MB")
|
||||
self.cleanup_memory(aggressive=True)
|
||||
|
||||
# Log stats periodically
|
||||
if int(time.time()) % 300 == 0: # Every 5 minutes
|
||||
logger.info(f"Memory stats - Process: {stats.process_memory_mb:.1f}MB, "
|
||||
f"System: {stats.system_memory_percent:.1f}%, "
|
||||
f"GPU: {stats.gpu_memory_mb:.1f}MB")
|
||||
|
||||
stats.gpu_memory_mb = torch.cuda.memory_allocated() / 1024 / 1024
|
||||
stats.gpu_memory_percent = (torch.cuda.memory_allocated() /
|
||||
torch.cuda.get_device_properties(0).total_memory * 100)
|
||||
except Exception as e:
|
||||
logger.error(f"Error in memory monitoring: {e}")
|
||||
|
||||
time.sleep(self.cleanup_interval)
|
||||
logger.error(f"Error getting GPU memory stats: {e}")
|
||||
|
||||
return stats
|
||||
|
||||
def _should_cleanup(self, stats: MemoryStats) -> bool:
|
||||
"""Determine if memory cleanup is needed"""
|
||||
def check_memory_pressure(self) -> bool:
|
||||
"""Check if system is under memory pressure"""
|
||||
stats = self.get_memory_stats()
|
||||
|
||||
# Check process memory
|
||||
if stats.process_memory_mb > self.memory_threshold_mb:
|
||||
logger.warning(f"High process memory usage: {stats.process_memory_mb:.1f}MB")
|
||||
return True
|
||||
|
||||
# Check system memory
|
||||
if stats.system_memory_percent > 85:
|
||||
if stats.memory_percent > 80:
|
||||
logger.warning(f"High system memory usage: {stats.memory_percent:.1f}%")
|
||||
return True
|
||||
|
||||
# Check GPU memory
|
||||
if stats.gpu_memory_mb > self.gpu_memory_threshold_mb:
|
||||
logger.warning(f"High GPU memory usage: {stats.gpu_memory_mb:.1f}MB")
|
||||
return True
|
||||
|
||||
return False
|
||||
|
||||
def get_memory_stats(self) -> MemoryStats:
|
||||
"""Get current memory statistics"""
|
||||
stats = MemoryStats()
|
||||
def cleanup_memory(self, aggressive: bool = False):
|
||||
"""Clean up memory"""
|
||||
logger.info("Starting memory cleanup...")
|
||||
|
||||
try:
|
||||
# Process memory
|
||||
process = psutil.Process()
|
||||
memory_info = process.memory_info()
|
||||
stats.process_memory_mb = memory_info.rss / 1024 / 1024
|
||||
|
||||
# System memory
|
||||
system_memory = psutil.virtual_memory()
|
||||
stats.system_memory_percent = system_memory.percent
|
||||
|
||||
# GPU memory if available
|
||||
if torch.cuda.is_available():
|
||||
stats.gpu_memory_mb = torch.cuda.memory_allocated() / 1024 / 1024
|
||||
stats.gpu_memory_percent = (torch.cuda.memory_allocated() /
|
||||
torch.cuda.get_device_properties(0).total_memory * 100)
|
||||
|
||||
# Temp files
|
||||
temp_dir = self.app.config.get('UPLOAD_FOLDER', tempfile.gettempdir())
|
||||
if os.path.exists(temp_dir):
|
||||
temp_files = list(os.listdir(temp_dir))
|
||||
stats.temp_files_count = len(temp_files)
|
||||
stats.temp_files_size_mb = sum(
|
||||
os.path.getsize(os.path.join(temp_dir, f))
|
||||
for f in temp_files if os.path.isfile(os.path.join(temp_dir, f))
|
||||
) / 1024 / 1024
|
||||
|
||||
# Session count
|
||||
if hasattr(self.app, 'session_manager'):
|
||||
stats.active_sessions = len(self.app.session_manager.sessions)
|
||||
|
||||
# GC stats
|
||||
gc_stats = gc.get_stats()
|
||||
for i, stat in enumerate(gc_stats):
|
||||
if isinstance(stat, dict):
|
||||
stats.gc_collections[i] = stat.get('collections', 0)
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error collecting memory stats: {e}")
|
||||
# Run garbage collection
|
||||
collected = gc.collect()
|
||||
logger.info(f"Garbage collector: collected {collected} objects")
|
||||
|
||||
return stats
|
||||
|
||||
def cleanup_memory(self, aggressive=False):
|
||||
"""Perform memory cleanup"""
|
||||
logger.info(f"Starting memory cleanup (aggressive={aggressive})")
|
||||
freed_mb = 0
|
||||
|
||||
try:
|
||||
# 1. Force garbage collection
|
||||
gc.collect()
|
||||
if aggressive:
|
||||
gc.collect(2) # Full collection
|
||||
|
||||
# 2. Clear GPU memory cache
|
||||
if torch.cuda.is_available():
|
||||
before_gpu = torch.cuda.memory_allocated() / 1024 / 1024
|
||||
torch.cuda.empty_cache()
|
||||
torch.cuda.synchronize()
|
||||
after_gpu = torch.cuda.memory_allocated() / 1024 / 1024
|
||||
freed_mb += (before_gpu - after_gpu)
|
||||
logger.info(f"Freed {before_gpu - after_gpu:.1f}MB GPU memory")
|
||||
|
||||
# 3. Clean old temporary files
|
||||
if hasattr(self.app, 'config'):
|
||||
temp_dir = self.app.config.get('UPLOAD_FOLDER')
|
||||
if temp_dir and os.path.exists(temp_dir):
|
||||
freed_mb += self._cleanup_temp_files(temp_dir, aggressive)
|
||||
|
||||
# 4. Trigger session cleanup
|
||||
if hasattr(self.app, 'session_manager'):
|
||||
self.app.session_manager.cleanup_expired_sessions()
|
||||
if aggressive:
|
||||
self.app.session_manager.cleanup_idle_sessions()
|
||||
|
||||
# 5. Run registered cleanup callbacks
|
||||
for callback in self._cleanup_callbacks:
|
||||
try:
|
||||
callback()
|
||||
except Exception as e:
|
||||
logger.error(f"Cleanup callback error: {e}")
|
||||
|
||||
# 6. Reload Whisper model if needed (aggressive mode only)
|
||||
if aggressive and self.whisper_model and torch.cuda.is_available():
|
||||
current_gpu_mb = torch.cuda.memory_allocated() / 1024 / 1024
|
||||
if current_gpu_mb > self.gpu_memory_threshold_mb * 0.8:
|
||||
self._reload_whisper_model()
|
||||
|
||||
logger.info(f"Memory cleanup completed - freed approximately {freed_mb:.1f}MB")
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error during memory cleanup: {e}")
|
||||
|
||||
def _cleanup_temp_files(self, temp_dir: str, aggressive: bool) -> float:
|
||||
"""Clean up temporary files"""
|
||||
freed_mb = 0
|
||||
current_time = time.time()
|
||||
max_age = 300 if not aggressive else 60 # 5 minutes or 1 minute
|
||||
|
||||
try:
|
||||
for filename in os.listdir(temp_dir):
|
||||
filepath = os.path.join(temp_dir, filename)
|
||||
if os.path.isfile(filepath):
|
||||
file_age = current_time - os.path.getmtime(filepath)
|
||||
if file_age > max_age:
|
||||
file_size = os.path.getsize(filepath) / 1024 / 1024
|
||||
try:
|
||||
os.remove(filepath)
|
||||
freed_mb += file_size
|
||||
logger.debug(f"Removed old temp file: {filename}")
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to remove {filepath}: {e}")
|
||||
except Exception as e:
|
||||
logger.error(f"Error cleaning temp files: {e}")
|
||||
|
||||
return freed_mb
|
||||
|
||||
def _reload_whisper_model(self):
|
||||
"""Reload Whisper model to clear GPU memory fragmentation"""
|
||||
if not self.whisper_model:
|
||||
return
|
||||
|
||||
# Don't reload too frequently
|
||||
if time.time() - self.last_model_reload < 300: # 5 minutes
|
||||
return
|
||||
|
||||
try:
|
||||
logger.info("Reloading Whisper model to clear GPU memory")
|
||||
|
||||
# Get model info
|
||||
import whisper
|
||||
model_size = getattr(self.whisper_model, 'model_size', 'base')
|
||||
device = next(self.whisper_model.parameters()).device
|
||||
|
||||
# Clear the old model
|
||||
del self.whisper_model
|
||||
# Clear GPU cache if available
|
||||
if torch.cuda.is_available():
|
||||
torch.cuda.empty_cache()
|
||||
gc.collect()
|
||||
torch.cuda.synchronize()
|
||||
logger.info("Cleared GPU cache")
|
||||
|
||||
if aggressive:
|
||||
# Force garbage collection of all generations
|
||||
for i in range(3):
|
||||
gc.collect(i)
|
||||
|
||||
# Reload model
|
||||
self.whisper_model = whisper.load_model(model_size, device=device)
|
||||
self.model_reload_count += 1
|
||||
self.last_model_reload = time.time()
|
||||
|
||||
# Update app reference
|
||||
if hasattr(self.app, 'whisper_model'):
|
||||
self.app.whisper_model = self.whisper_model
|
||||
|
||||
logger.info(f"Whisper model reloaded successfully (reload #{self.model_reload_count})")
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to reload Whisper model: {e}")
|
||||
# Clear Whisper model cache if needed
|
||||
if self.whisper_model and hasattr(self.whisper_model, 'clear_cache'):
|
||||
self.whisper_model.clear_cache()
|
||||
logger.info("Cleared Whisper model cache")
|
||||
|
||||
def register_cleanup_callback(self, callback: Callable):
|
||||
"""Register a callback to be called during cleanup"""
|
||||
self._cleanup_callbacks.append(callback)
|
||||
def _cleanup_worker(self):
|
||||
"""Background cleanup worker"""
|
||||
while not self._stop_cleanup.wait(self.cleanup_interval):
|
||||
try:
|
||||
if self.check_memory_pressure():
|
||||
self.cleanup_memory(aggressive=True)
|
||||
else:
|
||||
# Light cleanup
|
||||
gc.collect(0)
|
||||
|
||||
if torch.cuda.is_available():
|
||||
torch.cuda.empty_cache()
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error in memory cleanup worker: {e}")
|
||||
|
||||
def register_resource(self, resource, name: str = None):
|
||||
"""Register a resource for tracking"""
|
||||
if name:
|
||||
self._resource_registry[name] = resource
|
||||
def _start_cleanup_thread(self):
|
||||
"""Start background cleanup thread"""
|
||||
if self._cleanup_thread and self._cleanup_thread.is_alive():
|
||||
return
|
||||
|
||||
self._stop_cleanup.clear()
|
||||
self._cleanup_thread = threading.Thread(target=self._cleanup_worker, daemon=True)
|
||||
self._cleanup_thread.start()
|
||||
logger.info("Started memory cleanup thread")
|
||||
|
||||
def release_resource(self, name: str):
|
||||
"""Release a tracked resource"""
|
||||
if name in self._resource_registry:
|
||||
del self._resource_registry[name]
|
||||
def stop(self):
|
||||
"""Stop memory manager"""
|
||||
self._stop_cleanup.set()
|
||||
if self._cleanup_thread:
|
||||
self._cleanup_thread.join(timeout=5)
|
||||
|
||||
def get_metrics(self) -> Dict:
|
||||
"""Get memory management metrics"""
|
||||
def get_metrics(self) -> Dict[str, Any]:
|
||||
"""Get memory metrics for monitoring"""
|
||||
stats = self.get_memory_stats()
|
||||
|
||||
return {
|
||||
'memory': {
|
||||
'process_mb': round(stats.process_memory_mb, 1),
|
||||
'system_percent': round(stats.system_memory_percent, 1),
|
||||
'gpu_mb': round(stats.gpu_memory_mb, 1),
|
||||
'gpu_percent': round(stats.gpu_memory_percent, 1)
|
||||
},
|
||||
'temp_files': {
|
||||
'count': stats.temp_files_count,
|
||||
'size_mb': round(stats.temp_files_size_mb, 1)
|
||||
},
|
||||
'sessions': {
|
||||
'active': stats.active_sessions
|
||||
},
|
||||
'model': {
|
||||
'reload_count': self.model_reload_count,
|
||||
'last_reload': datetime.fromtimestamp(self.last_model_reload).isoformat()
|
||||
},
|
||||
'process_memory_mb': round(stats.process_memory_mb, 2),
|
||||
'available_memory_mb': round(stats.available_memory_mb, 2),
|
||||
'memory_percent': round(stats.memory_percent, 2),
|
||||
'gpu_memory_mb': round(stats.gpu_memory_mb, 2),
|
||||
'gpu_memory_percent': round(stats.gpu_memory_percent, 2),
|
||||
'thresholds': {
|
||||
'memory_mb': self.memory_threshold_mb,
|
||||
'process_mb': self.memory_threshold_mb,
|
||||
'gpu_mb': self.gpu_memory_threshold_mb
|
||||
}
|
||||
},
|
||||
'under_pressure': self.check_memory_pressure()
|
||||
}
|
||||
|
||||
def shutdown(self):
|
||||
"""Shutdown memory manager"""
|
||||
logger.info("Shutting down memory manager")
|
||||
self._shutdown = True
|
||||
|
||||
# Final cleanup
|
||||
self.cleanup_memory(aggressive=True)
|
||||
|
||||
# Wait for monitoring thread
|
||||
if self._monitoring_thread:
|
||||
self._monitoring_thread.join(timeout=5)
|
||||
|
||||
# Context manager for audio processing
|
||||
class AudioProcessingContext:
|
||||
"""Context manager to ensure audio resources are cleaned up"""
|
||||
def __init__(self, memory_manager: MemoryManager, name: str = None):
|
||||
"""Context manager for audio processing with memory management"""
|
||||
|
||||
def __init__(self, memory_manager: MemoryManager, name: str = "audio_processing"):
|
||||
self.memory_manager = memory_manager
|
||||
self.name = name or f"audio_{int(time.time() * 1000)}"
|
||||
self.name = name
|
||||
self.temp_files = []
|
||||
self.start_time = None
|
||||
self.start_memory = None
|
||||
|
||||
def __enter__(self):
|
||||
self.start_time = time.time()
|
||||
if torch.cuda.is_available():
|
||||
self.start_memory = torch.cuda.memory_allocated()
|
||||
|
||||
# Check memory before processing
|
||||
if self.memory_manager and self.memory_manager.check_memory_pressure():
|
||||
logger.warning(f"Memory pressure detected before {self.name}")
|
||||
self.memory_manager.cleanup_memory()
|
||||
|
||||
return self
|
||||
|
||||
def __exit__(self, exc_type, exc_val, exc_tb):
|
||||
# Clean up temp files
|
||||
for filepath in self.temp_files:
|
||||
# Clean up temporary files
|
||||
for temp_file in self.temp_files:
|
||||
try:
|
||||
if os.path.exists(filepath):
|
||||
os.remove(filepath)
|
||||
if os.path.exists(temp_file):
|
||||
os.remove(temp_file)
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to remove temp file {filepath}: {e}")
|
||||
logger.error(f"Failed to remove temp file {temp_file}: {e}")
|
||||
|
||||
# Clear GPU cache if used
|
||||
if torch.cuda.is_available():
|
||||
torch.cuda.empty_cache()
|
||||
|
||||
# Log memory usage
|
||||
if self.start_memory is not None:
|
||||
memory_used = torch.cuda.memory_allocated() - self.start_memory
|
||||
duration = time.time() - self.start_time
|
||||
logger.debug(f"Audio processing '{self.name}' - Duration: {duration:.2f}s, "
|
||||
f"GPU memory: {memory_used / 1024 / 1024:.1f}MB")
|
||||
|
||||
# Force garbage collection if there was an error
|
||||
if exc_type is not None:
|
||||
# Clean up memory after processing
|
||||
if self.memory_manager:
|
||||
gc.collect()
|
||||
if torch.cuda.is_available():
|
||||
torch.cuda.empty_cache()
|
||||
|
||||
duration = time.time() - self.start_time
|
||||
logger.info(f"{self.name} completed in {duration:.2f}s")
|
||||
|
||||
def add_temp_file(self, filepath: str):
|
||||
"""Register a temporary file for cleanup"""
|
||||
"""Add a temporary file to be cleaned up"""
|
||||
self.temp_files.append(filepath)
|
||||
|
||||
# Utility functions
|
||||
def with_memory_management(func):
|
||||
"""Decorator to add memory management to functions"""
|
||||
@wraps(func)
|
||||
def wrapper(*args, **kwargs):
|
||||
# Get memory manager from app context
|
||||
from flask import current_app
|
||||
memory_manager = getattr(current_app, 'memory_manager', None)
|
||||
|
||||
if memory_manager:
|
||||
with AudioProcessingContext(memory_manager, name=func.__name__):
|
||||
return func(*args, **kwargs)
|
||||
else:
|
||||
return func(*args, **kwargs)
|
||||
# Check memory before
|
||||
if memory_manager.check_memory_pressure():
|
||||
logger.warning(f"Memory pressure before {func.__name__}")
|
||||
memory_manager.cleanup_memory()
|
||||
|
||||
try:
|
||||
result = func(*args, **kwargs)
|
||||
return result
|
||||
finally:
|
||||
# Light cleanup after
|
||||
gc.collect(0)
|
||||
if torch.cuda.is_available():
|
||||
torch.cuda.empty_cache()
|
||||
|
||||
return wrapper
|
||||
|
||||
def init_memory_management(app, **kwargs):
|
||||
"""Initialize memory management for the application"""
|
||||
config = {
|
||||
'memory_threshold_mb': kwargs.get('memory_threshold_mb', 4096),
|
||||
'gpu_memory_threshold_mb': kwargs.get('gpu_memory_threshold_mb', 2048),
|
||||
'cleanup_interval': kwargs.get('cleanup_interval', 30)
|
||||
}
|
||||
|
||||
memory_manager = MemoryManager(app, config)
|
||||
return memory_manager
|
||||
return wrapper
|
135
migrations.py
Normal file
135
migrations.py
Normal file
@ -0,0 +1,135 @@
|
||||
# Database migration scripts
|
||||
import os
|
||||
import sys
|
||||
import logging
|
||||
from flask import Flask
|
||||
from flask_migrate import Migrate, init, migrate, upgrade
|
||||
from database import db, init_db
|
||||
from config import Config
|
||||
|
||||
logging.basicConfig(level=logging.INFO)
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
def create_app():
|
||||
"""Create Flask app for migrations"""
|
||||
app = Flask(__name__)
|
||||
|
||||
# Load configuration
|
||||
config = Config()
|
||||
app.config.from_object(config)
|
||||
|
||||
# Initialize database
|
||||
init_db(app)
|
||||
|
||||
return app
|
||||
|
||||
def init_migrations():
|
||||
"""Initialize migration repository"""
|
||||
app = create_app()
|
||||
|
||||
with app.app_context():
|
||||
# Initialize Flask-Migrate
|
||||
migrate_instance = Migrate(app, db)
|
||||
|
||||
# Initialize migration repository
|
||||
try:
|
||||
init()
|
||||
logger.info("Migration repository initialized")
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to initialize migrations: {e}")
|
||||
return False
|
||||
|
||||
return True
|
||||
|
||||
def create_migration(message="Auto migration"):
|
||||
"""Create a new migration"""
|
||||
app = create_app()
|
||||
|
||||
with app.app_context():
|
||||
# Initialize Flask-Migrate
|
||||
migrate_instance = Migrate(app, db)
|
||||
|
||||
try:
|
||||
migrate(message=message)
|
||||
logger.info(f"Migration created: {message}")
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to create migration: {e}")
|
||||
return False
|
||||
|
||||
return True
|
||||
|
||||
def run_migrations():
|
||||
"""Run pending migrations"""
|
||||
app = create_app()
|
||||
|
||||
with app.app_context():
|
||||
# Initialize Flask-Migrate
|
||||
migrate_instance = Migrate(app, db)
|
||||
|
||||
try:
|
||||
upgrade()
|
||||
logger.info("Migrations completed successfully")
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to run migrations: {e}")
|
||||
return False
|
||||
|
||||
return True
|
||||
|
||||
def create_initial_data():
|
||||
"""Create initial data if needed"""
|
||||
app = create_app()
|
||||
|
||||
with app.app_context():
|
||||
try:
|
||||
# Add any initial data here
|
||||
# For example, creating default API keys, admin users, etc.
|
||||
|
||||
db.session.commit()
|
||||
logger.info("Initial data created")
|
||||
except Exception as e:
|
||||
db.session.rollback()
|
||||
logger.error(f"Failed to create initial data: {e}")
|
||||
return False
|
||||
|
||||
return True
|
||||
|
||||
if __name__ == "__main__":
|
||||
if len(sys.argv) < 2:
|
||||
print("Usage: python migrations.py [init|create|run|seed]")
|
||||
sys.exit(1)
|
||||
|
||||
command = sys.argv[1]
|
||||
|
||||
if command == "init":
|
||||
if init_migrations():
|
||||
print("Migration repository initialized successfully")
|
||||
else:
|
||||
print("Failed to initialize migrations")
|
||||
sys.exit(1)
|
||||
|
||||
elif command == "create":
|
||||
message = sys.argv[2] if len(sys.argv) > 2 else "Auto migration"
|
||||
if create_migration(message):
|
||||
print(f"Migration created: {message}")
|
||||
else:
|
||||
print("Failed to create migration")
|
||||
sys.exit(1)
|
||||
|
||||
elif command == "run":
|
||||
if run_migrations():
|
||||
print("Migrations completed successfully")
|
||||
else:
|
||||
print("Failed to run migrations")
|
||||
sys.exit(1)
|
||||
|
||||
elif command == "seed":
|
||||
if create_initial_data():
|
||||
print("Initial data created successfully")
|
||||
else:
|
||||
print("Failed to create initial data")
|
||||
sys.exit(1)
|
||||
|
||||
else:
|
||||
print(f"Unknown command: {command}")
|
||||
print("Available commands: init, create, run, seed")
|
||||
sys.exit(1)
|
216
migrations/add_user_authentication.py
Normal file
216
migrations/add_user_authentication.py
Normal file
@ -0,0 +1,216 @@
|
||||
"""Add user authentication tables and update existing models
|
||||
|
||||
This migration:
|
||||
1. Creates user authentication tables (users, login_history, user_sessions, revoked_tokens)
|
||||
2. Updates translation and transcription tables to link to users
|
||||
3. Adds proper foreign key constraints and indexes
|
||||
"""
|
||||
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
from sqlalchemy.dialects import postgresql
|
||||
import uuid
|
||||
|
||||
# revision identifiers
|
||||
revision = 'add_user_authentication'
|
||||
down_revision = None
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
|
||||
def upgrade():
|
||||
# Create users table
|
||||
op.create_table('users',
|
||||
sa.Column('id', postgresql.UUID(as_uuid=True), nullable=False, default=uuid.uuid4),
|
||||
sa.Column('email', sa.String(255), nullable=False),
|
||||
sa.Column('username', sa.String(100), nullable=False),
|
||||
sa.Column('password_hash', sa.String(255), nullable=False),
|
||||
sa.Column('full_name', sa.String(255), nullable=True),
|
||||
sa.Column('avatar_url', sa.String(500), nullable=True),
|
||||
sa.Column('api_key', sa.String(64), nullable=False),
|
||||
sa.Column('api_key_created_at', sa.DateTime(), nullable=False),
|
||||
sa.Column('is_active', sa.Boolean(), nullable=False, default=True),
|
||||
sa.Column('is_verified', sa.Boolean(), nullable=False, default=False),
|
||||
sa.Column('is_suspended', sa.Boolean(), nullable=False, default=False),
|
||||
sa.Column('suspension_reason', sa.Text(), nullable=True),
|
||||
sa.Column('suspended_at', sa.DateTime(), nullable=True),
|
||||
sa.Column('suspended_until', sa.DateTime(), nullable=True),
|
||||
sa.Column('role', sa.String(20), nullable=False, default='user'),
|
||||
sa.Column('permissions', postgresql.JSONB(astext_type=sa.Text()), nullable=False, default=[]),
|
||||
sa.Column('rate_limit_per_minute', sa.Integer(), nullable=False, default=30),
|
||||
sa.Column('rate_limit_per_hour', sa.Integer(), nullable=False, default=500),
|
||||
sa.Column('rate_limit_per_day', sa.Integer(), nullable=False, default=5000),
|
||||
sa.Column('total_requests', sa.Integer(), nullable=False, default=0),
|
||||
sa.Column('total_translations', sa.Integer(), nullable=False, default=0),
|
||||
sa.Column('total_transcriptions', sa.Integer(), nullable=False, default=0),
|
||||
sa.Column('total_tts_requests', sa.Integer(), nullable=False, default=0),
|
||||
sa.Column('created_at', sa.DateTime(), nullable=False),
|
||||
sa.Column('updated_at', sa.DateTime(), nullable=False),
|
||||
sa.Column('last_login_at', sa.DateTime(), nullable=True),
|
||||
sa.Column('last_active_at', sa.DateTime(), nullable=True),
|
||||
sa.Column('password_changed_at', sa.DateTime(), nullable=False),
|
||||
sa.Column('failed_login_attempts', sa.Integer(), nullable=False, default=0),
|
||||
sa.Column('locked_until', sa.DateTime(), nullable=True),
|
||||
sa.Column('settings', postgresql.JSONB(astext_type=sa.Text()), nullable=False, default={}),
|
||||
sa.PrimaryKeyConstraint('id'),
|
||||
sa.UniqueConstraint('email'),
|
||||
sa.UniqueConstraint('username'),
|
||||
sa.UniqueConstraint('api_key')
|
||||
)
|
||||
|
||||
# Create indexes on users table
|
||||
op.create_index('idx_users_email', 'users', ['email'])
|
||||
op.create_index('idx_users_username', 'users', ['username'])
|
||||
op.create_index('idx_users_api_key', 'users', ['api_key'])
|
||||
op.create_index('idx_users_email_active', 'users', ['email', 'is_active'])
|
||||
op.create_index('idx_users_role_active', 'users', ['role', 'is_active'])
|
||||
op.create_index('idx_users_created_at', 'users', ['created_at'])
|
||||
|
||||
# Create login_history table
|
||||
op.create_table('login_history',
|
||||
sa.Column('id', postgresql.UUID(as_uuid=True), nullable=False, default=uuid.uuid4),
|
||||
sa.Column('user_id', postgresql.UUID(as_uuid=True), nullable=False),
|
||||
sa.Column('login_at', sa.DateTime(), nullable=False),
|
||||
sa.Column('logout_at', sa.DateTime(), nullable=True),
|
||||
sa.Column('login_method', sa.String(20), nullable=False),
|
||||
sa.Column('success', sa.Boolean(), nullable=False),
|
||||
sa.Column('failure_reason', sa.String(255), nullable=True),
|
||||
sa.Column('session_id', sa.String(255), nullable=True),
|
||||
sa.Column('jwt_jti', sa.String(255), nullable=True),
|
||||
sa.Column('ip_address', sa.String(45), nullable=False),
|
||||
sa.Column('user_agent', sa.String(500), nullable=True),
|
||||
sa.Column('device_info', postgresql.JSONB(astext_type=sa.Text()), nullable=True),
|
||||
sa.Column('country', sa.String(2), nullable=True),
|
||||
sa.Column('city', sa.String(100), nullable=True),
|
||||
sa.Column('is_suspicious', sa.Boolean(), nullable=False, default=False),
|
||||
sa.Column('security_notes', sa.Text(), nullable=True),
|
||||
sa.ForeignKeyConstraint(['user_id'], ['users.id'], ondelete='CASCADE'),
|
||||
sa.PrimaryKeyConstraint('id')
|
||||
)
|
||||
|
||||
# Create indexes on login_history
|
||||
op.create_index('idx_login_history_user_id', 'login_history', ['user_id'])
|
||||
op.create_index('idx_login_history_user_time', 'login_history', ['user_id', 'login_at'])
|
||||
op.create_index('idx_login_history_session', 'login_history', ['session_id'])
|
||||
op.create_index('idx_login_history_jwt_jti', 'login_history', ['jwt_jti'])
|
||||
op.create_index('idx_login_history_ip', 'login_history', ['ip_address'])
|
||||
|
||||
# Create user_sessions table
|
||||
op.create_table('user_sessions',
|
||||
sa.Column('id', postgresql.UUID(as_uuid=True), nullable=False, default=uuid.uuid4),
|
||||
sa.Column('session_id', sa.String(255), nullable=False),
|
||||
sa.Column('user_id', postgresql.UUID(as_uuid=True), nullable=False),
|
||||
sa.Column('access_token_jti', sa.String(255), nullable=True),
|
||||
sa.Column('refresh_token_jti', sa.String(255), nullable=True),
|
||||
sa.Column('created_at', sa.DateTime(), nullable=False),
|
||||
sa.Column('last_active_at', sa.DateTime(), nullable=False),
|
||||
sa.Column('expires_at', sa.DateTime(), nullable=False),
|
||||
sa.Column('ip_address', sa.String(45), nullable=False),
|
||||
sa.Column('user_agent', sa.String(500), nullable=True),
|
||||
sa.Column('data', postgresql.JSONB(astext_type=sa.Text()), nullable=False, default={}),
|
||||
sa.ForeignKeyConstraint(['user_id'], ['users.id'], ondelete='CASCADE'),
|
||||
sa.PrimaryKeyConstraint('id'),
|
||||
sa.UniqueConstraint('session_id')
|
||||
)
|
||||
|
||||
# Create indexes on user_sessions
|
||||
op.create_index('idx_user_sessions_session_id', 'user_sessions', ['session_id'])
|
||||
op.create_index('idx_user_sessions_user_id', 'user_sessions', ['user_id'])
|
||||
op.create_index('idx_user_sessions_user_active', 'user_sessions', ['user_id', 'expires_at'])
|
||||
op.create_index('idx_user_sessions_token', 'user_sessions', ['access_token_jti'])
|
||||
|
||||
# Create revoked_tokens table
|
||||
op.create_table('revoked_tokens',
|
||||
sa.Column('id', postgresql.UUID(as_uuid=True), nullable=False, default=uuid.uuid4),
|
||||
sa.Column('jti', sa.String(255), nullable=False),
|
||||
sa.Column('token_type', sa.String(20), nullable=False),
|
||||
sa.Column('user_id', postgresql.UUID(as_uuid=True), nullable=True),
|
||||
sa.Column('revoked_at', sa.DateTime(), nullable=False),
|
||||
sa.Column('expires_at', sa.DateTime(), nullable=False),
|
||||
sa.Column('reason', sa.String(255), nullable=True),
|
||||
sa.PrimaryKeyConstraint('id'),
|
||||
sa.UniqueConstraint('jti')
|
||||
)
|
||||
|
||||
# Create indexes on revoked_tokens
|
||||
op.create_index('idx_revoked_tokens_jti', 'revoked_tokens', ['jti'])
|
||||
op.create_index('idx_revoked_tokens_user_id', 'revoked_tokens', ['user_id'])
|
||||
op.create_index('idx_revoked_tokens_expires', 'revoked_tokens', ['expires_at'])
|
||||
|
||||
# Update translations table to add user_id with proper foreign key
|
||||
# First, check if user_id column exists
|
||||
try:
|
||||
op.add_column('translations', sa.Column('user_id', postgresql.UUID(as_uuid=True), nullable=True))
|
||||
op.create_foreign_key('fk_translations_user_id', 'translations', 'users', ['user_id'], ['id'], ondelete='SET NULL')
|
||||
op.create_index('idx_translations_user_id', 'translations', ['user_id'])
|
||||
except:
|
||||
pass # Column might already exist
|
||||
|
||||
# Update transcriptions table to add user_id with proper foreign key
|
||||
try:
|
||||
op.add_column('transcriptions', sa.Column('user_id', postgresql.UUID(as_uuid=True), nullable=True))
|
||||
op.create_foreign_key('fk_transcriptions_user_id', 'transcriptions', 'users', ['user_id'], ['id'], ondelete='SET NULL')
|
||||
op.create_index('idx_transcriptions_user_id', 'transcriptions', ['user_id'])
|
||||
except:
|
||||
pass # Column might already exist
|
||||
|
||||
# Update user_preferences table to add proper foreign key if not exists
|
||||
try:
|
||||
op.create_foreign_key('fk_user_preferences_user_id', 'user_preferences', 'users', ['user_id'], ['id'], ondelete='CASCADE')
|
||||
except:
|
||||
pass # Foreign key might already exist
|
||||
|
||||
# Update api_keys table to add proper foreign key if not exists
|
||||
try:
|
||||
op.add_column('api_keys', sa.Column('user_id_new', postgresql.UUID(as_uuid=True), nullable=True))
|
||||
op.create_foreign_key('fk_api_keys_user_id', 'api_keys', 'users', ['user_id_new'], ['id'], ondelete='CASCADE')
|
||||
except:
|
||||
pass # Column/FK might already exist
|
||||
|
||||
# Create function for updating updated_at timestamp
|
||||
op.execute("""
|
||||
CREATE OR REPLACE FUNCTION update_updated_at_column()
|
||||
RETURNS TRIGGER AS $$
|
||||
BEGIN
|
||||
NEW.updated_at = NOW();
|
||||
RETURN NEW;
|
||||
END;
|
||||
$$ language 'plpgsql';
|
||||
""")
|
||||
|
||||
# Create trigger for users table
|
||||
op.execute("""
|
||||
CREATE TRIGGER update_users_updated_at
|
||||
BEFORE UPDATE ON users
|
||||
FOR EACH ROW
|
||||
EXECUTE FUNCTION update_updated_at_column();
|
||||
""")
|
||||
|
||||
|
||||
def downgrade():
|
||||
# Drop triggers
|
||||
op.execute("DROP TRIGGER IF EXISTS update_users_updated_at ON users")
|
||||
op.execute("DROP FUNCTION IF EXISTS update_updated_at_column()")
|
||||
|
||||
# Drop foreign keys
|
||||
try:
|
||||
op.drop_constraint('fk_translations_user_id', 'translations', type_='foreignkey')
|
||||
op.drop_constraint('fk_transcriptions_user_id', 'transcriptions', type_='foreignkey')
|
||||
op.drop_constraint('fk_user_preferences_user_id', 'user_preferences', type_='foreignkey')
|
||||
op.drop_constraint('fk_api_keys_user_id', 'api_keys', type_='foreignkey')
|
||||
except:
|
||||
pass
|
||||
|
||||
# Drop columns
|
||||
try:
|
||||
op.drop_column('translations', 'user_id')
|
||||
op.drop_column('transcriptions', 'user_id')
|
||||
op.drop_column('api_keys', 'user_id_new')
|
||||
except:
|
||||
pass
|
||||
|
||||
# Drop tables
|
||||
op.drop_table('revoked_tokens')
|
||||
op.drop_table('user_sessions')
|
||||
op.drop_table('login_history')
|
||||
op.drop_table('users')
|
135
migrations/create_analytics_tables.sql
Normal file
135
migrations/create_analytics_tables.sql
Normal file
@ -0,0 +1,135 @@
|
||||
-- Create analytics tables for Talk2Me admin dashboard
|
||||
|
||||
-- Error logs table
|
||||
CREATE TABLE IF NOT EXISTS error_logs (
|
||||
id SERIAL PRIMARY KEY,
|
||||
error_type VARCHAR(100) NOT NULL,
|
||||
error_message TEXT,
|
||||
endpoint VARCHAR(255),
|
||||
method VARCHAR(10),
|
||||
status_code INTEGER,
|
||||
ip_address INET,
|
||||
user_agent TEXT,
|
||||
request_id VARCHAR(100),
|
||||
stack_trace TEXT,
|
||||
created_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP
|
||||
);
|
||||
|
||||
-- Create indexes for error_logs
|
||||
CREATE INDEX IF NOT EXISTS idx_error_logs_created_at ON error_logs(created_at DESC);
|
||||
CREATE INDEX IF NOT EXISTS idx_error_logs_error_type ON error_logs(error_type);
|
||||
CREATE INDEX IF NOT EXISTS idx_error_logs_endpoint ON error_logs(endpoint);
|
||||
|
||||
-- Request logs table for detailed analytics
|
||||
CREATE TABLE IF NOT EXISTS request_logs (
|
||||
id SERIAL PRIMARY KEY,
|
||||
endpoint VARCHAR(255) NOT NULL,
|
||||
method VARCHAR(10) NOT NULL,
|
||||
status_code INTEGER,
|
||||
response_time_ms INTEGER,
|
||||
ip_address INET,
|
||||
user_agent TEXT,
|
||||
request_size_bytes INTEGER,
|
||||
response_size_bytes INTEGER,
|
||||
session_id VARCHAR(100),
|
||||
created_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP
|
||||
);
|
||||
|
||||
-- Create indexes for request_logs
|
||||
CREATE INDEX IF NOT EXISTS idx_request_logs_created_at ON request_logs(created_at DESC);
|
||||
CREATE INDEX IF NOT EXISTS idx_request_logs_endpoint ON request_logs(endpoint);
|
||||
CREATE INDEX IF NOT EXISTS idx_request_logs_session_id ON request_logs(session_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_request_logs_response_time ON request_logs(response_time_ms);
|
||||
|
||||
-- Translation logs table
|
||||
CREATE TABLE IF NOT EXISTS translation_logs (
|
||||
id SERIAL PRIMARY KEY,
|
||||
source_language VARCHAR(10),
|
||||
target_language VARCHAR(10),
|
||||
text_length INTEGER,
|
||||
response_time_ms INTEGER,
|
||||
success BOOLEAN DEFAULT TRUE,
|
||||
error_message TEXT,
|
||||
session_id VARCHAR(100),
|
||||
created_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP
|
||||
);
|
||||
|
||||
-- Create indexes for translation_logs
|
||||
CREATE INDEX IF NOT EXISTS idx_translation_logs_created_at ON translation_logs(created_at DESC);
|
||||
CREATE INDEX IF NOT EXISTS idx_translation_logs_languages ON translation_logs(source_language, target_language);
|
||||
|
||||
-- Transcription logs table
|
||||
CREATE TABLE IF NOT EXISTS transcription_logs (
|
||||
id SERIAL PRIMARY KEY,
|
||||
detected_language VARCHAR(10),
|
||||
audio_duration_seconds FLOAT,
|
||||
file_size_bytes INTEGER,
|
||||
response_time_ms INTEGER,
|
||||
success BOOLEAN DEFAULT TRUE,
|
||||
error_message TEXT,
|
||||
session_id VARCHAR(100),
|
||||
created_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP
|
||||
);
|
||||
|
||||
-- Create indexes for transcription_logs
|
||||
CREATE INDEX IF NOT EXISTS idx_transcription_logs_created_at ON transcription_logs(created_at DESC);
|
||||
CREATE INDEX IF NOT EXISTS idx_transcription_logs_language ON transcription_logs(detected_language);
|
||||
|
||||
-- TTS logs table
|
||||
CREATE TABLE IF NOT EXISTS tts_logs (
|
||||
id SERIAL PRIMARY KEY,
|
||||
language VARCHAR(10),
|
||||
text_length INTEGER,
|
||||
voice VARCHAR(50),
|
||||
response_time_ms INTEGER,
|
||||
success BOOLEAN DEFAULT TRUE,
|
||||
error_message TEXT,
|
||||
session_id VARCHAR(100),
|
||||
created_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP
|
||||
);
|
||||
|
||||
-- Create indexes for tts_logs
|
||||
CREATE INDEX IF NOT EXISTS idx_tts_logs_created_at ON tts_logs(created_at DESC);
|
||||
CREATE INDEX IF NOT EXISTS idx_tts_logs_language ON tts_logs(language);
|
||||
|
||||
-- Daily aggregated stats table for faster queries
|
||||
CREATE TABLE IF NOT EXISTS daily_stats (
|
||||
date DATE PRIMARY KEY,
|
||||
total_requests INTEGER DEFAULT 0,
|
||||
total_translations INTEGER DEFAULT 0,
|
||||
total_transcriptions INTEGER DEFAULT 0,
|
||||
total_tts INTEGER DEFAULT 0,
|
||||
total_errors INTEGER DEFAULT 0,
|
||||
unique_sessions INTEGER DEFAULT 0,
|
||||
avg_response_time_ms FLOAT,
|
||||
created_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP,
|
||||
updated_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP
|
||||
);
|
||||
|
||||
-- Create function to update updated_at timestamp
|
||||
CREATE OR REPLACE FUNCTION update_updated_at_column()
|
||||
RETURNS TRIGGER AS $$
|
||||
BEGIN
|
||||
NEW.updated_at = CURRENT_TIMESTAMP;
|
||||
RETURN NEW;
|
||||
END;
|
||||
$$ language 'plpgsql';
|
||||
|
||||
-- Create trigger for daily_stats
|
||||
DROP TRIGGER IF EXISTS update_daily_stats_updated_at ON daily_stats;
|
||||
CREATE TRIGGER update_daily_stats_updated_at
|
||||
BEFORE UPDATE ON daily_stats
|
||||
FOR EACH ROW
|
||||
EXECUTE FUNCTION update_updated_at_column();
|
||||
|
||||
-- Create view for language pair statistics
|
||||
CREATE OR REPLACE VIEW language_pair_stats AS
|
||||
SELECT
|
||||
source_language || ' -> ' || target_language as language_pair,
|
||||
COUNT(*) as usage_count,
|
||||
AVG(response_time_ms) as avg_response_time,
|
||||
MAX(created_at) as last_used
|
||||
FROM translation_logs
|
||||
WHERE success = TRUE
|
||||
GROUP BY source_language, target_language
|
||||
ORDER BY usage_count DESC;
|
@ -17,7 +17,7 @@ class RateLimiter:
|
||||
"""
|
||||
def __init__(self):
|
||||
self.buckets = defaultdict(lambda: {
|
||||
'tokens': 0,
|
||||
'tokens': 5, # Start with some tokens to avoid immediate burst errors
|
||||
'last_update': time.time(),
|
||||
'requests': deque(maxlen=1000) # Track last 1000 requests
|
||||
})
|
||||
@ -145,8 +145,25 @@ class RateLimiter:
|
||||
|
||||
return True, None
|
||||
|
||||
def is_exempt_path(self, path):
|
||||
"""Check if path is exempt from rate limiting"""
|
||||
# Handle both path strings and endpoint names
|
||||
if path.startswith('admin.'):
|
||||
return True
|
||||
|
||||
exempt_paths = ['/admin', '/health', '/static']
|
||||
return any(path.startswith(p) for p in exempt_paths)
|
||||
|
||||
def check_rate_limit(self, client_id, endpoint, request_size=0):
|
||||
"""Check if request should be allowed"""
|
||||
# Log what endpoint we're checking
|
||||
logger.debug(f"Checking rate limit for endpoint: {endpoint}")
|
||||
|
||||
# Skip rate limiting for exempt paths before any processing
|
||||
if self.is_exempt_path(endpoint):
|
||||
logger.debug(f"Endpoint {endpoint} is exempt from rate limiting")
|
||||
return True, None, None
|
||||
|
||||
with self.lock:
|
||||
# Check global limits first
|
||||
global_ok, global_msg = self.check_global_limits()
|
||||
@ -295,6 +312,14 @@ def rate_limit(endpoint=None,
|
||||
def decorator(f):
|
||||
@wraps(f)
|
||||
def decorated_function(*args, **kwargs):
|
||||
# Skip rate limiting for admin routes - check both path and endpoint
|
||||
if request.path.startswith('/admin'):
|
||||
return f(*args, **kwargs)
|
||||
|
||||
# Also check endpoint name
|
||||
if request.endpoint and request.endpoint.startswith('admin.'):
|
||||
return f(*args, **kwargs)
|
||||
|
||||
# Get client ID
|
||||
client_id = rate_limiter.get_client_id(request)
|
||||
ip = request.remote_addr
|
||||
@ -403,6 +428,10 @@ ip_filter = IPFilter()
|
||||
|
||||
def ip_filter_check():
|
||||
"""Middleware to check IP filtering"""
|
||||
# Skip IP filtering for admin routes
|
||||
if request.path.startswith('/admin'):
|
||||
return None
|
||||
|
||||
ip = request.remote_addr
|
||||
if not ip_filter.is_allowed(ip):
|
||||
return jsonify({'error': 'Access denied'}), 403
|
446
redis_manager.py
Normal file
446
redis_manager.py
Normal file
@ -0,0 +1,446 @@
|
||||
# Redis connection and caching management
|
||||
import redis
|
||||
import json
|
||||
import pickle
|
||||
import logging
|
||||
from typing import Optional, Any, Dict, List, Union
|
||||
from datetime import timedelta
|
||||
from functools import wraps
|
||||
import hashlib
|
||||
import time
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
class RedisManager:
|
||||
"""Manage Redis connections and operations"""
|
||||
|
||||
def __init__(self, app=None, config=None):
|
||||
self.redis_client = None
|
||||
self.config = config or {}
|
||||
self.key_prefix = self.config.get('key_prefix', 'talk2me:')
|
||||
|
||||
if app:
|
||||
self.init_app(app)
|
||||
|
||||
def init_app(self, app):
|
||||
"""Initialize Redis with Flask app"""
|
||||
# Get Redis configuration
|
||||
redis_url = app.config.get('REDIS_URL', 'redis://localhost:6379/0')
|
||||
|
||||
# Parse connection options
|
||||
decode_responses = app.config.get('REDIS_DECODE_RESPONSES', False)
|
||||
max_connections = app.config.get('REDIS_MAX_CONNECTIONS', 50)
|
||||
socket_timeout = app.config.get('REDIS_SOCKET_TIMEOUT', 5)
|
||||
|
||||
# Create connection pool
|
||||
pool = redis.ConnectionPool.from_url(
|
||||
redis_url,
|
||||
max_connections=max_connections,
|
||||
socket_timeout=socket_timeout,
|
||||
decode_responses=decode_responses
|
||||
)
|
||||
|
||||
self.redis_client = redis.Redis(connection_pool=pool)
|
||||
|
||||
# Test connection
|
||||
try:
|
||||
self.redis_client.ping()
|
||||
logger.info(f"Redis connected successfully to {redis_url}")
|
||||
except redis.ConnectionError as e:
|
||||
logger.error(f"Failed to connect to Redis: {e}")
|
||||
raise
|
||||
|
||||
# Store reference in app
|
||||
app.redis = self
|
||||
|
||||
def _make_key(self, key: str) -> str:
|
||||
"""Create a prefixed key"""
|
||||
return f"{self.key_prefix}{key}"
|
||||
|
||||
# Basic operations
|
||||
def get(self, key: str, default=None) -> Any:
|
||||
"""Get value from Redis"""
|
||||
try:
|
||||
value = self.redis_client.get(self._make_key(key))
|
||||
if value is None:
|
||||
return default
|
||||
|
||||
# Try to deserialize JSON first
|
||||
try:
|
||||
return json.loads(value)
|
||||
except (json.JSONDecodeError, TypeError):
|
||||
# Try pickle for complex objects
|
||||
try:
|
||||
return pickle.loads(value)
|
||||
except:
|
||||
# Return as string
|
||||
return value.decode('utf-8') if isinstance(value, bytes) else value
|
||||
except Exception as e:
|
||||
logger.error(f"Redis get error for key {key}: {e}")
|
||||
return default
|
||||
|
||||
def set(self, key: str, value: Any, expire: Optional[int] = None) -> bool:
|
||||
"""Set value in Redis with optional expiration"""
|
||||
try:
|
||||
# Serialize value
|
||||
if isinstance(value, (str, int, float)):
|
||||
serialized = str(value)
|
||||
elif isinstance(value, (dict, list)):
|
||||
serialized = json.dumps(value)
|
||||
else:
|
||||
serialized = pickle.dumps(value)
|
||||
|
||||
return self.redis_client.set(
|
||||
self._make_key(key),
|
||||
serialized,
|
||||
ex=expire
|
||||
)
|
||||
except Exception as e:
|
||||
logger.error(f"Redis set error for key {key}: {e}")
|
||||
return False
|
||||
|
||||
def delete(self, *keys) -> int:
|
||||
"""Delete one or more keys"""
|
||||
try:
|
||||
prefixed_keys = [self._make_key(k) for k in keys]
|
||||
return self.redis_client.delete(*prefixed_keys)
|
||||
except Exception as e:
|
||||
logger.error(f"Redis delete error: {e}")
|
||||
return 0
|
||||
|
||||
def exists(self, key: str) -> bool:
|
||||
"""Check if key exists"""
|
||||
try:
|
||||
return bool(self.redis_client.exists(self._make_key(key)))
|
||||
except Exception as e:
|
||||
logger.error(f"Redis exists error for key {key}: {e}")
|
||||
return False
|
||||
|
||||
def expire(self, key: str, seconds: int) -> bool:
|
||||
"""Set expiration on a key"""
|
||||
try:
|
||||
return bool(self.redis_client.expire(self._make_key(key), seconds))
|
||||
except Exception as e:
|
||||
logger.error(f"Redis expire error for key {key}: {e}")
|
||||
return False
|
||||
|
||||
# Hash operations for session/rate limiting
|
||||
def hget(self, name: str, key: str, default=None) -> Any:
|
||||
"""Get value from hash"""
|
||||
try:
|
||||
value = self.redis_client.hget(self._make_key(name), key)
|
||||
if value is None:
|
||||
return default
|
||||
|
||||
try:
|
||||
return json.loads(value)
|
||||
except (json.JSONDecodeError, TypeError):
|
||||
return value.decode('utf-8') if isinstance(value, bytes) else value
|
||||
except Exception as e:
|
||||
logger.error(f"Redis hget error for {name}:{key}: {e}")
|
||||
return default
|
||||
|
||||
def hset(self, name: str, key: str, value: Any) -> bool:
|
||||
"""Set value in hash"""
|
||||
try:
|
||||
if isinstance(value, (dict, list)):
|
||||
value = json.dumps(value)
|
||||
return bool(self.redis_client.hset(self._make_key(name), key, value))
|
||||
except Exception as e:
|
||||
logger.error(f"Redis hset error for {name}:{key}: {e}")
|
||||
return False
|
||||
|
||||
def hgetall(self, name: str) -> Dict[str, Any]:
|
||||
"""Get all values from hash"""
|
||||
try:
|
||||
data = self.redis_client.hgetall(self._make_key(name))
|
||||
result = {}
|
||||
for k, v in data.items():
|
||||
key = k.decode('utf-8') if isinstance(k, bytes) else k
|
||||
try:
|
||||
result[key] = json.loads(v)
|
||||
except:
|
||||
result[key] = v.decode('utf-8') if isinstance(v, bytes) else v
|
||||
return result
|
||||
except Exception as e:
|
||||
logger.error(f"Redis hgetall error for {name}: {e}")
|
||||
return {}
|
||||
|
||||
def hdel(self, name: str, *keys) -> int:
|
||||
"""Delete fields from hash"""
|
||||
try:
|
||||
return self.redis_client.hdel(self._make_key(name), *keys)
|
||||
except Exception as e:
|
||||
logger.error(f"Redis hdel error for {name}: {e}")
|
||||
return 0
|
||||
|
||||
# List operations for queues
|
||||
def lpush(self, key: str, *values) -> int:
|
||||
"""Push values to the left of list"""
|
||||
try:
|
||||
serialized = []
|
||||
for v in values:
|
||||
if isinstance(v, (dict, list)):
|
||||
serialized.append(json.dumps(v))
|
||||
else:
|
||||
serialized.append(v)
|
||||
return self.redis_client.lpush(self._make_key(key), *serialized)
|
||||
except Exception as e:
|
||||
logger.error(f"Redis lpush error for {key}: {e}")
|
||||
return 0
|
||||
|
||||
def rpop(self, key: str, default=None) -> Any:
|
||||
"""Pop value from the right of list"""
|
||||
try:
|
||||
value = self.redis_client.rpop(self._make_key(key))
|
||||
if value is None:
|
||||
return default
|
||||
|
||||
try:
|
||||
return json.loads(value)
|
||||
except:
|
||||
return value.decode('utf-8') if isinstance(value, bytes) else value
|
||||
except Exception as e:
|
||||
logger.error(f"Redis rpop error for {key}: {e}")
|
||||
return default
|
||||
|
||||
def llen(self, key: str) -> int:
|
||||
"""Get length of list"""
|
||||
try:
|
||||
return self.redis_client.llen(self._make_key(key))
|
||||
except Exception as e:
|
||||
logger.error(f"Redis llen error for {key}: {e}")
|
||||
return 0
|
||||
|
||||
# Set operations for unique tracking
|
||||
def sadd(self, key: str, *values) -> int:
|
||||
"""Add members to set"""
|
||||
try:
|
||||
return self.redis_client.sadd(self._make_key(key), *values)
|
||||
except Exception as e:
|
||||
logger.error(f"Redis sadd error for {key}: {e}")
|
||||
return 0
|
||||
|
||||
def srem(self, key: str, *values) -> int:
|
||||
"""Remove members from set"""
|
||||
try:
|
||||
return self.redis_client.srem(self._make_key(key), *values)
|
||||
except Exception as e:
|
||||
logger.error(f"Redis srem error for {key}: {e}")
|
||||
return 0
|
||||
|
||||
def sismember(self, key: str, value: Any) -> bool:
|
||||
"""Check if value is member of set"""
|
||||
try:
|
||||
return bool(self.redis_client.sismember(self._make_key(key), value))
|
||||
except Exception as e:
|
||||
logger.error(f"Redis sismember error for {key}: {e}")
|
||||
return False
|
||||
|
||||
def scard(self, key: str) -> int:
|
||||
"""Get number of members in set"""
|
||||
try:
|
||||
return self.redis_client.scard(self._make_key(key))
|
||||
except Exception as e:
|
||||
logger.error(f"Redis scard error for {key}: {e}")
|
||||
return 0
|
||||
|
||||
def smembers(self, key: str) -> set:
|
||||
"""Get all members of set"""
|
||||
try:
|
||||
members = self.redis_client.smembers(self._make_key(key))
|
||||
return {m.decode('utf-8') if isinstance(m, bytes) else m for m in members}
|
||||
except Exception as e:
|
||||
logger.error(f"Redis smembers error for {key}: {e}")
|
||||
return set()
|
||||
|
||||
# Atomic counters
|
||||
def incr(self, key: str, amount: int = 1) -> int:
|
||||
"""Increment counter"""
|
||||
try:
|
||||
return self.redis_client.incr(self._make_key(key), amount)
|
||||
except Exception as e:
|
||||
logger.error(f"Redis incr error for {key}: {e}")
|
||||
return 0
|
||||
|
||||
def decr(self, key: str, amount: int = 1) -> int:
|
||||
"""Decrement counter"""
|
||||
try:
|
||||
return self.redis_client.decr(self._make_key(key), amount)
|
||||
except Exception as e:
|
||||
logger.error(f"Redis decr error for {key}: {e}")
|
||||
return 0
|
||||
|
||||
# Transaction support
|
||||
def pipeline(self):
|
||||
"""Create a pipeline for atomic operations"""
|
||||
return self.redis_client.pipeline()
|
||||
|
||||
# Pub/Sub support
|
||||
def publish(self, channel: str, message: Any) -> int:
|
||||
"""Publish message to channel"""
|
||||
try:
|
||||
if isinstance(message, (dict, list)):
|
||||
message = json.dumps(message)
|
||||
return self.redis_client.publish(self._make_key(channel), message)
|
||||
except Exception as e:
|
||||
logger.error(f"Redis publish error for {channel}: {e}")
|
||||
return 0
|
||||
|
||||
def subscribe(self, *channels):
|
||||
"""Subscribe to channels"""
|
||||
pubsub = self.redis_client.pubsub()
|
||||
prefixed_channels = [self._make_key(c) for c in channels]
|
||||
pubsub.subscribe(*prefixed_channels)
|
||||
return pubsub
|
||||
|
||||
# Cache helpers
|
||||
def cache_translation(self, source_text: str, source_lang: str,
|
||||
target_lang: str, translation: str,
|
||||
expire_hours: int = 24) -> bool:
|
||||
"""Cache a translation"""
|
||||
key = self._translation_key(source_text, source_lang, target_lang)
|
||||
data = {
|
||||
'translation': translation,
|
||||
'timestamp': time.time(),
|
||||
'hits': 0
|
||||
}
|
||||
return self.set(key, data, expire=expire_hours * 3600)
|
||||
|
||||
def get_cached_translation(self, source_text: str, source_lang: str,
|
||||
target_lang: str) -> Optional[str]:
|
||||
"""Get cached translation and increment hit counter"""
|
||||
key = self._translation_key(source_text, source_lang, target_lang)
|
||||
data = self.get(key)
|
||||
|
||||
if data and isinstance(data, dict):
|
||||
# Increment hit counter
|
||||
data['hits'] = data.get('hits', 0) + 1
|
||||
self.set(key, data)
|
||||
return data.get('translation')
|
||||
|
||||
return None
|
||||
|
||||
def _translation_key(self, text: str, source_lang: str, target_lang: str) -> str:
|
||||
"""Generate cache key for translation"""
|
||||
# Create a hash of the text to handle long texts
|
||||
text_hash = hashlib.md5(text.encode()).hexdigest()
|
||||
return f"translation:{source_lang}:{target_lang}:{text_hash}"
|
||||
|
||||
# Session management
|
||||
def save_session(self, session_id: str, data: Dict[str, Any],
|
||||
expire_seconds: int = 3600) -> bool:
|
||||
"""Save session data"""
|
||||
key = f"session:{session_id}"
|
||||
return self.set(key, data, expire=expire_seconds)
|
||||
|
||||
def get_session(self, session_id: str) -> Optional[Dict[str, Any]]:
|
||||
"""Get session data"""
|
||||
key = f"session:{session_id}"
|
||||
return self.get(key)
|
||||
|
||||
def delete_session(self, session_id: str) -> bool:
|
||||
"""Delete session data"""
|
||||
key = f"session:{session_id}"
|
||||
return bool(self.delete(key))
|
||||
|
||||
def extend_session(self, session_id: str, expire_seconds: int = 3600) -> bool:
|
||||
"""Extend session expiration"""
|
||||
key = f"session:{session_id}"
|
||||
return self.expire(key, expire_seconds)
|
||||
|
||||
# Rate limiting
|
||||
def check_rate_limit(self, identifier: str, limit: int,
|
||||
window_seconds: int) -> tuple[bool, int]:
|
||||
"""Check rate limit using sliding window"""
|
||||
key = f"rate_limit:{identifier}:{window_seconds}"
|
||||
now = time.time()
|
||||
window_start = now - window_seconds
|
||||
|
||||
pipe = self.pipeline()
|
||||
|
||||
# Remove old entries
|
||||
pipe.zremrangebyscore(self._make_key(key), 0, window_start)
|
||||
|
||||
# Count current entries
|
||||
pipe.zcard(self._make_key(key))
|
||||
|
||||
# Add current request
|
||||
pipe.zadd(self._make_key(key), {str(now): now})
|
||||
|
||||
# Set expiration
|
||||
pipe.expire(self._make_key(key), window_seconds + 1)
|
||||
|
||||
results = pipe.execute()
|
||||
current_count = results[1]
|
||||
|
||||
if current_count >= limit:
|
||||
return False, limit - current_count
|
||||
|
||||
return True, limit - current_count - 1
|
||||
|
||||
# Cleanup
|
||||
def cleanup_expired_keys(self, pattern: str = "*") -> int:
|
||||
"""Clean up expired keys matching pattern"""
|
||||
try:
|
||||
cursor = 0
|
||||
deleted = 0
|
||||
|
||||
while True:
|
||||
cursor, keys = self.redis_client.scan(
|
||||
cursor,
|
||||
match=self._make_key(pattern),
|
||||
count=100
|
||||
)
|
||||
|
||||
for key in keys:
|
||||
ttl = self.redis_client.ttl(key)
|
||||
if ttl == -2: # Key doesn't exist
|
||||
continue
|
||||
elif ttl == -1: # Key exists but no TTL
|
||||
# Set a default TTL of 24 hours for keys without expiration
|
||||
self.redis_client.expire(key, 86400)
|
||||
|
||||
if cursor == 0:
|
||||
break
|
||||
|
||||
return deleted
|
||||
except Exception as e:
|
||||
logger.error(f"Redis cleanup error: {e}")
|
||||
return 0
|
||||
|
||||
|
||||
# Cache decorator
|
||||
def redis_cache(expire_seconds: int = 300, key_prefix: str = ""):
|
||||
"""Decorator to cache function results in Redis"""
|
||||
def decorator(func):
|
||||
@wraps(func)
|
||||
def wrapper(*args, **kwargs):
|
||||
# Get Redis instance from app context
|
||||
from flask import current_app
|
||||
redis_manager = getattr(current_app, 'redis', None)
|
||||
|
||||
if not redis_manager:
|
||||
# No Redis, execute function normally
|
||||
return func(*args, **kwargs)
|
||||
|
||||
# Generate cache key
|
||||
cache_key = f"{key_prefix}:{func.__name__}:"
|
||||
cache_key += hashlib.md5(
|
||||
f"{args}:{kwargs}".encode()
|
||||
).hexdigest()
|
||||
|
||||
# Try to get from cache
|
||||
cached = redis_manager.get(cache_key)
|
||||
if cached is not None:
|
||||
logger.debug(f"Cache hit for {func.__name__}")
|
||||
return cached
|
||||
|
||||
# Execute function and cache result
|
||||
result = func(*args, **kwargs)
|
||||
redis_manager.set(cache_key, result, expire=expire_seconds)
|
||||
|
||||
return result
|
||||
|
||||
return wrapper
|
||||
return decorator
|
365
redis_rate_limiter.py
Normal file
365
redis_rate_limiter.py
Normal file
@ -0,0 +1,365 @@
|
||||
# Redis-based rate limiting implementation
|
||||
import time
|
||||
import logging
|
||||
from functools import wraps
|
||||
from flask import request, jsonify, g
|
||||
import hashlib
|
||||
from typing import Optional, Dict, Tuple
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
class RedisRateLimiter:
|
||||
"""Token bucket rate limiter using Redis for distributed rate limiting"""
|
||||
|
||||
def __init__(self, redis_manager):
|
||||
self.redis = redis_manager
|
||||
|
||||
# Default limits (can be overridden per endpoint)
|
||||
self.default_limits = {
|
||||
'requests_per_minute': 30,
|
||||
'requests_per_hour': 500,
|
||||
'burst_size': 10,
|
||||
'token_refresh_rate': 0.5 # tokens per second
|
||||
}
|
||||
|
||||
# Endpoint-specific limits
|
||||
self.endpoint_limits = {
|
||||
'/transcribe': {
|
||||
'requests_per_minute': 10,
|
||||
'requests_per_hour': 100,
|
||||
'burst_size': 3,
|
||||
'token_refresh_rate': 0.167,
|
||||
'max_request_size': 10 * 1024 * 1024 # 10MB
|
||||
},
|
||||
'/translate': {
|
||||
'requests_per_minute': 20,
|
||||
'requests_per_hour': 300,
|
||||
'burst_size': 5,
|
||||
'token_refresh_rate': 0.333,
|
||||
'max_request_size': 100 * 1024 # 100KB
|
||||
},
|
||||
'/translate/stream': {
|
||||
'requests_per_minute': 10,
|
||||
'requests_per_hour': 150,
|
||||
'burst_size': 3,
|
||||
'token_refresh_rate': 0.167,
|
||||
'max_request_size': 100 * 1024 # 100KB
|
||||
},
|
||||
'/speak': {
|
||||
'requests_per_minute': 15,
|
||||
'requests_per_hour': 200,
|
||||
'burst_size': 3,
|
||||
'token_refresh_rate': 0.25,
|
||||
'max_request_size': 50 * 1024 # 50KB
|
||||
}
|
||||
}
|
||||
|
||||
# Global limits
|
||||
self.global_limits = {
|
||||
'total_requests_per_minute': 1000,
|
||||
'total_requests_per_hour': 10000,
|
||||
'concurrent_requests': 50
|
||||
}
|
||||
|
||||
def get_client_id(self, req) -> str:
|
||||
"""Get unique client identifier"""
|
||||
ip = req.remote_addr or 'unknown'
|
||||
user_agent = req.headers.get('User-Agent', '')
|
||||
|
||||
# Handle proxied requests
|
||||
forwarded_for = req.headers.get('X-Forwarded-For')
|
||||
if forwarded_for:
|
||||
ip = forwarded_for.split(',')[0].strip()
|
||||
|
||||
# Create unique identifier
|
||||
identifier = f"{ip}:{user_agent}"
|
||||
return hashlib.md5(identifier.encode()).hexdigest()
|
||||
|
||||
def get_limits(self, endpoint: str) -> Dict:
|
||||
"""Get rate limits for endpoint"""
|
||||
return self.endpoint_limits.get(endpoint, self.default_limits)
|
||||
|
||||
def is_ip_blocked(self, ip: str) -> bool:
|
||||
"""Check if IP is blocked"""
|
||||
# Check permanent blocks
|
||||
if self.redis.sismember('blocked_ips:permanent', ip):
|
||||
return True
|
||||
|
||||
# Check temporary blocks
|
||||
block_key = f'blocked_ip:{ip}'
|
||||
if self.redis.exists(block_key):
|
||||
return True
|
||||
|
||||
return False
|
||||
|
||||
def block_ip_temporarily(self, ip: str, duration: int = 3600):
|
||||
"""Block IP temporarily"""
|
||||
block_key = f'blocked_ip:{ip}'
|
||||
self.redis.set(block_key, 1, expire=duration)
|
||||
logger.warning(f"IP {ip} temporarily blocked for {duration} seconds")
|
||||
|
||||
def check_global_limits(self) -> Tuple[bool, Optional[str]]:
|
||||
"""Check global rate limits"""
|
||||
now = time.time()
|
||||
|
||||
# Check requests per minute
|
||||
minute_key = 'global:requests:minute'
|
||||
allowed, remaining = self.redis.check_rate_limit(
|
||||
minute_key,
|
||||
self.global_limits['total_requests_per_minute'],
|
||||
60
|
||||
)
|
||||
if not allowed:
|
||||
return False, "Global rate limit exceeded (per minute)"
|
||||
|
||||
# Check requests per hour
|
||||
hour_key = 'global:requests:hour'
|
||||
allowed, remaining = self.redis.check_rate_limit(
|
||||
hour_key,
|
||||
self.global_limits['total_requests_per_hour'],
|
||||
3600
|
||||
)
|
||||
if not allowed:
|
||||
return False, "Global rate limit exceeded (per hour)"
|
||||
|
||||
# Check concurrent requests
|
||||
concurrent_key = 'global:concurrent'
|
||||
current_concurrent = self.redis.get(concurrent_key, 0)
|
||||
if current_concurrent >= self.global_limits['concurrent_requests']:
|
||||
return False, "Too many concurrent requests"
|
||||
|
||||
return True, None
|
||||
|
||||
def check_rate_limit(self, client_id: str, endpoint: str,
|
||||
request_size: int = 0) -> Tuple[bool, Optional[str], Optional[Dict]]:
|
||||
"""Check if request should be allowed"""
|
||||
# Check global limits first
|
||||
global_ok, global_msg = self.check_global_limits()
|
||||
if not global_ok:
|
||||
return False, global_msg, None
|
||||
|
||||
# Get limits for endpoint
|
||||
limits = self.get_limits(endpoint)
|
||||
|
||||
# Check request size if applicable
|
||||
if request_size > 0 and 'max_request_size' in limits:
|
||||
if request_size > limits['max_request_size']:
|
||||
return False, "Request too large", None
|
||||
|
||||
# Token bucket implementation using Redis
|
||||
bucket_key = f'bucket:{client_id}:{endpoint}'
|
||||
now = time.time()
|
||||
|
||||
# Get current bucket state
|
||||
bucket_data = self.redis.hgetall(bucket_key)
|
||||
|
||||
# Initialize bucket if empty
|
||||
if not bucket_data:
|
||||
bucket_data = {
|
||||
'tokens': limits['burst_size'],
|
||||
'last_update': now
|
||||
}
|
||||
else:
|
||||
# Update tokens based on time passed
|
||||
last_update = float(bucket_data.get('last_update', now))
|
||||
time_passed = now - last_update
|
||||
new_tokens = time_passed * limits['token_refresh_rate']
|
||||
|
||||
current_tokens = float(bucket_data.get('tokens', 0))
|
||||
bucket_data['tokens'] = min(
|
||||
limits['burst_size'],
|
||||
current_tokens + new_tokens
|
||||
)
|
||||
bucket_data['last_update'] = now
|
||||
|
||||
# Check sliding window limits
|
||||
minute_allowed, minute_remaining = self.redis.check_rate_limit(
|
||||
f'window:{client_id}:{endpoint}:minute',
|
||||
limits['requests_per_minute'],
|
||||
60
|
||||
)
|
||||
|
||||
if not minute_allowed:
|
||||
return False, "Rate limit exceeded (per minute)", {
|
||||
'retry_after': 60,
|
||||
'limit': limits['requests_per_minute'],
|
||||
'remaining': 0,
|
||||
'reset': int(now + 60)
|
||||
}
|
||||
|
||||
hour_allowed, hour_remaining = self.redis.check_rate_limit(
|
||||
f'window:{client_id}:{endpoint}:hour',
|
||||
limits['requests_per_hour'],
|
||||
3600
|
||||
)
|
||||
|
||||
if not hour_allowed:
|
||||
return False, "Rate limit exceeded (per hour)", {
|
||||
'retry_after': 3600,
|
||||
'limit': limits['requests_per_hour'],
|
||||
'remaining': 0,
|
||||
'reset': int(now + 3600)
|
||||
}
|
||||
|
||||
# Check token bucket
|
||||
if float(bucket_data['tokens']) < 1:
|
||||
retry_after = int(1 / limits['token_refresh_rate'])
|
||||
return False, "Rate limit exceeded (burst)", {
|
||||
'retry_after': retry_after,
|
||||
'limit': limits['burst_size'],
|
||||
'remaining': 0,
|
||||
'reset': int(now + retry_after)
|
||||
}
|
||||
|
||||
# Request allowed - update bucket
|
||||
bucket_data['tokens'] = float(bucket_data['tokens']) - 1
|
||||
|
||||
# Save bucket state
|
||||
self.redis.hset(bucket_key, 'tokens', bucket_data['tokens'])
|
||||
self.redis.hset(bucket_key, 'last_update', bucket_data['last_update'])
|
||||
self.redis.expire(bucket_key, 86400) # Expire after 24 hours
|
||||
|
||||
return True, None, {
|
||||
'limit': limits['requests_per_minute'],
|
||||
'remaining': minute_remaining,
|
||||
'reset': int(now + 60)
|
||||
}
|
||||
|
||||
def increment_concurrent(self):
|
||||
"""Increment concurrent request counter"""
|
||||
self.redis.incr('global:concurrent')
|
||||
|
||||
def decrement_concurrent(self):
|
||||
"""Decrement concurrent request counter"""
|
||||
self.redis.decr('global:concurrent')
|
||||
|
||||
def get_client_stats(self, client_id: str) -> Optional[Dict]:
|
||||
"""Get statistics for a client"""
|
||||
stats = {
|
||||
'requests_last_minute': 0,
|
||||
'requests_last_hour': 0,
|
||||
'buckets': {}
|
||||
}
|
||||
|
||||
# Get request counts from all endpoints
|
||||
for endpoint in self.endpoint_limits.keys():
|
||||
minute_key = f'window:{client_id}:{endpoint}:minute'
|
||||
hour_key = f'window:{client_id}:{endpoint}:hour'
|
||||
|
||||
# This is approximate since we're using sliding windows
|
||||
minute_count = self.redis.scard(minute_key)
|
||||
hour_count = self.redis.scard(hour_key)
|
||||
|
||||
stats['requests_last_minute'] += minute_count
|
||||
stats['requests_last_hour'] += hour_count
|
||||
|
||||
# Get bucket info
|
||||
bucket_key = f'bucket:{client_id}:{endpoint}'
|
||||
bucket_data = self.redis.hgetall(bucket_key)
|
||||
if bucket_data:
|
||||
stats['buckets'][endpoint] = {
|
||||
'tokens': float(bucket_data.get('tokens', 0)),
|
||||
'last_update': float(bucket_data.get('last_update', 0))
|
||||
}
|
||||
|
||||
return stats
|
||||
|
||||
|
||||
def rate_limit(endpoint=None,
|
||||
requests_per_minute=None,
|
||||
requests_per_hour=None,
|
||||
burst_size=None,
|
||||
check_size=False):
|
||||
"""
|
||||
Rate limiting decorator for Flask routes using Redis
|
||||
"""
|
||||
def decorator(f):
|
||||
@wraps(f)
|
||||
def decorated_function(*args, **kwargs):
|
||||
# Get Redis rate limiter from app
|
||||
from flask import current_app
|
||||
|
||||
if not hasattr(current_app, 'redis_rate_limiter'):
|
||||
# No Redis rate limiter, execute function normally
|
||||
return f(*args, **kwargs)
|
||||
|
||||
rate_limiter = current_app.redis_rate_limiter
|
||||
|
||||
# Get client ID
|
||||
client_id = rate_limiter.get_client_id(request)
|
||||
ip = request.remote_addr
|
||||
|
||||
# Check if IP is blocked
|
||||
if rate_limiter.is_ip_blocked(ip):
|
||||
return jsonify({
|
||||
'error': 'IP temporarily blocked due to excessive requests'
|
||||
}), 429
|
||||
|
||||
# Get endpoint
|
||||
endpoint_path = endpoint or request.endpoint
|
||||
|
||||
# Override default limits if specified
|
||||
if any([requests_per_minute, requests_per_hour, burst_size]):
|
||||
limits = rate_limiter.get_limits(endpoint_path).copy()
|
||||
if requests_per_minute:
|
||||
limits['requests_per_minute'] = requests_per_minute
|
||||
if requests_per_hour:
|
||||
limits['requests_per_hour'] = requests_per_hour
|
||||
if burst_size:
|
||||
limits['burst_size'] = burst_size
|
||||
rate_limiter.endpoint_limits[endpoint_path] = limits
|
||||
|
||||
# Check request size if needed
|
||||
request_size = 0
|
||||
if check_size:
|
||||
request_size = request.content_length or 0
|
||||
|
||||
# Check rate limit
|
||||
allowed, message, headers = rate_limiter.check_rate_limit(
|
||||
client_id, endpoint_path, request_size
|
||||
)
|
||||
|
||||
if not allowed:
|
||||
# Log excessive requests
|
||||
logger.warning(f"Rate limit exceeded for {client_id} on {endpoint_path}: {message}")
|
||||
|
||||
# Check if we should temporarily block this IP
|
||||
stats = rate_limiter.get_client_stats(client_id)
|
||||
if stats and stats['requests_last_minute'] > 100:
|
||||
rate_limiter.block_ip_temporarily(ip, 3600) # 1 hour block
|
||||
|
||||
response = jsonify({
|
||||
'error': message,
|
||||
'retry_after': headers.get('retry_after') if headers else 60
|
||||
})
|
||||
response.status_code = 429
|
||||
|
||||
# Add rate limit headers
|
||||
if headers:
|
||||
response.headers['X-RateLimit-Limit'] = str(headers['limit'])
|
||||
response.headers['X-RateLimit-Remaining'] = str(headers['remaining'])
|
||||
response.headers['X-RateLimit-Reset'] = str(headers['reset'])
|
||||
response.headers['Retry-After'] = str(headers['retry_after'])
|
||||
|
||||
return response
|
||||
|
||||
# Track concurrent requests
|
||||
rate_limiter.increment_concurrent()
|
||||
|
||||
try:
|
||||
# Add rate limit info to response
|
||||
g.rate_limit_headers = headers
|
||||
response = f(*args, **kwargs)
|
||||
|
||||
# Add headers to successful response
|
||||
if headers and hasattr(response, 'headers'):
|
||||
response.headers['X-RateLimit-Limit'] = str(headers['limit'])
|
||||
response.headers['X-RateLimit-Remaining'] = str(headers['remaining'])
|
||||
response.headers['X-RateLimit-Reset'] = str(headers['reset'])
|
||||
|
||||
return response
|
||||
finally:
|
||||
rate_limiter.decrement_concurrent()
|
||||
|
||||
return decorated_function
|
||||
return decorator
|
389
redis_session_manager.py
Normal file
389
redis_session_manager.py
Normal file
@ -0,0 +1,389 @@
|
||||
# Redis-based session management system
|
||||
import time
|
||||
import uuid
|
||||
import logging
|
||||
from datetime import datetime
|
||||
from typing import Dict, Any, Optional, List
|
||||
from dataclasses import dataclass, asdict
|
||||
from flask import session, request, g
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
@dataclass
|
||||
class SessionInfo:
|
||||
"""Session information stored in Redis"""
|
||||
session_id: str
|
||||
user_id: Optional[str] = None
|
||||
ip_address: Optional[str] = None
|
||||
user_agent: Optional[str] = None
|
||||
created_at: float = None
|
||||
last_activity: float = None
|
||||
request_count: int = 0
|
||||
resource_count: int = 0
|
||||
total_bytes_used: int = 0
|
||||
metadata: Dict[str, Any] = None
|
||||
|
||||
def __post_init__(self):
|
||||
if self.created_at is None:
|
||||
self.created_at = time.time()
|
||||
if self.last_activity is None:
|
||||
self.last_activity = time.time()
|
||||
if self.metadata is None:
|
||||
self.metadata = {}
|
||||
|
||||
|
||||
class RedisSessionManager:
|
||||
"""
|
||||
Session management using Redis for distributed sessions
|
||||
"""
|
||||
def __init__(self, redis_manager, config: Dict[str, Any] = None):
|
||||
self.redis = redis_manager
|
||||
self.config = config or {}
|
||||
|
||||
# Configuration
|
||||
self.max_session_duration = self.config.get('max_session_duration', 3600) # 1 hour
|
||||
self.max_idle_time = self.config.get('max_idle_time', 900) # 15 minutes
|
||||
self.max_resources_per_session = self.config.get('max_resources_per_session', 100)
|
||||
self.max_bytes_per_session = self.config.get('max_bytes_per_session', 100 * 1024 * 1024) # 100MB
|
||||
|
||||
logger.info("Redis session manager initialized")
|
||||
|
||||
def create_session(self, session_id: str = None, user_id: str = None,
|
||||
ip_address: str = None, user_agent: str = None) -> SessionInfo:
|
||||
"""Create a new session"""
|
||||
if not session_id:
|
||||
session_id = str(uuid.uuid4())
|
||||
|
||||
# Check if session already exists
|
||||
existing = self.get_session(session_id)
|
||||
if existing:
|
||||
logger.warning(f"Session {session_id} already exists")
|
||||
return existing
|
||||
|
||||
session_info = SessionInfo(
|
||||
session_id=session_id,
|
||||
user_id=user_id,
|
||||
ip_address=ip_address,
|
||||
user_agent=user_agent
|
||||
)
|
||||
|
||||
# Save to Redis
|
||||
self._save_session(session_info)
|
||||
|
||||
# Add to active sessions set
|
||||
self.redis.sadd('active_sessions', session_id)
|
||||
|
||||
# Update stats
|
||||
self.redis.incr('stats:sessions:created')
|
||||
|
||||
logger.info(f"Created session {session_id}")
|
||||
return session_info
|
||||
|
||||
def get_session(self, session_id: str) -> Optional[SessionInfo]:
|
||||
"""Get a session by ID"""
|
||||
data = self.redis.get(f'session:{session_id}')
|
||||
if not data:
|
||||
return None
|
||||
|
||||
# Update last activity
|
||||
session_info = SessionInfo(**data)
|
||||
session_info.last_activity = time.time()
|
||||
self._save_session(session_info)
|
||||
|
||||
return session_info
|
||||
|
||||
def update_session_activity(self, session_id: str):
|
||||
"""Update session last activity time"""
|
||||
session_info = self.get_session(session_id)
|
||||
if session_info:
|
||||
session_info.last_activity = time.time()
|
||||
session_info.request_count += 1
|
||||
self._save_session(session_info)
|
||||
|
||||
def add_resource(self, session_id: str, resource_type: str,
|
||||
resource_id: str = None, path: str = None,
|
||||
size_bytes: int = 0, metadata: Dict[str, Any] = None) -> bool:
|
||||
"""Add a resource to a session"""
|
||||
session_info = self.get_session(session_id)
|
||||
if not session_info:
|
||||
logger.warning(f"Session {session_id} not found")
|
||||
return False
|
||||
|
||||
# Check limits
|
||||
if session_info.resource_count >= self.max_resources_per_session:
|
||||
logger.warning(f"Session {session_id} reached resource limit")
|
||||
# Clean up oldest resources
|
||||
self._cleanup_oldest_resources(session_id, 1)
|
||||
|
||||
if session_info.total_bytes_used + size_bytes > self.max_bytes_per_session:
|
||||
logger.warning(f"Session {session_id} reached size limit")
|
||||
bytes_to_free = (session_info.total_bytes_used + size_bytes) - self.max_bytes_per_session
|
||||
self._cleanup_resources_by_size(session_id, bytes_to_free)
|
||||
|
||||
# Generate resource ID if not provided
|
||||
if not resource_id:
|
||||
resource_id = str(uuid.uuid4())
|
||||
|
||||
# Store resource info
|
||||
resource_key = f'session:{session_id}:resource:{resource_id}'
|
||||
resource_data = {
|
||||
'resource_id': resource_id,
|
||||
'resource_type': resource_type,
|
||||
'path': path,
|
||||
'size_bytes': size_bytes,
|
||||
'created_at': time.time(),
|
||||
'metadata': metadata or {}
|
||||
}
|
||||
|
||||
self.redis.set(resource_key, resource_data, expire=self.max_session_duration)
|
||||
|
||||
# Add to session's resource set
|
||||
self.redis.sadd(f'session:{session_id}:resources', resource_id)
|
||||
|
||||
# Update session info
|
||||
session_info.resource_count += 1
|
||||
session_info.total_bytes_used += size_bytes
|
||||
self._save_session(session_info)
|
||||
|
||||
# Update global stats
|
||||
self.redis.incr('stats:resources:active')
|
||||
self.redis.incr('stats:bytes:active', size_bytes)
|
||||
|
||||
logger.debug(f"Added {resource_type} resource {resource_id} to session {session_id}")
|
||||
return True
|
||||
|
||||
def remove_resource(self, session_id: str, resource_id: str) -> bool:
|
||||
"""Remove a resource from a session"""
|
||||
# Get resource info
|
||||
resource_key = f'session:{session_id}:resource:{resource_id}'
|
||||
resource_data = self.redis.get(resource_key)
|
||||
if not resource_data:
|
||||
return False
|
||||
|
||||
# Clean up the actual resource (file, etc.)
|
||||
self._cleanup_resource(resource_data)
|
||||
|
||||
# Remove from Redis
|
||||
self.redis.delete(resource_key)
|
||||
self.redis.srem(f'session:{session_id}:resources', resource_id)
|
||||
|
||||
# Update session info
|
||||
session_info = self.get_session(session_id)
|
||||
if session_info:
|
||||
session_info.resource_count -= 1
|
||||
session_info.total_bytes_used -= resource_data.get('size_bytes', 0)
|
||||
self._save_session(session_info)
|
||||
|
||||
# Update stats
|
||||
self.redis.decr('stats:resources:active')
|
||||
self.redis.decr('stats:bytes:active', resource_data.get('size_bytes', 0))
|
||||
self.redis.incr('stats:resources:cleaned')
|
||||
self.redis.incr('stats:bytes:cleaned', resource_data.get('size_bytes', 0))
|
||||
|
||||
logger.debug(f"Removed resource {resource_id} from session {session_id}")
|
||||
return True
|
||||
|
||||
def cleanup_session(self, session_id: str) -> bool:
|
||||
"""Clean up a session and all its resources"""
|
||||
session_info = self.get_session(session_id)
|
||||
if not session_info:
|
||||
return False
|
||||
|
||||
# Get all resources
|
||||
resource_ids = self.redis.smembers(f'session:{session_id}:resources')
|
||||
|
||||
# Clean up each resource
|
||||
for resource_id in resource_ids:
|
||||
self.remove_resource(session_id, resource_id)
|
||||
|
||||
# Remove session data
|
||||
self.redis.delete(f'session:{session_id}')
|
||||
self.redis.delete(f'session:{session_id}:resources')
|
||||
self.redis.srem('active_sessions', session_id)
|
||||
|
||||
# Update stats
|
||||
self.redis.incr('stats:sessions:cleaned')
|
||||
|
||||
logger.info(f"Cleaned up session {session_id}")
|
||||
return True
|
||||
|
||||
def cleanup_expired_sessions(self):
|
||||
"""Clean up sessions that have exceeded max duration"""
|
||||
now = time.time()
|
||||
active_sessions = self.redis.smembers('active_sessions')
|
||||
|
||||
for session_id in active_sessions:
|
||||
session_info = self.get_session(session_id)
|
||||
if session_info and (now - session_info.created_at > self.max_session_duration):
|
||||
logger.info(f"Cleaning up expired session {session_id}")
|
||||
self.cleanup_session(session_id)
|
||||
|
||||
def cleanup_idle_sessions(self):
|
||||
"""Clean up sessions that have been idle too long"""
|
||||
now = time.time()
|
||||
active_sessions = self.redis.smembers('active_sessions')
|
||||
|
||||
for session_id in active_sessions:
|
||||
session_info = self.get_session(session_id)
|
||||
if session_info and (now - session_info.last_activity > self.max_idle_time):
|
||||
logger.info(f"Cleaning up idle session {session_id}")
|
||||
self.cleanup_session(session_id)
|
||||
|
||||
def get_session_info(self, session_id: str) -> Optional[Dict[str, Any]]:
|
||||
"""Get detailed information about a session"""
|
||||
session_info = self.get_session(session_id)
|
||||
if not session_info:
|
||||
return None
|
||||
|
||||
# Get resources
|
||||
resource_ids = self.redis.smembers(f'session:{session_id}:resources')
|
||||
resources = []
|
||||
|
||||
for resource_id in resource_ids:
|
||||
resource_data = self.redis.get(f'session:{session_id}:resource:{resource_id}')
|
||||
if resource_data:
|
||||
resources.append({
|
||||
'resource_id': resource_data['resource_id'],
|
||||
'resource_type': resource_data['resource_type'],
|
||||
'size_bytes': resource_data['size_bytes'],
|
||||
'created_at': datetime.fromtimestamp(resource_data['created_at']).isoformat()
|
||||
})
|
||||
|
||||
return {
|
||||
'session_id': session_info.session_id,
|
||||
'user_id': session_info.user_id,
|
||||
'ip_address': session_info.ip_address,
|
||||
'created_at': datetime.fromtimestamp(session_info.created_at).isoformat(),
|
||||
'last_activity': datetime.fromtimestamp(session_info.last_activity).isoformat(),
|
||||
'duration_seconds': int(time.time() - session_info.created_at),
|
||||
'idle_seconds': int(time.time() - session_info.last_activity),
|
||||
'request_count': session_info.request_count,
|
||||
'resource_count': session_info.resource_count,
|
||||
'total_bytes_used': session_info.total_bytes_used,
|
||||
'resources': resources
|
||||
}
|
||||
|
||||
def get_all_sessions_info(self) -> List[Dict[str, Any]]:
|
||||
"""Get information about all active sessions"""
|
||||
active_sessions = self.redis.smembers('active_sessions')
|
||||
return [
|
||||
self.get_session_info(session_id)
|
||||
for session_id in active_sessions
|
||||
if self.get_session_info(session_id)
|
||||
]
|
||||
|
||||
def get_stats(self) -> Dict[str, Any]:
|
||||
"""Get session manager statistics"""
|
||||
active_sessions = self.redis.scard('active_sessions')
|
||||
|
||||
return {
|
||||
'active_sessions': active_sessions,
|
||||
'total_sessions_created': self.redis.get('stats:sessions:created', 0),
|
||||
'total_sessions_cleaned': self.redis.get('stats:sessions:cleaned', 0),
|
||||
'active_resources': self.redis.get('stats:resources:active', 0),
|
||||
'total_resources_cleaned': self.redis.get('stats:resources:cleaned', 0),
|
||||
'active_bytes': self.redis.get('stats:bytes:active', 0),
|
||||
'total_bytes_cleaned': self.redis.get('stats:bytes:cleaned', 0)
|
||||
}
|
||||
|
||||
def _save_session(self, session_info: SessionInfo):
|
||||
"""Save session info to Redis"""
|
||||
key = f'session:{session_info.session_id}'
|
||||
data = asdict(session_info)
|
||||
self.redis.set(key, data, expire=self.max_session_duration)
|
||||
|
||||
def _cleanup_resource(self, resource_data: Dict[str, Any]):
|
||||
"""Clean up a resource (e.g., delete file)"""
|
||||
import os
|
||||
|
||||
if resource_data.get('resource_type') in ['audio_file', 'temp_file']:
|
||||
path = resource_data.get('path')
|
||||
if path and os.path.exists(path):
|
||||
try:
|
||||
os.remove(path)
|
||||
logger.debug(f"Removed file {path}")
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to remove file {path}: {e}")
|
||||
|
||||
def _cleanup_oldest_resources(self, session_id: str, count: int):
|
||||
"""Clean up oldest resources from a session"""
|
||||
resource_ids = list(self.redis.smembers(f'session:{session_id}:resources'))
|
||||
|
||||
# Get resource creation times
|
||||
resources_with_time = []
|
||||
for resource_id in resource_ids:
|
||||
resource_data = self.redis.get(f'session:{session_id}:resource:{resource_id}')
|
||||
if resource_data:
|
||||
resources_with_time.append((resource_id, resource_data.get('created_at', 0)))
|
||||
|
||||
# Sort by creation time and remove oldest
|
||||
resources_with_time.sort(key=lambda x: x[1])
|
||||
for resource_id, _ in resources_with_time[:count]:
|
||||
self.remove_resource(session_id, resource_id)
|
||||
|
||||
def _cleanup_resources_by_size(self, session_id: str, bytes_to_free: int):
|
||||
"""Clean up resources to free up space"""
|
||||
resource_ids = list(self.redis.smembers(f'session:{session_id}:resources'))
|
||||
|
||||
# Get resource sizes
|
||||
resources_with_size = []
|
||||
for resource_id in resource_ids:
|
||||
resource_data = self.redis.get(f'session:{session_id}:resource:{resource_id}')
|
||||
if resource_data:
|
||||
resources_with_size.append((resource_id, resource_data.get('size_bytes', 0)))
|
||||
|
||||
# Sort by size (largest first) and remove until we've freed enough
|
||||
resources_with_size.sort(key=lambda x: x[1], reverse=True)
|
||||
freed_bytes = 0
|
||||
|
||||
for resource_id, size in resources_with_size:
|
||||
if freed_bytes >= bytes_to_free:
|
||||
break
|
||||
freed_bytes += size
|
||||
self.remove_resource(session_id, resource_id)
|
||||
|
||||
|
||||
def init_app(app):
|
||||
"""Initialize Redis session management for Flask app"""
|
||||
# Get Redis manager
|
||||
redis_manager = getattr(app, 'redis', None)
|
||||
if not redis_manager:
|
||||
raise RuntimeError("Redis manager not initialized. Call init_redis() first.")
|
||||
|
||||
config = {
|
||||
'max_session_duration': app.config.get('MAX_SESSION_DURATION', 3600),
|
||||
'max_idle_time': app.config.get('MAX_SESSION_IDLE_TIME', 900),
|
||||
'max_resources_per_session': app.config.get('MAX_RESOURCES_PER_SESSION', 100),
|
||||
'max_bytes_per_session': app.config.get('MAX_BYTES_PER_SESSION', 100 * 1024 * 1024)
|
||||
}
|
||||
|
||||
manager = RedisSessionManager(redis_manager, config)
|
||||
app.redis_session_manager = manager
|
||||
|
||||
# Add before_request handler
|
||||
@app.before_request
|
||||
def before_request_session():
|
||||
# Get or create session
|
||||
session_id = session.get('session_id')
|
||||
if not session_id:
|
||||
session_id = str(uuid.uuid4())
|
||||
session['session_id'] = session_id
|
||||
session.permanent = True
|
||||
|
||||
# Get session from manager
|
||||
user_session = manager.get_session(session_id)
|
||||
if not user_session:
|
||||
user_session = manager.create_session(
|
||||
session_id=session_id,
|
||||
ip_address=request.remote_addr,
|
||||
user_agent=request.headers.get('User-Agent')
|
||||
)
|
||||
|
||||
# Update activity
|
||||
manager.update_session_activity(session_id)
|
||||
|
||||
# Store in g for request access
|
||||
g.user_session = user_session
|
||||
g.session_manager = manager
|
||||
|
||||
logger.info("Redis session management initialized")
|
@ -1,5 +1,10 @@
|
||||
flask
|
||||
flask-cors
|
||||
flask-sqlalchemy
|
||||
flask-migrate
|
||||
flask-jwt-extended
|
||||
flask-bcrypt
|
||||
flask-login
|
||||
requests
|
||||
openai-whisper
|
||||
torch
|
||||
@ -10,3 +15,10 @@ python-dotenv
|
||||
click
|
||||
colorlog
|
||||
psutil
|
||||
redis
|
||||
psycopg2-binary
|
||||
alembic
|
||||
flask-socketio
|
||||
python-socketio
|
||||
eventlet
|
||||
python-dateutil
|
||||
|
32
run_dev_server.sh
Executable file
32
run_dev_server.sh
Executable file
@ -0,0 +1,32 @@
|
||||
#!/bin/bash
|
||||
# Run Talk2Me development server locally
|
||||
|
||||
echo "Starting Talk2Me development server..."
|
||||
echo "=================================="
|
||||
echo "Admin Dashboard: http://localhost:5005/admin"
|
||||
echo " Token: 4CFvwzmeDWhecfuOHYz7Hyb8nQQ="
|
||||
echo ""
|
||||
echo "User Login: http://localhost:5005/login"
|
||||
echo " Username: admin"
|
||||
echo " Password: talk2me123"
|
||||
echo ""
|
||||
echo "API Authentication:"
|
||||
echo " API Key: 6sy2_m8e89FeC2RmUo0CcgufM9b_0OoIwIa8LSEbNhI"
|
||||
echo "=================================="
|
||||
echo ""
|
||||
|
||||
# Kill any existing process on port 5005
|
||||
lsof -ti:5005 | xargs kill -9 2>/dev/null
|
||||
|
||||
# Set environment variables
|
||||
export FLASK_ENV=development
|
||||
export FLASK_DEBUG=1
|
||||
|
||||
# Run with gunicorn for a more production-like environment
|
||||
gunicorn --bind 0.0.0.0:5005 \
|
||||
--workers 1 \
|
||||
--threads 2 \
|
||||
--timeout 120 \
|
||||
--reload \
|
||||
--log-level debug \
|
||||
wsgi:application
|
156
setup_databases.sh
Executable file
156
setup_databases.sh
Executable file
@ -0,0 +1,156 @@
|
||||
#!/bin/bash
|
||||
# Setup script for Redis and PostgreSQL
|
||||
|
||||
echo "Talk2Me Database Setup Script"
|
||||
echo "============================="
|
||||
|
||||
# Check if running as root
|
||||
if [ "$EUID" -eq 0 ]; then
|
||||
echo "Please do not run this script as root"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Function to check if command exists
|
||||
command_exists() {
|
||||
command -v "$1" >/dev/null 2>&1
|
||||
}
|
||||
|
||||
# Check for PostgreSQL
|
||||
echo "Checking PostgreSQL installation..."
|
||||
if command_exists psql; then
|
||||
echo "✓ PostgreSQL is installed"
|
||||
psql --version
|
||||
else
|
||||
echo "✗ PostgreSQL is not installed"
|
||||
echo "Please install PostgreSQL first:"
|
||||
echo " Ubuntu/Debian: sudo apt-get install postgresql postgresql-contrib"
|
||||
echo " MacOS: brew install postgresql"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Check for Redis
|
||||
echo ""
|
||||
echo "Checking Redis installation..."
|
||||
if command_exists redis-cli; then
|
||||
echo "✓ Redis is installed"
|
||||
redis-cli --version
|
||||
else
|
||||
echo "✗ Redis is not installed"
|
||||
echo "Please install Redis first:"
|
||||
echo " Ubuntu/Debian: sudo apt-get install redis-server"
|
||||
echo " MacOS: brew install redis"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Check if Redis is running
|
||||
echo ""
|
||||
echo "Checking Redis server status..."
|
||||
if redis-cli ping > /dev/null 2>&1; then
|
||||
echo "✓ Redis server is running"
|
||||
else
|
||||
echo "✗ Redis server is not running"
|
||||
echo "Starting Redis server..."
|
||||
if command_exists systemctl; then
|
||||
sudo systemctl start redis
|
||||
else
|
||||
redis-server --daemonize yes
|
||||
fi
|
||||
sleep 2
|
||||
if redis-cli ping > /dev/null 2>&1; then
|
||||
echo "✓ Redis server started successfully"
|
||||
else
|
||||
echo "✗ Failed to start Redis server"
|
||||
exit 1
|
||||
fi
|
||||
fi
|
||||
|
||||
# Create PostgreSQL database
|
||||
echo ""
|
||||
echo "Setting up PostgreSQL database..."
|
||||
read -p "Enter PostgreSQL username (default: $USER): " PG_USER
|
||||
PG_USER=${PG_USER:-$USER}
|
||||
|
||||
read -p "Enter database name (default: talk2me): " DB_NAME
|
||||
DB_NAME=${DB_NAME:-talk2me}
|
||||
|
||||
# Check if database exists
|
||||
if psql -U "$PG_USER" -lqt | cut -d \| -f 1 | grep -qw "$DB_NAME"; then
|
||||
echo "Database '$DB_NAME' already exists"
|
||||
read -p "Do you want to drop and recreate it? (y/N): " RECREATE
|
||||
if [[ $RECREATE =~ ^[Yy]$ ]]; then
|
||||
echo "Dropping database '$DB_NAME'..."
|
||||
psql -U "$PG_USER" -c "DROP DATABASE $DB_NAME;"
|
||||
echo "Creating database '$DB_NAME'..."
|
||||
psql -U "$PG_USER" -c "CREATE DATABASE $DB_NAME;"
|
||||
fi
|
||||
else
|
||||
echo "Creating database '$DB_NAME'..."
|
||||
psql -U "$PG_USER" -c "CREATE DATABASE $DB_NAME;"
|
||||
fi
|
||||
|
||||
# Create .env file if it doesn't exist
|
||||
if [ ! -f .env ]; then
|
||||
echo ""
|
||||
echo "Creating .env file..."
|
||||
cat > .env << EOF
|
||||
# Database Configuration
|
||||
DATABASE_URL=postgresql://$PG_USER@localhost/$DB_NAME
|
||||
SQLALCHEMY_DATABASE_URI=postgresql://$PG_USER@localhost/$DB_NAME
|
||||
|
||||
# Redis Configuration
|
||||
REDIS_URL=redis://localhost:6379/0
|
||||
REDIS_DECODE_RESPONSES=false
|
||||
REDIS_MAX_CONNECTIONS=50
|
||||
|
||||
# Flask Configuration
|
||||
FLASK_ENV=development
|
||||
SECRET_KEY=$(openssl rand -base64 32)
|
||||
ADMIN_TOKEN=$(openssl rand -base64 24)
|
||||
|
||||
# TTS Configuration
|
||||
TTS_SERVER_URL=http://localhost:5050/v1/audio/speech
|
||||
TTS_API_KEY=your-tts-api-key-here
|
||||
|
||||
# Whisper Configuration
|
||||
WHISPER_MODEL_SIZE=base
|
||||
WHISPER_DEVICE=auto
|
||||
|
||||
# Ollama Configuration
|
||||
OLLAMA_HOST=http://localhost:11434
|
||||
OLLAMA_MODEL=gemma3:27b
|
||||
EOF
|
||||
echo "✓ .env file created"
|
||||
echo "Please update the TTS_API_KEY in .env file"
|
||||
else
|
||||
echo "✓ .env file already exists"
|
||||
fi
|
||||
|
||||
# Install Python dependencies
|
||||
echo ""
|
||||
echo "Installing Python dependencies..."
|
||||
if [ -f "requirements.txt" ]; then
|
||||
pip install -r requirements.txt
|
||||
echo "✓ Python dependencies installed"
|
||||
else
|
||||
echo "✗ requirements.txt not found"
|
||||
fi
|
||||
|
||||
# Initialize database
|
||||
echo ""
|
||||
echo "Initializing database..."
|
||||
python database_init.py
|
||||
|
||||
echo ""
|
||||
echo "Setup complete!"
|
||||
echo ""
|
||||
echo "Next steps:"
|
||||
echo "1. Update the TTS_API_KEY in .env file"
|
||||
echo "2. Run 'python migrations.py init' to initialize migrations"
|
||||
echo "3. Run 'python migrations.py create \"Initial migration\"' to create first migration"
|
||||
echo "4. Run 'python migrations.py run' to apply migrations"
|
||||
echo "5. Backup your current app.py and rename app_with_db.py to app.py"
|
||||
echo "6. Start the application with 'python app.py'"
|
||||
echo ""
|
||||
echo "To run Redis and PostgreSQL as services:"
|
||||
echo " Redis: sudo systemctl enable redis && sudo systemctl start redis"
|
||||
echo " PostgreSQL: sudo systemctl enable postgresql && sudo systemctl start postgresql"
|
693
templates/admin_users.html
Normal file
693
templates/admin_users.html
Normal file
@ -0,0 +1,693 @@
|
||||
<!DOCTYPE html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="UTF-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
||||
<title>User Management - Talk2Me Admin</title>
|
||||
<link href="https://cdn.jsdelivr.net/npm/bootstrap@5.1.3/dist/css/bootstrap.min.css" rel="stylesheet">
|
||||
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/bootstrap-icons@1.7.2/font/bootstrap-icons.css">
|
||||
<style>
|
||||
.user-avatar {
|
||||
width: 40px;
|
||||
height: 40px;
|
||||
border-radius: 50%;
|
||||
object-fit: cover;
|
||||
}
|
||||
.status-badge {
|
||||
font-size: 0.875rem;
|
||||
}
|
||||
.action-buttons .btn {
|
||||
padding: 0.25rem 0.5rem;
|
||||
font-size: 0.875rem;
|
||||
}
|
||||
.stats-card {
|
||||
border: none;
|
||||
border-radius: 10px;
|
||||
box-shadow: 0 2px 4px rgba(0,0,0,0.1);
|
||||
}
|
||||
.stats-card .card-body {
|
||||
padding: 1.5rem;
|
||||
}
|
||||
.stats-number {
|
||||
font-size: 2rem;
|
||||
font-weight: bold;
|
||||
color: #4a5568;
|
||||
}
|
||||
.table-responsive {
|
||||
border-radius: 8px;
|
||||
box-shadow: 0 1px 3px rgba(0,0,0,0.1);
|
||||
}
|
||||
.modal-header {
|
||||
background-color: #f8f9fa;
|
||||
}
|
||||
.search-filters {
|
||||
background-color: #f8f9fa;
|
||||
padding: 1rem;
|
||||
border-radius: 8px;
|
||||
margin-bottom: 1rem;
|
||||
}
|
||||
</style>
|
||||
</head>
|
||||
<body>
|
||||
<nav class="navbar navbar-expand-lg navbar-dark bg-dark">
|
||||
<div class="container-fluid">
|
||||
<a class="navbar-brand" href="/admin">Talk2Me Admin</a>
|
||||
<div class="navbar-nav ms-auto">
|
||||
<a class="nav-link" href="/" target="_blank">
|
||||
<i class="bi bi-box-arrow-up-right"></i> Main App
|
||||
</a>
|
||||
<a class="nav-link" href="#" onclick="logout()">
|
||||
<i class="bi bi-box-arrow-right"></i> Logout
|
||||
</a>
|
||||
</div>
|
||||
</div>
|
||||
</nav>
|
||||
|
||||
<div class="container-fluid mt-4">
|
||||
<div class="row">
|
||||
<div class="col-12">
|
||||
<h1 class="mb-4">User Management</h1>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Statistics Cards -->
|
||||
<div class="row mb-4">
|
||||
<div class="col-md-3">
|
||||
<div class="card stats-card">
|
||||
<div class="card-body">
|
||||
<h6 class="text-muted mb-2">Total Users</h6>
|
||||
<div class="stats-number" id="stat-total">0</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="col-md-3">
|
||||
<div class="card stats-card">
|
||||
<div class="card-body">
|
||||
<h6 class="text-muted mb-2">Active Users</h6>
|
||||
<div class="stats-number text-success" id="stat-active">0</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="col-md-3">
|
||||
<div class="card stats-card">
|
||||
<div class="card-body">
|
||||
<h6 class="text-muted mb-2">Suspended Users</h6>
|
||||
<div class="stats-number text-warning" id="stat-suspended">0</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="col-md-3">
|
||||
<div class="card stats-card">
|
||||
<div class="card-body">
|
||||
<h6 class="text-muted mb-2">Admin Users</h6>
|
||||
<div class="stats-number text-primary" id="stat-admins">0</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Search and Filters -->
|
||||
<div class="search-filters">
|
||||
<div class="row">
|
||||
<div class="col-md-4">
|
||||
<input type="text" class="form-control" id="searchInput" placeholder="Search by email, username, or name...">
|
||||
</div>
|
||||
<div class="col-md-2">
|
||||
<select class="form-select" id="roleFilter">
|
||||
<option value="">All Roles</option>
|
||||
<option value="admin">Admin</option>
|
||||
<option value="user">User</option>
|
||||
</select>
|
||||
</div>
|
||||
<div class="col-md-2">
|
||||
<select class="form-select" id="statusFilter">
|
||||
<option value="">All Status</option>
|
||||
<option value="active">Active</option>
|
||||
<option value="suspended">Suspended</option>
|
||||
<option value="inactive">Inactive</option>
|
||||
</select>
|
||||
</div>
|
||||
<div class="col-md-2">
|
||||
<select class="form-select" id="sortBy">
|
||||
<option value="created_at">Created Date</option>
|
||||
<option value="last_login_at">Last Login</option>
|
||||
<option value="total_requests">Total Requests</option>
|
||||
<option value="username">Username</option>
|
||||
</select>
|
||||
</div>
|
||||
<div class="col-md-2">
|
||||
<button class="btn btn-primary w-100" onclick="createUser()">
|
||||
<i class="bi bi-plus-circle"></i> Create User
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Users Table -->
|
||||
<div class="table-responsive">
|
||||
<table class="table table-hover">
|
||||
<thead>
|
||||
<tr>
|
||||
<th>User</th>
|
||||
<th>Role</th>
|
||||
<th>Status</th>
|
||||
<th>Usage</th>
|
||||
<th>Last Login</th>
|
||||
<th>Created</th>
|
||||
<th>Actions</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody id="usersTableBody">
|
||||
<!-- Users will be loaded here -->
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
|
||||
<!-- Pagination -->
|
||||
<nav aria-label="Page navigation">
|
||||
<ul class="pagination justify-content-center" id="pagination">
|
||||
<!-- Pagination will be loaded here -->
|
||||
</ul>
|
||||
</nav>
|
||||
</div>
|
||||
|
||||
<!-- User Details Modal -->
|
||||
<div class="modal fade" id="userModal" tabindex="-1">
|
||||
<div class="modal-dialog modal-lg">
|
||||
<div class="modal-content">
|
||||
<div class="modal-header">
|
||||
<h5 class="modal-title">User Details</h5>
|
||||
<button type="button" class="btn-close" data-bs-dismiss="modal"></button>
|
||||
</div>
|
||||
<div class="modal-body" id="userModalBody">
|
||||
<!-- User details will be loaded here -->
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Create/Edit User Modal -->
|
||||
<div class="modal fade" id="createUserModal" tabindex="-1">
|
||||
<div class="modal-dialog">
|
||||
<div class="modal-content">
|
||||
<div class="modal-header">
|
||||
<h5 class="modal-title" id="createUserModalTitle">Create User</h5>
|
||||
<button type="button" class="btn-close" data-bs-dismiss="modal"></button>
|
||||
</div>
|
||||
<div class="modal-body">
|
||||
<form id="userForm">
|
||||
<div class="mb-3">
|
||||
<label for="userEmail" class="form-label">Email</label>
|
||||
<input type="email" class="form-control" id="userEmail" required>
|
||||
</div>
|
||||
<div class="mb-3">
|
||||
<label for="userUsername" class="form-label">Username</label>
|
||||
<input type="text" class="form-control" id="userUsername" required>
|
||||
</div>
|
||||
<div class="mb-3">
|
||||
<label for="userPassword" class="form-label">Password</label>
|
||||
<input type="password" class="form-control" id="userPassword" minlength="8">
|
||||
<small class="text-muted">Leave blank to keep existing password (edit mode)</small>
|
||||
</div>
|
||||
<div class="mb-3">
|
||||
<label for="userFullName" class="form-label">Full Name</label>
|
||||
<input type="text" class="form-control" id="userFullName">
|
||||
</div>
|
||||
<div class="mb-3">
|
||||
<label for="userRole" class="form-label">Role</label>
|
||||
<select class="form-select" id="userRole">
|
||||
<option value="user">User</option>
|
||||
<option value="admin">Admin</option>
|
||||
</select>
|
||||
</div>
|
||||
<div class="mb-3">
|
||||
<div class="form-check">
|
||||
<input class="form-check-input" type="checkbox" id="userVerified">
|
||||
<label class="form-check-label" for="userVerified">
|
||||
Email Verified
|
||||
</label>
|
||||
</div>
|
||||
</div>
|
||||
<div class="mb-3">
|
||||
<label for="userRateLimit" class="form-label">Rate Limit (per minute)</label>
|
||||
<input type="number" class="form-control" id="userRateLimit" value="30">
|
||||
</div>
|
||||
</form>
|
||||
</div>
|
||||
<div class="modal-footer">
|
||||
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Cancel</button>
|
||||
<button type="button" class="btn btn-primary" onclick="saveUser()">Save User</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<script src="https://cdn.jsdelivr.net/npm/bootstrap@5.1.3/dist/js/bootstrap.bundle.min.js"></script>
|
||||
<script src="https://cdn.jsdelivr.net/npm/axios/dist/axios.min.js"></script>
|
||||
<script>
|
||||
let currentPage = 1;
|
||||
let editingUserId = null;
|
||||
|
||||
// Configure axios defaults for session-based auth
|
||||
axios.defaults.withCredentials = true;
|
||||
|
||||
// Also check if there's a JWT token (for API users)
|
||||
let authToken = localStorage.getItem('auth_token');
|
||||
if (authToken) {
|
||||
axios.defaults.headers.common['Authorization'] = `Bearer ${authToken}`;
|
||||
}
|
||||
|
||||
// For admin token auth, add the admin token header
|
||||
const adminToken = '{{ session.get("admin_token", "") }}';
|
||||
if (adminToken) {
|
||||
axios.defaults.headers.common['X-Admin-Token'] = adminToken;
|
||||
}
|
||||
|
||||
// Load users on page load
|
||||
document.addEventListener('DOMContentLoaded', function() {
|
||||
loadStats();
|
||||
loadUsers();
|
||||
|
||||
// Add event listeners
|
||||
document.getElementById('searchInput').addEventListener('input', debounce(loadUsers, 300));
|
||||
document.getElementById('roleFilter').addEventListener('change', loadUsers);
|
||||
document.getElementById('statusFilter').addEventListener('change', loadUsers);
|
||||
document.getElementById('sortBy').addEventListener('change', loadUsers);
|
||||
});
|
||||
|
||||
function debounce(func, wait) {
|
||||
let timeout;
|
||||
return function executedFunction(...args) {
|
||||
const later = () => {
|
||||
clearTimeout(timeout);
|
||||
func(...args);
|
||||
};
|
||||
clearTimeout(timeout);
|
||||
timeout = setTimeout(later, wait);
|
||||
};
|
||||
}
|
||||
|
||||
async function loadStats() {
|
||||
try {
|
||||
const response = await axios.get('/api/auth/admin/stats/users');
|
||||
const stats = response.data.stats;
|
||||
|
||||
document.getElementById('stat-total').textContent = stats.total_users;
|
||||
document.getElementById('stat-active').textContent = stats.active_users;
|
||||
document.getElementById('stat-suspended').textContent = stats.suspended_users;
|
||||
document.getElementById('stat-admins').textContent = stats.admin_users;
|
||||
} catch (error) {
|
||||
console.error('Failed to load stats:', error);
|
||||
}
|
||||
}
|
||||
|
||||
async function loadUsers(page = 1) {
|
||||
try {
|
||||
currentPage = page;
|
||||
const params = {
|
||||
page: page,
|
||||
per_page: 20,
|
||||
search: document.getElementById('searchInput').value,
|
||||
role: document.getElementById('roleFilter').value,
|
||||
status: document.getElementById('statusFilter').value,
|
||||
sort_by: document.getElementById('sortBy').value,
|
||||
sort_order: 'desc'
|
||||
};
|
||||
|
||||
const response = await axios.get('/api/auth/admin/users', { params });
|
||||
const data = response.data;
|
||||
|
||||
displayUsers(data.users);
|
||||
displayPagination(data.pagination);
|
||||
} catch (error) {
|
||||
console.error('Failed to load users:', error);
|
||||
showAlert('Failed to load users', 'danger');
|
||||
}
|
||||
}
|
||||
|
||||
function displayUsers(users) {
|
||||
const tbody = document.getElementById('usersTableBody');
|
||||
tbody.innerHTML = '';
|
||||
|
||||
users.forEach(user => {
|
||||
const tr = document.createElement('tr');
|
||||
tr.innerHTML = `
|
||||
<td>
|
||||
<div class="d-flex align-items-center">
|
||||
${user.avatar_url ?
|
||||
`<img src="${user.avatar_url}" class="user-avatar me-2" alt="${user.username}">` :
|
||||
`<div class="user-avatar me-2 bg-secondary d-flex align-items-center justify-content-center text-white">
|
||||
${user.username.charAt(0).toUpperCase()}
|
||||
</div>`
|
||||
}
|
||||
<div>
|
||||
<div class="fw-bold">${user.username}</div>
|
||||
<small class="text-muted">${user.email}</small>
|
||||
</div>
|
||||
</div>
|
||||
</td>
|
||||
<td>
|
||||
<span class="badge ${user.role === 'admin' ? 'bg-primary' : 'bg-secondary'}">
|
||||
${user.role}
|
||||
</span>
|
||||
</td>
|
||||
<td>
|
||||
${getStatusBadge(user)}
|
||||
</td>
|
||||
<td>
|
||||
<small>
|
||||
<i class="bi bi-translate"></i> ${user.total_translations}<br>
|
||||
<i class="bi bi-mic"></i> ${user.total_transcriptions}<br>
|
||||
<i class="bi bi-volume-up"></i> ${user.total_tts_requests}
|
||||
</small>
|
||||
</td>
|
||||
<td>
|
||||
<small>${user.last_login_at ? new Date(user.last_login_at).toLocaleString() : 'Never'}</small>
|
||||
</td>
|
||||
<td>
|
||||
<small>${new Date(user.created_at).toLocaleDateString()}</small>
|
||||
</td>
|
||||
<td class="action-buttons">
|
||||
<button class="btn btn-sm btn-info" onclick="viewUser('${user.id}')" title="View Details">
|
||||
<i class="bi bi-eye"></i>
|
||||
</button>
|
||||
<button class="btn btn-sm btn-warning" onclick="editUser('${user.id}')" title="Edit">
|
||||
<i class="bi bi-pencil"></i>
|
||||
</button>
|
||||
${user.is_suspended ?
|
||||
`<button class="btn btn-sm btn-success" onclick="unsuspendUser('${user.id}')" title="Unsuspend">
|
||||
<i class="bi bi-play-circle"></i>
|
||||
</button>` :
|
||||
`<button class="btn btn-sm btn-warning" onclick="suspendUser('${user.id}')" title="Suspend">
|
||||
<i class="bi bi-pause-circle"></i>
|
||||
</button>`
|
||||
}
|
||||
${user.role !== 'admin' ?
|
||||
`<button class="btn btn-sm btn-danger" onclick="deleteUser('${user.id}')" title="Delete">
|
||||
<i class="bi bi-trash"></i>
|
||||
</button>` : ''
|
||||
}
|
||||
</td>
|
||||
`;
|
||||
tbody.appendChild(tr);
|
||||
});
|
||||
}
|
||||
|
||||
function getStatusBadge(user) {
|
||||
if (user.is_suspended) {
|
||||
return '<span class="badge bg-warning status-badge">Suspended</span>';
|
||||
} else if (!user.is_active) {
|
||||
return '<span class="badge bg-secondary status-badge">Inactive</span>';
|
||||
} else if (!user.is_verified) {
|
||||
return '<span class="badge bg-info status-badge">Unverified</span>';
|
||||
} else {
|
||||
return '<span class="badge bg-success status-badge">Active</span>';
|
||||
}
|
||||
}
|
||||
|
||||
function displayPagination(pagination) {
|
||||
const nav = document.getElementById('pagination');
|
||||
nav.innerHTML = '';
|
||||
|
||||
const totalPages = pagination.pages;
|
||||
const currentPage = pagination.page;
|
||||
|
||||
// Previous button
|
||||
const prevLi = document.createElement('li');
|
||||
prevLi.className = `page-item ${currentPage === 1 ? 'disabled' : ''}`;
|
||||
prevLi.innerHTML = `<a class="page-link" href="#" onclick="loadUsers(${currentPage - 1})">Previous</a>`;
|
||||
nav.appendChild(prevLi);
|
||||
|
||||
// Page numbers
|
||||
for (let i = 1; i <= totalPages; i++) {
|
||||
if (i === 1 || i === totalPages || (i >= currentPage - 2 && i <= currentPage + 2)) {
|
||||
const li = document.createElement('li');
|
||||
li.className = `page-item ${i === currentPage ? 'active' : ''}`;
|
||||
li.innerHTML = `<a class="page-link" href="#" onclick="loadUsers(${i})">${i}</a>`;
|
||||
nav.appendChild(li);
|
||||
} else if (i === currentPage - 3 || i === currentPage + 3) {
|
||||
const li = document.createElement('li');
|
||||
li.className = 'page-item disabled';
|
||||
li.innerHTML = '<span class="page-link">...</span>';
|
||||
nav.appendChild(li);
|
||||
}
|
||||
}
|
||||
|
||||
// Next button
|
||||
const nextLi = document.createElement('li');
|
||||
nextLi.className = `page-item ${currentPage === totalPages ? 'disabled' : ''}`;
|
||||
nextLi.innerHTML = `<a class="page-link" href="#" onclick="loadUsers(${currentPage + 1})">Next</a>`;
|
||||
nav.appendChild(nextLi);
|
||||
}
|
||||
|
||||
async function viewUser(userId) {
|
||||
try {
|
||||
const response = await axios.get(`/api/auth/admin/users/${userId}`);
|
||||
const data = response.data;
|
||||
|
||||
const modalBody = document.getElementById('userModalBody');
|
||||
modalBody.innerHTML = `
|
||||
<div class="row">
|
||||
<div class="col-md-6">
|
||||
<h6>User Information</h6>
|
||||
<dl class="row">
|
||||
<dt class="col-sm-4">Username:</dt>
|
||||
<dd class="col-sm-8">${data.user.username}</dd>
|
||||
<dt class="col-sm-4">Email:</dt>
|
||||
<dd class="col-sm-8">${data.user.email}</dd>
|
||||
<dt class="col-sm-4">Full Name:</dt>
|
||||
<dd class="col-sm-8">${data.user.full_name || 'N/A'}</dd>
|
||||
<dt class="col-sm-4">Role:</dt>
|
||||
<dd class="col-sm-8">${data.user.role}</dd>
|
||||
<dt class="col-sm-4">Status:</dt>
|
||||
<dd class="col-sm-8">${getStatusBadge(data.user)}</dd>
|
||||
<dt class="col-sm-4">API Key:</dt>
|
||||
<dd class="col-sm-8">
|
||||
<code>${data.user.api_key}</code>
|
||||
<button class="btn btn-sm btn-secondary ms-2" onclick="copyToClipboard('${data.user.api_key}')">
|
||||
<i class="bi bi-clipboard"></i>
|
||||
</button>
|
||||
</dd>
|
||||
</dl>
|
||||
</div>
|
||||
<div class="col-md-6">
|
||||
<h6>Usage Statistics</h6>
|
||||
<dl class="row">
|
||||
<dt class="col-sm-6">Total Requests:</dt>
|
||||
<dd class="col-sm-6">${data.user.total_requests}</dd>
|
||||
<dt class="col-sm-6">Translations:</dt>
|
||||
<dd class="col-sm-6">${data.user.total_translations}</dd>
|
||||
<dt class="col-sm-6">Transcriptions:</dt>
|
||||
<dd class="col-sm-6">${data.user.total_transcriptions}</dd>
|
||||
<dt class="col-sm-6">TTS Requests:</dt>
|
||||
<dd class="col-sm-6">${data.user.total_tts_requests}</dd>
|
||||
<dt class="col-sm-6">Rate Limits:</dt>
|
||||
<dd class="col-sm-6">
|
||||
${data.user.rate_limit_per_minute}/min<br>
|
||||
${data.user.rate_limit_per_hour}/hour<br>
|
||||
${data.user.rate_limit_per_day}/day
|
||||
</dd>
|
||||
</dl>
|
||||
</div>
|
||||
</div>
|
||||
<hr>
|
||||
<h6>Login History</h6>
|
||||
<div class="table-responsive">
|
||||
<table class="table table-sm">
|
||||
<thead>
|
||||
<tr>
|
||||
<th>Date</th>
|
||||
<th>IP Address</th>
|
||||
<th>Method</th>
|
||||
<th>Status</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
${data.login_history.map(login => `
|
||||
<tr>
|
||||
<td>${new Date(login.login_at).toLocaleString()}</td>
|
||||
<td>${login.ip_address}</td>
|
||||
<td>${login.login_method}</td>
|
||||
<td>${login.success ?
|
||||
'<span class="badge bg-success">Success</span>' :
|
||||
`<span class="badge bg-danger">Failed</span>`
|
||||
}</td>
|
||||
</tr>
|
||||
`).join('')}
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
<hr>
|
||||
<h6>Active Sessions (${data.active_sessions.length})</h6>
|
||||
<div class="table-responsive">
|
||||
<table class="table table-sm">
|
||||
<thead>
|
||||
<tr>
|
||||
<th>Session ID</th>
|
||||
<th>Created</th>
|
||||
<th>Last Active</th>
|
||||
<th>IP Address</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
${data.active_sessions.map(session => `
|
||||
<tr>
|
||||
<td><code>${session.session_id.substr(0, 8)}...</code></td>
|
||||
<td>${new Date(session.created_at).toLocaleString()}</td>
|
||||
<td>${new Date(session.last_active_at).toLocaleString()}</td>
|
||||
<td>${session.ip_address}</td>
|
||||
</tr>
|
||||
`).join('')}
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
`;
|
||||
|
||||
const modal = new bootstrap.Modal(document.getElementById('userModal'));
|
||||
modal.show();
|
||||
} catch (error) {
|
||||
console.error('Failed to load user details:', error);
|
||||
showAlert('Failed to load user details', 'danger');
|
||||
}
|
||||
}
|
||||
|
||||
function createUser() {
|
||||
editingUserId = null;
|
||||
document.getElementById('createUserModalTitle').textContent = 'Create User';
|
||||
document.getElementById('userForm').reset();
|
||||
document.getElementById('userPassword').required = true;
|
||||
const modal = new bootstrap.Modal(document.getElementById('createUserModal'));
|
||||
modal.show();
|
||||
}
|
||||
|
||||
async function editUser(userId) {
|
||||
try {
|
||||
const response = await axios.get(`/api/auth/admin/users/${userId}`);
|
||||
const user = response.data.user;
|
||||
|
||||
editingUserId = userId;
|
||||
document.getElementById('createUserModalTitle').textContent = 'Edit User';
|
||||
document.getElementById('userEmail').value = user.email;
|
||||
document.getElementById('userUsername').value = user.username;
|
||||
document.getElementById('userPassword').value = '';
|
||||
document.getElementById('userPassword').required = false;
|
||||
document.getElementById('userFullName').value = user.full_name || '';
|
||||
document.getElementById('userRole').value = user.role;
|
||||
document.getElementById('userVerified').checked = user.is_verified;
|
||||
document.getElementById('userRateLimit').value = user.rate_limit_per_minute;
|
||||
|
||||
const modal = new bootstrap.Modal(document.getElementById('createUserModal'));
|
||||
modal.show();
|
||||
} catch (error) {
|
||||
console.error('Failed to load user for editing:', error);
|
||||
showAlert('Failed to load user', 'danger');
|
||||
}
|
||||
}
|
||||
|
||||
async function saveUser() {
|
||||
try {
|
||||
const data = {
|
||||
email: document.getElementById('userEmail').value,
|
||||
username: document.getElementById('userUsername').value,
|
||||
full_name: document.getElementById('userFullName').value,
|
||||
role: document.getElementById('userRole').value,
|
||||
is_verified: document.getElementById('userVerified').checked,
|
||||
rate_limit_per_minute: parseInt(document.getElementById('userRateLimit').value)
|
||||
};
|
||||
|
||||
if (editingUserId) {
|
||||
// Update existing user
|
||||
if (document.getElementById('userPassword').value) {
|
||||
data.password = document.getElementById('userPassword').value;
|
||||
}
|
||||
await axios.put(`/api/auth/admin/users/${editingUserId}`, data);
|
||||
showAlert('User updated successfully', 'success');
|
||||
} else {
|
||||
// Create new user
|
||||
data.password = document.getElementById('userPassword').value;
|
||||
await axios.post('/api/auth/admin/users', data);
|
||||
showAlert('User created successfully', 'success');
|
||||
}
|
||||
|
||||
bootstrap.Modal.getInstance(document.getElementById('createUserModal')).hide();
|
||||
loadUsers(currentPage);
|
||||
loadStats();
|
||||
} catch (error) {
|
||||
console.error('Failed to save user:', error);
|
||||
showAlert(error.response?.data?.error || 'Failed to save user', 'danger');
|
||||
}
|
||||
}
|
||||
|
||||
async function suspendUser(userId) {
|
||||
if (!confirm('Are you sure you want to suspend this user?')) return;
|
||||
|
||||
try {
|
||||
const reason = prompt('Enter suspension reason:');
|
||||
if (!reason) return;
|
||||
|
||||
await axios.post(`/api/auth/admin/users/${userId}/suspend`, { reason });
|
||||
showAlert('User suspended successfully', 'success');
|
||||
loadUsers(currentPage);
|
||||
loadStats();
|
||||
} catch (error) {
|
||||
console.error('Failed to suspend user:', error);
|
||||
showAlert('Failed to suspend user', 'danger');
|
||||
}
|
||||
}
|
||||
|
||||
async function unsuspendUser(userId) {
|
||||
if (!confirm('Are you sure you want to unsuspend this user?')) return;
|
||||
|
||||
try {
|
||||
await axios.post(`/api/auth/admin/users/${userId}/unsuspend`);
|
||||
showAlert('User unsuspended successfully', 'success');
|
||||
loadUsers(currentPage);
|
||||
loadStats();
|
||||
} catch (error) {
|
||||
console.error('Failed to unsuspend user:', error);
|
||||
showAlert('Failed to unsuspend user', 'danger');
|
||||
}
|
||||
}
|
||||
|
||||
async function deleteUser(userId) {
|
||||
if (!confirm('Are you sure you want to delete this user? This action cannot be undone.')) return;
|
||||
|
||||
try {
|
||||
await axios.delete(`/api/auth/admin/users/${userId}`);
|
||||
showAlert('User deleted successfully', 'success');
|
||||
loadUsers(currentPage);
|
||||
loadStats();
|
||||
} catch (error) {
|
||||
console.error('Failed to delete user:', error);
|
||||
showAlert('Failed to delete user', 'danger');
|
||||
}
|
||||
}
|
||||
|
||||
function copyToClipboard(text) {
|
||||
navigator.clipboard.writeText(text).then(() => {
|
||||
showAlert('Copied to clipboard', 'success');
|
||||
});
|
||||
}
|
||||
|
||||
function showAlert(message, type) {
|
||||
const alertDiv = document.createElement('div');
|
||||
alertDiv.className = `alert alert-${type} alert-dismissible fade show position-fixed top-0 start-50 translate-middle-x mt-3`;
|
||||
alertDiv.style.zIndex = '9999';
|
||||
alertDiv.innerHTML = `
|
||||
${message}
|
||||
<button type="button" class="btn-close" data-bs-dismiss="alert"></button>
|
||||
`;
|
||||
document.body.appendChild(alertDiv);
|
||||
|
||||
setTimeout(() => {
|
||||
alertDiv.remove();
|
||||
}, 5000);
|
||||
}
|
||||
|
||||
function logout() {
|
||||
localStorage.removeItem('auth_token');
|
||||
window.location.href = '/login';
|
||||
}
|
||||
</script>
|
||||
</body>
|
||||
</html>
|
287
templates/login.html
Normal file
287
templates/login.html
Normal file
@ -0,0 +1,287 @@
|
||||
<!DOCTYPE html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="UTF-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
||||
<title>Login - Talk2Me</title>
|
||||
<link href="https://cdn.jsdelivr.net/npm/bootstrap@5.1.3/dist/css/bootstrap.min.css" rel="stylesheet">
|
||||
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/bootstrap-icons@1.7.2/font/bootstrap-icons.css">
|
||||
<style>
|
||||
body {
|
||||
background: linear-gradient(135deg, #667eea 0%, #764ba2 100%);
|
||||
height: 100vh;
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
}
|
||||
.login-container {
|
||||
background: white;
|
||||
padding: 2rem;
|
||||
border-radius: 10px;
|
||||
box-shadow: 0 10px 30px rgba(0, 0, 0, 0.1);
|
||||
width: 100%;
|
||||
max-width: 400px;
|
||||
}
|
||||
.login-header {
|
||||
text-align: center;
|
||||
margin-bottom: 2rem;
|
||||
}
|
||||
.login-header h1 {
|
||||
color: #333;
|
||||
font-size: 2rem;
|
||||
margin-bottom: 0.5rem;
|
||||
}
|
||||
.login-header p {
|
||||
color: #666;
|
||||
margin: 0;
|
||||
}
|
||||
.form-control:focus {
|
||||
border-color: #667eea;
|
||||
box-shadow: 0 0 0 0.2rem rgba(102, 126, 234, 0.25);
|
||||
}
|
||||
.btn-primary {
|
||||
background-color: #667eea;
|
||||
border-color: #667eea;
|
||||
width: 100%;
|
||||
padding: 0.75rem;
|
||||
font-weight: 500;
|
||||
margin-top: 1rem;
|
||||
}
|
||||
.btn-primary:hover {
|
||||
background-color: #5a67d8;
|
||||
border-color: #5a67d8;
|
||||
}
|
||||
.alert {
|
||||
font-size: 0.875rem;
|
||||
}
|
||||
.divider {
|
||||
text-align: center;
|
||||
margin: 1.5rem 0;
|
||||
position: relative;
|
||||
}
|
||||
.divider::before {
|
||||
content: '';
|
||||
position: absolute;
|
||||
top: 50%;
|
||||
left: 0;
|
||||
right: 0;
|
||||
height: 1px;
|
||||
background: #e0e0e0;
|
||||
}
|
||||
.divider span {
|
||||
background: white;
|
||||
padding: 0 1rem;
|
||||
position: relative;
|
||||
color: #666;
|
||||
font-size: 0.875rem;
|
||||
}
|
||||
.api-key-section {
|
||||
margin-top: 1.5rem;
|
||||
padding-top: 1.5rem;
|
||||
border-top: 1px solid #e0e0e0;
|
||||
}
|
||||
.api-key-display {
|
||||
background: #f8f9fa;
|
||||
padding: 0.75rem;
|
||||
border-radius: 5px;
|
||||
font-family: monospace;
|
||||
font-size: 0.875rem;
|
||||
word-break: break-all;
|
||||
margin-top: 0.5rem;
|
||||
}
|
||||
.back-link {
|
||||
text-align: center;
|
||||
margin-top: 1rem;
|
||||
}
|
||||
.loading {
|
||||
display: none;
|
||||
}
|
||||
.loading.show {
|
||||
display: inline-block;
|
||||
}
|
||||
</style>
|
||||
</head>
|
||||
<body>
|
||||
<div class="login-container">
|
||||
<div class="login-header">
|
||||
<h1><i class="bi bi-translate"></i> Talk2Me</h1>
|
||||
<p>Voice Translation Made Simple</p>
|
||||
</div>
|
||||
|
||||
<div id="alertContainer">
|
||||
{% if error %}
|
||||
<div class="alert alert-danger alert-dismissible fade show" role="alert">
|
||||
{{ error }}
|
||||
<button type="button" class="btn-close" data-bs-dismiss="alert"></button>
|
||||
</div>
|
||||
{% endif %}
|
||||
</div>
|
||||
|
||||
<form id="loginForm" method="POST" action="{{ url_for('login', next=request.args.get('next')) }}">
|
||||
<div class="mb-3">
|
||||
<label for="username" class="form-label">Email or Username</label>
|
||||
<div class="input-group">
|
||||
<span class="input-group-text"><i class="bi bi-person"></i></span>
|
||||
<input type="text" class="form-control" id="username" name="username" required autofocus>
|
||||
</div>
|
||||
</div>
|
||||
<div class="mb-3">
|
||||
<label for="password" class="form-label">Password</label>
|
||||
<div class="input-group">
|
||||
<span class="input-group-text"><i class="bi bi-lock"></i></span>
|
||||
<input type="password" class="form-control" id="password" name="password" required>
|
||||
</div>
|
||||
</div>
|
||||
<button type="submit" class="btn btn-primary">
|
||||
<span class="loading spinner-border spinner-border-sm me-2" role="status"></span>
|
||||
<span class="btn-text">Sign In</span>
|
||||
</button>
|
||||
</form>
|
||||
|
||||
<div class="divider">
|
||||
<span>OR</span>
|
||||
</div>
|
||||
|
||||
<div class="text-center">
|
||||
<p class="mb-2">Use the app without signing in</p>
|
||||
<a href="/" class="btn btn-outline-secondary w-100">
|
||||
<i class="bi bi-arrow-right-circle"></i> Continue as Guest
|
||||
</a>
|
||||
</div>
|
||||
|
||||
<div class="api-key-section" id="apiKeySection" style="display: none;">
|
||||
<h6 class="mb-2">Your API Key</h6>
|
||||
<p class="text-muted small">Use this key to authenticate API requests:</p>
|
||||
<div class="api-key-display" id="apiKeyDisplay">
|
||||
<span id="apiKey"></span>
|
||||
<button class="btn btn-sm btn-outline-secondary float-end" onclick="copyApiKey()">
|
||||
<i class="bi bi-clipboard"></i> Copy
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="back-link">
|
||||
<a href="/" class="text-muted small">
|
||||
<i class="bi bi-arrow-left"></i> Back to App
|
||||
</a>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<script src="https://cdn.jsdelivr.net/npm/bootstrap@5.1.3/dist/js/bootstrap.bundle.min.js"></script>
|
||||
<script src="https://cdn.jsdelivr.net/npm/axios/dist/axios.min.js"></script>
|
||||
<script>
|
||||
// Check if already logged in
|
||||
const authToken = localStorage.getItem('auth_token');
|
||||
if (authToken) {
|
||||
// Verify token is still valid
|
||||
axios.defaults.headers.common['Authorization'] = `Bearer ${authToken}`;
|
||||
axios.get('/api/auth/profile').then(response => {
|
||||
// Token is valid, redirect to app
|
||||
window.location.href = '/';
|
||||
}).catch(() => {
|
||||
// Token invalid, clear it
|
||||
localStorage.removeItem('auth_token');
|
||||
delete axios.defaults.headers.common['Authorization'];
|
||||
});
|
||||
}
|
||||
|
||||
document.getElementById('loginForm').addEventListener('submit', async (e) => {
|
||||
e.preventDefault();
|
||||
|
||||
const username = document.getElementById('username').value;
|
||||
const password = document.getElementById('password').value;
|
||||
const loadingSpinner = document.querySelector('.loading');
|
||||
const btnText = document.querySelector('.btn-text');
|
||||
const submitBtn = e.target.querySelector('button[type="submit"]');
|
||||
|
||||
// Show loading state
|
||||
loadingSpinner.classList.add('show');
|
||||
btnText.textContent = 'Signing in...';
|
||||
submitBtn.disabled = true;
|
||||
|
||||
try {
|
||||
const response = await axios.post('/api/auth/login', {
|
||||
username: username,
|
||||
password: password
|
||||
});
|
||||
|
||||
if (response.data.success) {
|
||||
// Store token
|
||||
const token = response.data.tokens.access_token;
|
||||
localStorage.setItem('auth_token', token);
|
||||
localStorage.setItem('refresh_token', response.data.tokens.refresh_token);
|
||||
localStorage.setItem('user_id', response.data.user.id);
|
||||
localStorage.setItem('username', response.data.user.username);
|
||||
localStorage.setItem('user_role', response.data.user.role);
|
||||
|
||||
// Show API key
|
||||
document.getElementById('apiKey').textContent = response.data.user.api_key;
|
||||
document.getElementById('apiKeySection').style.display = 'block';
|
||||
|
||||
showAlert('Login successful! Redirecting...', 'success');
|
||||
|
||||
// Redirect based on role or next parameter
|
||||
setTimeout(() => {
|
||||
const urlParams = new URLSearchParams(window.location.search);
|
||||
const nextUrl = urlParams.get('next');
|
||||
|
||||
if (nextUrl) {
|
||||
window.location.href = nextUrl;
|
||||
} else if (response.data.user.role === 'admin') {
|
||||
window.location.href = '/admin';
|
||||
} else {
|
||||
window.location.href = '/';
|
||||
}
|
||||
}, 1500);
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Login error:', error);
|
||||
const errorMessage = error.response?.data?.error || 'Login failed. Please try again.';
|
||||
showAlert(errorMessage, 'danger');
|
||||
|
||||
// Reset button state
|
||||
loadingSpinner.classList.remove('show');
|
||||
btnText.textContent = 'Sign In';
|
||||
submitBtn.disabled = false;
|
||||
|
||||
// Clear password field on error
|
||||
document.getElementById('password').value = '';
|
||||
document.getElementById('password').focus();
|
||||
}
|
||||
});
|
||||
|
||||
function showAlert(message, type) {
|
||||
const alertContainer = document.getElementById('alertContainer');
|
||||
const alert = document.createElement('div');
|
||||
alert.className = `alert alert-${type} alert-dismissible fade show`;
|
||||
alert.innerHTML = `
|
||||
${message}
|
||||
<button type="button" class="btn-close" data-bs-dismiss="alert"></button>
|
||||
`;
|
||||
alertContainer.innerHTML = '';
|
||||
alertContainer.appendChild(alert);
|
||||
|
||||
// Auto-dismiss after 5 seconds
|
||||
setTimeout(() => {
|
||||
alert.remove();
|
||||
}, 5000);
|
||||
}
|
||||
|
||||
function copyApiKey() {
|
||||
const apiKey = document.getElementById('apiKey').textContent;
|
||||
navigator.clipboard.writeText(apiKey).then(() => {
|
||||
showAlert('API key copied to clipboard!', 'success');
|
||||
}).catch(() => {
|
||||
showAlert('Failed to copy API key', 'danger');
|
||||
});
|
||||
}
|
||||
|
||||
// Handle Enter key in form fields
|
||||
document.getElementById('username').addEventListener('keypress', (e) => {
|
||||
if (e.key === 'Enter') {
|
||||
document.getElementById('password').focus();
|
||||
}
|
||||
});
|
||||
</script>
|
||||
</body>
|
||||
</html>
|
352
user_rate_limiter.py
Normal file
352
user_rate_limiter.py
Normal file
@ -0,0 +1,352 @@
|
||||
"""User-specific rate limiting that integrates with authentication"""
|
||||
import time
|
||||
import logging
|
||||
from functools import wraps
|
||||
from flask import request, jsonify, g
|
||||
from collections import defaultdict, deque
|
||||
from threading import Lock
|
||||
from datetime import datetime, timedelta
|
||||
|
||||
from auth import get_current_user
|
||||
from auth_models import User
|
||||
from database import db
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class UserRateLimiter:
|
||||
"""Enhanced rate limiter with user-specific limits"""
|
||||
|
||||
def __init__(self, default_limiter):
|
||||
self.default_limiter = default_limiter
|
||||
self.user_buckets = defaultdict(lambda: {
|
||||
'tokens': 0,
|
||||
'last_update': time.time(),
|
||||
'requests': deque(maxlen=1000)
|
||||
})
|
||||
self.lock = Lock()
|
||||
|
||||
def get_user_limits(self, user: User, endpoint: str):
|
||||
"""Get rate limits for a specific user"""
|
||||
# Start with endpoint-specific or default limits
|
||||
base_limits = self.default_limiter.get_limits(endpoint)
|
||||
|
||||
if not user:
|
||||
return base_limits
|
||||
|
||||
# Override with user-specific limits
|
||||
user_limits = {
|
||||
'requests_per_minute': user.rate_limit_per_minute,
|
||||
'requests_per_hour': user.rate_limit_per_hour,
|
||||
'requests_per_day': user.rate_limit_per_day,
|
||||
'burst_size': base_limits.get('burst_size', 10),
|
||||
'token_refresh_rate': user.rate_limit_per_minute / 60.0 # Convert to per-second
|
||||
}
|
||||
|
||||
# Admin users get higher limits
|
||||
if user.is_admin:
|
||||
user_limits['requests_per_minute'] *= 10
|
||||
user_limits['requests_per_hour'] *= 10
|
||||
user_limits['requests_per_day'] *= 10
|
||||
user_limits['burst_size'] *= 5
|
||||
|
||||
return user_limits
|
||||
|
||||
def check_user_rate_limit(self, user: User, endpoint: str, request_size: int = 0):
|
||||
"""Check rate limit for authenticated user"""
|
||||
if not user:
|
||||
# Fall back to IP-based limiting
|
||||
client_id = self.default_limiter.get_client_id(request)
|
||||
return self.default_limiter.check_rate_limit(client_id, endpoint, request_size)
|
||||
|
||||
with self.lock:
|
||||
user_id = str(user.id)
|
||||
limits = self.get_user_limits(user, endpoint)
|
||||
|
||||
# Get or create bucket for user
|
||||
bucket = self.user_buckets[user_id]
|
||||
now = time.time()
|
||||
|
||||
# Update tokens based on time passed
|
||||
time_passed = now - bucket['last_update']
|
||||
new_tokens = time_passed * limits['token_refresh_rate']
|
||||
bucket['tokens'] = min(
|
||||
limits['burst_size'],
|
||||
bucket['tokens'] + new_tokens
|
||||
)
|
||||
bucket['last_update'] = now
|
||||
|
||||
# Clean old requests from sliding windows
|
||||
minute_ago = now - 60
|
||||
hour_ago = now - 3600
|
||||
day_ago = now - 86400
|
||||
|
||||
bucket['requests'] = deque(
|
||||
(t for t in bucket['requests'] if t > day_ago),
|
||||
maxlen=1000
|
||||
)
|
||||
|
||||
# Count requests in windows
|
||||
requests_last_minute = sum(1 for t in bucket['requests'] if t > minute_ago)
|
||||
requests_last_hour = sum(1 for t in bucket['requests'] if t > hour_ago)
|
||||
requests_last_day = len(bucket['requests'])
|
||||
|
||||
# Check limits
|
||||
if requests_last_minute >= limits['requests_per_minute']:
|
||||
return False, "Rate limit exceeded (per minute)", {
|
||||
'retry_after': 60,
|
||||
'limit': limits['requests_per_minute'],
|
||||
'remaining': 0,
|
||||
'reset': int(minute_ago + 60),
|
||||
'scope': 'user'
|
||||
}
|
||||
|
||||
if requests_last_hour >= limits['requests_per_hour']:
|
||||
return False, "Rate limit exceeded (per hour)", {
|
||||
'retry_after': 3600,
|
||||
'limit': limits['requests_per_hour'],
|
||||
'remaining': 0,
|
||||
'reset': int(hour_ago + 3600),
|
||||
'scope': 'user'
|
||||
}
|
||||
|
||||
if requests_last_day >= limits['requests_per_day']:
|
||||
return False, "Rate limit exceeded (per day)", {
|
||||
'retry_after': 86400,
|
||||
'limit': limits['requests_per_day'],
|
||||
'remaining': 0,
|
||||
'reset': int(day_ago + 86400),
|
||||
'scope': 'user'
|
||||
}
|
||||
|
||||
# Check token bucket
|
||||
if bucket['tokens'] < 1:
|
||||
retry_after = int(1 / limits['token_refresh_rate'])
|
||||
return False, "Rate limit exceeded (burst)", {
|
||||
'retry_after': retry_after,
|
||||
'limit': limits['burst_size'],
|
||||
'remaining': 0,
|
||||
'reset': int(now + retry_after),
|
||||
'scope': 'user'
|
||||
}
|
||||
|
||||
# Request allowed
|
||||
bucket['tokens'] -= 1
|
||||
bucket['requests'].append(now)
|
||||
|
||||
# Calculate remaining
|
||||
remaining_minute = limits['requests_per_minute'] - requests_last_minute - 1
|
||||
remaining_hour = limits['requests_per_hour'] - requests_last_hour - 1
|
||||
remaining_day = limits['requests_per_day'] - requests_last_day - 1
|
||||
|
||||
return True, None, {
|
||||
'limit_minute': limits['requests_per_minute'],
|
||||
'limit_hour': limits['requests_per_hour'],
|
||||
'limit_day': limits['requests_per_day'],
|
||||
'remaining_minute': remaining_minute,
|
||||
'remaining_hour': remaining_hour,
|
||||
'remaining_day': remaining_day,
|
||||
'reset': int(minute_ago + 60),
|
||||
'scope': 'user'
|
||||
}
|
||||
|
||||
def get_user_usage_stats(self, user: User):
|
||||
"""Get usage statistics for a user"""
|
||||
if not user:
|
||||
return None
|
||||
|
||||
with self.lock:
|
||||
user_id = str(user.id)
|
||||
if user_id not in self.user_buckets:
|
||||
return {
|
||||
'requests_last_minute': 0,
|
||||
'requests_last_hour': 0,
|
||||
'requests_last_day': 0,
|
||||
'tokens_available': 0
|
||||
}
|
||||
|
||||
bucket = self.user_buckets[user_id]
|
||||
now = time.time()
|
||||
minute_ago = now - 60
|
||||
hour_ago = now - 3600
|
||||
day_ago = now - 86400
|
||||
|
||||
requests_last_minute = sum(1 for t in bucket['requests'] if t > minute_ago)
|
||||
requests_last_hour = sum(1 for t in bucket['requests'] if t > hour_ago)
|
||||
requests_last_day = sum(1 for t in bucket['requests'] if t > day_ago)
|
||||
|
||||
return {
|
||||
'requests_last_minute': requests_last_minute,
|
||||
'requests_last_hour': requests_last_hour,
|
||||
'requests_last_day': requests_last_day,
|
||||
'tokens_available': bucket['tokens'],
|
||||
'last_request': bucket['last_update']
|
||||
}
|
||||
|
||||
def reset_user_limits(self, user: User):
|
||||
"""Reset rate limits for a user (admin action)"""
|
||||
if not user:
|
||||
return False
|
||||
|
||||
with self.lock:
|
||||
user_id = str(user.id)
|
||||
if user_id in self.user_buckets:
|
||||
del self.user_buckets[user_id]
|
||||
return True
|
||||
return False
|
||||
|
||||
|
||||
# Global user rate limiter instance
|
||||
from rate_limiter import rate_limiter as default_rate_limiter
|
||||
user_rate_limiter = UserRateLimiter(default_rate_limiter)
|
||||
|
||||
|
||||
def user_aware_rate_limit(endpoint=None,
|
||||
requests_per_minute=None,
|
||||
requests_per_hour=None,
|
||||
requests_per_day=None,
|
||||
burst_size=None,
|
||||
check_size=False,
|
||||
require_auth=False):
|
||||
"""
|
||||
Enhanced rate limiting decorator that considers user authentication
|
||||
|
||||
Usage:
|
||||
@app.route('/api/endpoint')
|
||||
@user_aware_rate_limit(requests_per_minute=10)
|
||||
def endpoint():
|
||||
return jsonify({'status': 'ok'})
|
||||
"""
|
||||
def decorator(f):
|
||||
@wraps(f)
|
||||
def decorated_function(*args, **kwargs):
|
||||
# Get current user (if authenticated)
|
||||
user = get_current_user()
|
||||
|
||||
# If auth is required but no user, return 401
|
||||
if require_auth and not user:
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'error': 'Authentication required',
|
||||
'code': 'auth_required'
|
||||
}), 401
|
||||
|
||||
# Get endpoint
|
||||
endpoint_path = endpoint or request.endpoint
|
||||
|
||||
# Check request size if needed
|
||||
request_size = 0
|
||||
if check_size:
|
||||
request_size = request.content_length or 0
|
||||
|
||||
# Check rate limit
|
||||
if user:
|
||||
# User-specific rate limiting
|
||||
allowed, message, headers = user_rate_limiter.check_user_rate_limit(
|
||||
user, endpoint_path, request_size
|
||||
)
|
||||
else:
|
||||
# Fall back to IP-based rate limiting
|
||||
client_id = default_rate_limiter.get_client_id(request)
|
||||
allowed, message, headers = default_rate_limiter.check_rate_limit(
|
||||
client_id, endpoint_path, request_size
|
||||
)
|
||||
|
||||
if not allowed:
|
||||
# Log excessive requests
|
||||
identifier = f"user:{user.username}" if user else f"ip:{request.remote_addr}"
|
||||
logger.warning(f"Rate limit exceeded for {identifier} on {endpoint_path}: {message}")
|
||||
|
||||
# Update user stats if authenticated
|
||||
if user:
|
||||
user.last_active_at = datetime.utcnow()
|
||||
db.session.commit()
|
||||
|
||||
response = jsonify({
|
||||
'success': False,
|
||||
'error': message,
|
||||
'retry_after': headers.get('retry_after') if headers else 60
|
||||
})
|
||||
response.status_code = 429
|
||||
|
||||
# Add rate limit headers
|
||||
if headers:
|
||||
if headers.get('scope') == 'user':
|
||||
response.headers['X-RateLimit-Limit'] = str(headers.get('limit_minute', 60))
|
||||
response.headers['X-RateLimit-Remaining'] = str(headers.get('remaining_minute', 0))
|
||||
response.headers['X-RateLimit-Limit-Hour'] = str(headers.get('limit_hour', 1000))
|
||||
response.headers['X-RateLimit-Remaining-Hour'] = str(headers.get('remaining_hour', 0))
|
||||
response.headers['X-RateLimit-Limit-Day'] = str(headers.get('limit_day', 10000))
|
||||
response.headers['X-RateLimit-Remaining-Day'] = str(headers.get('remaining_day', 0))
|
||||
else:
|
||||
response.headers['X-RateLimit-Limit'] = str(headers['limit'])
|
||||
response.headers['X-RateLimit-Remaining'] = str(headers['remaining'])
|
||||
response.headers['X-RateLimit-Reset'] = str(headers['reset'])
|
||||
response.headers['Retry-After'] = str(headers['retry_after'])
|
||||
|
||||
return response
|
||||
|
||||
# Track concurrent requests
|
||||
default_rate_limiter.increment_concurrent()
|
||||
|
||||
try:
|
||||
# Store user in g if authenticated
|
||||
if user:
|
||||
g.current_user = user
|
||||
|
||||
# Add rate limit info to response
|
||||
g.rate_limit_headers = headers
|
||||
response = f(*args, **kwargs)
|
||||
|
||||
# Add headers to successful response
|
||||
if headers and hasattr(response, 'headers'):
|
||||
if headers.get('scope') == 'user':
|
||||
response.headers['X-RateLimit-Limit'] = str(headers.get('limit_minute', 60))
|
||||
response.headers['X-RateLimit-Remaining'] = str(headers.get('remaining_minute', 0))
|
||||
response.headers['X-RateLimit-Limit-Hour'] = str(headers.get('limit_hour', 1000))
|
||||
response.headers['X-RateLimit-Remaining-Hour'] = str(headers.get('remaining_hour', 0))
|
||||
response.headers['X-RateLimit-Limit-Day'] = str(headers.get('limit_day', 10000))
|
||||
response.headers['X-RateLimit-Remaining-Day'] = str(headers.get('remaining_day', 0))
|
||||
else:
|
||||
response.headers['X-RateLimit-Limit'] = str(headers.get('limit', 60))
|
||||
response.headers['X-RateLimit-Remaining'] = str(headers.get('remaining', 0))
|
||||
response.headers['X-RateLimit-Reset'] = str(headers['reset'])
|
||||
|
||||
return response
|
||||
finally:
|
||||
default_rate_limiter.decrement_concurrent()
|
||||
|
||||
return decorated_function
|
||||
return decorator
|
||||
|
||||
|
||||
def get_user_rate_limit_status(user: User = None):
|
||||
"""Get current rate limit status for a user or IP"""
|
||||
if not user:
|
||||
user = get_current_user()
|
||||
|
||||
if user:
|
||||
stats = user_rate_limiter.get_user_usage_stats(user)
|
||||
limits = user_rate_limiter.get_user_limits(user, request.endpoint or '/')
|
||||
|
||||
return {
|
||||
'type': 'user',
|
||||
'identifier': user.username,
|
||||
'limits': {
|
||||
'per_minute': limits['requests_per_minute'],
|
||||
'per_hour': limits['requests_per_hour'],
|
||||
'per_day': limits['requests_per_day']
|
||||
},
|
||||
'usage': stats
|
||||
}
|
||||
else:
|
||||
# IP-based stats
|
||||
client_id = default_rate_limiter.get_client_id(request)
|
||||
stats = default_rate_limiter.get_client_stats(client_id)
|
||||
|
||||
return {
|
||||
'type': 'ip',
|
||||
'identifier': request.remote_addr,
|
||||
'limits': default_rate_limiter.default_limits,
|
||||
'usage': stats
|
||||
}
|
@ -90,6 +90,19 @@ class Validators:
|
||||
|
||||
return True, None
|
||||
|
||||
@staticmethod
|
||||
def validate_email(email: str) -> bool:
|
||||
"""Validate email address format"""
|
||||
if not email or not isinstance(email, str):
|
||||
return False
|
||||
|
||||
# Basic email pattern
|
||||
email_pattern = re.compile(
|
||||
r'^[a-zA-Z0-9._%+-]+@[a-zA-Z0-9.-]+\.[a-zA-Z]{2,}$'
|
||||
)
|
||||
|
||||
return bool(email_pattern.match(email))
|
||||
|
||||
@staticmethod
|
||||
def validate_url(url: str) -> Optional[str]:
|
||||
"""Validate and sanitize URL"""
|
||||
|
Loading…
Reference in New Issue
Block a user