Skip to content

Real-time cryptocurrency monitoring system with advanced analytics and intelligent alerts

Notifications You must be signed in to change notification settings

soheil-mp/Crypto-Price-Monitoring-System

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

2 Commits
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

๐Ÿ“Š Crypto Price Monitoring System

Python Apache Kafka Apache Airflow Apache Flink Apache Airflow Discord

Real-time cryptocurrency monitoring system with advanced analytics and intelligent alerts

Architecture

alt text

Features โ€ข Installation โ€ข Usage โ€ข Configuration โ€ข Contributing


๐ŸŒŸ Features

๐Ÿ”„ Real-time Monitoring

  • Live cryptocurrency price tracking
  • WebSocket-based data streaming
  • Instant market updates
  • Multi-exchange support
  • Fault-tolerant data processing
  • Scalable architecture

๐Ÿ“ˆ Advanced Analytics

  • SMA/EMA calculations
  • Z-score analysis
  • Trend detection
  • Price spike detection
  • Custom threshold monitoring
  • Real-time data processing

๐Ÿ”” Smart Alerts

  • Discord notifications
  • Custom price thresholds
  • Trend-based alerts
  • Alert history tracking
  • Real-time notification system
  • Configurable conditions

๐Ÿ“Š Data Visualization

  • Real-time price charts
  • Technical indicators
  • Market trends
  • Price patterns
  • Performance metrics
  • Custom dashboards

๐Ÿ—๏ธ Architecture

Our system consists of four interconnected components:

1๏ธโƒฃ Data Ingestion Layer

  • ๐Ÿ”น Crypto price feeds for real-time data streaming
  • ๐Ÿ”น Kafka Streams for reliable message delivery
    • Partitioned topics by currency pairs
    • Fault-tolerant message delivery
  • ๐Ÿ”น Airflow Scheduler
    • Automated data collection
    • Data validation and cleaning

2๏ธโƒฃ Processing Engine

  • ๐Ÿ”น Apache Flink/Spark for stream processing
    • Real-time analytics
    • Fault-tolerant computation
  • ๐Ÿ”น Technical Analysis
    • Moving averages (SMA/EMA)
    • Z-score calculations
  • ๐Ÿ”น Anomaly Detection
    • Price spike detection
    • Pattern recognition
    • Statistical analysis

3๏ธโƒฃ Storage Layer

  • ๐Ÿ”น Raw Data Storage (S3/GCS)
    • Historical price data
    • Market events
  • ๐Ÿ”น Processed Data Storage
    • Aggregated metrics
    • Analysis results
  • ๐Ÿ”น Time-Series Data
    • Real-time price data
    • Performance metrics

4๏ธโƒฃ Visualization & Alerting

  • ๐Ÿ”น Discord Integration
    • Real-time price alerts
    • Anomaly notifications
    • Custom commands
    • Alert management
  • ๐Ÿ”น Tableau Dashboards
    • Real-time data visualization
    • Technical analysis charts
    • Performance monitoring
    • Custom reports

๐Ÿ’ป Prerequisites

Before you begin, ensure you have the following installed:

  • Python 3.10+
  • Docker Desktop with WSL 2 integration enabled
  • Apache Airflow
  • Apache Kafka
  • Apache Flink
  • Discord Bot Token
  • Tableau Desktop/Server (optional)

๐Ÿš€ Installation

1๏ธโƒฃ Clone the Repository

git clone <repository-url>
cd crypto-price-monitoring

2๏ธโƒฃ Setup Environment

# Run the setup script
chmod +x setup.sh
./setup.sh

3๏ธโƒฃ Configure Environment

# Update the .env file with your configuration
nano .env

# Required Environment Variables:
DISCORD_BOT_TOKEN=your_discord_bot_token
DISCORD_CHANNEL_ID=your_channel_id
AIRFLOW_HOME=/path/to/airflow
TABLEAU_SERVER_URL=your_tableau_server_url  # Optional

๐ŸŽฎ Component Setup

1. Kafka Setup

# Start Kafka using Docker
docker-compose up -d kafka

2. Data Ingestion

# Start the crypto price producer
python src/data_ingestion/kafka_producer/crypto_producer.py

3. Stream Processing

# Start the price processor
python src/processing/stream_processor/price_processor.py

4. Anomaly Detection

# Start the anomaly detector
python src/processing/anomaly_detector/price_anomaly_detector.py

5. Discord Bot

# Start the alert bot
python src/visualization/discord_bot/alert_bot.py

6. Apache Airflow

# Start all services using Docker Compose
docker-compose up -d

# Initialize Airflow database (if needed)
docker exec cryptopricemonitoringsystem-airflow-webserver-1 airflow db init

# Create default admin user (if needed)
docker exec cryptopricemonitoringsystem-airflow-webserver-1 airflow users create \
    --username airflow \
    --password airflow \
    --firstname admin \
    --lastname admin \
    --role Admin \
    --email [email protected]

# Access Airflow Web Interface
# Open http://localhost:8080 in your browser
# Login with:
# - Username: airflow
# - Password: airflow

๐Ÿ“ฑ Usage

๐Ÿ’น Monitor Crypto Prices

  • Automatic price fetching for configured cryptocurrencies
  • Real-time processing through Kafka streams
  • Customizable monitoring intervals

๐ŸŽฏ Anomaly Detection

  • Statistical analysis for price movements
  • Configurable detection thresholds
  • Multiple detection methods:
    • Z-score analysis
    • Price change percentage
    • Pattern recognition

๐Ÿ’ฌ Discord Commands

  • !status - Check system status
  • !help - Display available commands
  • !alerts - View recent alerts
  • !configure - Configure alert settings

โš™๏ธ Configuration

Key Configuration Files

  • .env - Environment variables
  • config/kafka_config.yaml - Kafka settings
  • config/monitoring_config.yaml - Monitoring parameters
  • config/alert_config.yaml - Alert configurations

Customization Options

  • Alert thresholds
  • Monitoring intervals
  • Cryptocurrency pairs
  • Technical indicators
  • Alert formats

๐Ÿ› ๏ธ Development

Code Style

# Format code
black .

# Lint code
flake8

Testing

# Run tests
pytest

# Run with coverage
pytest --cov=src

๐Ÿ“Š Monitoring

The system includes comprehensive monitoring:

  • Component health checks
  • Performance metrics
  • Error tracking
  • Alert statistics
  • System logs

๐Ÿค Contributing

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/AmazingFeature)
  3. Commit changes (git commit -m 'Add AmazingFeature')
  4. Push to branch (git push origin feature/AmazingFeature)
  5. Open a Pull Request

๐Ÿ“ License

This project is licensed under the MIT License - see the LICENSE file for details.

๐Ÿ™ Acknowledgments

System Architecture

  • Data Ingestion Layer: WebSocket connection to Binance for real-time crypto prices
  • Message Queue: Apache Kafka for data streaming
  • Stream Processing: Real-time analytics and price monitoring
  • Analytics: Price statistics, moving averages, and alerts

Prerequisites

  • Python 3.10+
  • Docker and Docker Compose
  • WSL2 (if running on Windows)

Setup Instructions

  1. Create and activate virtual environment:
python3 -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate
  1. Install dependencies:
pip install -r requirements.txt
  1. Start Kafka services:
docker-compose up -d

Running the System

  1. Start the stream processor (Terminal 1):
source venv/bin/activate
python run_processor.py
  1. Start the WebSocket feed (Terminal 2):
source venv/bin/activate
python run_crypto_feed.py
  1. Storage Consumer (Terminal 3):
source venv/bin/activate
python run_storage.py
  1. Start the alert processor (Terminal 4):
source venv/bin/activate
python run_alerts.py
  1. Start the API server (Terminal 5):
python run_api.py

Visualization

The system provides two visualization options:

1. Streamlit Dashboard

# Install dependencies (in your virtual environment)
pip install -r requirements.txt

# Create visualization directory
mkdir -p src/visualization

# Start the Streamlit app
python scripts/runners/run_streamlit.py

The Streamlit dashboard will be available at: http://localhost:8501

Features:

  • Real-time price monitoring
  • Interactive charts with Plotly
  • Volume analysis
  • Price change indicators
  • Customizable time ranges
  • Auto-refresh capability

2. Tableau Integration

# Set up Tableau environment
python scripts/tableau/setup_tableau.py

# Start data export service
python scripts/runners/run_tableau_export.py

# Start real-time streaming
python scripts/runners/run_tableau_stream.py

Tableau data will be available in:

  • tableau/data/ - Historical data exports
  • tableau/realtime/ - Real-time streaming data

Running the Complete System

  1. Start the data pipeline:
# Activate virtual environment
source venv/bin/activate  # On Unix/Mac
# or
.\venv\Scripts\activate  # On Windows

# Start Kafka services
docker-compose up -d

# Start data ingestion
python scripts/runners/run_crypto_feed.py

# Start data processing
python scripts/runners/run_processor.py

# Start data storage
python scripts/runners/run_storage.py
  1. Start additional services:
# Start API server
python scripts/runners/run_api.py

# Start alert service
python scripts/runners/run_alerts.py
  1. Start visualization services:
# Start Streamlit dashboard
python scripts/runners/run_streamlit.py

# Start Tableau data export
python scripts/runners/run_tableau_export.py

# Start Tableau streaming
python scripts/runners/run_tableau_stream.py

Each service should be run in a separate terminal window with the virtual environment activated.

About

Real-time cryptocurrency monitoring system with advanced analytics and intelligent alerts

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published