Real-time cryptocurrency monitoring system with advanced analytics and intelligent alerts
Features โข Installation โข Usage โข Configuration โข Contributing
|
|
|
|
Our system consists of four interconnected components:
- ๐น Crypto price feeds for real-time data streaming
- ๐น Kafka Streams for reliable message delivery
- Partitioned topics by currency pairs
- Fault-tolerant message delivery
- ๐น Airflow Scheduler
- Automated data collection
- Data validation and cleaning
- ๐น Apache Flink/Spark for stream processing
- Real-time analytics
- Fault-tolerant computation
- ๐น Technical Analysis
- Moving averages (SMA/EMA)
- Z-score calculations
- ๐น Anomaly Detection
- Price spike detection
- Pattern recognition
- Statistical analysis
- ๐น Raw Data Storage (S3/GCS)
- Historical price data
- Market events
- ๐น Processed Data Storage
- Aggregated metrics
- Analysis results
- ๐น Time-Series Data
- Real-time price data
- Performance metrics
- ๐น Discord Integration
- Real-time price alerts
- Anomaly notifications
- Custom commands
- Alert management
- ๐น Tableau Dashboards
- Real-time data visualization
- Technical analysis charts
- Performance monitoring
- Custom reports
Before you begin, ensure you have the following installed:
- Python 3.10+
- Docker Desktop with WSL 2 integration enabled
- Apache Airflow
- Apache Kafka
- Apache Flink
- Discord Bot Token
- Tableau Desktop/Server (optional)
1๏ธโฃ Clone the Repository
git clone <repository-url>
cd crypto-price-monitoring
2๏ธโฃ Setup Environment
# Run the setup script
chmod +x setup.sh
./setup.sh
3๏ธโฃ Configure Environment
# Update the .env file with your configuration
nano .env
# Required Environment Variables:
DISCORD_BOT_TOKEN=your_discord_bot_token
DISCORD_CHANNEL_ID=your_channel_id
AIRFLOW_HOME=/path/to/airflow
TABLEAU_SERVER_URL=your_tableau_server_url # Optional
# Start Kafka using Docker
docker-compose up -d kafka
# Start the crypto price producer
python src/data_ingestion/kafka_producer/crypto_producer.py
# Start the price processor
python src/processing/stream_processor/price_processor.py
# Start the anomaly detector
python src/processing/anomaly_detector/price_anomaly_detector.py
# Start the alert bot
python src/visualization/discord_bot/alert_bot.py
# Start all services using Docker Compose
docker-compose up -d
# Initialize Airflow database (if needed)
docker exec cryptopricemonitoringsystem-airflow-webserver-1 airflow db init
# Create default admin user (if needed)
docker exec cryptopricemonitoringsystem-airflow-webserver-1 airflow users create \
--username airflow \
--password airflow \
--firstname admin \
--lastname admin \
--role Admin \
--email [email protected]
# Access Airflow Web Interface
# Open http://localhost:8080 in your browser
# Login with:
# - Username: airflow
# - Password: airflow
- Automatic price fetching for configured cryptocurrencies
- Real-time processing through Kafka streams
- Customizable monitoring intervals
- Statistical analysis for price movements
- Configurable detection thresholds
- Multiple detection methods:
- Z-score analysis
- Price change percentage
- Pattern recognition
!status
- Check system status!help
- Display available commands!alerts
- View recent alerts!configure
- Configure alert settings
.env
- Environment variablesconfig/kafka_config.yaml
- Kafka settingsconfig/monitoring_config.yaml
- Monitoring parametersconfig/alert_config.yaml
- Alert configurations
- Alert thresholds
- Monitoring intervals
- Cryptocurrency pairs
- Technical indicators
- Alert formats
# Format code
black .
# Lint code
flake8
# Run tests
pytest
# Run with coverage
pytest --cov=src
The system includes comprehensive monitoring:
- Component health checks
- Performance metrics
- Error tracking
- Alert statistics
- System logs
- Fork the repository
- Create a feature branch (
git checkout -b feature/AmazingFeature
) - Commit changes (
git commit -m 'Add AmazingFeature'
) - Push to branch (
git push origin feature/AmazingFeature
) - Open a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.
- Yahoo Finance API for cryptocurrency data
- Apache Kafka for stream processing
- Apache Flink for real-time analytics
- Discord.py for bot implementation
- Tableau for visualization capabilities
- Data Ingestion Layer: WebSocket connection to Binance for real-time crypto prices
- Message Queue: Apache Kafka for data streaming
- Stream Processing: Real-time analytics and price monitoring
- Analytics: Price statistics, moving averages, and alerts
- Python 3.10+
- Docker and Docker Compose
- WSL2 (if running on Windows)
- Create and activate virtual environment:
python3 -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
- Install dependencies:
pip install -r requirements.txt
- Start Kafka services:
docker-compose up -d
- Start the stream processor (Terminal 1):
source venv/bin/activate
python run_processor.py
- Start the WebSocket feed (Terminal 2):
source venv/bin/activate
python run_crypto_feed.py
- Storage Consumer (Terminal 3):
source venv/bin/activate
python run_storage.py
- Start the alert processor (Terminal 4):
source venv/bin/activate
python run_alerts.py
- Start the API server (Terminal 5):
python run_api.py
The system provides two visualization options:
# Install dependencies (in your virtual environment)
pip install -r requirements.txt
# Create visualization directory
mkdir -p src/visualization
# Start the Streamlit app
python scripts/runners/run_streamlit.py
The Streamlit dashboard will be available at: http://localhost:8501
Features:
- Real-time price monitoring
- Interactive charts with Plotly
- Volume analysis
- Price change indicators
- Customizable time ranges
- Auto-refresh capability
# Set up Tableau environment
python scripts/tableau/setup_tableau.py
# Start data export service
python scripts/runners/run_tableau_export.py
# Start real-time streaming
python scripts/runners/run_tableau_stream.py
Tableau data will be available in:
tableau/data/
- Historical data exportstableau/realtime/
- Real-time streaming data
- Start the data pipeline:
# Activate virtual environment
source venv/bin/activate # On Unix/Mac
# or
.\venv\Scripts\activate # On Windows
# Start Kafka services
docker-compose up -d
# Start data ingestion
python scripts/runners/run_crypto_feed.py
# Start data processing
python scripts/runners/run_processor.py
# Start data storage
python scripts/runners/run_storage.py
- Start additional services:
# Start API server
python scripts/runners/run_api.py
# Start alert service
python scripts/runners/run_alerts.py
- Start visualization services:
# Start Streamlit dashboard
python scripts/runners/run_streamlit.py
# Start Tableau data export
python scripts/runners/run_tableau_export.py
# Start Tableau streaming
python scripts/runners/run_tableau_stream.py
Each service should be run in a separate terminal window with the virtual environment activated.