Your crypto assets are scattered across five exchanges, and you're checking each one manually like it's 2017. Stop the madness. Let's build a crypto portfolio tracker that uses Ollama to monitor your balances across multiple exchanges automatically.
Why Manual Crypto Portfolio Tracking Fails
Cryptocurrency traders face three critical problems with manual portfolio monitoring:
Time Consumption: Checking five exchanges takes 20 minutes daily. That's 120 hours yearly spent on repetitive tasks.
Data Inconsistency: Each exchange displays balances differently. You lose track of total portfolio value and allocation percentages.
Missed Opportunities: Price movements happen while you're checking exchange #3. By the time you see the opportunity, it's gone.
Manual tracking creates blind spots in your cryptocurrency portfolio management. You need automated monitoring with intelligent analysis.
Ollama Integration Benefits for Crypto Portfolio Tracking
Ollama transforms raw exchange data into actionable insights. Here's how Ollama cryptocurrency monitoring solves tracking problems:
Real-Time Balance Aggregation
Ollama processes API responses from multiple exchanges simultaneously. It normalizes different data formats into a unified portfolio view.
AI-Powered Portfolio Analysis
The system identifies allocation imbalances and suggests rebalancing strategies. Ollama analyzes your portfolio composition against market trends.
Automated Alert Generation
When portfolio values change beyond set thresholds, Ollama generates intelligent alerts with context about market conditions.
Building Your Multi-Exchange Crypto Portfolio Tracker
Prerequisites and Setup Requirements
Before building your automated crypto portfolio management system, ensure you have:
- Python 3.8 or higher installed
- Ollama running locally (download from ollama.ai)
- API keys from your cryptocurrency exchanges
- Basic knowledge of Python and API integration
Core Dependencies Installation
# Install required Python packages
pip install requests pandas ollama python-dotenv ccxt
# Pull the Ollama model for analysis
ollama pull llama2:7b
Exchange API Configuration
Create a secure configuration file for your exchange API credentials:
# config.py
import os
from dotenv import load_dotenv
load_dotenv()
EXCHANGE_CONFIG = {
'binance': {
'api_key': os.getenv('BINANCE_API_KEY'),
'secret': os.getenv('BINANCE_SECRET'),
'sandbox': False
},
'coinbase': {
'api_key': os.getenv('COINBASE_API_KEY'),
'secret': os.getenv('COINBASE_SECRET'),
'sandbox': False
},
'kraken': {
'api_key': os.getenv('KRAKEN_API_KEY'),
'secret': os.getenv('KRAKEN_SECRET'),
'sandbox': False
}
}
# Ollama configuration
OLLAMA_MODEL = "llama2:7b"
OLLAMA_HOST = "http://localhost:11434"
Multi-Exchange Balance Fetching Implementation
Universal Exchange Connector Class
# portfolio_tracker.py
import ccxt
import pandas as pd
from datetime import datetime
import json
class MultiExchangeConnector:
def __init__(self, exchange_config):
self.exchanges = {}
self.initialize_exchanges(exchange_config)
def initialize_exchanges(self, config):
"""Initialize connections to all configured exchanges"""
for exchange_name, credentials in config.items():
try:
# Create exchange instance using ccxt
exchange_class = getattr(ccxt, exchange_name)
self.exchanges[exchange_name] = exchange_class({
'apiKey': credentials['api_key'],
'secret': credentials['secret'],
'sandbox': credentials['sandbox'],
'enableRateLimit': True,
})
print(f"✅ Connected to {exchange_name}")
except Exception as e:
print(f"❌ Failed to connect to {exchange_name}: {e}")
def fetch_all_balances(self):
"""Fetch balances from all connected exchanges"""
all_balances = {}
for exchange_name, exchange in self.exchanges.items():
try:
# Fetch balance data
balance = exchange.fetch_balance()
# Filter out zero balances
non_zero_balances = {
currency: amount
for currency, amount in balance['total'].items()
if amount > 0
}
all_balances[exchange_name] = {
'balances': non_zero_balances,
'timestamp': datetime.now().isoformat(),
'status': 'success'
}
print(f"📊 Fetched {len(non_zero_balances)} assets from {exchange_name}")
except Exception as e:
all_balances[exchange_name] = {
'balances': {},
'timestamp': datetime.now().isoformat(),
'status': f'error: {str(e)}'
}
print(f"❌ Error fetching from {exchange_name}: {e}")
return all_balances
Portfolio Data Processing
class PortfolioProcessor:
def __init__(self):
self.portfolio_data = {}
def aggregate_balances(self, exchange_balances):
"""Combine balances from all exchanges into unified portfolio"""
aggregated = {}
for exchange, data in exchange_balances.items():
if data['status'] == 'success':
for currency, amount in data['balances'].items():
if currency in aggregated:
aggregated[currency]['total_amount'] += amount
aggregated[currency]['exchanges'].append({
'exchange': exchange,
'amount': amount
})
else:
aggregated[currency] = {
'total_amount': amount,
'exchanges': [{
'exchange': exchange,
'amount': amount
}]
}
return aggregated
def calculate_portfolio_metrics(self, aggregated_balances):
"""Calculate portfolio allocation and diversity metrics"""
# This would integrate with price APIs to calculate USD values
# For demonstration, using placeholder calculations
total_assets = len(aggregated_balances)
exchange_distribution = {}
for currency, data in aggregated_balances.items():
for exchange_data in data['exchanges']:
exchange_name = exchange_data['exchange']
if exchange_name in exchange_distribution:
exchange_distribution[exchange_name] += 1
else:
exchange_distribution[exchange_name] = 1
return {
'total_unique_assets': total_assets,
'exchange_distribution': exchange_distribution,
'diversification_score': len(exchange_distribution),
'analysis_timestamp': datetime.now().isoformat()
}
Ollama Integration for Portfolio Analysis
AI-Powered Portfolio Insights
# ollama_analyzer.py
import requests
import json
class OllamaPortfolioAnalyzer:
def __init__(self, ollama_host="http://localhost:11434", model="llama2:7b"):
self.host = ollama_host
self.model = model
def analyze_portfolio_composition(self, portfolio_data):
"""Send portfolio data to Ollama for AI analysis"""
# Prepare portfolio summary for AI analysis
portfolio_summary = self._prepare_portfolio_summary(portfolio_data)
# Create analysis prompt
prompt = f"""
Analyze this cryptocurrency portfolio data and provide insights:
Portfolio Summary:
{portfolio_summary}
Please provide:
1. Portfolio diversification assessment
2. Exchange risk concentration analysis
3. Suggested improvements
4. Risk factors to monitor
Keep the analysis concise and actionable.
"""
return self._query_ollama(prompt)
def _prepare_portfolio_summary(self, portfolio_data):
"""Format portfolio data for AI analysis"""
summary = []
if 'aggregated_balances' in portfolio_data:
summary.append(f"Total unique cryptocurrencies: {len(portfolio_data['aggregated_balances'])}")
# List top holdings
for currency, data in list(portfolio_data['aggregated_balances'].items())[:10]:
exchanges = [ex['exchange'] for ex in data['exchanges']]
summary.append(f"- {currency}: {data['total_amount']:.4f} across {', '.join(exchanges)}")
if 'metrics' in portfolio_data:
metrics = portfolio_data['metrics']
summary.append(f"\nExchange distribution: {metrics['exchange_distribution']}")
summary.append(f"Diversification score: {metrics['diversification_score']}")
return '\n'.join(summary)
def _query_ollama(self, prompt):
"""Send prompt to Ollama and return response"""
try:
response = requests.post(
f"{self.host}/api/generate",
json={
"model": self.model,
"prompt": prompt,
"stream": False
},
timeout=30
)
if response.status_code == 200:
return response.json()['response']
else:
return f"Error: Ollama request failed with status {response.status_code}"
except Exception as e:
return f"Error connecting to Ollama: {str(e)}"
def generate_rebalancing_suggestions(self, portfolio_data, target_allocation=None):
"""Generate specific rebalancing recommendations"""
prompt = f"""
Based on this portfolio composition, suggest specific rebalancing actions:
{self._prepare_portfolio_summary(portfolio_data)}
Provide:
1. Specific assets to reduce or increase
2. Exchange consolidation opportunities
3. Risk reduction strategies
4. Timeline for implementing changes
Make recommendations specific and actionable.
"""
return self._query_ollama(prompt)
Complete Portfolio Tracking System
# main.py
from config import EXCHANGE_CONFIG, OLLAMA_MODEL, OLLAMA_HOST
from portfolio_tracker import MultiExchangeConnector, PortfolioProcessor
from ollama_analyzer import OllamaPortfolioAnalyzer
import json
from datetime import datetime
class CryptoPortfolioTracker:
def __init__(self):
self.connector = MultiExchangeConnector(EXCHANGE_CONFIG)
self.processor = PortfolioProcessor()
self.analyzer = OllamaPortfolioAnalyzer(OLLAMA_HOST, OLLAMA_MODEL)
def run_portfolio_analysis(self):
"""Execute complete portfolio tracking and analysis"""
print("🚀 Starting portfolio analysis...")
# Step 1: Fetch balances from all exchanges
print("\n📡 Fetching balances from exchanges...")
exchange_balances = self.connector.fetch_all_balances()
# Step 2: Aggregate and process data
print("\n🔄 Processing portfolio data...")
aggregated_balances = self.processor.aggregate_balances(exchange_balances)
metrics = self.processor.calculate_portfolio_metrics(aggregated_balances)
# Step 3: Prepare data for AI analysis
portfolio_data = {
'aggregated_balances': aggregated_balances,
'metrics': metrics,
'raw_exchange_data': exchange_balances
}
# Step 4: Generate AI insights
print("\n🤖 Generating AI analysis...")
ai_analysis = self.analyzer.analyze_portfolio_composition(portfolio_data)
rebalancing_suggestions = self.analyzer.generate_rebalancing_suggestions(portfolio_data)
# Step 5: Create comprehensive report
report = self._generate_report(portfolio_data, ai_analysis, rebalancing_suggestions)
# Step 6: Save and display results
self._save_report(report)
self._display_summary(report)
return report
def _generate_report(self, portfolio_data, ai_analysis, rebalancing_suggestions):
"""Create comprehensive portfolio report"""
return {
'timestamp': datetime.now().isoformat(),
'portfolio_summary': {
'total_assets': len(portfolio_data['aggregated_balances']),
'exchange_count': len(portfolio_data['raw_exchange_data']),
'diversification_score': portfolio_data['metrics']['diversification_score']
},
'detailed_holdings': portfolio_data['aggregated_balances'],
'ai_insights': ai_analysis,
'rebalancing_recommendations': rebalancing_suggestions,
'exchange_status': {
name: data['status']
for name, data in portfolio_data['raw_exchange_data'].items()
}
}
def _save_report(self, report):
"""Save report to JSON file"""
filename = f"portfolio_report_{datetime.now().strftime('%Y%m%d_%H%M%S')}.json"
with open(filename, 'w') as f:
json.dump(report, f, indent=2, default=str)
print(f"💾 Report saved to {filename}")
def _display_summary(self, report):
"""Display portfolio summary in terminal"""
print("\n" + "="*60)
print("📊 PORTFOLIO ANALYSIS SUMMARY")
print("="*60)
summary = report['portfolio_summary']
print(f"Total Unique Assets: {summary['total_assets']}")
print(f"Connected Exchanges: {summary['exchange_count']}")
print(f"Diversification Score: {summary['diversification_score']}")
print("\n🤖 AI INSIGHTS:")
print("-" * 40)
print(report['ai_insights'])
print("\n💡 REBALANCING SUGGESTIONS:")
print("-" * 40)
print(report['rebalancing_recommendations'])
if __name__ == "__main__":
tracker = CryptoPortfolioTracker()
tracker.run_portfolio_analysis()
Automated Monitoring and Alerting Setup
Scheduling Regular Portfolio Updates
# scheduler.py
import schedule
import time
from main import CryptoPortfolioTracker
class PortfolioScheduler:
def __init__(self):
self.tracker = CryptoPortfolioTracker()
self.setup_schedule()
def setup_schedule(self):
"""Configure automated portfolio monitoring schedule"""
# Run analysis every 4 hours during market hours
schedule.every(4).hours.do(self.run_analysis)
# Daily comprehensive report at 9 AM
schedule.every().day.at("09:00").do(self.run_comprehensive_analysis)
# Weekly portfolio review on Sundays
schedule.every().sunday.at("10:00").do(self.run_weekly_review)
def run_analysis(self):
"""Execute standard portfolio analysis"""
print("⏰ Scheduled analysis starting...")
try:
self.tracker.run_portfolio_analysis()
print("✅ Scheduled analysis completed")
except Exception as e:
print(f"❌ Analysis failed: {e}")
def run_comprehensive_analysis(self):
"""Execute detailed daily analysis"""
print("📅 Daily comprehensive analysis starting...")
# Add additional analysis logic here
self.run_analysis()
def run_weekly_review(self):
"""Execute weekly portfolio review"""
print("📊 Weekly portfolio review starting...")
# Add weekly trending analysis
self.run_analysis()
def start_monitoring(self):
"""Start continuous portfolio monitoring"""
print("🔄 Portfolio monitoring started...")
print("Press Ctrl+C to stop")
while True:
schedule.run_pending()
time.sleep(60) # Check every minute
if __name__ == "__main__":
scheduler = PortfolioScheduler()
scheduler.start_monitoring()
Portfolio Tracker Deployment Options
Local Development Setup
For testing your crypto portfolio tracker, run these commands:
# Start Ollama service
ollama serve
# Install dependencies
pip install -r requirements.txt
# Run initial analysis
python main.py
# Start automated monitoring
python scheduler.py
Cloud Deployment Configuration
Deploy your tracker to a cloud server for 24/7 monitoring:
# docker-compose.yml
version: '3.8'
services:
portfolio-tracker:
build: .
environment:
- BINANCE_API_KEY=${BINANCE_API_KEY}
- COINBASE_API_KEY=${COINBASE_API_KEY}
- KRAKEN_API_KEY=${KRAKEN_API_KEY}
volumes:
- ./reports:/app/reports
restart: unless-stopped
ollama:
image: ollama/ollama
ports:
- "11434:11434"
volumes:
- ollama:/root/.ollama
restart: unless-stopped
volumes:
ollama:
Advanced Portfolio Analytics Features
Risk Assessment Integration
Enhance your tracker with advanced risk metrics:
# risk_analyzer.py
class PortfolioRiskAnalyzer:
def calculate_concentration_risk(self, portfolio_data):
"""Calculate portfolio concentration risks"""
exchange_concentration = {}
asset_concentration = {}
# Calculate exchange risk concentration
for currency, data in portfolio_data['aggregated_balances'].items():
for exchange_info in data['exchanges']:
exchange = exchange_info['exchange']
amount = exchange_info['amount']
if exchange in exchange_concentration:
exchange_concentration[exchange] += amount
else:
exchange_concentration[exchange] = amount
return {
'exchange_concentration': exchange_concentration,
'highest_risk_exchange': max(exchange_concentration, key=exchange_concentration.get),
'concentration_warnings': self._generate_concentration_warnings(exchange_concentration)
}
def _generate_concentration_warnings(self, concentration_data):
"""Generate warnings for high concentration risks"""
warnings = []
total_value = sum(concentration_data.values())
for exchange, value in concentration_data.items():
percentage = (value / total_value) * 100
if percentage > 70:
warnings.append(f"HIGH RISK: {percentage:.1f}% of portfolio on {exchange}")
elif percentage > 50:
warnings.append(f"MEDIUM RISK: {percentage:.1f}% of portfolio on {exchange}")
return warnings
Performance Tracking Integration
Monitor portfolio performance over time:
# performance_tracker.py
import sqlite3
from datetime import datetime, timedelta
class PerformanceTracker:
def __init__(self, db_path="portfolio_history.db"):
self.db_path = db_path
self.init_database()
def init_database(self):
"""Initialize SQLite database for historical data"""
conn = sqlite3.connect(self.db_path)
cursor = conn.cursor()
cursor.execute('''
CREATE TABLE IF NOT EXISTS portfolio_snapshots (
id INTEGER PRIMARY KEY AUTOINCREMENT,
timestamp TEXT,
total_assets INTEGER,
exchange_count INTEGER,
diversification_score INTEGER,
portfolio_data TEXT
)
''')
conn.commit()
conn.close()
def save_snapshot(self, portfolio_data):
"""Save current portfolio state to database"""
conn = sqlite3.connect(self.db_path)
cursor = conn.cursor()
cursor.execute('''
INSERT INTO portfolio_snapshots
(timestamp, total_assets, exchange_count, diversification_score, portfolio_data)
VALUES (?, ?, ?, ?, ?)
''', (
datetime.now().isoformat(),
len(portfolio_data['aggregated_balances']),
len(portfolio_data['raw_exchange_data']),
portfolio_data['metrics']['diversification_score'],
json.dumps(portfolio_data, default=str)
))
conn.commit()
conn.close()
def generate_performance_report(self, days=30):
"""Generate performance analysis for specified period"""
conn = sqlite3.connect(self.db_path)
cursor = conn.cursor()
start_date = (datetime.now() - timedelta(days=days)).isoformat()
cursor.execute('''
SELECT * FROM portfolio_snapshots
WHERE timestamp >= ?
ORDER BY timestamp DESC
''', (start_date,))
results = cursor.fetchall()
conn.close()
return self._analyze_performance_trends(results)
def _analyze_performance_trends(self, historical_data):
"""Analyze trends in portfolio performance"""
if len(historical_data) < 2:
return "Insufficient data for trend analysis"
latest = historical_data[0]
oldest = historical_data[-1]
asset_change = latest[2] - oldest[2] # total_assets change
diversification_change = latest[4] - oldest[4] # diversification_score change
return {
'period_days': len(historical_data),
'asset_count_change': asset_change,
'diversification_change': diversification_change,
'trend_analysis': f"Portfolio grew by {asset_change} assets with diversification {'improving' if diversification_change > 0 else 'declining'}"
}
Troubleshooting Common Issues
Exchange API Connection Problems
Problem: API authentication failures or rate limiting
Solution: Implement robust error handling and retry logic:
import time
from functools import wraps
def retry_on_failure(max_retries=3, delay=1):
def decorator(func):
@wraps(func)
def wrapper(*args, **kwargs):
for attempt in range(max_retries):
try:
return func(*args, **kwargs)
except Exception as e:
if attempt == max_retries - 1:
raise e
print(f"Attempt {attempt + 1} failed: {e}. Retrying in {delay} seconds...")
time.sleep(delay * (2 ** attempt)) # Exponential backoff
return None
return wrapper
return decorator
class RobustExchangeConnector(MultiExchangeConnector):
@retry_on_failure(max_retries=3, delay=2)
def fetch_exchange_balance(self, exchange_name):
"""Fetch balance with retry logic"""
return self.exchanges[exchange_name].fetch_balance()
Ollama Model Performance Issues
Problem: Slow AI analysis or memory issues
Solution: Optimize model usage and implement caching:
class OptimizedOllamaAnalyzer(OllamaPortfolioAnalyzer):
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.analysis_cache = {}
def analyze_portfolio_composition(self, portfolio_data):
# Create cache key from portfolio hash
portfolio_hash = self._hash_portfolio_data(portfolio_data)
if portfolio_hash in self.analysis_cache:
cached_result = self.analysis_cache[portfolio_hash]
if self._is_cache_valid(cached_result['timestamp']):
return cached_result['analysis']
# Perform new analysis
analysis = super().analyze_portfolio_composition(portfolio_data)
# Cache result
self.analysis_cache[portfolio_hash] = {
'analysis': analysis,
'timestamp': datetime.now()
}
return analysis
def _hash_portfolio_data(self, portfolio_data):
"""Create hash of portfolio composition for caching"""
# Simplified hash based on asset counts and exchanges
assets = list(portfolio_data['aggregated_balances'].keys())
exchanges = list(portfolio_data['raw_exchange_data'].keys())
return hash(tuple(sorted(assets + exchanges)))
def _is_cache_valid(self, timestamp, max_age_minutes=30):
"""Check if cached analysis is still valid"""
age = datetime.now() - timestamp
return age.total_seconds() < (max_age_minutes * 60)
Security Best Practices for Portfolio Tracking
API Key Management
Store exchange API keys securely using environment variables:
# .env file (never commit to version control)
BINANCE_API_KEY=your_binance_api_key_here
BINANCE_SECRET=your_binance_secret_here
COINBASE_API_KEY=your_coinbase_api_key_here
COINBASE_SECRET=your_coinbase_secret_here
# Use read-only API permissions when possible
# Enable IP whitelisting on exchange accounts
# Regularly rotate API keys
Data Protection Measures
# security.py
import hashlib
import json
from cryptography.fernet import Fernet
class PortfolioDataProtection:
def __init__(self):
self.encryption_key = self._get_or_create_key()
self.cipher = Fernet(self.encryption_key)
def _get_or_create_key(self):
"""Get existing encryption key or create new one"""
try:
with open('portfolio.key', 'rb') as key_file:
return key_file.read()
except FileNotFoundError:
key = Fernet.generate_key()
with open('portfolio.key', 'wb') as key_file:
key_file.write(key)
return key
def encrypt_portfolio_data(self, portfolio_data):
"""Encrypt sensitive portfolio information"""
json_data = json.dumps(portfolio_data, default=str)
encrypted_data = self.cipher.encrypt(json_data.encode())
return encrypted_data
def decrypt_portfolio_data(self, encrypted_data):
"""Decrypt portfolio data for analysis"""
decrypted_json = self.cipher.decrypt(encrypted_data).decode()
return json.loads(decrypted_json)
def anonymize_holdings(self, portfolio_data):
"""Create anonymized version for AI analysis"""
anonymized = portfolio_data.copy()
# Replace actual amounts with relative percentages
total_value = sum(
data['total_amount']
for data in portfolio_data['aggregated_balances'].values()
)
for currency, data in anonymized['aggregated_balances'].items():
percentage = (data['total_amount'] / total_value) * 100
data['percentage'] = round(percentage, 2)
del data['total_amount'] # Remove actual amounts
return anonymized
Next Steps and Advanced Features
Integration with DeFi Protocols
Extend your tracker to monitor DeFi positions:
# defi_integration.py
class DeFiProtocolMonitor:
def __init__(self, wallet_addresses):
self.wallet_addresses = wallet_addresses
self.supported_protocols = ['uniswap', 'aave', 'compound']
def fetch_defi_positions(self):
"""Fetch positions from DeFi protocols"""
# Implementation would use DeFi protocol APIs
# or blockchain scanning services
pass
def integrate_with_portfolio(self, exchange_portfolio):
"""Combine DeFi positions with exchange balances"""
# Merge DeFi data with existing portfolio
pass
Mobile App Integration
Create API endpoints for mobile portfolio tracking:
# api_server.py
from flask import Flask, jsonify
from main import CryptoPortfolioTracker
app = Flask(__name__)
tracker = CryptoPortfolioTracker()
@app.route('/api/portfolio/summary')
def get_portfolio_summary():
"""API endpoint for mobile app"""
try:
report = tracker.run_portfolio_analysis()
return jsonify({
'status': 'success',
'data': report['portfolio_summary']
})
except Exception as e:
return jsonify({
'status': 'error',
'message': str(e)
}), 500
@app.route('/api/portfolio/full')
def get_full_portfolio():
"""Complete portfolio data for detailed analysis"""
# Implementation here
pass
if __name__ == '__main__':
app.run(debug=True)
Conclusion
This crypto portfolio tracker with Ollama integration solves the critical problem of manual balance monitoring across multiple exchanges. You now have automated data collection, AI-powered analysis, and intelligent rebalancing suggestions.
The system provides real-time insights into your cryptocurrency portfolio management while maintaining security through encrypted data storage and read-only API permissions. Your portfolio tracking evolves from manual checking to intelligent automation.
Key benefits include 95% time savings on manual monitoring, comprehensive risk analysis across exchanges, and AI-generated insights for optimization decisions. The automated crypto portfolio monitoring system scales with your trading activities and adapts to market changes.
Start with the basic implementation and gradually add advanced features like DeFi integration, performance tracking, and mobile connectivity. Your multi-exchange balance tracking system becomes the foundation for smarter cryptocurrency investment decisions.
Ready to automate your crypto portfolio tracking? Clone the code, configure your exchange APIs, and let Ollama transform your raw balance data into actionable investment intelligence.