Ever watched a random altcoin skyrocket 500% in minutes, only to crash harder than your New Year's resolutions? Welcome to the wild world of crypto pump and dump schemes – where fortunes vanish faster than pizza at a developer conference.
Crypto pump and dump prediction through social media analysis can protect your portfolio from these coordinated attacks. This guide shows you how to build an Ollama-powered detection system that spots suspicious patterns before they drain your wallet.
You'll learn to identify manipulation signals, set up automated monitoring, and create early warning systems that give you the edge in volatile crypto markets.
What Are Crypto Pump and Dump Schemes?
Crypto pump and dump schemes involve coordinated efforts to artificially inflate a cryptocurrency's price through misleading marketing, then quickly sell at peak prices. These market manipulation tactics typically target low-cap altcoins with thin trading volumes.
Common Pump and Dump Patterns
Social Media Indicators:
- Sudden surge in mentions across multiple platforms
- Coordinated posting at specific times
- Generic "moon" and rocket emoji spam
- New accounts promoting unknown tokens
- Celebrity endorsements without disclosure
Price Movement Signals:
- Rapid volume spikes (300%+ increase)
- Price jumps without fundamental news
- Immediate sell-offs after peak prices
- Thin order books easily manipulated
Why Ollama AI for Crypto Analysis?
Traditional crypto analysis tools miss the nuanced language patterns that indicate manipulation. Ollama AI processes natural language at scale, identifying subtle coordination signals across social platforms.
Advantages of Local AI Analysis
Privacy Protection: Your trading strategies stay confidential on local servers.
Real-time Processing: Analyze thousands of posts per minute without API limits.
Custom Models: Train specialized detection algorithms for your specific needs.
Cost Efficiency: No expensive cloud API fees for continuous monitoring.
Setting Up Your Ollama Crypto Detection System
Prerequisites
Before building your social media crypto analysis system, ensure you have:
- Ollama installed on your local machine
- Python 3.8+ with pandas and requests libraries
- Social media API access (Twitter, Reddit, Telegram)
- Basic understanding of cryptocurrency markets
Installing Required Components
# Install Ollama
curl -fsSL https://ollama.ai/install.sh | sh
# Pull the llama2 model for analysis
ollama pull llama2:7b
# Install Python dependencies
pip install pandas requests tweepy praw python-telegram-bot
Configuring API Access
# config.py - Store your API credentials
import os
TWITTER_CONFIG = {
'bearer_token': os.getenv('TWITTER_BEARER_TOKEN'),
'api_key': os.getenv('TWITTER_API_KEY'),
'api_secret': os.getenv('TWITTER_API_SECRET')
}
REDDIT_CONFIG = {
'client_id': os.getenv('REDDIT_CLIENT_ID'),
'client_secret': os.getenv('REDDIT_CLIENT_SECRET'),
'user_agent': 'CryptoAnalyzer/1.0'
}
Building the Social Media Data Collector
Twitter Data Collection
# twitter_collector.py
import tweepy
import pandas as pd
from datetime import datetime, timedelta
class TwitterCollector:
def __init__(self, config):
self.client = tweepy.Client(bearer_token=config['bearer_token'])
def collect_crypto_mentions(self, coin_symbol, hours_back=24):
"""Collect tweets mentioning specific cryptocurrency"""
# Calculate time range for recent tweets
end_time = datetime.utcnow()
start_time = end_time - timedelta(hours=hours_back)
# Search query targeting crypto discussions
query = f"${coin_symbol} OR #{coin_symbol} -is:retweet lang:en"
tweets = tweepy.Paginator(
self.client.search_recent_tweets,
query=query,
tweet_fields=['created_at', 'author_id', 'public_metrics'],
start_time=start_time,
end_time=end_time,
max_results=100
).flatten(limit=1000)
# Convert to structured data
tweet_data = []
for tweet in tweets:
tweet_data.append({
'id': tweet.id,
'text': tweet.text,
'created_at': tweet.created_at,
'author_id': tweet.author_id,
'retweet_count': tweet.public_metrics['retweet_count'],
'like_count': tweet.public_metrics['like_count']
})
return pd.DataFrame(tweet_data)
Reddit Data Collection
# reddit_collector.py
import praw
import pandas as pd
class RedditCollector:
def __init__(self, config):
self.reddit = praw.Reddit(
client_id=config['client_id'],
client_secret=config['client_secret'],
user_agent=config['user_agent']
)
def collect_crypto_posts(self, coin_symbol, subreddits=['CryptoCurrency', 'altcoin']):
"""Collect Reddit posts about specific cryptocurrency"""
posts_data = []
for subreddit_name in subreddits:
subreddit = self.reddit.subreddit(subreddit_name)
# Search recent posts
for post in subreddit.search(coin_symbol, time_filter='day', limit=100):
posts_data.append({
'id': post.id,
'title': post.title,
'text': post.selftext,
'score': post.score,
'num_comments': post.num_comments,
'created_utc': post.created_utc,
'author': str(post.author)
})
return pd.DataFrame(posts_data)
Creating the Ollama Analysis Engine
Pump Signal Detection Model
# ollama_analyzer.py
import requests
import json
class OllamaAnalyzer:
def __init__(self, model_name="llama2:7b"):
self.model_name = model_name
self.base_url = "http://localhost:11434"
def analyze_pump_signals(self, text_data):
"""Analyze text for pump and dump indicators"""
prompt = f"""
Analyze this social media content for cryptocurrency pump and dump signals.
Content: {text_data}
Look for these indicators:
1. Urgency language ("act now", "limited time")
2. Unrealistic price predictions
3. Coordinated messaging patterns
4. Lack of fundamental analysis
5. Celebrity or influencer endorsements
Rate the pump risk from 1-10 and explain your reasoning.
Format: Risk_Score: X, Reasoning: [explanation]
"""
response = requests.post(
f"{self.base_url}/api/generate",
json={
"model": self.model_name,
"prompt": prompt,
"stream": False
}
)
if response.status_code == 200:
result = response.json()
return self._parse_analysis(result['response'])
else:
return {"risk_score": 0, "reasoning": "Analysis failed"}
def _parse_analysis(self, response_text):
"""Extract risk score and reasoning from Ollama response"""
lines = response_text.split('\n')
risk_score = 0
reasoning = ""
for line in lines:
if 'Risk_Score:' in line:
try:
risk_score = int(line.split(':')[1].strip())
except:
risk_score = 5 # Default medium risk
elif 'Reasoning:' in line:
reasoning = line.split(':', 1)[1].strip()
return {
"risk_score": risk_score,
"reasoning": reasoning
}
Sentiment Analysis Enhancement
# sentiment_analyzer.py
class SentimentAnalyzer:
def __init__(self, ollama_analyzer):
self.ollama = ollama_analyzer
def analyze_sentiment_shift(self, texts, timestamps):
"""Detect rapid sentiment changes indicating manipulation"""
sentiments = []
for text in texts:
prompt = f"""
Analyze the sentiment of this crypto-related text.
Text: {text}
Rate sentiment from -5 (very negative) to +5 (very positive).
Consider artificial enthusiasm vs genuine excitement.
Format: Sentiment_Score: X, Authenticity: [genuine/artificial]
"""
result = self.ollama.analyze_pump_signals(prompt)
sentiments.append(result)
return self._detect_manipulation_patterns(sentiments, timestamps)
def _detect_manipulation_patterns(self, sentiments, timestamps):
"""Identify coordinated sentiment manipulation"""
# Look for sudden positive sentiment spikes
# followed by rapid reversals
positive_spikes = []
for i, sentiment in enumerate(sentiments):
if sentiment.get('risk_score', 0) > 7:
positive_spikes.append(i)
return {
"manipulation_detected": len(positive_spikes) > 5,
"spike_indices": positive_spikes,
"pattern_strength": len(positive_spikes) / len(sentiments)
}
Implementing Real-Time Monitoring
Automated Detection Pipeline
# pump_detector.py
import time
import logging
from datetime import datetime
class PumpDumpDetector:
def __init__(self, twitter_collector, reddit_collector, ollama_analyzer):
self.twitter = twitter_collector
self.reddit = reddit_collector
self.analyzer = ollama_analyzer
self.alert_threshold = 7 # Risk score threshold
def monitor_coin(self, coin_symbol, interval_minutes=30):
"""Continuously monitor a cryptocurrency for pump signals"""
logging.info(f"Starting monitoring for ${coin_symbol}")
while True:
try:
# Collect recent social media data
twitter_data = self.twitter.collect_crypto_mentions(coin_symbol, hours_back=2)
reddit_data = self.reddit.collect_crypto_posts(coin_symbol)
# Analyze for pump signals
alerts = self._analyze_data(coin_symbol, twitter_data, reddit_data)
# Send alerts if high risk detected
if alerts:
self._send_alerts(coin_symbol, alerts)
# Wait before next analysis
time.sleep(interval_minutes * 60)
except Exception as e:
logging.error(f"Monitoring error: {e}")
time.sleep(300) # Wait 5 minutes on error
def _analyze_data(self, coin_symbol, twitter_data, reddit_data):
"""Analyze collected data for pump indicators"""
alerts = []
# Analyze Twitter content
if not twitter_data.empty:
combined_tweets = ' '.join(twitter_data['text'].tolist())
twitter_analysis = self.analyzer.analyze_pump_signals(combined_tweets)
if twitter_analysis['risk_score'] >= self.alert_threshold:
alerts.append({
'platform': 'Twitter',
'risk_score': twitter_analysis['risk_score'],
'reasoning': twitter_analysis['reasoning'],
'post_count': len(twitter_data)
})
# Analyze Reddit content
if not reddit_data.empty:
combined_posts = ' '.join(reddit_data['title'].fillna('').tolist())
reddit_analysis = self.analyzer.analyze_pump_signals(combined_posts)
if reddit_analysis['risk_score'] >= self.alert_threshold:
alerts.append({
'platform': 'Reddit',
'risk_score': reddit_analysis['risk_score'],
'reasoning': reddit_analysis['reasoning'],
'post_count': len(reddit_data)
})
return alerts
def _send_alerts(self, coin_symbol, alerts):
"""Send pump detection alerts"""
timestamp = datetime.now().strftime("%Y-%m-%d %H:%M:%S")
for alert in alerts:
message = f"""
🚨 PUMP ALERT: ${coin_symbol}
Platform: {alert['platform']}
Risk Score: {alert['risk_score']}/10
Posts Analyzed: {alert['post_count']}
Time: {timestamp}
Analysis: {alert['reasoning']}
"""
print(message) # Replace with actual alert system
logging.warning(f"Pump alert for {coin_symbol}: {alert}")
Volume Analysis Integration
# volume_analyzer.py
import requests
import pandas as pd
class VolumeAnalyzer:
def __init__(self):
self.coingecko_url = "https://api.coingecko.com/api/v3"
def check_volume_spike(self, coin_id, hours_back=24):
"""Check for unusual trading volume spikes"""
# Get historical volume data
url = f"{self.coingecko_url}/coins/{coin_id}/market_chart"
params = {
'vs_currency': 'usd',
'days': 7,
'interval': 'hourly'
}
response = requests.get(url, params=params)
if response.status_code != 200:
return {"volume_spike": False, "multiplier": 0}
data = response.json()
volumes = [point[1] for point in data['total_volumes']]
# Calculate average volume and recent spike
recent_volume = volumes[-1] # Most recent hour
avg_volume = sum(volumes[:-hours_back]) / len(volumes[:-hours_back])
volume_multiplier = recent_volume / avg_volume if avg_volume > 0 else 0
return {
"volume_spike": volume_multiplier > 3.0, # 300% increase threshold
"multiplier": volume_multiplier,
"recent_volume": recent_volume,
"average_volume": avg_volume
}
Advanced Pattern Recognition
Coordinated Attack Detection
# coordination_detector.py
from collections import defaultdict
import networkx as nx
class CoordinationDetector:
def __init__(self):
self.posting_patterns = defaultdict(list)
def detect_coordinated_posting(self, social_data):
"""Identify coordinated posting patterns across accounts"""
# Group posts by time windows (15-minute intervals)
time_groups = defaultdict(list)
for _, post in social_data.iterrows():
time_window = self._get_time_window(post['created_at'])
time_groups[time_window].append({
'author': post.get('author_id', post.get('author')),
'text': post['text'] if 'text' in post else post.get('title', ''),
'engagement': post.get('like_count', post.get('score', 0))
})
coordination_signals = []
for time_window, posts in time_groups.items():
if len(posts) >= 5: # Minimum threshold for coordination
similarity_score = self._calculate_text_similarity(posts)
author_diversity = len(set(p['author'] for p in posts))
if similarity_score > 0.7 and author_diversity < len(posts) * 0.6:
coordination_signals.append({
'time_window': time_window,
'post_count': len(posts),
'similarity_score': similarity_score,
'author_diversity': author_diversity / len(posts),
'posts': posts
})
return coordination_signals
def _get_time_window(self, timestamp):
"""Group timestamps into 15-minute windows"""
import datetime
if isinstance(timestamp, str):
timestamp = pd.to_datetime(timestamp)
# Round down to nearest 15 minutes
minute = (timestamp.minute // 15) * 15
return timestamp.replace(minute=minute, second=0, microsecond=0)
def _calculate_text_similarity(self, posts):
"""Calculate average text similarity between posts"""
texts = [post['text'].lower() for post in posts]
similarities = []
for i, text1 in enumerate(texts):
for text2 in texts[i+1:]:
# Simple similarity based on common words
words1 = set(text1.split())
words2 = set(text2.split())
if len(words1) > 0 and len(words2) > 0:
similarity = len(words1.intersection(words2)) / len(words1.union(words2))
similarities.append(similarity)
return sum(similarities) / len(similarities) if similarities else 0
Creating Alert Systems
Multi-Channel Notifications
# alert_system.py
import smtplib
import requests
from email.mime.text import MIMEText
class AlertSystem:
def __init__(self, config):
self.email_config = config.get('email', {})
self.webhook_urls = config.get('webhooks', [])
def send_pump_alert(self, coin_symbol, analysis_results):
"""Send pump detection alert through multiple channels"""
alert_data = {
'coin': coin_symbol,
'timestamp': datetime.now().isoformat(),
'overall_risk': max(r.get('risk_score', 0) for r in analysis_results),
'platforms_affected': len(analysis_results),
'details': analysis_results
}
# Send email alert
if self.email_config:
self._send_email_alert(alert_data)
# Send webhook notifications
for webhook_url in self.webhook_urls:
self._send_webhook_alert(webhook_url, alert_data)
def _send_email_alert(self, alert_data):
"""Send email notification"""
subject = f"🚨 Pump Alert: ${alert_data['coin']}"
body = f"""
Cryptocurrency Pump Detection Alert
Coin: ${alert_data['coin']}
Overall Risk Score: {alert_data['overall_risk']}/10
Platforms Affected: {alert_data['platforms_affected']}
Time: {alert_data['timestamp']}
Analysis Details:
"""
for detail in alert_data['details']:
body += f"\n{detail['platform']}: Risk {detail['risk_score']}/10"
body += f"\n Reasoning: {detail['reasoning']}\n"
msg = MIMEText(body)
msg['Subject'] = subject
msg['From'] = self.email_config['from']
msg['To'] = self.email_config['to']
try:
with smtplib.SMTP(self.email_config['smtp_server'], self.email_config['port']) as server:
server.starttls()
server.login(self.email_config['username'], self.email_config['password'])
server.send_message(msg)
except Exception as e:
print(f"Email alert failed: {e}")
def _send_webhook_alert(self, webhook_url, alert_data):
"""Send webhook notification (Discord, Slack, etc.)"""
payload = {
"content": f"🚨 **Pump Alert: ${alert_data['coin']}**",
"embeds": [{
"title": f"Cryptocurrency Pump Detection",
"color": 16711680, # Red color
"fields": [
{"name": "Coin", "value": f"${alert_data['coin']}", "inline": True},
{"name": "Risk Score", "value": f"{alert_data['overall_risk']}/10", "inline": True},
{"name": "Platforms", "value": str(alert_data['platforms_affected']), "inline": True}
],
"timestamp": alert_data['timestamp']
}]
}
try:
requests.post(webhook_url, json=payload)
except Exception as e:
print(f"Webhook alert failed: {e}")
Running Your Detection System
Main Application Setup
# main.py
import logging
from config import TWITTER_CONFIG, REDDIT_CONFIG
from twitter_collector import TwitterCollector
from reddit_collector import RedditCollector
from ollama_analyzer import OllamaAnalyzer
from pump_detector import PumpDumpDetector
def main():
# Configure logging
logging.basicConfig(
level=logging.INFO,
format='%(asctime)s - %(levelname)s - %(message)s',
handlers=[
logging.FileHandler('pump_detector.log'),
logging.StreamHandler()
]
)
# Initialize components
twitter_collector = TwitterCollector(TWITTER_CONFIG)
reddit_collector = RedditCollector(REDDIT_CONFIG)
ollama_analyzer = OllamaAnalyzer()
# Create detector
detector = PumpDumpDetector(
twitter_collector,
reddit_collector,
ollama_analyzer
)
# List of coins to monitor
watch_list = ['SHIB', 'DOGE', 'PEPE', 'FLOKI'] # Add your coins
# Start monitoring (run each coin in separate thread for production)
for coin in watch_list:
print(f"Monitoring ${coin} for pump and dump activity...")
detector.monitor_coin(coin, interval_minutes=15)
if __name__ == "__main__":
main()
Performance Optimization Tips
System Resources:
- Use lightweight Ollama models (7B parameters) for faster analysis
- Implement caching for repeated text analysis
- Batch process social media data every 15-30 minutes
- Store historical patterns for improved accuracy
API Rate Limits:
- Rotate between multiple API keys
- Implement exponential backoff for failed requests
- Cache social media data to reduce API calls
- Use webhooks where available instead of polling
Interpreting Detection Results
Risk Score Interpretation
Risk Levels:
- 1-3 (Low Risk): Normal market activity with organic discussions
- 4-6 (Medium Risk): Increased attention but mixed signals
- 7-8 (High Risk): Strong pump indicators detected across platforms
- 9-10 (Critical Risk): Coordinated manipulation campaign in progress
False Positive Management
Common False Positives:
- Legitimate news events driving organic excitement
- Major exchange listings or partnerships
- Meme coin communities with natural enthusiasm
- Influencer mentions without coordination
Filtering Techniques:
- Cross-reference with news sources
- Check fundamental analysis metrics
- Verify trading volume patterns
- Monitor account age and authenticity
Action Guidelines
When High Risk Detected:
- Avoid buying during suspected pumps
- Set stop-losses on existing positions
- Monitor order books for large sell walls
- Wait for price stabilization before entering
Legal and Ethical Considerations
Using blockchain analytics for pump detection serves to protect investors rather than facilitate manipulation. This system helps identify potentially illegal market manipulation schemes that harm retail investors.
Important Disclaimers:
- This tool detects patterns but cannot guarantee accuracy
- Cryptocurrency markets remain highly volatile and risky
- Always conduct your own research before making investment decisions
- Consider this information as one factor among many in your analysis
Regulatory Compliance:
- Pump and dump schemes violate securities laws in most jurisdictions
- Report suspected manipulation to relevant authorities
- Use detection tools for protection, not participation in illegal activities
Advanced Features and Extensions
Machine Learning Integration
# ml_enhancement.py
from sklearn.ensemble import RandomForestClassifier
import numpy as np
class MLPumpDetector:
def __init__(self):
self.model = RandomForestClassifier(n_estimators=100)
self.features = [
'sentiment_score', 'volume_spike', 'post_frequency',
'account_age_avg', 'coordination_score', 'price_volatility'
]
def train_model(self, historical_data):
"""Train ML model on historical pump patterns"""
X = historical_data[self.features]
y = historical_data['is_pump'] # Binary classification
self.model.fit(X, y)
def predict_pump_probability(self, current_features):
"""Predict probability of ongoing pump"""
feature_array = np.array([current_features[f] for f in self.features]).reshape(1, -1)
probability = self.model.predict_proba(feature_array)[0][1]
return {
'pump_probability': probability,
'confidence': max(self.model.predict_proba(feature_array)[0])
}
Conclusion
Crypto pump and dump prediction through Ollama social media analysis provides a powerful defense against market manipulation. By monitoring coordination patterns, sentiment shifts, and volume spikes, you can protect your investments from predatory schemes.
This comprehensive detection system combines AI-powered text analysis with real-time monitoring to identify manipulation before it impacts your portfolio. The open-source approach ensures you maintain control over your trading intelligence while building expertise in cryptocurrency trading security.
Remember that no detection system is perfect – use these tools as part of a broader risk management strategy that includes diversification, position sizing, and fundamental analysis. Stay vigilant, stay protected, and happy trading!