# ChatGPT Actions integration
Source: https://docs.dexpaprika.com/ai-integration/chatgpt-actions
Integrate DexPaprika data with ChatGPT using OpenAPI actions for real-time crypto and DeFi data access directly in your conversations.
## What are ChatGPT Actions?
ChatGPT Actions allow you to connect ChatGPT to external APIs, enabling it to retrieve real-time data and perform actions beyond its training data. With DexPaprika's ChatGPT Actions integration, you can access live cryptocurrency and DeFi data directly within your ChatGPT conversations.
**Ready-to-use OpenAPI URL:** Visit [mcp.dexpaprika.com/openapi](https://mcp.dexpaprika.com/openapi) to get the pre-configured OpenAPI specification URL for immediate use with ChatGPT Actions.
## Prerequisites
* **ChatGPT Plus or Enterprise subscription** - Actions are only available for paid ChatGPT users
* Access to **ChatGPT's custom GPT creation** interface
* Basic understanding of API integrations (helpful but not required)
***
## Step-by-Step setup guide
### Step 1: Create a new Custom GPT
1. **Log in to ChatGPT** and navigate to your dashboard
2. **Click "Explore"** in the sidebar
3. **Select "Create a GPT"** from the top of the page
4. **Choose "Configure"** tab for manual setup
### Step 2: Configure your GPT
Fill in the basic information for your custom GPT:
* **Name**: `DexPaprika Crypto Assistant`
* **Description**: `Get real-time cryptocurrency and DeFi data across 20+ blockchain networks`
* **Instructions**:
```
You are a cryptocurrency and DeFi data assistant powered by DexPaprika. You can access real-time data about:
- Blockchain networks and their supported DEXes
- Liquidity pools and their metrics (TVL, volume, fees)
- Token prices and market data
- DEX trading activity and analytics
- Cross-chain comparisons and analysis
Always provide accurate, up-to-date information and explain complex DeFi concepts clearly. When showing pool data, include relevant metrics like TVL, 24h volume, and fee tiers when available.
```
### Step 3: Add the DexPaprika Action
1. **Scroll down to the "Actions" section**
2. **Click "Create new action"**
3. **Get the OpenAPI URL** from [mcp.dexpaprika.com/openapi](https://mcp.dexpaprika.com/openapi)
4. **Import the schema**:
* Select **"Import from URL"**
* Paste the OpenAPI URL: `https://mcp.dexpaprika.com/openapi`
* Click **"Import"**
The OpenAPI specification will automatically configure all available DexPaprika endpoints, including networks, pools, tokens, and search functionality.
### Step 4: Configure action settings
1. **Authentication**: Select "None" (DexPaprika API is publicly accessible)
2. **Privacy policy**: Add `https://dexpaprika.com/terms/` if required
3. **Action description**: The description will be auto-filled from the OpenAPI spec
### Step 5: Test and Publish
1. **Click "Test"** to verify the integration works
2. **Try a sample query**: "What are the top 5 liquidity pools on Ethereum?"
3. **Click "Save"** when everything works correctly
4. **Choose visibility**: Keep private or share with others
***
## Usage examples
Once your ChatGPT Action is set up, you can ask questions like:
### Network and DEX queries
* "Which blockchain networks does DexPaprika support?"
* "What are the top DEXes on Solana by trading volume?"
* "Show me all available DEXes on Arbitrum"
### Pool analysis
* "What are the most liquid USDC/ETH pools across all networks?"
* "Find the highest volume pools on Uniswap V3"
* "Compare fees between different USDT/USDC pools"
### Token information
* "What's the current price of SOL across different DEXes?"
* "Show me all pools containing PEPE token"
* "Get detailed information about Chainlink token on Ethereum"
### Market research
* "Find newly created pools with high trading volume"
* "What's the total trading volume on PancakeSwap today?"
* "Compare liquidity between Ethereum and Polygon networks"
### Advanced analytics
* "Analyze arbitrage opportunities between Uniswap and SushiSwap"
* "Show me pools with unusual price movements in the last 24 hours"
* "Find the most profitable liquidity provision opportunities"
**Want to explore more?** Visit [mcp.dexpaprika.com](https://mcp.dexpaprika.com) to test the interface, check out our [tutorials](/tutorials/tutorial_intro) for step-by-step guides, or browse the [API documentation](/api-reference/introduction) for technical details.
***
## Troubleshooting
**Solutions**:
1. Verify the OpenAPI URL is correctly imported: `https://mcp.dexpaprika.com/openapi`
2. Check that your ChatGPT subscription includes Actions (Plus or Enterprise)
3. Try recreating the Action if import failed
4. Ensure no typos in the OpenAPI URL
**Solutions**:
1. Be more specific in your queries (include network names, token symbols)
2. Try different phrasings for your questions
3. Check if the requested data exists (some networks/tokens may have limited pools)
**Solutions**:
1. Break complex queries into smaller, focused questions
2. Ask for specific data rather than broad overviews
3. Be patient - real-time data fetching may take a few seconds
***
## What's next?
Explore all available endpoints and their capabilities
Try our Model Context Protocol integration for Claude
Zero-setup MCP integration for Claude and Cursor
Learn how to discover newly created liquidity pools
## Need Help?
Connect with our community for real-time support
Contact our team for technical assistance
**Custom integrations needed?** Our team can help you build advanced ChatGPT Actions tailored to your specific use cases. [Contact us](mailto:support@coinpaprika.com) to discuss your requirements.
# DexPaprika Hosted MCP Server
Source: https://docs.dexpaprika.com/ai-integration/hosted-mcp-server
Access real-time DeFi data in Claude, Cursor, and other MCP-compatible tools without any installation or setup. Our hosted MCP server provides instant access to comprehensive DEX data across 20+ blockchains.
Having trouble connecting? We're here to help - [reach out](mailto:support@coinpaprika.com) and we'll get you up and running.
## Why use our hosted MCP server?
Skip the installation hassle and get instant access to comprehensive DeFi data. Our hosted MCP server eliminates setup complexity while providing enterprise-grade reliability and performance.
**What you get instantly:**
* **Zero setup required** - just add the URL to your config
* **Multiple transport options** - SSE, streamable HTTP, and JSON-RPC support
* **Always up-to-date** - we handle updates and maintenance
* **Enterprise reliability** - High availability and performance
* **Real-time data** - live prices, volumes, and pool information
* **20+ blockchain networks** - Ethereum, Solana, Base, Arbitrum, and more
Visit [mcp.dexpaprika.com](https://mcp.dexpaprika.com) to explore our hosted MCP server interface and see the available data before integrating.
***
## Quick integration guide
### Claude desktop setup
Add our hosted MCP server to Claude Desktop in just 2 steps:
Find your Claude Desktop configuration file:
* **macOS**: `~/Library/Application\ Support/Claude/claude_desktop_config.json`
* **Windows**: `%APPDATA%/Claude/claude_desktop_config.json`
* **Linux**: `~/.config/Claude/claude_desktop_config.json`
If the file doesn't exist, create it with this content:
```json
{
"mcpServers": {
"dexpaprika": {
"url": "https://mcp.dexpaprika.com/sse"
}
}
}
```
If the file already exists, add our server to the existing `mcpServers` object:
```json
{
"mcpServers": {
"dexpaprika": {
"url": "https://mcp.dexpaprika.com/sse"
}
}
}
```
Save the file and restart Claude Desktop. You're ready to go!
### Cursor setup
1. Open Cursor IDE
2. Go to **Settings** (Cmd/Ctrl + ,)
3. Navigate to **Tools & Integrations**
4. Click **New MCP server**
This will open the `mcp.json` file. Add the DexPaprika server configuration:
```json
{
"mcpServers": {
"dexpaprika": {
"url": "https://mcp.dexpaprika.com/sse"
}
}
}
```
Save the file and restart Cursor if needed.
### ChatGPT integration
Want to use DexPaprika data in ChatGPT? You can integrate our API directly into ChatGPT using Actions:
1. **Create a custom GPT** in ChatGPT (requires Plus or Enterprise subscription)
2. **Add our OpenAPI specification** from [mcp.dexpaprika.com/openapi](https://mcp.dexpaprika.com/openapi)
3. **Start asking crypto questions** directly in your ChatGPT conversations
**Ready to set up ChatGPT?** Follow our complete [ChatGPT Actions integration guide](/ai-integration/chatgpt-actions) for step-by-step instructions.
***
## Connection options
Our hosted MCP server supports multiple transport protocols to ensure compatibility with different clients and use cases:
**Endpoint:** `https://mcp.dexpaprika.com/sse`
**Best for:** Claude Desktop, Cursor, and most MCP clients\
**Benefits:** Real-time streaming updates, excellent browser compatibility, automatic reconnection\
**Use cases:** Live price monitoring, real-time pool updates, continuous market data feeds
This is the recommended option for most users as it provides the smoothest experience with popular AI tools.
**Endpoint:** `https://mcp.dexpaprika.com/streamable-http`
**Best for:** Custom applications, web services, and clients that prefer HTTP streaming\
**Benefits:** Standard HTTP protocol, works well with firewalls, easier debugging\
**Use cases:** Integration with existing web infrastructure, corporate environments with strict network policies
Perfect for developers building custom integrations or working in environments where SSE might be restricted.
**Endpoint:** `https://mcp.dexpaprika.com/json-rpc`
**Best for:** Traditional API integrations, batch processing, simple request-response patterns\
**Benefits:** Familiar REST-like interface, stateless communication, easy to cache\
**Use cases:** Periodic data fetching, batch analysis, integration with existing JSON-RPC systems
Ideal for applications that don't need real-time updates and prefer traditional API communication patterns.
**Getting started?** Use the SSE endpoint (`https://mcp.dexpaprika.com/sse`) for the best experience with Claude Desktop and Cursor. You can always switch protocols later based on your specific needs.
**Need help choosing?** Visit [mcp.dexpaprika.com](https://mcp.dexpaprika.com) to test different connection methods and see which works best for your setup.
***
## Available features
Our hosted MCP server provides comprehensive access to DeFi data:
### Core data access
* **Multi-chain support** - 20+ blockchain networks including Ethereum, Solana, Base, Arbitrum, Polygon, and more
* **Real-time prices** - Live token prices and market data
* **Liquidity pools** - Detailed pool information, volumes, and fees
* **DEX analytics** - Trading data across major decentralized exchanges
* **Search functionality** - Find tokens, pools, and DEXes across all networks
### Advanced analytics
* **Historical data** - Price history and trading volumes over time
* **Pool monitoring** - Track new pool creation and liquidity changes
* **Cross-chain comparisons** - Compare prices and liquidity across different networks
* **Volume analysis** - Trading volume trends and patterns
***
## Usage examples
Once configured, you can ask Claude or Cursor powerful questions about DeFi data:
### Example queries you can make
| Category | Example Query |
| --------------------------- | ------------------------------------------------------------------------------------------- |
| **Basic market data** | "What are the top 5 liquidity pools by volume today?" |
| **Network information** | "Which blockchain networks does DexPaprika support?" |
| **Token prices** | "What's the current price of SOL across different DEXes?" |
| **Pool analysis** | "Show me the most liquid USDC/ETH pools across all networks and compare their trading fees" |
| **New opportunities** | "Find newly created liquidity pools in the last 24 hours with volume over \$100k" |
| **Cross-chain comparison** | "Compare ETH prices between Ethereum mainnet and Layer 2 solutions like Arbitrum and Base" |
| **DEX performance** | "Which DEX has the highest trading volume on Solana today?" |
| **Token discovery** | "Search for meme tokens with high trading volume in the last hour" |
| **Arbitrage opportunities** | "Find price differences for USDC across different DEXes on Ethereum" |
**Want to explore more?** Visit [mcp.dexpaprika.com](https://mcp.dexpaprika.com) to test the interface, check out our [tutorials](/tutorials/tutorial_intro) for step-by-step guides, or browse the [API documentation](/api-reference/introduction) for technical details.
***
## Benefits over self-hosted solutions
| Feature | Hosted MCP | Self-Hosted MCP |
| --------------- | -------------------- | ------------------------------ |
| **Setup time** | \< 2 minutes | 15-30 minutes |
| **Maintenance** | Zero - we handle it | Regular updates required |
| **Reliability** | High availability | Depends on your infrastructure |
| **Performance** | Optimized hosting | Limited by your server |
| **Updates** | Automatic | Manual intervention |
| **Support** | Professional support | Community only |
***
## Troubleshooting
### Common issues and solutions
**Symptoms**: Claude shows connection errors or timeouts when making requests.
**Solutions**:
1. Check your internet connection
2. Restart Claude Desktop/Cursor
3. Verify the configuration syntax in your config file
4. Try removing and re-adding the server configuration
**Symptoms**: Can't locate the Claude Desktop configuration file.
**Solutions**:
1. Create the directories if they don't exist:
* **macOS**: `mkdir -p ~/Library/Application\ Support/Claude/`
* **Windows**: Create the `Claude` folder in your `%APPDATA%` directory
2. Create the `claude_desktop_config.json` file manually
3. Ensure proper JSON syntax
**Symptoms**: The DexPaprika server doesn't appear in Claude's available tools.
**Solutions**:
1. Verify JSON syntax in your configuration file
2. Restart Claude Desktop completely
3. Check that the server URL is correct: `https://mcp.dexpaprika.com/sse`
4. Try removing and re-adding the server configuration
**Symptoms**: Getting rate limit errors or API failures.
**Solutions**:
1. Our hosted service includes built-in rate limiting protection
2. If you're hitting limits, contact support for assistance
3. Try spacing out your requests if making many in quick succession
***
## API coverage
Our hosted MCP server provides access to all DexPaprika API endpoints:
Access all supported blockchain networks and their available decentralized exchanges
Comprehensive pool data including TVL, volume, fees, and token pairs
Real-time token prices, market data, and detailed token information
Price history, volume trends, and historical pool performance
Powerful search across tokens, pools, and DEXes with filtering options
Live data feeds with the latest market information and trading activity
***
## What's next?
Explore all available data endpoints and their capabilities
Learn how to discover newly created liquidity pools and tokens
Want to run your own MCP server? Check out our self-hosted guide
Access and analyze historical price and volume data
## Need help?
Connect with our community and get real-time support from other builders
Reach out directly for technical support or feature requests
**Looking for custom integrations?** Our team can help you integrate DexPaprika data into any application or workflow. [Contact us](mailto:support@coinpaprika.com) to discuss your specific needs.
# Installing MCP Server for DexPaprika
Source: https://docs.dexpaprika.com/ai-integration/mcp
Integrating DexPaprika data with Claude.ai using Model Context Protocol (MCP)
## What is MCP?
MCP (Model Context Protocol) is an [open protocol standard customized by Claude](https://modelcontextprotocol.io/introduction) for establishing unified context interaction between AI models and development environments, enabling AI to better understand and process code. The DexPaprika MCP server leverages this protocol to provide AI assistants like Claude with access to real-time crypto and DeFi market data, enabling advanced conversations about blockchain networks, decentralized exchanges (DEXes), liquidity pools, and tokens across the DeFi ecosystem.
**Looking for an easier setup?** If you prefer not to install anything locally, check out our [DexPaprika Hosted MCP Server](/ai-integration/hosted-mcp-server) for instant access with zero installation required.
## Installation Guide
### Prerequisites
Before installing the DexPaprika MCP server, ensure you have:
* [Node.js](https://nodejs.org/) (v16 or higher) installed on your system
* [npm](https://www.npmjs.com/) (comes with Node.js) or [yarn](https://yarnpkg.com/) package manager
* [Claude Desktop](https://claude.ai/desktop) or [Cursor](https://cursor.sh/) installed if you want to use the MCP server with these applications
### Installation Options
#### Option 1: Global Installation (Recommended)
Installing the DexPaprika MCP server globally makes it available throughout your system:
```bash
npm install -g dexpaprika-mcp
```
After installation, you can start the server by running:
```bash
dexpaprika-mcp
```
#### Option 2: Use with npx (No Installation)
Alternatively, you can run the server directly without installation using npx:
```bash
npx dexpaprika-mcp
```
This is useful for trying out the server without permanently installing it.
### Verification
To verify that your installation was successful, run:
```bash
dexpaprika-mcp --version
```
You should see the current version number of the DexPaprika MCP server.
## Configuration
### Claude Desktop Configuration
To use the DexPaprika MCP server with Claude Desktop:
1. Download and install [Claude Desktop](https://claude.ai/desktop) if you haven't already
2. Locate your Claude Desktop configuration file:
* **macOS**: `~/Library/Application\ Support/Claude/claude_desktop_config.json`
* **Windows**: `%APPDATA%/Claude/claude_desktop_config.json`
3. If the file doesn't exist, create it with the following content:
```json
{
"mcpServers": {
"dexpaprika": {
"command": "dexpaprika-mcp"
}
}
}
```
4. If the file already exists, add the DexPaprika configuration to the existing `mcpServers` object:
```json
"dexpaprika": {
"command": "dexpaprika-mcp"
}
```
5. Save the file and restart Claude Desktop.
6. To verify the configuration, open Claude Desktop and try asking a question about cryptocurrency data, such as "What are the top liquidity pools on Ethereum?"
Make sure you have the latest version of our package by running `npm update -g dexpaprika-mcp`
### Cursor Configuration
If you're using Cursor IDE with Claude:
1. Download and install [Cursor](https://cursor.sh/) if you haven't already
2. Open Cursor and click on the Claude button in the sidebar
3. Click on the settings icon (⚙️) and select "Add MCP Server"
4. Fill in the following information:
* **Server name**: `dexpaprika`
* **Type**: `command` (select from dropdown)
* **Command to run**: `npx dexpaprika-mcp`
5. Click "Add" to save the configuration
6. Alternatively, Cursor will automatically use any MCP servers configured in Claude Desktop
## Troubleshooting
If you encounter issues with the DexPaprika MCP server:
1. **Server not found errors**:
* Ensure the server is installed correctly using `npm list -g dexpaprika-mcp`
* Try reinstalling with `npm install -g dexpaprika-mcp`
2. **Connection errors**:
* Check that your internet connection is active
* Verify that no firewall is blocking the connection
3. **Configuration errors**:
* Double-check your configuration file syntax
* Ensure the path to the configuration file is correct for your OS
4. **Command not found errors**:
* Ensure Node.js is installed and in your PATH
* Try using the full path to the npm or npx executables
## Features
The DexPaprika MCP server provides access to:
* Blockchain network information across multiple chains
* Decentralized exchange (DEX) data
* Liquidity pool details and metrics
* Token information and market data
* Price and volume analytics for tokens and pools
* Comprehensive search capabilities across DeFi entities
## Available Tools
The DexPaprika MCP server provides the following tools to Claude:
1. **getNetworks** - Retrieve a list of all supported blockchain networks and their metadata
2. **getNetworkDexes** - Get a list of available decentralized exchanges on a specific network
3. **getTopPools** - Get a paginated list of top liquidity pools from all networks
4. **getNetworkPools** - Get a list of top liquidity pools on a specific network
5. **getDexPools** - Get top pools on a specific DEX within a network
6. **getPoolDetails** - Get detailed information about a specific pool on a network
7. **getTokenDetails** - Get detailed information about a specific token on a network
8. **search** - Search for tokens, pools, and DEXes by name or identifier
9. **getStats** - Get high-level statistics about the DexPaprika ecosystem
## Usage Examples
Once configured, you can ask Claude questions about DeFi data. Here are some example prompts:
### General Market Data
* "What are the top 5 liquidity pools across all networks by volume?"
### Network-Specific Queries
* "Which blockchain networks are supported by DexPaprika?"
* "What are the top DEXes on the Solana network?"
* "Show me the top 10 liquidity pools on Ethereum, ordered by volume."
### DEX and Pool Analysis
* "What are the most active pools on Uniswap V3?"
* "Show me details about the USDC/ETH pool on Uniswap V3."
* "Compare the trading volume between PancakeSwap and Uniswap."
### Token Information
* "What's the current price of SOL in the Raydium pool on Solana?"
* "Find all pools that include the SHIB token."
### Search Functionality
* "Search for pools related to 'Bitcoin'"
* "Find tokens with 'Pepe' in their name"
## Advanced Queries
You can also ask Claude to perform more complex analysis:
* "Compare the liquidity and volume of the top 3 DEXes on Ethereum"
* "What's the price difference of ETH between Uniswap and SushiSwap?"
* "Show me the tokens with the highest price volatility in the last 24 hours"
* "Analyze the trading volume trends for BNB on PancakeSwap"
## Support
If you need further assistance, you can:
* Contact DexPaprika support by [email](support@coinpaprika.com)
* Join the [Discord community](https://discord.gg/DhJge5TUGM)
# Get a list of available dexes on a network.
Source: https://docs.dexpaprika.com/api-reference/dexes/get-a-list-of-available-dexes-on-a-network
get /networks/{network}/dexes
# Getting Started
Source: https://docs.dexpaprika.com/api-reference/introduction
The DexPaprika DEX API provides near real-time data about tokens, liquidity pools, and decentralized exchanges on over 18 blockchains. Below you can find the most popular endpoints that you can use to build your own applications:
Get detailed information about any token on a given network like latest price, liquidity, and trading volume
Access liquidity pool data and trading statistics for a given pool address
Search for tokens, pools, and DEXes
## Quick Start
We will make a GET request to the [Token](/api-reference/tokens/get-a-tokens-latest-data-on-a-network) endpoint in order to get the latest price in USD of TRUMP. All we need is the [network ID](/api-reference/networks/get-a-list-of-available-blockchain-networks) and the token address.
```bash
curl -X GET "https://api.dexpaprika.com/networks/solana/tokens/6p6xgHyF7AeE6TZkSmFsko444wqoP15icUSqi2jfGiPN"
```
This will return latest data about TRUMP (TRUMP) on Solana:
```json Response [expandable]
{
"id": "6p6xgHyF7AeE6TZkSmFsko444wqoP15icUSqi2jfGiPN",
"name": "OFFICIAL TRUMP",
"symbol": "TRUMP",
"chain": "solana",
"decimals": 6,
"total_supply": 1000000000000000,
"description": "",
"website": "",
"explorer": "",
"added_at": "2025-01-17T23:26:21Z",
"summary": {
"price_usd": 13.14091959123721,
"fdv": 13140919591.23721,
"liquidity_usd": 149423162.71562782,
"24h": {
"volume": 21068441.593753017,
"volume_usd": 281073979.6428536,
"sell": 64806,
"buy": 69661,
"txns": 134467
},
"6h": {
"volume": 3947009.6413350017,
"volume_usd": 51291873.69781224,
"sell": 15636,
"buy": 16434,
"txns": 32070
},
"1h": {
"volume": 412765.81969400006,
"volume_usd": 5443526.242503976,
"sell": 1965,
"buy": 1934,
"txns": 3899
},
"30m": {
"volume": 138223.89740200003,
"volume_usd": 1826042.0489568561,
"sell": 666,
"buy": 942,
"txns": 1608
},
"15m": {
"volume": 102904.73769500002,
"volume_usd": 1356865.0163336615,
"sell": 499,
"buy": 634,
"txns": 1133
},
"5m": {
"volume": 16003.968976,
"volume_usd": 210782.49285099082,
"sell": 77,
"buy": 155,
"txns": 232
}
},
"last_updated": "2025-02-25T13:42:32.093353071Z"
}
```
## Base URL
All API endpoints use the following base URL:
```
https://api.dexpaprika.com/
```
## Common Use Cases
Here are some popular ways to use our API:
```bash
# Get USDC price and trading data
curl -X GET "https://api.dexpaprika.com/networks/solana/tokens/EPjFWdd5AufqSSqeM2qN1xzybapC8G4wEGGkZwyTDt1v"
```
```bash
# Get all pools where SOL is traded
curl -X GET "https://api.dexpaprika.com/networks/solana/tokens/So11111111111111111111111111111111111111112/pools"
```
```bash
# Search for "Jupiter" token
curl -X GET "https://api.dexpaprika.com/networks/solana/search?query=jupiter"
```
## Rate Limits
The API is free to use with a rate limit of:
* 10,000 requests per day
Need higher limits? [Contact us](mailto:support@dexpaprika.com) to discuss enterprise options.
# Get a list of available blockchain networks.
Source: https://docs.dexpaprika.com/api-reference/networks/get-a-list-of-available-blockchain-networks
get /networks
Retrieve a list of all supported blockchain networks, including metadata
like display names and associated details. Ideal for building dropdowns
or querying supported networks for your application.
# Get a pool on a network.
Source: https://docs.dexpaprika.com/api-reference/pools/get-a-pool-on-a-network
get /networks/{network}/pools/{pool_address}
Retrieve detailed information about a specific on-chain pool,
including token pairs, current price data, and volume metrics.
# Get OHLCV data for a pool pair.
Source: https://docs.dexpaprika.com/api-reference/pools/get-ohlcv-data-for-a-pool-pair
get /networks/{network}/pools/{pool_address}/ohlcv
Retrieves Open-High-Low-Close-Volume (OHLCV) data for a specific pool,
potentially over a specified time range.
- **start** is **required** to set the beginning of the data window.
- **end** is optional; if omitted, data is returned for the "start" date only.
- **limit** can control how many data points to retrieve (e.g., maximum of 100).
- **interval** defines the granularity (e.g., 1h, 4h, 1d).
- **inverted_price** indicates whether to invert the main price ratio.
# Get top X pools. (DEPRECATED)
Source: https://docs.dexpaprika.com/api-reference/pools/get-top-x-pools
get /pools
**THIS ENDPOINT HAS BEEN DEPRECATED AND WILL BE REMOVED.**
It now returns a 410 Gone status. Please refer to our API documentation for alternatives.
---
*Original Description (for historical reference):*
Retrieves a paginated list of top pools from all (or specific) networks.
Allows sorting and ordering, providing aggregated volume, price data,
and token details for each pool.
**This endpoint has been deprecated and will return a `410 Gone` error.**
Please use [`/networks/{network}/pools`](/api-reference/pools/get-top-x-pools-on-a-network) instead to get top pools for each specific blockchain network.
### Migration Examples:
Instead of:
```bash
curl -X GET "https://api.dexpaprika.com/pools"
```
Use network-specific endpoints:
```bash
# Get top pools on Ethereum
curl -X GET "https://api.dexpaprika.com/networks/ethereum/pools"
# Get top pools on Solana
curl -X GET "https://api.dexpaprika.com/networks/solana/pools"
# Get top pools on Fantom
curl -X GET "https://api.dexpaprika.com/networks/fantom/pools"
```
This change provides better performance and more relevant, network-specific results.
# Get top X pools. (DEPRECATED)
Source: https://docs.dexpaprika.com/api-reference/pools/get-top-x-pools-deprecated
api-reference/openapi.yml get /pools
**THIS ENDPOINT HAS BEEN DEPRECATED AND WILL BE REMOVED.**
It now returns a 410 Gone status. Please refer to our API documentation for alternatives.
---
*Original Description (for historical reference):*
Retrieves a paginated list of top pools from all (or specific) networks.
Allows sorting and ordering, providing aggregated volume, price data,
and token details for each pool.
# Get top X pools on a network.
Source: https://docs.dexpaprika.com/api-reference/pools/get-top-x-pools-on-a-network
get /networks/{network}/pools
Retrieves a paginated list of top pools on a specific network.
Supports sorting and ordering by different parameters. The response
includes volume, price data, and token details for each pool.
# Get top X pools on a network's DEX.
Source: https://docs.dexpaprika.com/api-reference/pools/get-top-x-pools-on-a-networks-dex
get /networks/{network}/dexes/{dex}/pools
Retrieves a paginated list of top pools on a specific network's DEX.
Supports sorting and ordering, returning essential price data and token details.
# Get transactions of a pool on a network. Paging can be used up to 100 pages.
Source: https://docs.dexpaprika.com/api-reference/pools/get-transactions-of-a-pool-on-a-network-paging-can-be-used-up-to-100-pages
get /networks/{network}/pools/{pool_address}/transactions
# Search for tokens, pools, and DEXes
Source: https://docs.dexpaprika.com/api-reference/search/search-for-tokens-pools-and-dexes
get /search
Allows users to search across multiple entities (tokens, pools, and DEXes)
in a single query. Useful for quickly finding resources by name, symbol, or ID.
# Get a token's latest data on a network.
Source: https://docs.dexpaprika.com/api-reference/tokens/get-a-tokens-latest-data-on-a-network
get /networks/{network}/tokens/{token_address}
Retrieves detailed information about a specific token on the given network,
including latest price, metadata, status, and recent summary metrics such as price changes
and volumes over multiple timeframes. The `token_supply` is a raw integer. To get the actual supply, move the decimal point left by the `decimals` value. Example: supply = `raw_supply` / 10^`decimals`
# Get top X pools for a token.
Source: https://docs.dexpaprika.com/api-reference/tokens/get-top-x-pools-for-a-token
get /networks/{network}/tokens/{token_address}/pools
Retrieves a paginated list of liquidity pools that involve the specified token,
including details like current price, volume in USD, and tokens present in each pool.
Useful for analytics, DEX front-ends, or portfolio tracking.
# Retrieve high-level asset statistics
Source: https://docs.dexpaprika.com/api-reference/utils/retrieve-high-level-asset-statistics
get /stats
Provides a snapshot of the total number of chains, factories, pools,
and tokens tracked by this API. Ideal for overview dashboards or
quick system capacity checks.
# Changelog
Source: https://docs.dexpaprika.com/changelog/changelog
Track all updates and changes to DexPaprika API
This page contains all significant updates, improvements, and bug fixes for the DexPaprika API. We're committed to making our product better with each release. Check our [API reference](/api-reference) for the latest version.
## Enhanced
* **Transaction Schema:**
* Added `token_0_symbol` and `token_1_symbol` fields to transaction objects for explicit token symbol tracking.
* Added `price_0`, `price_1`, `price_0_usd`, and `price_1_usd` fields to transaction objects for detailed price reporting.
* Added `created_at` field to transaction objects for precise transaction timestamping.
## Migration/Compatibility
* No breaking changes to existing endpoints, but clients parsing transaction objects should update their models to support the new fields for full compatibility.
## Deprecated
* **BREAKING**: The [`/pools`](/api-reference/pools/get-top-x-pools) endpoint has been permanently deprecated and now returns `410 Gone`
* Users should migrate to [`/networks/{network}/pools`](/api-reference/pools/get-top-x-pools-on-a-network) to get top pools for each specific network
## Changed
* Enhanced deprecation messaging with clear migration paths for affected endpoints
## Migration Guide
Instead of using the deprecated global pools endpoint:
```
GET /pools
```
Use the network-specific pools endpoint for each blockchain:
```
GET /networks/ethereum/pools
GET /networks/solana/pools
GET /networks/fantom/pools
```
This change provides better performance and more relevant results by focusing on network-specific data.
## Added
* Introduced the optional `reorder` query parameter to the [`/networks/{network}/tokens/{token_address}/pools`](/api-reference/tokens/get-top-x-pools-for-a-token) endpoint. This allows clients to reorder pool data so the specified token becomes the primary token for all metrics and calculations.
## Added
* Added operation IDs to all endpoints for better client code generation
* Added fully diluted valuation (`fdv`) field to token responses
* Added detailed license information and contact details in API specification
* Added `buy_usd` and `sell_usd` fields to provide monetary values for trades
* Added `last_price_usd_change` field to all time intervals (24h, 6h, 1h, 30m, 15m, 5m, 1m)
* Added organized API tags with descriptions for better navigation
## Changed
* Updated OpenAPI specification from 3.0.3 to 3.1.0 for improved documentation
* Changed [OHLCV endpoint](/api-reference/pools/get-ohlcv-data-for-a-pool-pair) response format from wrapped object to direct array of records (Breaking Change)
* Renamed `buy`/`sell` fields to `buys`/`sells` for consistency (Breaking Change)
* Changed [network](/api-reference/networks) response format from object to array for cleaner consumption
* Updated network schema with improved field naming (`display_name` instead of just `name`)
* Standardized ID field in Network schema to use string identifiers instead of numeric IDs
* Enhanced token schema with additional fields: chain, total\_supply, added\_at, last\_updated
* Improved parameter documentation with examples and clearer descriptions
## Fixed
* Improved consistency in representing null values in responses
* Updated example responses to more accurately reflect actual API behavior
* Fixed formatting inconsistencies in API documentation
## Improved
* Enhanced OHLCV schema with time\_open and time\_close fields to clearly define candlestick periods
* Improved validation for time intervals and limits in OHLCV endpoint
* Added explicit error response documentation for the /stats endpoint
* Enhanced descriptions for all endpoints and parameters
## Added
* Support for buy/sell volume metrics across all time intervals
* Added transaction counts to pool details
* Added initial support for price tracking
## Changed
* Improved error messaging with more specific error codes
* Enhanced documentation with more descriptive examples
## Fixed
* Corrected timestamp format inconsistencies across endpoints
* Fixed incorrect price calculations in some edge cases
## Added
* Initial public beta release
* Support for Solana network
* Basic endpoints for [networks](/api-reference/networks), [DEXes](/api-reference/dexes/get-a-list-of-available-dexes-on-a-network), [pools](/api-reference/pools/get-top-x-pools), and [tokens](/api-reference/tokens/get-a-tokens-latest-data-on-a-network)
* [Search functionality](/api-reference/search/search-for-tokens-pools-and-dexes) across tokens, pools, and DEXes
* OHLCV data for historical price tracking
* Transaction history for pools
## Added
* Pool and token details endpoints
* Initial version of search functionality
## Changed
* Improved error handling and response formats
* Enhanced documentation with examples
## Added
* Initial development release
* Preliminary endpoints for pools and DEXes
# DexPaprika Go SDK
Source: https://docs.dexpaprika.com/get-started/sdk-go
The official Go client library for the DexPaprika API, providing easy access to decentralized exchange data across multiple blockchain networks
## Installation
```bash
go get github.com/coinpaprika/dexpaprika-sdk-go
```
## Prerequisites
* Go 1.24 or higher
* Connection to the internet to access the DexPaprika API
* No API key required
## Quick Example: Get Token Price
```go
package main
import (
"context"
"fmt"
"log"
"time"
"github.com/coinpaprika/dexpaprika-sdk-go/dexpaprika"
)
func main() {
// Create client
client := dexpaprika.NewClient()
// Create context with timeout
ctx, cancel := context.WithTimeout(context.Background(), 5*time.Second)
defer cancel()
// Get WETH token details on Ethereum
token, err := client.Tokens.GetDetails(ctx, "ethereum", "0xc02aaa39b223fe8d0a0e5c4f27ead9083c756cc2")
if err != nil {
log.Fatalf("Error getting token details: %v", err)
}
fmt.Printf("%s: $%.2f\n", token.Name, token.PriceUSD)
// Output: Wrapped Ether: $3245.67
}
```
## API Methods Reference
Parameters marked with an asterisk (\*) are required.
### client.Networks.List(ctx)
**Endpoint:** [GET `/networks`](/api-reference/networks/get-a-list-of-available-blockchain-networks)
Gets all supported blockchain networks including Ethereum, Solana, etc.
**Parameters:**
* `ctx`\* - Context for API request
**Returns:** Network IDs, names, and related information. [Response Structure](/api-reference/networks/get-a-list-of-available-blockchain-networks).
```go
networks, err := client.Networks.List(ctx)
```
### client.Networks.ListDexes(ctx, networkId, options)
**Endpoint:** [GET `/networks/{network}/dexes`](/api-reference/dexes/get-a-list-of-available-dexes-on-a-network)
Gets all DEXes on a specific network.
**Parameters:**
* `ctx`\* - Context for API request
* `networkId`\* - ID of the network (e.g., 'ethereum', 'solana')
* `options` - ListOptions containing pagination parameters:
* `Page` - Page number for pagination (starts at 0)
* `Limit` - Number of results per page
**Returns:** DEX IDs, names, pool counts, and volume information. [Response Structure](/api-reference/dexes/get-a-list-of-available-dexes-on-a-network).
```go
options := &dexpaprika.ListOptions{
Page: 0,
Limit: 10,
}
dexes, err := client.Networks.ListDexes(ctx, "ethereum", options)
```
### client.Pools.List(ctx, options)
**Endpoint:** [GET `/pools`](/api-reference/pools/get-top-x-pools)
Gets top pools across all networks with pagination.
**Parameters:**
* `ctx`\* - Context for API request
* `options` - ListOptions containing pagination and sorting parameters:
* `Page` - Page number for pagination (starts at 0)
* `Limit` - Number of results per page
* `Sort` - Sort direction ('asc' or 'desc')
* `OrderBy` - Field to sort by ('volume\_usd', 'liquidity\_usd', etc.)
**Returns:** Paginated list of pool objects with pricing data. [Response Structure](/api-reference/pools/get-top-x-pools).
```go
options := &dexpaprika.ListOptions{
Limit: 10,
OrderBy: "volume_usd",
Sort: "desc",
}
pools, err := client.Pools.List(ctx, options)
```
***
### client.Pools.ListByNetwork(ctx, networkId, options)
**Endpoint:** [GET `/networks/{network}/pools`](/api-reference/pools/get-top-x-pools-on-a-network)
Gets pools on a specific network with pagination and sorting options.
**Parameters:**
* `ctx`\* - Context for API request
* `networkId`\* - ID of the network
* `options` - ListOptions containing pagination and sorting parameters
**Returns:** Paginated list of pools for the given network. [Response Structure](/api-reference/pools/get-top-x-pools-on-a-network).
```go
options := &dexpaprika.ListOptions{
Limit: 5,
OrderBy: "volume_usd",
Sort: "desc",
}
pools, err := client.Pools.ListByNetwork(ctx, "ethereum", options)
```
***
### client.Pools.ListByDex(ctx, networkId, dexId, options)
**Endpoint:** [GET `/networks/{network}/dexes/{dex}/pools`](/api-reference/pools/get-top-x-pools-on-a-networks-dex)
Gets pools on a specific DEX within a network with pagination.
**Parameters:**
* `ctx`\* - Context for API request
* `networkId`\* - ID of the network
* `dexId`\* - ID of the DEX
* `options` - ListOptions containing pagination and sorting parameters
**Returns:** Paginated list of pools for the specified DEX. [Response Structure](/api-reference/pools/get-top-x-pools-on-a-networks-dex).
```go
options := &dexpaprika.ListOptions{
Limit: 10,
OrderBy: "volume_usd",
Sort: "desc",
}
pools, err := client.Pools.ListByDex(ctx, "ethereum", "uniswap_v2", options)
```
***
### client.Pools.GetDetails(ctx, networkId, poolAddress, options)
**Endpoint:** [GET `/networks/{network}/pools/{pool_address}`](/api-reference/pools/get-a-pool-on-a-network)
Gets detailed information about a specific pool.
**Parameters:**
* `ctx`\* - Context for API request
* `networkId`\* - ID of the network
* `poolAddress`\* - On-chain address of the pool
* `options` - PoolDetailOptions containing:
* `Inversed` - Whether to invert the price ratio (boolean)
**Returns:** Detailed pool information including tokens, volumes, liquidity, and more. [Response Structure](/api-reference/pools/get-a-pool-on-a-network).
```go
options := &dexpaprika.PoolDetailOptions{
Inversed: false,
}
poolDetails, err := client.Pools.GetDetails(ctx, "ethereum", "0xb4e16d0168e52d35cacd2c6185b44281ec28c9dc", options)
```
***
### client.Pools.GetTransactions(ctx, networkId, poolAddress, options)
**Endpoint:** [GET `/networks/{network}/pools/{pool_address}/transactions`](/api-reference/pools/get-transactions-of-a-pool-on-a-network-paging-can-be-used-up-to-100-pages)
Gets transaction history for a specific pool with pagination.
**Parameters:**
* `ctx`\* - Context for API request
* `networkId`\* - ID of the network
* `poolAddress`\* - On-chain address of the pool
* `options` - ListOptions containing pagination parameters
**Returns:** List of transactions with details about tokens, amounts, and timestamps. [Response Structure](/api-reference/pools/get-transactions-of-a-pool-on-a-network-paging-can-be-used-up-to-100-pages).
```go
options := &dexpaprika.ListOptions{
Limit: 20,
Page: 0,
}
transactions, err := client.Pools.GetTransactions(ctx, "ethereum", "0xb4e16d0168e52d35cacd2c6185b44281ec28c9dc", options)
```
***
### client.Pools.GetOHLCV(ctx, networkId, poolAddress, options)
**Endpoint:** [GET `/networks/{network}/pools/{pool_address}/ohlcv`](/api-reference/pools/get-ohlcv-data-for-a-pool-pair)
Gets OHLCV (Open, High, Low, Close, Volume) chart data for a pool.
**Parameters:**
* `ctx`\* - Context for API request
* `networkId`\* - ID of the network
* `poolAddress`\* - On-chain address of the pool
* `options` - OHLCVOptions containing:
* `Start`\* - Start time (time.Time or string ISO format)
* `End` - End time (optional)
* `Limit` - Number of data points to return
* `Interval` - Time interval ('1h', '6h', '24h', etc.)
* `Inversed` - Whether to invert the price ratio (boolean)
**Returns:** Array of OHLCV data points for the specified time range and interval. [Response Structure](/api-reference/pools/get-ohlcv-data-for-a-pool-pair).
```go
// Start time 7 days ago
startTime := time.Now().AddDate(0, 0, -7)
options := &dexpaprika.OHLCVOptions{
Start: startTime,
Limit: 100,
Interval: "1h",
Inversed: false,
}
ohlcv, err := client.Pools.GetOHLCV(ctx, "ethereum", "0xb4e16d0168e52d35cacd2c6185b44281ec28c9dc", options)
```
### client.Tokens.GetDetails(ctx, networkId, tokenAddress)
**Endpoint:** [GET `/networks/{network}/tokens/{token_address}`](/api-reference/tokens/get-a-tokens-latest-data-on-a-network)
Gets comprehensive token information.
**Parameters:**
* `ctx`\* - Context for API request
* `networkId`\* - ID of the network
* `tokenAddress`\* - Token contract address
**Returns:** Token details including price, market cap, volume, and metadata. [Response Structure](/api-reference/tokens/get-a-tokens-latest-data-on-a-network).
```go
token, err := client.Tokens.GetDetails(ctx, "ethereum", "0xc02aaa39b223fe8d0a0e5c4f27ead9083c756cc2")
```
***
### client.Tokens.GetPools(ctx, networkId, tokenAddress, options, additionalTokenAddress)
**Endpoint:** [GET `/networks/{network}/tokens/{token_address}/pools`](/api-reference/tokens/get-top-x-pools-for-a-token)
Gets pools containing a specific token.
**Parameters:**
* `ctx`\* - Context for API request
* `networkId`\* - ID of the network
* `tokenAddress`\* - Token contract address
* `options` - ListOptions containing pagination and sorting parameters
* `additionalTokenAddress` - Optional second token address to filter for pairs with both tokens
**Returns:** List of pools where the queried token is found. [Response Structure](/api-reference/tokens/get-top-x-pools-for-a-token).
```go
options := &dexpaprika.ListOptions{
Limit: 10,
Page: 0,
OrderBy: "volume_usd",
Sort: "desc",
}
// Get all WETH-USDC pools
pools, err := client.Tokens.GetPools(
ctx,
"ethereum",
"0xc02aaa39b223fe8d0a0e5c4f27ead9083c756cc2", // WETH
options,
"0xa0b86991c6218b36c1d19d4a2e9eb0ce3606eb48", // USDC
)
```
### client.Search.Search(ctx, query)
**Endpoint:** [GET `/search`](/api-reference/search/search-for-tokens-pools-and-dexes)
Searches across tokens, pools, and DEXes using a query string.
**Parameters:**
* `ctx`\* - Context for API request
* `query`\* - Search query string
**Returns:** Matching entities from all categories (tokens, pools, DEXes). [Response Structure](/api-reference/search/search-for-tokens-pools-and-dexes).
```go
results, err := client.Search.Search(ctx, "ethereum")
```
### client.Utils.GetStats(ctx)
**Endpoint:** [GET `/stats`](/api-reference/utils/retrieve-high-level-asset-statistics)
Gets platform-wide statistics.
**Parameters:**
* `ctx`\* - Context for API request
**Returns:** Counts of chains, DEXes, pools, and tokens indexed. [Response Structure](/api-reference/utils/retrieve-high-level-asset-statistics).
```go
stats, err := client.Utils.GetStats(ctx)
```
## Complete Example
```go
package main
import (
"context"
"fmt"
"log"
"time"
"github.com/coinpaprika/dexpaprika-sdk-go/dexpaprika"
)
func main() {
// Initialize client
client := dexpaprika.NewClient()
// Create context with timeout
ctx, cancel := context.WithTimeout(context.Background(), 10*time.Second)
defer cancel()
// Get Ethereum network details
networks, err := client.Networks.List(ctx)
if err != nil {
log.Fatalf("Error fetching networks: %v", err)
}
var ethereum *dexpaprika.Network
for _, network := range networks {
if network.ID == "ethereum" {
ethereum = &network
break
}
}
if ethereum == nil {
log.Fatal("Ethereum network not found")
}
fmt.Printf("Found %s with %d DEXes\n", ethereum.DisplayName, ethereum.DexesCount)
// Get WETH token details
weth, err := client.Tokens.GetDetails(
ctx,
"ethereum",
"0xc02aaa39b223fe8d0a0e5c4f27ead9083c756cc2",
)
if err != nil {
log.Fatalf("Error fetching token details: %v", err)
}
fmt.Printf("%s price: $%.2f\n", weth.Name, weth.PriceUSD)
// Find WETH/USDC pools
options := &dexpaprika.ListOptions{
Limit: 5,
Page: 0,
OrderBy: "volume_usd",
Sort: "desc",
}
pools, err := client.Tokens.GetPools(
ctx,
"ethereum",
"0xc02aaa39b223fe8d0a0e5c4f27ead9083c756cc2", // WETH
options,
"0xa0b86991c6218b36c1d19d4a2e9eb0ce3606eb48", // USDC
)
if err != nil {
log.Fatalf("Error fetching pools: %v", err)
}
// Show top pools
fmt.Println("Top WETH/USDC pools:")
for _, pool := range pools.Pools {
fmt.Printf("%s: $%.2f 24h volume\n", pool.DexName, pool.VolumeUSD)
}
}
```
## Advanced Features
### Error Handling
```go
import (
"context"
"errors"
"fmt"
"log"
"time"
"github.com/coinpaprika/dexpaprika-sdk-go/dexpaprika"
)
func handleAPIErrors() {
client := dexpaprika.NewClient()
ctx := context.Background()
// Attempt to get a token with an invalid address
_, err := client.Tokens.GetDetails(ctx, "ethereum", "0xinvalidaddress")
if err != nil {
// Check for specific error types
var apiErr *dexpaprika.APIError
if errors.As(err, &apiErr) {
switch apiErr.StatusCode {
case 404:
fmt.Println("Token not found")
case 429:
fmt.Println("Rate limit exceeded, retry after a delay")
case 500:
fmt.Println("Server error, retry may succeed")
default:
fmt.Printf("API error: %s\n", apiErr.Message)
}
} else {
// Handle non-API errors (like network issues)
fmt.Printf("Non-API error: %v\n", err)
}
// Check if error is retryable
if dexpaprika.IsRetryable(err) {
fmt.Println("This error is retryable")
}
}
}
```
### Caching
```go
import (
"context"
"fmt"
"time"
"github.com/coinpaprika/dexpaprika-sdk-go/dexpaprika"
)
func useCaching() {
// Create a regular client
client := dexpaprika.NewClient()
// Create a cached client with 5-minute TTL
cachedClient := dexpaprika.NewCachedClient(client, nil, 5*time.Minute)
// Create context
ctx := context.Background()
// First call - hits the API
startTime := time.Now()
networks, err := cachedClient.GetNetworks(ctx)
if err != nil {
fmt.Printf("Error: %v\n", err)
return
}
fmt.Printf("First call (API): %d networks, took %v\n", len(networks), time.Since(startTime))
// Second call - served from cache (much faster)
startTime = time.Now()
networks, err = cachedClient.GetNetworks(ctx)
if err != nil {
fmt.Printf("Error: %v\n", err)
return
}
fmt.Printf("Second call (cached): %d networks, took %v\n", len(networks), time.Since(startTime))
}
```
### Pagination Helpers
```go
import (
"context"
"fmt"
"time"
"github.com/coinpaprika/dexpaprika-sdk-go/dexpaprika"
)
func usePagination() {
client := dexpaprika.NewClient()
ctx, cancel := context.WithTimeout(context.Background(), 30*time.Second)
defer cancel()
// Create a paginator for Ethereum pools with 50 items per page
options := &dexpaprika.ListOptions{
Limit: 50,
OrderBy: "volume_usd",
Sort: "desc",
}
paginator := dexpaprika.NewPoolsPaginator(client, options).ForNetwork("ethereum")
// Count total pools processed
totalPools := 0
// Process first 3 pages (or fewer if there aren't that many)
for i := 0; i < 3 && paginator.HasNextPage(); i++ {
// Get next page of results
if err := paginator.GetNextPage(ctx); err != nil {
fmt.Printf("Error getting page: %v\n", err)
break
}
// Process current page
pools := paginator.GetCurrentPage()
totalPools += len(pools)
// Process first few pools in each page
fmt.Printf("=== Page %d ===\n", i+1)
for j, pool := range pools {
if j >= 3 {
fmt.Printf("...and %d more pools\n", len(pools)-3)
break
}
fmt.Printf("%s: $%.2f 24h volume\n", pool.DexName, pool.VolumeUSD)
}
}
fmt.Printf("Processed %d pools total\n", totalPools)
}
```
### Custom Configuration
```go
import (
"net/http"
"time"
"github.com/coinpaprika/dexpaprika-sdk-go/dexpaprika"
)
func configureClient() *dexpaprika.Client {
// Create a client with custom configuration
client := dexpaprika.NewClient(
// Custom HTTP client with longer timeout
dexpaprika.WithHTTPClient(&http.Client{
Timeout: 60 * time.Second,
}),
// Custom retry configuration (5 retries with backoff)
dexpaprika.WithRetryConfig(5, 2*time.Second, 30*time.Second),
// Rate limiting to 3 requests per second
dexpaprika.WithRateLimit(3.0),
// Custom user agent
dexpaprika.WithUserAgent("MyApp/1.0 DexPaprikaClient"),
)
return client
}
```
## Resources
* [GitHub Repository](https://github.com/coinpaprika/dexpaprika-sdk-go)
* [DexPaprika Website](https://dexpaprika.com)
* [API Reference](/api-reference/introduction)
## API Status
The DexPaprika API provides consistent data with stable endpoints. No API key is currently required to access the service. We aim to maintain backward compatibility and provide notice of any significant changes.
# DexPaprika PHP SDK
Source: https://docs.dexpaprika.com/get-started/sdk-php
The official PHP client library for the DexPaprika API, providing easy access to decentralized exchange data across multiple blockchain networks
## Installation
```bash
# Using Composer
composer require coinpaprika/dexpaprika-sdk-php
# From source
git clone https://github.com/coinpaprika/dexpaprika-sdk-php.git
cd dexpaprika-sdk-php
composer install
```
## Prerequisites
* PHP 7.4 or higher
* [Composer](https://getcomposer.org/)
* `ext-json` PHP extension
* Connection to the internet to access the DexPaprika API
* No API key required
## Quick Example: Get Token Price
```php
tokens->getTokenDetails('ethereum', '0xc02aaa39b223fe8d0a0e5c4f27ead9083c756cc2');
echo "{$weth['name']}: \${$weth['price_usd']}";
// Output: Wrapped Ether: $3245.67
```
## API Methods Reference
Parameters marked with an asterisk (\*) are required.
### client->networks->getNetworks()
**Endpoint:** [GET `/networks`](/api-reference/networks/get-a-list-of-available-blockchain-networks)
Gets all supported blockchain networks including Ethereum, Solana, etc.
**Parameters:** None
**Returns:** Network IDs, names, and related information. [Response Structure](/api-reference/networks/get-a-list-of-available-blockchain-networks).
```php
// Get all networks
$networks = $client->networks->getNetworks();
echo "Found {$networks['networks']} networks";
```
### client->networks->findNetwork(networkId)
Gets a network by its ID.
**Parameters:**
* `networkId`\* - ID of the network to find (e.g., 'ethereum')
**Returns:** The network information or null if not found.
```php
// Find Ethereum network
$ethereum = $client->networks->findNetwork('ethereum');
if ($ethereum) {
echo "Found network: {$ethereum['display_name']}";
}
```
### client->networks->getNetworkDexes(networkId, options)
**Endpoint:** [GET `/networks/{network}/dexes`](/api-reference/dexes/get-a-list-of-available-dexes-on-a-network)
Gets all DEXes on a specific network.
**Parameters:**
* `networkId`\* - ID of the network (e.g., 'ethereum', 'solana')
* `options` - Additional options:
* `page` - Page number for pagination (starts at 0)
* `limit` - Number of results per page
**Returns:** DEX IDs, names, pool counts, and volume information. [Response Structure](/api-reference/dexes/get-a-list-of-available-dexes-on-a-network).
```php
// Get all DEXes on Ethereum
$dexes = $client->networks->getNetworkDexes('ethereum', ['limit' => 10]);
echo "Found {$dexes['dexes']} DEXes on Ethereum";
```
### client->networks->findDex(networkId, dexId)
Finds a specific DEX on a network by its ID.
**Parameters:**
* `networkId`\* - ID of the network
* `dexId`\* - ID of the DEX to find
**Returns:** The DEX information or null if not found.
```php
// Find Uniswap V3 on Ethereum
$dex = $client->networks->findDex('ethereum', 'uniswap-v3');
if ($dex) {
echo "Found DEX: {$dex['name']}";
}
```
### client->pools->getTopPools(options)
**Endpoint:** [GET `/pools`](/api-reference/pools/get-top-x-pools)
Gets top pools across all networks with pagination.
**Parameters:**
* `options` - Additional options:
* `page` - Page number for pagination (starts at 0)
* `limit` - Number of results per page
* `orderBy` - Field to sort by ('volume\_usd', 'liquidity\_usd', etc.)
* `sort` - Sort direction ('asc' or 'desc')
**Returns:** Paginated list of pool objects with pricing data. [Response Structure](/api-reference/pools/get-top-x-pools).
```php
// Get top 10 pools by volume
$topPools = $client->pools->getTopPools([
'limit' => 10,
'orderBy' => 'volume_usd',
'sort' => 'desc'
]);
echo "Top pool: {$topPools['pools'][0]['dex_name']}";
```
### client->pools->getNetworkPools(networkId, options)
**Endpoint:** [GET `/networks/{network}/pools`](/api-reference/pools/get-top-x-pools-on-a-network)
Gets pools on a specific network with pagination and sorting options.
**Parameters:**
* `networkId`\* - ID of the network
* `options` - Additional options:
* `page` - Page number for pagination (starts at 0)
* `limit` - Number of results per page
* `orderBy` - Field to sort by ('volume\_usd', 'liquidity\_usd', etc.)
* `sort` - Sort direction ('asc' or 'desc')
**Returns:** Paginated list of pools for the given network. [Response Structure](/api-reference/pools/get-top-x-pools-on-a-network).
```php
// Get top 5 pools on Ethereum by volume
$pools = $client->pools->getNetworkPools('ethereum', [
'limit' => 5,
'orderBy' => 'volume_usd',
'sort' => 'desc'
]);
echo "Found {$pools['pools']} pools on Ethereum";
```
### client->pools->getDexPools(networkId, dexId, options)
**Endpoint:** [GET `/networks/{network}/dexes/{dex}/pools`](/api-reference/pools/get-top-x-pools-on-a-networks-dex)
Gets pools on a specific DEX within a network with pagination.
**Parameters:**
* `networkId`\* - ID of the network
* `dexId`\* - ID of the DEX
* `options` - Additional options:
* `page` - Page number for pagination (starts at 0)
* `limit` - Number of results per page
* `orderBy` - Field to sort by ('volume\_usd', 'liquidity\_usd', etc.)
* `sort` - Sort direction ('asc' or 'desc')
**Returns:** Paginated list of pools for the specified DEX. [Response Structure](/api-reference/pools/get-top-x-pools-on-a-networks-dex).
```php
// Get top 10 Uniswap V3 pools on Ethereum
$uniswapPools = $client->pools->getDexPools('ethereum', 'uniswap-v3', [
'limit' => 10,
'orderBy' => 'volume_usd',
'sort' => 'desc'
]);
echo "Found {$uniswapPools['pools']} Uniswap V3 pools";
```
### client->pools->getPoolDetails(networkId, poolAddress, options)
**Endpoint:** [GET `/networks/{network}/pools/{pool_address}`](/api-reference/pools/get-a-pool-on-a-network)
Gets detailed information about a specific pool.
**Parameters:**
* `networkId`\* - ID of the network
* `poolAddress`\* - On-chain address of the pool
* `options` - Additional options:
* `inversed` - Whether to invert the price ratio (boolean)
**Returns:** Detailed pool information including tokens, volumes, liquidity, and more. [Response Structure](/api-reference/pools/get-a-pool-on-a-network).
```php
// Get details for a specific pool (WETH/USDC on Uniswap V2)
$pool = $client->pools->getPoolDetails(
'ethereum',
'0xb4e16d0168e52d35cacd2c6185b44281ec28c9dc',
['inversed' => false]
);
echo "Pool: {$pool['tokens'][0]['symbol']}/{$pool['tokens'][1]['symbol']}";
```
### client->pools->getPoolTransactions(networkId, poolAddress, options)
**Endpoint:** [GET `/networks/{network}/pools/{pool_address}/transactions`](/api-reference/pools/get-transactions-of-a-pool-on-a-network-paging-can-be-used-up-to-100-pages)
Gets transaction history for a specific pool with pagination.
**Parameters:**
* `networkId`\* - ID of the network
* `poolAddress`\* - On-chain address of the pool
* `options` - Additional options:
* `page` - Page number for pagination
* `limit` - Number of transactions per page
* `cursor` - Transaction ID used for cursor-based pagination
**Returns:** List of transactions with details about tokens, amounts, and timestamps. [Response Structure](/api-reference/pools/get-transactions-of-a-pool-on-a-network-paging-can-be-used-up-to-100-pages).
```php
// Get the latest 20 transactions for a pool
$transactions = $client->pools->getPoolTransactions(
'ethereum',
'0xb4e16d0168e52d35cacd2c6185b44281ec28c9dc',
['limit' => 20]
);
$latestTxTime = date('Y-m-d H:i:s', $transactions['transactions'][0]['block_timestamp']);
echo "Latest transaction: {$latestTxTime}";
```
### client->pools->getPoolOHLCV(networkId, poolAddress, start, options)
**Endpoint:** [GET `/networks/{network}/pools/{pool_address}/ohlcv`](/api-reference/pools/get-ohlcv-data-for-a-pool-pair)
Gets OHLCV (Open, High, Low, Close, Volume) chart data for a pool.
**Parameters:**
* `networkId`\* - ID of the network
* `poolAddress`\* - On-chain address of the pool
* `start`\* - Start time (ISO date string, YYYY-MM-DD, or Unix timestamp)
* `options` - Additional options:
* `end` - End time (optional)
* `limit` - Number of data points to return
* `interval` - Time interval ('1m', '5m', '15m', '30m', '1h', '6h', '12h', '24h')
* `inversed` - Whether to invert the price ratio (boolean)
**Returns:** Array of OHLCV data points for the specified time range and interval. [Response Structure](/api-reference/pools/get-ohlcv-data-for-a-pool-pair).
```php
// Get OHLCV data for the past 7 days with 1-hour intervals
$endDate = date('Y-m-d');
$startDate = date('Y-m-d', strtotime('-7 days'));
$ohlcv = $client->pools->getPoolOHLCV(
'ethereum',
'0xb4e16d0168e52d35cacd2c6185b44281ec28c9dc',
$startDate,
[
'end' => $endDate,
'interval' => '1h',
'limit' => 168 // 24 * 7 hours
]
);
echo "Received {$ohlcv} OHLCV data points";
```
### client->tokens->getTokenDetails(networkId, tokenAddress)
**Endpoint:** [GET `/networks/{network}/tokens/{token_address}`](/api-reference/tokens/get-a-tokens-latest-data-on-a-network)
Gets comprehensive token information.
**Parameters:**
* `networkId`\* - ID of the network
* `tokenAddress`\* - Token contract address
**Returns:** Token details including price, market cap, volume, and metadata. [Response Structure](/api-reference/tokens/get-a-tokens-latest-data-on-a-network).
```php
// Get WETH token details
$weth = $client->tokens->getTokenDetails(
'ethereum',
'0xc02aaa39b223fe8d0a0e5c4f27ead9083c756cc2'
);
echo "{$weth['name']} price: \${$weth['price_usd']}";
```
### client->tokens->getTokenPools(networkId, tokenAddress, options)
**Endpoint:** [GET `/networks/{network}/tokens/{token_address}/pools`](/api-reference/tokens/get-top-x-pools-for-a-token)
Gets pools containing a specific token.
**Parameters:**
* `networkId`\* - ID of the network
* `tokenAddress`\* - Token contract address
* `options` - Additional options:
* `page` - Page number for pagination (starts at 0)
* `limit` - Number of results per page
* `sort` - Sort direction ('asc' or 'desc')
* `orderBy` - Field to sort by ('volume\_usd', 'price\_usd', etc.)
* `address` - Optional second token address to filter for pairs with both tokens
**Returns:** List of pools where the queried token is found. [Response Structure](/api-reference/tokens/get-top-x-pools-for-a-token).
```php
// Get WETH/USDC pools
$pools = $client->tokens->getTokenPools(
'ethereum',
'0xc02aaa39b223fe8d0a0e5c4f27ead9083c756cc2', // WETH
[
'limit' => 5,
'orderBy' => 'volume_usd',
'sort' => 'desc',
'address' => '0xa0b86991c6218b36c1d19d4a2e9eb0ce3606eb48' // USDC
]
);
foreach ($pools['pools'] as $pool) {
echo "{$pool['dex_name']}: $" . number_format($pool['volume_usd'], 2) . " 24h volume\n";
}
```
### client->tokens->findToken(networkId, tokenSymbol)
Attempts to find a token by its symbol.
**Parameters:**
* `networkId`\* - ID of the network
* `tokenSymbol`\* - Token symbol to search for
**Returns:** Token details if found.
```php
// Find WETH token by symbol
try {
$token = $client->tokens->findToken('ethereum', 'WETH');
echo "Found token {$token['name']} at {$token['address']}";
} catch (Exception $e) {
echo "Token not found";
}
```
### client->search->search(query)
**Endpoint:** [GET `/search`](/api-reference/search/search-for-tokens-pools-and-dexes)
Searches across tokens, pools, and DEXes using a query string.
**Parameters:**
* `query`\* - Search query string
**Returns:** Matching entities from all categories (tokens, pools, DEXes). [Response Structure](/api-reference/search/search-for-tokens-pools-and-dexes).
```php
// Search for "ethereum" across all entities
$results = $client->search->search("ethereum");
echo "Found {$results['summary']['total_tokens']} tokens\n";
echo "Found {$results['summary']['total_pools']} pools\n";
echo "Found {$results['summary']['total_dexes']} dexes\n";
```
### client->stats->getStats()
**Endpoint:** [GET `/stats`](/api-reference/utils/retrieve-high-level-asset-statistics)
Gets platform-wide statistics.
**Parameters:** None
**Returns:** Counts of chains, DEXes, pools, and tokens indexed. [Response Structure](/api-reference/utils/retrieve-high-level-asset-statistics).
```php
// Get platform statistics
$stats = $client->stats->getStats();
echo "Total chains: {$stats['chains']}\n";
echo "Total DEXes: {$stats['dexes']}\n";
echo "Total pools: {$stats['pools']}\n";
echo "Total tokens: {$stats['tokens']}\n";
```
## Complete Example
```php
networks->getNetworks();
$ethereum = null;
foreach ($networks['networks'] as $network) {
if ($network['id'] === 'ethereum') {
$ethereum = $network;
break;
}
}
echo "Found {$ethereum['display_name']} with {$ethereum['dexes_count']} DEXes\n";
// Get WETH token details
$weth = $client->tokens->getTokenDetails(
'ethereum',
'0xc02aaa39b223fe8d0a0e5c4f27ead9083c756cc2'
);
echo "{$weth['name']} price: \${$weth['price_usd']}\n";
// Find WETH/USDC pools
$pools = $client->tokens->getTokenPools(
'ethereum',
'0xc02aaa39b223fe8d0a0e5c4f27ead9083c756cc2', // WETH
[
'limit' => 5,
'orderBy' => 'volume_usd',
'sort' => 'desc',
'address' => '0xa0b86991c6218b36c1d19d4a2e9eb0ce3606eb48' // USDC
]
);
// Show top pools
echo "Top WETH/USDC pools:\n";
foreach ($pools['pools'] as $pool) {
echo "{$pool['dex_name']}: $" . number_format($pool['volume_usd'], 2) . " 24h volume\n";
}
} catch (DexPaprikaApiException $e) {
echo "API Error: {$e->getMessage()}\n";
} catch (Exception $e) {
echo "Error: {$e->getMessage()}\n";
}
}
main();
```
## Advanced Features
### Error Handling
```php
tokens->getTokenDetails('ethereum', '0xinvalidaddress');
} catch (NotFoundException $e) {
echo "Token not found: {$e->getMessage()}\n";
} catch (RateLimitException $e) {
echo "Rate limit exceeded. Try again in: {$e->getRetryAfter()} seconds\n";
} catch (ServerException $e) {
echo "Server error occurred: {$e->getMessage()}\n";
} catch (NetworkException $e) {
echo "Network error: {$e->getMessage()}\n";
} catch (DexPaprikaApiException $e) {
echo "API error: {$e->getMessage()} (Code: {$e->getCode()})\n";
} catch (Exception $e) {
echo "General error: {$e->getMessage()}\n";
}
// The SDK automatically retries on these status codes:
// 408 (Request Timeout), 429 (Too Many Requests),
// 500, 502, 503, 504 (Server Errors)
// Custom retry configuration
use DexPaprika\Config;
$config = new Config();
$config->setMaxRetries(4); // Number of retry attempts (default: 3)
$config->setRetryDelays([100, 500, 1000, 5000]); // Delay in milliseconds
$client = new Client(null, null, false, $config);
// All API requests will now use these retry settings
try {
$networks = $client->networks->getNetworks();
} catch (Exception $e) {
echo "Failed after multiple retries: {$e->getMessage()}\n";
}
```
### Caching System
```php
setupCache();
// First call - hits the API
$startTime = microtime(true);
$networks = $client->networks->getNetworks();
$firstCallTime = microtime(true) - $startTime;
echo "First call (API): {$networks['networks']} networks, took {$firstCallTime} s\n";
// Second call - served from cache (much faster)
$startTime = microtime(true);
$networks = $client->networks->getNetworks();
$secondCallTime = microtime(true) - $startTime;
echo "Second call (cached): {$networks['networks']} networks, took {$secondCallTime} s\n";
echo "Cache speedup: {$firstCallTime / $secondCallTime} x\n";
// You can skip the cache when you need fresh data
$client->getConfig()->setCacheEnabled(false);
$freshNetworks = $client->networks->getNetworks();
$client->getConfig()->setCacheEnabled(true);
// Custom cache configuration
$cache = new FilesystemCache('/path/to/custom/cache');
$client->setupCache($cache, 300); // 5 minutes TTL
// Clear the entire cache
$client->getConfig()->getCache()->clear();
```
### Pagination Helper
```php
pools->getNetworkPools(
$networkId,
[
'page' => $page,
'limit' => $limit,
'sort' => 'desc',
'orderBy' => 'volume_usd'
]
);
$allPools = array_merge($allPools, $response['pools']);
// Check if there are more pages
$hasMore = count($response['pools']) == $limit;
$page++;
// Adding a small delay to avoid hitting rate limits
usleep(100000); // 100ms
}
echo "Fetched a total of {$allPools} pools on {$networkId}\n";
return $allPools;
}
// Alternative approach using built-in paginator
function fetchAllPoolsWithPaginator($networkId) {
$client = new Client();
$allPools = [];
// Create a paginator for network pools
$paginator = $client->createPaginator(
$client->pools,
'getNetworkPools',
[
$networkId,
[
'limit' => 50,
'orderBy' => 'volume_usd',
'sort' => 'desc'
]
]
);
// Iterate through all pages (or up to a maximum)
$paginator->setMaxPages(10); // Optional: limit to 10 pages
foreach ($paginator as $page => $response) {
$allPools = array_merge($allPools, $response['pools']);
echo "Processed page {$page}, got {$response['pools']} pools\n";
}
echo "Fetched a total of {$allPools} pools on {$networkId}\n";
return $allPools;
}
// Usage example
$ethereumPools = fetchAllPools('ethereum');
// or
$ethereumPools = fetchAllPoolsWithPaginator('ethereum');
```
### Working with Objects
```php
setResponseFormat('object');
$client = new Client(null, null, true, $config);
// Get pool details
$pool = $client->pools->getPoolDetails(
'ethereum',
'0x88e6a0c2ddd26feeb64f039a2c41296fcb3f5640' // USDC/WETH Uniswap v3 pool
);
// Access pool properties as object properties
echo "Pool: {$pool->tokens[0]->symbol}/{$pool->tokens[1]->symbol}\n";
echo "Volume (24h): \${$pool->day->volume_usd}\n";
echo "Transactions (24h): {$pool->day->txns}\n";
echo "Price: \${$pool->last_price_usd}\n";
// Time interval data is available for multiple timeframes
echo "1h price change: {$pool->hour1->last_price_usd_change}%\n";
echo "24h price change: {$pool->day->last_price_usd_change}%\n";
// Combining with array-based response for specific API calls
$client->getConfig()->setResponseFormat('array');
$networks = $client->networks->getNetworks();
// Back to object mode
$client->getConfig()->setResponseFormat('object');
$token = $client->tokens->getTokenDetails('ethereum', '0xc02aaa39b223fe8d0a0e5c4f27ead9083c756cc2');
echo "{$token->name} price: \${$token->price_usd}\n";
```
## Resources
* [GitHub Repository](https://github.com/coinpaprika/dexpaprika-sdk-php)
* [DexPaprika Website](https://dexpaprika.com)
* [API Reference](/api-reference/introduction)
* [Discord Community](https://discord.gg/DhJge5TUGM)
## API Status
The DexPaprika API provides consistent data with stable endpoints. No API key is currently required to access the service. We aim to maintain backward compatibility and provide notice of any significant changes.
# DexPaprika Python SDK
Source: https://docs.dexpaprika.com/get-started/sdk-python
The official Python client library for the DexPaprika API, providing easy access to decentralized exchange data across multiple blockchain networks
## Installation
```bash
# Using pip
pip install dexpaprika-sdk
# Using poetry
poetry add dexpaprika-sdk
# From source
git clone https://github.com/coinpaprika/dexpaprika-sdk-python.git
cd dexpaprika-sdk-python
pip install -e .
```
## Prerequisites
* Python 3.8 or higher
* Connection to the internet to access the DexPaprika API
* No API key required
## Quick Example: Get Token Price
```python
from dexpaprika_sdk import DexPaprikaClient
from dexpaprika_sdk.models import TokenDetails # Type hint example
# Create client and get WETH price on Ethereum
client = DexPaprikaClient()
token: TokenDetails = client.tokens.get_details("ethereum", "0xc02aaa39b223fe8d0a0e5c4f27ead9083c756cc2")
print(f"{token.name}: ${token.price_usd}")
# Output: Wrapped Ether: $3245.67
```
## API Methods Reference
Parameters marked with an asterisk (\*) are required.
### client.networks.list()
**Endpoint:** [GET `/networks`](/api-reference/networks/get-a-list-of-available-blockchain-networks)
Gets all supported blockchain networks including Ethereum, Solana, etc.
**Parameters:** None
**Returns:** Network IDs, names, and related information. [Response Structure](/api-reference/networks/get-a-list-of-available-blockchain-networks).
```python
# Get all networks
networks = client.networks.list()
print(f"Found {len(networks)} networks")
```
### client.dexes.list\_by\_network(network\_id, page, limit)
**Endpoint:** [GET `/networks/{network}/dexes`](/api-reference/dexes/get-a-list-of-available-dexes-on-a-network)
Gets all DEXes on a specific network.
**Parameters:**
* `network_id`\* - ID of the network (e.g., 'ethereum', 'solana')
* `page` - Page number for pagination (starts at 0)
* `limit` - Number of results per page
**Returns:** DEX IDs, names, pool counts, and volume information. [Response Structure](/api-reference/dexes/get-a-list-of-available-dexes-on-a-network).
```python
# Get all DEXes on Ethereum
dexes = client.dexes.list_by_network("ethereum")
print(f"Found {len(dexes.dexes)} DEXes on Ethereum")
```
### client.pools.list(page, limit, sort, order\_by)
**Endpoint:** [GET `/pools`](/api-reference/pools/get-top-x-pools)
Gets top pools across all networks with pagination.
**Parameters:**
* `page` - Page number for pagination (starts at 0)
* `limit` - Number of results per page
* `sort` - Sort direction ('asc' or 'desc')
* `order_by` - Field to sort by ('volume\_usd', 'liquidity\_usd', etc.)
**Returns:** Paginated list of pool objects with pricing data. [Response Structure](/api-reference/pools/get-top-x-pools).
```python
# Get top 10 pools by volume
top_pools = client.pools.list(limit=10, order_by="volume_usd", sort="desc")
print(f"Top pool: {top_pools.pools[0].dex_name}")
```
***
### client.pools.list\_by\_network(network\_id, page, limit, sort, order\_by)
**Endpoint:** [GET `/networks/{network}/pools`](/api-reference/pools/get-top-x-pools-on-a-network)
Gets pools on a specific network with pagination and sorting options.
**Parameters:**
* `network_id`\* - ID of the network
* `page` - Page number for pagination (starts at 0)
* `limit` - Number of results per page
* `sort` - Sort direction ('asc' or 'desc')
* `order_by` - Field to sort by ('volume\_usd', 'liquidity\_usd', etc.)
**Returns:** Paginated list of pools for the given network. [Response Structure](/api-reference/pools/get-top-x-pools-on-a-network).
```python
# Get top 5 pools on Ethereum by volume
pools = client.pools.list_by_network(
network_id="ethereum",
limit=5,
order_by="volume_usd",
sort="desc"
)
print(f"Found {len(pools.pools)} pools on Ethereum")
```
***
### client.pools.list\_by\_dex(network\_id, dex\_id, page, limit, sort, order\_by)
**Endpoint:** [GET `/networks/{network}/dexes/{dex}/pools`](/api-reference/pools/get-top-x-pools-on-a-networks-dex)
Gets pools on a specific DEX within a network with pagination.
**Parameters:**
* `network_id`\* - ID of the network
* `dex_id`\* - ID of the DEX
* `page` - Page number for pagination (starts at 0)
* `limit` - Number of results per page
* `sort` - Sort direction ('asc' or 'desc')
* `order_by` - Field to sort by ('volume\_usd', 'liquidity\_usd', etc.)
**Returns:** Paginated list of pools for the specified DEX. [Response Structure](/api-reference/pools/get-top-x-pools-on-a-networks-dex).
```python
# Get top 10 Uniswap V2 pools on Ethereum
uniswap_pools = client.pools.list_by_dex(
network_id="ethereum",
dex_id="uniswap_v2",
limit=10,
order_by="volume_usd",
sort="desc"
)
print(f"Found {len(uniswap_pools.pools)} Uniswap V2 pools")
```
***
### client.pools.get\_details(network\_id, pool\_address, inversed)
**Endpoint:** [GET `/networks/{network}/pools/{pool_address}`](/api-reference/pools/get-a-pool-on-a-network)
Gets detailed information about a specific pool.
**Parameters:**
* `network_id`\* - ID of the network
* `pool_address`\* - On-chain address of the pool
* `inversed` - Whether to invert the price ratio (boolean)
**Returns:** Detailed pool information including tokens, volumes, liquidity, and more. [Response Structure](/api-reference/pools/get-a-pool-on-a-network).
```python
from dexpaprika_sdk.models import PoolDetails # Type hint example
# Get details for a specific pool (WETH/USDC on Uniswap V2)
pool: PoolDetails = client.pools.get_details(
network_id="ethereum",
pool_address="0xb4e16d0168e52d35cacd2c6185b44281ec28c9dc",
inversed=False
)
print(f"Pool: {pool.tokens[0].symbol}/{pool.tokens[1].symbol}")
```
***
### client.pools.get\_transactions(network\_id, pool\_address, page, limit)
**Endpoint:** [GET `/networks/{network}/pools/{pool_address}/transactions`](/api-reference/pools/get-transactions-of-a-pool-on-a-network-paging-can-be-used-up-to-100-pages)
Gets transaction history for a specific pool with pagination.
**Parameters:**
* `network_id`\* - ID of the network
* `pool_address`\* - On-chain address of the pool
* `page` - Page number for pagination
* `limit` - Number of transactions per page
**Returns:** List of transactions with details about tokens, amounts, and timestamps. [Response Structure](/api-reference/pools/get-transactions-of-a-pool-on-a-network-paging-can-be-used-up-to-100-pages).
```python
# Get the latest 20 transactions for a pool
transactions = client.pools.get_transactions(
network_id="ethereum",
pool_address="0xb4e16d0168e52d35cacd2c6185b44281ec28c9dc",
limit=20
)
from datetime import datetime
latest_tx_time = datetime.fromtimestamp(transactions.transactions[0].block_timestamp).strftime('%Y-%m-%d %H:%M:%S')
print(f"Latest transaction: {latest_tx_time}")
```
***
### client.pools.get\_ohlcv(network\_id, pool\_address, start, end, limit, interval, inversed)
**Endpoint:** [GET `/networks/{network}/pools/{pool_address}/ohlcv`](/api-reference/pools/get-ohlcv-data-for-a-pool-pair)
Gets OHLCV (Open, High, Low, Close, Volume) chart data for a pool.
**Parameters:**
* `network_id`\* - ID of the network
* `pool_address`\* - On-chain address of the pool
* `start`\* - Start time (ISO date string, YYYY-MM-DD, or Unix timestamp)
* `end` - End time (optional)
* `limit` - Number of data points to return
* `interval` - Time interval ('1m', '5m', '15m', '30m', '1h', '6h', '12h', '24h')
* `inversed` - Whether to invert the price ratio (boolean)
**Returns:** Array of OHLCV data points for the specified time range and interval. [Response Structure](/api-reference/pools/get-ohlcv-data-for-a-pool-pair).
```python
from datetime import datetime, timedelta
# Get OHLCV data for the past 7 days with 1-hour intervals
end_date = datetime.now()
start_date = end_date - timedelta(days=7)
ohlcv = client.pools.get_ohlcv(
network_id="ethereum",
pool_address="0xb4e16d0168e52d35cacd2c6185b44281ec28c9dc",
start=start_date.strftime("%Y-%m-%d"),
end=end_date.strftime("%Y-%m-%d"),
interval="1h",
limit=168 # 24 * 7 hours
)
print(f"Received {len(ohlcv)} OHLCV data points")
```
### client.tokens.get\_details(network\_id, token\_address)
**Endpoint:** [GET `/networks/{network}/tokens/{token_address}`](/api-reference/tokens/get-a-tokens-latest-data-on-a-network)
Gets comprehensive token information.
**Parameters:**
* `network_id`\* - ID of the network
* `token_address`\* - Token contract address
**Returns:** Token details including price, market cap, volume, and metadata. [Response Structure](/api-reference/tokens/get-a-tokens-latest-data-on-a-network).
```python
# Get WETH token details
weth = client.tokens.get_details(
network_id="ethereum",
token_address="0xc02aaa39b223fe8d0a0e5c4f27ead9083c756cc2"
)
print(f"{weth.name} price: ${weth.price_usd}")
```
***
### client.tokens.get\_pools(network\_id, token\_address, page, limit, sort, order\_by, address)
**Endpoint:** [GET `/networks/{network}/tokens/{token_address}/pools`](/api-reference/tokens/get-top-x-pools-for-a-token)
Gets pools containing a specific token.
**Parameters:**
* `network_id`\* - ID of the network
* `token_address`\* - Token contract address
* `page` - Page number for pagination (starts at 0)
* `limit` - Number of results per page
* `sort` - Sort direction ('asc' or 'desc')
* `order_by` - Field to sort by ('volume\_usd', 'price\_usd', etc.)
* `address` - Optional second token address to filter for pairs with both tokens
**Returns:** List of pools where the queried token is found. [Response Structure](/api-reference/tokens/get-top-x-pools-for-a-token).
```python
# Get WETH/USDC pools
pools = client.tokens.get_pools(
network_id="ethereum",
token_address="0xc02aaa39b223fe8d0a0e5c4f27ead9083c756cc2", # WETH
limit=5,
order_by="volume_usd",
sort="desc",
address="0xa0b86991c6218b36c1d19d4a2e9eb0ce3606eb48" # USDC
)
for pool in pools.pools:
print(f"{pool.dex_name}: ${pool.volume_usd:,.2f} 24h volume")
```
### client.search.search(query)
**Endpoint:** [GET `/search`](/api-reference/search/search-for-tokens-pools-and-dexes)
Searches across tokens, pools, and DEXes using a query string.
**Parameters:**
* `query`\* - Search query string
**Returns:** Matching entities from all categories (tokens, pools, DEXes). [Response Structure](/api-reference/search/search-for-tokens-pools-and-dexes).
```python
# Search for "ethereum" across all entities
results = client.search.search("ethereum")
print(f"Found {len(results.tokens)} tokens")
print(f"Found {len(results.pools)} pools")
print(f"Found {len(results.dexes)} dexes")
```
### client.utils.get\_stats()
**Endpoint:** [GET `/stats`](/api-reference/utils/retrieve-high-level-asset-statistics)
Gets platform-wide statistics.
**Parameters:** None
**Returns:** Counts of chains, DEXes, pools, and tokens indexed. [Response Structure](/api-reference/utils/retrieve-high-level-asset-statistics).
```python
# Get platform statistics
stats = client.utils.get_stats()
print(f"Total chains: {stats.chains}")
print(f"Total DEXes: {stats.dexes}")
print(f"Total pools: {stats.pools}")
print(f"Total tokens: {stats.tokens}")
```
## Complete Example
```python
from dexpaprika_sdk import DexPaprikaClient
def main():
# Initialize client
client = DexPaprikaClient()
# Get Ethereum network details
networks = client.networks.list()
ethereum = next((n for n in networks if n.id == "ethereum"), None)
print(f"Found {ethereum.display_name} with {ethereum.dexes_count} DEXes")
# Get WETH token details
weth = client.tokens.get_details(
network_id="ethereum",
token_address="0xc02aaa39b223fe8d0a0e5c4f27ead9083c756cc2"
)
print(f"{weth.name} price: ${weth.price_usd}")
# Find WETH/USDC pools
pools = client.tokens.get_pools(
network_id="ethereum",
token_address="0xc02aaa39b223fe8d0a0e5c4f27ead9083c756cc2", # WETH
limit=5,
order_by="volume_usd",
sort="desc",
address="0xa0b86991c6218b36c1d19d4a2e9eb0ce3606eb48" # USDC
)
# Show top pools
print('Top WETH/USDC pools:')
for pool in pools.pools:
print(f"{pool.dex_name}: ${pool.volume_usd:,.2f} 24h volume")
if __name__ == "__main__":
main()
```
## Advanced Features
### Error Handling
```python
from dexpaprika_sdk import DexPaprikaClient
import requests
# Basic error handling
try:
client = DexPaprikaClient()
token = client.tokens.get_details("ethereum", "0xinvalidaddress")
except Exception as e:
if "404" in str(e):
print("Token not found")
elif "429" in str(e):
print("Rate limit exceeded")
else:
print(f"An error occurred: {e}")
# The SDK automatically retries on these status codes:
# 408 (Request Timeout), 429 (Too Many Requests),
# 500, 502, 503, 504 (Server Errors)
# Custom retry configuration
client = DexPaprikaClient(
max_retries=4, # Number of retry attempts (default: 4)
backoff_times=[0.1, 0.5, 1.0, 5.0] # Backoff times in seconds
)
# All API requests will now use these retry settings
try:
networks = client.networks.list()
except requests.exceptions.RetryError as e:
print(f"Failed after multiple retries: {e}")
```
### Caching System
```python
from dexpaprika_sdk import DexPaprikaClient
import time
# The SDK includes built-in caching by default
# Demonstration of cache behavior
# Create a regular client with default cache (5 minutes TTL)
client = DexPaprikaClient()
# First call - hits the API
start_time = time.time()
networks = client.networks.list()
first_call_time = time.time() - start_time
print(f"First call (API): {len(networks)} networks, took {first_call_time:.4f}s")
# Second call - served from cache (much faster)
start_time = time.time()
networks = client.networks.list()
second_call_time = time.time() - start_time
print(f"Second call (cached): {len(networks)} networks, took {second_call_time:.4f}s")
print(f"Cache speedup: {first_call_time / second_call_time:.1f}x")
# You can skip the cache when you need fresh data
fresh_networks = client.networks._get("/networks", skip_cache=True)
# Clear the entire cache
client.clear_cache()
# Clear cache only for specific endpoints
client.clear_cache(endpoint_prefix="/networks")
# Different types of data have different cache durations:
# - Network data: 24 hours
# - Pool data: 5 minutes
# - Token data: 10 minutes
# - Statistics: 15 minutes
# - Other data: 5 minutes (default)
```
### Pagination Helper
```python
from dexpaprika_sdk import DexPaprikaClient
import time
def fetch_all_pools(network_id):
client = DexPaprikaClient()
all_pools = []
page = 0
limit = 50
has_more = True
while has_more:
response = client.pools.list_by_network(
network_id=network_id,
page=page,
limit=limit,
sort="desc",
order_by="volume_usd"
)
all_pools.extend(response.pools)
# Check if there are more pages
has_more = len(response.pools) == limit
page += 1
# Adding a small delay to avoid hitting rate limits
time.sleep(0.1)
print(f"Fetched a total of {len(all_pools)} pools on {network_id}")
return all_pools
# Usage example
ethereum_pools = fetch_all_pools("ethereum")
```
### Parameter Validation
```python
from dexpaprika_sdk import DexPaprikaClient
# The SDK automatically validates parameters before making API requests
client = DexPaprikaClient()
# Invalid parameter examples will raise helpful error messages
try:
# Invalid network ID
client.pools.list_by_network(network_id="", limit=5)
except ValueError as e:
print(e) # "network_id is required"
try:
# Invalid sort parameter
client.pools.list(sort="invalid_sort")
except ValueError as e:
print(e) # "sort must be one of: asc, desc"
try:
# Invalid limit parameter
client.pools.list(limit=500)
except ValueError as e:
print(e) # "limit must be at most 100"
```
### Working with Models
```python
from dexpaprika_sdk import DexPaprikaClient
from dexpaprika_sdk.models import PoolDetails # Type hint example
client = DexPaprikaClient()
# Get pool details
pool: PoolDetails = client.pools.get_details(
network_id="ethereum",
pool_address="0x88e6a0c2ddd26feeb64f039a2c41296fcb3f5640" # USDC/WETH Uniswap v3 pool
)
# Access pool properties with type checking and auto-completion
print(f"Pool: {pool.tokens[0].symbol}/{pool.tokens[1].symbol}")
print(f"Volume (24h): ${pool.day.volume_usd:.2f}")
print(f"Transactions (24h): {pool.day.txns}")
print(f"Price: ${pool.last_price_usd:.4f}")
# Time interval data is available for multiple timeframes
print(f"1h price change: {pool.hour1.last_price_usd_change:.2f}%")
print(f"24h price change: {pool.day.last_price_usd_change:.2f}%")
# All API responses are converted to typed Pydantic models
# This provides automatic validation, serialization/deserialization,
# and IDE auto-completion support through Python type hints
```
## Resources
* [GitHub Repository](https://github.com/coinpaprika/dexpaprika-sdk-python)
* [DexPaprika Website](https://dexpaprika.com)
* [API Reference](/api-reference/introduction)
* [Discord Community](https://discord.gg/DhJge5TUGM)
## API Status
The DexPaprika API provides consistent data with stable endpoints. No API key is currently required to access the service. We aim to maintain backward compatibility and provide notice of any significant changes.
# DexPaprika SDK
Source: https://docs.dexpaprika.com/get-started/sdk-ts
JavaScript client library for accessing decentralized exchange data across multiple blockchain networks
## Prerequisites
* Node.js 14.0.0 or higher
* Connection to the internet to access the DexPaprika API
* No API key required
## Installation
Install the DexPaprika SDK using your preferred package manager:
```bash
# using npm
npm install @dexpaprika/sdk
# using yarn
yarn add @dexpaprika/sdk
# using pnpm
pnpm add @dexpaprika/sdk
```
## Quick Example
```javascript
import { DexPaprikaClient } from '@dexpaprika/sdk';
// Initialize the client
const client = new DexPaprikaClient();
// Get the price of Wrapped Ether (WETH) on Ethereum
async function getWethPrice() {
const wethAddress = '0xc02aaa39b223fe8d0a0e5c4f27ead9083c756cc2';
const token = await client.tokens.getDetails('ethereum', wethAddress);
console.log(`Current WETH price: $${token.price_usd}`);
}
getWethPrice();
```
## API Methods Reference
All API methods in the DexPaprika SDK follow a consistent pattern, accepting required parameters first, followed by an optional `options` object for additional configuration. This options pattern provides flexibility while keeping the API clean and easy to use.
For example, most listing methods accept pagination and sorting options:
```javascript
// Get pools with custom pagination and sorting
const pools = await client.pools.listByNetwork('ethereum', {
page: 0, // start at first page
limit: 20, // get 20 results
sort: 'desc', // sort descending
orderBy: 'volume_usd' // sort by volume
});
```
### client.networks.list()
**Endpoint:** [GET `/networks`](/api-reference/networks/get-networks)
Retrieves all supported blockchain networks and their metadata.
**Parameters:** None
**Returns:** Array of network objects containing network ID, name, and other information.
```javascript
const networks = await client.networks.list();
console.log(`Supported networks: ${networks.length}`);
```
***
### client.networks.getDexes(networkId, options)
**Endpoint:** [GET `/networks/{network}/dexes`](/api-reference/networks/get-dexes-of-a-network)
Retrieves all DEXes on a specific network.
**Parameters:**
* `networkId`\* - Network ID (e.g., "ethereum", "solana")
* `options` - Optional configuration:
* `page` - Page number for pagination
* `limit` - Number of DEXes per page
**Returns:** Paginated list of DEX objects with name, ID, and metadata.
```javascript
const dexes = await client.networks.getDexes('ethereum', { limit: 20 });
dexes.data.forEach(dex => console.log(dex.name));
```
### client.dexes.get(networkId, dexId)
**Endpoint:** [GET `/networks/{network}/dexes/{dex}`](/api-reference/dexes/get-a-dex-on-a-network)
Gets information about a specific DEX on a network.
**Parameters:**
* `networkId`\* - Network ID (e.g., "ethereum", "solana")
* `dexId`\* - DEX identifier (e.g., "uniswap\_v2")
**Returns:** DEX details including name, website, API endpoints if available, and statistics.
```javascript
const dex = await client.dexes.get('ethereum', 'uniswap_v2');
console.log(`${dex.name} stats: Volume $${dex.volume_usd_24h}, Pools: ${dex.pool_count}`);
```
### client.pools.list(options)
**Endpoint:** [GET `/pools`](/api-reference/pools/get-top-x-pools)
Gets top pools across all networks with pagination.
**Parameters:**
* `options` - Options object for pagination and sorting:
* `page` - Page number for pagination
* `limit` - Number of pools per page
* `sort` - Sort direction ('asc' or 'desc')
* `orderBy` - Field to sort by ('volume\_usd', 'price\_usd', etc.)
**Returns:** Paginated list of pool objects with pricing data.
```javascript
const pools = await client.pools.list({
limit: 10,
orderBy: 'volume_usd',
sort: 'desc'
});
console.log(`Top pool: ${pools.data[0].token0.symbol}/${pools.data[0].token1.symbol}`);
```
***
### client.pools.listByNetwork(networkId, options)
**Endpoint:** [GET `/networks/{network}/pools`](/api-reference/pools/get-top-x-pools-on-a-network)
Gets pools on a specific network with pagination and sorting options.
**Parameters:**
* `networkId`\* - ID of the network
* `options` - Options object for pagination and sorting:
* `page` - Page number for pagination
* `limit` - Number of pools per page
* `sort` - Sort direction ('asc' or 'desc')
* `orderBy` - Field to sort by ('volume\_usd', 'price\_usd', etc.)
**Returns:** Paginated list of pools for the given network.
```javascript
const ethereumPools = await client.pools.listByNetwork('ethereum', {
limit: 5,
orderBy: 'volume_usd',
sort: 'desc'
});
console.log(`Found ${ethereumPools.meta.total} pools on Ethereum`);
```
***
### client.pools.listByDex(networkId, dexId, options)
**Endpoint:** [GET `/networks/{network}/dexes/{dex}/pools`](/api-reference/pools/get-top-x-pools-on-a-networks-dex)
Gets pools on a specific DEX within a network with pagination.
**Parameters:**
* `networkId`\* - ID of the network
* `dexId`\* - ID of the DEX
* `options` - Options object for pagination and sorting:
* `page` - Page number for pagination
* `limit` - Number of pools per page
* `sort` - Sort direction ('asc' or 'desc')
* `orderBy` - Field to sort by ('volume\_usd', 'price\_usd', etc.)
**Returns:** Paginated list of pools for the specified DEX.
```javascript
const uniswapPools = await client.pools.listByDex('ethereum', 'uniswap_v2', {
limit: 10,
orderBy: 'volume_usd',
sort: 'desc'
});
console.log(`Top Uniswap V2 pool: ${uniswapPools.data[0].token0.symbol}/${uniswapPools.data[0].token1.symbol}`);
```
***
### client.pools.getDetails(networkId, poolAddress, options)
**Endpoint:** [GET `/networks/{network}/pools/{pool_address}`](/api-reference/pools/get-a-pool-on-a-network)
Gets detailed information about a specific pool.
**Parameters:**
* `networkId`\* - ID of the network
* `poolAddress`\* - On-chain address of the pool
* `options` - Options object:
* `inversed` - Whether to invert the price ratio (boolean)
**Returns:** Detailed pool information including tokens, volumes, liquidity, and more.
```javascript
// Get details for a specific pool (WETH/USDC on Uniswap V2)
const pool = await client.pools.getDetails(
'ethereum',
'0xb4e16d0168e52d35cacd2c6185b44281ec28c9dc',
{ inversed: false }
);
console.log(`Pool: ${pool.token0.symbol}/${pool.token1.symbol}`);
```
***
### client.pools.getTransactions(networkId, poolAddress, options)
**Endpoint:** [GET `/networks/{network}/pools/{pool_address}/transactions`](/api-reference/pools/get-transactions-of-a-pool-on-a-network-paging-can-be-used-up-to-100-pages)
Gets transaction history for a specific pool with pagination.
**Parameters:**
* `networkId`\* - ID of the network
* `poolAddress`\* - On-chain address of the pool
* `options` - Options object for pagination:
* `page` - Page number for pagination
* `limit` - Number of transactions per page
* `cursor` - Transaction ID for cursor-based pagination
**Returns:** List of transactions with details about tokens, amounts, and timestamps.
```javascript
// Get the latest 20 transactions for a pool
const transactions = await client.pools.getTransactions(
'ethereum',
'0xb4e16d0168e52d35cacd2c6185b44281ec28c9dc',
{ limit: 20 }
);
console.log(`Latest transaction: ${transactions.data[0].hash}`);
```
***
### client.pools.getOHLCV(networkId, poolAddress, options)
**Endpoint:** [GET `/networks/{network}/pools/{pool_address}/ohlcv`](/api-reference/pools/get-ohlcv-data-for-a-pool-pair)
Gets OHLCV (Open, High, Low, Close, Volume) chart data for a pool.
**Parameters:**
* `networkId`\* - ID of the network
* `poolAddress`\* - On-chain address of the pool
* `options`\* - OHLCV options object:
* `start`\* - Start time (ISO date string or timestamp)
* `end` - End time (optional)
* `limit` - Number of data points to return
* `interval` - Time interval ('1h', '6h', '24h', etc.)
* `inversed` - Whether to invert the price ratio (boolean)
**Returns:** Array of OHLCV data points for the specified time range and interval.
```javascript
// Get hourly price data for the past week
const startDate = new Date();
startDate.setDate(startDate.getDate() - 7);
const ohlcvData = await client.pools.getOHLCV(
'ethereum',
'0xb4e16d0168e52d35cacd2c6185b44281ec28c9dc',
{
start: startDate.toISOString(),
interval: '1h',
limit: 168 // 7 days * 24 hours
}
);
console.log(`Data points: ${ohlcvData.length}`);
console.log(`Current price: ${ohlcvData[ohlcvData.length-1].close}`);
```
### client.tokens.getDetails(networkId, tokenAddress)
**Endpoint:** [GET `/networks/{network}/tokens/{token_address}`](/api-reference/tokens/get-a-token-on-a-network)
Gets detailed information about a specific token on a network.
**Parameters:**
* `networkId`\* - ID of the network
* `tokenAddress`\* - Token address or identifier
**Returns:** Detailed token information including price, volume, and metadata.
```javascript
// Get details for WETH on Ethereum
const weth = await client.tokens.getDetails(
'ethereum',
'0xc02aaa39b223fe8d0a0e5c4f27ead9083c756cc2'
);
console.log(`${weth.name} (${weth.symbol}): $${weth.price_usd}`);
```
***
### client.tokens.getPools(networkId, tokenAddress, options)
**Endpoint:** [GET `/networks/{network}/tokens/{token_address}/pools`](/api-reference/tokens/get-token-pools-on-a-network)
Gets a list of liquidity pools that include the specified token.
**Parameters:**
* `networkId`\* - ID of the network
* `tokenAddress`\* - Token address or identifier
* `options` - Options object for filtering, pagination and sorting:
* `page` - Page number for pagination
* `limit` - Number of pools per page
* `sort` - Sort direction ('asc' or 'desc')
* `orderBy` - Field to sort by ('volume\_usd', 'liquidity\_usd', etc.)
* `pairWith` - Optional second token address to filter for specific pairs
**Returns:** Paginated list of pools containing the specified token.
```javascript
// Find WETH/USDC pools on Ethereum
const wethAddress = '0xc02aaa39b223fe8d0a0e5c4f27ead9083c756cc2';
const usdcAddress = '0xa0b86991c6218b36c1d19d4a2e9eb0ce3606eb48';
const pools = await client.tokens.getPools(
'ethereum',
wethAddress,
{
limit: 5,
orderBy: 'volume_usd',
sort: 'desc',
pairWith: usdcAddress
}
);
pools.data.forEach(pool => {
console.log(`${pool.dex.name}: $${pool.volume_usd_24h} 24h volume`);
});
```
### client.search.search(query)
**Endpoint:** [GET `/search`](/api-reference/search/search-for-tokens-pools-and-dexes)
Searches for tokens, pools, and DEXes by name or identifier.
**Parameters:**
* `query`\* - Search term (e.g., "uniswap", "bitcoin", or a token address)
**Returns:** Search results organized by category (tokens, pools, DEXes).
```javascript
// Search for "ethereum"
const results = await client.search.search('ethereum');
console.log(`Found ${results.tokens.length} tokens`);
console.log(`Found ${results.pools.length} pools`);
console.log(`Found ${results.dexes.length} DEXes`);
```
### client.utils.getStats()
**Endpoint:** [GET `/stats`](/api-reference/utils/get-stats)
Gets high-level statistics about the DexPaprika ecosystem.
**Parameters:** None
**Returns:** Statistics about chains, DEXes, pools, and tokens.
```javascript
const stats = await client.utils.getStats();
console.log(`Networks: ${stats.networks}`);
console.log(`DEXes: ${stats.dexes}`);
console.log(`Pools: ${stats.pools}`);
console.log(`Tokens: ${stats.tokens}`);
```
## Complete Example
```javascript
import { DexPaprikaClient } from '@dexpaprika/sdk';
async function main() {
// Initialize client
const client = new DexPaprikaClient();
// Get Ethereum network details
const networks = await client.networks.list();
const ethereum = networks.find(n => n.id === 'ethereum');
console.log(`Found ${ethereum.name} with ${ethereum.dexes_count} DEXes`);
// Get WETH token details
const weth = await client.tokens.getDetails(
'ethereum',
'0xc02aaa39b223fe8d0a0e5c4f27ead9083c756cc2'
);
console.log(`${weth.name} price: $${weth.price_usd}`);
// Find WETH/USDC pools
const pools = await client.tokens.getPools(
'ethereum',
'0xc02aaa39b223fe8d0a0e5c4f27ead9083c756cc2', // WETH
{
limit: 5,
sort: 'desc',
orderBy: 'volume_usd',
pairWith: '0xa0b86991c6218b36c1d19d4a2e9eb0ce3606eb48' // USDC
}
);
// Show top pools
console.log('Top WETH/USDC pools:');
pools.data.forEach(pool => {
console.log(`${pool.dex.name}: $${pool.volume_usd_24h.toLocaleString()} 24h volume`);
});
}
main().catch(console.error);
```
## Advanced Features
### Error Handling
```javascript
import { DexPaprikaClient, parseError } from '@dexpaprika/sdk';
// Basic error handling
try {
const client = new DexPaprikaClient();
const token = await client.tokens.getDetails('ethereum', '0xinvalidaddress');
} catch (error) {
// Using the helper to extract the most relevant error message
console.error('Error:', parseError(error));
// Or handle specific error cases manually
if (error.response?.status === 404) {
console.error('Resource not found');
} else if (error.response?.status === 429) {
console.error('Rate limit exceeded');
}
}
// The SDK automatically retries on these status codes:
// 408 (Request Timeout), 429 (Too Many Requests),
// 500, 502, 503, 504 (Server Errors)
// Custom retry configuration
const client = new DexPaprikaClient('https://api.dexpaprika.com', {}, {
retry: {
maxRetries: 3,
delaySequenceMs: [200, 500, 1000],
retryableStatuses: [429, 500, 503]
}
});
```
### Caching
```javascript
import { DexPaprikaClient, Cache } from '@dexpaprika/sdk';
// The SDK includes built-in caching by default
// This example shows how to configure it
// Configure caching with custom settings
const client = new DexPaprikaClient('https://api.dexpaprika.com', {}, {
cache: {
ttl: 60 * 1000, // 1 minute cache TTL (default is 5 minutes)
maxSize: 100, // Store up to 100 responses (default is 1000)
enabled: true // Enable caching (enabled by default)
}
});
// Demonstration of cache behavior
async function demonstrateCaching() {
console.time('First call');
await client.networks.list(); // Makes an API request
console.timeEnd('First call');
console.time('Second call');
await client.networks.list(); // Returns cached result
console.timeEnd('Second call');
// You can also manage the cache manually
client.clearCache(); // Clear all cached data
console.log(client.cacheSize); // Get current cache size
client.setCacheEnabled(false); // Disable caching
}
// Using the Cache class directly
const manualCache = new Cache({
ttl: 30 * 1000, // 30 second TTL
maxSize: 50 // Store maximum 50 items
});
manualCache.set('myKey', { data: 'example data' });
const data = manualCache.get('myKey');
manualCache.delete('myKey');
manualCache.clear();
```
### Pagination Helper
```javascript
import { DexPaprikaClient } from '@dexpaprika/sdk';
async function fetchAllPools(networkId) {
const client = new DexPaprikaClient();
const allPools = [];
let page = 0;
const limit = 50;
let hasMore = true;
while (hasMore) {
const response = await client.pools.listByNetwork(
networkId,
{
page,
limit,
sort: 'desc',
orderBy: 'volume_usd'
}
);
allPools.push(...response.data);
// Check if there are more pages
hasMore = response.data.length === limit;
page++;
// Adding a small delay to avoid hitting rate limits
await new Promise(resolve => setTimeout(resolve, 100));
}
console.log(`Fetched a total of ${allPools.length} pools on ${networkId}`);
return allPools;
}
// Usage example
fetchAllPools('ethereum').catch(console.error);
```
### Custom Configuration
```javascript
import { DexPaprikaClient } from '@dexpaprika/sdk';
import axios from 'axios';
// Create a custom axios instance
const axiosInstance = axios.create({
timeout: 60000, // 60 second timeout
headers: {
'User-Agent': 'MyApp/1.0 DexPaprikaSDK'
}
});
// Create client with custom configuration
const client = new DexPaprikaClient(
'https://api.dexpaprika.com', // Base URL (optional)
axiosInstance, // Custom axios instance (optional)
{
// Retry configuration (optional)
retry: {
maxRetries: 5,
delaySequenceMs: [100, 500, 1000, 2000, 5000],
retryableStatuses: [429, 500, 502, 503, 504]
},
// Cache configuration (optional)
cache: {
ttl: 10 * 60 * 1000, // 10 minutes TTL
maxSize: 500, // Store up to 500 responses
enabled: true // Enable caching
}
}
);
// Usage example
async function fetchWithCustomClient() {
try {
const networks = await client.networks.list();
console.log(`Fetched ${networks.length} networks with custom client`);
} catch (err) {
console.error('Error:', err);
}
}
```
## Resources
* [GitHub Repository](https://github.com/coinpaprika/dexpaprika-sdk-js)
* [DexPaprika Website](https://dexpaprika.com)
* [API Reference](/api-reference/introduction)
## API Status
The DexPaprika API provides consistent data with stable endpoints. No API key is currently required to access the service. We aim to maintain backward compatibility and provide notice of any significant changes.
# Welcome to DexPaprika API documentation!
Source: https://docs.dexpaprika.com/introduction
DexPaprika provides latest data access to comprehensive DEX data, including token prices, liquidity pools, and trading volumes. Whether you're building a trading dashboard, portfolio tracker, or DeFi application, we've got you covered.
Jump straight into the API documentation and start making requests within minutes
Learn step-by-step how to integrate DexPaprika into your applications
## Let's make your first API call in 3 steps!
You can also make your first request directly in our API Playground available in the API Reference section. Visit [GET Token](/api-reference/tokens/get-a-tokens-latest-data-on-a-network) to try it out.
In the next steps we will make a GET request to the [Token](/api-reference/tokens/get-a-tokens-latest-data-on-a-network) endpoint in order to get the latest price in USD of SOL. All we need is the network ID and the token address.
* Network ID: `solana`
* Token address: `So11111111111111111111111111111111111111112`
You can find the list of all supported networks in the [Networks](/api-reference/networks/get-a-list-of-available-blockchain-networks) endpoint.
Simply make a GET request to the following endpoint in order to fetch latest data about any token. In this case we will use the specified above network ID and token address to populate the endpoint in a format of `https://api.dexpaprika.com/networks/{network_id}/tokens/{address}`.
```bash bash
curl -X GET "https://api.dexpaprika.com/networks/solana/tokens/So11111111111111111111111111111111111111112"
```
```python python.py
import requests
response = requests.get("https://api.dexpaprika.com/networks/solana/tokens/So11111111111111111111111111111111111111112")
print(response.json())
```
```javascript javascript.js
fetch("https://api.dexpaprika.com/networks/solana/tokens/So11111111111111111111111111111111111111112")
.then(response => response.json())
.then(data => console.log(data));
```
This will return quite lengthy response of the latest data about Solana (SOL) that will look similar to this. We send a lot of data in the response, so in the next step we will extract only the price of SOL in USD.
```json Response [expandable]
{
"id": "So11111111111111111111111111111111111111112",
"name": "Wrapped SOL",
"symbol": "SOL",
"chain": "solana",
"decimals": 9,
"total_supply": 0,
"description": "",
"website": "",
"explorer": "",
"added_at": "2024-10-04T08:30:05Z",
"summary": {
"main_pool": "Czfq3xZZDmsdGdUyrNLtRhGc47cXcZtLG4crryfu44zE",
"price": 164.9536657855954,
"price_usd": 165.0363416231933,
"liquidity_usd": 21277129298.9034355594499,
"24h": {
"volume": 18207227.90283451,
"volume_usd": 3172926820.723157,
"sell": 11953376,
"buy": 8641848,
"txns": 20595327
},
"6h": {
"volume": 5237173.683970005,
"volume_usd": 882506167.8594747,
"sell": 3309487,
"buy": 2416834,
"txns": 5726323
},
"1h": {
"volume": 670087.840859647,
"volume_usd": 119985216.45169246,
"sell": 519919,
"buy": 343819,
"txns": 863738
},
"30m": {
"volume": 327613.1220672249,
"volume_usd": 54505021.9229673,
"sell": 245850,
"buy": 168366,
"txns": 414216
},
"15m": {
"volume": 168834.195930299,
"volume_usd": 28068199.473847672,
"sell": 124337,
"buy": 82098,
"txns": 206435
},
"5m": {
"volume": 64118.999946111,
"volume_usd": 10629376.9487243,
"sell": 45027,
"buy": 30474,
"txns": 75501
}
},
"last_updated": "2025-02-18T20:36:46.946687298Z"
}
```
As you can see, the response is very rich of data. You can extract only the price of SOL in USD from the full response by navigating to the `summary` object (*or any other object*) and then to the `price_usd` key (*or any other key*) simply by modyfying the previously used GET request:
```bash bash
curl -s "https://api.dexpaprika.com/networks/solana/tokens/So11111111111111111111111111111111111111112" | jq '.summary.price_usd'
```
```python python.py
import requests
response = requests.get("https://api.dexpaprika.com/networks/solana/tokens/So11111111111111111111111111111111111111112")
print(response.json()["summary"]["price_usd"])
```
```javascript javascript.js
fetch("https://api.dexpaprika.com/networks/solana/tokens/So11111111111111111111111111111111111111112")
.then(response => response.json())
.then(data => {
const onlyPrice = data.summary.price_usd;
console.log(onlyPrice);
});
```
And as a result you will get the price of SOL in USD:
```json summary.price_usd
165.0363416231933
```
Congratulations! You just successfully retrieved the latest price of SOL in USD!
## Next Steps
Jump straight into the API documentation and start making requests within minutes
Learn step-by-step how to integrate DexPaprika into your applications
## Popular Use Cases
Get real-time and historical price data for any Solana token
Access detailed liquidity pool data across major Solana DEXes
Analyze trading volumes, price changes, and market trends
Compare prices and liquidity across different DEXes
## Get Support
We're here to help you succeed with DexPaprika.
Connect with our community and get real-time support
Share your experience and help us improve
**Looking for enterprise solutions?** We offer dedicated support, higher rate limits, and custom features.
[Contact our team](mailto:support@coinpaprika.com) to learn more.
# Build a Crypto Price Alert Bot
Source: https://docs.dexpaprika.com/tutorials/crypto-alert-bot
Learn how to create a real-time cryptocurrency price alert system using DexPaprika API and Telegram
The DexPaprika API provides reliable data access. If you find any issues or have suggestions for improvement, please [contact us](mailto:support@coinpaprika.com).
## Overview
This tutorial guides you through building a **real-time cryptocurrency price alert system** that monitors prices using the **DexPaprika API** and sends notifications to your Telegram when price thresholds are met. Perfect for traders and developers who want to stay updated on market movements without constant manual checking.
The complete code for this tutorial is available on [GitHub](https://github.com/coinpaprika/tutorials/tree/main/crypto-alert-bot).
***
## Features
* Track any cryptocurrency available on DexPaprika API
* Set custom price thresholds for buy/sell opportunities
* Get instant alerts when prices go above or below your targets
* Configure check intervals to match your trading strategy
* Receive notifications directly on Telegram
***
## Prerequisites
* Node.js (v14 or higher)
* npm (Node Package Manager)
* A Telegram account
* A Telegram Bot (created using BotFather)
***
## Step 1: Create Your Telegram Bot
1. Open Telegram and search for "BotFather" (@BotFather)
2. Start a chat and send the command `/newbot`
3. Follow the instructions to create your bot
4. Save the **bot token** BotFather provides you
***
## Step 2: Get Your Telegram Chat ID
1. Start a conversation with your new bot
2. Send any message to your bot
3. Visit this URL in your browser (replace with your actual token):
```bash
https://api.telegram.org/botYOUR_BOT_TOKEN/getUpdates
```
4. Find the `"chat":{"id":XXXXXXXXX,` value in the response - this is your **chat ID**
***
## Step 3: Set Up the Project
1. Clone the repository or set up a new project:
```bash
git clone https://github.com/coinpaprika/tutorials/tree/main/crypto-alert-bot
# OR
mkdir crypto-alert-bot
cd crypto-alert-bot
npm init -y
```
2. Install required dependencies:
```bash
npm install axios dotenv node-telegram-bot-api
```
3. Create the following files in your project directory:
* `.env` (configuration file)
* `index.js` (main application)
***
## Step 4: Configure Your Settings
Create a `.env` file in the project directory with the following parameters:
```
# Telegram Bot Token (Get this from BotFather on Telegram)
TELEGRAM_BOT_TOKEN=your_telegram_bot_token_here
# Telegram Chat ID (The chat where alerts will be sent)
TELEGRAM_CHAT_ID=your_telegram_chat_id_here
# Cryptocurrency to track (token address)
CRYPTO_TOKEN_ID=So11111111111111111111111111111111111111112
CRYPTO_NETWORK=solana
# Price threshold for alert (in USD)
TARGET_PRICE=135
# Alert type: "above" or "below" - to trigger when price goes above or below target
ALERT_TYPE=above
# How often to check price (in minutes)
CHECK_INTERVAL=1
```
Make sure to replace placeholder values with your actual configuration details.
***
## Step 5: Create the Alert Bot
Create an `index.js` file with the following code:
```javascript expandable [expandable]
require('dotenv').config();
const axios = require('axios');
const TelegramBot = require('node-telegram-bot-api');
// Configuration from .env file
const TELEGRAM_BOT_TOKEN = process.env.TELEGRAM_BOT_TOKEN;
const TELEGRAM_CHAT_ID = process.env.TELEGRAM_CHAT_ID;
const CRYPTO_TOKEN_ID = process.env.CRYPTO_TOKEN_ID;
const CRYPTO_NETWORK = process.env.CRYPTO_NETWORK;
const TARGET_PRICE = parseFloat(process.env.TARGET_PRICE);
const ALERT_TYPE = process.env.ALERT_TYPE.toLowerCase();
const CHECK_INTERVAL = parseInt(process.env.CHECK_INTERVAL) * 60 * 1000; // Convert minutes to milliseconds
// Initialize Telegram Bot
const bot = new TelegramBot(TELEGRAM_BOT_TOKEN);
// Validate configuration
if (!TELEGRAM_BOT_TOKEN || !TELEGRAM_CHAT_ID || !CRYPTO_TOKEN_ID || !CRYPTO_NETWORK ||
isNaN(TARGET_PRICE) || !ALERT_TYPE || isNaN(CHECK_INTERVAL)) {
console.error('Invalid configuration. Please check your .env file.');
process.exit(1);
}
// Send startup message
bot.sendMessage(TELEGRAM_CHAT_ID,
`🤖 Crypto Alert Bot Started!\n\n` +
`Monitoring: ${CRYPTO_TOKEN_ID} on ${CRYPTO_NETWORK}\n` +
`Alert when price goes ${ALERT_TYPE} $${TARGET_PRICE}\n` +
`Checking every ${CHECK_INTERVAL / 60000} minute(s)`
);
// Variables to track alert state
let alertSent = false;
let lastPrice = 0;
// Main function to check price and send alerts
async function checkPrice() {
try {
// Fetch current price from DexPaprika API
const response = await axios.get(
`https://api.dexpaprika.com/networks/${CRYPTO_NETWORK}/tokens/${CRYPTO_TOKEN_ID}`
);
// Extract price from response
const currentPrice = response.data.summary.price_usd;
lastPrice = currentPrice;
console.log(`Current price of ${CRYPTO_TOKEN_ID}: $${currentPrice}`);
// Check if alert condition is met
let shouldAlert = false;
if (ALERT_TYPE === 'above' && currentPrice > TARGET_PRICE) {
shouldAlert = true;
} else if (ALERT_TYPE === 'below' && currentPrice < TARGET_PRICE) {
shouldAlert = true;
}
// Send alert if condition is met and no alert was sent before
if (shouldAlert && !alertSent) {
const message =
`🚨 PRICE ALERT 🚨\n\n` +
`${response.data.name} (${response.data.symbol})\n` +
`Current Price: $${currentPrice}\n` +
`Target: ${ALERT_TYPE} $${TARGET_PRICE}\n\n` +
`Condition met! 🎯`;
bot.sendMessage(TELEGRAM_CHAT_ID, message);
alertSent = true;
console.log('Alert sent!');
}
// Reset alert flag if price goes back on the other side of the threshold
if ((ALERT_TYPE === 'above' && currentPrice < TARGET_PRICE) ||
(ALERT_TYPE === 'below' && currentPrice > TARGET_PRICE)) {
alertSent = false;
}
} catch (error) {
console.error('Error checking price:', error.message);
// Send error notification if API fails
if (error.response) {
bot.sendMessage(TELEGRAM_CHAT_ID,
`⚠️ Error checking price: ${error.response.status} - ${error.response.statusText}`);
} else {
bot.sendMessage(TELEGRAM_CHAT_ID,
`⚠️ Error checking price: ${error.message}`);
}
}
}
// Run the price check immediately
checkPrice();
// Then set up interval to check regularly
setInterval(checkPrice, CHECK_INTERVAL);
console.log(`Bot is running. Checking ${CRYPTO_TOKEN_ID} every ${CHECK_INTERVAL / 60000} minute(s).`);
```
***
## Step 6: Finding the Right Token Address
Need to track a different token? Use DexPaprika API to find its address:
1. List available networks:
```bash
curl -X GET "https://api.dexpaprika.com/networks" | jq
```
2. Search for your token:
```bash
curl -X GET "https://api.dexpaprika.com/search?query=YOUR_TOKEN_NAME" | jq
```
***
## Step 7: Running the Bot
1. Start the bot:
```bash
node index.js
```
2. You'll receive a confirmation message on Telegram.
3. The bot will check prices at your specified interval
4. When your price condition is met, you'll get an alert
***
## Running as a Background Service
### On Linux/Mac:
```bash
npm install -g pm2
pm2 start index.js --name crypto-alert
pm2 save
```
### On Windows:
```bash
npm install -g forever
forever start index.js
```
***
## How It Works
1. The application connects to DexPaprika API to retrieve real-time token prices
2. It compares the current price against your target threshold
3. When the condition is met, it sends an immediate alert via the Telegram Bot API
4. The process repeats based on your configured check interval
***
## Troubleshooting
* Not receiving messages? Double-check your bot token and chat ID
* Ensure your network/token combination is valid in DexPaprika
* Check console output for any error messages
***
## Next Steps
Extend the code to monitor multiple tokens or set different thresholds.
Build a visual interface to manage your alerts and view price history.
**Share Your Extensions!** Built something cool by extending this tutorial? We'd love to see it!
Share your work on our [Discord](https://discord.gg/DhJge5TUGM) - your tutorial might be featured on our website.
Ideas to try: smart trend alerts, multi-token tracking, or historical data analysis.
***
## Get Support
Connect with our community and get real-time support.
Share your experience and help us improve.
# Fetching Token Prices
Source: https://docs.dexpaprika.com/tutorials/fetch-token-price
Learn how to retrieve the price of any token using DexPaprika API with simple curl commands.
The DexPaprika API provides reliable data access. If you find any issues or have suggestions for improvement, please [contact us](mailto:support@coinpaprika.com).
## Fetching Token Prices
This tutorial will walk you through retrieving the **latest price of any token** using **DexPaprika API**. We will use simple `cURL` commands to interact with the API.
You can test the API directly in the documentation without writing any code. Visit the [API Reference](/api-reference/introduction) to try it out.
***
## Step 1: Get Available Networks
To fetch token data, you need the **network ID**. Use this request to get a list of supported blockchain networks:
```bash
curl -X GET "https://api.dexpaprika.com/networks" | jq
```
The response will include networks like **Solana, Base, Aptos, Ethereum**, etc. Choose the one you need.
You can find the full list of supported networks in the [Networks API](/api-reference/networks/get-a-list-of-available-blockchain-networks).
***
## Step 2: Find a Token Address
If you don't have a token address, you can search for it using the API:
```bash
curl -X GET "https://api.dexpaprika.com/search?query=YOUR_INPUT"
```
Replace `YOUR_INPUT` with the actual phrase you're looking for. The response will return matching token & pools along with their addresses.
***
## Step 3: Fetch the Token Price
Once you have the **network ID** and **token address**, use the following request to get the token's price:
```bash
curl -X GET "https://api.dexpaprika.com/networks/{network}/tokens/{token_address}"
```
Replace:
* `{network}` with the network (e.g., `solana`, `base`)
* `{token_address}` with the address found in Step 2
The response will return data like this:
```json Response [expandable]
{
"id": "JUPyiwrYJFskUPiHa7hkeR8VUtAeFoSYbKedZNsDvCN",
"name": "Jupiter",
"symbol": "JUP",
"chain": "solana",
"decimals": 6,
"total_supply": 9999979509174084,
"description": "",
"website": "",
"explorer": "",
"added_at": "2024-09-11T04:37:20Z",
"summary": {
"price_usd": 0.6863252359922881,
"fdv": 6863238296.5519485244070243816004,
"liquidity_usd": 25575125.4495078768612017,
"24h": {
"volume": 80207699.45778705,
"volume_usd": 55796800.523819186,
"sell": 106864,
"buy": 57315,
"txns": 164179
},
"6h": {
"volume": 11540575.177305005,
"volume_usd": 8037337.331943456,
"sell": 17801,
"buy": 9926,
"txns": 27727
},
"1h": {
"volume": 2766848.754695,
"volume_usd": 1900837.0877990713,
"sell": 3484,
"buy": 2082,
"txns": 5566
},
"30m": {
"volume": 1394651.1182109998,
"volume_usd": 954794.8829624434,
"sell": 1907,
"buy": 1198,
"txns": 3105
},
"15m": {
"volume": 373588.37757400004,
"volume_usd": 255759.0367338767,
"sell": 742,
"buy": 316,
"txns": 1058
},
"5m": {
"volume": 109317.68508500002,
"volume_usd": 74963.92390747965,
"sell": 265,
"buy": 86,
"txns": 351
}
},
"last_updated": "2025-02-26T13:11:25.858732857Z"
}
```
***
## Step 4: Extract Only the Price
If you only need the **token price in USD**, you can filter the response using `jq`:
```bash
curl -X GET "https://api.dexpaprika.com/networks/{network}/tokens/{token_address}" | jq '.summary.price_usd'
```
This will return only the price:
```json
0.6863252359922881
```
***
## Next Steps
Jump straight into the API documentation and start making requests within minutes.
Learn more ways to integrate DexPaprika into your applications.
## Get Support
Connect with our community and get real-time support.
Share your experience and help us improve.
# Find New Crypto Pools and Token Launches
Source: https://docs.dexpaprika.com/tutorials/find-new-pools
Discover newly created liquidity pools across 20+ blockchains. Perfect for finding new tokens, arbitrage opportunities, and emerging DeFi projects early.
Hitting any snags? We've got your back - [reach out](mailto:support@coinpaprika.com) and we'll help you get this working.
## Why Monitor New Pools?
New liquidity pools are where the action happens in DeFi. They signal new token launches, liquidity migrations, and fresh trading opportunities. Being first to spot them can be a serious advantage.
**What you can catch early:**
* **New token launches** before they hit major trackers
* **Arbitrage opportunities** between pools
* **Liquidity migrations** to better DEXes
* **Emerging projects** before they explode
**TL;DR**: Jump to [Step 2](#step-2-get-newest-pools) if you want to start pulling new pools immediately.
***
## Step 1: Pick Your Networks
First, see what blockchains you want to monitor using the [Networks API](/api-reference/networks/get-a-list-of-available-blockchain-networks):
```bash
curl "https://api.dexpaprika.com/networks" | jq '.[] | {id: .id, name: .display_name}'
```
***
## Step 2: Get Newest Pools
Here's the money shot - getting pools sorted by creation date using the [Network Pools API](/api-reference/pools/get-top-x-pools-on-a-network):
```bash
curl "https://api.dexpaprika.com/networks/ethereum/pools?orderBy=created_at&sort=desc&limit=10" | jq
```
### What These Parameters Do:
| Parameter | Effect | Pro Tip |
| -------------------- | -------------------------- | --------------------------------------- |
| `orderBy=created_at` | Sort by when pool was made | Only way to find truly new pools |
| `sort=desc` | Newest first | Use `asc` for historical analysis |
| `limit=10` | How many pools back | Max is 100, but 10-20 is usually enough |
| `page=0` | Pagination | Increase for deeper history |
### What You Get:
```json
{
"pools": [
{
"id": "0x462229e7fc9e6cab0ebbd643cc6dfef2a5261ee9",
"dex_name": "Uniswap V2",
"created_at": "2025-06-09T09:28:35Z",
"volume_usd": 767.17,
"transactions": 5,
"tokens": [
{
"symbol": "🧸TEDDYS",
"name": "🧸TeddySwap",
"fdv": 18769.56
},
{
"symbol": "WETH",
"fdv": 6617036250.33
}
]
}
]
}
```
**Red flags to watch for:**
* Crazy high FDV on new tokens (possible honeypot)
* Zero transactions after hours (might be a test pool)
***
## Focus on Specific DEXes
Want to monitor just Uniswap or Raydium? First get the DEX list using the [Network DEXes API](/api-reference/dexes/get-a-list-of-available-dexes-on-a-network):
```bash
curl "https://api.dexpaprika.com/networks/ethereum/dexes" | jq '.[] | {id: .id, name: .name}'
```
Then filter new pools by DEX using the [DEX Pools API](/api-reference/pools/get-top-x-pools-on-a-networks-dex):
```bash
curl "https://api.dexpaprika.com/networks/ethereum/dexes/uniswap_v3/pools?orderBy=created_at&sort=desc&limit=10" | jq
```
***
## Multi-Chain Monitoring
Real pros monitor multiple chains simultaneously. Here's how:
### Ethereum
```bash
curl "https://api.dexpaprika.com/networks/ethereum/pools?orderBy=created_at&sort=desc&limit=5" | jq '.pools[] | {chain: "ethereum", pool: .id, created: .created_at, volume: .volume_usd, tokens: [.tokens[].symbol]}'
```
### Solana
```bash
curl "https://api.dexpaprika.com/networks/solana/pools?orderBy=created_at&sort=desc&limit=5" | jq '.pools[] | {chain: "solana", pool: .id, created: .created_at, volume: .volume_usd, tokens: [.tokens[].symbol]}'
```
### Base
```bash
curl "https://api.dexpaprika.com/networks/base/pools?orderBy=created_at&sort=desc&limit=5" | jq '.pools[] | {chain: "base", pool: .id, created: .created_at, volume: .volume_usd, tokens: [.tokens[].symbol]}'
```
**Pro tip**: Use different limits based on chain activity. Solana might need `limit=20` because of memecoin amounts, while Ethereum `limit=5` catches the important stuff.
***
## Smart Filtering Strategies
### Only High-Activity Pools
Skip the dead pools - focus on ones with real trading:
```bash
curl "https://api.dexpaprika.com/networks/ethereum/pools?orderBy=created_at&sort=desc&limit=50" | jq '.pools[] | select(.volume_usd > 1000 and .transactions > 10) | {id: .id, created: .created_at, volume: .volume_usd, txns: .transactions}'
```
### Last 24 Hours Only
Filter by timestamp to get super fresh pools:
```bash
# Get pools from last 24 hours (adjust date as needed)
curl "https://api.dexpaprika.com/networks/ethereum/pools?orderBy=created_at&sort=desc&limit=100" | jq --arg yesterday "$(date -d '1 day ago' -Iseconds)" '.pools[] | select(.created_at > $yesterday) | {id: .id, age: .created_at, volume: .volume_usd}'
```
### New Token Launches
Look for pools where the tokens themselves are brand new:
```bash
curl "https://api.dexpaprika.com/networks/ethereum/pools?orderBy=created_at&sort=desc&limit=20" | jq '.pools[] | select(.tokens[] | .added_at > "2025-01-25T00:00:00Z") | {pool: .id, pool_age: .created_at, new_tokens: [.tokens[] | select(.added_at > "2025-01-25T00:00:00Z") | .symbol]}'
```
***
## Production Monitoring Setup
### Simple Monitoring Script
Here's a bash script that checks for new pools every 5 minutes:
```bash
#!/bin/bash
# Monitor new pools across multiple chains
CHAINS=("ethereum" "solana" "base")
MIN_VOLUME=5000
while true; do
echo "🔍 Checking for new pools at $(date)"
for chain in "${CHAINS[@]}"; do
echo "--- $chain ---"
# Get new pools with decent volume
curl -s "https://api.dexpaprika.com/networks/$chain/pools?orderBy=created_at&sort=desc&limit=10" | \
jq --argjson min_vol $MIN_VOLUME '.pools[] | select(.volume_usd > $min_vol) | {
pool_id: .id,
chain: "'$chain'",
created: .created_at,
volume: .volume_usd,
tokens: [.tokens[].symbol]
}'
done
echo "⏰ Sleeping for 5 minutes..."
sleep 300
done
```
### Alert on High-Volume New Pools
Catch the big moves automatically:
```bash
# Check for new pools with serious volume
NEW_POOLS=$(curl -s "https://api.dexpaprika.com/networks/ethereum/pools?orderBy=created_at&sort=desc&limit=20" | jq '.pools[] | select(.volume_usd > 50000)')
if [ -n "$NEW_POOLS" ]; then
echo "🚨 HIGH VOLUME NEW POOL DETECTED!"
echo "$NEW_POOLS"
# Add your notification here (Discord webhook, Slack, email, etc.)
# curl -X POST "YOUR_DISCORD_WEBHOOK" -d "{\"content\": \"New high-volume pool: $NEW_POOLS\"}"
fi
```
***
## Troubleshooting Common Issues
### "No New Pools Found"
Check if you're looking at the right timeframe:
```bash
# See when the last pool was created
curl "https://api.dexpaprika.com/networks/ethereum/pools?orderBy=created_at&sort=desc&limit=1" | jq '.pools[0].created_at'
```
### "Too Much Spam"
Filter out low-quality pools:
```bash
# More restrictive filtering
curl "https://api.dexpaprika.com/networks/solana/pools?orderBy=created_at&sort=desc&limit=50" | jq '.pools[] | select(.transactions > 20 and .volume_usd > 5000 and (.tokens[0].fdv < 100000000 or .tokens[1].fdv < 100000000))'
```
### "Missing Opportunities"
You might need to check more frequently or cast a wider net:
```bash
# Check multiple DEXes on one chain
for dex in uniswap_v3 uniswap_v2 sushiswap; do
echo "=== $dex ==="
curl -s "https://api.dexpaprika.com/networks/ethereum/dexes/$dex/pools?orderBy=created_at&sort=desc&limit=3" | jq '.pools[0] | {dex: "'$dex'", created: .created_at, tokens: [.tokens[].symbol]}'
done
```
***
## What's Next?
Analyze new pools with price history.
Monitor trading activity in real-time.
Get comprehensive pool information and metadata.
Find pools, tokens, and DEXes across all networks.
## Need Help?
Share strategies and get help from other builders.
Stuck on something? We'll help you figure it out.
# Local crypto analytics with DuckDB
Source: https://docs.dexpaprika.com/tutorials/local-analytics-with-duckdb
Build a powerful, local crypto analytics database with DuckDB. This tutorial guides you through creating an ETL pipeline for Uniswap v3 data to run complex SQL queries instantly.
## The local powerhouse: A core pattern for on-chain analysis
Why make thousands of slow, rate-limited API calls when you can run complex SQL queries instantly on your own machine? This tutorial introduces the most effective pattern for crypto data analysis: creating a local, high-performance copy of a complete on-chain dataset. By fetching the data once, you unlock unlimited, high-speed analytical capabilities.
Looking for other analytics solutions? Check out our full list of [API Tutorials](/tutorials/tutorial_intro) for more step-by-step guides.
Our "free tier" isn't about delayed or incomplete data; it's about providing a **full, live, but scope-limited dataset** for you to master. We're giving you the tools, but it's up to you to use them in the way that makes the most sense for your project.
```mermaid
graph TD;
subgraph "Step 1: Data Pipeline";
A["Run Python ETL Script"] --> B{"Fetches Pool & OHLCV Data"};
B --> C["DexPaprika API"];
C --> B;
B --> D["Local DuckDB File (uniswap_v3.db)"];
end
subgraph "Step 2 & 3: Analysis";
D --> E{"Query the Database"};
E --> F["Option A: Directly with SQL"];
E --> G["Option B: AI Assistant via MCP"];
end
```
**The goal:**
By the end of this guide, you will have a local `uniswap_v3.db` file containing all pools and their recent trading history from Uniswap v3 on Ethereum. You will be able to:
1. Run a robust, high-performance ETL script that pulls a complete dataset from the DexPaprika API.
2. Perform complex SQL queries on this data instantly, without rate limits.
3. Understand a professional workflow for acquiring and analyzing on-chain data.
**Why this is a foundational skill:**
* **Eliminates Rate Limiting:** Instead of thousands of small, repetitive API calls, you perform one efficient batch download.
* **Unlocks True Analytical Power:** Run complex joins, aggregations, and window functions that are impossible with a simple API.
* **Creates a Foundation:** The skills you learn here can be applied to any data source, preparing you for more advanced, real-time analysis.
Create a Python script to fetch complete Uniswap v3 pool and OHLCV data.
Populate your database and run complex SQL queries to find insights.
Connect your database to an AI assistant for natural language queries.
***
## Step 1: Build your local data pipeline
First, let's create a Python script to act as our ETL (Extract, Transform, Load) pipeline. This script will fetch all pool data for **Uniswap v3 on Ethereum** and their recent trading history, then load it into a local DuckDB database file. It leverages two key endpoints: the [Top Pools on a DEX endpoint](/api-reference/pools/get-top-x-pools-on-a-networks-dex) to discover pools, and the [Pool OHLCV Data endpoint](/api-reference/pools/get-ohlcv-data-for-a-pool-pair) to fetch historical price data.
Create a new file named `build_uniswap_db.py`.
```python build_uniswap_db.py [expandable]
import duckdb
import pandas as pd
from datetime import datetime, timedelta
import logging
import os
import asyncio
import aiohttp
from typing import List, Dict
# --- Configuration ---
API_BASE_URL = "https://api.dexpaprika.com"
NETWORK = "ethereum"
DEX_ID = "uniswap_v3"
HISTORY_DAYS = 14 # Default days of OHLCV data to fetch
DB_FILE = "dbs/uniswap_v3.db"
INTERVAL = "1h" # 1-hour intervals
OHLCV_API_LIMIT = 100 # Max records per API call
TOP_POOLS_LIMIT = 500 # Focus on top 500 pools by volume
CONCURRENT_REQUESTS = 3 # Number of concurrent API requests
BATCH_SIZE = 15 # Number of pools to process in each batch
# Setup logging
logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(levelname)s - %(message)s')
async def fetch_with_retry(session: aiohttp.ClientSession, url: str, params: Dict = None, retries=5, backoff_factor=1.0) -> Dict:
"""Generic async fetch function with exponential backoff."""
for attempt in range(retries):
try:
async with session.get(url, params=params, timeout=30) as response:
response.raise_for_status()
return await response.json()
except (aiohttp.ClientError, asyncio.TimeoutError) as e:
if attempt == retries - 1:
logging.error(f"Final attempt failed for {url}: {e}")
raise
sleep_time = backoff_factor * (2 ** attempt)
logging.warning(f"Request to {url} failed: {e}. Retrying in {sleep_time:.2f}s...")
await asyncio.sleep(sleep_time)
return {}
async def get_top_dex_pools(session: aiohttp.ClientSession, network: str, dex_id: str) -> List[Dict]:
"""Fetches top pools for a given DEX, handling pagination asynchronously."""
logging.info(f"Fetching top {TOP_POOLS_LIMIT} pools for {dex_id} on {network}...")
all_pools = []
page = 1
while len(all_pools) < TOP_POOLS_LIMIT:
url = f"{API_BASE_URL}/networks/{network}/dexes/{dex_id}/pools"
params = {"page": page, "limit": 100, "order_by": "volume_usd", "sort": "desc"}
try:
data = await fetch_with_retry(session, url, params=params)
pools = data.get('pools', [])
if not pools:
break
all_pools.extend(pools)
logging.info(f"Fetched page {page}, got {len(pools)} pools. Total: {len(all_pools)}")
page += 1
if len(all_pools) >= TOP_POOLS_LIMIT:
all_pools = all_pools[:TOP_POOLS_LIMIT]
break
await asyncio.sleep(0.5) # Be respectful to the API
except Exception as e:
logging.error(f"Error fetching page {page} for {dex_id} pools: {e}")
break
logging.info(f"Finished fetching pools. Total found: {len(all_pools)}")
return all_pools
async def get_pool_ohlcv(session: aiohttp.ClientSession, pool_address: str, pool_created_at: str, semaphore: asyncio.Semaphore) -> List[Dict]:
"""
Fetches 1-hour OHLCV data for a pool using an intelligent date range and dynamic windowing.
"""
async with semaphore:
logging.info(f"Fetching OHLCV for pool {pool_address}...")
final_end_time = datetime.utcnow()
# Use the later of: pool creation date or the default history window
start_time = final_end_time - timedelta(days=HISTORY_DAYS)
if pool_created_at:
try:
pool_creation = datetime.strptime(pool_created_at, '%Y-%m-%dT%H:%M:%SZ')
if pool_creation > start_time:
start_time = pool_creation
except (ValueError, TypeError):
logging.warning(f"Could not parse creation date '{pool_created_at}', using default {HISTORY_DAYS} days.")
all_ohlcv = []
current_start_time = start_time
# Calculate how much time each API call can cover
interval_hours = 1 # Based on "1h" interval
time_delta_per_call = timedelta(hours=OHLCV_API_LIMIT * interval_hours)
while current_start_time < final_end_time:
batch_end_time = min(current_start_time + time_delta_per_call, final_end_time)
url = f"{API_BASE_URL}/networks/{NETWORK}/pools/{pool_address}/ohlcv"
params = {
"start": current_start_time.strftime('%Y-%m-%dT%H:%M:%SZ'),
"end": batch_end_time.strftime('%Y-%m-%dT%H:%M:%SZ'),
"interval": INTERVAL,
"limit": OHLCV_API_LIMIT
}
try:
batch_data = await fetch_with_retry(session, url, params=params)
if batch_data:
for record in batch_data:
record['network'] = NETWORK
record['pool_address'] = pool_address
avg_price = (record.get('open', 0) + record.get('close', 0)) / 2
record['volume_usd'] = record.get('volume', 0) * avg_price if avg_price > 0 else 0
all_ohlcv.extend(batch_data)
except Exception as e:
logging.warning(f"Could not fetch OHLCV batch for {pool_address}: {e}")
current_start_time = batch_end_time
await asyncio.sleep(0.75) # Small delay to be respectful
logging.info(f"Successfully fetched {len(all_ohlcv)} OHLCV records for {pool_address}")
return all_ohlcv
async def main():
"""Main ETL function to build the local DuckDB database."""
os.makedirs("dbs", exist_ok=True)
async with aiohttp.ClientSession() as session:
pools = await get_top_dex_pools(session, NETWORK, DEX_ID)
all_ohlcv_data = []
semaphore = asyncio.Semaphore(CONCURRENT_REQUESTS)
for i in range(0, len(pools), BATCH_SIZE):
batch = pools[i:i+BATCH_SIZE]
tasks = [get_pool_ohlcv(session, p.get('id'), p.get('created_at'), semaphore) for p in batch if p.get('id')]
batch_num = (i // BATCH_SIZE) + 1
total_batches = (len(pools) + BATCH_SIZE - 1) // BATCH_SIZE
logging.info(f"--- Processing batch {batch_num}/{total_batches} ---")
results = await asyncio.gather(*tasks)
for res in results:
all_ohlcv_data.extend(res)
if i + BATCH_SIZE < len(pools):
logging.info(f"--- Finished batch {batch_num}, sleeping for 10 seconds ---")
await asyncio.sleep(10)
logging.info("ETL process finished. Loading data into DuckDB.")
con = duckdb.connect(database=DB_FILE, read_only=False)
if pools:
for pool in pools:
tokens = pool.get('tokens', [])
pool['token0_symbol'] = tokens[0]['symbol'] if len(tokens) > 0 else None
pool['token1_symbol'] = tokens[1]['symbol'] if len(tokens) > 1 else None
pools_df = pd.DataFrame(pools)
pools_df = pools_df[['id', 'dex_name', 'volume_usd', 'created_at', 'token0_symbol', 'token1_symbol']]
pools_df = pools_df.rename(columns={'id': 'address', 'volume_usd': 'volume_24h_usd'})
con.execute("CREATE OR REPLACE TABLE pools AS SELECT * FROM pools_df")
logging.info(f"Loaded {len(pools_df)} records into 'pools' table.")
if all_ohlcv_data:
ohlcv_df = pd.DataFrame(all_ohlcv_data)
ohlcv_df['timestamp'] = pd.to_datetime(ohlcv_df['time_close'])
ohlcv_df = ohlcv_df[['timestamp', 'network', 'pool_address', 'open', 'high', 'low', 'close', 'volume_usd']]
con.execute("CREATE OR REPLACE TABLE pool_ohlcv AS SELECT * FROM ohlcv_df")
logging.info(f"Loaded {len(ohlcv_df)} records into 'pool_ohlcv' table.")
logging.info("Database build complete. Summary:")
print(con.execute("SHOW TABLES").fetchdf())
print("\nPools Sample:")
print(con.execute("SELECT * FROM pools LIMIT 5").fetchdf())
print("\nOHLCV Sample:")
print(con.execute("SELECT * FROM pool_ohlcv ORDER BY timestamp DESC LIMIT 5").fetchdf())
con.close()
if __name__ == "__main__":
# Ensure you have the required libraries:
# pip install requests pandas duckdb aiohttp
asyncio.run(main())
```
A simple, sequential script is great for learning, but real-world data fetching requires a more robust approach. Here is what we've used to make sure it runs reliably:
* **Asynchronous Operations:** By using `asyncio` and `aiohttp`, the script can make many API requests concurrently instead of one by one. This means shorter time for completion.
* **Dynamic Windowing:** The `get_pool_ohlcv` function calculates how much data to request per API call so that it gets all the data for each pool.
* **Concurrency Control & Throttling:** An `asyncio.Semaphore`, combined with carefully tuned `BATCH_SIZE` and `asyncio.sleep()` calls, makes sure we don't hit the rate limit.
* **Resiliency:** The `fetch_with_retry` function automatically retries failed requests with an exponential backoff delay, making the pipeline resilient to temporary network issues.
### **Required libraries**
Before running the script, make sure you have the necessary Python libraries installed.
```bash
pip install requests pandas duckdb aiohttp
```
***
## Step 2: Run the pipeline and query with SQL
Now, execute the script from your terminal. It will fetch all Uniswap v3 pool data from Ethereum and their recent trading history, then create a `uniswap_v3.db` file in a new `dbs` directory. This may take several minutes, but it will be significantly faster than a purely sequential script.
```bash
python build_uniswap_db.py
```
### **Querying your new database**
Once the script completes, you have a powerful local database at your fingertips. You can now use any SQL client that supports DuckDB, or Python itself, to perform instant, complex analysis. In step 3, we will connect the database to an AI assistant for natural language queries.
If you want to query the database with a Python script, create a new file named `query_duckdb.py` and paste the following code into it.
```python query_duckdb.py [expandable]
import duckdb
import pandas as pd
import time
# Connect to the DuckDB database file
con = duckdb.connect(database='dbs/uniswap_v3.db', read_only=True)
print("=== DuckDB Uniswap v3 Analytics ===\n")
# --- Query 1: Database Summary ---
print("--- 1. Database Summary ---")
pool_count = con.execute("SELECT COUNT(*) FROM pools").fetchone()[0]
ohlcv_count = con.execute("SELECT COUNT(*) FROM pool_ohlcv").fetchone()[0]
print(f"Total Pools Loaded: {pool_count}")
print(f"Total OHLCV Records: {ohlcv_count:,}\n")
# --- Query 2: Top 10 Pools by 24h Volume ---
print("--- 2. Top 10 Pools by 24h Volume ---")
start_time = time.time()
top_pools_df = con.execute("""
SELECT
address,
token0_symbol,
token1_symbol,
volume_24h_usd
FROM pools
ORDER BY volume_24h_usd DESC
LIMIT 10
""").fetchdf()
print(top_pools_df)
print(f"Query executed in {time.time() - start_time:.4f} seconds\n")
# --- Query 3: Peak Trading Hours ---
print("--- 3. Peak Trading Hours (UTC) ---")
start_time = time.time()
hourly_volume_df = con.execute("""
SELECT
EXTRACT(hour FROM timestamp) AS hour_of_day,
SUM(volume_usd) AS total_volume_usd
FROM pool_ohlcv
WHERE volume_usd > 0 AND volume_usd < 1000000000 -- Defensive filter against outliers
GROUP BY hour_of_day
ORDER BY total_volume_usd DESC
""").fetchdf()
# Format the volume for better readability
hourly_volume_df['total_volume_usd'] = hourly_volume_df['total_volume_usd'].map('${:,.2f}'.format)
print(hourly_volume_df)
print(f"Query executed in {time.time() - start_time:.4f} seconds\n")
con.close()
```
Now, execute the script from your terminal:
```bash
python query_duckdb.py
```
***
## Step 3: AI-powered analysis with an MCP server
While you can use any SQL client to query your database, the real power comes from connecting it to an AI assistant. By using a Model Context Protocol (MCP) server, you can enable your assistant to directly query the `uniswap_v3.db` file you created. This allows you to ask for insights in plain English instead of writing SQL.
For this, we will use `mcp-server-duckdb`, an open-source MCP server for DuckDB.
### **Install the DuckDB MCP server**
You can install the server easily using `npx`:
```bash
npx -y @smithery/cli install mcp-server-duckdb --client claude
```
### **Configure your AI assistant**
Next, you need to tell your AI assistant how to run the server. Add the following to your `claude_desktop_config.json` file.
If you see a "Server disconnected" error after restarting your AI assistant, it means the application cannot find the `uvx` or `npx` command. This happens because the application doesn't share the same `PATH` environment variable as your terminal.
**To fix this, you must use the full, absolute path to the command.**
1. Find the absolute path by running `which uvx` or `which npx` in your terminal.
2. Copy the output (e.g., `/Users/yourname/.local/bin/uvx` or `/opt/homebrew/bin/npx`).
3. Use that full path as the `command` value in the JSON configuration below.
The example below uses `uvx`, which is recommended. Make sure to replace `` with the actual absolute path to your project directory.
```json
{
"mcpServers": {
"duckdb-crypto": {
"command": "/Users//.local/bin/uvx",
"args": [
"mcp-server-duckdb",
"--db-path",
"/dbs/uniswap_v3.db",
"--readonly"
]
}
}
}
```
Now, when you start your AI assistant, it will have the tools to query your local Uniswap V3 database. You can ask it things like:
* *"Using the duckdb-crypto tool, find the 5 pools with the highest 24-hour volume."*
* *"What was the hourly volatility for the top pool yesterday?"*
***
## What you've built: From API calls to analytics powerhouse
By completing this tutorial, you have successfully transitioned from being a passive data consumer to an active data analyst. You've replaced the slow, restrictive pattern of making individual API calls with a fast, powerful, and scalable local analytics workflow.
**Key achievements:**
* **Built a professional ETL pipeline:** You have a reusable, high-performance Python script that can create a comprehensive local database from any supported DEX and network.
* **Unlocked high-speed SQL:** You can now perform complex analytical queries on a rich dataset in milliseconds, directly on your machine.
* **Mastered a foundational workflow:** This "local-first" data strategy is a cornerstone of professional data analysis. It enables deeper exploration, from high-level market trends down to individual wallet behaviors.
* **Created a Reusable Asset:** Your `uniswap_v3.db` file is a valuable, reusable asset for any future analysis, dashboarding, or AI integration project.
When your project grows and you need to explore other data solutions, check out our full list of [API Tutorials](/tutorials/tutorial_intro) for more advanced guides.
# Scalable time-series analytics with InfluxDB
Source: https://docs.dexpaprika.com/tutorials/real-time-analytics-with-influxdb
Harness InfluxDB, a best-in-class time-series database, to build a powerful crypto analytics pipeline that scales from real-time monitoring to large historical datasets.
## From real-time monitoring to historical analysis
While InfluxDB is a champion of real-time data, its power extends far beyond live dashboards. It provides a highly optimized engine for storing and querying massive time-series datasets, making it a perfect middle-ground between the local analytics of DuckDB and the enterprise scale of ClickHouse.
Looking for other analytics solutions? Check out our full list of [API Tutorials](/tutorials/tutorial_intro) for more step-by-step guides.
This tutorial will guide you through building a production-grade ETL pipeline to populate InfluxDB with a substantial historical dataset, enabling both high-performance queries and real-time visualization.
```mermaid
graph TD;
subgraph "Step 1: Infrastructure";
A["Setup InfluxDB & Grafana (Docker)"];
end
subgraph "Step 2: Data Ingestion";
B["Run Python ETL script"];
C["DexPaprika API"];
D["InfluxDB Bucket ('crypto-data')"];
B --"Fetches 14 days of 1h data for 500 pools"--> C;
B --"Writes time-series data"--> D;
end
subgraph "Step 3 & 4: Analysis & Visualization";
E["Option A: Programmatic queries (Python)"];
F["Option B: Live dashboards (Grafana)"];
D --> E;
D --> F;
end
A --> B;
```
**The goal:**
By the end of this guide, you will have a scalable analytics pipeline that can:
1. Ingest 1-hour OHLCV data for the top 500 Uniswap v3 pools over a 14-day period.
2. Run complex time-series analysis using Python and the Flux language.
3. Visualize the data in a live-updating Grafana dashboard.
Get InfluxDB and Grafana running in seconds with Docker.
Create a robust data pipeline for ingesting large time-series datasets.
Analyze your time-series data with InfluxDB's Python client.
Build a real-time dashboard to monitor crypto pools.
***
## Step 1: Setting Up InfluxDB and Grafana
We'll use Docker Compose to spin up both services. First, create a new directory named `INFLUXDB` in your project root. Inside that directory, create a file named `docker-compose.yml` with the following content.
```yml INFLUXDB/docker-compose.yml
services:
influxdb:
image: influxdb:2.7
container_name: influxdb
ports:
- "8087:8086"
volumes:
- influxdb-data:/var/lib/influxdb2
environment:
- DOCKER_INFLUXDB_INIT_MODE=setup
- DOCKER_INFLUXDB_INIT_USERNAME=my-user
- DOCKER_INFLUXDB_INIT_PASSWORD=my-password
- DOCKER_INFLUXDB_INIT_ORG=my-org
- DOCKER_INFLUXDB_INIT_BUCKET=crypto-data
- DOCKER_INFLUXDB_INIT_ADMIN_TOKEN=my-super-secret-token
grafana:
image: grafana/grafana:latest
container_name: grafana
ports:
- "3000:3000"
volumes:
- grafana-data:/var/lib/grafana
volumes:
influxdb-data:
grafana-data:
```
Run the following command from the root of the project to start the containers:
```bash
docker-compose -f INFLUXDB/docker-compose.yml up -d
```
Once running, you can access:
* **InfluxDB UI:** `http://localhost:8087`
* **Grafana UI:** `http://localhost:3000` (login with `admin`/`admin`)
Use the token `my-super-secret-token` to connect to InfluxDB.
We use port `8087` for InfluxDB to avoid potential conflicts with other services that might be using the default port `8086`.
***
## Step 2: Build the Python ETL Pipeline
Create a new file named `INFLUXDB/build_influxdb_db.py`. This script is built to be robust and efficient, capable of ingesting large volumes of time-series data from the DexPaprika API into your InfluxDB instance. It leverages two key endpoints: the [Top Pools on a DEX endpoint](/api-reference/pools/get-top-x-pools-on-a-networks-dex) to discover pools, and the [Pool OHLCV Data endpoint](/api-reference/pools/get-ohlcv-data-for-a-pool-pair) to fetch historical price data.
```python INFLUXDB/build_influxdb_db.py [expandable]
import influxdb_client
from influxdb_client.client.write_api import SYNCHRONOUS
import requests
from datetime import datetime, timedelta, timezone
import time
import logging
import asyncio
import aiohttp
from typing import List, Dict
import math
# --- Configuration ---
API_BASE_URL = "https://api.dexpaprika.com"
NETWORK = "ethereum"
DEX_ID = "uniswap_v3"
HISTORY_DAYS = 14 # Fetch 14 days of OHLCV data
TOP_POOLS_LIMIT = 500 # Focus on top 500 pools by volume
BATCH_SIZE = 15 # Process pools in smaller batches
CONCURRENT_REQUESTS = 3 # Concurrent requests for API calls
OHLCV_API_LIMIT = 100 # API limit for OHLCV requests
INTERVAL = "1h" # 1-hour intervals
# InfluxDB Configuration
INFLUX_URL = "http://localhost:8087"
INFLUX_TOKEN = "my-super-secret-token"
INFLUX_ORG = "my-org"
INFLUX_BUCKET = "crypto-data"
# Setup logging
logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(levelname)s - %(message)s')
async def fetch_with_retry(session: aiohttp.ClientSession, url: str, params: Dict = None, retries=5, backoff_factor=1.0):
"""Generic async fetch function with exponential backoff."""
for attempt in range(retries):
try:
async with session.get(url, params=params, timeout=30) as response:
response.raise_for_status()
return await response.json()
except (aiohttp.ClientError, asyncio.TimeoutError) as e:
if attempt == retries - 1:
logging.error(f"Final attempt failed for {url}: {e}")
raise
sleep_time = backoff_factor * (2 ** attempt)
logging.warning(f"Request to {url} failed: {e}. Retrying in {sleep_time:.2f}s...")
await asyncio.sleep(sleep_time)
return {}
class InfluxDBETL:
def __init__(self):
self.client = influxdb_client.InfluxDBClient(url=INFLUX_URL, token=INFLUX_TOKEN, org=INFLUX_ORG)
self.write_api = self.client.write_api(write_options=SYNCHRONOUS)
self.api_semaphore = asyncio.Semaphore(CONCURRENT_REQUESTS)
self._ensure_bucket_exists()
def _ensure_bucket_exists(self):
"""Checks if the bucket exists and creates it if not."""
logging.info(f"Ensuring bucket '{INFLUX_BUCKET}' exists...")
buckets_api = self.client.buckets_api()
bucket = buckets_api.find_bucket_by_name(INFLUX_BUCKET)
if not bucket:
logging.warning(f"Bucket '{INFLUX_BUCKET}' not found. Creating it...")
buckets_api.create_bucket(bucket_name=INFLUX_BUCKET, org=INFLUX_ORG)
logging.info(f"Bucket '{INFLUX_BUCKET}' created successfully.")
else:
logging.info(f"Bucket '{INFLUX_BUCKET}' already exists.")
def clear_bucket_data(self):
"""Deletes all data from the 'ohlcv' measurement in the bucket."""
logging.info(f"Clearing existing data from measurement 'ohlcv' in bucket '{INFLUX_BUCKET}'...")
try:
delete_api = self.client.delete_api()
start = "1970-01-01T00:00:00Z"
stop = datetime.now(timezone.utc).strftime('%Y-%m-%dT%H:%M:%SZ')
delete_api.delete(start, stop, '_measurement="ohlcv"', bucket=INFLUX_BUCKET, org=INFLUX_ORG)
logging.info("Existing data cleared successfully.")
except Exception as e:
logging.error(f"Could not clear data from bucket: {e}")
async def fetch_top_pools(self) -> List[Dict]:
"""Fetch top pools by volume from the specified DEX, handling pagination."""
logging.info(f"Fetching top {TOP_POOLS_LIMIT} pools for {DEX_ID} on {NETWORK}...")
all_pools = []
page = 0
async with aiohttp.ClientSession() as session:
while len(all_pools) < TOP_POOLS_LIMIT:
url = f"{API_BASE_URL}/networks/{NETWORK}/dexes/{DEX_ID}/pools"
params = {"page": page, "limit": 100, "order_by": "volume_usd", "sort": "desc"}
try:
data = await fetch_with_retry(session, url, params=params)
pools = data.get('pools', [])
if not pools:
break
all_pools.extend(pools)
logging.info(f"Fetched page {page}, got {len(pools)} pools. Total: {len(all_pools)}")
page += 1
if len(all_pools) >= TOP_POOLS_LIMIT:
all_pools = all_pools[:TOP_POOLS_LIMIT]
break
await asyncio.sleep(0.5) # Be respectful to the API
except Exception as e:
logging.error(f"Error fetching page {page}: {e}")
break
logging.info(f"Finished fetching pools. Total: {len(all_pools)}")
return all_pools
async def fetch_pool_ohlcv_paginated(self, session: aiohttp.ClientSession, pool_address: str) -> List[Dict]:
"""Fetch complete OHLCV data for a pool using intelligent, dynamic windowing."""
async with self.api_semaphore:
final_end_time = datetime.now(timezone.utc)
current_start_time = final_end_time - timedelta(days=HISTORY_DAYS)
all_ohlcv = []
try:
if 'h' in INTERVAL:
interval_value = int(INTERVAL.replace('h', ''))
time_delta_per_call = timedelta(hours=OHLCV_API_LIMIT * interval_value)
elif 'm' in INTERVAL:
interval_value = int(INTERVAL.replace('m', ''))
time_delta_per_call = timedelta(minutes=OHLCV_API_LIMIT * interval_value)
else:
raise ValueError(f"Unsupported interval format: {INTERVAL}")
except ValueError as e:
logging.error(f"Invalid INTERVAL format: {e}. Defaulting to 1 hour.")
time_delta_per_call = timedelta(hours=OHLCV_API_LIMIT * 1)
total_expected_calls = math.ceil((final_end_time - current_start_time) / time_delta_per_call) if time_delta_per_call.total_seconds() > 0 else 0
call_count = 0
while current_start_time < final_end_time:
call_count += 1
batch_end_time = min(current_start_time + time_delta_per_call, final_end_time)
logging.info(f" [Pool {pool_address}] Fetching window {call_count}/{total_expected_calls}: {current_start_time.date()} to {batch_end_time.date()}")
url = f"{API_BASE_URL}/networks/{NETWORK}/pools/{pool_address}/ohlcv"
params = {
"start": current_start_time.strftime('%Y-%m-%dT%H:%M:%SZ'),
"end": batch_end_time.strftime('%Y-%m-%dT%H:%M:%SZ'),
"interval": INTERVAL,
"limit": OHLCV_API_LIMIT
}
try:
batch_data = await fetch_with_retry(session, url, params=params)
if batch_data:
for record in batch_data:
record['network'] = NETWORK
record['pool_address'] = pool_address
all_ohlcv.extend(batch_data)
except Exception as e:
logging.warning(f"Could not fetch OHLCV batch for {pool_address}: {e}")
current_start_time = batch_end_time
await asyncio.sleep(0.75) # Crucial delay to prevent rate-limiting
logging.info(f"Pool {pool_address}: collected {len(all_ohlcv)} OHLCV records.")
return all_ohlcv
async def fetch_pool_ohlcv_batch(self, pool_addresses: List[str]) -> List[Dict]:
"""Fetch OHLCV data for multiple pools concurrently."""
logging.info(f"Fetching {INTERVAL} OHLCV for {len(pool_addresses)} pools...")
all_ohlcv = []
async with aiohttp.ClientSession() as session:
tasks = [self.fetch_pool_ohlcv_paginated(session, addr) for addr in pool_addresses]
results = await asyncio.gather(*tasks, return_exceptions=True)
for i, result in enumerate(results):
if isinstance(result, list):
all_ohlcv.extend(result)
elif isinstance(result, Exception):
logging.warning(f"OHLCV fetch failed for pool {pool_addresses[i]}: {result}")
return all_ohlcv
def load_ohlcv_data(self, ohlcv_data: List[Dict], pools_map: Dict):
"""Load OHLCV data into InfluxDB."""
if not ohlcv_data:
logging.warning("No OHLCV data to load.")
return
points = []
for record in ohlcv_data:
pool_id = record.get('pool_address')
pair = pools_map.get(pool_id, "Unknown/Unknown")
point = (
influxdb_client.Point("ohlcv")
.tag("pool_id", pool_id)
.tag("pair", pair)
.field("open", float(record['open']))
.field("high", float(record['high']))
.field("low", float(record['low']))
.field("close", float(record['close']))
.field("volume", float(record.get('volume', 0)))
.time(record['time_close'])
)
points.append(point)
if points:
self.write_api.write(bucket=INFLUX_BUCKET, org=INFLUX_ORG, record=points)
logging.info(f"Wrote {len(points)} data points to InfluxDB.")
async def run_etl(self):
"""Run the complete ETL process."""
self.clear_bucket_data()
logging.info(f"Starting InfluxDB ETL process for top {TOP_POOLS_LIMIT} pools...")
pools = await self.fetch_top_pools()
if pools:
pools_map = {
pool['id']: f"{pool['tokens'][0]['symbol']}/{pool['tokens'][1]['symbol']}"
for pool in pools if len(pool.get('tokens', [])) >= 2
}
pool_addresses = [pool['id'] for pool in pools if pool.get('id')]
for i in range(0, len(pool_addresses), BATCH_SIZE):
batch_addresses = pool_addresses[i:i + BATCH_SIZE]
batch_num = (i // BATCH_SIZE) + 1
total_batches = (len(pool_addresses) + BATCH_SIZE - 1) // BATCH_SIZE
logging.info(f"Processing OHLCV batch {batch_num}/{total_batches} ({len(batch_addresses)} pools)")
ohlcv_data = await self.fetch_pool_ohlcv_batch(batch_addresses)
self.load_ohlcv_data(ohlcv_data, pools_map)
if i + BATCH_SIZE < len(pool_addresses):
logging.info(f"--- Finished batch {batch_num}, sleeping for 10 seconds ---")
await asyncio.sleep(10)
logging.info("ETL process completed!")
async def main():
etl = InfluxDBETL()
await etl.run_etl()
if __name__ == "__main__":
# pip install influxdb-client aiohttp requests pandas
asyncio.run(main())
```
This script is built for performance and reliability, using several best practices common in data pipelines:
* **Asynchronous operations:** By using `asyncio` and `aiohttp`, the script can make many API requests concurrently instead of one by one.
* **Dynamic windowing:** The `fetch_pool_ohlcv_paginated` function calculates how much data to request per API call to ensure complete history is fetched efficiently.
* **Concurrency control & throttling:** An `asyncio.Semaphore`, combined with carefully tuned `BATCH_SIZE` and `asyncio.sleep()` calls, prevents API rate-limiting.
* **Resiliency:** The `fetch_with_retry` function automatically retries failed requests with an exponential backoff delay.
* **Data Integrity:** The script automatically clears old data from the bucket before each run to ensure a clean, consistent dataset.
### **Setup Python Environment**
Before running the ETL script, it's a critical best practice to create an isolated Python environment to manage dependencies.
1. **Create a virtual environment:**
Open your terminal in the project root and run:
```bash
python3 -m venv venv
```
2. **Activate the environment:**
* On macOS and Linux:
```bash
source venv/bin/activate
```
* On Windows:
```bash
.\venv\Scripts\activate
```
Your terminal prompt should now be prefixed with `(venv)`, indicating that the virtual environment is active.
3. **Install required libraries:**
Now, install the necessary Python packages inside the activated environment:
```bash
pip install influxdb-client aiohttp requests pandas
```
Run the script to start streaming data into your InfluxDB instance. This may take several minutes.
```bash
python INFLUXDB/build_influxdb_db.py
```
***
## Step 3: Programmatic Analysis with Python
While the InfluxDB UI is great for exploration, the real power comes from programmatic access. The `influxdb-client` library for Python allows you to run complex Flux queries and integrate the data into other tools or scripts.
We've created a `query_influxdb.py` script to demonstrate how to connect to your database and perform analysis on the hourly data.
```python INFLUXDB/query_influxdb.py [expandable]
import influxdb_client
import pandas as pd
# --- InfluxDB Configuration ---
INFLUX_URL = "http://localhost:8087"
INFLUX_TOKEN = "my-super-secret-token"
INFLUX_ORG = "my-org"
INFLUX_BUCKET = "crypto-data"
def run_flux_query(client: influxdb_client.InfluxDBClient, query: str):
"""Helper function to execute a Flux query and return a pandas DataFrame."""
try:
query_api = client.query_api()
result = query_api.query_data_frame(query, org=INFLUX_ORG)
if isinstance(result, list): # Handle multiple dataframes in result
return pd.concat(result, ignore_index=True) if result else pd.DataFrame()
return result
except Exception as e:
print(f"Error running query: {e}")
return pd.DataFrame()
def main():
"""Connects to InfluxDB and runs sample analytics queries."""
client = influxdb_client.InfluxDBClient(url=INFLUX_URL, token=INFLUX_TOKEN, org=INFLUX_ORG)
print("=== InfluxDB Python Analytics Demo ===\n")
# --- Query 1: Find available trading pairs ---
print("--- 1. Finding available trading pairs ---")
list_pairs_query = f'''
import "influxdata/influxdb/schema"
schema.tagValues(
bucket: "{INFLUX_BUCKET}",
tag: "pair",
start: -14d
)
'''
pairs_df = run_flux_query(client, list_pairs_query)
if not pairs_df.empty:
available_pairs = pairs_df['_value'].tolist()
print(f"Found {len(available_pairs)} pairs. Examples: {available_pairs[:5]}")
# Use the first available pair for subsequent queries
target_pair = available_pairs[0]
else:
print("No pairs found. Please run the ingestion script first.")
print("Using 'WETH/USDC' as a placeholder for query examples.")
target_pair = "WETH/USDC" # Fallback for demo
print(f"\n--- Using pair '{target_pair}' for next queries ---\n")
# --- Query 2: Get raw data for the target pool ---
print(f"--- 2. Raw OHLCV data for {target_pair} ---")
raw_data_query = f'''
from(bucket: "{INFLUX_BUCKET}")
|> range(start: -3d) // Limit to last 3 days for brevity
|> filter(fn: (r) => r._measurement == "ohlcv")
|> filter(fn: (r) => r.pair == "{target_pair}")
|> pivot(rowKey:["_time"], columnKey: ["_field"], valueColumn: "_value")
|> sort(columns: ["_time"], desc: true)
|> limit(n: 10)
'''
raw_df = run_flux_query(client, raw_data_query)
print("Last 10 records:")
if not raw_df.empty and all(c in raw_df.columns for c in ['_time', 'open', 'high', 'low', 'close', 'volume']):
print(raw_df[['_time', 'open', 'high', 'low', 'close', 'volume']])
else:
print("Could not retrieve raw data. Please check if the ingestion was successful.")
print("\n")
# --- Query 3: Calculate 12-hour moving average ---
print(f"--- 3. 12-Hour moving average for {target_pair} close price ---")
moving_avg_query = f'''
from(bucket: "{INFLUX_BUCKET}")
|> range(start: -14d)
|> filter(fn: (r) => r._measurement == "ohlcv" and r._field == "close" and r.pair == "{target_pair}")
|> timedMovingAverage(every: 1h, period: 12h)
|> sort(columns: ["_time"], desc: true)
|> limit(n: 10)
'''
ma_df = run_flux_query(client, moving_avg_query)
print("Last 10 moving average values:")
if not ma_df.empty and all(c in ma_df.columns for c in ['_time', '_value']):
print(ma_df[['_time', '_value']])
else:
print("Could not retrieve moving average data.")
if __name__ == "__main__":
# Ensure you have the required libraries:
# pip install influxdb-client pandas
main()
```
Run the script to see how you can query your data with Python:
```bash
python INFLUXDB/query_influxdb.py
```
***
## Step 4: Visualizing Data in Grafana
1. Open Grafana at `http://localhost:3000`.
2. Go to **Connections** > **Add new connection** > **InfluxDB**.
3. Configure the connection:
* **Name**: InfluxDB\_Crypto
* **Query Language**: Flux
* **URL**: `http://influxdb:8086` (use the Docker service name)
* Under **Auth**, toggle **Basic auth** off.
* In the **Custom HTTP Headers** section, add a header:
* **Header**: `Authorization`
* **Value**: `Token my-super-secret-token`
* Enter your InfluxDB **Organization** (`my-org`) and default **Bucket** (`crypto-data`).
4. Click **Save & Test**. You should see a "Bucket found" confirmation.
5. Now, let's create a dashboard. In the left-hand menu, click the **+** icon and select **Dashboard**.
6. Click on the **Add new panel** button.
7. In the new panel view, ensure your `InfluxDB_Crypto` data source is selected at the top.
8. Below the graph, you'll see a query editor. You can switch to the **Script editor** by clicking the pencil icon on the right.
9. Paste the following query into the editor. This query will plot the raw closing price for the `AAVE/USDC` trading pair.
```flux
from(bucket: "crypto-data")
|> range(start: v.timeRangeStart, stop: v.timeRangeStop)
|> filter(fn: (r) => r._measurement == "ohlcv" and r._field == "close" and r.pair == "AAVE/USDC")
|> sort(columns: ["_time"], desc: false)
```
**Troubleshooting Tip:** If you initially see "No data," there are two common reasons:
1. **Time Range:** Ensure the time picker at the top right is set to "Last 7 days" or wider, not a shorter period like "Last 6 hours."
2. **Trading Pair:** The default `WETH/USDC` pair used in the original tutorial may not have been in the top 500 pools fetched by the script. The query above uses `AAVE/USDC`, which is more likely to be present. You can find other available pairs by running the `query_influxdb.py` script.
10. At the top right of the dashboard page, set the time range to **Last 7 days** to ensure you see all the historical data you ingested.
11. You should now see the data appear in the panel. You can give the panel a title (e.g., "AAVE/USDC Close Price") in the settings on the right.
12. Click **Apply** to save the panel to your dashboard. You can now add more panels for other queries.
***
## What You've Built
You now have a powerful, scalable analytics pipeline for time-series crypto data. You've combined a production-grade Python ETL script with the industry-standard tools for time-series data storage (InfluxDB) and visualization (Grafana).
**Key achievements:**
* **Built a production-ready ETL pipeline:** You have a reusable, high-performance Python script that can populate a time-series database from any supported DEX.
* **Unlocked programmatic time-series analysis:** You can now perform complex analytical queries on large time-series datasets using Python and Flux.
* **Mastered a scalable analytics workflow:** This pipeline provides a solid foundation for building live dashboards, conducting in-depth market research, and developing sophisticated monitoring or trading algorithms.
* **Enabled live data visualization:** You've connected your database to Grafana, the leading open-source tool for observability and data visualization.
# Get Token Price History and OHLCV Data
Source: https://docs.dexpaprika.com/tutorials/retrieve-historical-data
Get historical crypto price data (OHLCV) for any token across 20+ blockchains. Perfect for building charts, backtesting strategies, and price analysis.
Having trouble with the API? We're here to help - [drop us a line](mailto:support@coinpaprika.com) and we'll get you sorted.
## Why You Need Historical Price Data
Building a crypto app? You'll likely need price charts, volatility analysis, or backtesting data. That's where OHLCV (Open, High, Low, Close, Volume) data comes in - it's the backbone of any serious crypto application.
**Common use cases:**
* **Price charts** in trading apps
* **Backtesting** trading strategies
* **Volatility analysis** for risk management
* **Historical performance** dashboards
* **Market research** and analytics
**Quick Start**: If you know your token already, jump to [Step 2](#step-2-get-price-history-data) to grab the data immediately.
***
## Step 1: Find Your Token's Trading Pool
Here's the thing - historical data comes from actual trading pools, not tokens directly. This makes sense because prices happen where people trade.
### Quick Search (Recommended)
The fastest way to find what you need using the [Search API](/api-reference/search/search-for-tokens-pools-and-dexes):
```bash
curl "https://api.dexpaprika.com/search?query=USDC" | jq
```
**Pro tip**: Search returns tokens, pools, and exchanges. Look for pools with high volume - they'll have the most reliable price data.
### If You Have the Token Address
Skip the search and go straight to pools using the [Token Pools API](/api-reference/tokens/get-top-x-pools-for-a-token):
```bash
curl "https://api.dexpaprika.com/networks/ethereum/tokens/0xa0b86991c6218b36c1d19d4a2e9eb0ce3606eb48/pools" | jq
```
Always pick pools with decent volume (>\$10k daily). Low-volume pools can have weird price spikes that don't reflect real market conditions.
***
## Step 2: Get Price History Data
Now for the good stuff. Here's how to pull historical OHLCV data using the [Pool OHLCV API](/api-reference/pools/get-ohlcv-data-for-a-pool-pair):
```bash
curl "https://api.dexpaprika.com/networks/ethereum/pools/0x88e6a0c2ddd26feeb64f039a2c41296fcb3f5640/ohlcv?start=2025-01-01&limit=30&interval=24h&inversed=true" | jq
```
*(Using USDC/ETH pool on Ethereum as example in this tutorial)*
### What These Parameters Do:
| Parameter | What It Does | Example |
| ---------- | ------------------------------ | ------------------------------ |
| `start` | When to start collecting data | `2025-01-01` or Unix timestamp |
| `limit` | How many data points (max 366) | `30` for 30 days |
| `interval` | Time between each point | `24h`, `1h`, `5m`, etc. |
| `end` | When to stop (optional) | `2025-01-31` |
| `inversed` | Flip the price ratio | `true` for ETH/USDC → USDC/ETH |
### What You Get Back:
```json
[
{
"time_open": "2025-01-30T00:00:00Z",
"time_close": "2025-01-31T00:00:00Z",
"open": 3115.1614508315265,
"high": 3277.717757396331,
"low": 3097.803416632386,
"close": 3250.184016286268,
"volume": 226403988
}
]
```
Each data point gives you everything you need for candlestick charts or analysis.
***
## Time Intervals That Actually Matter
Choose based on what you're building:
**For Trading Apps:**
* `1m`, `5m` - Real-time trading
* `1h`, `4h` - Swing trading
* `24h` - Position trading
**For Analytics Dashboards:**
* `24h` - Daily summaries
* Use daily data and aggregate for weekly/monthly views
**For Research:**
* `24h` with longer date ranges (up to 1 year)
***
## Production Tips That'll Save You Time
### Cache Aggressively
Historical data doesn't change - cache it locally:
```bash
# Check when pool was created to avoid requesting non-existent data
curl "https://api.dexpaprika.com/networks/ethereum/pools/0x88e6a0c2ddd26feeb64f039a2c41296fcb3f5640" | jq '.created_at'
```
### Handle Data Gaps
Some pools have quiet periods. Here's how to deal with gaps:
```bash
# Filter out low-volume periods that might have unreliable prices
curl "https://api.dexpaprika.com/networks/ethereum/pools/0x88e6a0c2ddd26feeb64f039a2c41296fcb3f5640/ohlcv?start=2025-01-01&limit=30&inversed=true" | jq '.[] | select(.volume > 1000)'
```
### Multi-Pool Strategy
For major tokens, cross-reference data from multiple pools:
```bash
# Get all USDC pools to compare price consistency
curl "https://api.dexpaprika.com/networks/ethereum/tokens/0xa0b86991c6218b36c1d19d4a2e9eb0ce3606eb48/pools?limit=5" | jq '.pools[] | .id'
```
### Rate Limiting
Don't hammer the API. We suggest you to batch your requests and cache results:
```javascript
// Example: Batch multiple token histories
const tokens = ['USDC', 'WETH', 'USDT'];
const historyPromises = tokens.map(token =>
fetch(`/api/history/${token}`).then(r => r.json())
);
const allHistories = await Promise.all(historyPromises);
```
***
## Troubleshooting Common Issues
### "Empty Response"
Pool might not have data for your date range:
```bash
# Check pool age first
curl "https://api.dexpaprika.com/networks/ethereum/pools/0x88e6a0c2ddd26feeb64f039a2c41296fcb3f5640" | jq '.created_at'
```
### "Weird Price Spikes"
You hit a low-liquidity period. Switch to shorter intervals or higher-volume pools:
```bash
# Use 6h intervals to smooth out anomalies
curl "https://api.dexpaprika.com/networks/ethereum/pools/0x88e6a0c2ddd26feeb64f039a2c41296fcb3f5640/ohlcv?start=2025-01-01&interval=6h&limit=50&inversed=true"
```
### "Upside-Down Prices"
You need the inverted ratio:
```bash
# Flip from TOKEN/ETH to ETH/TOKEN
curl "https://api.dexpaprika.com/networks/ethereum/pools/0x88e6a0c2ddd26feeb64f039a2c41296fcb3f5640/ohlcv?start=2025-01-01&inversed=true&limit=10"
```
***
## What's Next?
Get liquidity, fees, and other pool metrics.
Combine historical data with live prices.
## Need Help?
Ask questions and see what others are building.
Hit a wall? We'll help you debug it.
# Crypto analytics with ClickHouse
Source: https://docs.dexpaprika.com/tutorials/scaling-with-clickhouse
Take your crypto analytics to the next level with ClickHouse. This guide shows you how to build a production-grade data pipeline for massive datasets and lightning-fast queries.
## From local to production scale
When your analytics needs grow beyond a single machine and you require a database designed for production scale, it's time to consider ClickHouse. ClickHouse is built for handling billions of rows with sub-second query times, making it perfect for production analytics, real-time dashboards, and enterprise-grade data analysis.
Looking for other analytics solutions? Check out our full list of [API Tutorials](/tutorials/tutorial_intro) for more step-by-step guides.
**Why ClickHouse for crypto analytics?**
* **Massive scale:** Built to handle billions of rows and petabytes of data, far beyond the scope of local, in-process databases.
* **Lightning speed:** Optimized columnar storage delivers queries that are 10-100x faster than traditional row-based systems.
* **Real-time ingestion:** Built for continuous data streaming and updates.
* **Production ready:** Used by companies like Uber, Cloudflare, and Spotify for analytics at scale.
```mermaid
graph TD;
subgraph "Step 1: Infrastructure";
A["Setup ClickHouse Server"];
end
subgraph "Step 2: Data Ingestion";
B["Run Python ETL script"];
C["DexPaprika API"];
D["ClickHouse database ('crypto_analytics')"];
B --"Fetches pool & OHLCV data"--> C;
B --"Loads millions of rows"--> D;
end
subgraph "Step 3 & 4: Analysis";
E["Option A: Direct SQL queries"];
F["Option B: AI assistant via MCP"];
D --> E;
D --> F;
end
A --> B;
```
**The goal:**
By the end of this guide, you will have a production-grade ClickHouse setup that can:
1. Ingest 15-minute OHLCV data for the top 250 Uniswap v3 pools (7 days of history)
2. Handle real-time data updates via streaming
3. Run complex analytical queries in milliseconds
4. Enable AI-powered analysis through MCP server integration
Install and configure ClickHouse for crypto analytics.
Create a robust data pipeline for high-frequency data ingestion.
Run complex analytics on 15-minute interval data.
Enable AI-powered analysis through MCP server integration.
***
## Step 1: Setting up ClickHouse
### **Option A: Local installation (recommended for learning)**
Install ClickHouse locally for development and testing:
```bash
# macOS
brew install clickhouse
```
**macOS specifics: Cask installation**
The `brew install clickhouse` command now installs a Cask, not a standard formula. This provides a single `clickhouse` binary that acts as a multi-tool for both the server and client. Commands that refer to `clickhouse-server` or `brew services` will not work.
Use the following commands instead:
```bash
# To start the server on macOS (runs in the foreground):
clickhouse server
# To connect with the client in a new terminal:
clickhouse client
```
```bash
# Ubuntu/Debian
sudo apt-get install -y apt-transport-https ca-certificates dirmngr
sudo apt-key adv --keyserver hkp://keyserver.ubuntu.com:80 --recv 8919F6BD2B48D754
echo "deb https://packages.clickhouse.com/deb stable main" | sudo tee /etc/apt/sources.list.d/clickhouse.list
sudo apt-get update
sudo apt-get install -y clickhouse-server clickhouse-client
# Start the server
sudo systemctl start clickhouse-server
```
### **Option B: ClickHouse Cloud (recommended for production)**
For production workloads, use [ClickHouse Cloud](https://clickhouse.com/cloud):
1. Sign up for a free trial at clickhouse.com/cloud
2. Create a new service
3. Note your connection details (host, port, username, password)
**Moving forward:**
The rest of this tutorial will assume you are using a **local ClickHouse installation (Option A)**. The Python scripts are configured for this by default. If you choose to use ClickHouse Cloud, remember to update the `CLICKHOUSE_HOST`, `CLICKHOUSE_PORT`, `CLICKHOUSE_USER`, and `CLICKHOUSE_PASSWORD` variables in the scripts accordingly.
### **Test your connection**
```bash
# Local installation (macOS)
clickhouse client
# Local installation (Linux)
clickhouse-client
# ClickHouse Cloud
clickhouse-client --host your-host.clickhouse.cloud --port 9440 --user default --password your-password --secure
```
***
## Step 2: Build the production ETL pipeline
Create a new file named `build_clickhouse_db.py`. This script efficiently handles high-frequency data from the top 500 pools, incorporating robust error handling and API management strategies. It leverages two key endpoints: the [Top Pools on a DEX endpoint](/api-reference/pools/get-top-x-pools-on-a-networks-dex) to discover pools, and the [Pool OHLCV Data endpoint](/api-reference/pools/get-ohlcv-data-for-a-pool-pair) to fetch historical price data.
```python build_clickhouse_db.py [expandable]
import requests
import pandas as pd
import clickhouse_connect
from datetime import datetime, timedelta
import logging
import time
import asyncio
import aiohttp
from typing import List, Dict
import json
import math
# --- Configuration ---
API_BASE_URL = "https://api.dexpaprika.com"
NETWORK = "ethereum"
DEX_ID = "uniswap_v3"
HISTORY_DAYS = 7 # Fetch 7 days of OHLCV data
TOP_POOLS_LIMIT = 250 # Focus on top 250 pools by volume
BATCH_SIZE = 15 # Process pools in smaller batches
CONCURRENT_REQUESTS = 4 # Concurrent requests for API calls
OHLCV_API_LIMIT = 100 # API limit for OHLCV requests
INTERVAL = "15m" # 15-minute intervals
# ClickHouse Configuration
CLICKHOUSE_HOST = "localhost" # or your ClickHouse Cloud host
CLICKHOUSE_PORT = 8123
CLICKHOUSE_USER = "default"
CLICKHOUSE_PASSWORD = "" # Set if using ClickHouse Cloud
CLICKHOUSE_DATABASE = "crypto_analytics"
# Setup logging
logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(levelname)s - %(message)s')
async def fetch_with_retry(session, url, params=None, retries=5, backoff_factor=0.5):
"""Generic fetch function with exponential backoff."""
for attempt in range(retries):
try:
async with session.get(url, params=params, timeout=30) as response:
response.raise_for_status()
return await response.json()
except (aiohttp.ClientError, asyncio.TimeoutError) as e:
if attempt == retries - 1:
logging.error(f"Final attempt failed for {url}: {e}")
raise
sleep_time = backoff_factor * (2 ** attempt)
logging.warning(f"Request to {url} failed: {e}. Retrying in {sleep_time:.2f}s...")
await asyncio.sleep(sleep_time)
class ClickHouseETL:
def __init__(self):
# Connect without a database first to ensure it exists
with clickhouse_connect.get_client(
host=CLICKHOUSE_HOST,
port=CLICKHOUSE_PORT,
username=CLICKHOUSE_USER,
password=CLICKHOUSE_PASSWORD
) as client:
client.command(f"CREATE DATABASE IF NOT EXISTS {CLICKHOUSE_DATABASE}")
# Now, connect to the specific database for table operations
self.client = clickhouse_connect.get_client(
host=CLICKHOUSE_HOST,
port=CLICKHOUSE_PORT,
username=CLICKHOUSE_USER,
password=CLICKHOUSE_PASSWORD,
database=CLICKHOUSE_DATABASE
)
self.api_semaphore = asyncio.Semaphore(CONCURRENT_REQUESTS)
self.setup_database()
def setup_database(self):
"""Create tables optimized for 15-minute interval data."""
logging.info("Setting up ClickHouse tables...")
# Create pools table with ReplacingMergeTree to handle duplicates
pools_schema = """
CREATE TABLE IF NOT EXISTS pools (
address String,
dex_name String,
volume_24h_usd Float64,
created_at DateTime,
token0_symbol Nullable(String),
token1_symbol Nullable(String),
pair Nullable(String) MATERIALIZED if(isNotNull(token0_symbol) AND isNotNull(token1_symbol), concat(token0_symbol, '-', token1_symbol), NULL),
created_date Date MATERIALIZED toDate(created_at),
volume_rank UInt32
) ENGINE = ReplacingMergeTree(created_at)
ORDER BY (address, volume_24h_usd, created_at)
PARTITION BY toYYYYMM(created_date)
"""
self.client.command(pools_schema)
# Create OHLCV table optimized for time-series analytics
ohlcv_schema = """
CREATE TABLE IF NOT EXISTS pool_ohlcv (
timestamp DateTime,
network String,
pool_address String,
open Float64,
high Float64,
low Float64,
close Float64,
volume_usd Float64,
date Date MATERIALIZED toDate(timestamp),
hour UInt8 MATERIALIZED toHour(timestamp),
minute UInt8 MATERIALIZED toMinute(timestamp),
quarter_hour UInt8 MATERIALIZED intDiv(toMinute(timestamp), 15)
) ENGINE = ReplacingMergeTree(timestamp)
ORDER BY (pool_address, timestamp)
PARTITION BY (date, network)
"""
self.client.command(ohlcv_schema)
logging.info("Database and tables setup complete.")
async def fetch_top_pools(self) -> List[Dict]:
"""Fetch top pools by volume from the specified DEX, handling pagination."""
logging.info(f"Fetching top {TOP_POOLS_LIMIT} pools for {DEX_ID} on {NETWORK}...")
all_pools = []
page = 0
async with aiohttp.ClientSession() as session:
while len(all_pools) < TOP_POOLS_LIMIT:
url = f"{API_BASE_URL}/networks/{NETWORK}/dexes/{DEX_ID}/pools"
params = {"page": page, "limit": 100, "order_by": "volume_usd", "sort": "desc"}
try:
data = await fetch_with_retry(session, url, params=params)
pools = data.get('pools', [])
if not pools:
break
all_pools.extend(pools)
logging.info(f"Fetched page {page}, got {len(pools)} pools. Total: {len(all_pools)}")
page += 1
if len(all_pools) >= TOP_POOLS_LIMIT:
all_pools = all_pools[:TOP_POOLS_LIMIT]
break
await asyncio.sleep(0.5) # Be respectful to the API
except Exception as e:
logging.error(f"Error fetching page {page}: {e}")
break
logging.info(f"Finished fetching pools. Total: {len(all_pools)}")
return all_pools
async def fetch_pool_ohlcv_paginated(self, session: aiohttp.ClientSession, pool_address: str) -> List[Dict]:
"""Fetch complete OHLCV data for a pool using intelligent, dynamic windowing."""
async with self.api_semaphore:
final_end_time = datetime.utcnow()
current_start_time = final_end_time - timedelta(days=HISTORY_DAYS)
all_ohlcv = []
try:
interval_minutes = int(INTERVAL.replace('m', ''))
minutes_per_call = OHLCV_API_LIMIT * interval_minutes
time_delta_per_call = timedelta(minutes=minutes_per_call)
except ValueError:
logging.error(f"Invalid INTERVAL format: {INTERVAL}. Defaulting to 15 minutes.")
interval_minutes = 15
time_delta_per_call = timedelta(minutes=OHLCV_API_LIMIT * 15)
while current_start_time < final_end_time:
batch_end_time = min(current_start_time + time_delta_per_call, final_end_time)
url = f"{API_BASE_URL}/networks/{NETWORK}/pools/{pool_address}/ohlcv"
params = {
"start": current_start_time.strftime('%Y-%m-%dT%H:%M:%SZ'),
"end": batch_end_time.strftime('%Y-%m-%dT%H:%M:%SZ'),
"interval": INTERVAL,
"limit": OHLCV_API_LIMIT
}
try:
batch_data = await fetch_with_retry(session, url, params=params)
if batch_data:
for record in batch_data:
record['network'] = NETWORK
record['pool_address'] = pool_address
if 'volume_usd' not in record:
avg_price = (record.get('open', 0) + record.get('close', 0)) / 2
record['volume_usd'] = record.get('volume', 0) * avg_price if avg_price > 0 else 0
all_ohlcv.extend(batch_data)
except Exception as e:
logging.warning(f"Could not fetch OHLCV batch for {pool_address}: {e}")
current_start_time = batch_end_time
await asyncio.sleep(0.75) # Crucial delay to prevent rate-limiting
logging.info(f"Pool {pool_address}: collected {len(all_ohlcv)} OHLCV records.")
return all_ohlcv
async def fetch_pool_ohlcv_batch(self, pool_addresses: List[str]) -> List[Dict]:
"""Fetch OHLCV data for multiple pools concurrently."""
logging.info(f"Fetching {INTERVAL} OHLCV for {len(pool_addresses)} pools...")
all_ohlcv = []
async with aiohttp.ClientSession() as session:
tasks = [self.fetch_pool_ohlcv_paginated(session, addr) for addr in pool_addresses]
results = await asyncio.gather(*tasks, return_exceptions=True)
for i, result in enumerate(results):
if isinstance(result, list):
all_ohlcv.extend(result)
elif isinstance(result, Exception):
logging.warning(f"OHLCV fetch failed for pool {pool_addresses[i]}: {result}")
return all_ohlcv
def load_pools_data(self, pools: List[Dict]):
"""Load pools data into ClickHouse with volume ranking."""
if not pools: return
logging.info("Processing and loading pools data...")
for i, pool in enumerate(pools):
tokens = pool.get('tokens', [])
pool['token0_symbol'] = tokens[0]['symbol'] if len(tokens) > 0 else None
pool['token1_symbol'] = tokens[1]['symbol'] if len(tokens) > 1 else None
pool['volume_rank'] = i + 1
df = pd.DataFrame(pools)
df = df[['id', 'dex_name', 'volume_usd', 'created_at', 'token0_symbol', 'token1_symbol', 'volume_rank']]
df = df.rename(columns={'id': 'address', 'volume_usd': 'volume_24h_usd'})
df['created_at'] = pd.to_datetime(df['created_at'])
self.client.insert_df('pools', df)
logging.info(f"Loaded {len(df)} pools into 'pools' table.")
def load_ohlcv_data(self, ohlcv_data: List[Dict]):
"""Load OHLCV data into ClickHouse."""
if not ohlcv_data: return
logging.info(f"Processing and loading {len(ohlcv_data)} OHLCV records...")
df = pd.DataFrame(ohlcv_data)
df['timestamp'] = pd.to_datetime(df['time_close'])
df = df[['timestamp', 'network', 'pool_address', 'open', 'high', 'low', 'close', 'volume_usd']]
self.client.insert_df('pool_ohlcv', df)
logging.info(f"Loaded {len(df)} records into 'pool_ohlcv' table.")
async def run_etl(self):
"""Run the complete ETL process."""
logging.info(f"Starting ClickHouse ETL process for top {TOP_POOLS_LIMIT} pools...")
pools = await self.fetch_top_pools()
if pools:
self.load_pools_data(pools)
pool_addresses = [pool['id'] for pool in pools if pool.get('id')]
for i in range(0, len(pool_addresses), BATCH_SIZE):
batch_addresses = pool_addresses[i:i + BATCH_SIZE]
batch_num = (i // BATCH_SIZE) + 1
total_batches = (len(pool_addresses) + BATCH_SIZE - 1) // BATCH_SIZE
logging.info(f"Processing OHLCV batch {batch_num}/{total_batches} ({len(batch_addresses)} pools)")
ohlcv_data = await self.fetch_pool_ohlcv_batch(batch_addresses)
self.load_ohlcv_data(ohlcv_data)
if i + BATCH_SIZE < len(pool_addresses):
logging.info(f"--- Finished batch {batch_num}, sleeping for 10 seconds ---")
await asyncio.sleep(10)
logging.info("ETL process completed!")
pool_count = self.client.command("SELECT COUNT() FROM pools")
ohlcv_count = self.client.command("SELECT COUNT() FROM pool_ohlcv")
unique_pools_with_data = self.client.command("SELECT COUNT(DISTINCT pool_address) FROM pool_ohlcv")
avg_records = ohlcv_count / unique_pools_with_data if unique_pools_with_data > 0 else 0
logging.info(f"Final counts - Pools: {pool_count}, OHLCV records: {ohlcv_count:,}")
logging.info(f"Coverage - {unique_pools_with_data} pools with data, avg {avg_records:.1f} records/pool.")
async def main():
etl = ClickHouseETL()
await etl.run_etl()
if __name__ == "__main__":
# pip install clickhouse-connect aiohttp pandas requests
asyncio.run(main())
```
This script is used for performance and reliability, using several good practices common in data pipelines:
* **Asynchronous operations:** By using `asyncio` and `aiohttp`, the script can make many API requests concurrently instead of one by one.
* **Dynamic windowing:** The `fetch_pool_ohlcv_paginated` function calculates how much data to request per API call based on the `OHLCV_API_LIMIT`.
* **Concurrency control & throttling:** An `asyncio.Semaphore`, combined with carefully tuned `BATCH_SIZE` and `asyncio.sleep()` calls, makes sure we don't hit the rate limit.
* **Resiliency:** The `fetch_with_retry` function automatically retries failed requests with an exponential backoff delay.
### **Required libraries**
```bash
pip install clickhouse-connect aiohttp pandas requests
```
***
## Step 3: Lightning-fast analytics (Optional)
Once your database is populated, you can query it directly using any ClickHouse-compatible SQL client or a Python script. While the next step (AI Integration) is recommended for the most powerful analysis, running queries directly is a great way to verify your data.
You can create a file named `query_clickhouse.py` to see how fast ClickHouse can process complex analytical queries on the millions of rows you've ingested.
```python query_clickhouse.py [expandable]
import clickhouse_connect
import pandas as pd
import time
client = clickhouse_connect.get_client(
host='localhost', port=8123, database='crypto_analytics'
)
print("=== ClickHouse 15-Minute Analytics Demo ===\n")
# Query 1: Top Pools by Volume with Data Coverage
print("--- Top 10 Pools by Volume (with data coverage) ---")
start_time = time.time()
result1 = client.query_df("""
SELECT
p.pair,
p.address,
max(p.volume_24h_usd) as volume_24h,
min(p.volume_rank) as best_rank,
count(o.timestamp) as ohlcv_records
FROM pools p
LEFT JOIN pool_ohlcv o ON p.address = o.pool_address
GROUP BY p.pair, p.address
ORDER BY volume_24h DESC
LIMIT 10
""")
print(result1)
print(f"Query executed in {time.time() - start_time:.3f} seconds\n")
# Query 2: 15-Minute Volume Patterns Analysis
print("--- Volume Patterns by 15-Minute Intervals ---")
result2 = client.query_df("""
SELECT
hour,
quarter_hour * 15 as minute_of_hour,
COUNT(DISTINCT pool_address) as active_pools,
round(avg(volume_usd), 2) as avg_15min_volume,
round(sum(volume_usd), 2) as total_15min_volume
FROM pool_ohlcv
WHERE volume_usd > 0 AND volume_usd < 1000000000 -- Defensive filter
GROUP BY hour, quarter_hour
ORDER BY total_15min_volume DESC
""")
print(result2.head(10))
# Query 3: High-Frequency Price Action Analysis (Top 5 Pools)
print("\n--- High-Frequency Volatility Analysis (Top 5 Pools) ---")
result3 = client.query_df("""
WITH top_pools AS (
SELECT address from pools ORDER BY volume_24h_usd DESC LIMIT 5
),
pool_volatility AS (
SELECT
o.pool_address,
p.pair,
(o.high - o.low) / o.low * 100 as interval_volatility
FROM pool_ohlcv o
JOIN pools p ON o.pool_address = p.address
WHERE o.pool_address IN (SELECT address FROM top_pools) AND o.low > 0
)
SELECT
pair,
pool_address,
avg(interval_volatility) as avg_15min_volatility,
max(interval_volatility) as max_15min_volatility
FROM pool_volatility
GROUP BY pair, pool_address
ORDER BY avg_15min_volatility DESC
""")
print(result3.head(15))
# Query 4: Peak Trading Hours Analysis
print("\n--- Peak Trading Hours Analysis ---")
result4 = client.query_df("""
SELECT
hour,
COUNT(DISTINCT pool_address) as active_pools,
round(sum(volume_usd), 2) as hourly_volume,
round(avg(volume_usd), 2) as avg_15min_volume,
round(avg((high - low) / low * 100), 4) as avg_volatility_pct
FROM pool_ohlcv
WHERE volume_usd > 0 AND low > 0 AND volume_usd < 1000000000 -- Defensive filter
GROUP BY hour
ORDER BY hourly_volume DESC
""")
print(result4.head(10))
# Query 5: Database performance and storage stats
print("\n--- Database Performance Stats ---")
stats = client.query_df("""
SELECT
table as table_name,
formatReadableSize(sum(bytes_on_disk)) as size,
sum(rows) as row_count,
formatReadableSize(sum(bytes_on_disk)/sum(rows)) as avg_row_size
FROM system.parts
WHERE database = 'crypto_analytics' AND active = 1
GROUP BY table
""")
print(stats)
# Query 6: Data quality check
print("\n--- Data Quality Summary ---")
quality_check = client.query_df("""
SELECT
'Total Records' as metric, toString(COUNT(*)) as value
FROM pool_ohlcv
UNION ALL SELECT
'Date Range', concat(toString(MIN(date)), ' to ', toString(MAX(date)))
FROM pool_ohlcv
UNION ALL SELECT
'Unique Pools with Data', toString(COUNT(DISTINCT pool_address))
FROM pool_ohlcv
UNION ALL SELECT
'Avg Records per Day', toString(ROUND(COUNT(*) / nullif(COUNT(DISTINCT date), 0)))
FROM pool_ohlcv
UNION ALL SELECT
'Expected 15min Intervals vs Actual',
concat(
toString(dateDiff('minute', MIN(timestamp), MAX(timestamp)) / 15),
' vs ',
toString(count())
)
FROM pool_ohlcv
""")
print(quality_check)
```
You can run the script by executing it from your terminal:
```bash
python query_clickhouse.py
```
Now, let's move on to the recommended final step: connecting your database to an AI assistant.
***
## Step 4: AI-powered analysis with an MCP server
Enable seamless analysis of your local ClickHouse database through the [ClickHouse MCP Server](https://github.com/ClickHouse/mcp-clickhouse). This allows AI assistants like Claude Desktop to connect to your database, list tables, and run `SELECT` queries securely.
### 1. Install the MCP server
The server is a Python package that can be installed via `pip`:
```bash
pip install clickhouse-mcp-server
```
### 2. Configure your AI client
Next, configure your AI client (e.g., Claude Desktop) to use the server. You'll need to edit its configuration file.
* **macOS:** `~/Library/Application Support/Claude/claude_desktop_config.json`
* **Windows:** `%APPDATA%/Claude/claude_desktop_config.json`
Add the following JSON block to the `mcpServers` section of the file. This tells the client how to run the server and provides the connection details for your local ClickHouse instance.
**Finding the command path**
The most common point of failure is an incorrect command path. The command should be the **absolute path** to the `clickhouse-mcp-server` executable that `pip` installed.
Find this path by running `which clickhouse-mcp-server` in your terminal and use the output in the `command` field below.
```json
{
"mcpServers": {
"clickhouse-mcp-server": {
"command": "/path/to/your/clickhouse-mcp-server",
"args": [],
"env": {
"CLICKHOUSE_HOST": "localhost",
"CLICKHOUSE_USER": "default",
"CLICKHOUSE_PASSWORD": "",
"CLICKHOUSE_DATABASE": "crypto_analytics",
"CLICKHOUSE_SECURE": "false"
}
}
}
}
```
### 3. Restart and analyze
Save the configuration file and **restart your AI client**. Once restarted, you can start asking it to analyze the data in your `crypto_analytics` database.
### Troubleshooting & important notes
* **"Server disconnected" error:** This almost always means the `command` path in your configuration is incorrect. Double-check the absolute path using `which clickhouse-mcp-server`.
* **AI connects to the `default` database:** We observed that the AI client might sometimes choose to connect to the `default` database on its own, even if `crypto_analytics` is specified in the config. This will result in it seeing no tables.
* **Solution: Be explicit:** To ensure the AI works correctly, always specify the database in your prompt. This overrides the AI's tendency to use the default.
**Good example prompts:**
* "Using the clickhouse-mcp-server, **connect to the `crypto_analytics` database** and then list the tables."
* "**In the `crypto_analytics` database**, show me the top 10 pools by volume from the `pools` table."
* "Calculate the average daily volume for the top 5 most volatile pools **from the `crypto_analytics` database**."
***
## What you've built: A production-grade analytics pipeline
Congratulations! You've successfully built a scalable crypto analytics pipeline with ClickHouse. You've ingested a large dataset of OHLCV data, and you've enabled a powerful AI assistant to securely query and analyze that data.
**Key achievements:**
* **Built a production-ready ETL pipeline:** You have a reusable, high-performance Python script that can create a comprehensive, multi-million row database from any supported DEX and network.
* **Unlocked lightning-fast SQL:** You can now perform complex analytical queries on a massive dataset in milliseconds, directly on your machine.
* **Mastered a scalable workflow:** This "local-first" data strategy, combined with ClickHouse's power, provides a solid foundation for building real-time dashboards, conducting in-depth market research, and developing sophisticated trading algorithms.
* **Enabled secure AI analysis:** By connecting your database to an AI assistant via an MCP server, you've created a powerful and secure way to explore your data using natural language.
# API Tutorials
Source: https://docs.dexpaprika.com/tutorials/tutorial_intro
Learn how to use the DexPaprika API with step-by-step tutorials.
## Available Tutorials
| Tutorial | Description |
| ---------------------------------------------------------------------- | ----------------------------------------------------------------------------------------------------- |
| [Fetching Token Prices](/tutorials/fetch-token-price) | Learn how to retrieve the price of any token using the DexPaprika API. |
| [Build Simple Alert System](/tutorials/crypto-alert-bot) | Learn how to create a real-time cryptocurrency price alert system using DexPaprika API & Telegram |
| [Retrieve Historical Data](/tutorials/retrieve-historical-data) | Learn how to get OHLCV (Open, High, Low, Close, Volume) historical price data for any token. |
| [Find New Pools](/tutorials/find-new-pools) | Learn how to discover newly created liquidity pools on any network by sorting pools by creation date. |
| [Local Analytics with DuckDB](/tutorials/local-analytics-with-duckdb) | Build a high-performance local analytics database with DuckDB to instantly query Uniswap v3 data. |
| [Crypto Analytics with ClickHouse](/tutorials/scaling-with-clickhouse) | Create a production-grade ClickHouse pipeline for massive datasets and lightning-fast queries. |
| More tutorials coming soon | Stay tuned for additional guides. |
Do you have an interesting implementation that you want to share with others? We'd be happy to share your tutorial utilizing our API with others. [Reach us out on Discord!](https://discord.gg/mS4cWp6a)
## Get Support
Connect with our community and get real-time support.
Share your experience and help us improve the API.