Export & Rate Limits
While DeepBlock’s GraphQL API is powerful for interactive queries and integrations, enterprises often need to work with large datasets or perform bulk analysis. We provide options to accommodate those needs, as well as policies to ensure fairness and stability:
Bulk Data Export
For analyses that require crunching entire datasets (cold data), DeepBlock offers export functionality. For example, if you need all transactions of a specific contract for the past year, or the full list of addresses that interacted with a protocol since its inception, doing this through the API alone might be slow or hit rate limits. Instead, we can provide a data dump (e.g., a CSV, Parquet, or database snapshot) of that subset of the knowledge graph. Enterprise users can request exports via the dashboard or support, and we’ll prepare the files for secure download.
These exports can then be loaded into your own data warehouse or analyzed offline with your preferred tools.
Snapshots for Cold Storage
Similar to exports, we can periodically provide snapshots of certain data (like a monthly snapshot of all token balances, or end-of-day summaries of key metrics). This is useful if you want to maintain an in-house archive of historical data sourced from DeepBlock for compliance or deep historical analysis. We ensure these snapshots are consistent and labeled by date, so you know exactly what point in time they represent.
Rate Limits
Our API has rate limits to protect the system for all users. The specifics might vary, but for example, we might allow X requests per second and Y requests per day for a standard key. The limits are set high enough that typical analytical usage and even moderate automated usage won’t hit them.
However, for very data-intensive tasks (like pulling thousands of records in one go), we encourage using pagination and being mindful of these limits. If a user exceeds the limits, the API will return a friendly error asking to slow down or contact us for an increase.
Enterprise Rate Limit Increases
Enterprise partners often receive higher rate limits or dedicated throughput. If you have a mission-critical application that needs to make a large volume of queries, we can allocate dedicated resources or mirror infrastructure closer to your region for low latency.
Essentially, we can custom-tailor the access based on your use case (this could be part of an enterprise SLA agreement). Just discuss your needs with our team.
Query Complexity Limits
In addition to raw request counts, we also guard against extremely expensive single queries (e.g., a query that tries to return millions of objects in one response or do a massive join without filters). In practice, we implement timeouts or result caps for a single query’s execution. If a query times out or exceeds complexity bounds, you’ll get an error.
In such cases, we often advise breaking the query into smaller pieces or using an export. The documentation guides writing efficient queries (for example, always filtering by date ranges or specific contracts when possible, rather than scanning the entire chain history in one go).
API Usage Monitoring
You’ll have access to a dashboard that shows your API usage – calls made, data volume, etc. This helps you track if you’re nearing any limits. It’s also useful for cost management if you’re on a paid plan (in the future) that charges by usage. Transparency here means no surprises in terms of hitting a ceiling.
In summary, DeepBlock provides flexibility for both real-time queries and big-data analysis. Use the GraphQL API for day-to-day interactive work and integration, and leverage our export/snapshot services for heavy lifting. We’re here to support your data needs, and if something isn’t feasible via the self-serve API due to volume, we’ll work with you to get the data to you in the best way possible.
Last updated