Tips

Filtering

While any field can be used to filter down responses, it is recommended to use chainName and timestamp whenever possible to speed up querying time.

Ordering

As with filtering, timestamp and chainName will provide the fastest queries.

Linked Types

While using Linked Types/Object Types is possible it significantly slows down the query response time.

For example, try not to use the fills type inside a transaction query nor vise-versa.

Pagination

Responses are limited to 1000 items per request, to go beyond that limit use Ordering, limit and offset. If the offset goes beyond the number of existing items the response will be an empty array.

If the result set is very large, like in a historical export, the best would be to iterate, first through chainName then timestamp and then use pagination. See Historical Datafor an example.

Historical Data

Sometimes, historical data is needed for reporting purpose, and it might take a long time to process all of them. It is recommended that you do pagination along with relevant filters on chains and timestamp. The following code snippet is an example

example_historical_data.py
from gql import gql, Client
from gql.transport.aiohttp import AIOHTTPTransport

import psycopg2 as pg
from urllib.parse import urlparse

from datetime import datetime, timedelta

# helper function
def connect_pg(uri):
    # example database connection function for postgres server
    result = urlparse(uri)
    user = result.username
    password = result.password
    db = result.path[1:]
    host = result.hostname
    conn_string = f"host={host} dbname={db} user={user} password={password}"
    conn = pg.connect(conn_string)
    return conn


def main():
    url = 'https://api.0x.org/data/v0'
    headers = {"0x-api-key": "Your API Key Here"}

    transport = AIOHTTPTransport(url=url, headers=headers)
    client = Client(transport=transport)

    sample_query = gql("""
        query ($offset_number:Int!, $time_variable_lower:timestamptz, $time_variable_higher:timestamptz){
          transactions(offset: $offset_number, limit: 1000, order_by: {timestamp: desc}, 
          where: {_and: {timestamp: {_gte: $time_variable_lower, _lt: $time_variable_higher}, chainName: {_eq: "Polygon"}}}) {
            timestamp
            affiliate
            app
            baseFeePerGas
            burntGasFees
            burntGasFeesUSD
            calledFunction
            feeRecipient
            gasFees
            gasFeesUSD
            gasLimit
            gasPrice
            gasUsed
            hasDirect
            hasLimitOrder
            hasRFQ
            isGasless
            isMultiplex
            isMutihop
            liquiditySource
            maker
            makerAmount
            makerTokenPriceUSD
            makerTokenSymbol
            makerVolumeUSD
            maxFeePerGas
            maxPriorityFeePerGas
            nativeOrderType
            reimbursedGasFees
            reimbursedGasFeesUSD
            router
            takerAmount
            takerTokenPriceUSD
            takerTokenSymbol
            takerVolumeUSD
            tipGasFees
            tipGasFeesUSD
            transformerFeeRecipient
            transformerFeeToken
            transformerFeeTokenAmount
            transformerFeeTokenSymbol
            transformerFeeVolumeUSD
            type
            volumeUSD
          }
          }
        """)

    all_responses = []
    starting_time = datetime(2022, 8, 1)
    # it is faster to iterate through a time period
    for date_var in range(0, 18):
        incremental_time = starting_time + timedelta(1)
        query_starting_time = starting_time.isoformat()
        query_ending_time = incremental_time.isoformat()
        starting_time = incremental_time
        start_number = 0
        while True:
            vars_list = {"offset_number": start_number, "time_variable_lower": query_starting_time, "time_variable_higher": query_ending_time}
            response = client.execute(sample_query, variable_values=vars_list)
            all_responses.extend(response['transactions'])
            start_number += 1000
            if not response['transactions']:
                break
    # can export the json file from all_responses to an external database like postgres for all historical data


if __name__ == "__main__":
    main()

Last updated