While any field can be used to filter down responses, it is recommended to use chainName and timestamp whenever possible to speed up querying time.
Ordering
As with filtering, timestamp and chainName will provide the fastest queries.
Linked Types
While using Linked Types/Object Types is possible it significantly slows down the query response time.
For example, try not to use the fills type inside a transaction query nor vise-versa.
Pagination
Responses are limited to 1000 items per request, to go beyond that limit use Ordering, limit and offset. If the offset goes beyond the number of existing items the response will be an empty array.
If the result set is very large, like in a historical export, the best would be to iterate, first through chainName then timestamp and then use pagination. See Historical Datafor an example.
Historical Data
Sometimes, historical data is needed for reporting purpose, and it might take a long time to process all of them. It is recommended that you do pagination along with relevant filters on chains and timestamp. The following code snippet is an example
example_historical_data.py
from gql import gql, Clientfrom gql.transport.aiohttp import AIOHTTPTransportimport psycopg2 as pgfrom urllib.parse import urlparsefrom datetime import datetime, timedelta# helper functiondefconnect_pg(uri):# example database connection function for postgres server result =urlparse(uri) user = result.username password = result.password db = result.path[1:] host = result.hostname conn_string =f"host={host} dbname={db} user={user} password={password}" conn = pg.connect(conn_string)return conndefmain(): url ='https://api.0x.org/data/v0' headers ={"0x-api-key":"Your API Key Here"} transport =AIOHTTPTransport(url=url, headers=headers) client =Client(transport=transport) sample_query =gql(""" query ($offset_number:Int!, $time_variable_lower:timestamptz, $time_variable_higher:timestamptz){ transactions(offset: $offset_number, limit: 1000, order_by: {timestamp: desc}, where: {_and: {timestamp: {_gte: $time_variable_lower, _lt: $time_variable_higher}, chainName: {_eq: "Polygon"}}}) {
timestamp affiliate app baseFeePerGas burntGasFees burntGasFeesUSD calledFunction feeRecipient gasFees gasFeesUSD gasLimit gasPrice gasUsed hasDirect hasLimitOrder hasRFQ isGasless isMultiplex isMutihop liquiditySource maker makerAmount makerTokenPriceUSD makerTokenSymbol makerVolumeUSD maxFeePerGas maxPriorityFeePerGas nativeOrderType reimbursedGasFees reimbursedGasFeesUSD router takerAmount takerTokenPriceUSD takerTokenSymbol takerVolumeUSD tipGasFees tipGasFeesUSD transformerFeeRecipient transformerFeeToken transformerFeeTokenAmount transformerFeeTokenSymbol transformerFeeVolumeUSD type volumeUSD } } """) all_responses = [] starting_time =datetime(2022, 8, 1)# it is faster to iterate through a time periodfor date_var inrange(0, 18): incremental_time = starting_time +timedelta(1) query_starting_time = starting_time.isoformat() query_ending_time = incremental_time.isoformat() starting_time = incremental_time start_number =0whileTrue: vars_list = {"offset_number": start_number, "time_variable_lower": query_starting_time, "time_variable_higher": query_ending_time}
response = client.execute(sample_query, variable_values=vars_list) all_responses.extend(response['transactions']) start_number +=1000ifnot response['transactions']:break# can export the json file from all_responses to an external database like postgres for all historical dataif__name__=="__main__":main()