Processing Large Datasets with the Batch API

Convert hundreds or thousands of legal land descriptions to GPS coordinates using the Township Canada Batch API. Includes chunking, error handling, and examples in Node.js and Python.

What You'll Build

By the end of this guide, you'll have a working script that reads a list of legal land descriptions—potentially thousands of them—and converts each one to GPS coordinates using the Township Canada Batch API. You'll handle the 100-record-per-request limit by chunking your data, aggregate the responses into a single output file, and add the rate limit handling needed for production use.

The same pattern applies in reverse: if you're starting from coordinates and need legal land descriptions, the reverse batch endpoint follows an identical workflow.

Prerequisites

  • A Township Canada Batch API key — subscribe at /api
  • Node.js 18+ or Python 3.8+
  • Basic familiarity with REST APIs and JSON

The Batch API is available on its own subscription ($40/month) and is separate from the Search and Autocomplete APIs. See the API Integration Guide for a full overview of available endpoints and pricing tiers.

The Batch Endpoint

The Batch API exposes two endpoints under https://developer.townshipcanada.com:

Forward (LLD → coordinates)

POST /batch/legal-location
X-API-Key: YOUR_API_KEY
Content-Type: application/json

["NE-7-102-19-W4", "1-24-60-1-W5", "7-66-4-W6"]

Reverse (coordinates → LLD)

POST /batch/coordinates
X-API-Key: YOUR_API_KEY
Content-Type: application/json

{
  "coordinates": [[-110.086743843, 54.28602155], [-110.011880321, 54.336941143]],
  "survey_system": "DLS",
  "unit": "Quarter Section"
}

Both endpoints accept a maximum of 100 records per request. Sending more than 100 records in a single request will result in an error.

Response format

Each successful request returns a GeoJSON FeatureCollection. Every input record produces two features in the response: a MultiPolygon representing the grid boundary, and a Point representing the centroid.

{
  "type": "FeatureCollection",
  "features": [
    {
      "type": "Feature",
      "geometry": { "type": "MultiPolygon", "coordinates": [[...]] },
      "properties": {
        "unit": "Quarter Section",
        "shape": "grid",
        "province": "Alberta",
        "search_term": "NE-7-102-19-W4",
        "legal_location": "NE-7-102-19-W4",
        "survey_system": "DLS"
      }
    },
    {
      "type": "Feature",
      "geometry": { "type": "Point", "coordinates": [-111.644676, 56.535938] },
      "properties": {
        "unit": "Quarter Section",
        "shape": "centroid",
        "search_term": "NE-7-102-19-W4",
        "legal_location": "NE-7-102-19-W4",
        "survey_system": "DLS",
        "province": "Alberta"
      }
    }
  ]
}

Key details to note:

  • The legal_location property identifies which input record produced the feature.
  • The shape property distinguishes "grid" (boundary polygon) from "centroid" (point).
  • Geometry type is MultiPolygon for boundaries, not Polygon.
  • A dataset of 500 LLDs will produce a response with 1,000 features across all chunks.

Chunking Large Datasets

Since each request is limited to 100 records, any dataset larger than 100 LLDs needs to be split into chunks before sending. For a list of 750 legal land descriptions, that's 8 requests (7 of 100 records, 1 of 50).

The simplest approach is a sequential loop: send chunk 1, wait for the response, send chunk 2, and so on. This avoids rate limit issues and keeps memory usage low. For time-sensitive workflows, you can run a small number of chunks concurrently — but stay within your subscription's rate limit (1 req/sec on Build tier, 5 req/sec on Scale, 25 req/sec on Enterprise).

Error Handling

The API returns HTTP errors for request-level problems (invalid JSON, missing auth header, oversized payload). Within a valid response, individual records that could not be resolved will simply be absent from the features array — there is no per-record error object. After processing, compare the number of unique legal_location values in your output against your input count to identify records that returned no results.

Node.js Example

This script reads a JSON array of legal land descriptions from input.json, processes them in batches of 100, and writes the aggregated GeoJSON to output.json.

const fs = require("fs");

const API_KEY = process.env.TOWNSHIP_CANADA_API_KEY;
const BASE_URL = "https://developer.townshipcanada.com";
const CHUNK_SIZE = 100;
const DELAY_MS = 1100; // slightly over 1 second to stay under 1 req/sec on Build tier

function chunk(array, size) {
  const chunks = [];
  for (let i = 0; i < array.length; i += size) {
    chunks.push(array.slice(i, i + size));
  }
  return chunks;
}

function sleep(ms) {
  return new Promise((resolve) => setTimeout(resolve, ms));
}

async function batchConvert(llds) {
  const response = await fetch(`${BASE_URL}/batch/legal-location`, {
    method: "POST",
    headers: {
      "X-API-Key": API_KEY,
      "Content-Type": "application/json"
    },
    body: JSON.stringify(llds)
  });

  if (!response.ok) {
    throw new Error(`HTTP ${response.status}: ${await response.text()}`);
  }

  return response.json();
}

async function main() {
  const input = JSON.parse(fs.readFileSync("input.json", "utf8"));
  console.log(`Processing ${input.length} records...`);

  const chunks = chunk(input, CHUNK_SIZE);
  const allFeatures = [];

  for (let i = 0; i < chunks.length; i++) {
    console.log(`Chunk ${i + 1}/${chunks.length} (${chunks[i].length} records)`);

    const result = await batchConvert(chunks[i]);
    allFeatures.push(...result.features);

    // Avoid hitting rate limits between chunks (skip delay after last chunk)
    if (i < chunks.length - 1) {
      await sleep(DELAY_MS);
    }
  }

  const output = {
    type: "FeatureCollection",
    features: allFeatures
  };

  fs.writeFileSync("output.json", JSON.stringify(output, null, 2));
  console.log(`Done. ${allFeatures.length} features written to output.json`);

  // Report any input records with no matching output
  const resolved = new Set(allFeatures.map((f) => f.properties.legal_location));
  const missing = input.filter((lld) => !resolved.has(lld));
  if (missing.length > 0) {
    console.warn(`${missing.length} records returned no results:`, missing);
  }
}

main().catch(console.error);

Run it with:

TOWNSHIP_CANADA_API_KEY=your_key_here node batch-convert.js

Your input.json should be a plain JSON array of LLD strings:

["NE-7-102-19-W4", "SW-12-45-22-W4", "1-24-60-1-W5", "7-66-4-W6"]

Python Example

The same workflow using the requests library:

import json
import os
import time

import requests

API_KEY = os.environ["TOWNSHIP_CANADA_API_KEY"]
BASE_URL = "https://developer.townshipcanada.com"
CHUNK_SIZE = 100
DELAY_SECONDS = 1.1  # slightly over 1 second for Build tier rate limit


def chunk(lst, size):
    for i in range(0, len(lst), size):
        yield lst[i : i + size]


def batch_convert(llds):
    response = requests.post(
        f"{BASE_URL}/batch/legal-location",
        headers={
            "X-API-Key": API_KEY,
            "Content-Type": "application/json",
        },
        json=llds,
        timeout=30,
    )
    response.raise_for_status()
    return response.json()


def main():
    with open("input.json") as f:
        input_llds = json.load(f)

    print(f"Processing {len(input_llds)} records...")

    chunks = list(chunk(input_llds, CHUNK_SIZE))
    all_features = []

    for i, batch in enumerate(chunks):
        print(f"Chunk {i + 1}/{len(chunks)} ({len(batch)} records)")

        result = batch_convert(batch)
        all_features.extend(result["features"])

        # Pause between chunks to respect rate limits
        if i < len(chunks) - 1:
            time.sleep(DELAY_SECONDS)

    output = {"type": "FeatureCollection", "features": all_features}

    with open("output.json", "w") as f:
        json.dump(output, f, indent=2)

    print(f"Done. {len(all_features)} features written to output.json")

    # Identify input records with no matching output
    resolved = {f["properties"]["legal_location"] for f in all_features}
    missing = [lld for lld in input_llds if lld not in resolved]
    if missing:
        print(f"{len(missing)} records returned no results: {missing}")


if __name__ == "__main__":
    main()

Install the dependency and run:

pip install requests
TOWNSHIP_CANADA_API_KEY=your_key_here python batch_convert.py

Rate Limiting

Each API tier has a per-second rate limit:

TierRate LimitMonthly Requests
Build1 req/sec1,000
Scale5 req/sec10,000
Enterprise25 req/sec100,000

The examples above use a 1.1-second delay between chunks, which works safely on the Build tier. On Scale or Enterprise, you can reduce or remove the delay, or process multiple chunks concurrently. If you receive an HTTP 429 response, back off and retry after a short wait.

For large one-off migrations, running overnight is often the simplest approach — no concurrency required, no risk of hitting limits.

Next Steps

For questions or support with your integration, contact us.