Skip to main content
AfterShip is a shipment tracking platform that helps e-commerce businesses track packages across 1,100+ carriers worldwide. It provides real-time tracking updates, delivery estimates, and analytics to improve post-purchase customer experience and operational efficiency.

Configuring AfterShip as a Source

In the Sources tab, click on the “Add source” button located on the top right of your screen. Then, select the AfterShip option from the list of connectors. Click Next and you’ll be prompted to add your access.

1. Add account access

You’ll need an AfterShip API key to authenticate. The following configurations are available:
  • API Key: Your AfterShip API key used to authenticate API requests.
  • Start Date: The earliest date from which records will be synced based on the last modification date.
1

Log in to AfterShip

Go to AfterShip and log in to your account.
2

Navigate to API Settings

  1. Click on your profile icon in the top-right corner
  2. Select Settings
  3. In the left sidebar, click on API Keys under the “Integrations” section
3

Generate a new API Key

  1. Click Generate API Key
  2. Give your API key a descriptive name (e.g., “Nekt Integration”)
  3. Click Generate
  4. Copy the generated API key immediately
Important: The API key is only shown once. Copy it immediately and store it securely. If you lose it, you’ll need to generate a new one.
Once you’re done, click Next.

2. Select streams

Choose which data streams you want to sync. For faster extractions, select only the streams that are relevant to your analysis.
Tip: The stream can be found more easily by typing its name.
Select the streams and click Next.

3. Configure data streams

Customize how you want your data to appear in your catalog. Select the desired layer where the data will be placed, a folder to organize it inside the layer, a name for each table (which will effectively contain the fetched data) and the type of sync.
  • Layer: choose between the existing layers on your catalog. This is where you will find your new extracted tables as the extraction runs successfully.
  • Folder: a folder can be created inside the selected layer to group all tables being created from this new data source.
  • Table name: we suggest a name, but feel free to customize it. You have the option to add a prefix to all tables at once and make this process faster!
  • Sync Type: you can choose between INCREMENTAL and FULL_TABLE.
    • Incremental: every time the extraction happens, we’ll get only the new or updated data based on the updated_at field.
    • Full table: every time the extraction happens, we’ll get the current state of all data.
Once you are done configuring, click Next.

4. Configure data source

Describe your data source for easy identification within your organization, not exceeding 140 characters. To define your Trigger, consider how often you want data to be extracted from this source. This decision usually depends on how frequently you need the new table data updated (every day, once a week, or only at specific times). Optionally, you can define some additional settings:
  • Configure Delta Log Retention and determine for how long we should store old states of this table as it gets updated. Read more about this resource here.
  • Determine when to execute an Additional Full Sync. This will complement the incremental data extractions, ensuring that your data is completely synchronized with your source every once in a while.
Once you are ready, click Next to finalize the setup.

5. Check your new source

You can view your new source on the Sources page. If needed, manually trigger the source extraction by clicking on the arrow button. Once executed, your data will appear in your Catalog.
For you to be able to see it on your Catalog, you need at least one successful source run.

Streams and Fields

Below you’ll find all available data streams from AfterShip and their corresponding fields:
Stream containing shipment tracking information with full tracking history.Key Fields:
  • id - Unique identifier for the tracking
  • legacy_id - Legacy tracking ID
  • tracking_number - The carrier tracking number
  • slug - Courier code (e.g., “ups”, “fedex”, “dhl”)
  • active - Whether the tracking is currently active
  • created_at - When the tracking was created
  • updated_at - When the tracking was last updated (replication key)
Order Information:
  • order_id - Associated order ID
  • order_number - Order number
  • order_id_path - Order IDs path
  • order_date - Date of the order
  • order_tags[] - Tags associated with the order
  • order_promised_delivery_date - Promised delivery date
Origin Location:
  • origin_country_region - Origin country/region code
  • origin_state - Origin state
  • origin_city - Origin city
  • origin_postal_code - Origin postal code
  • origin_raw_location - Raw origin location string
Destination Location:
  • destination_country_region - Destination country/region code
  • destination_state - Destination state
  • destination_city - Destination city
  • destination_postal_code - Destination postal code
  • destination_raw_location - Raw destination location string
  • courier_destination_country_region - Courier-reported destination
Delivery Status:
  • tag - Current tracking status tag (e.g., “Delivered”, “InTransit”, “OutForDelivery”)
  • subtag - Detailed status subtag
  • subtag_message - Human-readable status message
  • on_time_status - On-time delivery status
  • on_time_difference - Difference from promised delivery time (in days)
  • signed_by - Name of person who signed for delivery
  • return_to_sender - Whether package was returned to sender
Shipment Details:
  • shipment_type - Type of shipment
  • shipment_package_count - Number of packages
  • shipment_pickup_date - Pickup date
  • shipment_delivery_date - Actual delivery date
  • shipping_method - Shipping method used
  • shipment_tags[] - Tags for the shipment
  • shipment_weight.value - Weight value
  • shipment_weight.unit - Weight unit (kg, lb, etc.)
  • transit_time - Transit time in days
Estimated Delivery:
  • courier_estimated_delivery_date.estimated_delivery_date - Courier’s estimated delivery
  • courier_estimated_delivery_date.estimated_delivery_date_min - Minimum estimate
  • courier_estimated_delivery_date.estimated_delivery_date_max - Maximum estimate
  • aftership_estimated_delivery_date - AfterShip’s predicted delivery date
  • custom_estimated_delivery_date - Custom estimated delivery date
  • first_estimated_delivery.datetime - First estimated delivery datetime
  • first_estimated_delivery.source - Source of first estimate
  • latest_estimated_delivery.datetime - Latest estimated delivery datetime
  • latest_estimated_delivery.revise_reason - Reason for revision
Checkpoints (Tracking History):
  • checkpoints[] - Array of tracking events:
    • checkpoints[].checkpoint_time - Timestamp of the event
    • checkpoints[].tag - Status tag at checkpoint
    • checkpoints[].subtag - Detailed subtag
    • checkpoints[].subtag_message - Status message
    • checkpoints[].message - Checkpoint description
    • checkpoints[].location - Full location string
    • checkpoints[].city - City
    • checkpoints[].state - State
    • checkpoints[].country_region - Country/region code
    • checkpoints[].country_region_name - Country/region name
    • checkpoints[].postal_code - Postal code
    • checkpoints[].coordinate.latitude - Latitude
    • checkpoints[].coordinate.longitude - Longitude
    • checkpoints[].slug - Courier slug
    • checkpoints[].source - Data source
    • checkpoints[].created_at - When checkpoint was created
    • checkpoints[].events[] - Events at this checkpoint
Customer Information:
  • customers[] - Array of customers:
    • customers[].name - Customer name
    • customers[].email - Customer email
    • customers[].phone_number - Phone number
    • customers[].language - Preferred language
    • customers[].role - Role (buyer or receiver)
Notifications:
  • subscribed_emails[] - Subscribed email addresses
  • subscribed_smses[] - Subscribed SMS numbers
Delivery Attempts:
  • first_attempted_at - First delivery attempt timestamp
  • failed_delivery_attempts - Number of failed attempts
  • delivery_type - Type of delivery
  • delivery_location_type - Type of delivery location
  • signature_requirement - Signature requirement info
  • pickup_location - Pickup location (for pickup deliveries)
  • pickup_note - Pickup note
Tracking Links:
  • courier_tracking_link - Direct link to carrier tracking
  • courier_redirect_link - Courier redirect link
  • aftership_tracking_url - AfterShip tracking page URL
  • aftership_tracking_order_url - AfterShip order tracking URL
Additional Fields:
  • title - Tracking title (usually order number or product name)
  • note - Notes about the tracking
  • source - Source of the tracking data
  • language - Language of the tracking record
  • tracked_count - Number of times tracked
  • tracking_account_number - Carrier account number
  • tracking_key - Tracking key
  • tracking_ship_date - Ship date (YYYY-MM-DD)
  • courier_connection_id - Courier connection ID
  • location_id - Location ID
  • carbon_emissions - Estimated carbon emissions
  • custom_fields.item_names - Custom field for item names
  • first_mile - First mile information
  • last_mile - Last mile information

Data Model

The following diagram illustrates the structure of the AfterShip data. The Trackings stream is the main entity containing all shipment tracking information including nested checkpoints and customer data.

Use Cases for Data Analysis

This guide outlines valuable business intelligence use cases when consolidating AfterShip data, along with ready-to-use SQL queries that you can run on Explorer.

1. Delivery Performance Overview

Track delivery performance across carriers and identify trends in on-time delivery. Business Value:
  • Monitor carrier performance and reliability
  • Identify delivery delays and their causes
  • Optimize carrier selection based on performance data
WITH delivery_metrics AS (
    SELECT
        slug AS carrier,
        tag AS status,
        COUNT(*) AS total_shipments,
        AVG(transit_time) AS avg_transit_days,
        SUM(CASE WHEN on_time_status = 'on_time' THEN 1 ELSE 0 END) AS on_time_count,
        SUM(CASE WHEN tag = 'Delivered' THEN 1 ELSE 0 END) AS delivered_count
    FROM
        nekt_raw.aftership_trackings
    WHERE
        created_at >= CURRENT_DATE - INTERVAL '30' DAY
    GROUP BY
        slug,
        tag
)
SELECT
    carrier,
    status,
    total_shipments,
    ROUND(avg_transit_days, 1) AS avg_transit_days,
    on_time_count,
    ROUND(on_time_count * 100.0 / NULLIF(delivered_count, 0), 1) AS on_time_rate_pct
FROM
    delivery_metrics
ORDER BY
    total_shipments DESC
carrierstatustotal_shipmentsavg_transit_dayson_time_counton_time_rate_pct
upsDelivered1,2453.21,15692.9
fedexDelivered8922.884594.7
dhlInTransit4564.1--
uspsDelivered3784.531282.5

2. Shipment Status Distribution

Analyze current shipment statuses to understand pipeline and identify potential issues. Business Value:
  • Monitor shipments in transit
  • Identify stuck or problematic shipments
  • Track exception rates by carrier
SELECT
    tag AS status,
    subtag AS detailed_status,
    COUNT(*) AS shipment_count,
    ROUND(COUNT(*) * 100.0 / SUM(COUNT(*)) OVER (), 1) AS percentage
FROM
    nekt_raw.aftership_trackings
WHERE
    active = true
GROUP BY
    tag,
    subtag
ORDER BY
    shipment_count DESC
statusdetailed_statusshipment_countpercentage
InTransitInTransit_PickedUp52335.2
InTransitInTransit_Departed31221.0
OutForDeliveryOutForDelivery_ToBeDelivered17812.0
DeliveredDelivered_Signed15610.5
ExceptionException_Delayed896.0

3. Destination Analysis

Analyze shipment destinations to understand geographic distribution and regional performance. Business Value:
  • Identify top shipping destinations
  • Analyze regional delivery performance
  • Plan logistics and carrier partnerships by region
SELECT
    destination_country_region AS country,
    destination_state AS state,
    COUNT(*) AS total_shipments,
    AVG(transit_time) AS avg_transit_days,
    SUM(CASE WHEN tag = 'Delivered' THEN 1 ELSE 0 END) AS delivered,
    SUM(CASE WHEN tag = 'Exception' THEN 1 ELSE 0 END) AS exceptions
FROM
    nekt_raw.aftership_trackings
WHERE
    destination_country_region IS NOT NULL
    AND created_at >= CURRENT_DATE - INTERVAL '30' DAY
GROUP BY
    destination_country_region,
    destination_state
ORDER BY
    total_shipments DESC
LIMIT 20
countrystatetotal_shipmentsavg_transit_daysdeliveredexceptions
USCalifornia4562.842312
USTexas3123.12878
USNew York2892.52715
USFlorida2343.421211
CAOntario1784.21567

Implementation Notes

Incremental Sync Support

  • The Trackings stream supports incremental sync using the updated_at field as the replication key
  • Only trackings modified since the last sync will be fetched, improving extraction performance

Nested Data Structures

  • The checkpoints array contains the full tracking history with timestamps, locations, and status updates
  • The customers array may contain multiple customers (buyer and receiver)
  • Use JSON functions or UNNEST to analyze nested checkpoint and customer data

Tracking Status Tags

AfterShip uses standardized tags to represent shipment status:
  • Pending - Tracking created, no updates yet
  • InfoReceived - Carrier received shipment information
  • InTransit - Shipment is in transit
  • OutForDelivery - Out for delivery
  • AttemptFail - Delivery attempt failed
  • Delivered - Successfully delivered
  • AvailableForPickup - Available at pickup location
  • Exception - Exception occurred (delay, customs, etc.)
  • Expired - Tracking expired with no delivery confirmation