> ## Documentation Index
> Fetch the complete documentation index at: https://docs.nekt.com/llms.txt
> Use this file to discover all available pages before exploring further.

# AfterShip as a data source

> Bring data from AfterShip to Nekt.

AfterShip is a shipment tracking platform that helps e-commerce businesses track packages across 1,100+ carriers worldwide. It provides real-time tracking updates, delivery estimates, and analytics to improve post-purchase customer experience and operational efficiency.

<img height="50" src="https://mintcdn.com/nekt/xqgrcq1t4bxrxVbv/assets/logo/logo-aftership.png?fit=max&auto=format&n=xqgrcq1t4bxrxVbv&q=85&s=94abe48ba7f34b7be526b49294d74f8a" data-path="assets/logo/logo-aftership.png" />

## Configuring AfterShip as a Source

In the [Sources](https://app.nekt.ai/sources) tab, click on the "Add source" button located on the top right of your screen. Then, select the AfterShip option from the list of connectors.

Click **Next** and you'll be prompted to add your access.

### 1. Add account access

You'll need an AfterShip API key to authenticate. The following configurations are available:

* **API Key**: Your AfterShip API key used to authenticate API requests.

* **Start Date**: The earliest date from which records will be synced based on the last modification date.

<AccordionGroup>
  <Accordion title="How to get your AfterShip API Key">
    <Steps>
      <Step title="Log in to AfterShip">
        Go to [AfterShip](https://www.aftership.com/) and log in to your account.
      </Step>

      <Step title="Navigate to API Settings">
        1. Click on your profile icon in the top-right corner
        2. Select **Settings**
        3. In the left sidebar, click on **API Keys** under the "Integrations" section
      </Step>

      <Step title="Generate a new API Key">
        1. Click **Generate API Key**
        2. Give your API key a descriptive name (e.g., "Nekt Integration")
        3. Click **Generate**
        4. Copy the generated API key immediately

        <Warning>
          **Important:** The API key is only shown once. Copy it immediately and store it securely. If you lose it, you'll need to generate a new one.
        </Warning>
      </Step>
    </Steps>
  </Accordion>
</AccordionGroup>

Once you're done, click **Next**.

### 2. Select streams

Choose which data streams you want to sync. For faster extractions, select only the streams that are relevant to your analysis.

> Tip: The stream can be found more easily by typing its name.

Select the streams and click **Next**.

### 3. Configure data streams

Customize how you want your data to appear in your catalog. Select the desired layer where the data will be placed, a folder to organize it inside the layer, a name for each table (which will effectively contain the fetched data) and the type of sync.

* **Layer**: choose between the existing layers on your catalog. This is where you will find your new extracted tables as the extraction runs successfully.
* **Folder**: a folder can be created inside the selected layer to group all tables being created from this new data source.
* **Table name**: we suggest a name, but feel free to customize it. You have the option to add a **prefix** to all tables at once and make this process faster!
* **Sync Type**: you can choose between INCREMENTAL and FULL\_TABLE.
  * Incremental: every time the extraction happens, we'll get only the new or updated data based on the `updated_at` field.
  * Full table: every time the extraction happens, we'll get the current state of all data.

Once you are done configuring, click **Next**.

### 4. Configure data source

Describe your data source for easy identification within your organization, not exceeding 140 characters.

To define your [Trigger](https://docs.nekt.com/get-started/core-concepts/triggers), consider how often you want data to be extracted from this source. This decision usually depends on how frequently you need the new table data updated (every day, once a week, or only at specific times).

Optionally, you can define some additional settings:

* Configure Delta Log Retention and determine for how long we should store old states of this table as it gets updated. Read more about this resource [here](https://docs.nekt.com/get-started/core-concepts/resource-control).
* Determine when to execute an **Additional [Full Sync](https://docs.nekt.com/get-started/core-concepts/types-of-sync#additional-full-sync)**. This will complement the incremental data extractions, ensuring that your data is completely synchronized with your source every once in a while.

Once you are ready, click **Next** to finalize the setup.

### 5. Check your new source

You can view your new source on the [Sources](https://app.nekt.ai/sources) page. If needed, manually trigger the source extraction by clicking on the arrow button. Once executed, your data will appear in your Catalog.

<Warning>For you to be able to see it on your [Catalog](https://app.nekt.ai/catalog), you need at least one successful source run.</Warning>

# Streams and Fields

Below you'll find all available data streams from AfterShip and their corresponding fields:

<AccordionGroup>
  <Accordion title="Trackings">
    Stream containing shipment tracking information with full tracking history.

    **Key Fields:**

    * `id` - Unique identifier for the tracking
    * `legacy_id` - Legacy tracking ID
    * `tracking_number` - The carrier tracking number
    * `slug` - Courier code (e.g., "ups", "fedex", "dhl")
    * `active` - Whether the tracking is currently active
    * `created_at` - When the tracking was created
    * `updated_at` - When the tracking was last updated (replication key)

    **Order Information:**

    * `order_id` - Associated order ID
    * `order_number` - Order number
    * `order_id_path` - Order IDs path
    * `order_date` - Date of the order
    * `order_tags[]` - Tags associated with the order
    * `order_promised_delivery_date` - Promised delivery date

    **Origin Location:**

    * `origin_country_region` - Origin country/region code
    * `origin_state` - Origin state
    * `origin_city` - Origin city
    * `origin_postal_code` - Origin postal code
    * `origin_raw_location` - Raw origin location string

    **Destination Location:**

    * `destination_country_region` - Destination country/region code
    * `destination_state` - Destination state
    * `destination_city` - Destination city
    * `destination_postal_code` - Destination postal code
    * `destination_raw_location` - Raw destination location string
    * `courier_destination_country_region` - Courier-reported destination

    **Delivery Status:**

    * `tag` - Current tracking status tag (e.g., "Delivered", "InTransit", "OutForDelivery")
    * `subtag` - Detailed status subtag
    * `subtag_message` - Human-readable status message
    * `on_time_status` - On-time delivery status
    * `on_time_difference` - Difference from promised delivery time (in days)
    * `signed_by` - Name of person who signed for delivery
    * `return_to_sender` - Whether package was returned to sender

    **Shipment Details:**

    * `shipment_type` - Type of shipment
    * `shipment_package_count` - Number of packages
    * `shipment_pickup_date` - Pickup date
    * `shipment_delivery_date` - Actual delivery date
    * `shipping_method` - Shipping method used
    * `shipment_tags[]` - Tags for the shipment
    * `shipment_weight.value` - Weight value
    * `shipment_weight.unit` - Weight unit (kg, lb, etc.)
    * `transit_time` - Transit time in days

    **Estimated Delivery:**

    * `courier_estimated_delivery_date.estimated_delivery_date` - Courier's estimated delivery
    * `courier_estimated_delivery_date.estimated_delivery_date_min` - Minimum estimate
    * `courier_estimated_delivery_date.estimated_delivery_date_max` - Maximum estimate
    * `aftership_estimated_delivery_date` - AfterShip's predicted delivery date
    * `custom_estimated_delivery_date` - Custom estimated delivery date
    * `first_estimated_delivery.datetime` - First estimated delivery datetime
    * `first_estimated_delivery.source` - Source of first estimate
    * `latest_estimated_delivery.datetime` - Latest estimated delivery datetime
    * `latest_estimated_delivery.revise_reason` - Reason for revision

    **Checkpoints (Tracking History):**

    * `checkpoints[]` - Array of tracking events:
      * `checkpoints[].checkpoint_time` - Timestamp of the event
      * `checkpoints[].tag` - Status tag at checkpoint
      * `checkpoints[].subtag` - Detailed subtag
      * `checkpoints[].subtag_message` - Status message
      * `checkpoints[].message` - Checkpoint description
      * `checkpoints[].location` - Full location string
      * `checkpoints[].city` - City
      * `checkpoints[].state` - State
      * `checkpoints[].country_region` - Country/region code
      * `checkpoints[].country_region_name` - Country/region name
      * `checkpoints[].postal_code` - Postal code
      * `checkpoints[].coordinate.latitude` - Latitude
      * `checkpoints[].coordinate.longitude` - Longitude
      * `checkpoints[].slug` - Courier slug
      * `checkpoints[].source` - Data source
      * `checkpoints[].created_at` - When checkpoint was created
      * `checkpoints[].events[]` - Events at this checkpoint

    **Customer Information:**

    * `customers[]` - Array of customers:
      * `customers[].name` - Customer name
      * `customers[].email` - Customer email
      * `customers[].phone_number` - Phone number
      * `customers[].language` - Preferred language
      * `customers[].role` - Role (buyer or receiver)

    **Notifications:**

    * `subscribed_emails[]` - Subscribed email addresses
    * `subscribed_smses[]` - Subscribed SMS numbers

    **Delivery Attempts:**

    * `first_attempted_at` - First delivery attempt timestamp
    * `failed_delivery_attempts` - Number of failed attempts
    * `delivery_type` - Type of delivery
    * `delivery_location_type` - Type of delivery location
    * `signature_requirement` - Signature requirement info
    * `pickup_location` - Pickup location (for pickup deliveries)
    * `pickup_note` - Pickup note

    **Tracking Links:**

    * `courier_tracking_link` - Direct link to carrier tracking
    * `courier_redirect_link` - Courier redirect link
    * `aftership_tracking_url` - AfterShip tracking page URL
    * `aftership_tracking_order_url` - AfterShip order tracking URL

    **Additional Fields:**

    * `title` - Tracking title (usually order number or product name)
    * `note` - Notes about the tracking
    * `source` - Source of the tracking data
    * `language` - Language of the tracking record
    * `tracked_count` - Number of times tracked
    * `tracking_account_number` - Carrier account number
    * `tracking_key` - Tracking key
    * `tracking_ship_date` - Ship date (YYYY-MM-DD)
    * `courier_connection_id` - Courier connection ID
    * `location_id` - Location ID
    * `carbon_emissions` - Estimated carbon emissions
    * `custom_fields.item_names` - Custom field for item names
    * `first_mile` - First mile information
    * `last_mile` - Last mile information
  </Accordion>
</AccordionGroup>

# Data Model

The following diagram illustrates the structure of the AfterShip data. The Trackings stream is the main entity containing all shipment tracking information including nested checkpoints and customer data.

```mermaid theme={null}
erDiagram
    Trackings {
        string id PK
        string tracking_number
        string slug
        string order_id
        string order_number
        datetime created_at
        datetime updated_at
        string tag
        string subtag
        boolean active
        string origin_country_region
        string destination_country_region
        datetime shipment_delivery_date
        string signed_by
        array checkpoints
        array customers
    }

    Checkpoints {
        datetime checkpoint_time
        string tag
        string subtag
        string message
        string city
        string state
        string country_region
        string postal_code
        number latitude
        number longitude
    }

    Customers {
        string name
        string email
        string phone_number
        string role
    }

    Trackings ||--o{ Checkpoints : "contains"
    Trackings ||--o{ Customers : "has"
```

# Use Cases for Data Analysis

This guide outlines valuable business intelligence use cases when consolidating AfterShip data, along with ready-to-use SQL queries that you can run on [Explorer](https://app.nekt.ai/explorer).

### 1. Delivery Performance Overview

Track delivery performance across carriers and identify trends in on-time delivery.

**Business Value:**

* Monitor carrier performance and reliability
* Identify delivery delays and their causes
* Optimize carrier selection based on performance data

<Accordion title="SQL query">
  <Tabs>
    <Tab title="AWS">
      ```sql theme={null}
      WITH delivery_metrics AS (
          SELECT
              slug AS carrier,
              tag AS status,
              COUNT(*) AS total_shipments,
              AVG(transit_time) AS avg_transit_days,
              SUM(CASE WHEN on_time_status = 'on_time' THEN 1 ELSE 0 END) AS on_time_count,
              SUM(CASE WHEN tag = 'Delivered' THEN 1 ELSE 0 END) AS delivered_count
          FROM
              nekt_raw.aftership_trackings
          WHERE
              created_at >= CURRENT_DATE - INTERVAL '30' DAY
          GROUP BY
              slug,
              tag
      )
      SELECT
          carrier,
          status,
          total_shipments,
          ROUND(avg_transit_days, 1) AS avg_transit_days,
          on_time_count,
          ROUND(on_time_count * 100.0 / NULLIF(delivered_count, 0), 1) AS on_time_rate_pct
      FROM
          delivery_metrics
      ORDER BY
          total_shipments DESC
      ```
    </Tab>

    <Tab title="GCP">
      ```sql theme={null}
      WITH delivery_metrics AS (
          SELECT
              slug AS carrier,
              tag AS status,
              COUNT(*) AS total_shipments,
              AVG(transit_time) AS avg_transit_days,
              SUM(CASE WHEN on_time_status = 'on_time' THEN 1 ELSE 0 END) AS on_time_count,
              SUM(CASE WHEN tag = 'Delivered' THEN 1 ELSE 0 END) AS delivered_count
          FROM
              `nekt_raw.aftership_trackings`
          WHERE
              DATE(created_at) >= DATE_SUB(CURRENT_DATE(), INTERVAL 30 DAY)
          GROUP BY
              slug,
              tag
      )
      SELECT
          carrier,
          status,
          total_shipments,
          ROUND(avg_transit_days, 1) AS avg_transit_days,
          on_time_count,
          ROUND(SAFE_DIVIDE(on_time_count * 100.0, delivered_count), 1) AS on_time_rate_pct
      FROM
          delivery_metrics
      ORDER BY
          total_shipments DESC
      ```
    </Tab>
  </Tabs>
</Accordion>

<Accordion title="Sample Result">
  | carrier | status    | total\_shipments | avg\_transit\_days | on\_time\_count | on\_time\_rate\_pct |
  | ------- | --------- | ---------------- | ------------------ | --------------- | ------------------- |
  | ups     | Delivered | 1,245            | 3.2                | 1,156           | 92.9                |
  | fedex   | Delivered | 892              | 2.8                | 845             | 94.7                |
  | dhl     | InTransit | 456              | 4.1                | -               | -                   |
  | usps    | Delivered | 378              | 4.5                | 312             | 82.5                |
</Accordion>

### 2. Shipment Status Distribution

Analyze current shipment statuses to understand pipeline and identify potential issues.

**Business Value:**

* Monitor shipments in transit
* Identify stuck or problematic shipments
* Track exception rates by carrier

<Accordion title="SQL query">
  <Tabs>
    <Tab title="AWS">
      ```sql theme={null}
      SELECT
          tag AS status,
          subtag AS detailed_status,
          COUNT(*) AS shipment_count,
          ROUND(COUNT(*) * 100.0 / SUM(COUNT(*)) OVER (), 1) AS percentage
      FROM
          nekt_raw.aftership_trackings
      WHERE
          active = true
      GROUP BY
          tag,
          subtag
      ORDER BY
          shipment_count DESC
      ```
    </Tab>

    <Tab title="GCP">
      ```sql theme={null}
      SELECT
          tag AS status,
          subtag AS detailed_status,
          COUNT(*) AS shipment_count,
          ROUND(COUNT(*) * 100.0 / SUM(COUNT(*)) OVER (), 1) AS percentage
      FROM
          `nekt_raw.aftership_trackings`
      WHERE
          active = true
      GROUP BY
          tag,
          subtag
      ORDER BY
          shipment_count DESC
      ```
    </Tab>
  </Tabs>
</Accordion>

<Accordion title="Sample Result">
  | status         | detailed\_status              | shipment\_count | percentage |
  | -------------- | ----------------------------- | --------------- | ---------- |
  | InTransit      | InTransit\_PickedUp           | 523             | 35.2       |
  | InTransit      | InTransit\_Departed           | 312             | 21.0       |
  | OutForDelivery | OutForDelivery\_ToBeDelivered | 178             | 12.0       |
  | Delivered      | Delivered\_Signed             | 156             | 10.5       |
  | Exception      | Exception\_Delayed            | 89              | 6.0        |
</Accordion>

### 3. Destination Analysis

Analyze shipment destinations to understand geographic distribution and regional performance.

**Business Value:**

* Identify top shipping destinations
* Analyze regional delivery performance
* Plan logistics and carrier partnerships by region

<Accordion title="SQL query">
  <Tabs>
    <Tab title="AWS">
      ```sql theme={null}
      SELECT
          destination_country_region AS country,
          destination_state AS state,
          COUNT(*) AS total_shipments,
          AVG(transit_time) AS avg_transit_days,
          SUM(CASE WHEN tag = 'Delivered' THEN 1 ELSE 0 END) AS delivered,
          SUM(CASE WHEN tag = 'Exception' THEN 1 ELSE 0 END) AS exceptions
      FROM
          nekt_raw.aftership_trackings
      WHERE
          destination_country_region IS NOT NULL
          AND created_at >= CURRENT_DATE - INTERVAL '30' DAY
      GROUP BY
          destination_country_region,
          destination_state
      ORDER BY
          total_shipments DESC
      LIMIT 20
      ```
    </Tab>

    <Tab title="GCP">
      ```sql theme={null}
      SELECT
          destination_country_region AS country,
          destination_state AS state,
          COUNT(*) AS total_shipments,
          AVG(transit_time) AS avg_transit_days,
          SUM(CASE WHEN tag = 'Delivered' THEN 1 ELSE 0 END) AS delivered,
          SUM(CASE WHEN tag = 'Exception' THEN 1 ELSE 0 END) AS exceptions
      FROM
          `nekt_raw.aftership_trackings`
      WHERE
          destination_country_region IS NOT NULL
          AND DATE(created_at) >= DATE_SUB(CURRENT_DATE(), INTERVAL 30 DAY)
      GROUP BY
          destination_country_region,
          destination_state
      ORDER BY
          total_shipments DESC
      LIMIT 20
      ```
    </Tab>
  </Tabs>
</Accordion>

<Accordion title="Sample Result">
  | country | state      | total\_shipments | avg\_transit\_days | delivered | exceptions |
  | ------- | ---------- | ---------------- | ------------------ | --------- | ---------- |
  | US      | California | 456              | 2.8                | 423       | 12         |
  | US      | Texas      | 312              | 3.1                | 287       | 8          |
  | US      | New York   | 289              | 2.5                | 271       | 5          |
  | US      | Florida    | 234              | 3.4                | 212       | 11         |
  | CA      | Ontario    | 178              | 4.2                | 156       | 7          |
</Accordion>

## Implementation Notes

### Incremental Sync Support

* The Trackings stream supports incremental sync using the `updated_at` field as the replication key
* Only trackings modified since the last sync will be fetched, improving extraction performance

### Nested Data Structures

* The `checkpoints` array contains the full tracking history with timestamps, locations, and status updates
* The `customers` array may contain multiple customers (buyer and receiver)
* Use JSON functions or UNNEST to analyze nested checkpoint and customer data

### Tracking Status Tags

AfterShip uses standardized tags to represent shipment status:

* `Pending` - Tracking created, no updates yet
* `InfoReceived` - Carrier received shipment information
* `InTransit` - Shipment is in transit
* `OutForDelivery` - Out for delivery
* `AttemptFail` - Delivery attempt failed
* `Delivered` - Successfully delivered
* `AvailableForPickup` - Available at pickup location
* `Exception` - Exception occurred (delay, customs, etc.)
* `Expired` - Tracking expired with no delivery confirmation

## Skills for agents

<Snippet file="agent-skills-intro.mdx" />

<Card title="Download AfterShip skills file" icon="wand-magic-sparkles" href="/sources/aftership.md">
  AfterShip connector documentation as plain markdown, for use in AI agent contexts.
</Card>
