
Configuring AfterShip as a Source
In the Sources tab, click on the “Add source” button located on the top right of your screen. Then, select the AfterShip option from the list of connectors. Click Next and you’ll be prompted to add your access.1. Add account access
You’ll need an AfterShip API key to authenticate. The following configurations are available:- API Key: Your AfterShip API key used to authenticate API requests.
- Start Date: The earliest date from which records will be synced based on the last modification date.
How to get your AfterShip API Key
How to get your AfterShip API Key
Log in to AfterShip
Go to AfterShip and log in to your account.
Navigate to API Settings
- Click on your profile icon in the top-right corner
- Select Settings
- In the left sidebar, click on API Keys under the “Integrations” section
2. Select streams
Choose which data streams you want to sync. For faster extractions, select only the streams that are relevant to your analysis.Tip: The stream can be found more easily by typing its name.Select the streams and click Next.
3. Configure data streams
Customize how you want your data to appear in your catalog. Select the desired layer where the data will be placed, a folder to organize it inside the layer, a name for each table (which will effectively contain the fetched data) and the type of sync.- Layer: choose between the existing layers on your catalog. This is where you will find your new extracted tables as the extraction runs successfully.
- Folder: a folder can be created inside the selected layer to group all tables being created from this new data source.
- Table name: we suggest a name, but feel free to customize it. You have the option to add a prefix to all tables at once and make this process faster!
- Sync Type: you can choose between INCREMENTAL and FULL_TABLE.
- Incremental: every time the extraction happens, we’ll get only the new or updated data based on the
updated_atfield. - Full table: every time the extraction happens, we’ll get the current state of all data.
- Incremental: every time the extraction happens, we’ll get only the new or updated data based on the
4. Configure data source
Describe your data source for easy identification within your organization, not exceeding 140 characters. To define your Trigger, consider how often you want data to be extracted from this source. This decision usually depends on how frequently you need the new table data updated (every day, once a week, or only at specific times). Optionally, you can define some additional settings:- Configure Delta Log Retention and determine for how long we should store old states of this table as it gets updated. Read more about this resource here.
- Determine when to execute an Additional Full Sync. This will complement the incremental data extractions, ensuring that your data is completely synchronized with your source every once in a while.
5. Check your new source
You can view your new source on the Sources page. If needed, manually trigger the source extraction by clicking on the arrow button. Once executed, your data will appear in your Catalog.Streams and Fields
Below you’ll find all available data streams from AfterShip and their corresponding fields:Trackings
Trackings
Stream containing shipment tracking information with full tracking history.Key Fields:
id- Unique identifier for the trackinglegacy_id- Legacy tracking IDtracking_number- The carrier tracking numberslug- Courier code (e.g., “ups”, “fedex”, “dhl”)active- Whether the tracking is currently activecreated_at- When the tracking was createdupdated_at- When the tracking was last updated (replication key)
order_id- Associated order IDorder_number- Order numberorder_id_path- Order IDs pathorder_date- Date of the orderorder_tags[]- Tags associated with the orderorder_promised_delivery_date- Promised delivery date
origin_country_region- Origin country/region codeorigin_state- Origin stateorigin_city- Origin cityorigin_postal_code- Origin postal codeorigin_raw_location- Raw origin location string
destination_country_region- Destination country/region codedestination_state- Destination statedestination_city- Destination citydestination_postal_code- Destination postal codedestination_raw_location- Raw destination location stringcourier_destination_country_region- Courier-reported destination
tag- Current tracking status tag (e.g., “Delivered”, “InTransit”, “OutForDelivery”)subtag- Detailed status subtagsubtag_message- Human-readable status messageon_time_status- On-time delivery statuson_time_difference- Difference from promised delivery time (in days)signed_by- Name of person who signed for deliveryreturn_to_sender- Whether package was returned to sender
shipment_type- Type of shipmentshipment_package_count- Number of packagesshipment_pickup_date- Pickup dateshipment_delivery_date- Actual delivery dateshipping_method- Shipping method usedshipment_tags[]- Tags for the shipmentshipment_weight.value- Weight valueshipment_weight.unit- Weight unit (kg, lb, etc.)transit_time- Transit time in days
courier_estimated_delivery_date.estimated_delivery_date- Courier’s estimated deliverycourier_estimated_delivery_date.estimated_delivery_date_min- Minimum estimatecourier_estimated_delivery_date.estimated_delivery_date_max- Maximum estimateaftership_estimated_delivery_date- AfterShip’s predicted delivery datecustom_estimated_delivery_date- Custom estimated delivery datefirst_estimated_delivery.datetime- First estimated delivery datetimefirst_estimated_delivery.source- Source of first estimatelatest_estimated_delivery.datetime- Latest estimated delivery datetimelatest_estimated_delivery.revise_reason- Reason for revision
checkpoints[]- Array of tracking events:checkpoints[].checkpoint_time- Timestamp of the eventcheckpoints[].tag- Status tag at checkpointcheckpoints[].subtag- Detailed subtagcheckpoints[].subtag_message- Status messagecheckpoints[].message- Checkpoint descriptioncheckpoints[].location- Full location stringcheckpoints[].city- Citycheckpoints[].state- Statecheckpoints[].country_region- Country/region codecheckpoints[].country_region_name- Country/region namecheckpoints[].postal_code- Postal codecheckpoints[].coordinate.latitude- Latitudecheckpoints[].coordinate.longitude- Longitudecheckpoints[].slug- Courier slugcheckpoints[].source- Data sourcecheckpoints[].created_at- When checkpoint was createdcheckpoints[].events[]- Events at this checkpoint
customers[]- Array of customers:customers[].name- Customer namecustomers[].email- Customer emailcustomers[].phone_number- Phone numbercustomers[].language- Preferred languagecustomers[].role- Role (buyer or receiver)
subscribed_emails[]- Subscribed email addressessubscribed_smses[]- Subscribed SMS numbers
first_attempted_at- First delivery attempt timestampfailed_delivery_attempts- Number of failed attemptsdelivery_type- Type of deliverydelivery_location_type- Type of delivery locationsignature_requirement- Signature requirement infopickup_location- Pickup location (for pickup deliveries)pickup_note- Pickup note
courier_tracking_link- Direct link to carrier trackingcourier_redirect_link- Courier redirect linkaftership_tracking_url- AfterShip tracking page URLaftership_tracking_order_url- AfterShip order tracking URL
title- Tracking title (usually order number or product name)note- Notes about the trackingsource- Source of the tracking datalanguage- Language of the tracking recordtracked_count- Number of times trackedtracking_account_number- Carrier account numbertracking_key- Tracking keytracking_ship_date- Ship date (YYYY-MM-DD)courier_connection_id- Courier connection IDlocation_id- Location IDcarbon_emissions- Estimated carbon emissionscustom_fields.item_names- Custom field for item namesfirst_mile- First mile informationlast_mile- Last mile information
Data Model
The following diagram illustrates the structure of the AfterShip data. The Trackings stream is the main entity containing all shipment tracking information including nested checkpoints and customer data.Use Cases for Data Analysis
This guide outlines valuable business intelligence use cases when consolidating AfterShip data, along with ready-to-use SQL queries that you can run on Explorer.1. Delivery Performance Overview
Track delivery performance across carriers and identify trends in on-time delivery. Business Value:- Monitor carrier performance and reliability
- Identify delivery delays and their causes
- Optimize carrier selection based on performance data
SQL query
SQL query
- AWS
- GCP
Sample Result
Sample Result
| carrier | status | total_shipments | avg_transit_days | on_time_count | on_time_rate_pct |
|---|---|---|---|---|---|
| ups | Delivered | 1,245 | 3.2 | 1,156 | 92.9 |
| fedex | Delivered | 892 | 2.8 | 845 | 94.7 |
| dhl | InTransit | 456 | 4.1 | - | - |
| usps | Delivered | 378 | 4.5 | 312 | 82.5 |
2. Shipment Status Distribution
Analyze current shipment statuses to understand pipeline and identify potential issues. Business Value:- Monitor shipments in transit
- Identify stuck or problematic shipments
- Track exception rates by carrier
SQL query
SQL query
- AWS
- GCP
Sample Result
Sample Result
| status | detailed_status | shipment_count | percentage |
|---|---|---|---|
| InTransit | InTransit_PickedUp | 523 | 35.2 |
| InTransit | InTransit_Departed | 312 | 21.0 |
| OutForDelivery | OutForDelivery_ToBeDelivered | 178 | 12.0 |
| Delivered | Delivered_Signed | 156 | 10.5 |
| Exception | Exception_Delayed | 89 | 6.0 |
3. Destination Analysis
Analyze shipment destinations to understand geographic distribution and regional performance. Business Value:- Identify top shipping destinations
- Analyze regional delivery performance
- Plan logistics and carrier partnerships by region
SQL query
SQL query
- AWS
- GCP
Sample Result
Sample Result
| country | state | total_shipments | avg_transit_days | delivered | exceptions |
|---|---|---|---|---|---|
| US | California | 456 | 2.8 | 423 | 12 |
| US | Texas | 312 | 3.1 | 287 | 8 |
| US | New York | 289 | 2.5 | 271 | 5 |
| US | Florida | 234 | 3.4 | 212 | 11 |
| CA | Ontario | 178 | 4.2 | 156 | 7 |
Implementation Notes
Incremental Sync Support
- The Trackings stream supports incremental sync using the
updated_atfield as the replication key - Only trackings modified since the last sync will be fetched, improving extraction performance
Nested Data Structures
- The
checkpointsarray contains the full tracking history with timestamps, locations, and status updates - The
customersarray may contain multiple customers (buyer and receiver) - Use JSON functions or UNNEST to analyze nested checkpoint and customer data
Tracking Status Tags
AfterShip uses standardized tags to represent shipment status:Pending- Tracking created, no updates yetInfoReceived- Carrier received shipment informationInTransit- Shipment is in transitOutForDelivery- Out for deliveryAttemptFail- Delivery attempt failedDelivered- Successfully deliveredAvailableForPickup- Available at pickup locationException- Exception occurred (delay, customs, etc.)Expired- Tracking expired with no delivery confirmation