
Configuring PostHog as a Source
In the Sources tab, click on the “Add source” button located on the top right of your screen. Then, select the PostHog option from the list of connectors. Click Next and you’ll be prompted to add your access.1. Add account access
The following configurations are available:- Personal API key: Personal API key used to authenticate against the PostHog API. See the official PostHog API docs.
- Base URL: API base URL for your PostHog region or self-hosted environment.
- US Cloud:
https://us.posthog.com - EU Cloud:
https://eu.posthog.com - Self-hosted: your own PostHog domain
- US Cloud:
- Start date: Earliest date from which events are synced.
2. Select streams
Choose which data streams you want to sync. For faster extractions, select only the streams relevant to your analysis.Tip: In addition to theSelect the streams and click Next.projectsstream, the connector creates one dynamic event stream per PostHog project using the pattern{organization_slug}_events_{project_name}.
3. Configure data streams
Customize how you want your data to appear in your catalog. Select the desired layer where the data will be placed, a folder to organize it inside the layer, a name for each table, and the type of sync.- Layer: Choose the layer where extracted PostHog tables will be created.
- Folder: Optionally group all PostHog tables inside a folder.
- Table name: A default name is suggested, but you can customize it and optionally apply a prefix.
- Sync Type: Choose between INCREMENTAL and FULL_TABLE.
- Incremental: Recommended for event streams, using
timestampas the replication key. - Full table: Suitable for
projectsmetadata refreshes.
- Incremental: Recommended for event streams, using
4. Configure data source
Describe your data source for easy identification within your organization, not exceeding 140 characters. To define your Trigger, consider how often your product analytics data should be refreshed. Optionally, you can define:- Delta Log Retention: How long Nekt keeps previous table states. See Resource control.
- Additional Full Sync: Periodic full syncs to complement incremental runs.
5. Check your new source
You can view your new source on the Sources page. If needed, manually trigger the extraction by clicking on the arrow button. Once a run completes successfully, your data appears in the Catalog.Streams and Fields
Below you’ll find the available PostHog streams and their core fields.Projects
Projects
Project metadata for each organization available to the API token.Key fields:
id- Project identifier (primary key)uuid- Project UUIDorganization- Organization identifier linked to the projectname- Project nameapi_token- Project ingestion tokentimezone- Project default timezoneis_demo- Indicates whether the project is a demo projectingested_event- Indicates whether at least one event has been ingestedcompleted_snippet_onboarding- Onboarding status for snippet setuphas_completed_onboarding_for.feature_flags- Feature flags onboarding statusaccess_control- Access-control enablement flag
- Stream name:
projects - Primary key:
id - Replication: full-table style (no replication key)
Events (dynamic per project)
Events (dynamic per project)
Event-level analytics data from each PostHog project. The connector dynamically creates one stream per project discovered during catalog discovery.Dynamic stream naming:
{organization_slug}_events_{project_name}(slugified with underscores)
id- Event identifierdistinct_id- User or actor distinct identifierevent- Event nametimestamp- Event timestamp (replication key)properties- Array of key/value event propertiesperson.is_identified- Whether associated person is identifiedperson.distinct_ids- Associated person distinct IDsperson.properties.email- Associated person email (when available)elements- Captured element metadataelements_chain- Serialized captured element chain
- Primary keys:
id,distinct_id - Replication key:
timestamp - Incremental sync uses
start_datefor the first run, then continues from saved state
Data Model
The connector follows an organization/project-based model:Use Cases for Data Analysis
This section includes practical SQL examples you can run in Explorer.1. Event volume by day
Measure daily event volume to track product activity trends.SQL query
SQL query
- AWS
- GCP
2. Top users by event activity (last 30 days)
Identify the most active users based on tracked events.SQL query
SQL query
- AWS
- GCP
Skills for agents
Download PostHog skills file
PostHog connector documentation as plain markdown, for use in AI agent contexts.