Analytics Domain
The Analytics domain provides read-only endpoints for querying activity, utilization, and AI app usage data across users, devices, and apps. It is completely separate from the identity/CRUD resource domains -- those domains own the resource definitions, this domain owns the numbers.
Endpoint pattern
Every analytics resource exposes up to four endpoint roles:
| Role | Path shape | Returns |
|---|---|---|
| Activity | /{resource}/{id}/activity | Daily time-series for one resource. One data point per day, one metric per request. |
| List | /{resource}s/{kind} | Paginated table across all resources. { period, data, total_count, next_cursor } |
| Summary | /{resource}s/{kind}/summary | Flat aggregate card counts. No pagination. |
| Rankings | /{resource}s/{kind}/rankings | Named top-performer slots (leader, biggest increase, biggest decrease). No pagination. |
Not every resource has all four. The parent route (/{kind}) is always the list — never a container. /summary and /rankings are siblings to it.
Apps example:
GET /analytics/apps/{app_id}/activity ← time-series for one app
GET /analytics/apps/utilization ← paginated per-app metrics table
GET /analytics/apps/utilization/summary ← cards: apps used, discovered, dropped off
GET /analytics/apps/utilization/rankings ← top app by interactions, time, users
For the full rules on each role, see the Design Guidelines.
Endpoints
Apps
| Endpoint | Method | Description |
|---|---|---|
| App Activity | GET | Daily activity time-series for an app |
| App Utilization | GET | Paginated utilization table across all client apps (unique users, active time, session count, avg daily time); supports discovered/dropped_off filter |
| App Utilization Summary | GET | Card counts: total apps used, newly discovered, dropped off, plus dimension counts (categories, users, departments). Modified — see dashboard spec |
| App Utilization Rankings | GET | Top-ranked app by interactions, active time, and active users; includes biggest movers |
| App Utilization Timeseries | GET | Approved — Time-series of app counts by classification (unused, unapproved, AI, other) for stacked bar chart |
| App Utilization by Category | GET | Approved — Categories ranked by app count and user count; supports is_approved and category_ids filters |
| App Utilization by User | GET | Approved — Users ranked by distinct app count; supports is_approved and category_ids filters |
| App Utilization by Department | GET | Approved — Departments ranked by user count and app count; supports is_approved and category_ids filters |
AI Apps
AI-specific analytics filtered to apps whose catalog categories include "AI". Two list endpoints remain active because they provide usage_level (heavy/medium/light), which is not available on the generic utilization endpoints. Four endpoints are deprecated — see the AI Apps Overview for details.
| Endpoint | Method | Description |
|---|---|---|
| AI App Users Summary | GET | Aggregate user adoption metrics for AI apps |
| AI App Users | GET | Paginated list of AI apps ranked by user count |
| New AI Tools Summary | GET | Count of newly discovered AI tools in the period |
| New AI Tools | GET | Paginated list of AI apps first seen in the period |
| AI By Department Summary | GET | Total AI users across all departments |
| AI By Department | GET | Paginated list of departments with AI app usage metrics |
Devices
| Endpoint | Method | Description |
|---|---|---|
| Device Activity | GET | Daily activity time-series for a device |
Users
| Endpoint | Method | Description |
|---|---|---|
| User Activity | GET | Daily activity time-series for a user |
Data Sources
Analytics endpoints aggregate from two tables depending on the resource:
| Resource | Source Table | Granularity |
|---|---|---|
| Users | heartbeats | One row per heartbeat per user-device pair |
| Devices | heartbeats | Same table, scoped by device_id |
| Apps | app_usage_reports | One row per user per app per day (partitioned by month) |
| AI Apps | app_usage_reports | Same table, filtered to apps with an AI catalog category |
heartbeats
The desktop agent sends periodic heartbeats. Each heartbeat is tied to a user_id, device_id, and user_device_id. Key columns:
| Column | Type | Description |
|---|---|---|
total_activity_count | integer | Aggregate activity count (sum of all sources below) |
chrome_activity_count | integer | Chrome extension activity |
edge_activity_count | integer | Edge extension activity |
firefox_activity_count | integer | Firefox extension activity |
desktop_activity_count | integer | Desktop agent activity |
web_accessibility_activity_count | integer | Web accessibility activity |
generated_at_utc | timestamp | When the heartbeat was generated (used for date grouping) |
app_usage_reports
Daily aggregated app usage per user. Partitioned by month (app_usage_reports_2026_01, etc.).
| Column | Type | Description |
|---|---|---|
total_active_ms | bigint | Active time in milliseconds |
total_session_count | bigint | Number of sessions |
application_id | uuid | The app being measured |
user_id | uuid | The user |
activity_date | date | The date of activity (used for grouping) |
Shared Response DTO
ActivityTimeSeries
All /activity endpoints return this shape. See individual endpoint docs for which metric values are available.
{
"metric": "activity-count",
"label": "Total Activity",
"period": {
"start": "2024-01-01",
"end": "2024-01-15"
},
"data": [
{ "date": "2024-01-01", "value": 42 },
{ "date": "2024-01-02", "value": 38 },
{ "date": "2024-01-03", "value": 0 }
]
}
| Field | Type | Description |
|---|---|---|
metric | string | The metric key that was requested |
label | string | Human-readable label for the metric (e.g. "Total Activity", "Daily Users") |
period | object | Date range for this data |
period.start | string | Start date (YYYY-MM-DD) |
period.end | string | End date (YYYY-MM-DD) |
data | DailyDataPoint[] | One entry per day in the range |
data[].date | string | Date (YYYY-MM-DD) |
data[].value | number | Metric value for that day (0 if no activity) |
value is used instead of count because not all metrics are counts -- for example, active-time returns milliseconds.
Shared Patterns
Granularity
All analytics endpoints accept a granularity parameter that controls how metrics are averaged within the period. See Analytics Pattern: Granularity.
Period object in responses
All aggregate analytics endpoints that accept date range + comparison parameters echo the resolved dates back in a period object. See Response Period Object.
Filter parameters: plural IDs
All filter parameters accept comma-separated UUID lists (category_ids, department_ids, vendor_ids). See Filter Parameters: Plural IDs.
Constraints
- Max date range (activity): 90 days. Requests exceeding this return
400. - Max date range (aggregate): 366 days for both primary and comparison periods.
- Zero-fill: Days with no activity return
value: 0. The response always includes every date in the requested range. - Timezone: All dates are in the client's configured timezone.
- One metric per request (activity): Each request returns a single metric. To display multiple metrics (e.g. a chart with a metric selector), make parallel requests.
Extensibility
New analytics types are added as sibling sub-paths. Existing endpoints are never affected:
/analytics/apps/utilization ← paginated per-app metrics table
/analytics/apps/utilization/summary ← collection-level dashboard cards
/analytics/apps/utilization/rankings ← top app rankings
/analytics/apps/utilization/timeseries ← proposed: time-series by classification
/analytics/apps/utilization/categories ← proposed: categories dimension view
/analytics/apps/utilization/users ← proposed: users dimension view
/analytics/apps/utilization/departments ← proposed: departments dimension view
/analytics/apps/{app_id}/utilization ← future: per-app utilization detail
/analytics/apps/{app_id}/utilization/users ← future: per-user breakdown for an app
/analytics/devices/{device_id}/performance ← future: CPU/RAM resource usage
/analytics/apps/ai/users ← AI app user adoption
/analytics/apps/ai/recent ← newly discovered AI tools
/analytics/apps/ai/departments ← AI usage by department