Events & SSE
Every mutation fires an event. Every event tells a story.
Structured lifecycle events power webhooks, prompt hooks, audit logging, and real-time streaming across the entire platform.
Architecture
Router Handler → emitAndAudit()
- INSERT → events
- Structured event row written to Postgres
- PUBLISH → Redis
- Event pushed to
events:{org_id}channel - INSERT → audit_log
- Immutable audit record alongside the event
SSE endpoint
/api/events/stream
Real-time streaming to browser clients
Webhook dispatcher
webhook endpoints → HTTP
POST deliveries to external servers
Prompt dispatcher
prompt hooks → LLM / MCP
AI execution via configured templates
When a user creates a task, updates an employee record, or completes a workflow step, the API handler calls emitAndAudit(). This atomically writes the event to Postgres and the audit log, then publishes it to Redis for real-time consumers.
Append-only
Events are append-only. Once written, they are never modified.
Lifecycle phases
Every entity in the system fires events at standardized lifecycle points:
| Phase | Fires when |
|---|---|
pre_create | Before the INSERT -- allows pre-validation hooks |
post_create | After a successful INSERT |
pre_update | Before the UPDATE |
post_update | After a successful UPDATE |
pre_archive | Before a soft-delete or deactivation |
post_archive | After a successful soft-delete |
pre_export | Before an export operation begins |
post_export | After export data is assembled |
The pre_* phases fire before the database write and are used by prompt hooks for validation and review. The post_* phases fire after the write commits and are used by webhooks and downstream integrations.
Event type naming
Event types follow the pattern {aggregate_type}.{phase}:
task.post_createemployee.post_updateworkflow.post_archivesurvey.post_create
Some domain-specific events use descriptive names instead of lifecycle phases:
workflow.started,workflow.completed,workflow.cancelledstep.passed,step.failed,step.approvedauth.login,auth.logout,auth.password_changed
Aggregate types
Events cover every mutable entity in the platform, including: task, employee, workflow, step, survey, survey_response, feedback, kudos, calendar_event, announcement, position, role, team, user, organization, api_key, webhook_endpoint, and more.
EventContext payload
Every event carries a standardized EventContext in its payload, giving downstream consumers enough information to act without re-querying the API:
interface EventContext {
org_id: number;
actor_id: number | null;
workflow_id?: number;
workflow_name?: string;
step_id?: number;
step_name?: string;
record_id: number;
record_type: string;
record_name?: string;
action: string;
timestamp: string;
} | Field | Description |
|---|---|
org_id | The organization where the event occurred |
actor_id | The user who performed the action (null for system events) |
workflow_id | The workflow context, if the action happened inside a workflow |
step_id | The workflow step context, if applicable |
record_id | The primary key of the affected entity |
record_type | The aggregate type (e.g. task, employee) |
record_name | A human-readable label for the record |
action | The lifecycle phase or domain-specific action |
timestamp | ISO 8601 timestamp of when the event occurred |
REST API
List events
Retrieve events with pagination and filters.
GET /api/events | Parameter | Type | Description |
|---|---|---|
aggregate_type | string | Filter by entity type (e.g. task) |
event_type | string | Filter by full event type (e.g. task.post_create) |
from | ISO 8601 | Events after this timestamp |
to | ISO 8601 | Events before this timestamp |
limit | integer | Results per page (1--100, default 25) |
page | integer | Page number (default 1) |
curl "https://api.wrk.ing/api/events?aggregate_type=task&limit=10" \
-H "Authorization: Bearer $TOKEN" const res = await fetch(
"https://api.wrk.ing/api/events?aggregate_type=task&limit=10",
{ headers: { Authorization: `Bearer ${token}` } }
);
const { data, total } = await res.json(); import requests
res = requests.get(
"https://api.wrk.ing/api/events",
params={"aggregate_type": "task", "limit": 10},
headers={"Authorization": f"Bearer {token}"},
)
data = res.json() {
"data": [
{
"id": 8401,
"org_id": 1,
"aggregate_type": "task",
"aggregate_id": 234,
"event_type": "task.post_create",
"payload": { "..." },
"occurred_at": "2026-04-12T14:30:00.000Z"
}
],
"total": 142
} Entity event history
Get the full event history for a specific entity, ordered by sequence number.
GET /api/events/:aggregateType/:aggregateId curl https://api.wrk.ing/api/events/task/234 \
-H "Authorization: Bearer $TOKEN" const res = await fetch(
"https://api.wrk.ing/api/events/task/234",
{ headers: { Authorization: `Bearer ${token}` } }
);
const history = await res.json(); res = requests.get(
"https://api.wrk.ing/api/events/task/234",
headers={"Authorization": f"Bearer {token}"},
)
history = res.json() Server-Sent Events
The SSE endpoint opens a persistent connection and streams events as they happen. This is useful for building real-time dashboards, live activity feeds, or monitoring tools.
GET /api/events/stream Connecting
const token = "your-access-token";
const source = new EventSource(
`https://api.wrk.ing/api/events/stream?token=${token}`
);
source.onmessage = (event) => {
const data = JSON.parse(event.data);
console.log(`${data.event_type}: ${data.aggregate_type}#${data.aggregate_id}`);
};
source.onerror = () => {
console.log("Connection lost, reconnecting...");
}; import sseclient, requests
url = "https://api.wrk.ing/api/events/stream"
headers = {"Authorization": f"Bearer {token}"}
response = requests.get(url, headers=headers, stream=True)
for event in sseclient.SSEClient(response).events():
print(event.data) curl -N https://api.wrk.ing/api/events/stream \
-H "Authorization: Bearer $TOKEN" Connection behavior
- The server sends a heartbeat comment every 30 seconds to keep the connection alive through proxies and load balancers.
- If the connection drops,
EventSourceautomatically reconnects with exponential backoff. - Each event is published as a
data:line containing the full JSON event payload. - Events are scoped to your organization. You only receive events for the org associated with your token.
Output appears as a continuous stream:
: heartbeat
data: {"id":8401,"event_type":"task.post_create","aggregate_type":"task",...}
data: {"id":8402,"event_type":"employee.post_update","aggregate_type":"employee",...}
: heartbeat Event-driven automation
Events are the trigger for two powerful automation systems:
Webhooks
Register HTTP endpoints to receive events as POST requests. Filter by event type to only receive what you need. See the Webhooks guide.
Prompt hooks
Attach AI prompt templates to lifecycle events. When a matching event fires, wrk!ng renders the template with the event context and calls an LLM. See the AI integration guide.
Both systems consume events from the same Redis pub/sub channel, ensuring consistent and immediate delivery.