Skip to main content
Version: 0.1 (Current)

1. Introduction

1.1 The problem this specification addresses

Organisations operating across modern digital environments — multi-cloud infrastructure, SaaS platforms, AI pipelines, and cross-border data flows — face a structural compliance problem. Privacy and data governance obligations are documented, but not operationalised. The gap between what an organisation's policies say should happen and what its systems actually do when data moves is where regulatory risk, legal liability, and AI deployment failures accumulate.

Existing tools address parts of this problem. Privacy management platforms such as OneTrust, Transcend, and TrustArc serve as systems of record — they document lawful bases, maintain records of processing activities, manage consent, and track regulatory obligations. Sovereign cloud infrastructure from hyperscalers provides data residency guarantees at the infrastructure level. However, none of these tools enforce obligations at runtime, at the moment data is actually used.

The result is that compliance, for most organisations, exists in documents and dashboards — not in the execution path. When an AI model is called, when data crosses a jurisdiction, when a dataset is used for a purpose that was never consented to: there is no consistent, automatic mechanism to intercept, verify, and either permit or block that action.

The runtime gap: PCT is designed specifically to close the gap between documented obligations and enforced obligations. It does not replace policy platforms or infrastructure controls — it makes them actionable at the moment of execution.

1.2 The core concept

The Privacy Claims Token takes its conceptual model from JSON Web Tokens (JWTs), the standard mechanism for portable, signed identity claims in authentication systems. A JWT carries structured claims about a user — who they are, what they are permitted to do — and those claims travel with the user's session. Any system that receives the token can verify its authenticity and act on its claims without calling back to a central authority.

PCT applies the same model to data rather than users. When a dataset is created, ingested, or enters a processing pipeline, a PCT is issued that encodes the obligations attached to that data: its jurisdiction of origin, the purposes for which it may be used, the lawful basis under which it was collected, consent status, transfer restrictions, and so forth. Those claims are cryptographically signed. Wherever the data travels, the PCT travels with it. Before any action — a processing operation, an AI model call, a cross-border transfer — the PCT is verified. If the claimed obligations permit the action, it proceeds. If not, it is blocked. In either case, a tamper-evident audit record is generated automatically.

This model produces three outcomes that current approaches cannot deliver simultaneously: real-time enforcement, portable proof, and automatic audit.

1.3 Design principles

This specification is governed by the following principles, which take precedence in cases of ambiguity or tension:

  • Jurisdiction-neutral. The claims model must be capable of expressing obligations arising under any privacy, AI governance, or data sovereignty regulatory framework, present or future. No single jurisdiction's requirements are privileged in the base schema.

  • Extensible by design. The core schema defines a mandatory minimum. All claims fields support extension namespaces, allowing implementations to express regulatory requirements not anticipated at the time of this specification.

  • Open and implementation-agnostic. This specification describes what a PCT must contain and how it must behave. It does not prescribe a specific technology stack, programming language, or vendor implementation.

  • Cryptographically verifiable. Claims must be signed in a manner that allows any receiving system to verify authenticity and detect tampering without contacting the issuing system.

  • Human-readable. While PCT is designed for machine processing, the claims vocabulary is drawn from established legal and regulatory terminology so that the token is comprehensible to lawyers, DPOs, and compliance professionals without specialist technical knowledge.

  • Audit-first. Every verification event must produce a structured, tamper-evident audit record. Proof of compliance is not optional — it is a first-class output of the PCT lifecycle.