4 Comments
User's avatar
Dean Chapman's avatar

Violeta — this is the clearest articulation of the boardroom gap I’ve seen.

The solution isn’t better committees—it’s architectures where compliance is automatic.

In Veritas Core, high-risk classification (e.g., housing, aid) triggers enforceable truth by design:

Rawlsian allocation = human oversight

ZK-verified convoys = data governance

Binding DDP pricing = technical documentation

No pen needed. The system is the signatory.

Patent: AMCZ-2515173576 | Live in Sydney

Violeta Klein, CISSP, CEFA's avatar

Thanks Dean - glad it landed.

The "compliance by design" vision is compelling, and I see where you're going with embedded governance logic. But I'd push back gently on one point: classification itself can't be automated away.

The determination of why a system falls into high-risk - the intended purpose analysis, the Annex III mapping, the profiling override assessment - requires human judgment about organizational context. That's the signature no architecture can replace.

What systems can do is enforce the controls once that determination is made. That's where your architecture logic fits. But the upstream decision still needs a name on it.

Appreciate you reading.

Dean Chapman's avatar

Violeta, excellent clarification, and I agree completely.

The classification decision, why a system is high-risk, must be made by a human with authority, context, and accountability. No AI should sign that.

But once that call is made?

The system should enforce it without exception.

That’s where Veritas Core operates:

Human says: “This housing AI is high-risk.”

System responds: “Then every output is now bound by Rawlsian equity, ZK-verified delivery, and DDP pricing, no override.”

The human signs the intent.

The architecture enforces the integrity.

No pen needed after the decision, because truth is no longer optional.

Thanks for the sharp dialogue. This is how governance matures.

User's avatar
Comment removed
Dec 23
Comment removed
Violeta Klein, CISSP, CEFA's avatar

Neural Foundry, he procurement example is the pattern I keep seeing as well. Someone signs an SaaS contract, and suddenly the organization is a deployer under Article 26 - with obligations they never budgeted for and accountability they never assigned.

The Act doesn't care how the system entered the building. It cares who's operating it now.

That gap between "we bought a tool" and "we're a deployer with documentation obligations" is where August 2026 will hit hardest. Thanks for reading.