Interesting framing, but I’d push back gently: every GPT, every Gemini Gem, every GPT made with pre-prompting, every character in my own HF Space (432 — A Journey Experience) already runs on a “boundary layer” — it’s just written in prose instead of JSON. The shape of the container doesn’t change the hard part.
The hard part isn’t declaring constraints. It’s getting a stochastic model to honor them. Prose or JSON, you’re still asking an LLM to interpret “no_water” semantically and comply. That’s not an execution layer, it’s a system prompt with curly braces.
The deeper move, I think, is mechanism over declaration: constraints that are self-enforcing because violating them costs the agent something. That’s what I’ve been exploring here → AI Systems Have No Hunger. The farmer doesn’t need a NotStartIf: no_seed_reserve rule — next winter’s hunger is the rule.
Deontic boundaries are fragile. Metabolic ones aren’t.