Meta is asking some employees to install software that records clicks, keystrokes, and screen activity so it can train AI agents. In plain language, the company is turning work into training data.
What most people miss is that this is not only a labor or privacy story. It is a substrate story. The same human judgment required to map the work is being used in a way that can teach people that honesty is unsafe.
That is a Crime Against Ren. Not because the data is false, but because truth can still be weaponized against the human conditions that make truth available in the first place. Once people learn that showing how the work actually gets done helps build the thing that displaces them, candor curdles into theater.
A serious operator should notice the trade being made. You can extract workflow truth by force. But if you do it by burning trust, you are training the model on this quarter’s reality while destroying your ability to learn the next one.