This is 100% true and moral.
Some people think that a hypothetical AGI being intelligent is enough to put it on charge of decisions; it is not. It needs to be a moral agent, and you can't really have a moral agent that is not punishable or reward-able.
Most HN comments ITT are trash, so addressing the one with some merit:
The implication here is that unlike a computer, a person or a corporation can be held accountable. I'm not sure that's true.
Emphasis in the original. True, a corporation cannot be held accountable for its actions; thus it should not be in charge of management decisions.