He’s referring to the fact that the Effective Altruism / Less Wrong crowd seems to be focused almost entirely on preventing an AI apocalypse at some point in the future, and they use a lot of obscure math and logic to explain why it’s much more important than dealing with war, homelessness, climate change, or any of the other issues that are causing actual humans to suffer today or are 100% certain to cause suffering in the near future.
It wouldn’t delay your death. But it would make it more pleasant. You would most likely pass out from low blood pressure pretty quickly and then you wouldn’t have to worry about starving any more.