We should go back in time and kill baby Eliezer Yudkowsky.
All of this is according to Yuds own theories. According to Yud, we should accept killing as many people as necessary to prevent rogue AI, up to the limit for repopulation being possible. Luckily we only have to kill one person, baby Yudkowsky. Despite making it his aim to stop the development of AI, most of the CEOs pursuing it right now directly site his writing as inspiring them to begin working on AI. It would not be far-fetched to say his work started the race to build AI. Moreover, accoring to Yud it's totally fine to kill children under the age of one, possibly even under the age of six, since they can't talk and therefore probably don't have qualia. So, obviously killing baby Yudkowsky would both be morally permissible and stop the development of AI that will surely kill us all. From a longtermist view, then, we should devote all of our resources from here on out to inventing time travel and murdering him as a baby.