2 min read

Want to learn about AGI safety & alignment? - Beginner resources

A list of high quality beginning resources for learning about AGI safety & alignment
A future world where humans & AI leave peacefully together (generated by AI, Lexica Aperture v2)
Generated by AI, Lexica Aperture v2

I've had a few people new to the world of AI alignment and AGI x-risk reach out about beginning resources, so here's a quick compilation.

I've kept it short to focus on the highest value content. You can find plenty more by following links and searching terms in the resources below.

At that point, if you want to go deeper, I highly recommend the AGI Safety Fundamentals online course. It's the next level of depth when it comes to AGI safety & alignment.

If you'd like to follow along with the field but don't think you can go as deep as a full course, I can suggest my own podcast, The AGI Show (also on YouTube), which is targeted to technical but non-expert audiences who want to better understand the state of the field and how to contribute.

Finally, Holden Karnofsky has some great resources on how to contribute targeted to a variety of audiences:

One last note: It's very easy in this field to get caught by the "doomer" whirlpool – endless content that's quite negative and defeatist. Almost everyone in this field has fallen into this trap at some point. As you go through AI safety content, remember to stay positive and pragmatic. Ultimately, all we can do is to contribute meaningfully (by taking action) and accept that the future is fundamentally uncertain. Anybody who says they know the future is wrong!

Good luck! Don't hesitate to reach out if I can be helpful with your journey into this space as well 🙂.