Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> e/acc types

Please expand?



Effective Acceleration, the promotion of rapid AI development and roll out, appealing to all the deaths and suffering that can be prevented if we have the Singularity a year early.

Extremely optimistic about the benefits of new tech, downplay all the risks, my experience of self-identifying e/acc people has generally been that they assume AI alignment will happen by default or be solved in the marketplace… and specifically where I hope they're wrong, is that many seem to think this is all imminent, as in 3-5 years.

If they're right about everything else then we're all going to have a great time regardless of when it comes, but I don't see human nature being compatible with even just an LLM that can do a genuinely novel PhD's worth of research rather than "merely" explain it or assist with it (impressive though even those much easier targets are).


TYVM. Hopefully the inability to see ways this could go wrong or really look at the problem is sufficiently correlated with the lack of the tools required for progress.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: