Hacker News new | past | comments | ask | show | jobs | submit login

Thanks for the clarification.

The CSAM bit seems to then be propaganda from at least one AI company putting out PR to falsely quell people's concerns about their LLMs being able to generate content involving children that's sexualized.

I've yet to see details of how much compute-minimum server requirements are necessary to run LLMs. Maybe you know a source who's compiling a list in a feature matrix that includes such details?




Large LLMs like gpt-3 and gpt-4 need very serious hardware. They have hundreds of billions of parameters (or more) which need to be loaded in memory all at once.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: