Hacker News new | past | comments | ask | show | jobs | submit login

Isn't one of the issues with local LLM the huge amount of GPU memory needed? I'd say that we'll go a lot longer than 5 years before phones have 24+GB of VRAM.



Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: