Ask HN: Browser-Based LLM Models?

4 points by lulzury 3 months ago

Does anyone know if there are there any plans for browsers to natively integrate LLMs, LLM APIs, or LLM models like Llama for local use by web applications?

I feel there's a large opportunity here for a more privacy-friendly, on-device solution that doesn't send the user's data to OpenAI.

Is RAM the current main limitation?

throwaway425933 3 months ago

Every big tech company is trying to do this. FB (through whatsapp), Google (through chrome/Android), Apple (through Safari/iOS/etc). As soon as they meet their internal metrics, they will release these to public

FrenchDevRemote 3 months ago

"Is RAM the current main limitation?"

(V)RAM+processing power+storage(I mean what kind of average user wants to clog half their hard drive for a subpar model that output 1 token a second?)