HN Mail
Subscribe
CPP
"Captions With Attitude" in the browser from local VLM using llama.cpp in Go
1 points
|
0 comments
Pure Go hardware accelerated local inference on VLMs using llama.cpp
1 points
|
0 comments
Show HN: Gerbil – an open source desktop app for running LLMs locally
32 points
|
7 comments
SSH teletekst.nl - in Dutch, but just try it (mouse works!)
2 points
|
1 comments
Ask HN: Why don't programming language foundations offer "smol" models?
1 points
|
2 comments
Costs and Benefits
1 points
|
0 comments
Zig / C++ Interop
108 points
|
15 comments