hhmx.de

· Föderation EN Mo 29.04.2024 22:26:05

@jernej__s @gabrielesvelto there are a couple of different answers to that
1. they can, but they'd have to retrain the model every time it fucks up, like when it started giving out people recipes for IEDs
2. they can't, really, because it's a language model, not a fact model; it can still stumble into an untruth
3. if they can, then they can remove all the copyrighted material from their training base, and there goes their model
4. they shouldn't, because it wastes a fuckton of water