

You can’t exactly feed it a manual and expect it to extrapolate or understand (for that matter “what manual).
You can do that to a degree (RLVR). They are also paying human experts. But that’s the situation now. Who knows how it will be in a couple more years. Maybe training AIs will be like writing a library, framework, …












The assumption seems to be that an LLM can’t figure out a manual or source code. If it can’t, then you have to pay people. But that’s not a universally valid assumption.