LLMs need focus, because attention isn't enough. (thread on the Focus Chain, Cline's link to persistent context)
23,86K