I thought AI was great at picking up context?
I don’t know why you thought that. LLMs split your question into separate words and assigns scores to those words, then looks up answers relevant to those words. It has no idea of how those words are relevant to each other. That’s why LLMs couldn’t answer how many "r"s are in “strawberry”. They assigned the word “strawberry” a lower relevancy score in that question. The word “rescue” is probably treated the same way here.
iAmTheTot@sh.itjust.works 2 months ago
I don’t think they are really “making excuses”, just explaining how the search came up with those steps, which what the OP is so confused about.