It's pattern matching, but it's not matching intelligently. An intelligence should be able to optimize itself to the task at hand, even before self-improvement. LLMs can't select relevant data to operate on, nor can it handle executive functions.
LLMs are cool, and I think humans have something similar to process information with, but that's just one part of a larger system, and it's not the intelligent part.
Therapists as pentesters is an interesting concept. Maybe if we thought about it like that there would be more interdisciplinary methods.