An adversary exploits inherent human psychological predisposition to influence a targeted individual or group to solicit information or manipulate the target into performing an action that serves the adversary's interests. Many interpersonal social engineering techniques do not involve outright deception, although they can; many are subtle ways of manipulating a target to remove barriers, make the target feel comfortable, and produce an exchange in which the target is either more likely to share information directly, or let key information slip out unintentionally. A skilled adversary uses these techniques when appropriate to produce the desired outcome. Manipulation techniques vary from the overt, such as pretending to be a supervisor to a help desk, to the subtle, such as making the target feel comfortable with the adversary's speech and thought patterns.
|
|
| ID | CAPEC-416 |
| Latest Sync Date | 11/05/25 15:15:38 |
| Original ID | 416 |
| Abstraction | Meta |
| Status | Stable |
| Alternate Terms | |
| Likelihood Of Attack | Medium |
| Typical Severity | Medium |
| Related Attack Patterns | |
| Execution Flow | |
| Prerequisites | ::The adversary must have the means and knowledge of how to communicate with the target in some manner.:: |
| Skills Required | |
| Resources Required | |
| Indicators | |
| Consequences | ::SCOPE:Confidentiality:SCOPE:Integrity:SCOPE:Availability:TECHNICAL IMPACT:Other:NOTE:Attack patterns that manipulate human behavior can result in a wide variety of consequences and potentially affect the confidentiality, availability, and/or integrity of an application or system.:: |
| Mitigations | ::An organization should provide regular, robust cybersecurity training to its employees to prevent successful social engineering attacks.:: |
| Example Instances | |
| Related Weaknesses | |
| Taxonomy Mappings | |
| Notes | |