Skip to main content

Preventing It Before It Happens: AI and Emotional Signals Against Aggression

Detected

Data analysis made from second 24 – 29 of the video Headbutts to journalist, Rai 2 crew attacked in Ostia by Roberto Spada_09_16_2025_12_48_26

A combined “risk” signal built from rising anger, higher arousal, dropping valence, more negative affects, and absence of positive affects crosses an alert threshold at ~27.12 s—about 1.9 s before the 29 s aggression moment.

In this clip, yes: there are detectable precursors that could have raised a “danger” alert a couple of seconds in advance.

Two charts inspecting the trend lines around the event:

What drove the early warning

I constructed a conservative, real-time friendly “danger score” (normalized to 0–1) using 3-second rolling z-scores of:

Using an alert line at 0.7, the score first crosses at ~27.12 s, which is ~1.88 s ahead of 29.0 s. In the plots, you’ll see:

What this implies for a live predictor

A practical on-edge rule you could run frame-by-frame:

Alert if (within a 3 s rolling window):

Could this outburst have been foreseen a bit earlier? Yes…

Between 20 and 27 seconds, we already see a convergence of four warning signals:

When such signals co-occur, the system captures not just a single emotion but a synergistic emotional cluster: growing anger, amplified by anxiety, reinforced by disgust, and locked in a state of worry. This cluster suggests that the subject is not only irritated but also losing emotional regulation—a pattern that often precedes aggressive behavior.

The graph from 20–28 s illustrates this trajectory clearly: anger climbing, anxiety sustained, and worry dominating the affective space, with the 27 s mark already showing heightened risk well before the 29 s aggression.

Therefore, the escalation could indeed have been predicted a few seconds earlier—because the simultaneous rise of multiple negative dimensions, coupled with the absence of balancing positive states, created a reliable early-warning signature

Note: MorphCast is the only technology that can detect, in real time and directly on the device, both Ekman’s facial signals and the 98 emotional states defined by Russell’s model of affect.

How can this help protect us in dangerous situations?

Imagine wearing smart glasses powered by MorphCast that trigger safety alerts—just like warning systems in an airplane cockpit.

About the author

Latest from our Blog

See all articles See all articles
  • Purpose of artificial intelligence
    AI and Humanity AI Technology Expert Content March 22, 2026

    Purpose of artificial intelligence

    Artificial intelligence becomes truly valuable when it helps us refine our abilities, not abandon them. Used well, AI saves time,…

  • Preventing It Before It Happens: AI and Emotional Signals Against Aggression
    AI Technology Expert Content Industry News & Trends Product Innovation October 17, 2025

    Preventing It Before It Happens: AI and Emotional Signals Against Aggression

    Data analysis made from second 24 – 29 of the video Headbutts to journalist, Rai 2 crew attacked in Ostia…

  • How Emotional Websites Transform User Experience and Drive Engagement
    Expert Content October 10, 2025

    How Emotional Websites Transform User Experience and Drive Engagement

    In a digital landscape dominated by fleeting attention spans, the power of emotional websites is revolutionizing user experience and engagement….