SAI Revisit targets prolific theives and violent offenders
while respecting fundamental rights.
SAI captures evidence of theft and violence in shops and makes it available to retailers and the police. Revisit helps shop workers make informed decisions about prolific offenders.
Every theft incident in SAI protected stores is reviewed by trained human analysts before being added to Revisit.
For individuals involved in validated violence incidents, the priority is safety. Staff will be alerted and police may be contacted if the person returns to a store.
You deserve to know exactly what data we capture, how we use it, and how long we keep it. Here's our complete data inventory:
• Images of innocent customers
• Unvalidated or suspected incidents
• General shopping behavior or preferences
• Data from outside sources
• Request to see any data we have about you
• Challenge any decision made about you
• Ask for immediate deletion of your data
• File a complaint with our privacy officer
If someone approaches you, you have the right to understand why and to challenge that decision. Here's exactly how our process works:
When someone with a history of theft enters a SAI protected store, their current image is sent to trained staff, along with evidence from previous incidents.
Colleagues compare the live image with past images to decide whether the alert is valid.
If it is not valid, the live image is immediately deleted.
Store staff review the evidence together and make informed decisions. They can consult with colleagues and managers to ensure fair treatment and appropriate response.
If staff approach you, it should be for a private conversation where they can show you evidence of previous incidents. The goal is resolution, not confrontation.
Human and AI interaction will continue to evolve. The question isn't whether it should exist - the question is
how we can use it responsibly while respecting our fundamental rights.
We believe in building technology that serves communities, not systems that surveil them. Every decision we make is guided by principles of transparency, fairness and dignity.
We're advocating for industry standards that put people first. This includes mandatory signage, clear procedures, and regular audits of algorithmic bias.