An AI-powered video analytics system has been launched in Kislovodsk to search for wanted and missing persons. According to Rostec (20.08.2025) and regional reports, the solution has moved beyond a late-2024 pilot and now processes feeds from hundreds of city cameras. The key question is how to balance safety and privacy under the “Islamic Code for the Application of AI.”
- Goal — public safety and locating people; technology — facial recognition and video analytics.
- “Kislovodsk AI facial recognition” operates as part of an integrated city surveillance system.
- Risks: biometric data collection, recognition errors, and broad access to data.
- Under the Code, permissibility depends on benefit/harm, transparency, and control.
- Residents and communities need clear rules, audit, and feedback channels.
What Happened and Why It Matters
On 20 August 2025, Rostec reported that neural-network video analytics is operating in Kislovodsk; the city confirmed the launch to search for missing and wanted persons. Reports say the pilot ran since late 2024; now AI analyzes video streams from many cameras and may expand. For residents, this means faster searches for people and, at the same time, privacy questions.
How “Kislovodsk AI Facial Recognition” Works in Practice
- Video streams from city cameras are fed into the AI module.
- The algorithm compares faces with reference records from lawful sources.
- Alerts go to an operator; final decisions are made by a human.
- Integration with city systems (“Safe City”) is provided, along with logging and access rights.
- Possible expansions: transport and infrastructure analytics, detection of road events.
What Data Are Processed
- Facial images (biometric personal data) and event metadata.
- Access logs and service records for audit.
- Under the Code, lawful basis, minimization, and deletion periods are required.
Risks and Safeguards
- False matches. Recognition errors may cause inconvenience and reputational harm.
- Excessive data collection. Unnecessary retention of video raises leak risks.
- Insufficient transparency. If residents do not know the rules, trust suffers.
- Purpose limitation. Use beyond the stated aims is unacceptable.
Assessment Under the “Islamic Code for the Application of AI”
Conformities
- Protecting life and aiding searches aligns with مَقَاصِدُ ٱلشَّرِيعَةِ (objectives of the Sharia) on preserving life and property (secs. 2.2.23; 2.2.25).
- Priority of preventing evil and harm — permissible with proven benefit and control (secs. 2.2.12; 2.2.18).
- Human responsibility for decisions — the operator confirms the final action (secs. 1.4.2; 2.2.9).
Potential Divergences
- Risk of infringing honor and privacy without strict minimization (sec. 2.2.26).
- Opaque contracts and vague conditions — unacceptable (sec. 2.2.30).
- Possible algorithmic bias — regular checks required (secs. 2.2.8; 2.3.9).
Conditional Conformity
- Permissible with a clear purpose, limited scenarios, reporting, and right of appeal (secs. 2.2.10; 2.2.33; 2.3.6; 2.3.10).
Conformity Matrix
Statement/fact | Code clause(s) | Status | Comment/control |
---|---|---|---|
Purpose: search for missing persons; protect life | 2.2.23; 2.2.12 | Conformity | Benefit outweighs harm if effectiveness is real and risks are managed. |
Human confirmation of alerts | 1.4.2; 2.2.9 | Conformity | Keep decisions with a human; log actions. |
Processing of biometrics | 2.2.26; 2.2.10 | Conditional | Need minimization, lawful basis, retention and deletion periods. |
Transparency and logs | 2.2.30; 2.3.6 | Conformity if ensured | Publish rules, reports, metrics; arrange external audit. |
Non-discrimination | 2.2.8; 2.3.9 | Conditional | Run regular bias tests; adjust models. |
Expansion of scenarios | 2.2.20; 2.2.10 | Risk of divergence | Any new use only after renewed benefit/harm assessment. |
Accounting for local norms | 2.2.34 | Conformity | Adapt rules to a resort setting without violating the Sharia. |
Supplier responsibility | 2.2.16; 2.2.31 | Conformity | Compensate harm; bear responsibility for built-in scenarios. |
Practical Takeaways / Checklist
For residents
- Find out where cameras are located and who the operator is.
- Check how to submit a request about your data and how to delete it if there is an error.
- Report incidents via official channels: camera number, time, and facts are needed.
- Support searches for the missing: volunteer patrols and hotlines.
For parents and communities
- Discuss with the administration a “white list” of high-sensitivity zones (mosques, children’s institutions).
- Request reports: accuracy, false alerts, retention periods.
- Check for a right of appeal and compensation in case of errors.
For developers and operators
- Implement minimization policies and retention time-outs; log accesses.
- Conduct regular bias audits; publish model limitations.
- Use high-quality, lawful data sources; exclude dubious datasets.
Risks and Limitations
- Recognition errors and a “chain reaction” of consequences.
- Data leaks due to weak protection.
- Regulatory “goal drift”: from searching for the missing to total monitoring.
- Insufficient public oversight and reporting.
Check Outside the Code
- In places of worship, preserve سِتْرٌ (protection of private life) and حُرْمَةٌ (inviolability/sanctity). Cameras near mosques should avoid excessive close-ups of worshippers’ faces. This supports تَقْوَى (God-consciousness) and respect for the sanctity of the space.
Candidates for Code Updates
- Introduce a rule for “religiously sensitive zones” and geo-blurring of faces near mosques.
- Set minimum quality metrics for biometrics in public places and review thresholds.
- Require publication of a system “passport”: purposes, legal bases, retention periods, accuracy indicators, and appeal process.
- Establish a rule of “deletion by default” for records of non-involved persons after the minimum period.
What’s Next: Monitoring and Preparation
- Request from the operator: processing policy, a DPIA analogue (impact assessment on rights), and error metrics.
- Agree with communities on a feedback procedure and priorities for assisting in searches.
- Arrange regular external audits and publish reports every six months.
Section Disclaimer
This material is prepared for the “AI News” section. The assessment is informational and does not replace legal advice.