I hope you or anybody else in your life ever has to deal with cancer. I've not been so fortunate and you're a callous dickhead.
You are not special, and you will not gain sympathy from me with name-calling and personal attacks. If you are offended, deal with it yourself. I rarely go out of my way just to offend people, but I am not interested in sugar-coating and censoring myself just to cater to those who might be offended by something I say, especially if it isn't aimed at them.
Are the "black/poor/immigrant neighbourhoods" areas with significantly higher crime rates such because those people are really more prone to be criminals or because those areas are overpoliced, where you can get arrested even for trivial stuff that is ordinarily let slide elsewhere? Especially if you take into account the "broken windows" policing doctrine (= harshly punishing everything, no matter how tiny and insignificant, with the idea that it will discourage more serious crime) that has been widely discredited but is still widely applied.
Now take such data and feed them into a neural network or some other machine learning system. The problem is that if you train that AI with a history of crime that is based on pre-existing biases (immigrant/poor/black/muslim/whatever ...), you will actually only aggravate the situation. Basically a machine-reinforced confirmation bias.
http://www.sciencemag.org/news/2017/04/even-artificial-intelligence-can-acquire-biases-against-race-and-gender
https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing
https://www.nytimes.com/2016/06/26/opinion/sunday/artificial-intelligences-white-guy-problem.html?_r=0
It actually gets worse - there are similar software black boxes that will evaluate your likelihood to re-offend and suggest sentence to the judges. Guess who gets hit with much tougher sentences for the same crimes.
Knowing how police departments typically operate, and having seen the shoddy state of many immigrant/muslim/poor areas, it is very unlikely that any over-policing is responsible for the higher crime rate, so it's more likely that these areas are just more prone to criminal activity. It's not "racism" or "confirmation bias" to acknowledge and utilise facts and correlations such as this, and despite what divisive "minority" groups like BLM want everyone to believe,
it in't all the state's fault they are over-represented in crime statistics.
As for harsh punishments, what would you consider to be a minor/trivial crime (other than the obvious ones like smoking pot and saying things that offend others)? An incident can erupt from what is meant to be a response to a minor misdemeanor, but was escalated to the point where the offender now has to be restrained with a large amount of force, or they have a very bad rap. I see nothing wrong with imposing a harsher sentence to someone who is more likely to re-offend, so long as this is done fairly and without unjustified prejudice.
There is a good book on why these black box models where we don't know how the system actually arrives to the result a huge problem:
https://weaponsofmathdestructionbook.com/
And therefore, any such AI/neural network learning system should not be used in legal proceedings against anyone, without the output being scrutinised by appropriate human personnel. If the output is used without enough scrutiny, there is a serious issue with how that jurisdiction operates. I do share your concern to some extent, as I can see the potential for dependence on AI in legal matters to escalate beyond control.
These behavior-based approaches have been widely discredited as an ineffective woodoo, unfortunately they are still being sold by the technology vendors. Such a system would be good to alert the operator that something is going on at best, otherwise it produces way too many false positives.
Sure, I'm not saying they are effective or useful, but that they
exist. In reality, high res cameras are mostly useful for identifying people in a crowd as opposed to identifying any behaviour. But as you say, many security tech companies love touting their stuff and pretending that it's useful. It's a major issue in the industry, the "oh everyone else are just dinosaurs, you need OUR tech because it's NEW!" This is what confuses people (including legislators) and results in a mess of others (including those in office) having somewhat of an aversion to security systems. It's a dream for some clients who get their security advise from watching Hollywood crime blockbusters and top-secret agent policing fiction TV shows.
Actually most applications of CCTV are for ex-post investigations - once something happens the footage is actually reviewed.
Yes, video analytics are still used in this case, to search for the object or evidence one may be looking for. Even simple analytics such as object searches based on visible movement.
And re behavior based stuff:
http://www.law.uchicago.edu/files/files/Harcourt%20OpEd%20Behavioral%20Profiling%20Longer%20Version.pdf
https://www.rt.com/usa/tsa-behavior-profiling-program-685/
And that is done by humans - it is very unlikely that some sort of AI would be performing any better than that if even humans have difficulty to predict the problem correctly than a random chance.
Of course you still need a human, all the video analytics AI does is make the job easier. There will be false alarms present that the humans have to be around to filter. I am not suggesting the AI should be the be-all and end-all in video analytics, but it can reduce the time an investigation takes.
In most applications of CCTV, such advanced behaviour-identifying analytics aren't even necessary, as the system is composed of many "low" resolution cameras in specific positions, and the location and rough time and location of occurrence is often already known. The only thing the video analytics does here is help the operator find what he is looking for without spending a lot of time sifting through footage manually.
It is concerning how advanced this can get, knowing that the dependence on technology will increase, and even failure modes that are supposed to be non-essential will result in significant losses.