Camera surveillance in Hamburg is already very extensive. Large parts of the city center and the area around the main station are comprehensively screened with cameras, every subway station is monitored and there are also cameras without end in so-called dangerous places. From mid-July, however, the city is planning a new project in terms of camera surveillance. Near the central station, at Hansaplatz in St. Georg, surveillance cameras based on artificial intelligence will be used in the future.

 

The AI cameras are supposed to be able to detect and recognize "suspicious behavior patterns". Based on gestures and movement patterns, alleged criminals are to be recognized and apprehended. However, records from Mannheim, where the system has been in use for some time, show that this technology is not yet particularly mature and is quite susceptible to errors. According to these, the error rate of the AI-based software is over 10 percent. This means that at least one in ten people is caught in the crosshairs of the police for no reason. Of course, this also opens the door to police violence and arbitrariness, as the people recognized by the AI cameras are considered by the police to be identified criminals or perpetrators of violence and no longer just suspects.
Residents of the St. Georg neighborhood are also critical of the new surveillance measures. For example, the St. Georg residents' association is calling for more money to be invested in social drop-in centers rather than in surveillance of the neighborhood.

The new cameras, to be installed next week at Hansaplatz, are part of the increasing militarization of the area around Hamburg's main train station, which has been pushed by the Senate in recent times. More police and more surveillance mean more repression and harassment for people of our class, while little is actually being done about problems like prostitution and drug dealing. Instead, people with dark skin and black hair are unfairly policed. In addition, there is also the double standard of the ruling class in the FRG, which immediately cries out when so-called "authoritarian states" like China use similar software, while in this country exactly the same thing is being pushed and is being used more and more extensively.