Police technology see Landscapes as Pornography

in #science7 years ago

Over the last few year, there has been a boom in AI technology, but success is accompanied of course by a number of mistakes that leave mixed feelings.

The latest AI news comes from London where the City Police announced plans to use artificial intelligence to scan electronic devices for child pornography images. According to the statements, the technology should be ready for use within 2 to 3 years.

desert-body.jpg

For the moment, however, there is one problem - the software takes pictures of a port for act photography. It's supposed to be because of the color.

The Metropolitan Police Service now uses a less precise form of image recognition software. The problem is that if the software detects certain forms of crime (weapons, drugs and money), it has great difficulty in identifying pornographic images and videos. Stokes is the head of the London-based computer crime department.

This means that police officers themselves have to look at obscene images and decide whether there is an element of crime - psychologically distressing and stressful work, especially given the scale of the task. The London Police had to review 53,000 devices for impressive imagery in 2016. In February, the lead policeman called the amount of sexually crimes committed against children in the country "memorable".

Fortunately, thanks to technological developments, they can deal with inappropriate devices that will not be mentally affected by work. The British police are currently working with Silicon Valley vendors (such as Google and Amazon) to create artificial intelligence technology sophisticated enough to identify offensive images.

But there are still a few problems to smooth out. - Sometimes it comes up with a desert and it thinks it's an indecent image or pornography. This is not a small problem, as desert landscapes are popular for screensavers and desktop imagery.

Coin Marketplace

STEEM 0.22
TRX 0.28
JST 0.042
BTC 104621.15
ETH 3895.38
SBD 3.29