Every summer for the last 14 years, the Welsh village of Porthcawl has played host to a strange festival.

Over the course of four days the population of the town doubles as thirty thousand Elvis fans arrive from all over the world.

The pubs swell with crowds – men and women with slicked back hair, sunglasses and white jump suits.

And at least year’s festival, the South Wales police decided to use the festival for an unusual experiment.

Vans with cameras mounted on the their roofs were positioned throughout the town. And cctv cameras were trained on the crowds, feeding pictures to special forces watching close by.

This was a trial of facial recognition technology.

Thousands of pictures were taken of Elvis fans and each was compared against a large database of wanted criminals.

It wasn’t a great success.

The cameras picked up 17 faces they believed matched those stored in criminal databases.

Ten were correct, seven were wrongly identified.

But that didn’t stop the South Wales police from conducting another experiment at the Champions League final in Cardiff last June.

Two systems were put to use.

One system was called “Locate”: using live feeds from CCTV cameras mounted on vans to take detailed measurements of facial features in the crowds.

The other was called “Identify”: which matched faces against suspects from a database of past crimes, a database of about 450,000 images.

How did this experiment go?

It seems that they managed to wrongly identify 2,000 football fans as possible criminals.

In total, 92% of the profile matches were incorrect.

The Met issued a defensive press release that began with the line: “Of course no facial recognition system is 100% accurate under all conditions.

Remove Glasses, Keep Looking Straight Ahead, And Don’t Move

Still the police have continued to experiment with face recognition.

South Wales Police is one of three forces in the UK, as well as the Met in London and Leicestershire police, trialling software developed by the Japanese company NEC.

The technology has been used at Notting Hill Carnival and Remembrance Sunday services, and there have been unwitting experiments on thousands of shoppers at Westfield in Stratford.

And it seems the tech is improving.

By March of this year, claim the Met, the Locate system was able to correctly identify a person 76% of the time.

And the technology is spreading too.

You will have used face recognition technology yourself when you’ve travelled through airports.

You’ll remember standing perfectly still for 10 to 20 seconds while the machine scans your face and checking your features against a criminal database.

That experience gives you a good indication of just how developed this technology is.

This technology has difficulty when the target is moving.

It also struggles with glasses, suntans, and with skin tones of several ethnicities.

That hasn’t stopped stores using it to spot shoplifters.

Apple relies on it enable iPhone X owners to unlock their phones.

Carmarkers are starting to use it to allow drivers unlock their vehicles.

Two years ago, a study by researchers at Georgetown Law estimated that half of all American adults are in law enforcement facial recognition databases.

And there is a good chance that you face shows up on a database in this country.

In the next few years, we could see face recognition technology becoming one of the most important ways that people are identified – whether it be in shops, parking, banks, bars, offices or any other number of public spaces.

And that worries me I have to say.

I mean… let’s just take a look at a few ways this technology has backfired in the last year.

1. Mistaken Identity

Take the example of Rekognition.

This is face recognition software that Amazon has been aggressively selling to US law enforcement, as well as US Immigration and Customs Enforcement.

When Rekognition was tasked with matching photos of the 535 members of Congress against 25,000 publicly available mugshots, it made 28 false matches.

That’s quite disturbing.

Especially when you consider how this technology may be used by stretched police forces.

Just by way of illustration…

In a recent incident in San Francisco, police stopped a car and handcuffed an elderly woman and forced her to kneel at gunpoint, all because an automatic license plate reader improperly identified her car as a stolen vehicle.

Meanwhile, Det Supt Galopin said the Met was assessing how effective facial recognition was at tackling different challenges in British policing at a time of “stretched budget cuts, falling officer numbers, rising demand and the terror threat”.

2. Public Shaming

Or take the example of Dong Mingzhu.

Last week this chairwoman of a leading manufacturer in China had her face displayed on a giant billboard-sized screen in Ningbo, to publicly shame her for breaking a traffic law.

The Chinese have been doing this sort of thing for several years now.

People who violate traffic laws, fall behind on debts, don’t pay parking tickets, or get caught for being drunk and disorderly, are publicly shamed on screens throughout the city.

The problem was that Dong Mingzhu was innocent.

Face recognition technology had identified her photo on the side of a bus advertisement, and concluded she was jaywalking.

3. Blacklisted

The retail industry is investing heavily in face recognition technology.

Mastercard has been introducing ways for customers to pay using a selfie. Many stores in China and Japan have introduced “smile as you pay” technology, which is exactly what it sounds like.

These are being introduced to prevent fraud and identify theft.

In Japan, automatically recorded images of shoppers’ faces taken by security cameras have been shared among 115 Japanese supermarkets and convenience stores as an anti-shoplifting measure.

At these shops, security cameras film all customers’ faces.

If a person shoplifts or makes an unreasonable complaint, security camera footage of the person is processed into facial data and placed on a digital blacklist.

Categories include: “shoplifter” and “complainer”.

4. Face Hacking

The disturbing thing about this technology is the scope for hacking.

Apple’s Face ID works by projecting a grid of 30,000 invisible dots onto a person’s face, which creates a 3-D map of your face.

Unlike similar features on earlier phones, the 3-D apparently mapping makes Face ID pretty hard to hack.

But for how long?

And will surveillance systems in shops, car parks, offices and hospitals be as secure as Apple?

The technology to map your face and graft it onto a compromising video already exists.

Could this open us up to blackmail?

We’re Caught in a Trap

Meanwhile, Amazon continues to promote its use of Rekognition with police departments in this country.

And as I’ve pointed out in a recent issue of Monkey Darts, the Chinese are exporting their model of surveillance – which relies heavily on face recognition and public shaming – to cities across Asia and Africa.

It will be interesting to see where else it shows up.

Hotels, schools, universities and healthcare facilities are already experimenting with face recognition.

Let’s hope the tech keeps improving.

The technology industry has certainly invested wholesale in this idea of surveillance.

Social scientist Shoshana Zuboff calls it “Surveillance Capitalism”: an ideology that puts it’s all on a database so that our lives can be mined for useful information.

The natural end point is a city with cameras and sensors that track us on an individual level, to create real-life versions of the profiles that Google and Amazon already used to track us online.

Without anonymisation, every activity can be connected across multiple databases to track our movements and behaviour over the course of a day.

We have between 4 and 6 million CCTV cameras in this country – many of which record behaviour that is simply ignored.

But with facial recognition technology, there is huge scope of abuse.

My advice? Learn from the crowds at Porthcawl.

Slick back the hair. Wear sunglasses. Walk at a decent pace. Travel in small groups if possible.

And never stay in the same pub or shop for too long.