GamesBeat Summit is the largest event for leaders in gaming. Reserve your spot now!

COVID-19 will last longer than some health experts had predicted. The novel coronaviruses is likely to become endemic due to slow vaccine rollouts, rapidly spreading new strains, and politically charged rhetoric around social distancing.

In brick-and-mortar retail stores clothing, touch surfaces like countertops, cash, credit cards, and bags are potential viral spread vectors. The new flu seems to have renewed interest in cashierless technology like Amazon Go, which allows shoppers to pick up and purchase items without interacting with a store clerk. Walmart, 7-Eleven, and cashierless startups have expanded their presence over the past year.

As cashierless technology becomes normalized, there is a risk that it could be used for other purposes. While the detection of shoplifters is not problematic on its face, case studies show that it is susceptible to bias and other flaws that could result in false positives.

Synthetic datasets

The majority of cashierless platforms rely on cameras to monitor the individual behaviors of customers in stores. Machine learning classification is a process that uses video footage from the cameras to identify when a shopper picks up an item and places it in a shopping cart. During a session at the re:Mars conference in the year of 2019, the VP of Amazon Go explained that Amazon engineers use errors like missed item detections to train machine learning models that power its Go stores. Synthetic datasets increase the diversity of the training data and the models use both geometry and deep learning to ensure transactions are associated with the right customer.

The problem with this approach is that synthetic datasets can be poorly audited. The image recognition software that was used in the photo storage service was labeled as “gorillas” by a software engineer in 2015. People with darker skin were mislabeled as guns by the Cloud Vision API. Many experiments have shown that image-classifying models trained on ImageNet automatically learn biases about race, gender, and weight.

A professor at Rutgers University told NBC that a theft-detection system might unfairly target people of color, who are stopped more often than white shoppers, because of their race. A study found that middle-class white women were treated more favorably than other women in toy stores, and that the police were never called on them when they were aggressive. A recent survey of Black shoppers found that 80% of them had experienced racial stigma and stereotypes while shopping.

The people who are caught for stealing are not an indication of who is stealing, according to Williams. Black shoppers who feel they have been scrutinized in stores are more likely to appear nervous while shopping, which might be perceived by a system as suspicious behavior. It is based on discrimination and who is being watched and who is being caught.

Some solutions are designed to detect the patterns of limb movements. It is a potentially problematic measure considering that disabled shoppers might have gaits that appear suspicious to an algorithm trained on footage of able-bodied shoppers. Some people with disabilities have slurred speech and may be mistaken for intoxication in other states.

Vaak’s anti-theft product, VaakEye, was trained on more than 100 hours of closed-circuit television footage to monitor the facial expressions, movements, hand movements, clothing choices, and over 100 other aspects of shoppers. The Japanese telecom company and tech startup Earth Eyes have collaborated on a project called the Artificial Intelligence Guardsman, which uses live video to give shoppers clues.

No one at NTT East claims that its algorithm is perfect. A company spokesman told The Verge that it sometimes flags customers who pick up items and put them back in the store. The system of NTT East doesn’t find pre-registered individuals.

Walmart’s anti-shoplifting technology was under scrutiny last May over its poor detection rates. Walmart workers said their top concern was false positives at self-checkout. The employees believe that the tech misinterprets innocent behavior as potential theft.

Industry practices

Trigo emerged from stealth in July of last year, and is trying to bring checkout-less experiences to existing convenience stores. The company supplies both high-resolution, ceiling-mounted cameras and an on-premises processing unit that runs machine learning-powered tracking software for a monthly subscription fee. Data is beamed from the unit to the cloud, where it is analyzed and used to improve Trigo’s algorithms.

Trigo claims that it can’t identify individual shoppers beyond the products they’ve purchased, that it can’tonymize the data it collects, and that its system is 99.5% accurate at identifying purchases. When VentureBeat asked about the anti-shoplifting detection features of the product, the company refused to comment.

Grabango, founded by Will Glaser, declined to comment on the article. The company says shoppers have to check in with a payment method and that staff are only notified when malicious actors sneak in. Standard Cognition, which claims to have technology that can account for changes like when a customer puts back an item they initially considered purchasing, doesn’t offer the same capabilities to its customers.

Standard doesn’t monitor for theft and we never have to. Jordan Fisher told VentureBeat that they do this without using fingerprints. An artificial intelligence system that is trained to be responsible should be able to detect theft without bias. Standard won’t be doing it. We are focused on the checkout-free aspects of this technology.

The New York Times and Fast Company had separate interviews with them in the year of the dragon. Standard’s platform could look at a shopper’s trajectory, gaze, and speed to detect and alert a store attendant to theft via text message, according to Michael Suswal, Standard’s COO. Standard does collect information about certain body features, but it doesn’t collect fingerprints. Standard hired 100 actors to shop for hours in its San Francisco demo store in order to train its software to recognize theft.

The behaviors that it looks like to leave are learned by the people. They are looking at the door if they are going to steal.

Standard has a patent that supports the idea that they developed a system to track the movements of the body. The application describes an application that can recognize the physical features of customers in a store aisle. The on-body points that this is designed to identify are neck, nose, eyes, ears, shoulders, elbow, wrists, hips, and knees.

Standard’s patent is for a system that is anonymous and not related to intent recognition or anything related to theft, according to a statement sent to VentureBeat. We don’t do any of the other types of biometrics and we’re glad that we were able to clear up the discrepancy from the previous media stories. The computer vision-based system we use to identify people is not able to do so, we rely on a shopper to check in with their phone to get the payment information.

AiFi says its cashierless solution can recognize suspicious behavior inside of stores. The company uses synthetic datasets to generate training and testing data without customer data. A spokesman told VentureBeat that simulation can randomly change hairstyle, color, clothing, and body shape to ensure that we have a diverse and unbiased dataset. We do not use facial recognition or personally identifiable information. Our goal is to make shopping more privacy-conscious and inclusive.

The startup has a patent that shows its proposed anti-shoplifting solution, which uses anonymous tags to hide a person’s identity. A server can use camera images to infer whether a person took items from a shelf with malice. If distinguishing characteristics are saved and retrieved for each visitor, it can be used to identify shoplifters who have previously stolen from the store.

The system can be configured to detect when someone leaves the store without paying for their purchases. The patent description states that a human cashier may be able to see the person’s shopping cart list at the traditional cash register screen. The human cashier can use this information to verify that the shopper has not taken anything or is paying for items taken from the store. If a customer has taken two items from the store, the customer should pay for them.

Lack of transparency

Tech startups are usually reticent to reveal their technical details for competitive reasons. This does not reflect well on the shoppers. It will be difficult to engender trust among shoppers, and it will be difficult to develop the capabilities of the platforms, without transparency.

The only company that volunteered information was Zippin. The size of the dataset varies from a few thousand to a few million video clips, depending on the particular algorithm to be trained. The company did not say what steps it takes to ensure the datasets are diverse and unbiased, or whether it continuously retrains its software to correct for errors.

Stores can flag false positives if the systems like AI Guardsman allow it. It is a step in the right direction, but without more information about how these system work, it is unlikely to allay shoppers’ concerns about bias and surveillance.

Christopher Eastham is a specialist in Artificial Intelligence at the law firm Fieldfisher. Ryo Tanaka, the founder of Vaak, believes that customers should be given notice before entering stores so that they can opt out. He told CNN that government should make stores reveal information.

Leave a Reply

Your email address will not be published.