U.S. employers wrestle with COVID vaccine requirements in regulatory “hairball”

By Tom Hals

(Reuters) -America’s largest garlic farm needs 1,000 workers to harvest its annual crop, but faces an unexpected hurdle in this year’s recruitment drive: it now must document and track the COVID-19 vaccine status of these seasonal laborers.

Employers in California’s Santa Clara County, including Christopher Ranch, are required as of June 1 to ascertain if their workers have been vaccinated and check in every 14 days on those who say they have not or who decline to answer.

The timing of the order, in the middle of the busy harvest season, couldn’t be worse.

Ken Christopher, the farm’s executive vice president, said the company has to develop a system to check who has been vaccinated while observing privacy laws and monitoring workers’ adherence to safety protocols and testing.

“If the government wants to mandate (a vaccine), that’s one thing,” Christopher said. “But then requiring us to police it, that feels very unconventional.”

Workers in the Silicon Valley county who aren’t vaccinated or refuse to reveal their status to their employer must remain masked and should follow other protocols, such as limiting long-distance work travel and submitting to regular COVID-19 testing.

Employment lawyers said companies are watching closely how rules play out nationally, as they look to bring workers back safely and to dispense with mask protocols. But doing so may require identifying those who got a COVID-19 shot with badges or bracelets, raising discrimination issues and complicating hiring in a tightening labor market as the pandemic eases.

Several states, including California, Michigan and Oregon, have their own rules or guidance on documenting vaccination status for workers but they are generally less strict than in Santa Clara County.

In Montana, however, a recently enacted law discourages employers from asking about vaccination status because it could lead to discrimination claims, according to employment lawyers.

“It’s a hairball,” said Eric Hobbs, an employment attorney with Ogletree Deakins in Milwaukee. “It’s all very confusing.”

Christopher said he is considering a mask-free shift for vaccinated workers and another shift for workers who haven’t gotten their shot to avoid discrimination and tension.

But asking farm laborers about their vaccination status and entering their details in a database could hurt recruitment efforts, he said.

“It’s the additional information being offered to the government,” said Christopher. “The more layers added on top, the more uncomfortable they are in seeking jobs here.”

The U.S. workplace safety regulator, the Occupational Safety and Health Administration, or OSHA, has not provided clear guidance on the issue.

“We continue to let the employer make the determination how to properly do this for their workplace,” OSHA’s acting director, Jim Frederick, told Reuters.

‘SCARLET LETTER’

The U.S. Centers for Disease Control and Prevention, which said last month that inoculated people can go without face coverings indoors in most places, has not addressed the thorny issue of how to establish whether someone has been vaccinated.

“What companies are debating right now, and we are too, is: is it necessary to specify on someone’s badge or wear something around their neck that, yes, they are vaccinated and therefore if they don’t have a mask on there’s nothing to worry about?” said Peter Hunt, vice president of brand protection and security at Flex Ltd, a product design and manufacturing company.

That troubles Alix Mayer, the president of the California chapter of Children’s Health Defense, which is skeptical of the vaccination effort.

Requiring employers to ask about inoculation status is in essence a vaccine mandate, she said, because the unvaccinated will have to wear masks which amount to a “scarlet letter.”

In Santa Clara County, ServiceNow Inc, a cloud computing platform developer, told Reuters it is marketing an app for workers to provide employers their vaccination status, and, if required, to document it.

In communications with its own employees, ServiceNow emphasizes it does not require vaccines to return to work and leaves it to employees to decide whether to reveal their vaccination status.

“We encourage you to share if you are comfortable doing so,” say the instructions.

The company does require masks to be worn in its offices, however.

Helen Cleary, director of the Phylmar Regulatory Roundtable, an environmental health and safety forum for large employers, said companies should be allowed to trust employees to follow mask rules rather than prove or disclose if they’ve been vaccinated.

“We trust employees to do a lot of things. We trust them not to steal from the till,” said Cleary. “We support the honor system, and think that could alleviate a lot of these issues.”

(Reporting by Tom Hals in Wilmington, Delaware, Stephen Nellis and Jane Lanhee Lee in San Francisco and Elizabeth Dilts in New YorkEditing by Noeleen Walder and Sonya Hepinstall)

Strip searches and ads: 10 tech and privacy hot spots for 2020

By Umberto Bacchi

TBILISI (Thomson Reuters Foundation) – From whether governments should use facial recognition for surveillance to what data internet giants should be allowed to collect, 2019 was marked by a heated global debate around privacy and technology.

The Thomson Reuters Foundation asked 10 privacy experts what issues will shape the conversation in 2020:

1. CALIFORNIA DIGITAL PRIVACY LAW – Cindy Cohn, executive director, Electronic Frontier Foundation

“A California law giving consumers more control over their personal information, like the right to know what data businesses have collected about them, to delete it and to opt-out of its sale comes into effect on Jan. 1, 2020.

The legislation could have a ripple effect across the United States, or lead to the passage of a federal law.

This could be good news, if a federal law was to mandate some basic privacy guarantees that states could improve on – or bad news, if it was to instead block stronger state laws.”

2. DIGITAL STRIP SEARCHES – Silkie Carlo, director, Big Brother Watch

“From where we have been to who we have spoken to, our phones contain mountains of data that is increasingly sought after by police during investigations. So-called “digital strip searches”, where crime victims are asked to hand over their phones, are becoming common place all around the world.

In Britain, victims of rape are now routinely required to give police full downloads of their phones, and police can keep the data for 100 years. It’s no coincidence that almost 50% of victims are dropping their cases.

There’s no law in the UK around this and it’s likely we’ll see a showdown between police, data regulators and privacy advocates in 2020.”

3. FACIAL RECOGNITION – Jameson Spivack, policy associate, centre on privacy & technology, Georgetown Law Centre

“In 2019, face recognition technology became an integral part of the public debate about privacy, as people realized just how much of a risk this technology poses to civil rights and liberties.

Public officials have responded, with bans and proposed regulation at all levels of government. These conversations will come to a head in 2020.

In the U.S. this could mean new federal, state, or local policies around how law enforcement is allowed to use (or not use) face recognition; rules for companies developing the technology; and/or increased enforcement action from entities like the Federal Trade Commission or state attorneys general.”

4. BEHAVIOURAL ADVERTISING – Karolina Iwanska, lawyer, Panoptykon Foundation

“A wave of complaints against the use of personal information to target advertising online have been filed with data authorities across the European Union over the past two years.

The Irish data protection authority – which is a lead authority for Google – started an investigation into the company’s advertising business and the British ICO has published a damning report on the ad-tech industry.

2020 should bring much needed decisions in these cases, potentially leading to fines and further restrictions on companies’ use of people’s data.”

5. EU BUDGET – Edin Omanovic, advocacy director, Privacy International

“Next year, the EU will decide its budget for the years 2021-2028. How it will spend what is likely to be in excess of 1 trillion euro ($1.10 trillion) will have a transformative impact not just on its residents, but around the world.

For the first time, it will spend more on migration control than on developing Africa, often involving some sort of surveillance, which could pose huge threats to privacy and other human rights.”

6. AI TECHNOLOGIES – Diego Naranjo, head of policy, European Digital Rights

“A 2019 report on facial recognition by the EU’s rights agency represented a crucial step in the debate that we as societies need to have prior to deploying such technologies, which affect privacy, data protection, and other rights.

We could end up implementing practices in Europe which horrify us when they are implemented elsewhere, for example in China.

This conversation, as well as examining the impact of other technologies, like the potential discriminatory impact of “AI-based lie detectors” on vulnerable groups, such as migrants, will be an important part of the debate in 2020.”

7. ALGORITHMS’ DECISION MAKING – Sandra Wachter, professor, Oxford Internet Institute

“The EU’s General Data Protection Regulation (GDPR) currently focus on things like transparency, consent and notification of data collection, but not on how we are evaluated after data is collected.

This means users have few rights to challenge or contest how they are assessed by algorithms processing their information, which is worrisome since our digital identity steers our paths in lives and impacts our opportunities.

In 2020, the EU’s data watchdog will publish several recommendations on how to improve data rights. This is a great opportunity to give guidance to transform the GDPR, introducing more controls over how algorithms evaluate us.”

8. TARGETED POLITICAL ADS – Matthew Rice, Scotland director, Open Rights Group

“Personal data is becoming ever more central in the operations of political campaigns, as parties buy up commercial data sets in an attempt to derive the voters’ opinions and decide whether to target them online and how.

This practice stretches the limits of data protection laws and strains trust in democratic systems.

With the U.S. Presidential elections taking place in 2020 expect to see a huge amount of attention paid on what personal data parties are using and how they are using it.”

9. BIOMETRICS TECHNOLOGIES – Carly Kind, director, Ada Lovelace Institute

“In 2020 biometrics technologies are likely to come under the serious scrutiny of regulators in Europe (and possibly beyond).

We’re approaching a tipping point in public concern about the increasing ubiquity of facial recognition. In China 84% of people surveyed want the opportunity to review or delete facial data collected about them.

EU authorities have promised facial recognition regulation will be forthcoming in 2020. It is critical that it looks beyond facial recognition to the entire gambit of AI-enabled biometric technologies that will be rolled out in the years to come.”

10. IRELAND’S DATA AUTHORITY – Paul-Olivier Dehaye, co-founder, Personaldata.io

“In 2020, Ireland is likely to come under increased pressure from other European countries to take a stronger stance on data protection after years of lax enforcement.

Thanks to the EU’s harmonization mechanisms, the Irish data authority could be compelled to adjust to the stricter parameters used by its EU counterparts when deciding on the growing number of privacy complaints filed by EU citizens.

As Ireland hosts the European headquarters of U.S. technology firms like Facebook and Google, this would have far-reaching consequences across the bloc.”

($1 = 0.9073 euros)

(Reporting by Umberto Bacchi @UmbertoBacchi, Editing by Belinda Goldsmith; Please credit the Thomson Reuters Foundation, the charitable arm of Thomson Reuters, that covers humanitarian news, women’s and LGBT+ rights, human trafficking, property rights, and climate change. Visit http://news.trust.org)

Facial recognition at Indian cafe chain sparks calls for data protection law

A visitor drinks coffee at the 'International Coffee Festival 2007' in the southern Indian city of Bangalore February 25, 2007. REUTERS/Jagadeesh Nv (INDIA) - GM1DURPKFSAA

By Rina Chandran

BANGKOK (Thomson Reuters Foundation) – The use of facial recognition technology at a popular Indian cafe chain that triggered a backlash among customers, led to calls from human rights advocates on Monday for the government to speed up the introduction of laws to protect privacy.

Customers at Chaayos took to social media during the last week to complain about the camera technology they said captured images of them without their consent, with no information on what the data would be used for, and no option to opt out.

While the technology is marketed as a convenience, the lack of legislative safeguards to protect against the misuse of data can lead to “breaches of privacy, misidentification and even profiling of individuals”, said Joanne D’Cunha, associate counsel at Internet Freedom Foundation, a digital rights group.

“Until India introduces a comprehensive data protection law that provides such guarantees, there needs to be a moratorium on any technology that would infringe upon an individual’s right to privacy and other rights that stem from it,” she told the Thomson Reuters Foundation from New Delhi.

A statement from Chaayos said the technology was being tested in select cafes and was aimed at reducing purchase times for customers.

The data was encrypted, would not be shared, and customers could choose to opt out, it added.

“We are extremely conscious about our customers’ data security and privacy and are committed to protecting it,” the statement said.

A Personal Data Protection Bill is scheduled to be introduced by lawmakers in the current parliamentary session to Dec. 13.

The draft of the bill proposed strict conditions for requiring and storing personal data, and hefty penalties for misuse of such data.

But digital rights activists had criticised a recent consultation on the bill they said was “secret and selective”.

The ministry for information technology did not respond to a request for comment.

Worldwide, the rise of cloud computing and artificial intelligence technologies have popularised the use of facial recognition for a range of applications from tracking criminals to catching truant students.

In India, facial recognition technology was installed in several airports this year, and the government plans to roll out a nationwide system to stop criminals and find missing children.

But digital rights experts say it could breach privacy and lead to increased surveillance.

India’s Supreme Court, in a landmark ruling in 2017 on the national biometric identity card programme Aadhaar, said individual privacy is a fundamental right.

There is a growing backlash elsewhere: San Francisco and Oakland have banned the use of facial recognition technology, and “anti-surveillance fashion” is becoming popular.

(Reporting by Rina Chandran @rinachandran; Editing by Michael Taylor. Please credit the Thomson Reuters Foundation, the charitable arm of Thomson Reuters, that covers humanitarian news, women’s and LGBT+ rights, human trafficking, property rights, and climate change. Visit http://news.trust.org)