In short Contract lawyers are increasingly working under facial recognition software as they continue to work from home during the COVID-19 pandemic.
The technology is hit and miss, judging by interviews with more than two dozen U.S. lawyers conducted by The Washington Post. To ensure that these contract lawyers, who take on short-term assignments, work as planned and handle sensitive information appropriately, their every move is tracked by webcams.
The monitoring software is commissioned by their employers and is used to control access to legal documents that need to be reviewed. If the system thinks that someone else is viewing the files on the computer, or if a device has been configured to record information from the screen, the user is started.
For some legal eagles, especially those with darker skin, this work environment is more than tedious. Algorithms cannot reliably recognize their faces or are confused by their room lighting, webcam quality, or small facial movements. These trick the monitoring software into thinking that an unauthorized person is present or that some other violation has taken place, and an alert is generated.
A lawyer said the twisted knots in her hair were mistaken for “unauthorized recording devices” and she was often left out of the system – she said she had to log in more than 25 times on some days.
Many said they felt dehumanized and hated feeling like they were âtreated like a robotâ. Others, however, said they didn’t mind being watched so much and that they were actually more productive because of it. We have more about this type of surveillance technology here.
Skin Cancer AI Algorithm Database Lacks Darker Skin Patients
The public datasets used to train and test AI skin care algorithms lack racial diversity and could lead to poorer models when analyzing darker skin tones.
An article published this week in Lancet Digital Health and presented to the National Cancer Research Institute found that 21 open-source skin cancer datasets mostly contained images of fair skin.
There were 106,950 images in total, and only 2,436 of them bothered to have a skin type tag. Of those 2,436 images, there were only ten images of people with brown skin, and only one marked as dark brown or black skin.
“We found that for the majority of the datasets, a lot of important information about the images and the patients in those datasets was not being reported,” said David Wen, study co-author and dermatologist at Oxford University. showed that programs formed on images taken only of people with lighter skin type might not be as accurate for people with darker skin, and vice versa. “
Although these datasets are geared towards academic research, it is difficult to say whether any commercial medical systems have been affected by its limitations.
âAssessing whether or what business algorithms were developed from the datasets was beyond the scope of our review,â he said. The register. âThis is a relevant question and may indeed form the basis for future work. “
Enter Cohere, can he speak?
GPT-3 isn’t the only great model of business language in town. Customers now have more choices than ever after the latest startup Cohere launched its AI text generation API and announced a multi-year contract to run Google’s TPUs.
These contracts are lucrative for cloud providers. Cohere will pay Google large sums of money for its compute resources. And in turn, Google will help Cohere sell its API, according to TechCrunch. It’s a win-win situation for both companies.
Developers only need to add a few lines of code to their applications to access Cohere models through the API. They can also refine their own datasets to perform all kinds of tasks, like generating or summarizing text.
âUntil now, high-quality NLP models have been the exclusive domain of large companies,â said Aidan Gomez, Cohere co-founder and CEO. âThrough this partnership, we are giving developers access to one of the most important technologies to emerge from the modern AI revolution. “
Other commercial models include Nvidia’s Megatron and AI21 Lab’s Jurassic-1.
OpenAI’s GPT-3 API is now available to everyone
OpenAI announced that its GPT-3 API is now available to everyone, users in selected countries can register and play with the model immediately.
“Our progress in safeguards eliminates the waiting list for GPT-3,” he said this week.
âTens of thousands of developers are already taking advantage of powerful AI models through our platform. We believe that by opening up access to these models through an easy-to-use API, more developers will find creative ways to apply AI to a large number of useful applications and open issues. “
Previously, developers had to wait to be approved by the company before they could use the tool. Although OpenAI has said it has changed some of its usage restrictions, developers cannot use the AI ââtext generation model for some applications and in some cases may need to implement a content filter.
Things like general purpose chatbots that can spit hate speech or NSFW text are definitely off limits.
What it’s like to be an âAmazonianâ constantly watched by AI cameras
A man infiltrated an Amazon distribution center in Montreal and said his AI cameras were “the most insidious form of surveillance” for workers.
Mostafa Henaway, community animator at the Immigrant Workers Center, an organization that fights for immigrant rights, and a doctoral student at Concordia University, decided to work as an âAmazombianâ for a month. He described what it was like to take the cemetery shift between 1:20 a.m. and 12:00 p.m. on weekdays.
Workers have to attach a device to their arm, which tells them what tasks they need to do for the day and records their hours of work. AI cameras, installed during the COVID-19 pandemic to ensure colleagues stay six feet from each other, scan their every move. Even supervisors cannot escape their dazzling gaze.
“The artificial cameras only ensured our obedience,” he wrote in The Breach, a Canadian media outlet.
âEvery six minutes, the AI ââcameras analyze each worker and the distance between them, generating a report at the end of the shift. The use of Big Data artificial intelligence shows that even management is not itself under control – it is simply there to apply algorithms and predetermined tasks.
But hey, at least the guy responsible for all of this made his trip to space. Â®