Robots can develop prejudices just like humans

In a fascinating study by researchers at Cardiff University and MIT, we learn that robots can develop prejudices when working together. The robots, which ran inside a teamwork simulator, expressed prejudice against other robots not on their team. In short, write the researchers, “groups of autonomous machines could demonstrate prejudice by simply identifying, copying and […]

In a fascinating study by researchers at Cardiff University and MIT, we learn that robots can develop prejudices when working together. The robots, which ran inside a teamwork simulator, expressed prejudice against other robots not on their team. In short, write the researchers, “groups of autonomous machines could demonstrate prejudice by simply identifying, copying and learning this behavior from one another.”

To test the theory, researchers ran a simple game in a simulator. The game involved donating to parties outside or inside the robot’s personal group based on reputation as well as donation strategy. They were able to measure the level of prejudice against outsiders. As the simulation ran, they saw a rise in prejudice against outsiders over time.

The researchers found the prejudice was easy to grow in the simulator, a fact that should give us pause as we give robots more autonomy.

“Our simulations show that prejudice is a powerful force of nature and through evolution, it can easily become incentivised in virtual populations, to the detriment of wider connectivity with others. Protection from prejudicial groups can inadvertently lead to individuals forming further prejudicial groups, resulting in a fractured population. Such widespread prejudice is hard to reverse,” said Cardiff University Professor Roger Whitaker. “It is feasible that autonomous machines with the ability to identify with discrimination and copy others could in future be susceptible to prejudicial phenomena that we see in the human population.”

Interestingly, prejudice fell when there were “more distinct subpopulations being present within a population,” an important consideration in human prejudice as well.

“With a greater number of subpopulations, alliances of non-prejudicial groups can cooperate without being exploited. This also diminishes their status as a minority, reducing the susceptibility to prejudice taking hold. However, this also requires circumstances where agents have a higher disposition towards interacting outside of their group,” Professor Whitaker said.

Tall Poppy aims to make online harassment protection an employee benefit

For the nearly 20 percent of Americans who experience severe online harassment, there’s a new company launching in the latest batch of Y Combinator called Tall Poppy that’s giving them the tools to fight back. Co-founded by Leigh Honeywell and Logan Dean, Tall Poppy grew out of the work that Honeywell, a security specialist, had […]

For the nearly 20 percent of Americans who experience severe online harassment, there’s a new company launching in the latest batch of Y Combinator called Tall Poppy that’s giving them the tools to fight back.

Co-founded by Leigh Honeywell and Logan Dean, Tall Poppy grew out of the work that Honeywell, a security specialist, had been doing to hunt down trolls in online communities since at least 2008.

That was the year that Honeywell first went after a particularly noxious specimen who spent his time sending death threats to women in various Linux communities. Honeywell cooperated with law enforcement to try and track down the troll and eventually pushed the commenter into hiding after he was visited by investigators.

That early success led Honeywell to assume a not-so-secret identity as a security expert by day for companies like Microsoft, Salesforce, and Slack, and a defender against online harassment when she wasn’t at work.

“It was an accidental thing that I got into this work,” says Honeywell. “It’s sort of an occupational hazard of being an internet feminist.”

Honeywell started working one-on-one with victims of online harassment that would be referred to her directly.

“As people were coming forward with #metoo… I was working with a number of high profile folks to essentially batten down the hatches,” says Honeywell. “It’s been satisfying work helping people get back a sense of safety when they feel like they have lost it.”

As those referrals began to climb (eventually numbering in the low hundreds of cases), Honeywell began to think about ways to systematize her approach so it could reach the widest number of people possible.

“The reason we’re doing it that way is to help scale up,” says Honeywell. “As with everything in computer security it’s an arms race… As you learn to combat abuse the abusive people adopt technologies and learn new tactics and ways to get around it.”

Primarily, Tall Poppy will provide an educational toolkit to help people lock down their own presence and do incident response properly, says Honeywell. The company will work with customers to gain an understanding of how to protect themselves, but also to be aware of the laws in each state that they can use to protect themselves and punish their attackers.

The scope of the problem

Based on research conducted by the Pew Foundation, there are millions of people in the U.S. alone, who could benefit from the type of service that Tall Poppy aims to provide.

According to a 2017 study, “nearly one-in-five Americans (18%) have been subjected to particularly severe forms of harassment online, such as physical threats, harassment over a sustained period, sexual harassment or stalking.”

The women and minorities that bear the brunt of these assaults (and, let’s be clear, it is primarily women and minorities who bear the brunt of these assaults), face very real consequences from these virtual assaults.

Take the case of the New York principal who lost her job when an ex-boyfriend sent stolen photographs of her to the New York Post and her boss. In a powerful piece for Jezebel she wrote about the consequences of her harassment.

As a result, city investigators escorted me out of my school pending an investigation. The subsequent investigation quickly showed that I was set up by my abuser. Still, Mayor Bill de Blasio’s administration demoted me from principal to teacher, slashed my pay in half, and sent me to a rubber room, the DOE’s notorious reassignment centers where hundreds of unwanted employees languish until they are fired or forgotten.

In 2016, I took a yearlong medical leave from the DOE to treat extreme post-traumatic stress and anxiety. Since the leave was almost entirely unpaid, I took loans against my pension to get by. I ran out of money in early 2017 and reported back to the department, where I was quickly sent to an administrative trial. There the city tried to terminate me. I was charged with eight counts of misconduct despite the conclusion by all parties that my ex-partner uploaded the photos to the computer and that there was no evidence to back up his salacious story. I was accused of bringing “widespread negative publicity, ridicule and notoriety” to the school system, as well as “failing to safeguard a Department of Education computer” from my abusive ex.

Her story isn’t unique. Victims of online harassment regularly face serious consequences from online harassment.

According to a  2013 Science Daily study, cyber stalking victims routinely need to take time off from work, or change or quit their job or school. And the stalking costs the victims $1200 on average to even attempt to address the harassment, the study said.

“It’s this widespread problem and the platforms have in many ways have dropped the ball on this,” Honeywell says.

Tall Poppy’s co-founders

Creating Tall Poppy

As Honeywell heard more and more stories of online intimidation and assault, she started laying the groundwork for the service that would eventually become Tall Poppy. Through a mutual friend she reached out to Dean, a talented coder who had been working at Ticketfly before its Eventbrite acquisition and was looking for a new opportunity.

That was in early 2015. But, afraid that striking out on her own would affect her citizenship status (Honeywell is Canadian), she and Dean waited before making the move to finally start the company.

What ultimately convinced them was the election of Donald Trump.

“After the election I had a heart-to-heart with myself… And I decided that I could move back to Canada, but I wanted to stay and fight,” Honeywell says.

Initially, Honeywell took on a year-long fellowship with the American Civil Liberties Union to pick up on work around privacy and security that had been handled by Chris Soghoian who had left to take a position with Senator Ron Wyden’s office.

But the idea for Tall Poppy remained, and once Honeywell received her green card, she was “chomping at the bit to start this company.”

A few months in the company already has businesses that have signed up for the services and tools it provides to help companies protect their employees.

Some platforms have taken small steps against online harassment. Facebook, for instance, launched an initiative to get people to upload their nude pictures  so that the social network can monitor when similar images are distributed online and contact a user to see if the distribution is consensual.

Meanwhile, Twitter has made a series of changes to its algorithm to combat online abuse.

“People were shocked and horrified that people were trying this,” Honeywell says. “[But] what is the way [harassers] can do the most damage? Sharing them to Facebook is one of the ways where they can do the most damage. It was a worthwhile experiment.”

To underscore how pervasive a problem online harassment is, out of the four companies where the company is doing business or could do business in the first month and a half there is already an issue that the company is addressing. 

“It is an important problem to work on,” says Honeywell. “My recurring realization is that the cavalry is not coming.”