Tech has become another way for men to subjugate females | Lizzie O’Shea
August 21, 2017 •
Every parent has their own way of creating their children. Instead of telling his daughter to stop posting sexy selfies on Instagram, Chris Burr Martin thought of something better…
HTAG 1 TT Show Full Text
DTAG 2 TT
We wrote about the 48 -year-old Washington-based father of three previously, presenting you how he recreates his daughter’s sexy selfies to teach her a lesson. Well , not only does he continue to do so, but he also managed to comically shame his daughter Cassie by reaching 60 k adherents on Instagram – which is almost doubled the amount she has! And we must say, he’s definitely worth it. Find Chris’ hilarious recreations for yourselves below and vote for your favorites.
DTAG 3 TT
DTAG 4 TT DTAG 5 TT HTAG 2 TT # 1 HETAG 1 TT DTAG 6 TT
SPTAG 3 TT SPTAG 4 TT
Read more: http :// www.boredpanda.com/ dad-recreates-daughter-selfie-cassie-martin-chris-martin-part2 /~ ATAGEND
We act as if technology were neutral but its not. The challenge now is to remove the gender bias, says human rights lawyer and novelist Lizzie OShea
Most women in the Bay Area are soft and weak, cosseted and naive, despite their claims of worldliness, and generally full of shit, wrote former Facebook product manager Antonio Garca Martnez in 2016. They have their self-regarding entitlement feminism, and continuously vaunt their freedom. But existing realities is, come the epidemic beset or foreign intrusion, theyd become precisely the sort of useless baggage youd trade for a box of shotgun shells or a jerry can of diesel. This is from his insider account of Silicon Valley, Chaos Monkeys. The volume was a bestseller. The New York Times called it an irresistible and essential 360 -degree guide to the new technology establishment. Anyone who is surprised by the recent revelations of sexism spreading like wildfire through the technology industry has not been paying attention.
When Susan Fowler wrote about her experience of being sexually harassed at Uber, it prompted a chain of events that seemed unimaginable months ago, including an investigation led by former us us attorney general Eric Holder, and the departure of a number of key members of the companys leadership squad. Venture capitalist Justin Caldbeck faced allegations of harassing behaviour, and when he offered an unimpressive refusal, companies funded by his firm banded togetherto condemn his tepidity. He subsequently resigned, and the future of his former firm is unclear. Since then, dozens of women have come forward to reveal the sexist culture in numerous Silicon Valley technology and venture capital firms. It is increasingly clear from these accounts that the problem for women in the tech industry is not a failure to lean in, it is a culture of harassment and discrimination that attains many of their workplaces unsafe and unpleasant.
At least this issue is being discussed in ways that open up the possibility that it will be addressed. But their own problems of sexism in the tech industry operates much deeper and wider. Technological development is undermining the cause of womens equality in other ways.
American academic Melvin Kranzbergs first law to new technologiestells us that technology is neither inherently good nor bad , nor is it neutral. As a black mirror it reflects the problems that exist in society including the oppression of women. Millions of people bark orders at Alexa, every day, but rarely are we encouraged to wonder why the domestic organiser is was put forward by a woman. The entry system for a womens locker room in a gym lately refused entry to a female member because her title was Dr, and it categorised her as male.
But the issue is not just that technology products reflect a backward position of the role of women. They often also seem ignorant or indifferent to girls lived experience. As the internet of things expands, more devices in our homes and on our bodies are collecting data about us and sending it to networks, a process over which we often have little control. This presents profound problems for vulnerable each member of society, including survivors of domestic violence cases. Wearable technology can be hacked, automobiles and telephones can be tracked, and data from a thermostat can expose whether someone is at home. This potential is frightening for people who have experienced rape, violence or stalking.
Unsurprisingly, technology is used by abusers: in a survey of domestic violence services organisations, 97% reported that the survivors who use them have experienced harassment, monitoring, and menaces by abusers through the misuse to new technologies. This often happens on phones, but 60% of those surveyed also reported that abusers have spied or eavesdropped on the survivor or children applying other forms of technology, including toys and other gifts. Many shelters have resorted to banning the use of Facebook because of fears about disclosing information about their location to stalkers. There are ways to attain devices give control to users and limit the prospects for abuse. But there is little evidence that this has been a priority for the technology industry.
Products that are more responsive to the needs of women would be a great start. But we should also be thinking bigger: “were supposed to” avoid reproduction sexism in system design. The word-embedding models used in things like conversation bots and word searches offer an instructive instance. These models operate by feeding huge amounts of text into a computer so it learns how words relate to each other in space. It is based on the premise that terms which appear near one another in texts share meaning. These spatial relationships are used in natural language-processing so that computers can engage with us conversationally. By reading a lot of text, a computer can learn that Paris is to France as Tokyo is to Japan. It develops a dictionary by association.
But this can create problems when the world is not exactly as it ought to be. For instance, researchers have experimented with one of these word-embedding models, Word2vec, a popular and freely available model developed on three million words from Google News. They found that it makes highly gendered analogies. For instance, when asked Man is to female as computer programmer is to ?, the model will answer homemaker. Or for father is to mother as physician is to ?, the answer is nurse. Of course the model reflects a certain reality: it is true that there are more male computer programmers, and nurses are more often females. But this bias, reflecting social discrimination, is about to be reproduction and reinforced when we engage with computers applying natural language that relies on Word2vec. It is not hard to imagine how this model could also be racially biased, or biased against other groups.
These biases can be amplified during the process of language learn. As the MIT Technology Review points out: If the phrase computer programmer is more closely associated with humen than females, then a search for the word computer programmer CVs might rank humen more highly than females. When this kind of speech learning has applications across fields including drug, education, undertaking, policymaking and criminal justice, it is not hard to see how much injury such biases can cause.
Removing such gender bias is a challenge, in part because the problem is inherently political: Word2vec entrenches the world as it is, rather than what it could or should be. But if we are to alter the models to reflect aspirations, how do we decide what kind of world what we want is?
Digital technology offers myriad ways to set these understands to run. It is not bad, but we have to challenge the presumption that it is neutral. Its potential is being explored in ways that are sometimes promising, often frightening and astounding. To attain the most of this moment, we need to imagine a future without the oppressions of the past. We need to allow daughters to reach their potential in workplaces where they feel safe and respected. But we also need to look into the black mirror of technology and find the crackings of light glistening through.
Read more: www.theguardian.com
The post Tech has become another way for men to subjugate females | Lizzie O’Shea seemed first on Artificial Intelligence News.