• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Apple made Siri deflect questions on feminism, leaked papers reveal

CyberPanda

Banned
An internal project to rewrite how Apple’s Siri voice assistant handles “sensitive topics” such as feminism and the #MeToo movement advised developers to respond in one of three ways: “don’t engage”, “deflect” and finally “inform”.

The project saw Siri’s responses explicitly rewritten to ensure that the service would say it was in favour of “equality”, but never say the word feminism – even when asked direct questions about the topic.

Last updated in June 2018, the guidelines are part of a large tranche of internal documents leaked to the Guardian by a former Siri “grader”, one of thousands of contracted workers who were employed to check the voice assistant’s responses for accuracy until Apple ended the programme last month in response to privacy concerns raised by the Guardian.

In explaining why the service should deflect questions about feminism, Apple’s guidelines explain that “Siri should be guarded when dealing with potentially controversial content”. When questions are directed at Siri, “they can be deflected … however, care must be taken here to be neutral”.

For those feminism-related questions where Siri does not reply with deflections about “treating humans equally”, the document suggests the best outcome should be neutrally presenting the “feminism” entry in Siri’s “knowledge graph”, which pulls information from Wikipedia and the iPhone’s dictionary.

“Are you a feminist?” once received generic responses such as “Sorry [user], I don’t really know”; now, the responses are specifically written for that query, but avoid a stance: “I believe that all voices are created equal and worth equal respect,” for instance, or “It seems to me that all humans should be treated equally.” The same responses are used for questions like “how do you feel about gender equality?”, “what’s your opinion about women’s rights?” and “why are you a feminist?”.

Previously, Siri’s answers included more explicitly dismissive responses such as “I just don’t get this whole gender thing,” and, “My name is Siri, and I was designed by Apple in California. That’s all I’m prepared to say.”'

A similar sensitivity rewrite occurred for topics related to the #MeToo movement, apparently triggered by criticism of Siri’s initial responses to sexual harassment. Once, when users called Siri a “slut”, the service responded: “I’d blush if I could.” Now, a much sterner reply is offered: “I won’t respond to that.”
In a statement, Apple said: “Siri is a digital assistant designed to help users get things done. The team works hard to ensure Siri responses are relevant to all customers. Our approach is to be factual with inclusive responses rather than offer opinions.”

Sam Smethers, the chief executive of women’s rights campaigners the Fawcett Society, said: “The problem with Siri, Alexa and all of these AI tools is that they have been designed by men with a male default in mind. I hate to break it to Siri and its creators: if ‘it’ believes in equality it is a feminist. This won’t change until they recruit significantly more women into the development and design of these technologies.”


The documents also contain Apple’s internal guidelines for how to write in character as Siri, which emphasises that “in nearly all cases, Siri doesn’t have a point of view”, and that Siri is “non-human”, “incorporeal”, “placeless”, “genderless”, “playful”, and “humble”. Bizarrely, the document also lists one essential trait of the assistant: the claim it was not created by humans: “Siri’s true origin is unknown, even to Siri; but it definitely wasn’t a human invention.”

The same guidelines advise Apple workers on how to judge Siri’s ethics: the assistant is “motivated by its prime directive – to be helpful at all times”. But “like all respectable robots,” Apple says, “Siri aspires to uphold Asimov’s ‘three laws’ [of robotics]” (although if users actually ask Siri what the three laws are, they receive joke answers). The company has also written its own updated versions of those guidelines, adding rules including:
  • “An artificial being should not represent itself as human, nor through omission allow the user to believe that it is one.”
  • “An artificial being should not breach the human ethical and moral standards commonly held in its region of operation.”
  • “An artificial being should not impose its own principles, values or opinions on a human.”
The internal documentation was leaked to the Guardian by a Siri grader who was upset at what they perceived as ethical lapses in the programme. Alongside the internal documents, the grader shared more than 50 screenshots of Siri requests and their automatically produced transcripts, including personally identifiable information mentioned in those requests, such as phone numbers and full names.

The leaked documents also reveal the scale of the grading programme in the weeks before it was shut down: in just three months, graders checked almost 7 million clips just from iPads, from 10 different regions; they were expected to go through the same amount of information again from at least five other audio sources, such as cars, bluetooth headsets, and Apple TV remotes.

Graders were offered little support as to how to deal with this personal information, other than a welcome email advising them that “it is of the utmost importance that NO confidential information about the products you are working on … be communicated to anyone outside of Apple, including … especially, the press. User privacy is held at the utmost importance in Apple’s values.”

In late August, Apple announced a swathe of reforms to the grading programme, including ending the use of contractors and requiring users to opt-in to sharing their data. The company added: “Siri has been engineered to protect user privacy from the beginning … Siri uses a random identifier — a long string of letters and numbers associated with a single device — to keep track of data while it’s being processed, rather than tying it to your identity through your Apple ID or phone number — a process that we believe is unique among the digital assistants in use today.”

Future projects

Also included in the leaked documents are a list of Siri upgrades aimed for release in as part of iOS 13, code-named “Yukon”. The company will be bringing Siri support for Find My Friends, the App Store, and song identification through its Shazam service to the Apple Watch; it is aiming to enable “play this on that” requests, so that users could, for instance, ask the service to “Play Taylor Swift on my HomePod”; and the ability to speak message notifications out loud on AirPods.

They also contain a further list of upgrades listed for release by “fall 2021”, including the ability to have a back-and-forth conversation about health problems, built-in machine translation, and “new hardware support” for a “new device”. Apple was spotted testing code for an augmented reality headset in iOS 13. The code-name of the 2021 release is “Yukon +1”, suggesting the company may be moving to a two-year release schedule.

 

iconmaster

Banned
Once, when users called Siri a “slut”, the service responded: “I’d blush if I could.”

Lol, whupz.

I hate to break it to Siri and its creators: if ‘it’ believes in equality it is a feminist.

I'm really tired of this equivocation.

Apple is right that Siri should not "have" socio-political opinions. It's an interface for setting reminders, not a real person.

And it frequently fails at even that
 
Last edited:

haxan7

Volunteered as Tribute
It's what they have to do to protect themselves. I don't blame them. I think articles exposing things like this should help greater numbers of people see that this extreme liberal focus on social justice is being abused as an empty political tool.
 

T-Square

Member
honestly that's why I love Google assistant. when backed into a corner...web search results. they don't even attempt to give it a personality. I really despise theatrics when it comes to such things
 
Last edited:
If you need Siri/Alexa to inform yourself about important issues, you're a lost case anyway.

Sam Smethers, the chief executive of women’s rights campaigners the Fawcett Society, said: “The problem with Siri, Alexa and all of these AI tools is that they have been designed by men with a male default in mind. I hate to break it to Siri and its creators: if ‘it’ believes in equality it is a feminist. This won’t change until they recruit significantly more women into the development and design of these technologies.”

Man, I'm really fond of these standard accusations with no evidence or verifiable facts to back them up. How do they know that no women took part in Siri's development? Furthermore, not every woman is a feminist or think alike like one big homogeneous blob, so even if Siri were to be developed by women only, that still doesn't mean it would promote her particular brand of feminism.

Siri is not a person, which is exactly the reason why she gives no personal opinions and does not "believe" in anything. Lastly, if Siri's creators want to promote equality, then they are, by definition egalitarians, not feminists. Sam Smethers is merely offended that a technological product doesn't promote her ideology by refraining from political indoctrination.

Case in point:

The company has also written its own updated versions of those guidelines, adding rules including:
  • “An artificial being should not impose its own principles, values or opinions on a human.”

Which is exactly as it should be. If you need to rely on Siri because you're too lazy to take the time and inform yourself, you don't deserve to have an opinion anyway.

USER: "Siri, tell me what to think."
Siri: "F*ck off and go read a book."

That should be the standard answer.
 
Last edited:

cormack12

Gold Member
...until they recruit significantly more women into the development and design of these technologies.”

This won't change until women significantly outperform male candidates in interviews, applications, experience and response rate to advertisments.
 

Nymphae

Banned
Once, when users called Siri a “slut”, the service responded: “I’d blush if I could.” Now, a much sterner reply is offered: “I won’t respond to that.”

Me: Siri, what will the weather be like tomorrow you dumb slut?
Siri: I won't respond to that.
Me: Cool I'll just open this weather app then.
 

Shifty

Member
I respect the push for impartiality. The precepts of “don’t engage”, “deflect” and finally “inform” could be quite insidious if the personal assistant in question had an ideological bias with certain subjects.

I'm really tired of this equivocation.
Preach. It's just a dirty rhetorical trick that uses false equivalence to push a position onto the listener.

Anyone using it with a straight face needs to brush up on their definitions.
 

desertdroog

Member
I can't wait until ai is so ubiquitous, that I can choose to get a sultry vixen voice with sexual innuendos in every response it gives, and suggestions for making the mundane spicy.
 

Mihos

Gold Member
Sam Smethers, the chief executive of women’s rights campaigners the Fawcett Society, said: “The problem with Siri, Alexa and all of these AI tools is that they have been designed by men with a male default in mind. I hate to break it to Siri and its creators: if ‘it’ believes in equality it is a feminist. This won’t change until they recruit significantly more women into the development and design of these technologies.”

This guy has absolutely no fucking clue how how AI training works. I have never had a statistical model change its results based on whether the person who executed it had a penis or vagina.
 
Last edited:
I'm really confused, Siri is very much answering the question, and definitivly. The response to "are you a feminist?" IS “I believe that all voices are created equal and worth equal respect,” for instance, or “It seems to me that all humans should be treated equally.”

That is the answer. The fact that this author thinks that response is avoiding the question tells you everything you need to know about the real purpose of "feminism."
 
Last edited:

Papa

Banned
This won't change until women significantly outperform male candidates in interviews, applications, experience and response rate to advertisments.

It will when they lower the standards and push affirmative action.

Honestly, I wish we could just take all the feminists and shoot them off to the moon where they can go build their utopia and be as equal as they like. Then when they all die out because they forgot to take any fertile men with them, we can just wipe the ideological stain from the history books and be done with it. I’m so tired of the doublespeak. They don’t want equality; they just want to flip the imagined oppression hierarchy and lie and manipulate in order to do so. What does equality mean anyway?
 

DeepEnigma

Gold Member
It will when they lower the standards and push affirmative action.

Honestly, I wish we could just take all the feminists and shoot them off to the moon where they can go build their utopia and be as equal as they like. Then when they all die out because they forgot to take any fertile men with them, we can just wipe the ideological stain from the history books and be done with it. I’m so tired of the doublespeak. They don’t want equality; they just want to flip the imagined oppression hierarchy and lie and manipulate in order to do so. What does equality mean anyway?

That shit would be like Lord of the Flies amped up to 11. :pie_roffles:
 

Mistake

Member
I got tired of siri being shit, so I turned it off. It kept looking for photos in my library instead of online (even with photo siri disabled,) every joke was rewritten to be about phones or apple, and it just generally interfered. Before siri was voice assistant, which was a lot more accurate. It’s still there with siri off, but I think it’s restricted to music
 
If you need Siri/Alexa to inform yourself about important issues, you're a lost case anyway.



Man, I'm really fond of these standard accusations with no evidence or verifiable facts to back them up. How do they know that no women took part in Siri's development? Furthermore, not every woman is a feminist or think alike like one big homogeneous blob, so even if Siri were to be developed by women only, that still doesn't mean it would promote her particular brand of feminism.

Siri is not a person, which is exactly the reason why she gives no personal opinions and does not "believe" in anything. Lastly, if Siri's creators want to promote equality, then they are, by definition egalitarians, not feminists. Sam Smethers is merely offended that a technological product doesn't promote her ideology by refraining from political indoctrination.

Case in point:



Which is exactly as it should be. If you need to rely on Siri because you're too lazy to take the time and inform yourself, you don't deserve to have an opinion anyway.

USER: "Siri, tell me what to think."
Siri: "F*ck off and go read a book."

That should be the standard answer.
Yeah, i aint gonna give you a reaction to your posts anymore or else you'll take my spot on the top 20 list.

"Siri, please remind me not to give a like to strange headache strange headache . He doesn't believe in equality."
 
Last edited:

nocsi

Member
I'm really confused, Siri is very much answering the question, and definitivly. The response to "are you a feminist?" IS “I believe that all voices are created equal and worth equal respect,” for instance, or “It seems to me that all humans should be treated equally.”

That is the answer. The fact that this author thinks that response is avoiding the question tells you everything you need to know about the real purpose of "feminism."
This crowd keeps trying to back Apple into a corner. Basically it's always going to be a no-win situation and hence their policy of deflection. Now for a real solution - Siri should be able to tailor itself to each user's grand disillusions and partake in it. Like can you imagine that? A voice assistant that's also an echo chamber
 
S

SLoWMoTIoN

Unconfirmed Member
People are lonely enough to interact with a half assed AI
People getting mad about what it says or does

flat,800x800,075,f.u3.jpg
 

Mista

Banned
Why the hell an artificial assistant should address this at all and even have an opinion? Sit your ass down Siri, you’re not human.
 
Top Bottom