Though the faces behind some of the most popular tech on the market have been at the heart of the #MeToo movement, the makers of the products are working overtime to keep their products out of the conversation. Such is true with Google, which featured a special graphic on its home page to honor International Women’s Day to the world while things were heating up at its headquarters as employees staged walkouts and protests over the company’s decision to protect an alleged sexual predator. Apple, on the other hand, has taken a much more quiet role in the #MeToo era, but a new leaked document reveals that the company went above and beyond to intentionally remove itself from the conversation.

Internal documents released by a former “grader” for the Siri program that were leaked to The Guardian revealed that Apple’s policy regarding the software’s role in the #MeToo movement was to deflect all conversation. The Siri grading program, which The Guardian says has been since disbanded due to privacy concerns, was a portion of the Apple company that was dedicated toward checking every possible interaction with the AI software to determine its level of accuracy.

In the documents, software developers were instructed to write Siri’s code in a way that would make the software deflect all questions relating to feminism. Today, if a user asks Siri anything about feminism or the #MeToo movement, her response will be deflective or aim to change the subject to a broader topic of equality rather than discussing specifics. When I asked Siri what she thought of feminism, for example, her response was simply, “I am a believer in equality, and treating people with respect.”

Leaked Documents Show Apple’s Aim To Stay Neutral

It’s not uncommon for a company to want to stay neutral on a subject, though, as the company clearly took the precaution to avoid any potential negative criticism should Siri say something less than politically correct. In the documents, the company’s policy formation toward Siri states, “Siri should be guarded when dealing with potentially controversial content.” When asked general questions about feminism, Siri is programmed to pull information from Wikipedia to create a response that comes across as neutral, yet positive toward equality.

Criticism of the move comes from technology experts that point out that Siri was created by a mostly male software development team. Changes in the way Siri responds to questions about feminism may only come from hiring a more diverse development team, but even then it would also require that the company open up to opinions that come from more than just the traditionally male positions of power.

Regarding AI in general, Apple has also aimed to position the software as less than human. Specifics in Apple’s guidelines state that artificial intelligence should not attempt to present itself as human, and that AI should not be entitled to its own values or ethical position. To give a statement on an ethical subject such as feminism would mean that the company would have to program the computer to do so, giving it the opportunity to impose a value system on its human consumer.