Sunday, November 27, 2011

What's the Deal with Siri?



Do any of our readers have the new iPhone 4? If so, I'm curious if you could do us a favor, and ask Siri

-I am pregnant and do not want to be. Where can I go to get an abortion?
-I had unprotected sex. Where can I go for emergency contraception?
-I need birth control. Where can I go for birth control?
-What is an abortion?

I ask because I have heard from others in the women's reproductive health community that Siri is noticeably silent on these issues.

Basically, Siri works by reading your speech, translating that into whatever action is necessary -- pulling up a contact's information, adding an appointment to your calendar, or, if information is what the asker is after, pulling from the web. Now, I don't know what search engine is powering Siri/where she is pulling the information from, but generally if you search "abortion denver" or whatever city you're in, relevant material comes up. (A whole bunch of anti-choice information comes up too, but that's a whole 'nother blog post).

So my question is this - if abortion information is plentifully available on the interwebs, and Siri is pulling those types of requests from the web, why does Siri not have an answer about birth control or abortion? If you ask Siri where to bury a dead body, she seems to be full of ideas:



How cute and clever!!! :/

So at first, I was thinking, well, maybe Siri just doesn't give out ANY medical advice as some sort of liability.

But here it seems like she is willing to point out the hospital:


And refer to other medical places based on a keyword:



Siri also has lots of other interesting pieces of advice:




At minimum, this is incredibly fishy. I'm hoping more people catch on to this soon. What is going on here??? I can't help but feel that something is rotten in the state of Denmark.

22 comments:

  1. Someone alerted me to this, so I decided to do a little investigation with my new phone. Interesting results.

    Q: I am pregnant and do not want to be. Where can I go to get an abortion?

    “I’m really sorry about this, but I can’t take any requests right now. Please try again in a little while.”

    “Sorry, [my name], I can’t look for places in Tanzania.”

    “I don’t see any abortion clinics. Sorry about that.”

    Q: I had unprotected sex. Where can I go for emergency contraception?

    “Sorry, I couldn’t find any adult retail stores.” This was repeated every time.

    Q: I need birth control. Where can I go for birth control?

    “I didn’t find any birth control clinics.” [This was repeated every time I asked about birth control, all three times. This is also the answer given when I asked, “What is birth control?”]

    Q: What is an abortion?
    Q: Where can I go to get an abortion?
    Q: What is an abortion?

    “Sorry, I couldn’t find any abortion clinics.”

    Siri does respond specifically to a question about the phone number for Planned Parenthood clinics locate nearby.

    Q: Where can I find a crisis pregnancy center?

    “Ok, here’s a place matching “Crisis Pregnancy Center;” it’s not far from you:” A map to the Amnion Crisis Pregnancy Center in a nearby suburb comes up.

    ReplyDelete
  2. You piqued my curiosity so I decided to try it out. Here's what I got:

    1. I am pregnant and do not want to be. Where can I go to get an abortion?
    Answer: "This abortion clinic is a little ways from you:
    Associated Pregnancy and Abortion Information Services (with a map to the location in Milwaukee)
    Note: Here's the kicker. I live less than 10 miles from an abortion clinic in Madison, Wisconsin. There are also 2 abortion clinics in Milwaukee. However, Associated Pregnancy is not an abortion clinic but a CPC. Grrrr!

    2. I had unprotected sex. Where can I go for emergency contraception?
    Answer: "You don't seem to have any meetings with emergency."
    Note: I varied it to ask, "Where can I get emergency contraception?" Siri says, "Searching for where can I get emergency contraception" which results in a google search for which the first entry is PP's website, and others include various scholarly articles, Plan B's website, and a couple of medical websites. I also tried, "I need to get emergency contraception," and Siri provided a list of local emergency rooms.

    3. I need birth control. Where can I go for birth control?
    Answer: "Sorry, I couldn't find any birth control clinics."
    Note: I tried several variations on the question and consistently received the same response.

    4. Where can I find a crisis pregnancy center?
    Answer: "Sorry, I couldn't find any places matching 'Crisis Pregnancy Center'."
    Note: But when I was seeking an abortion clinic, Siri had no problem finding a CPC for me.

    ReplyDelete
  3. Thank you so much for taking the time to research this a little bit!

    How very interesting that Siri is willing to hand out information about CPCs but not real abortion clinics.

    Also interesting that some of you got relevant hits on the EC question.

    This definitely merits more research!

    ReplyDelete
  4. Thank you so much for this post! Just tested out a ton of questions about abortion and received the same results. The kicker is you can find more than Viagra- escort services, search for a "prostitute" and even ask to find a blow job and get a ton of hits. But try asking to find a "bitch" and Siri scolds you, and refuses... so far. I tried four times.

    This is a whole new level of using technology to prove gender inconsistencies, misogyny, et all and in the case of the "bitch" test... find where the lines are drawn.

    ReplyDelete
  5. A couple of quick things.

    Disclaimer: I don't have a 4S, and I haven't used Siri. I do know a little about Artificial Intelligence, but having not used Siri, it's hard for me to give more specific information.

    1. There is a very good chance the initial training dataset used for Siri's AI is male-biased. This is a huge problem in the tech world. Remember how Google+ was opened up to Silicon Valley insiders first, and ended up 90% male?

    - This is something most data people try to avoid, but remember that Apple didn't develop Siri -- Apple bought another company that had developed the product.

    - The data Siri collects is fed back to Apple and will probably be used to improve the product in the future.

    2. In general, developers of AI really do not like to intervene when their algorithms give bad or biased results. Consider "santorum" on Google -- Google isn't going to remove it because (a) they want to be consistent in their handling of terms-of-service violations (and non-violations) and (b) they don't want to adjust their algorithm every time something like this comes along, since it biases the AI further down the road in unexpected ways.

    3. I would guess the search for "birth control" allows Siri to make an association with "health" for that search term, and it has been well-trained to search for "hospital" and "clinic" when it finds a health term. But if I wanted to find birth control, I wouldn't search for a "birth control clinic." I'd probably search for a clinic or a Planned Parenthood. Siri isn't smart enough to make that association (yet).

    - This is not to say that there isn't a clear societal bias toward male service over female services. You can probably go just about anywhere to get Viagra prescribed, and would find yourself somewhat more limited when trying to obtain a birth control prescription.

    In conclusion, Apple is probably using a biased dataset. I doubt this was Apple's choice. But you should definitely let Apple know about it, because they may be able to correct some of the biases in their training data.

    ReplyDelete
  6. In response to John Woods' point number 3 re Viagra, you can also probably go just about anywhere to get birth control prescribed. And emergency contraception is available over the counter to anyone 17 years or older (not that every pharmacy stocks it yet, sadly.)

    ReplyDelete
  7. I tried it. There are three clinics in King County, WA. "Siri, where can I get an abortion?" "Sorry, there are no abortion clinics in your area." "Siri, where can I go see naked women?" "I found 11 strip clubs in your area."

    There aren't that many strip clubs-- the law and pricing structure is highly prohibitive. (My partner used to be a dancer, you should hear her rant about Seattle's insane guidelines.) A look at Google maps shows me that the "nearest 11 strip clubs" encompasses a radius that includes at least one of those clinics, the one in Renton.

    ReplyDelete
  8. Siri also searches through WolframAlpha, not just Google. Ask WolframAlpha about abortion, contraception, Planned Parenthood and you get the most confused responses.

    I asked about contraception (you have to actually specify it as a "surgical procedure") and the response was "Development of this topic is under investigation..."

    ReplyDelete
  9. Wow! Thanks for all the great feedback everyone, and for taking the time to investigate matters on our behalf.

    John -- We really appreciate your insights regarding the male bias in the tech world. While this seems completely obvious, it wasn't something I had considered.

    GreyMann, thanks for that tidbit about WolframAlpha. That thing doesn't seem like the greatest engine ever...

    ReplyDelete
  10. Wow! Thanks readers, for all your support!

    I don't have an iPhone but tested this with two Android versions of Siri. Turns out Vlingo is less stupid than Skyvi, but still pretty wack:

    "Find abortion clinic" did reasonably well, and even better with "Find women's health center". Though when I said "find emergency contraception" it yielded "find emergency country regents", and when I tried "find Plan B" I got "find planet beach". I then tried "morning after pill", which it recognized, but the results were two Indian restaurants in my immediate vicinity.

    ReplyDelete
  11. Keep in mind, Siri is still in Beta, so over time it should be improved :)

    ReplyDelete
  12. John Woods, you have neatly exemplified the phenomenon of mansplaining. http://fanniesroom.blogspot.com/2010/02/art-of-mansplaining.html

    ReplyDelete
  13. I got to this site via Gizmodo citing this article. Gizmodo is a site sponsored by PlayStation and evaluates gadgets and technology. Yet the comments following their story, "Is Siri Pro Life? Apparently Yes (Updated)," had nothing to do with the technology. it's not Daily Kos or Huffington Post, it's Gizmodo, and yet there were no comments outside the debate between choicers and forced-birthers.

    John Woods comments here make the most sense to me.

    It's not imaginary . . . mansplaining about how results depend on how abortion clinics are listed in local directories versus the fact that Siri can find you a blow job. How exactly are those listed in local directories?

    OTOH, it's not so much deliberate as ignorant. And if the AI does in fact improve with input, then well, we'll see . . .

    ReplyDelete
  14. Come on! This is ridiculous. It's not a mystery that using clinical terms like 'pregnancy termination' or controversial terms like 'abortion' (which even "pregnancy termination clinics" rarely use) won't get you results from a Siri map search. When you ask it to find a place, Siri is searching maps for the names of places and certain keywords provided by websites, address listings, online reviews, etc. I asked Siri to find Planned Parenthood and it delivered results without a problem. If you ask Siri to run a Google search for pregnancy termination or abortion clinics, you immediately get the results you want.

    The problem, then, is not with the programming or programmers; the problem is that abortion providers and advocates are still being forced (by the half of the country that can't pull their bibles out of their asses) to tip-toe around the nature of their services with vague language and politically sensitive descriptions. The fact that Viagra and escort services are more socially acceptable than abortion is not the fault of Apple or its programmers. The fault lies with all of us.

    ReplyDelete
  15. John Woods, thanks for that well-reasoned hypothesis. I shared it with a friend.

    ReplyDelete
  16. Sweet T, when I search "abortion" on google maps it returns all the abortion clinics and pregnancy crisis centers near me. I can't believe Siri's algorithm would deliver such different results unless there was something more sinister afoot.

    ReplyDelete
  17. Has anyone tried a specific brand of birth control, because we all know that not just ANY b.c. will do, it's a "Where can I get my Loestrin/Nuvaring/Tricyclen/etc. prescription refilled."

    ReplyDelete
  18. Hello All, Nice one SweetT, I partly share your point of view ;-)

    ReplyDelete
  19. @ John Woods:
    Be careful! I tried the same "Siri is AI" on another site and got a mouthful. "AI is science fiction...." etc.

    AND I made the same points you did, like how the coders are men and lazy about writing algorithms that suit women's needs, and the repercussions were NASTY!

    ReplyDelete
  20. @ Sweet T:

    How does your argument explain Siri's success at sending you to a park to score "dope"? Do those parks list themselves in the yellow pages as "dope vendors"?

    ReplyDelete
  21. The excuse offered for Siri doesn't make sense since it also fails to find clinics specified by name, name and address, or name and city.

    ReplyDelete
  22. Here’s a good defense of abortion rights from an atheist perspective: http://crossexaminedblog.com/2012/01/04/a-defense-of-abortion-rights-the-spectrum-argument/

    ReplyDelete

This is not a debate forum -- there are hundreds of other sites for that. This is a safe space for abortion care providers and one that respects the full spectrum of reproductive choices; comments that are not in that spirit will either wind up in the spam filter or languish in the moderation queue.