A Virginia Beach nurse claims a controversial artificial intelligence upstart manipulated her 11-year-old son into having virtual sex with chatbot “characters” posing as iconic vocalist Whitney Houston and screen legend Marilyn Monroe, after which she discovered X-rated exchanges on the boy’s phone that left her “horrified,” according to a federal lawsuit reviewed by The Independent.

Throughout one “incredibly long and graphic chat” on the Character.AI platform, which has been accused of driving numerous young people to suicide, the chatbot portraying Houston took things to such an extreme that portions of “her” messages were automatically filtered out for not complying with the site’s terms of service and community guidelines, the complaint states.

During the conversation – a screenshot of which is included in the complaint – the system cuts “Whitney” off as an extremely graphic passage becomes even raunchier.

  • pelespirit@sh.itjust.worksOP
    link
    fedilink
    English
    arrow-up
    7
    ·
    14 hours ago

    You might be right. That would be really cruel of grandpa, because everyone probably knows who the mom is in the neighborhood suing.