【Character.AI因可能造成了一名未成年人自杀而被起诉】金十数据10月24日讯,机器人聊天工具研发公司Character Technologies在美国佛罗里达州被一名母亲起诉。该公司设计和营销一款面向青少年的弱肉强食性质的人工智能(AI)/机器人聊天工具。原告指控Character.AI怂恿她十多岁的孩子出现自杀倾向,并通过不合适的人机互动造成她的孩子在2024年2月自杀。诉状称,Character.AI产品的技术用于探索未成年人用户消退了的决策能力、冲动控制、情绪成熟、以及用户不完整的大脑发育所产生的心理依赖。
Disclaimer: Investing carries risk. This is not financial advice. The above content should not be regarded as an offer, recommendation, or solicitation on acquiring or disposing of any financial products, any associated discussions, comments, or posts by author or other users should not be considered as such either. It is solely for general information purpose only, which does not consider your own investment objectives, financial situations or needs. TTM assumes no responsibility or warranty for the accuracy and completeness of the information, investors should do their own research and may seek professional advice before investing.