bloomingbitbloomingbit

Encrypted messengers rattled by AI integration wave… Session warns of a "privacy collapse"

Bloomingbit Newsroom
공유하기

Summary

  • Session said that if AI is integrated into device operating systems, encrypted messenger security could be effectively neutralized, leading to a privacy collapse.
  • Session executives said users’ lack of data awareness and regulatory pressure are threatening the future of private messaging.
  • Session cited the EU’s push for Chat Control and a data exposure case involving OpenAI, saying developers of encryption tools are under significant pressure.
Photo=Shutterstock
Photo=Shutterstock

A warning has emerged that the security of encrypted messengers could be effectively undermined if artificial intelligence (AI) is integrated down to the device operating system level. Executives at the decentralized messenger Session said the spread of AI, users’ lack of awareness, and regulatory pressure are threatening the future of private messaging.

According to a Cointelegraph report on the 31st (local time), Alex Linton, head of the Session Technology Foundation, said, “The way AI analyzes and stores information inside a device creates enormous privacy and security problems.” He added that in an average smartphone or computer environment, truly private communication itself could become impossible.

Linton explained that the risks grow even larger if AI operates at the operating system (OS) level. “If AI is integrated into the operating system, it could completely bypass messenger encryption,” he said, adding, “No one knows how encrypted information is used after being passed to a black-box AI.” He warned that “at that point, you won’t even be able to know what is actually happening on the device.”

Session co-founder Chris McCabe also pointed to users’ lack of awareness about their data as a key issue. “Many people don’t properly understand how their data is used, or what can be done with it,” he said. He added that “data can be used—through advertising or algorithms—to steer people into behaviors they don’t even want.”

Such concerns have also surfaced through recent cases. OpenAI has said that some user data was exposed due to a hack of a third-party data analytics firm. That information could be abused for phishing or social engineering attacks, and a feature that shared chat histories on the web was also discovered at one point.

Linton also cited the regulatory burden, referencing the European Union’s (EU) proposed ‘Chat Control’ bill that would mandate the scanning of private messages. “People who build encryption tools are feeling significant pressure,” he said. “These technologies are not designed to aid crime; they exist to protect users’ information and make the online space a better place.”

publisher img

Bloomingbit Newsroom

news@bloomingbit.ioFor news reports, news@bloomingbit.io
What did you think of the article you just read?