bloomingbitbloomingbit

Encrypted messengers shaken by AI integration spread… Session "Concerns over privacy collapse"

Source
YM Lee
공유하기
  • Alex Linton, head of the Session Technologies Foundation, said that if AI is integrated into operating systems, the security of encrypted messengers could be neutralized.
  • Chris McCabe, co-founder of Session, pointed out that users' lack of awareness about their data creates additional risks related to privacy.
  • Regulatory pressure such as the European Union's 'Chat Control' is acting as a burden on developers of encryption tools.
STAT AI Notice
  • The article was summarized using an artificial intelligence-based language model.
  • Due to the nature of the technology, key content in the text may be excluded or different from the facts.
Photo=Shutterstock
Photo=Shutterstock

A warning has emerged that if artificial intelligence (AI) is integrated down to the device operating system level, the security of encrypted messengers could be effectively neutralized. Executives of the decentralized messenger Session said that the spread of AI, users' lack of awareness, and regulatory pressure are threatening the future of private messaging.

According to a report by Cointelegraph on the 31st (local time), Alex Linton, head of the Session Technologies Foundation, said, "The way AI analyzes and stores information inside devices creates huge privacy and security problems." He diagnosed that in an average smartphone or computer environment, private communication itself could become impossible.

Linton explained that the danger grows if AI operates at the operating system (OS) level. He said, "If AI is integrated into the operating system, it could completely bypass messenger encryption," adding, "No one can know how encrypted information passed to a black-box AI will be used." He warned, "At that point, it would be impossible to even know what is actually happening on the device."

Session co-founder Chris McCabe pointed out that users' lack of awareness about their data is also a key issue. He said, "Many people do not properly understand how their data is used or what can be done with that data." He added, "Data can be used to induce people to take actions they do not want through advertising or algorithms."

These concerns have also been illustrated by recent incidents. OpenAI disclosed that some user data was exposed due to a hack of a third-party data analytics firm. That information can be exploited for phishing or social engineering attacks, and at one point a feature that shared chat histories on the web was also discovered.

Linton referenced the European Union (EU)'s proposed law to mandate private message scanning, 'Chat Control,' noting the burden of the regulatory environment. He said, "People who build encryption tools are under significant pressure," and added, "These technologies are not meant to aid crime but to protect users' information and make the online space a better place."

publisher img

YM Lee

20min@bloomingbit.ioCrypto Chatterbox_ tlg@Bloomingbit_YMLEE

Feel free to share your thoughts and questions about the news!

What did you think of the article you just read?