The internet is becoming more hazardous as the number of scams on the internet increases. Social media giants are now alerting users to fraudulent ChatGPT apps that are circulating on the internet. It is essential for users to be vigilant and only download ChatGPT apps from reliable sources.
Meta has identified scammers exploiting people’s interest in ChatGPT by enticing users to download harmful apps and browser extensions. Cybercriminals are taking advantage of people’s curiosity and trust in ChatGPT to launch attacks, using tactics similar to those used in cryptocurrency scams.
Meta has discovered about ten malware families that pose as ChatGPT and other similar tools. These malware strains are used to compromise accounts across the internet. Once a user downloads the malware, the malicious actors can launch an attack and keep updating their methods to bypass security protocols.
Meta’s Q3 2023 security report states, “Over the past several months, we’ve investigated and taken action against malware strains taking advantage of people’s interest in OpenAI’s ChatGPT to trick them into installing malware pretending to provide AI functionality.” The company has detected and blocked over 1,000 unique malicious URLs from being shared on their apps, protecting unsuspecting users from falling prey to these cyberattacks.
Furthermore, Meta has reported these malicious URLs to their industry peers at file-sharing services where malware is hosted, enabling them to take appropriate action to protect their users and networks. Meta has taken significant steps to combat the threat posed by malware strains posing as ChatGPT and similar tools.
Cybercriminals are also using other websites such as LinkedIn, Chrome, Edge, Brave, and Firefox to deceive people. Meta has taken action against nine groups worldwide who are attempting to influence people secretly and steal information.
Meta advises people to be cautious and ensure that the things they download are safe. They recommend that people only download things from trusted sources.
Leave a Reply