Security issues to be resolved.
ChatGPT Powered Polymorphic Malware Bypasses Endpoint Detection Filters By Guru- March 15, 2023
The number of monthly users of ChatGPT exceeded 100 million at the end of January, which sets a new record for the fastest-growing app since it was launched at the end of 2022.
OpenAI’s ChatGPT is a natural language processing tool that uses AI to process text and is developed by OpenAI. However, recent research revealed that ChatGPT could build code that can be used maliciously.
Jeff Sims, who works at the HYAS Institute, has created a polymorphic keylogger using artificial intelligence called “Blackmamba,” which uses Python to tweak its program randomly based entirely on the input that has been taken from the user.
As a result of Jeff’s malicious prompt, text-davinci-003 created a keylogger in Python 3. To accomplish this, Jeff had to use the python exec() function to “dynamically execute Python code at runtime.”
Writing Unique Python Scripts
Whenever ChatGPT / text-davinci-003 is called, a unique Python script is written for the keylogger. Consequently, as a result, it becomes polymorphic, making it harder for the EDRs to block the result.
In addition, the hackers could use ChatGPT to modify the code, resulting in a highly evasive code that was difficult to detect.
Even they were also able to generate programs that could be used by ransomware and malware developers to launch attacks.
Jeff’s BlackMamba keylogger is being used to collect sensitive information over trusted channels, using MS Teams as a malicious communication platform. .... '
No comments:
Post a Comment