A hacker found a way to trick ChatGPT into ignoring its own guidelines and ethical responsibilities to produce instructions for making powerful explosives. By [@Techcrunch](
). #AI #Tech #Technology #ChatGPT 
Flipboard
TechCrunch (@Techcrunch) on Flipboard
Breaking technology news and analysis. The number one guide for all things tech.

TechCrunch
Hacker tricks ChatGPT into giving out detailed instructions for making homemade bombs | TechCrunch
An explosives expert told TechCrunch that the ChatGPT output could be used to make a detonatable product and was too sensitive to be released.





