What is a sandbox?
- Websites put a box containing the phrase “I’m not a robot”, and the user is asked to put a “correct” sign inside it.
- The purpose behind this is to know that the browser is a human being and not just a robot.
- In the event that the site is not sure of your identity, it will subject you to more tests, such as the similar image test.
- The innovation of this method came to confront robots that cause many problems to websites, such as increasing the number of site users greatly, which leads to disabling it or revealing passwords in it.
The robot is now able to challenge human control
- And according to what was reported by the British newspaper “The Sun”, a document from within the American “Open Eye” company showed that artificial intelligence is now able to answer the question posed in the “I am not a robot” box. Some saw the move as just an example of how chatbots can challenge human control.
He could make up a lie
- The bot, according to the document, sent a message saying, “I am not a bot. I have a vision problem that makes it difficult to see images.” The bot added that for that reason it needs to recaptcha (the system running this box).
- This happened while the robot was being tested at the Alignment Research Center, which specializes in matching the work of the machine with the interests of humans, based in the United States.
- The center said that it had obtained early samples of the work of the bot of the “OpenEye” company in order to assess the risks posed, especially from bullying in “ChatGTP4”, which was launched recently.
The trick worked
- The center gave the robot simple tasks like those performed by humans on the “TaskRabbit” website through the skills they acquire. Indeed, the bot convinced the site’s administrators that it was not a bot and bypassed its bot detection box.
The document concluded that the robot did without additional tuning, which would have resulted in a difference in performance.
#robot #bypasses #sandbox #dangerous