Dangerous Technology: Even without watching science movies, it can be understood that the more beneficial the development of technology is, the more dangerous it can be. Since the 20th century, technology has made human life easy and that is why technical investment is increasing continuously worldwide. But the other aspect of this is that if it is misused, it can become a threat to our privacy, freedom and civil rights. Let’s know about 5 such techniques that can cause future concern.
Facial recognition (facial identification technique)
This facial identification technique is very useful in many places in terms of safety, but it can also be easily misused. For example, this technology in China is used to monitor and control the Muslim community. Even in countries like Russia, the cameras on the roads are engaged in identifying the “special people”. This technique collects our biometric information such as face, fingers and gestures. But anxiety increases when these figures are used for illegal or unfair purposes.
Smart drone
The drones were previously used for entertainment and photography. But now smart drones are being used in the battlefield, which can carry out the mission by taking decisions on their own. Although these drones bring boom and skill in military work, but if technical disturbances are done then they can also target innocent people. In such a situation, this technique can become a serious threat during war.
AI Cloning and Deepfeck
With the help of AI, it has become very easy to copy a person’s voice. AI can make a video that looks real by taking only a few second voice or a few pictures. In deepfack technology, such videos are prepared using machine learning and face mapping in which a person is seen talking what he never said. This technique can prove to be extremely dangerous in fraud, blackmailing and spreading rumors.
Fake news bots
AI system like Grover can create full false news by reading only one headline. Institutions like Openai have made such bots which can prepare news like real in appearance. However, their code was not made public so that it is not misused. But if this technique goes into the wrong hands, then it can become a threat to democracy and social stability.
“Smart Dust”
Smart dust i.e. microelectromechanical systems (MEMS) are so small that they look as much as salt particles. They contain sensors and cameras which can record data. Its use in areas such as health and safety can be extremely beneficial but if it is used in monitoring, espionage or illegal activities, it would be a major threat to personal privacy.
Also read:
Now AI makes less ‘mistakes’ from humans, CEO of Anthropic