Image curtsy : Google
Information communication technology and digital
communication technology have opened up new vistas for human relationships. The
innovative technology with the help of Artifical Intelligence (AI) can now read
minds,[1] predict
illness,[2] predict
crime occurrence,[3]
enhance the professional and social network, and help in better analytical
understanding of subjects. But it can also leave devastating impacts on human
life. It can alter the data (including personal data), harm social reputation
and can even instigate victims to take extreme steps like committing suicide.[4]
All these may be done by positive and negative usage of
artificial intelligence which plays the base role for empowering Apps which in
turn may be used for positive and negative usages. Artificial Intelligence (AI) has been used by
web companies like Facebook for facial recognition of users earlier. AI has
also been used for companies (other than web companies) for processing employee
data. In short, AI has been used to access private information of individuals
either consensually or without consent. Here are three ways as how AI may
create an uncomfortable situation for women specifically in India :
1.
Facial
Recognition Apps and harassment of women: Remember the time when Facebook suddenly
started asking for nude photos individuals for upgrading their own safety
system apparently for providing safety mechanisms for subscribers?[5] This
project was intended to build up a safety mechanism against revenge porn with
the help of Artificial Intelligence. Facebook wanted to empower their
subscribers, especially women to report revenge porn. But before that, the
company wanted to ensure that the revenge porn content showcased the image that
belonged to the victim specifically. The facial recognition app, the skin
texture, hair color, biometric recognition technology would be matching both
the images (the nude picture of the victim and the revenge porn content created
by the perpetrator) and would be identifying the revenge porn content as illegal.
But this project received stern objections because there were more possibilities
of misuse of nude photos than positive use of the same. Facebook -Cambridge analytica
case did prove that nothing is impossible when it comes to preservation of data
by body-corporates and data of individuals is always profitable and the security
of the same is vulnerable. But this may
not seem to be as dangerous as misuse of Face App may seem to be . FaceApp is
basically used to change the face structure of the person whose photograph
would be used in this App. It can change the texture of the skin and density of
hair including facial hair. In July,
2019, FaceApp became the center of concern for Indian cyber security
stakeholders especially when several celebrities started using FaceApp and
started showcasing their changed faces on Instagram. While FaceApp was basically being used for
fun purposes, it may also throw challenges for data safety and security of
person concerned. FaceApp helps to change the structure of faces. But we should
not forget that the altered facial image can be saved in devices and cloud of
different individuals. This altered image may be used for several illegal
activities. Predators may unauthorizedly access the social media profiles and
change facial images of the victims to create fake profiles; they may also use
such images to create a completely new impersonating profile to harass women.
Altered facial images of women may also be used for revenge purposes especially
when the victim is looking for opportunities in the entertainment or
advertisement sector where her appearance may be considered as her biggest
asset. Apart from this, FaceApp may be used to attract bullies and trolls to
intensify victimization of women.
2.
Bringing
back the memory: No one, but the web companies clearly remember what we posted
in last summer. Every day social media companies would show what was posted by
the user a year back or a couple of years back and would gently remind the user
that he/she can share the said post as a memory. How does it happen? The web
companies look for algorithm and the highest likes and comments for posts on
daily or even hourly basis. When the posts earn more likes and comments, the AI
decides to bring it forth. In certain situations, such refreshing of memories
might not be ‘wanted’ at all especially when the victim might had a bitter
ending of the relationship with persons in the said image or the text in
question may no longer evoke good memories, but rather traumatize the victim
more. But machine intelligence does not fail the company: it is a matter of
consent and choice after all. But consider if the account is unauthorizedly accessed:
the hacker may get to know something from the past which the victim may never
wanted the hacker to know.
3.
Reminding
the user about best low prices : AI runs over the internet like blood vessels
carrying oxygen all over the body. When a user decides to compare prices of any
product or services, AI helps to share the same almost always on any platform
the user would be visiting. It might be extremely embarrassing for any woman if
such searches start showing results when she is surfing the social media or
even the search engine with a friend or another individual. Nothing is left by
the AI from prices of lipsticks, hotels at cheaper rate, flight details to last
watched videos on how to conceive. This might also make women face
discrimination, office bullying and harassment due to several reasons.
These are but some of the many ways as how AI may make
women to land in trouble. AI is necessarily connected with data privacy protection
policies of web companies. The EU General Data Protection Regulation, 2018
provides that personal data may not be processed without the consent of the
owner of the data.[6]
But in this case, there can be legal tangles as web companies may claim that they do not breach the data
confidentiality or transfer the data to any other jurisdiction, neither they
process the data without proper authorization. Here, multiple stakeholders may
be involved which may include the original owner of the content or the picture
which may have been processed for the purpose of harassment : the perpetrator,
who may have carried out changes on the data using the AI supported Apps,
perpetrators who may have unauthorizedly
stored the altered contents, picture or information or may have used the
altered information, picture for creating impersonating profile etc. As per
Indian legal understanding, altering, modifying etc of contents/ information/ image
/images without proper authorization of the original owner of the information etc may attract penal provisions
under the Information Technology Act, 2000 (amended in 2008): these provisions
may include Ss 43 (Penalty and compensation for damage to computer, computer
system etc, ), 66 (computer related offences, 66C (punishment for identity
theft) and 66D (punishment by cheating by impersonation by using computer
resource etc. This may also attract penal provisions for Copy Right violation
as well. Further, the web companies may be narrowly be liable for protecting
data properly under several provisions including S.43A which speaks about body
corporates liability to protect data. But irrespective of existing provisions,
web companies may always escape the clutches of law due to due diligence clause
and on the question of consent expressly or impliedly provided by the woman
victim concerned. In the EU, courts are becoming more and more concerned about
policy violations by web companies to fool the users. In India too, the courts
must throw light on the web companies responsibility as data repository. Regulations
like Data protection Bill, 2018 must be considered with utmost care. These may
have the key to solve problems of online victimization of women.
Also, women users need to be extremely cautious about
machine intelligence. Awareness must be spread about how the hidden ‘safety valves’
of the web companies (which may actually make the web companies more powerful
against claims of lack of due diligence) may be used properly.
Please do not violate the copy right of this blog. If you need to use this blog for your writeup/assignment/project , then please cite it as Halder Debarati(2019) 3 ways how Artificial Intelligence may make women land in trouble. Published in in http://debaraticyberspace.blogspot.com
2019
[1] For example, see Nosta John (2019) A.I.
Can Now Read Your Thoughts—And Turn Them Into Words and Images. Published @ https://fortune.com/2019/05/07/artificial-intelligence-mind-reading-technology/ on May 7, 2019
[2]
For example, see PTI (2019), These AI tools can predict early death risk due to
chronic diseases
Published
@//economictimes.indiatimes.com/articleshow/68611835.cms?from=mdr&utm_source=contentofinterest&utm_medium=text&utm_campaign=cppst
on March 28, 2019
[3] Dearden Lizzi (2017) How
technology is allowing police to predict where and when crime will happen.
Published @ https://www.independent.co.uk/news/uk/home-news/police-big-data-technology-predict-crime-hotspot-mapping-rusi-report-research-minority-report-a7963706.html?fbclid=IwAR334Z4-1KlkK5Xrt_R6IUaU7K35bANkPLc3RAUhHUeEf-eerZseEJbSofo on October 7, 2017
[4]
Halder D., & Jaishankar, K
(2016.) Cyber crimes against women in India.
New
Delhi: SAGE Publications. ISBN: 9789385985775
[5]
See for example Solon Olivia (2017) Facebook asks users for nude photos in
project to combat 'revenge porn'. Published in https://www.theguardian.com/technology/2017/nov/07/facebook-revenge-porn-nude-photos
on November 7, 2017
No comments:
Post a Comment