Digital Drugs


Share this article

Facebook
Twitter
LinkedIn

‘Digital drug’ is an apt moniker for products or services designed to keep one online, hooked by heaps of content, barely lifting a thumb as auto-play saves the trouble of scrolling. Social media is so addictive that even adults are still grappling with how to control its excessive use. If well-educated adults are vulnerable to threats in the digital realm, one can only imagine the impact on underdeveloped brains.

‘Like our stomachs, our minds are hurt more often by over-eating than hunger.’ As children absorb an endless flow of immediate pleasure online, TikTok brains have increased stress levels and decreased attention spans. Overstimulated reward centres give a dopamine rush until a dopamine crash leaves one wanting more.

Besides quantity, the nature of the content is also a cause of concern. Algorithms encroach on intellectual integrity even before a child’s intellectual faculties have fully developed. The feedback loop crafted by Machine Learning (ML) can form ‘echo chambers’, repeatedly presenting the same kind of information, limiting room for new knowledge. The matter worsens if the content echoes poor judgement about any given matter. For example, unhealthy food marketing can distort children’s perception of a regular diet, according to Dr Monique Kent.

Viewing explicit content can have detrimental impacts on a child. Explicit images of children themselves perpetuate sexual violence against children, 2 million of which were found on Meta (then Facebook) in Pakistan in 2021. Moreover, if they publish personal details online, children and their parents are vulnerable to ‘doxxing’, i.e. someone using their private information with malicious intent.

Adding on, the attempt to impart financial education to children through video games has raised concerns about gambling addiction in youngsters, who buy random digital treasure chests called ‘lootboxes’ with real money and no guarantee of the items inside.  Dr Nizan Packin questions the designs as well: In games where children learn to invest and suffer a ‘fake’ loss, they may underestimate the real-life repercussions of such decisions. In games where this is attached to real money, it risks putting the parents or guardians or children themselves in debt

The well-being of children is rarely a concern for profiteers looking to expand their clientele; even the suitability criteria of educational apps do not consider safety. Safety mechanisms that depend on people being honest about how old they are, are inefficient and allow children to evade age restrictions. Furthermore, Britain’s watchdog fined TikTok USD 16 million for using children’s data without parental consent. Against the massive revenue similar companies generate, they can see it merely as the cost of doing business. Profiteers deliberately siphon data from children; by the time a child turns 13, a company may have 72 million data points about him/her, which will target them as a consumer.

Knowing that millions of dollars go into designing these apps for addiction, to think that children can make better choices is like leaving them in a candy store and expecting them to choose to eat vegetables. Technologies certainly have good uses, but to ensure that the promise of positive potential does not wane in the face of negative employment, children cannot be left unsupervised.

If the intellectual resource from the nascent stage is raised by algorithms, exploited for profit, and deliberately addicted to digital drugs by profiteers, it is enough reason to sound the alarm. Protecting children is then: a national task. So, what should an Anti-Digital-Narcotics Force look like? An alliance of parents, experts, civil society actors and legislators.

The first task is to distinguish the most impressionable ages. Social media access until the age of 12 should be outright restricted and monitored beyond 13 until 18. The role of parents is indispensable: they can monitor other usages so children do not lag in the digital age. Nevertheless, not all can be helicopter parents who hover over their children and constantly monitor content and screen time. As a workable solution, children should use shared gadgets in living rooms/common spaces, making it easier to keep a watchful eye on them. Parents can also set an example for children by regulating their own use of social media.

Institutional mechanisms and international regimes should also have sharper teeth. Radical transparency can prevent age-inappropriate content and sexual exploitation online and offline from being swept under the rug by companies.

These threats deserve more attention in Pakistan, where physiological shortcomings, amplified by psychological ones, can cause a compounded crisis. With 53.7% anaemic children and 36.5% active Internet users in Pakistan, even those who may not have nutritious food or medicine are likely to have a phone and an Internet connection.

For technology to be an enabler, digital spaces must be safe and beneficial. The Personal Data Protection Bill 2021 does not explicitly mention children, and PECA 2016 addresses online child sexual abuse but does not address other digital threats posed to youngsters. The Pakistan Telecommunication Authority has useful guidelines for Child Online Protection that can guide formal policy. Thus, national legislation for ‘Online Child Security’ is just as important for Pakistan as attempts to increase internet access and bridge digital divides.

Bakhtawar Iftikhar is a Research Assistant at the Centre for Aerospace & Security Studies (CASS), Islamabad, Pakistan. She can be contacted at: cass.thinkers@casstt.com.

Recent Publications

Browse through the list of recent publications.